Science.gov

Sample records for applied systems analysis

  1. Applied mathematics analysis of the multibody systems

    NASA Astrophysics Data System (ADS)

    Sahin, H.; Kar, A. K.; Tacgin, E.

    2012-08-01

    A methodology is developed for the analysis of the multibody systems that is applied on the vehicle as a case study. The previous study emphasizes the derivation of the multibody dynamics equations of motion for bogie [2]. In this work, we have developed a guide-way for the analysis of the dynamical behavior of the multibody systems for mainly validation, verification of the realistic mathematical model and partly for the design of the alternative optimum vehicle parameters.

  2. Sneak analysis applied to process systems

    NASA Astrophysics Data System (ADS)

    Whetton, Cris

    Traditional safety analyses, such as HAZOP, FMEA, FTA, and MORT, are less than effective at identifying hazards resulting from incorrect 'flow' - whether this be flow of information, actions, electric current, or even the literal flow of process fluids. Sneak Analysis (SA) has existed since the mid nineteen-seventies as a means of identifying such conditions in electric circuits; in which area, it is usually known as Sneak Circuit Analysis (SCA). This paper extends the ideas of Sneak Circuit Analysis to a general method of Sneak Analysis applied to process plant. The methods of SA attempt to capitalize on previous work in the electrical field by first producing a pseudo-electrical analog of the process and then analyzing the analog by the existing techniques of SCA, supplemented by some additional rules and clues specific to processes. The SA method is not intended to replace any existing method of safety analysis; instead, it is intended to supplement such techniques as HAZOP and FMEA by providing systematic procedures for the identification of a class of potential problems which are not well covered by any other method.

  3. Systems design analysis applied to launch vehicle configuration

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Verderaime, V.

    1993-01-01

    As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.

  4. Dynamical systems analysis applied to working memory data.

    PubMed

    Gasimova, Fidan; Robitzsch, Alexander; Wilhelm, Oliver; Boker, Steven M; Hu, Yueqin; Hülür, Gizem

    2014-01-01

    In the present paper we investigate weekly fluctuations in the working memory capacity (WMC) assessed over a period of 2 years. We use dynamical system analysis, specifically a second order linear differential equation, to model weekly variability in WMC in a sample of 112 9th graders. In our longitudinal data we use a B-spline imputation method to deal with missing data. The results show a significant negative frequency parameter in the data, indicating a cyclical pattern in weekly memory updating performance across time. We use a multilevel modeling approach to capture individual differences in model parameters and find that a higher initial performance level and a slower improvement at the MU task is associated with a slower frequency of oscillation. Additionally, we conduct a simulation study examining the analysis procedure's performance using different numbers of B-spline knots and values of time delay embedding dimensions. Results show that the number of knots in the B-spline imputation influence accuracy more than the number of embedding dimensions. PMID:25071657

  5. Dynamical systems analysis applied to working memory data

    PubMed Central

    Gasimova, Fidan; Robitzsch, Alexander; Wilhelm, Oliver; Boker, Steven M.; Hu, Yueqin; Hülür, Gizem

    2014-01-01

    In the present paper we investigate weekly fluctuations in the working memory capacity (WMC) assessed over a period of 2 years. We use dynamical system analysis, specifically a second order linear differential equation, to model weekly variability in WMC in a sample of 112 9th graders. In our longitudinal data we use a B-spline imputation method to deal with missing data. The results show a significant negative frequency parameter in the data, indicating a cyclical pattern in weekly memory updating performance across time. We use a multilevel modeling approach to capture individual differences in model parameters and find that a higher initial performance level and a slower improvement at the MU task is associated with a slower frequency of oscillation. Additionally, we conduct a simulation study examining the analysis procedure's performance using different numbers of B-spline knots and values of time delay embedding dimensions. Results show that the number of knots in the B-spline imputation influence accuracy more than the number of embedding dimensions. PMID:25071657

  6. System Analysis Applied to Autonomy: Application to High-Altitude Long-Endurance Remotely Operated Aircraft

    NASA Technical Reports Server (NTRS)

    Young, Larry A.; Yetter, Jeffrey A.; Guynn, Mark D.

    2006-01-01

    Maturation of intelligent systems technologies and their incorporation into aerial platforms are dictating the development of new analysis tools and incorporation of such tools into existing system analysis methodologies in order to fully capture the trade-offs of autonomy on vehicle and mission success. A first-order "system analysis of autonomy" methodology is outlined in this paper. Further, this analysis methodology is subsequently applied to notional high-altitude long-endurance (HALE) aerial vehicle missions.

  7. System Analysis Applied to Autonomy: Application to Human-Rated Lunar/Mars Landers

    NASA Technical Reports Server (NTRS)

    Young, Larry A.

    2006-01-01

    System analysis is an essential technical discipline for the modern design of spacecraft and their associated missions. Specifically, system analysis is a powerful aid in identifying and prioritizing the required technologies needed for mission and/or vehicle development efforts. Maturation of intelligent systems technologies, and their incorporation into spacecraft systems, are dictating the development of new analysis tools, and incorporation of such tools into existing system analysis methodologies, in order to fully capture the trade-offs of autonomy on vehicle and mission success. A "system analysis of autonomy" methodology will be outlined and applied to a set of notional human-rated lunar/Mars lander missions toward answering these questions: 1. what is the optimum level of vehicle autonomy and intelligence required? and 2. what are the specific attributes of an autonomous system implementation essential for a given surface lander mission/application in order to maximize mission success? Future human-rated lunar/Mars landers, though nominally under the control of their crew, will, nonetheless, be highly automated systems. These automated systems will range from mission/flight control functions, to vehicle health monitoring and prognostication, to life-support and other "housekeeping" functions. The optimum degree of autonomy afforded to these spacecraft systems/functions has profound implications from an exploration system architecture standpoint.

  8. System Sensitivity Analysis Applied to the Conceptual Design of a Dual-Fuel Rocket SSTO

    NASA Technical Reports Server (NTRS)

    Olds, John R.

    1994-01-01

    This paper reports the results of initial efforts to apply the System Sensitivity Analysis (SSA) optimization method to the conceptual design of a single-stage-to-orbit (SSTO) launch vehicle. SSA is an efficient, calculus-based MDO technique for generating sensitivity derivatives in a highly multidisciplinary design environment. The method has been successfully applied to conceptual aircraft design and has been proven to have advantages over traditional direct optimization methods. The method is applied to the optimization of an advanced, piloted SSTO design similar to vehicles currently being analyzed by NASA as possible replacements for the Space Shuttle. Powered by a derivative of the Russian RD-701 rocket engine, the vehicle employs a combination of hydrocarbon, hydrogen, and oxygen propellants. Three primary disciplines are included in the design - propulsion, performance, and weights & sizing. A complete, converged vehicle analysis depends on the use of three standalone conceptual analysis computer codes. Efforts to minimize vehicle dry (empty) weight are reported in this paper. The problem consists of six system-level design variables and one system-level constraint. Using SSA in a 'manual' fashion to generate gradient information, six system-level iterations were performed from each of two different starting points. The results showed a good pattern of convergence for both starting points. A discussion of the advantages and disadvantages of the method, possible areas of improvement, and future work is included.

  9. Success Factors of European Syndromic Surveillance Systems: A Worked Example of Applying Qualitative Comparative Analysis

    PubMed Central

    Ziemann, Alexandra; Fouillet, Anne; Brand, Helmut; Krafft, Thomas

    2016-01-01

    Introduction Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors. Materials and Methods We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events. Results We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness. Conclusions We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings. PMID:27182731

  10. Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Szapacs, Cindy

    2006-01-01

    Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…

  11. A Meta-Analysis of Instructional Systems Applied in Science Teaching.

    ERIC Educational Resources Information Center

    Willett, John B.; And Others

    1983-01-01

    Reports results of the meta-analysis of 130 research studies on effects of instructional systems used in science teaching. Studies coded gave rise to 341 effect sizes. Systems examined included audio-tutorial, computer-linked, individualized, and media-based instructional systems, personalized system of instruction, mastery learning, self-directed…

  12. Thermoeconomics applied to HVAC systems

    SciTech Connect

    Tozer, R.; Valero, A.; Lozano, M.A.

    1999-07-01

    Thermoeconomics uses thermodynamics in conjunction with economic analysis to achieve improved cost benefit and quality in design. The thermodynamic analysis uses the second law and availability or exergy, which is a measure of the usefulness of energy. This paper uses exergy rather than energy in the thermoeconomic analysis and system optimization. The objective of thermoeconomics is to minimize a cost function, which is usually capital or life-cycle costs, which are expressed in terms of thermodynamic variables of the system. This will establish the most cost-effective design parameters for the specific design configuration analyzed. This paper presents the methods for applying a thermoeconomic analysis to several HVAC systems. The design conditions are analyzed, providing detailed exergy costing related to the plant investment and operating costs of the base design. By reviewing these, the most appropriate system variables are selected and the system cost is optimized, therefore achieving important life-cycle cost and capital investment savings.

  13. Model-Based Systems Engineering With the Architecture Analysis and Design Language (AADL) Applied to NASA Mission Operations

    NASA Technical Reports Server (NTRS)

    Munoz Fernandez, Michela Miche

    2014-01-01

    The potential of Model Model Systems Engineering (MBSE) using the Architecture Analysis and Design Language (AADL) applied to space systems will be described. AADL modeling is applicable to real-time embedded systems- the types of systems NASA builds. A case study with the Juno mission to Jupiter showcases how this work would enable future missions to benefit from using these models throughout their life cycle from design to flight operations.

  14. Applying STAMP in Accident Analysis

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen

    2003-01-01

    Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.

  15. Genetic algorithm applied to a Soil-Vegetation-Atmosphere system: Sensitivity and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Schneider, Sébastien; Jacques, Diederik; Mallants, Dirk

    2010-05-01

    Numerical models are of precious help for predicting water fluxes in the vadose zone and more specifically in Soil-Vegetation-Atmosphere (SVA) systems. For such simulations, robust models and representative soil hydraulic parameters are required. Calibration of unsaturated hydraulic properties is known to be a difficult optimization problem due to the high non-linearity of the water flow equations. Therefore, robust methods are needed to avoid the optimization process to lead to non-optimal parameters. Evolutionary algorithms and specifically genetic algorithms (GAs) are very well suited for those complex parameter optimization problems. Additionally, GAs offer the opportunity to assess the confidence in the hydraulic parameter estimations, because of the large number of model realizations. The SVA system in this study concerns a pine stand on a heterogeneous sandy soil (podzol) in the Campine region in the north of Belgium. Throughfall and other meteorological data and water contents at different soil depths have been recorded during one year at a daily time step in two lysimeters. The water table level, which is varying between 95 and 170 cm, has been recorded with intervals of 0.5 hour. The leaf area index was measured as well at some selected time moments during the year in order to evaluate the energy which reaches the soil and to deduce the potential evaporation. Water contents at several depths have been recorded. Based on the profile description, five soil layers have been distinguished in the podzol. Two models have been used for simulating water fluxes: (i) a mechanistic model, the HYDRUS-1D model, which solves the Richards' equation, and (ii) a compartmental model, which treats the soil profile as a bucket into which water flows until its maximum capacity is reached. A global sensitivity analysis (Morris' one-at-a-time sensitivity analysis) was run previously to the calibration, in order to check the sensitivity in the chosen parameter search space. For

  16. Energy analysis of facade-integrated photovoltaic systems applied to UAE commercial buildings

    SciTech Connect

    Radhi, Hassan

    2010-12-15

    Developments in the design and manufacture of photovoltaic cells have recently been a growing concern in the UAE. At present, the embodied energy pay-back time (EPBT) is the criterion used for comparing the viability of such technology against other forms. However, the impact of PV technology on the thermal performance of buildings is not considered at the time of EPBT estimation. If additional energy savings gained over the PV system life are also included, the total EPBT could be shorter. This paper explores the variation of the total energy of building integrated photovoltaic systems (BiPV) as a wall cladding system applied to the UAE commercial sector and shows that the ratio between PV output and saving in energy due to PV panels is within the range of 1:3-1:4. The result indicates that for the southern and western facades in the UAE, the embodied energy pay-back time for photovoltaic system is within the range of 12-13 years. When reductions in operational energy are considered, the pay-back time is reduced to 3.0-3.2 years. This study comes to the conclusion that the reduction in operational energy due to PV panels represents an important factor in the estimation of EPBT. (author)

  17. Multi-attribute criteria applied to electric generation energy system analysis LDRD.

    SciTech Connect

    Kuswa, Glenn W.; Tsao, Jeffrey Yeenien; Drennen, Thomas E.; Zuffranieri, Jason V.; Paananen, Orman Henrie; Jones, Scott A.; Ortner, Juergen G.; Brewer, Jeffrey D.; Valdez, Maximo M.

    2005-10-01

    This report began with a Laboratory-Directed Research and Development (LDRD) project to improve Sandia National Laboratories multidisciplinary capabilities in energy systems analysis. The aim is to understand how various electricity generating options can best serve needs in the United States. The initial product is documented in a series of white papers that span a broad range of topics, including the successes and failures of past modeling studies, sustainability, oil dependence, energy security, and nuclear power. Summaries of these projects are included here. These projects have provided a background and discussion framework for the Energy Systems Analysis LDRD team to carry out an inter-comparison of many of the commonly available electric power sources in present use, comparisons of those options, and efforts needed to realize progress towards those options. A computer aid has been developed to compare various options based on cost and other attributes such as technological, social, and policy constraints. The Energy Systems Analysis team has developed a multi-criteria framework that will allow comparison of energy options with a set of metrics that can be used across all technologies. This report discusses several evaluation techniques and introduces the set of criteria developed for this LDRD.

  18. Characteristic analysis of the lower limb muscular strength training system applied with MR dampers.

    PubMed

    Yu, Chang Ho; Piao, Young Jun; Kim, Kyung; Kwon, Tae Kyu

    2014-01-01

    A new training system that can adjust training intensity and indicate the center pressure of a subject was proposed by applying controlled electric current to the Magneto-Rheological damper. The experimental studying on the muscular activities were performed in lower extremities during maintaining and moving exercises, which were processed on an unstable platform with Magneto rheological dampers and recorded in a monitor. The electromyography (EMG) signals of the eight muscles in lower extremities were recorded and analyzed in certain time and frequency domain. Muscles researched in this paper were rectus femoris (RF), biceps femoris (BF), tensor fasciae latae (TFL), vastuslateralis (VL), vastusmedialis (VM), gastrocnemius (Ga), tibialis anterior (TA), and soleus (So). Differences of muscular activities during four moving exercises were studied in our experimental results. The rate of the increment of the muscular activities was affected by the condition of the unstable platform with MR dampers, which suggested the difference of moving exercises could selectively train each muscle with varying intensities. Furthermore, these findings also proposed that this training system can improve the ability of postural balance.

  19. Technical, economic and environmental analysis of a MSW kerbside separate collection system applied to small communities.

    PubMed

    De Feo, G; Malvano, C

    2012-10-01

    The main aim of this study was to evaluate the costs and environmental impacts induced by a fixed model of MSW kerbside separate collection system for communities up to 10,000 inhabitants, in order to evaluate the convenience for the smaller municipalities to unite and form more economically and environmentally sound systems. This topic is important not only due to the large number of small municipalities (e.g. in Italy 72% of the municipalities has less than 5000 inhabitants) but also to the fact that separate collection systems are typically designed to take into account only the technical and economic aspects, which is a practice but not acceptable in the light of the sustainable development paradigm. In economic terms, between 1000 and 4000 inhabitants, the annual per capita cost for vehicles and personnel decreased, with a maximum at approximately 180€/inhabitants/year; while, from 5000 up to 10,000 inhabitants, the annual per capita cost was practically constant and equal to about 80€/inhabitants/year. For the municipalities of less than 5000 inhabitants, from an economic point of view the aggregation is always advantageous. The environmental impacts were calculated by means of the Life Cycle Assessment tool SimaPro 7.1, while the economic-environmental convenience was evaluated by combining in a simple multicriteria analysis, the annual total per capita cost (€/inhabitants/year) and the annual total per capita environmental impact (kEco-indicator point/inhabitants/year), giving the same importance to each criteria. The analysis was performed by means of the Paired Comparison Technique using the Simple Additive Weighting method. The economic and environmental convenience of the aggregation diminishes with the size of the municipalities: for less than 4000 inhabitants, the aggregation was almost always advantageous (91.7%); while, for more than or equal to 5000 inhabitants, the aggregation was convenient only in 33.3% of the cases. On the whole, out of

  20. Underdetermined system theory applied to qualitative analysis of response caused by attenuating plane waves

    NASA Astrophysics Data System (ADS)

    Sano, Yukio

    1989-05-01

    A qualitative analysis of the mechanical response of rate-dependent media caused by a one-dimensional plane smooth wave front and by a continuous wave front attenuating in the media is performed by an underdetermined system of nonlinear partial differential equations. The analysis reveals that smooth strain, particle velocity, and stress profiles, which the smooth wave front has, are not similar and that the wave front is composed of some partial waves having different properties. The property is represented by a set of strain rate, acceleration, and stress rate. The wave front derived here from the analysis is composed of four different partial waves. The front of the wave front is necessarily a contraction wave in which strain, particle velocity, and stress increase with time, while the rear is a rarefaction wave where they all decrease with time. Between these two wave fronts there are two remaining wave fronts. We call these wave fronts mesocontraction waves I and II. Wave front I is a wave in which stress decreases notwithstanding the increase in strain and particle velocity with time, which is followed by the other, i.e., wave front II, where with time, particle velocity, and stress decrease in spite of the increase in strain. The continuous wave front having continuous and nonsmooth profiles of strain, particle velocity, and stress can also be composed of four waves. These waves possess the same property as the corresponding waves in the smooth wave front mentioned above. The velocities at three boundaries that the waves have are discontinuous. Therefore, these four wave fronts are independent waves, just as a shock wave and a rarefraction wave. Specifically, the front wave, i.e., a contraction wave front is being outrun by a second wave front, the second one is being outrun by a third wave front, and the third is being outrun by a fourth wave front, i.e., a rarefaction wave. We call the second wave front degenerate contraction wave I. We also call the third

  1. Analysis and Design of International Emission Trading Markets Applying System Dynamics Techniques

    NASA Astrophysics Data System (ADS)

    Hu, Bo; Pickl, Stefan

    2010-11-01

    The design and analysis of international emission trading markets is an important actual challenge. Time-discrete models are needed to understand and optimize these procedures. We give an introduction into this scientific area and present actual modeling approaches. Furthermore, we develop a model which is embedded in a holistic problem solution. Measures for energy efficiency are characterized. The economic time-discrete "cap-and-trade" mechanism is influenced by various underlying anticipatory effects. With a systematic dynamic approach the effects can be examined. First numerical results show that fair international emissions trading can only be conducted with the use of protective export duties. Furthermore a comparatively high price which evokes emission reduction inevitably has an inhibiting effect on economic growth according to our model. As it always has been expected it is not without difficulty to find a balance between economic growth and emission reduction. It can be anticipated using our System Dynamics model simulation that substantial changes must be taken place before international emissions trading markets can contribute to global GHG emissions mitigation.

  2. Comparison of complexity measures using two complex system analysis methods applied to the epileptic ECoG

    NASA Astrophysics Data System (ADS)

    Janjarasjitt, Suparerk; Loparo, Kenneth A.

    2013-10-01

    A complex system analysis has been widely applied to examine the characteristics of an electroencephalogram (EEG) in health and disease, as well as the dynamics of the brain. In this study, two complexity measures, the correlation dimension and the spectral exponent, are applied to electrocorticogram (ECoG) data from subjects with epilepsy obtained during different states (seizure and non-seizure) and from different brain regions, and the complexities of ECoG data obtained during different states and from different brain regions are examined. From the computational results, the spectral exponent obtained from the wavelet-based fractal analysis is observed to provide information complementary to the correlation dimension derived from the nonlinear dynamical-systems analysis. ECoG data obtained during seizure activity have smoother temporal patterns and are less complex than data obtained during non-seizure activity. In addition, significant differences between these two ECoG complexity measures exist when applied to ECoG data obtained from different brain regions of subjects with epilepsy.

  3. Enhancing the Effectiveness of Significant Event Analysis: Exploring Personal Impact and Applying Systems Thinking in Primary Care

    PubMed Central

    McNaughton, Elaine; Bruce, David; Holly, Deirdre; Forrest, Eleanor; Macleod, Marion; Kennedy, Susan; Power, Ailsa; Toppin, Denis; Black, Irene; Pooley, Janet; Taylor, Audrey; Swanson, Vivien; Kelly, Moya; Ferguson, Julie; Stirling, Suzanne; Wakeling, Judy; Inglis, Angela; McKay, John; Sargeant, Joan

    2016-01-01

    Introduction: Significant event analysis (SEA) is well established in many primary care settings but can be poorly implemented. Reasons include the emotional impact on clinicians and limited knowledge of systems thinking in establishing why events happen and formulating improvements. To enhance SEA effectiveness, we developed and tested “guiding tools” based on human factors principles. Methods: Mixed-methods development of guiding tools (Personal Booklet—to help with emotional demands and apply a human factors analysis at the individual level; Desk Pad—to guide a team-based systems analysis; and a written Report Format) by a multiprofessional “expert” group and testing with Scottish primary care practitioners who submitted completed enhanced SEA reports. Evaluation data were collected through questionnaire, telephone interviews, and thematic analysis of SEA reports. Results: Overall, 149/240 care practitioners tested the guiding tools and submitted completed SEA reports (62.1%). Reported understanding of how to undertake SEA improved postintervention (P < .001), while most agreed that the Personal Booklet was practical (88/123, 71.5%) and relevant to dealing with related emotions (93/123, 75.6%). The Desk Pad tool helped focus the SEA on systems issues (85/123, 69.1%), while most found the Report Format clear (94/123, 76.4%) and would recommend it (88/123, 71.5%). Most SEA reports adopted a systems approach to analyses (125/149, 83.9%), care improvement (74/149, 49.7), or planned actions (42/149, 28.2%). Discussion: Applying human factors principles to SEA potentially enables care teams to gain a systems-based understanding of why things go wrong, which may help with related emotional demands and with more effective learning and improvement. PMID:27583996

  4. Course Modularization Applied: The Interface System and Its Implications For Sequence Control and Data Analysis.

    ERIC Educational Resources Information Center

    Schneider, E. W.

    The Interface System is a comprehensive method for developing and managing computer-assisted instructional courses or computer-managed instructional courses composed of sets of instructional modules. Each module is defined by one or more behavioral objectives and by a list of prerequisite modules that must be completed successfully before the…

  5. Matrix effects in applying mono- and polyclonal ELISA systems to the analysis of weathered oils in contaminated soil.

    PubMed

    Pollard, S J T; Farmer, J G; Knight, D M; Young, P J

    2002-01-01

    Commercial mono- and polyclonal enzyme-linked immunosorbent assay (ELISA) systems were applied to the on-site analysis of weathered hydrocarbon-contaminated soils at a former integrated steelworks. Comparisons were made between concentrations of solvent extractable matter (SEM) determined gravimetrically by Soxhlet (dichloromethane) extraction and those estimated immunologically by ELISA determination over a concentration range of 2000-330,000 mg SEM/kg soil dry weight. Both ELISA systems tinder-reported for the more weathered soil samples. Results suggest this is due to matrix effects in the sample rather than any inherent bias in the ELISA systems and it is concluded that, for weathered hydrocarbons typical of steelworks and coke production sites, the use of ELISA requires careful consideration as a field technique. Consideration of the target analyte relative to the composition of the hydrocarbon waste encountered appears critical.

  6. Comparing Waste-to-Energy technologies by applying energy system analysis.

    PubMed

    Münster, Marie; Lund, Henrik

    2010-07-01

    Even when policies of waste prevention, re-use and recycling are prioritised, a fraction of waste will still be left which can be used for energy recovery. This article asks the question: How to utilise waste for energy in the best way seen from an energy system perspective? Eight different Waste-to-Energy technologies are compared with a focus on fuel efficiency, CO(2) reductions and costs. The comparison is carried out by conducting detailed energy system analyses of the present as well as a potential future Danish energy system with a large share of combined heat and power as well as wind power. The study shows potential of using waste for the production of transport fuels. Biogas and thermal gasification technologies are hence interesting alternatives to waste incineration and it is recommended to support the use of biogas based on manure and organic waste. It is also recommended to support research into gasification of waste without the addition of coal and biomass. Together the two solutions may contribute to alternate use of one third of the waste which is currently incinerated. The remaining fractions should still be incinerated with priority to combined heat and power plants with high electric efficiency.

  7. Comparing Waste-to-Energy technologies by applying energy system analysis.

    PubMed

    Münster, Marie; Lund, Henrik

    2010-07-01

    Even when policies of waste prevention, re-use and recycling are prioritised, a fraction of waste will still be left which can be used for energy recovery. This article asks the question: How to utilise waste for energy in the best way seen from an energy system perspective? Eight different Waste-to-Energy technologies are compared with a focus on fuel efficiency, CO(2) reductions and costs. The comparison is carried out by conducting detailed energy system analyses of the present as well as a potential future Danish energy system with a large share of combined heat and power as well as wind power. The study shows potential of using waste for the production of transport fuels. Biogas and thermal gasification technologies are hence interesting alternatives to waste incineration and it is recommended to support the use of biogas based on manure and organic waste. It is also recommended to support research into gasification of waste without the addition of coal and biomass. Together the two solutions may contribute to alternate use of one third of the waste which is currently incinerated. The remaining fractions should still be incinerated with priority to combined heat and power plants with high electric efficiency. PMID:19700298

  8. Underdetermined system theory applied to quantitative analysis of responses caused by unsteady smooth-plane waves

    NASA Astrophysics Data System (ADS)

    Sano, Yukio

    1993-01-01

    The mechanical responses of rate-dependent media caused by unsteady smooth-plane waves are quantitatively analyzed by an underdetermined system of equations without using any constitutive relation of the media; that is, by using the particle velocity field expressed by an algebraic equation that is derived from the mass conservation equation, and the stress field expressed by an algebraic equation that is derived from the momentum conservation equation. First of all, this approach for analyzing unsteady wave motion is justified by the verification of various inferences such as the existence of the nonindependent elementary waves by Sano [J. Appl. Phys. 65, 3857(1989)] and the degradation process by Sano [J. Appl. Phys. 67, 4072(1990)]. Second, the situation under which a spike arises in particle velocity-time and stress-time profiles, and the reason for the arising are clarified. Third, the influence of the spike on stress-particle velocity and stress-strain paths is examined. The spike induced in the profiles by a growing wave greatly influences the paths near the impacted surface. Finally, calculated particle velocity-time profiles are compared with experimental data.

  9. NextGen Brain Microdialysis: Applying Modern Metabolomics Technology to the Analysis of Extracellular Fluid in the Central Nervous System

    PubMed Central

    Kao, Chi-Ya; Anderzhanova, Elmira; Asara, John M.; Wotjak, Carsten T.; Turck, Christoph W.

    2015-01-01

    Microdialysis is a powerful method for in vivo neurochemical analyses. It allows fluid sampling in a dynamic manner in specific brain regions over an extended period of time. A particular focus has been the neurochemical analysis of extracellular fluids to explore central nervous system functions. Brain microdialysis recovers neurotransmitters, low-molecular-weight neuromodulators and neuropeptides of special interest when studying behavior and drug effects. Other small molecules, such as central metabolites, are typically not assessed despite their potential to yield important information related to brain metabolism and activity in selected brain regions. We have implemented a liquid chromatography online mass spectrometry metabolomics platform for an expanded analysis of mouse brain microdialysates. The method is sensitive and delivers information for a far greater number of analytes than commonly used electrochemical and fluorescent detection or biochemical assays. The metabolomics platform was applied to the analysis of microdialysates in a foot shock-induced mouse model of posttraumatic stress disorder (PTSD). The rich metabolite data information was then used to delineate affected prefrontal molecular pathways that reflect individual susceptibility for developing PTSD-like symptoms. We demonstrate that hypothesis-free metabolomics can be adapted to the analysis of microdialysates for the discovery of small molecules with functional significance. PMID:27602357

  10. NextGen Brain Microdialysis: Applying Modern Metabolomics Technology to the Analysis of Extracellular Fluid in the Central Nervous System

    PubMed Central

    Kao, Chi-Ya; Anderzhanova, Elmira; Asara, John M.; Wotjak, Carsten T.; Turck, Christoph W.

    2015-01-01

    Microdialysis is a powerful method for in vivo neurochemical analyses. It allows fluid sampling in a dynamic manner in specific brain regions over an extended period of time. A particular focus has been the neurochemical analysis of extracellular fluids to explore central nervous system functions. Brain microdialysis recovers neurotransmitters, low-molecular-weight neuromodulators and neuropeptides of special interest when studying behavior and drug effects. Other small molecules, such as central metabolites, are typically not assessed despite their potential to yield important information related to brain metabolism and activity in selected brain regions. We have implemented a liquid chromatography online mass spectrometry metabolomics platform for an expanded analysis of mouse brain microdialysates. The method is sensitive and delivers information for a far greater number of analytes than commonly used electrochemical and fluorescent detection or biochemical assays. The metabolomics platform was applied to the analysis of microdialysates in a foot shock-induced mouse model of posttraumatic stress disorder (PTSD). The rich metabolite data information was then used to delineate affected prefrontal molecular pathways that reflect individual susceptibility for developing PTSD-like symptoms. We demonstrate that hypothesis-free metabolomics can be adapted to the analysis of microdialysates for the discovery of small molecules with functional significance.

  11. Frequency Domain Analysis of Beat-Less Control Method for Converter-Inverter Driving Systems Applied to AC Electric Cars

    NASA Astrophysics Data System (ADS)

    Kimura, Akira

    In inverter-converter driving systems for AC electric cars, the DC input voltage of an inverter contains a ripple component with a frequency that is twice as high as the line voltage frequency, because of a single-phase converter. The ripple component of the inverter input voltage causes pulsations on torques and currents of driving motors. To decrease the pulsations, a beat-less control method, which modifies a slip frequency depending on the ripple component, is applied to the inverter control. In the present paper, the beat-less control method was analyzed in the frequency domain. In the first step of the analysis, transfer functions, which revealed the relationship among the ripple component of the inverter input voltage, the slip frequency, the motor torque pulsation and the current pulsation, were derived with a synchronous rotating model of induction motors. An analysis model of the beat-less control method was then constructed using the transfer functions. The optimal setting of the control method was obtained according to the analysis model. The transfer functions and the analysis model were verified through simulations.

  12. Simulating Nationwide Pandemics: Applying the Multi-scale Epidemiologic Simulation and Analysis System to Human Infectious Diseases

    SciTech Connect

    Dombroski, M; Melius, C; Edmunds, T; Banks, L E; Bates, T; Wheeler, R

    2008-09-24

    This study uses the Multi-scale Epidemiologic Simulation and Analysis (MESA) system developed for foreign animal diseases to assess consequences of nationwide human infectious disease outbreaks. A literature review identified the state of the art in both small-scale regional models and large-scale nationwide models and characterized key aspects of a nationwide epidemiological model. The MESA system offers computational advantages over existing epidemiological models and enables a broader array of stochastic analyses of model runs to be conducted because of those computational advantages. However, it has only been demonstrated on foreign animal diseases. This paper applied the MESA modeling methodology to human epidemiology. The methodology divided 2000 US Census data at the census tract level into school-bound children, work-bound workers, elderly, and stay at home individuals. The model simulated mixing among these groups by incorporating schools, workplaces, households, and long-distance travel via airports. A baseline scenario with fixed input parameters was run for a nationwide influenza outbreak using relatively simple social distancing countermeasures. Analysis from the baseline scenario showed one of three possible results: (1) the outbreak burned itself out before it had a chance to spread regionally, (2) the outbreak spread regionally and lasted a relatively long time, although constrained geography enabled it to eventually be contained without affecting a disproportionately large number of people, or (3) the outbreak spread through air travel and lasted a long time with unconstrained geography, becoming a nationwide pandemic. These results are consistent with empirical influenza outbreak data. The results showed that simply scaling up a regional small-scale model is unlikely to account for all the complex variables and their interactions involved in a nationwide outbreak. There are several limitations of the methodology that should be explored in future

  13. Conversation Analysis and Applied Linguistics.

    ERIC Educational Resources Information Center

    Schegloff, Emanuel A.; Koshik, Irene; Jacoby, Sally; Olsher, David

    2002-01-01

    Offers biographical guidance on several major areas of conversation-analytic work--turn-taking, repair, and word selection--and indicates past or potential points of contact with applied linguistics. Also discusses areas of applied linguistic work. (Author/VWL)

  14. Applied mathematics of chaotic systems

    SciTech Connect

    Jen, E.; Alber, M.; Camassa, R.; Choi, W.; Crutchfield, J.; Holm, D.; Kovacic, G.; Marsden, J.

    1996-07-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objectives of the project were to develop new mathematical techniques for describing chaotic systems and for reexpressing them in forms that can be solved analytically and computationally. The authors focused on global bifurcation analysis of rigid body motion in an ideal incompressible fluid and on an analytical technique for the exact solution of nonlinear cellular automata. For rigid-body motion, they investigated a new completely integrable partial differential equation (PDE) representing model motion of fronts in nematic crystals and studied perturbations of the integrable PDE. For cellular automata with multiple domain structures, the work has included: (1) identification of the associated set of conserved quantities for each type of domain; (2) use of the conserved quantities to construct isomorphism between the nonlinear system and a linear template; and (3) use of exact solvability methods to characterize detailed structure of equilibrium states and to derive bounds for maximal transience times.

  15. Functional analysis, a resilience improvement tool applied to a waste management system - application to the "household waste management chain"

    NASA Astrophysics Data System (ADS)

    Beraud, H.; Barroca, B.; Hubert, G.

    2012-12-01

    A waste management system plays a leading role in the capacity of an area to restart after flooding, as their impact on post-crisis management can be very considerable. Improving resilience, i.e. enabling it to maintain or recover acceptable operating levels after flooding is primordial. To achieve this, we must understand how the system works for bringing any potential dysfunctions to light and taking preventive measures. Functional analysis has been used for understanding the complexity of this type of system. The purpose of this article is to show the interest behind this type of method and the limits in its use for improving resilience of waste management system as well as other urban technical systems1, by means of theoretical modelling and its application on a study site. 1In a systemic vision of the city, urban technical systems combine all the user service systems that are essential for the city to operate (electricity, water supplies, transport, sewerage, etc.). These systems are generally organised in the form of networks (Coutard, 2010; CERTU, 2005).

  16. Economic impacts of bio-refinery and resource cascading systems: an applied general equilibrium analysis for Poland.

    PubMed

    Ignaciuk, Adriana M; Sanders, Johan

    2007-12-01

    Due to more stringent energy and climate policies, it is expected that many traditional chemicals will be replaced by their biomass-based substitutes, bio-chemicals. These innovations, however, can influence land allocation since the demand for land dedicated to specific crops might increase. Moreover, it can have an influence on traditional agricultural production. In this paper, we use an applied general equilibrium framework, in which we include two different bio-refinery processes and incorporate so-called cascading mechanisms. The bio-refinery processes use grass, as one of the major inputs, to produce bio-nylon and propane-diol (1,3PDO) to substitute currently produced fossil fuel-based nylon and ethane-diol. We examine the impact of specific climate policies on the bioelectricity share in total electricity production, land allocation, and production quantities and prices of selected commodities. The novel technologies become competitive, with an increased stringency of climate policies. This switch, however, does not induce a higher share of bioelectricity. The cascade does stimulate the production of bioelectricity, but it induces more of a shift in inputs in the bioelectricity sector (from biomass to the cascaded bio-nylon and 1, 3PDO) than an increase in production level of bioelectricity. We conclude that dedicated biomass crops will remain the main option for bioelectricity production: the contribution of the biomass systems remains limited. Moreover, the bioelectricity sector looses a competition for land for biomass production with bio-refineries. PMID:17924388

  17. A mathematical method for the 3D analysis of rotating deformable systems applied on lumen-forming MDCK cell aggregates.

    PubMed

    Marmaras, Anastasios; Berge, Ulrich; Ferrari, Aldo; Kurtcuoglu, Vartan; Poulikakos, Dimos; Kroschewski, Ruth

    2010-04-01

    Cell motility contributes to the formation of organs and tissues, into which multiple cells self-organize. However such mammalian cellular motilities are not characterized in a quantitative manner and the systemic consequences are thus unknown. A mathematical tool to decipher cell motility, accounting for changes in cell shape, within a three-dimensional (3D) cell system was missing. We report here such a tool, usable on segmented images reporting the outline of clusters (cells) and allowing the time-resolved 3D analysis of circular motility of these as parts of a system (cell aggregate). Our method can analyze circular motility in sub-cellular, cellular, multi-cellular, and also non-cellular systems for which time-resolved segmented cluster outlines are available. To exemplify, we characterized the circular motility of lumen-initiating MDCK cell aggregates, embedded in extracellular matrix. We show that the organization of the major surrounding matrix fibers was not significantly affected during this cohort rotation. Using our developed tool, we discovered two classes of circular motion, rotation and random walk, organized in three behavior patterns during lumen initiation. As rotational movements were more rapid than random walk and as both could continue during lumen initiation, we conclude that neither the class nor the rate of motion regulates lumen initiation. We thus reveal a high degree of plasticity during a developmentally critical cell polarization step, indicating that lumen initiation is a robust process. However, motility rates decreased with increasing cell number, previously shown to correlate with epithelial polarization, suggesting that migratory polarization is converted into epithelial polarization during aggregate development. PMID:20183868

  18. Applied Information Systems Research Program Workshop

    NASA Technical Reports Server (NTRS)

    Bredekamp, Joe

    1991-01-01

    Viewgraphs on Applied Information Systems Research Program Workshop are presented. Topics covered include: the Earth Observing System Data and Information System; the planetary data system; Astrophysics Data System project review; OAET Computer Science and Data Systems Programs; the Center of Excellence in Space Data and Information Sciences; and CASIS background.

  19. A Systematic Comprehensive Computational Model for Stake Estimation in Mission Assurance: Applying Cyber Security Econometrics System (CSES) to Mission Assurance Analysis Protocol (MAAP)

    SciTech Connect

    Abercrombie, Robert K; Sheldon, Frederick T; Grimaila, Michael R

    2010-01-01

    In earlier works, we presented a computational infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder stands to sustain as a result of security breakdowns. In this paper, we discuss how this infrastructure can be used in the subject domain of mission assurance as defined as the full life-cycle engineering process to identify and mitigate design, production, test, and field support deficiencies of mission success. We address the opportunity to apply the Cyberspace Security Econometrics System (CSES) to Carnegie Mellon University and Software Engineering Institute s Mission Assurance Analysis Protocol (MAAP) in this context.

  20. 2012 International Conference on Medical Physics and Biomedical Engineering Thermal Economic Analysis on LiBr Refrigeration -Heat Pump System Applied in CCHP System

    NASA Astrophysics Data System (ADS)

    Zhang, CuiZhen; Yang, Mo; Lu, Mei; Zhu, Jiaxian; Xu, Wendong

    LiBr refrigeration cooling water contains a lot of low-temperature heat source, can use this part of the heat source heat boiler feed water. This paper introduced LiBr refrigeration - heat pump system which recovery heat of the LiBr refrigeration cooling water by heat pump system to heat the feed water of boiler. Hot economic analysis on the system has been performed based on the experimental data. Results show that LiBr refrigeration-heat pump system brings 26.6 percent decrease in primary energy rate consumption comparing with the combined heat and power production system(CHP) and separate generation of cold;

  1. Applied Behavior Analysis and Statistical Process Control?

    ERIC Educational Resources Information Center

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  2. On differentiation in applied behavior analysis

    PubMed Central

    Fawcett, Stephen B.

    1985-01-01

    Distinct types of activity in the field of applied behavior analysis are noted and discussed. Four metaphorical types of activity are considered: prospecting, farming, building, and guiding. Prospecting consists of time-limited exploration of a variety of beaviors, populations, or settings. Farming consists of producing new behaviors in the same setting using independent variables provided by the researchers or normally available in the setting. Building consists of combining procedural elements to create new programs or systems or to rehabilitate aspects of existing programs. Guiding involves pointing out connections between the principles of human behavior and the problems, populations, settings, and procedures with which researchers are (or could be) working. Advantages of each sphere are noted, and benefits of this division of labor to the field as a whole are discussed. PMID:22478631

  3. System planning analysis applied to OTEC: initial cases by Florida Power Corporation. Task II report No. FC-5237-2

    SciTech Connect

    1980-03-01

    The objective of the task was to exercise the FPC system planning methodology on: (1) Base Case, 10 year generation expansion plan with coal plants providing base load expansion, and (2) same, but 400 MW of OTEC substituting for coal burning units with equal resultant system reliability. OTEC inputs were based on reasonable economic projections of direct capital cost and O and M costs for first-generation large commercial plants. OTEC inputs discussed in Section 2. The Base Case conditions for FPC system planning methodology involved base load coal fueled additions during the 1980's and early 1990's. The first trial runs of the PROMOD system planning model substituted OTEC for 400 MW purchases of coal generated power during 1988-1989 and then 400 MW coal capacity thereafter. Result showed higher system reliability than Base Case runs. Reruns with greater coal fueled capacity displacement showed that OTEC could substitute for 400 MW purchases in 1988-1989 and replace the 800 MW coal unit scheduled for 1990 to yield equivalent system reliability. However, a 1995 unit would need to be moved to 1994. Production costing computer model runs were used as input to Corporate Model to examine corporate financial impact. Present value of total revenue requirements were primary indication of relative competitiveness between Base Case and OTEC. Results show present value of total revenue requirements unfavorable to OTEC as compared to coal units. The disparity was in excess of the allowable range for possible consideration.

  4. Concept analysis of culture applied to nursing.

    PubMed

    Marzilli, Colleen

    2014-01-01

    Culture is an important concept, especially when applied to nursing. A concept analysis of culture is essential to understanding the meaning of the word. This article applies Rodgers' (2000) concept analysis template and provides a definition of the word culture as it applies to nursing practice. This article supplies examples of the concept of culture to aid the reader in understanding its application to nursing and includes a case study demonstrating components of culture that must be respected and included when providing health care.

  5. Hydrogeochemical variables regionalization--applying cluster analysis for a seasonal evolution model from an estuarine system affected by AMD.

    PubMed

    Grande, J A; Carro, B; Borrego, J; de la Torre, M L; Valente, T; Santisteban, M

    2013-04-15

    This study describes the spatial evolution of the hydrogeochemical parameters which characterise a strongly affected estuary by Acid Mine Drainage (AMD). The studied estuarine system receives AMD from the Iberian Pyrite Belt (SW Spain) and, simultaneously, is affected by the presence of an industrial chemical complex. Water sampling was performed in the year of 2008, comprising four sampling campaigns, in order to represent seasonality. The results show how the estuary can be divided into three areas of different behaviour in response to hydrogeochemical variables concentrations that define each sampling stations: on one hand, an area dominated by tidal influence; in the opposite end there is a second area including the points located in the two rivers headwaters that are not influenced by seawater; finally there is the area that can be defined as mixing zone. These areas are moved along the hydrological year due to seasonal chemical variations. PMID:23453814

  6. Electrochemical analysis of acetaminophen using a boron-doped diamond thin film electrode applied to flow injection system.

    PubMed

    Wangfuengkanagul, Nattakarn; Chailapakul, Orawon

    2002-06-01

    The electrochemistry of acetaminophen in phosphate buffer solution (pH 8) was studied at a boron-doped diamond (BDD) thin film electrode using cyclic voltammetry, hydrodynamic voltammetry, and flow injection with amperometric detection. Cyclic voltammetry was used to study the reaction as a function of concentration of analyte. Comparison experiments were performed using a polished glassy carbon (GC) electrode. Acetaminophen undergoes quasi-reversible reaction at both of these two electrodes. The BDD and GC electrodes provided well-resolved cyclic voltammograms but the voltammetric signal-to-background ratios obtained from the diamond electrode were higher than those obtained from the GC electrode. The diamond electrode provided a linear dynamic range from 0.1 to 8 mM and a detection of 10 microM (S/B approximately 3) for voltammetric measurement. The flow injection analysis results at the diamond electrode indicated a linear dynamic range from 0.5 to 50 microM and a detection limit of 10 nM (S/N approximately 4). Acetaminophen in syrup samples has also been investigated. The results obtained in the recovery study (24.68+/-0.26 mg/ml) were comparable to those labeled (24 mg/ml). PMID:12039625

  7. Defining applied behavior analysis: An historical analogy

    PubMed Central

    Deitz, Samuel M.

    1982-01-01

    This article examines two criteria for a definition of applied behavior analysis. The criteria are derived from a 19th century attempt to establish medicine as a scientific field. The first criterion, experimental determinism, specifies the methodological boundaries of an experimental science. The second criterion, philosophic doubt, clarifies the tentative nature of facts and theories derived from those facts. Practices which will advance the science of behavior are commented upon within each criteria. To conclude, the problems of a 19th century form of empiricism in medicine are related to current practices in applied behavior analysis. PMID:22478557

  8. Thermodynamic Laws Applied to Economic Systems

    ERIC Educational Resources Information Center

    González, José Villacís

    2009-01-01

    Economic activity in its different manifestations--production, exchange, consumption and, particularly, information on quantities and prices--generates and transfers energy. As a result, we can apply to it the basic laws of thermodynamics. These laws are applicable within a system, i.e., in a country or between systems and countries. To these…

  9. Goals Analysis Procedure Guidelines for Applying the Goals Analysis Process

    NASA Technical Reports Server (NTRS)

    Motley, Albert E., III

    2000-01-01

    One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.

  10. Applied behavior analysis and statistical process control?

    PubMed Central

    Hopkins, B L

    1995-01-01

    This paper examines Pfadt and Wheeler's (1995) suggestions that the methods of statistical process control (SPC) be incorporated into applied behavior analysis. The research strategies of SPC are examined and compared to those of applied behavior analysis. I argue that the statistical methods that are a part of SPC would likely reduce applied behavior analysts' intimate contacts with the problems with which they deal and would, therefore, likely yield poor treatment and research decisions. Examples of these kinds of results and decisions are drawn from the cases and data Pfadt and Wheeler present. This paper also describes and clarifies many common misconceptions about SPC, including W. Edwards Deming's involvement in its development, its relationship to total quality management, and its confusion with various other methods designed to detect sources of unwanted variability. PMID:7592156

  11. Positive Behavior Support and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Johnston, J. M.; Foxx, R. M.; Jacobson, J. W.; Green, G.; Mulick, J. A.

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We…

  12. Caldwell University's Department of Applied Behavior Analysis.

    PubMed

    Reeve, Kenneth F; Reeve, Sharon A

    2016-05-01

    Since 2004, faculty members at Caldwell University have developed three successful graduate programs in Applied Behavior Analysis (i.e., PhD, MA, non-degree programs), increased program faculty from two to six members, developed and operated an on-campus autism center, and begun a stand-alone Applied Behavior Analysis Department. This paper outlines a number of strategies used to advance these initiatives, including those associated with an extensive public relations campaign. We also outline challenges that have limited our programs' growth. These strategies, along with a consideration of potential challenges, might prove useful in guiding academicians who are interested in starting their own programs in behavior analysis. PMID:27606194

  13. What happened to analysis in applied behavior analysis?

    PubMed

    Pierce, W D; Epling, W F

    1980-01-01

    This paper addresses the current help-oriented focus of researchers in applied behavior analysis. Evidence from a recent volume of JABA suggests that analytic behavior is at low levels in applied analysis while cure-help behavior is at high strength. This low proportion of scientific behavior is apparantly related to cure-help contingencies set by institutions and agencies of help and the editorial policies of JABA itself. These contingencies have favored the flight to real people and a concern with client gains, evaluation and outcome strategies rather than the analysis of contingencies of reinforcement controlling human behavior. In this regard, the paper documents the current separation of applied behavior analysis from the experimental analysis of behavior. There is limited use of basic principles in applied analysis today and almost no reference to the current research in the experimental analysis of behavior involving concurrent operants and adjunctive behavior. This divorce of applied behavior research and the experimental analysis of behavior will mitigate against progress toward a powerful technology of behavior. In order to encourage a return to analysis in applied research, there is a need to consider the objectives of applied behavior analysis. The original purpose of behavioral technology is examined and a re-definition of the concept of "social importance" is presented which can direct applied researchers toward an analytic focus. At the same time a change in the publication policies of applied journals such as JABA toward analytic research and the design of new educational contingencies for students will insure the survival of analysis in applied behavior analysis. PMID:22478471

  14. Applied Pharmaceutical Analysis India 2014 conference report.

    PubMed

    Kole, Prashant; Barot, Deepak; Kotecha, Jignesh; Raina, Vijay; Rao, Mukkavilli; Yadav, Manish

    2014-01-01

    Applied Pharmaceutical Analysis (APA) India 23-26 February 2014, Ahmedabad, India The fifth Applied Pharmaceutical Analysis (APA) India meeting was held in February 2014 at Hyatt Ahmedabad, India. With the theme of 'The Science of Measurement: Current status and Future trends in Bioanalysis, Biotransformation and Drug Discovery Platforms', the conference was attended by over 160 delegates. The agenda comprised advanced and relevant research topics in the key areas of bioanalysis and drug metabolism. APA India 2014 provided a unique platform for networking and professional linking to participants, innovators and policy-makers. As part of the global research community, APA India continues to grow and receive considerable attention from the drug discovery and development community of India.

  15. Applying neural networks in autonomous systems

    NASA Astrophysics Data System (ADS)

    Thornbrugh, Allison L.; Layne, J. D.; Wilson, James M., III

    1992-03-01

    Autonomous and teleautonomous operations have been defined in a variety of ways by different groups involved with remote robotic operations. For example, Conway describes architectures for producing intelligent actions in teleautonomous systems. Applying neural nets in such systems is similar to applying them in general. However, for autonomy, learning or learned behavior may become a significant system driver. Thus, artificial neural networks are being evaluated as components in fully autonomous and teleautonomous systems. Feed- forward networks may be trained to perform adaptive signal processing, pattern recognition, data fusion, and function approximation -- as in control subsystems. Certain components of particular autonomous systems become more amenable to implementation using a neural net due to a match between the net's attributes and desired attributes of the system component. Criteria have been developed for distinguishing such applications and then implementing them. The success of hardware implementation is a crucial part of this application evaluation process. Three basic applications of neural nets -- autoassociation, classification, and function approximation -- are used to exemplify this process and to highlight procedures that are followed during the requirements, design, and implementation phases. This paper assumes some familiarity with basic neural network terminology and concentrates upon the use of different neural network types while citing references that cover the underlying mathematics and related research.

  16. Tropospheric Delay Raytracing Applied in VLBI Analysis

    NASA Astrophysics Data System (ADS)

    MacMillan, D. S.; Eriksson, D.; Gipson, J. M.

    2013-12-01

    Tropospheric delay modeling error continues to be one of the largest sources of error in VLBI analysis. For standard operational solutions, we use the VMF1 elevation-dependent mapping functions derived from ECMWF data. These mapping functions assume that tropospheric delay at a site is azimuthally symmetric. As this assumption does not reflect reality, we have determined the raytrace delay along the signal path through the troposphere for each VLBI quasar observation. We determined the troposphere refractivity fields from the pressure, temperature, specific humidity and geopotential height fields of the NASA GSFC GEOS-5 numerical weather model. We discuss results from analysis of the CONT11 R&D and the weekly operational R1+R4 experiment sessions. When applied in VLBI analysis, baseline length repeatabilities were better for 66-72% of baselines with raytraced delays than with VMF1 mapping functions. Vertical repeatabilities were better for 65% of sites.

  17. Positive Behavior Support and Applied Behavior Analysis

    PubMed Central

    Johnston, J.M; Foxx, Richard M; Jacobson, John W; Green, Gina; Mulick, James A

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We also consider the features of PBS that have facilitated its broad dissemination and how ABA might benefit from emulating certain practices of the PBS movement. PMID:22478452

  18. Wavelet analysis applied to the IRAS cirrus

    NASA Technical Reports Server (NTRS)

    Langer, William D.; Wilson, Robert W.; Anderson, Charles H.

    1994-01-01

    The structure of infrared cirrus clouds is analyzed with Laplacian pyramid transforms, a form of non-orthogonal wavelets. Pyramid and wavelet transforms provide a means to decompose images into their spatial frequency components such that all spatial scales are treated in an equivalent manner. The multiscale transform analysis is applied to IRAS 100 micrometer maps of cirrus emission in the north Galactic pole region to extract features on different scales. In the maps we identify filaments, fragments and clumps by separating all connected regions. These structures are analyzed with respect to their Hausdorff dimension for evidence of the scaling relationships in the cirrus clouds.

  19. Artificial intelligence applied to process signal analysis

    NASA Technical Reports Server (NTRS)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  20. Applied Information Systems Research Program Workshop

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The first Applied Information Systems Research Program (AISRP) Workshop provided the impetus for several groups involved in information systems to review current activities. The objectives of the workshop included: (1) to provide an open forum for interaction and discussion of information systems; (2) to promote understanding by initiating a dialogue with the intended benefactors of the program, the scientific user community, and discuss options for improving their support; (3) create an advocacy in having science users and investigators of the program meet together and establish the basis for direction and growth; and (4) support the future of the program by building collaborations and interaction to encourage an investigator working group approach for conducting the program.

  1. Applied spectrophotometry: analysis of a biochemical mixture.

    PubMed

    Trumbo, Toni A; Schultz, Emeric; Borland, Michael G; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the concentration of a given species (RNA, DNA, protein) in isolation (a contrived circumstance) as opposed to determining that concentration in the presence of other species (a more realistic situation). To present the student with a more realistic laboratory experience and also to fill a hole that we believe exists in student experience prior to reaching a biochemistry course, we have devised a three week laboratory experience designed so that students learn to: connect laboratory practice with theory, apply the Beer-Lambert-Bougert Law to biochemical analyses, demonstrate the utility and limitations of example quantitative colorimetric assays, demonstrate the utility and limitations of UV analyses for biomolecules, develop strategies for analysis of a solution of unknown biomolecular composition, use digital micropipettors to make accurate and precise measurements, and apply graphing software. PMID:23625877

  2. Applied spectrophotometry: analysis of a biochemical mixture.

    PubMed

    Trumbo, Toni A; Schultz, Emeric; Borland, Michael G; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the concentration of a given species (RNA, DNA, protein) in isolation (a contrived circumstance) as opposed to determining that concentration in the presence of other species (a more realistic situation). To present the student with a more realistic laboratory experience and also to fill a hole that we believe exists in student experience prior to reaching a biochemistry course, we have devised a three week laboratory experience designed so that students learn to: connect laboratory practice with theory, apply the Beer-Lambert-Bougert Law to biochemical analyses, demonstrate the utility and limitations of example quantitative colorimetric assays, demonstrate the utility and limitations of UV analyses for biomolecules, develop strategies for analysis of a solution of unknown biomolecular composition, use digital micropipettors to make accurate and precise measurements, and apply graphing software.

  3. A novel mating system analysis for modes of self-oriented mating applied to diploid and polyploid arctic Easter daisies (Townsendia hookeri).

    PubMed

    Thompson, S L; Ritland, K

    2006-08-01

    We have developed a new model for mating system analysis, which attempts to distinguish among alternative modes of self-oriented mating within populations. This model jointly estimates the rates of outcrossing, selfing, automixis and apomixis, through the use of information in the family structure given by dominant genetic marker data. The method is presented, its statistical properties evaluated, and is applied to three arctic Easter daisy populations, one consisting of diploids, the other two of tetraploids. The tetraploids are predominantly male sterile and reported to be apomictic while the diploids are male fertile. In each Easter daisy population, 10 maternal arrays of six progeny were assayed for amplified fragment length polymorphism markers. Estimates, confirmed with likelihood ratio tests of mating hypotheses, showed apomixis to be predominant in all populations (ca. 70%), but selfing or automixis was moderate (ca. 25%) in tetraploids. It was difficult to distinguish selfing from automixis, and simulations confirm that with even very large sample sizes, the estimates have a very strong negative statistical correlation, for example, they are not independent. No selfing or automixis was apparent in the diploid population, instead, moderate levels of outcrossing were detected (23%). Low but significant levels of outcrossing (2-4%) seemed to occur in the male-sterile tetraploid populations; this may be due to genotyping error of this level. Overall, this study shows apomixis can be partial, and provides evidence for higher levels of inbreeding in polyploids compared to diploids and for significant levels of apomixis in a diploid plant population. PMID:16721390

  4. The Applied Mathematics for Power Systems (AMPS)

    SciTech Connect

    Chertkov, Michael

    2012-07-24

    Increased deployment of new technologies, e.g., renewable generation and electric vehicles, is rapidly transforming electrical power networks by crossing previously distinct spatiotemporal scales and invalidating many traditional approaches for designing, analyzing, and operating power grids. This trend is expected to accelerate over the coming years, bringing the disruptive challenge of complexity, but also opportunities to deliver unprecedented efficiency and reliability. Our Applied Mathematics for Power Systems (AMPS) Center will discover, enable, and solve emerging mathematics challenges arising in power systems and, more generally, in complex engineered networks. We will develop foundational applied mathematics resulting in rigorous algorithms and simulation toolboxes for modern and future engineered networks. The AMPS Center deconstruction/reconstruction approach 'deconstructs' complex networks into sub-problems within non-separable spatiotemporal scales, a missing step in 20th century modeling of engineered networks. These sub-problems are addressed within the appropriate AMPS foundational pillar - complex systems, control theory, and optimization theory - and merged or 'reconstructed' at their boundaries into more general mathematical descriptions of complex engineered networks where important new questions are formulated and attacked. These two steps, iterated multiple times, will bridge the growing chasm between the legacy power grid and its future as a complex engineered network.

  5. Tribological systems as applied to aircraft engines

    NASA Technical Reports Server (NTRS)

    Buckley, D. H.

    1985-01-01

    Tribological systems as applied to aircraft are reviewed. The importance of understanding the fundamental concepts involved in such systems is discussed. Basic properties of materials which can be related to adhesion, friction and wear are presented and correlated with tribology. Surface processes including deposition and treatment are addressed in relation to their present and future application to aircraft components such as bearings, gears and seals. Lubrication of components with both liquids and solids is discussed. Advances in both new liquid molecular structures and additives for those structures are reviewed and related to the needs of advanced engines. Solids and polymer composites are suggested for increasing use and ceramic coatings containing fluoride compounds are offered for the extreme temperatures encountered in such components as advanced bearings and seals.

  6. Use of an automated chromium reduction system for hydrogen isotope ratio analysis of physiological fluids applied to doubly labeled water analysis.

    PubMed

    Schoeller, D A; Colligan, A S; Shriver, T; Avak, H; Bartok-Olson, C

    2000-09-01

    The doubly labeled water method is commonly used to measure total energy expenditure in free-living subjects. The method, however, requires accurate and precise deuterium abundance determinations, which can be laborious. The aim of this study was to evaluate a fully automated, high-throughput, chromium reduction technique for the measurement of deuterium abundances in physiological fluids. The chromium technique was compared with an off-line zinc bomb reduction technique and also subjected to test-retest analysis. Analysis of international water standards demonstrated that the chromium technique was accurate and had a within-day precision of <1 per thousand. Addition of organic matter to water samples demonstrated that the technique was sensitive to interference at levels between 2 and 5 g l(-1). Physiological samples could be analyzed without this interference, plasma by 10000 Da exclusion filtration, saliva by sedimentation and urine by decolorizing with carbon black. Chromium reduction of urine specimens from doubly labeled water studies indicated no bias relative to zinc reduction with a mean difference in calculated energy expenditure of -0.2 +/- 3.9%. Blinded reanalysis of urine specimens from a second doubly labeled water study demonstrated a test-retest coefficient of variation of 4%. The chromium reduction method was found to be a rapid, accurate and precise method for the analysis of urine specimens from doubly labeled water. PMID:11006607

  7. Use of an automated chromium reduction system for hydrogen isotope ratio analysis of physiological fluids applied to doubly labeled water analysis.

    PubMed

    Schoeller, D A; Colligan, A S; Shriver, T; Avak, H; Bartok-Olson, C

    2000-09-01

    The doubly labeled water method is commonly used to measure total energy expenditure in free-living subjects. The method, however, requires accurate and precise deuterium abundance determinations, which can be laborious. The aim of this study was to evaluate a fully automated, high-throughput, chromium reduction technique for the measurement of deuterium abundances in physiological fluids. The chromium technique was compared with an off-line zinc bomb reduction technique and also subjected to test-retest analysis. Analysis of international water standards demonstrated that the chromium technique was accurate and had a within-day precision of <1 per thousand. Addition of organic matter to water samples demonstrated that the technique was sensitive to interference at levels between 2 and 5 g l(-1). Physiological samples could be analyzed without this interference, plasma by 10000 Da exclusion filtration, saliva by sedimentation and urine by decolorizing with carbon black. Chromium reduction of urine specimens from doubly labeled water studies indicated no bias relative to zinc reduction with a mean difference in calculated energy expenditure of -0.2 +/- 3.9%. Blinded reanalysis of urine specimens from a second doubly labeled water study demonstrated a test-retest coefficient of variation of 4%. The chromium reduction method was found to be a rapid, accurate and precise method for the analysis of urine specimens from doubly labeled water.

  8. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2002-01-01

    This viewgraph report presents an overview of activities and accomplishments of NASA's Marshall Space Flight Center's Applied Fluid Dynamics Analysis Group. Expertise in this group focuses on high-fidelity fluids design and analysis with application to space shuttle propulsion and next generation launch technologies. Topics covered include: computational fluid dynamics research and goals, turbomachinery research and activities, nozzle research and activities, combustion devices, engine systems, MDA development and CFD process improvements.

  9. Artificial intelligence technologies applied to terrain analysis

    SciTech Connect

    Wright, J.C. ); Powell, D.R. )

    1990-01-01

    The US Army Training and Doctrine Command is currently developing, in cooperation with Los Alamos National Laboratory, a Corps level combat simulation to support military analytical studies. This model emphasizes high resolution modeling of the command and control processes, with particular attention to architectural considerations that enable extension of the model. A planned future extension is the inclusion of an computer based planning capability for command echelons that can be dynamical invoked during the execution of then model. Command and control is the process through which the activities of military forces are directed, coordinated, and controlled to achieve the stated mission. To perform command and control the commander must understand the mission, perform terrain analysis, understand his own situation and capabilities as well as the enemy situation and his probable actions. To support computer based planning, data structures must be available to support the computer's ability to understand'' the mission, terrain, own capabilities, and enemy situation. The availability of digitized terrain makes it feasible to apply artificial intelligence technologies to emulate the terrain analysis process, producing data structures for uses in planning. The work derived thus for to support the understanding of terrain is the topic of this paper. 13 refs., 5 figs., 6 tabs.

  10. Applying Causal Reasoning to Analyze Value Systems

    NASA Astrophysics Data System (ADS)

    Macedo, Patrícia; Camarinha-Matos, Luis M.

    Collaborative networked organizations are composed of heterogeneous and autonomous entities. Thus it is natural that each member has its own set of values and preferences, as a result, conflicts among partners might emerge due to some values misalignment. Therefore, tools to support the analysis of Value Systems in a collaborative context are relevant to improve the network management. Since a Value System reflects the set of values and preferences of an actor, which are cognitive issues, a cognitive approach based on qualitative causal maps is suggested. Qualitative inference methods are presented in order to assess the potential for conflicts among network members and the positive impact between members' Value Systems. The software tool developed, in order to support the proposed framework and the qualitative inference methods, is briefly presented.

  11. Systems Analysis.

    ERIC Educational Resources Information Center

    Loucks, D. P.; Bell, J. M.

    1978-01-01

    Presents a literature review of the analysis of the administrative systems of various environmental programs related to water quality and pollution policy. A list of 70 references published in 1976 and 1977 is also presented. (HM)

  12. Magnetic Analysis Techniques Applied to Desert Varnish

    NASA Technical Reports Server (NTRS)

    Schmidgall, E. R.; Moskowitz, B. M.; Dahlberg, E. D.; Kuhlman, K. R.

    2003-01-01

    Desert varnish is a black or reddish coating commonly found on rock samples from arid regions. Typically, the coating is very thin, less than half a millimeter thick. Previous research has shown that the primary components of desert varnish are silicon oxide clay minerals (60%), manganese and iron oxides (20-30%), and trace amounts of other compounds [1]. Desert varnish is thought to originate when windborne particles containing iron and manganese oxides are deposited onto rock surfaces where manganese oxidizing bacteria concentrate the manganese and form the varnish [4,5]. If desert varnish is indeed biogenic, then the presence of desert varnish on rock surfaces could serve as a biomarker, indicating the presence of microorganisms. This idea has considerable appeal, especially for Martian exploration [6]. Magnetic analysis techniques have not been extensively applied to desert varnish. The only previous magnetic study reported that based on room temperature demagnetization experiments, there were noticeable differences in magnetic properties between a sample of desert varnish and the substrate sandstone [7]. Based upon the results of the demagnetization experiments, the authors concluded that the primary magnetic component of desert varnish was either magnetite (Fe3O4) or maghemite ( Fe2O3).

  13. Analysis of the interaction between experimental and applied behavior analysis.

    PubMed

    Virues-Ortega, Javier; Hurtado-Parrado, Camilo; Cox, Alison D; Pear, Joseph J

    2014-01-01

    To study the influences between basic and applied research in behavior analysis, we analyzed the coauthorship interactions of authors who published in JABA and JEAB from 1980 to 2010. We paid particular attention to authors who published in both JABA and JEAB (dual authors) as potential agents of cross-field interactions. We present a comprehensive analysis of dual authors' coauthorship interactions using social networks methodology and key word analysis. The number of dual authors more than doubled (26 to 67) and their productivity tripled (7% to 26% of JABA and JEAB articles) between 1980 and 2010. Dual authors stood out in terms of number of collaborators, number of publications, and ability to interact with multiple groups within the field. The steady increase in JEAB and JABA interactions through coauthors and the increasing range of topics covered by dual authors provide a basis for optimism regarding the progressive integration of basic and applied behavior analysis.

  14. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  15. Toward applied behavior analysis of life aloft

    NASA Technical Reports Server (NTRS)

    Brady, J. V.

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  16. Toward applied behavior analysis of life aloft.

    PubMed

    Brady, J V

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  17. Toward applied behavior analysis of life aloft.

    PubMed

    Brady, J V

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  18. Digital photoelastic analysis applied to implant dentistry

    NASA Astrophysics Data System (ADS)

    Ramesh, K.; Hariprasad, M. P.; Bhuvanewari, S.

    2016-12-01

    Development of improved designs of implant systems in dentistry have necessitated the study of stress fields in the implant regions of the mandible/maxilla for better understanding of the biomechanics involved. Photoelasticity has been used for various studies related to dental implants in view of whole field visualization of maximum shear stress in the form of isochromatic contours. The potential of digital photoelasticity has not been fully exploited in the field of implant dentistry. In this paper, the fringe field in the vicinity of the connected implants (All-On-Four® concept) is analyzed using recent advances in digital photoelasticity. Initially, a novel 3-D photoelastic model making procedure, to closely mimic all the anatomical features of the human mandible is proposed. By choosing appropriate orientation of the model with respect to the light path, the essential region of interest were sought to be analysed while keeping the model under live loading conditions. Need for a sophisticated software module to carefully identify the model domain has been brought out. For data extraction, five-step method is used and isochromatics are evaluated by twelve fringe photoelasticity. In addition to the isochromatic fringe field, whole field isoclinic data is also obtained for the first time in implant dentistry, which could throw important information in improving the structural stability of the implant systems. Analysis is carried out for the implant in the molar as well as the incisor region. In addition, the interaction effects of loaded molar implant on the incisor area are also studied.

  19. Moving Forward: Positive Behavior Support and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Tincani, Matt

    2007-01-01

    A controversy has emerged about the relationship between positive behavior support and applied behavior analysis. Some behavior analysts suggest that positive behavior support and applied behavior analysis are the same (e.g., Carr & Sidener, 2002). Others argue that positive behavior support is harmful to applied behavior analysis (e.g., Johnston,…

  20. Liquid Chromatography Applied to Space System

    NASA Astrophysics Data System (ADS)

    Poinot, Pauline; Chazalnoel, Pascale; Geffroy, Claude; Sternberg, Robert; Carbonnier, Benjamin

    Searching for signs of past or present life in our Solar System is a real challenge that stirs up the curiosity of scientists. Until now, in situ instrumentation was designed to detect and determine concentrations of a wide number of organic biomarkers. The relevant method which was and still is employed in missions dedicated to the quest of life (from Viking to ExoMars) corresponds to the pyrolysis-GC-MS. Along the missions, this approach has been significantly improved in terms of extraction efficiency and detection with the use of chemical derivative agents (e.g. MTBSTFA, DMF-DMA, TMAH…), and in terms of analysis sensitivity and resolution with the development of in situ high-resolution mass spectrometer (e.g. TOF-MS). Thanks to such an approach, organic compounds such as amino acids, sugars, tholins or polycyclic aromatic hydrocarbons (PAHs) were expected to be found. However, while there’s a consensus that the GC-MS of Viking, Huygens, MSL and MOMA space missions worked the way they had been designed to, pyrolysis is much more in debate (Glavin et al. 2001; Navarro-González et al. 2006). Indeed, (1) it is thought to remove low levels of organics, (2) water and CO2 could interfere with the detection of likely organic pyrolysis products, and (3) only low to mid-molecular weight organic molecules can be detected by this technique. As a result, researchers are now focusing on other in situ techniques which are no longer based on the volatility of the organic matter, but on the liquid phase extraction and analysis. In this line, micro-fluidic systems involving sandwich and/or competitive immunoassays (e.g. LMC, SOLID; Parro et al. 2005; Sims et al. 2012), micro-chip capillary electrophoreses (e.g. MOA; Bada et al. 2008), or nanopore-based analysis (e.g. BOLD; Schulze-Makuch et al. 2012) have been conceived for in situ analysis. Thanks to such approaches, molecular biological polymers (polysaccharides, polypeptides, polynucleotides, phospholipids, glycolipids

  1. Introduction: Conversation Analysis in Applied Linguistics

    ERIC Educational Resources Information Center

    Sert, Olcay; Seedhouse, Paul

    2011-01-01

    This short, introductory paper presents an up-to-date account of works within the field of Applied Linguistics which have been influenced by a Conversation Analytic paradigm. The article reviews recent studies in classroom interaction, materials development, proficiency assessment and language teacher education. We believe that the publication of…

  2. The Evolution of Fungicide Resistance Resulting from Combinations of Foliar-Acting Systemic Seed Treatments and Foliar-Applied Fungicides: A Modeling Analysis.

    PubMed

    Kitchen, James L; van den Bosch, Frank; Paveley, Neil D; Helps, Joseph; van den Berg, Femke

    2016-01-01

    For the treatment of foliar diseases of cereals, fungicides may be applied as foliar sprays or systemic seed treatments which are translocated to leaves. Little research has been done to assess the resistance risks associated with foliar-acting systemic seed treatments when used alone or in combination with foliar sprays, even though both types of treatment may share the same mode of action. It is therefore unknown to what extent adding a systemic seed treatment to a foliar spray programme poses an additional resistance risk and whether in the presence of a seed treatment additional resistance management strategies (such as limiting the total number of treatments) are necessary to limit the evolution of fungicide-resistance. A mathematical model was developed to simulate an epidemic and the resistance evolution of Zymoseptoria tritici on winter wheat, which was used to compare different combinations of seed and foliar treatments by calculating the fungicide effective life, i.e. the number of years before effective disease control is lost to resistance. A range of parameterizations for the seed treatment fungicide and different fungicide uptake models were compared. Despite the different parameterizations, the model consistently predicted the same trends in that i) similar levels of efficacy delivered either by a foliar-acting seed treatment, or a foliar application, resulted in broadly similar resistance selection, ii) adding a foliar-acting seed treatment to a foliar spray programme increased resistance selection and usually decreased effective life, and iii) splitting a given total dose-by adding a seed treatment to foliar treatments, but decreasing dose per treatment-gave effective lives that were the same as, or shorter than those given by the spray programme alone. For our chosen plant-pathogen-fungicide system, the model results suggest that to effectively manage selection for fungicide-resistance, foliar acting systemic seed treatments should be included as

  3. The Evolution of Fungicide Resistance Resulting from Combinations of Foliar-Acting Systemic Seed Treatments and Foliar-Applied Fungicides: A Modeling Analysis.

    PubMed

    Kitchen, James L; van den Bosch, Frank; Paveley, Neil D; Helps, Joseph; van den Berg, Femke

    2016-01-01

    For the treatment of foliar diseases of cereals, fungicides may be applied as foliar sprays or systemic seed treatments which are translocated to leaves. Little research has been done to assess the resistance risks associated with foliar-acting systemic seed treatments when used alone or in combination with foliar sprays, even though both types of treatment may share the same mode of action. It is therefore unknown to what extent adding a systemic seed treatment to a foliar spray programme poses an additional resistance risk and whether in the presence of a seed treatment additional resistance management strategies (such as limiting the total number of treatments) are necessary to limit the evolution of fungicide-resistance. A mathematical model was developed to simulate an epidemic and the resistance evolution of Zymoseptoria tritici on winter wheat, which was used to compare different combinations of seed and foliar treatments by calculating the fungicide effective life, i.e. the number of years before effective disease control is lost to resistance. A range of parameterizations for the seed treatment fungicide and different fungicide uptake models were compared. Despite the different parameterizations, the model consistently predicted the same trends in that i) similar levels of efficacy delivered either by a foliar-acting seed treatment, or a foliar application, resulted in broadly similar resistance selection, ii) adding a foliar-acting seed treatment to a foliar spray programme increased resistance selection and usually decreased effective life, and iii) splitting a given total dose-by adding a seed treatment to foliar treatments, but decreasing dose per treatment-gave effective lives that were the same as, or shorter than those given by the spray programme alone. For our chosen plant-pathogen-fungicide system, the model results suggest that to effectively manage selection for fungicide-resistance, foliar acting systemic seed treatments should be included as

  4. The Evolution of Fungicide Resistance Resulting from Combinations of Foliar-Acting Systemic Seed Treatments and Foliar-Applied Fungicides: A Modeling Analysis

    PubMed Central

    Kitchen, James L.; van den Bosch, Frank; Paveley, Neil D.; Helps, Joseph; van den Berg, Femke

    2016-01-01

    For the treatment of foliar diseases of cereals, fungicides may be applied as foliar sprays or systemic seed treatments which are translocated to leaves. Little research has been done to assess the resistance risks associated with foliar-acting systemic seed treatments when used alone or in combination with foliar sprays, even though both types of treatment may share the same mode of action. It is therefore unknown to what extent adding a systemic seed treatment to a foliar spray programme poses an additional resistance risk and whether in the presence of a seed treatment additional resistance management strategies (such as limiting the total number of treatments) are necessary to limit the evolution of fungicide-resistance. A mathematical model was developed to simulate an epidemic and the resistance evolution of Zymoseptoria tritici on winter wheat, which was used to compare different combinations of seed and foliar treatments by calculating the fungicide effective life, i.e. the number of years before effective disease control is lost to resistance. A range of parameterizations for the seed treatment fungicide and different fungicide uptake models were compared. Despite the different parameterizations, the model consistently predicted the same trends in that i) similar levels of efficacy delivered either by a foliar-acting seed treatment, or a foliar application, resulted in broadly similar resistance selection, ii) adding a foliar-acting seed treatment to a foliar spray programme increased resistance selection and usually decreased effective life, and iii) splitting a given total dose—by adding a seed treatment to foliar treatments, but decreasing dose per treatment—gave effective lives that were the same as, or shorter than those given by the spray programme alone. For our chosen plant-pathogen-fungicide system, the model results suggest that to effectively manage selection for fungicide-resistance, foliar acting systemic seed treatments should be included

  5. Applying time series analysis to performance logs

    NASA Astrophysics Data System (ADS)

    Kubacki, Marcin; Sosnowski, Janusz

    2015-09-01

    Contemporary computer systems provide mechanisms for monitoring various performance parameters (e.g. processor or memory usage, disc or network transfers), which are collected and stored in performance logs. An important issue is to derive characteristic features describing normal and abnormal behavior of the systems. For this purpose we use various schemes of analyzing time series. They have been adapted to the specificity of performance logs and verified using data collected from real systems. The presented approach is useful in evaluating system dependability.

  6. Liquid Chromatography Applied to Space System

    NASA Astrophysics Data System (ADS)

    Poinot, Pauline; Chazalnoel, Pascale; Geffroy, Claude; Sternberg, Robert; Carbonnier, Benjamin

    Searching for signs of past or present life in our Solar System is a real challenge that stirs up the curiosity of scientists. Until now, in situ instrumentation was designed to detect and determine concentrations of a wide number of organic biomarkers. The relevant method which was and still is employed in missions dedicated to the quest of life (from Viking to ExoMars) corresponds to the pyrolysis-GC-MS. Along the missions, this approach has been significantly improved in terms of extraction efficiency and detection with the use of chemical derivative agents (e.g. MTBSTFA, DMF-DMA, TMAH…), and in terms of analysis sensitivity and resolution with the development of in situ high-resolution mass spectrometer (e.g. TOF-MS). Thanks to such an approach, organic compounds such as amino acids, sugars, tholins or polycyclic aromatic hydrocarbons (PAHs) were expected to be found. However, while there’s a consensus that the GC-MS of Viking, Huygens, MSL and MOMA space missions worked the way they had been designed to, pyrolysis is much more in debate (Glavin et al. 2001; Navarro-González et al. 2006). Indeed, (1) it is thought to remove low levels of organics, (2) water and CO2 could interfere with the detection of likely organic pyrolysis products, and (3) only low to mid-molecular weight organic molecules can be detected by this technique. As a result, researchers are now focusing on other in situ techniques which are no longer based on the volatility of the organic matter, but on the liquid phase extraction and analysis. In this line, micro-fluidic systems involving sandwich and/or competitive immunoassays (e.g. LMC, SOLID; Parro et al. 2005; Sims et al. 2012), micro-chip capillary electrophoreses (e.g. MOA; Bada et al. 2008), or nanopore-based analysis (e.g. BOLD; Schulze-Makuch et al. 2012) have been conceived for in situ analysis. Thanks to such approaches, molecular biological polymers (polysaccharides, polypeptides, polynucleotides, phospholipids, glycolipids

  7. pH recycling aqueous two-phase systems applied in extraction of Maitake β-Glucan and mechanism analysis using low-field nuclear magnetic resonance.

    PubMed

    Hou, Huiyun; Cao, Xuejun

    2015-07-31

    In this paper, a recycling aqueous two-phase systems (ATPS) based on two pH-response copolymers PADB and PMDM were used in purification of β-Glucan from Grifola frondosa. The main parameters, such as polymer concentration, type and concentration of salt, extraction temperature and pH, were investigated to optimize partition conditions. The results demonstrated that β-Glucan was extracted into PADB-rich phase, while impurities were extracted into PMDM-rich phase. In this 2.5% PADB/2.5% PMDM ATPS, 7.489 partition coefficient and 96.92% extraction recovery for β-Glucan were obtained in the presence of 30mmol/L KBr, at pH 8.20, 30°C. The phase-forming copolymers could be recycled by adjusting pH, with recoveries of over 96.0%. Furthermore, the partition mechanism of Maitake β-Glucan in PADB/PMDM aqueous two-phase systems was studied. Fourier transform infrared spectra, ForteBio Octet system and low-field nuclear magnetic resonance (LF-NMR) were introduced for elucidating the partition mechanism of β-Glucan. Especially, LF-NMR was firstly used in the mechanism analysis in partition of aqueous two-phase systems. The change of transverse relaxation time (T2) in ATPS could reflect the interaction between polymers and β-Glucan.

  8. Statistical Uncertainty Analysis Applied to Criticality Calculation

    SciTech Connect

    Hartini, Entin; Andiwijayakusuma, Dinan; Susmikanti, Mike; Nursinta, A. W.

    2010-06-22

    In this paper, we present an uncertainty methodology based on a statistical approach, for assessing uncertainties in criticality prediction using monte carlo method due to uncertainties in the isotopic composition of the fuel. The methodology has been applied to criticality calculations with MCNP5 with additional stochastic input of the isotopic fuel composition. The stochastic input were generated using the latin hypercube sampling method based one the probability density function of each nuclide composition. The automatic passing of the stochastic input to the MCNP and the repeated criticality calculation is made possible by using a python script to link the MCNP and our latin hypercube sampling code.

  9. Expert systems applied to spacecraft fire safety

    NASA Technical Reports Server (NTRS)

    Smith, Richard L.; Kashiwagi, Takashi

    1989-01-01

    Expert systems are problem-solving programs that combine a knowledge base and a reasoning mechanism to simulate a human expert. The development of an expert system to manage fire safety in spacecraft, in particular the NASA Space Station Freedom, is difficult but clearly advantageous in the long-term. Some needs in low-gravity flammability characteristics, ventilating-flow effects, fire detection, fire extinguishment, and decision models, all necessary to establish the knowledge base for an expert system, are discussed.

  10. Applying texture analysis to materials engineering problems

    NASA Astrophysics Data System (ADS)

    Knorr, D. B.; Weiland, H.; Szpunar, J. A.

    1994-09-01

    Textures in materials have been studied extensively since the 1930s following the pioneering work of Wassermann.1,2 The modern era of texture measurement started in 1949 with the development of the x-ray pole figure technique for texture measurement by Schultz.3 Finally, modern texture analysis was initiated with the publication by Bunge4 and Roe5 of a mathematical method of pole figure inversion, which is now used to calculate the orientation distribution function (ODF). This article cannot summarize such an extensive body of work, but it does endeavor to provide the background necessary to understand texture analysis; it also illustrates several applications of texture.

  11. Cost Utility Analysis Applied to Library Budgeting.

    ERIC Educational Resources Information Center

    Stitleman, Leonard

    Cost Utility Analysis (CUA) is, basically, an administrative tool to be used in situations where making a choice among meaningful programs is necessary. It does not replace the administrator, but can provide a significant source of data for the decision maker. CUA can be a guide to the selection of an optimal program in terms of available funds,…

  12. Applied Spectrophotometry: Analysis of a Biochemical Mixture

    ERIC Educational Resources Information Center

    Trumbo, Toni A.; Schultz, Emeric; Borland, Michael G.; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the…

  13. Science, Skepticism, and Applied Behavior Analysis

    PubMed Central

    Normand, Matthew P

    2008-01-01

    Pseudoscientific claims concerning medical and psychological treatments of all varieties are commonplace. As behavior analysts, a sound skeptical approach to our science and practice is essential. The present paper offers an overview of science and skepticism and discusses the relationship of skepticism to behavior analysis, with an emphasis on the types of issues concerning behavior analysts in practice. PMID:22477687

  14. EG G Mound Applied Technologies payroll system

    SciTech Connect

    Not Available

    1992-02-07

    EG G Mound Applied Technologies, Inc., manages and operates the Mound Facility, Miamisburg, Ohio, under a cost-plus-award-fee contract administered by the Department of Energy's Albuquerque Field Office. The contractor's Payroll Department is responsible for prompt payment in the proper amount to all persons entitled to be paid, in compliance with applicable laws, regulations, and legal decisions. The objective was to determine whether controls were in place to avoid erroneous payroll payments. EG G Mound Applied Technologies, Inc., did not have all the internal controls required by General Accounting Office Title 6, Pay, Leave, and Allowances.'' Specifically, they did not have computerized edits, separation of duties and responsibilities, and restricted access to payroll data files. This condition occurred because its managers were not aware of Title 6 requirements. As a result, the contractor could not assure the Department of Energy that payroll costs were processes accurately; and fraud, waste, or abuse of Department of Energy funds could go undetected. Our sample of 212 payroll transactions from a population of 66,000 in FY 1991 disclosed only two minor processing errors and no instances of fraud, waste or abuse.

  15. Does terrestrial epidemiology apply to marine systems?

    USGS Publications Warehouse

    McCallum, Hamish I.; Kuris, Armand M.; Harvell, C. Drew; Lafferty, Kevin D.; Smith, Garriet W.; Porter, James

    2004-01-01

    Most of epidemiological theory has been developed for terrestrial systems, but the significance of disease in the ocean is now being recognized. However, the extent to which terrestrial epidemiology can be directly transferred to marine systems is uncertain. Many broad types of disease-causing organism occur both on land and in the sea, and it is clear that some emergent disease problems in marine environments are caused by pathogens moving from terrestrial to marine systems. However, marine systems are qualitatively different from terrestrial environments, and these differences affect the application of modelling and management approaches that have been developed for terrestrial systems. Phyla and body plans are more diverse in marine environments and marine organisms have different life histories and probably different disease transmission modes than many of their terrestrial counterparts. Marine populations are typically more open than terrestrial ones, with the potential for long-distance dispersal of larvae. Potentially, this might enable unusually rapid propagation of epidemics in marine systems, and there are several examples of this. Taken together, these differences will require the development of new approaches to modelling and control of infectious disease in the ocean.

  16. Thermal analysis applied to irradiated propolis

    NASA Astrophysics Data System (ADS)

    Matsuda, Andrea Harumi; Machado, Luci Brocardo; del Mastro, Nélida Lucia

    2002-03-01

    Propolis is a resinous hive product, collected by bees. Raw propolis requires a decontamination procedure and irradiation appears as a promising technique for this purpose. The valuable properties of propolis for food and pharmaceutical industries have led to increasing interest in its technological behavior. Thermal analysis is a chemical analysis that gives information about changes on heating of great importance for technological applications. Ground propolis samples were 60Co gamma irradiated with 0 and 10 kGy. Thermogravimetry curves shown a similar multi-stage decomposition pattern for both irradiated and unirradiated samples up to 600°C. Similarly, through differential scanning calorimetry , a coincidence of melting point of irradiated and unirradiated samples was found. The results suggest that the irradiation process do not interfere on the thermal properties of propolis when irradiated up to 10 kGy.

  17. Multiagent cooperative systems applied to precision applications

    NASA Astrophysics Data System (ADS)

    McKay, Mark D.; Anderson, Matthew O.; Gunderson, Robert W.; Flann, Nicholas S.; Abbott, Ben A.

    1998-08-01

    Regulatory agencies are imposing limits and constraints to protect the operator and/or the environment. While generally necessary, these controls also tend to increase cost and decrease efficiency and productivity. Intelligent computer systems can be made to perform these hazardous tasks with greater efficiency and precision without danger to the operators. The Idaho National Engineering and Environmental Laboratory and the Center for Self-Organizing and Intelligent Systems at Utah State University have developed a series of autonomous all-terrain multi-agent systems capable of performing automated tasks within hazardous environments. This pare discusses the development and application of cooperative small-scale and large-scale robots for use in various activities associated with radiologically contaminated areas, prescription farming, and unexploded ordinances.

  18. Reliability analysis applied to structural tests

    NASA Technical Reports Server (NTRS)

    Diamond, P.; Payne, A. O.

    1972-01-01

    The application of reliability theory to predict, from structural fatigue test data, the risk of failure of a structure under service conditions because its load-carrying capability is progressively reduced by the extension of a fatigue crack, is considered. The procedure is applicable to both safe-life and fail-safe structures and, for a prescribed safety level, it will enable an inspection procedure to be planned or, if inspection is not feasible, it will evaluate the life to replacement. The theory has been further developed to cope with the case of structures with initial cracks, such as can occur in modern high-strength materials which are susceptible to the formation of small flaws during the production process. The method has been applied to a structure of high-strength steel and the results are compared with those obtained by the current life estimation procedures. This has shown that the conventional methods can be unconservative in certain cases, depending on the characteristics of the structure and the design operating conditions. The suitability of the probabilistic approach to the interpretation of the results from full-scale fatigue testing of aircraft structures is discussed and the assumptions involved are examined.

  19. Scanning methods applied to bitemark analysis

    NASA Astrophysics Data System (ADS)

    Bush, Peter J.; Bush, Mary A.

    2010-06-01

    The 2009 National Academy of Sciences report on forensics focused criticism on pattern evidence subdisciplines in which statements of unique identity are utilized. One principle of bitemark analysis is that the human dentition is unique to the extent that a perpetrator may be identified based on dental traits in a bitemark. Optical and electron scanning methods were used to measure dental minutia and to investigate replication of detail in human skin. Results indicated that being a visco-elastic substrate, skin effectively reduces the resolution of measurement of dental detail. Conclusions indicate caution in individualization statements.

  20. Multivariate analysis applied to tomato hybrid production.

    PubMed

    Balasch, S; Nuez, F; Palomares, G; Cuartero, J

    1984-11-01

    Twenty characters were measured on 60 tomato varieties cultivated in the open-air and in polyethylene plastic-house. Data were analyzed by means of principal components, factorial discriminant methods, Mahalanobis D(2) distances and principal coordinate techniques. Factorial discriminant and Mahalanobis D(2) distances methods, both of which require collecting data plant by plant, lead to similar conclusions as the principal components method that only requires taking data by plots. Characters that make up the principal components in both environments studied are the same, although the relative importance of each one of them varies within the principal components. By combining information supplied by multivariate analysis with the inheritance mode of characters, crossings among cultivars can be experimented with that will produce heterotic hybrids showing characters within previously established limits.

  1. Hybrid functionals applied to extended systems

    NASA Astrophysics Data System (ADS)

    Marsman, M.; Paier, J.; Stroppa, A.; Kresse, G.

    2008-02-01

    We present an overview of the description of structural, thermochemical, and electronic properties of extended systems using several well known hybrid Hartree-Fock/density-functional-theory functionals (PBE0, HSE03, and B3LYP). In addition we address a few aspects of the evaluation of the Hartree-Fock exchange interactions in reciprocal space, relevant to all methods that employ a plane wave basis set and periodic boundary conditions.

  2. Applying Genomic Analysis to Newborn Screening

    PubMed Central

    Solomon, B.D.; Pineda-Alvarez, D.E.; Bear, K.A.; Mullikin, J.C.; Evans, J.P.

    2012-01-01

    Large-scale genomic analysis such as whole-exome and whole-genome sequencing is becoming increasingly prevalent in the research arena. Clinically, many potential uses of this technology have been proposed. One such application is the extension or augmentation of newborn screening. In order to explore this application, we examined data from 3 children with normal newborn screens who underwent whole-exome sequencing as part of research participation. We analyzed sequence information for 151 selected genes associated with conditions ascertained by newborn screening. We compared findings with publicly available databases and results from over 500 individuals who underwent whole-exome sequencing at the same facility. Novel variants were confirmed through bidirectional dideoxynucleotide sequencing. High-density microarrays (Illumina Omni1-Quad) were also performed to detect potential copy number variations affecting these genes. We detected an average of 87 genetic variants per individual. After excluding artifacts, 96% of the variants were found to be reported in public databases and have no evidence of pathogenicity. No variants were identified that would predict disease in the tested individuals, which is in accordance with their normal newborn screens. However, we identified 6 previously reported variants and 2 novel variants that, according to published literature, could result in affected offspring if the reproductive partner were also a mutation carrier; other specific molecular findings highlight additional means by which genomic testing could augment newborn screening. PMID:23112750

  3. Digital Systems Analysis

    ERIC Educational Resources Information Center

    Martin, Vance S.

    2009-01-01

    There have been many attempts to understand how the Internet affects our modern world. There have also been numerous attempts to understand specific areas of the Internet. This article applies Immanuel Wallerstein's World Systems Analysis to our informationalist society. Understanding this world as divided among individual core, semi-periphery,…

  4. Compatibility of person-centered planning and applied behavior analysis

    PubMed Central

    Holburn, Steve

    2001-01-01

    In response to Osborne (1999), the aims and practices of person-centered planning (PCP) are compared to the basic principles of applied behavior analysis set forth by Baer, Wolf, and Risley (1968, 1987). The principal goal of PCP is social integration of people with disabilities; it qualifies as a socially important behavior, and its problems have been displayed sufficiently. However, social integration is a complex social problem whose solution requires access to system contingencies that influence lifestyles. Nearly all of the component goals of PCP proposed by O'Brien (1987b) have been reliably quantified, although concurrent measurement of outcomes such as friendship, autonomy, and respect presents a formidable challenge. Behavioral principles such as contingency and contextual control are operative within PCP, but problems in achieving reliable implementation appear to impede an experimental analysis. PMID:22478371

  5. Design of the optocoupler applied to medical lighting systems.

    PubMed

    Yang, Xibin; Lit, Rui; Zhu, Jianfeng; Xiong, Daxi

    2012-12-01

    A new type of optocoupler applied to medical lighting system is proposed, and the principle, Etendue and design process is introduced. With the help of Tracrpro, modeling and simulation of the optocoupler is conducted and the parameters are optimized. Analysis of factors affecting the energy coupling efficiency is done. With a view towards the development of Ultra High Brightness Light Emitting Diodes (UHB-LEDs), which play an important role a new sources of lighting in various biomedical devices, including those used in diagnosis and treatment, a series of simulations are executed and a variety of solutions are achieved. According to simulation results, the design target of coupling efficiency is achieved and the optical uniformity is also significantly improved. According to the result of theoretical analysis, verification experiments are designed and simulation results are verified. The optocoupler, which has simple structure, compact size and low cost, is suitable for applications in the field of low-cost medical domain.

  6. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W.

    1992-01-01

    We are developing a machine learning system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being learned. Using this information (which we call a domain theory''), our learning algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, the KBANN algorithm maps inference rules, such as consensus sequences, into a neural (connectionist) network. Neural network training techniques then use the training examples of refine these inference rules. We have been applying this approach to several problems in DNA sequence analysis and have also been extending the capabilities of our learning system along several dimensions.

  7. Hyperspectral imaging applied to complex particulate solids systems

    NASA Astrophysics Data System (ADS)

    Bonifazi, Giuseppe; Serranti, Silvia

    2008-04-01

    HyperSpectral Imaging (HSI) is based on the utilization of an integrated hardware and software (HW&SW) platform embedding conventional imaging and spectroscopy to attain both spatial and spectral information from an object. Although HSI was originally developed for remote sensing, it has recently emerged as a powerful process analytical tool, for non-destructive analysis, in many research and industrial sectors. The possibility to apply on-line HSI based techniques in order to identify and quantify specific particulate solid systems characteristics is presented and critically evaluated. The originally developed HSI based logics can be profitably applied in order to develop fast, reliable and lowcost strategies for: i) quality control of particulate products that must comply with specific chemical, physical and biological constraints, ii) performance evaluation of manufacturing strategies related to processing chains and/or realtime tuning of operative variables and iii) classification-sorting actions addressed to recognize and separate different particulate solid products. Case studies, related to recent advances in the application of HSI to different industrial sectors, as agriculture, food, pharmaceuticals, solid waste handling and recycling, etc. and addressed to specific goals as contaminant detection, defect identification, constituent analysis and quality evaluation are described, according to authors' originally developed application.

  8. Negative reinforcement in applied behavior analysis: an emerging technology.

    PubMed

    Iwata, B A

    1987-01-01

    Although the effects of negative reinforcement on human behavior have been studied for a number of years, a comprehensive body of applied research does not exist at this time. This article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. A consideration of research currently being done in these areas suggests the emergence of an applied technology on negative reinforcement.

  9. Setting events in applied behavior analysis: Toward a conceptual and methodological expansion

    PubMed Central

    Wahler, Robert G.; Fox, James J.

    1981-01-01

    The contributions of applied behavior analysis as a natural science approach to the study of human behavior are acknowledged. However, it is also argued that applied behavior analysis has provided limited access to the full range of environmental events that influence socially significant behavior. Recent changes in applied behavior analysis to include analysis of side effects and social validation represent ways in which the traditional applied behavior analysis conceptual and methodological model has been profitably expanded. A third area of expansion, the analysis of setting events, is proposed by the authors. The historical development of setting events as a behavior influence concept is traced. Modifications of the basic applied behavior analysis methodology and conceptual systems that seem necessary to setting event analysis are discussed and examples of descriptive and experimental setting event analyses are presented. PMID:16795646

  10. Negative Reinforcement in Applied Behavior Analysis: An Emerging Technology.

    ERIC Educational Resources Information Center

    Iwata, Brian A.

    1987-01-01

    The article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. Current research suggests the emergence of an applied technology on negative reinforcement.…

  11. Animal Research in the "Journal of Applied Behavior Analysis"

    ERIC Educational Resources Information Center

    Edwards, Timothy L.; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the "Journal of Applied Behavior Analysis" and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance…

  12. B. F. Skinner's contributions to applied behavior analysis

    PubMed Central

    Morris, Edward K.; Smith, Nathaniel G.; Altus, Deborah E.

    2005-01-01

    Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew from his science for application, his descriptions of possible applications, and his own applications to nonhuman and human behavior. Second, we found that he explicitly or implicitly addressed all seven dimensions of applied behavior analysis. These contributions and the dimensions notwithstanding, he neither incorporated the field's scientific (e.g., analytic) and social dimensions (e.g., applied) into any program of published research such that he was its originator, nor did he systematically integrate, advance, and promote the dimensions so to have been its founder. As the founder of behavior analysis, however, he was the father of applied behavior analysis. PMID:22478444

  13. Applying expertise to data in the Geologist's Assistant expert system

    SciTech Connect

    Berkbigler, K.P.; Papcun, G.J.; Marusak, N.L.; Hutson, J.E.

    1988-01-01

    The Geologist's Assistant combines expert system technology with numerical pattern-matching and online communication to a large database. This paper discusses the types of rules used for the expert system, the pattern-matching technique applied, and the implementation of the system using a commercial expert system development environment. 13 refs., 8 figs.

  14. A Vision for Systems Engineering Applied to Wind Energy (Presentation)

    SciTech Connect

    Felker, F.; Dykes, K.

    2015-01-01

    This presentation was given at the Third Wind Energy Systems Engineering Workshop on January 14, 2015. Topics covered include the importance of systems engineering, a vision for systems engineering as applied to wind energy, and application of systems engineering approaches to wind energy research and development.

  15. Multidisciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  16. Computer-Aided Decision Support for Melanoma Detection Applied on Melanocytic and Nonmelanocytic Skin Lesions: A Comparison of Two Systems Based on Automatic Analysis of Dermoscopic Images

    PubMed Central

    Møllersen, Kajsa; Kirchesch, Herbert; Zortea, Maciel; Schopf, Thomas R.; Hindberg, Kristian; Godtliebsen, Fred

    2015-01-01

    Commercially available clinical decision support systems (CDSSs) for skin cancer have been designed for the detection of melanoma only. Correct use of the systems requires expert knowledge, hampering their utility for nonexperts. Furthermore, there are no systems to detect other common skin cancer types, that is, nonmelanoma skin cancer (NMSC). As early diagnosis of skin cancer is essential, there is a need for a CDSS that is applicable to all types of skin lesions and is suitable for nonexperts. Nevus Doctor (ND) is a CDSS being developed by the authors. We here investigate ND's ability to detect both melanoma and NMSC and the opportunities for improvement. An independent test set of dermoscopic images of 870 skin lesions, including 44 melanomas and 101 NMSCs, were analysed by ND. Its sensitivity to melanoma and NMSC was compared to that of Mole Expert (ME), a commercially available CDSS, using the same set of lesions. ND and ME had similar sensitivity to melanoma. For ND at 95% melanoma sensitivity, the NMSC sensitivity was 100%, and the specificity was 12%. The melanomas misclassified by ND at 95% sensitivity were correctly classified by ME, and vice versa. ND is able to detect NMSC without sacrificing melanoma sensitivity. PMID:26693486

  17. First Attempt of Applying Factor Analysis in Moving Base Gravimetry

    NASA Astrophysics Data System (ADS)

    Li, X.; Roman, D. R.

    2014-12-01

    For gravimetric observation systems on mobile platforms (land/sea/airborne), the Low Signal to Noise Ratio (SNR) issue is the main barrier to achieving an accurate, high resolution gravity signal. Normally, low-pass filters (Childers et al 1999, Forsberg et al 2000, Kwon and Jekeli 2000, Hwang et al 2006) are applied to smooth or remove the high frequency "noise" - even though some of the high frequency component is not necessarily noise. This is especially true for aerogravity surveys such as those from the Gravity for the Redefinition of the American Vertical Datum (GRAV-D) project. These gravity survey flights have a spatial resolution of 10 km between tracks but higher resolution along track. The along track resolution is improved due to the lower flight height (6.1 km), equipment sensitivity, and improved modeling of potential errors. Additionally, these surveys suffer from a loss of signal power due to the increased flight elevation. Hence, application of a low-pass filter removes possible signal sensed in the along-track direction that might otherwise prove useful for various geophysical and geodetic applications. Some cutting-edge developments in Wavelets and Artificial Neural Networks had been successfully applied for obtaining improved results (Li 2008 and 2011, Liang and Liu 2013). However, a clearer and fundamental understanding of the error characteristics will further improve the quality of the gravity estimates out of these gravimetric systems. Here, instead of using any predefined basis function or any a priori model, the idea of Factor Analysis is first employed to try to extract the underlying factors of the noises in the systems. Real data sets collected by both land vehicle and aircraft will be processed as the examples.

  18. Competing Uses of Underground Systems Related to Energy Supply: Applying Single- and Multiphase Simulations for Site Characterization and Risk-Analysis

    NASA Astrophysics Data System (ADS)

    Kissinger, A.; Walter, L.; Darcis, M.; Flemisch, B.; Class, H.

    2012-04-01

    Global climate change, shortage of resources and the resulting turn towards renewable sources of energy lead to a growing demand for the utilization of subsurface systems. Among these competing uses are Carbon Capture and Storage (CCS), geothermal energy, nuclear waste disposal, "renewable" methane or hydrogen storage as well as the ongoing production of fossil resources like oil, gas, and coal. Besides competing among themselves, these technologies may also create conflicts with essential public interests like water supply. For example, the injection of CO2 into the underground causes an increase in pressure reaching far beyond the actual radius of influence of the CO2 plume, potentially leading to large amounts of displaced salt water. Finding suitable sites is a demanding task for several reasons. Natural systems as opposed to technical systems are always characterized by heterogeneity. Therefore, parameter uncertainty impedes reliable predictions towards capacity and safety of a site. State of the art numerical simulations combined with stochastic approaches need to be used to obtain a more reliable assessment of the involved risks and the radii of influence of the different processes. These simulations may include the modeling of single- and multiphase non-isothermal flow, geo-chemical and geo-mechanical processes in order to describe all relevant physical processes adequately. Stochastic approaches have the aim to estimate a bandwidth of the key output parameters based on uncertain input parameters. Risks for these different underground uses can then be made comparable with each other. Along with the importance and the urgency of the competing processes this may lead to a more profound basis for a decision. Communicating risks to stake holders and a concerned public is crucial for the success of finding a suitable site for CCS (or other subsurface utilization). We present and discuss first steps towards an approach for addressing the issue of competitive

  19. Applying a toolkit for dissemination and analysis of near real-time data through the World Wide Web: integration of the Antelope Real Time System, ROADNet, and PHP

    NASA Astrophysics Data System (ADS)

    Newman, R. L.; Lindquist, K. G.; Hansen, T. S.; Vernon, F. L.; Eakins, J.; Foley, S.; Orcutt, J.

    2005-12-01

    The ROADNet project has enabled the acquisition and storage of diverse data streams through seamless integration of the Antelope Real Time System (ARTS) with (for example) ecological, seismological and geodetic instrumentation. The robust system architecture allows researchers to simply network data loggers with relational databases; however, the ability to disseminate these data to policy makers, scientists and the general public has (until recently) been provided on an 'as needed' basis. The recent development of a Datascope interface to the popular open source scripting language PHP has provided an avenue for presenting near real time data (such as integers, images and movies) from within the ARTS framework easily on the World Wide Web. The interface also indirectly provided the means to transform data types into various formats using the extensive function libraries that accompany a PHP installation (such as image creation and manipulation, data encryption for sensitive information, and XML creation for structured document interchange through the World Wide Web). Using a combination of Datascope and PHP library functions, an extensible tool-kit is being developed to allow data managers to easily present their products on the World Wide Web. The tool-kit has been modeled after the pre-existing ARTS architecture to simplify the installation, development and ease-of-use for both the seasoned researcher and the casual user. The methodology and results of building the applications that comprise the tool-kit are the focus of this presentation, including procedural vs. object oriented design, incorporation of the tool-kit into the existing contributed software libraries, and case-studies of researchers who are employing the tools to present their data. http://anf.ucsd.edu

  20. The Significance of Regional Analysis in Applied Geography.

    ERIC Educational Resources Information Center

    Sommers, Lawrence M.

    Regional analysis is central to applied geographic research, contributing to better planning and policy development for a variety of societal problems facing the United States. The development of energy policy serves as an illustration of the capabilities of this type of analysis. The United States has had little success in formulating a national…

  1. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics

    PubMed Central

    Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999–2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  2. Negative reinforcement in applied behavior analysis: an emerging technology.

    PubMed Central

    Iwata, B A

    1987-01-01

    Although the effects of negative reinforcement on human behavior have been studied for a number of years, a comprehensive body of applied research does not exist at this time. This article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. A consideration of research currently being done in these areas suggests the emergence of an applied technology on negative reinforcement. PMID:3323157

  3. System theory as applied differential geometry. [linear system

    NASA Technical Reports Server (NTRS)

    Hermann, R.

    1979-01-01

    The invariants of input-output systems under the action of the feedback group was examined. The approach used the theory of Lie groups and concepts of modern differential geometry, and illustrated how the latter provides a basis for the discussion of the analytic structure of systems. Finite dimensional linear systems in a single independent variable are considered. Lessons of more general situations (e.g., distributed parameter and multidimensional systems) which are increasingly encountered as technology advances are presented.

  4. Integrated systems analysis applied to environmental remediation

    SciTech Connect

    Thayer, G.R.; Hardie, R.W.; Catherwood, R.; Springer, E.P.

    1997-12-31

    At the request of the Congressional Task Force on the Salton Sea and the Salton Sea Authority, the authors examined various technologies that have been proposed to reduce the decline in the Salton Sea. The primary focus of the technologies was to reduce the salinity of the Salton Sea, with secondary objectives of maintaining the present shoreline and to have a minimum cost. The authors found that two technologies, pump-out and diking, could provide the required salinity reduction. The pump-out option would result in a smaller Sea while to diking option would create a high salinity impoundment area in the Sea. The costs for the two options were similar. Desalination and pump-in; pump-out options were rejected because of high costs and because they did not provide a sufficient reduction in the salinity of the Salton Sea. The end product of the project was testimony before the Subcommittee on Water and Power, U.S. House of Representatives Committee on Resources, given October 3, 1997.

  5. Complex, Dynamic Systems: A New Transdisciplinary Theme for Applied Linguistics?

    ERIC Educational Resources Information Center

    Larsen-Freeman, Diane

    2012-01-01

    In this plenary address, I suggest that Complexity Theory has the potential to contribute a transdisciplinary theme to applied linguistics. Transdisciplinary themes supersede disciplines and spur new kinds of creative activity (Halliday 2001 [1990]). Investigating complex systems requires researchers to pay attention to system dynamics. Since…

  6. Applied behavior analysis: New directions from the laboratory

    PubMed Central

    Epling, W. Frank; Pierce, W. David

    1983-01-01

    Applied behavior analysis began when laboratory based principles were extended to humans inorder to change socially significant behavior. Recent laboratory findings may have applied relevance; however, the majority of basic researchers have not clearly communicated the practical implications of their work. The present paper samples some of the new findings and attempts to demonstrate their applied importance. Schedule-induced behavior which occurs as a by-product of contingencies of reinforcement is discussed. Possible difficulties in treatment and management of induced behaviors are considered. Next, the correlation-based law of effect and the implications of relative reinforcement are explored in terms of applied examples. Relative rate of reinforcement is then extended to the literature dealing with concurrent operants. Concurrent operant models may describe human behavior of applied importance, and several techniques for modification of problem behavior are suggested. As a final concern, the paper discusses several new paradigms. While the practical importance of these models is not clear at the moment, it may be that new practical advantages will soon arise. Thus, it is argued that basic research continues to be of theoretical and practical importance to applied behavior analysis. PMID:22478574

  7. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Wang, Tee-See; Griffin, Lisa; Turner, James E. (Technical Monitor)

    2001-01-01

    This document is a presentation graphic which reviews the activities of the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center (i.e., Code TD64). The work of this group focused on supporting the space transportation programs. The work of the group is in Computational Fluid Dynamic tool development. This development is driven by hardware design needs. The major applications for the design and analysis tools are: turbines, pumps, propulsion-to-airframe integration, and combustion devices.

  8. Conformity with the HIRF Environment Applied to Avionic System

    NASA Astrophysics Data System (ADS)

    Tristant, F.; Rotteleur, J. P.; Moreau, J. P.

    2012-05-01

    This paper presents the qualification and certification methodology applied to the avionic system for the HIRF and Lightning environment. Several versions of this system are installed in our legacy Falcon with different variations. The paper presents the compliance process taking into account the criticality and the complexity of the system, its installation, the level of exposition for EM environment and some solutions used by Dassault Aviation to demonstrate the compliance process.

  9. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    ERIC Educational Resources Information Center

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  10. Progressive-Ratio Schedules and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Poling, Alan

    2010-01-01

    Establishing appropriate relations between the basic and applied areas of behavior analysis has been of long and persistent interest to the author. In this article, the author illustrates that there is a direct relation between how hard an organism will work for access to an object or activity, as indexed by the largest ratio completed under a…

  11. Context, Cognition, and Biology in Applied Behavior Analysis.

    ERIC Educational Resources Information Center

    Morris, Edward K.

    Behavior analysts are having their professional identities challenged by the roles that cognition and biology are said to play in the conduct and outcome of applied behavior analysis and behavior therapy. For cogniphiliacs, cognition and biology are central to their interventions because cognition and biology are said to reflect various processes,…

  12. Overview af MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2004-01-01

    This paper presents viewgraphs on NASA Marshall Space Flight Center's Applied Fluid Dynamics Analysis Group Activities. The topics include: 1) Status of programs at MSFC; 2) Fluid Mechanics at MSFC; 3) Relevant Fluid Dynamics Activities at MSFC; and 4) Shuttle Return to Flight.

  13. Sensitivity analysis for texture models applied to rust steel classification

    NASA Astrophysics Data System (ADS)

    Trujillo, Maite; Sadki, Mustapha

    2004-05-01

    The exposure of metallic structures to rust degradation during their operational life is a known problem and it affects storage tanks, steel bridges, ships, etc. In order to prevent this degradation and the potential related catastrophes, the surfaces have to be assessed and the appropriate surface treatment and coating need to be applied according to the corrosion time of the steel. We previously investigated the potential of image processing techniques to tackle this problem. Several mathematical algorithms methods were analyzed and evaluated on a database of 500 images. In this paper, we extend our previous research and provide a further analysis of the textural mathematical methods for automatic rust time steel detection. Statistical descriptors are provided to evaluate the sensitivity of the results as well as the advantages and limitations of the different methods. Finally, a selector of the classifiers algorithms is introduced and the ratio between sensitivity of the results and time response (execution time) is analyzed to compromise good classification results (high sensitivity) and acceptable time response for the automation of the system.

  14. Flight control system design factors for applying automated testing techniques

    NASA Technical Reports Server (NTRS)

    Sitz, Joel R.; Vernon, Todd H.

    1990-01-01

    Automated validation of flight-critical embedded systems is being done at ARC Dryden Flight Research Facility. The automated testing techniques are being used to perform closed-loop validation of man-rated flight control systems. The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 High Alpha Research Vehicle (HARV) automated test systems are discussed. Operationally applying automated testing techniques has accentuated flight control system features that either help or hinder the application of these techniques. The paper also discusses flight control system features which foster the use of automated testing techniques.

  15. Applying systems engineering methodologies to the micro- and nanoscale realm

    NASA Astrophysics Data System (ADS)

    Garrison Darrin, M. Ann

    2012-06-01

    Micro scale and nano scale technology developments have the potential to revolutionize smart and small systems. The application of systems engineering methodologies that integrate standalone, small-scale technologies and interface them with macro technologies to build useful systems is critical to realizing the potential of these technologies. This paper covers the expanding knowledge base on systems engineering principles for micro and nano technology integration starting with a discussion of the drivers for applying a systems approach. Technology development on the micro and nano scale has transition from laboratory curiosity to the realization of products in the health, automotive, aerospace, communication, and numerous other arenas. This paper focuses on the maturity (or lack thereof) of the field of nanosystems which is emerging in a third generation having transitioned from completing active structures to creating systems. The emphasis of applying a systems approach focuses on successful technology development based on the lack of maturity of current nano scale systems. Therefore the discussion includes details relating to enabling roles such as product systems engineering and technology development. Classical roles such as acquisition systems engineering are not covered. The results are also targeted towards small-scale technology developers who need to take into account systems engineering processes such as requirements definition, verification, and validation interface management and risk management in the concept phase of technology development to maximize the likelihood of success, cost effective micro and nano technology to increase the capability of emerging deployed systems and long-term growth and profits.

  16. Applying Model Based Systems Engineering to NASA's Space Communications Networks

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul; Barnes, Patrick; Reinert, Jessica; Golden, Bert

    2013-01-01

    System engineering practices for complex systems and networks now require that requirement, architecture, and concept of operations product development teams, simultaneously harmonize their activities to provide timely, useful and cost-effective products. When dealing with complex systems of systems, traditional systems engineering methodology quickly falls short of achieving project objectives. This approach is encumbered by the use of a number of disparate hardware and software tools, spreadsheets and documents to grasp the concept of the network design and operation. In case of NASA's space communication networks, since the networks are geographically distributed, and so are its subject matter experts, the team is challenged to create a common language and tools to produce its products. Using Model Based Systems Engineering methods and tools allows for a unified representation of the system in a model that enables a highly related level of detail. To date, Program System Engineering (PSE) team has been able to model each network from their top-level operational activities and system functions down to the atomic level through relational modeling decomposition. These models allow for a better understanding of the relationships between NASA's stakeholders, internal organizations, and impacts to all related entities due to integration and sustainment of existing systems. Understanding the existing systems is essential to accurate and detailed study of integration options being considered. In this paper, we identify the challenges the PSE team faced in its quest to unify complex legacy space communications networks and their operational processes. We describe the initial approaches undertaken and the evolution toward model based system engineering applied to produce Space Communication and Navigation (SCaN) PSE products. We will demonstrate the practice of Model Based System Engineering applied to integrating space communication networks and the summary of its

  17. Characterization of anomalies by applying methods of fractal analysis

    SciTech Connect

    Sakuma, M.; Kozma, R.; Kitamura, M.

    1996-01-01

    Fractal analysis is applied in a variety of research fields to characterize nonstationary data. Here, fractal analysis is used as a tool of characterization in time series. The fractal dimension is calculated by Higuchi`s method, and the effect of small data size on accuracy is studied in detail. Three types of fractal-based anomaly indicators are adopted: (a) the fractal dimension, (b) the error of the fractal dimension, and (c) the chi-square value of the linear fitting of the fractal curve in the wave number domain. Fractal features of time series can be characterized by introducing these three measures. The proposed method is applied to various simulated fractal time series with ramp, random, and periodic noise anomalies and also to neutron detector signals acquired in a nuclear reactor. Fractal characterization can successfully supplement conventional signal analysis methods especially if nonstationary and non-Gaussian features of the signal become important.

  18. Applying Sustainable Systems Development Approach to Educational Technology Systems

    ERIC Educational Resources Information Center

    Huang, Albert

    2012-01-01

    Information technology (IT) is an essential part of modern education. The roles and contributions of technology to education have been thoroughly documented in academic and professional literature. Despite the benefits, the use of educational technology systems (ETS) also creates a significant impact on the environment, primarily due to energy…

  19. Recent reinforcement-schedule research and applied behavior analysis

    PubMed Central

    Lattal, Kennon A.; Neef, Nancy A.

    1996-01-01

    Reinforcement schedules are considered in relation to applied behavior analysis by examining several recent laboratory experiments with humans and other animals. The experiments are drawn from three areas of contemporary schedule research: behavioral history effects on schedule performance, the role of instructions in schedule performance of humans, and dynamic schedules of reinforcement. All of the experiments are discussed in relation to the role of behavioral history in current schedule performance. The paper concludes by extracting from the experiments some more general issues concerning reinforcement schedules in applied research and practice. PMID:16795888

  20. Applied Information Systems Research Program (AISRP) Workshop 3 meeting proceedings

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The third Workshop of the Applied Laboratory Systems Research Program (AISRP) met at the Univeristy of Colorado's Laboratory for Atmospheric and Space Physics in August of 1993. The presentations were organized into four sessions: Artificial Intelligence Techniques; Scientific Visualization; Data Management and Archiving; and Research and Technology.

  1. Boston Society's 11th Annual Applied Pharmaceutical Analysis conference.

    PubMed

    Lee, Violet; Liu, Ang; Groeber, Elizabeth; Moghaddam, Mehran; Schiller, James; Tweed, Joseph A; Walker, Gregory S

    2016-02-01

    Boston Society's 11th Annual Applied Pharmaceutical Analysis conference, Hyatt Regency Hotel, Cambridge, MA, USA, 14-16 September 2015 The Boston Society's 11th Annual Applied Pharmaceutical Analysis (APA) conference took place at the Hyatt Regency hotel in Cambridge, MA, on 14-16 September 2015. The 3-day conference affords pharmaceutical professionals, academic researchers and industry regulators the opportunity to collectively participate in meaningful and relevant discussions impacting the areas of pharmaceutical drug development. The APA conference was organized in three workshops encompassing the disciplines of regulated bioanalysis, discovery bioanalysis (encompassing new and emerging technologies) and biotransformation. The conference included a short course titled 'Bioanalytical considerations for the clinical development of antibody-drug conjugates (ADCs)', an engaging poster session, several panel and round table discussions and over 50 diverse talks from leading industry and academic scientists. PMID:26853375

  2. Applying Technology Ranking and Systems Engineering in Advanced Life Support

    NASA Technical Reports Server (NTRS)

    Jones, Harry; Luna, Bernadette (Technical Monitor)

    2000-01-01

    According to the Advanced Life Support (ALS) Program Plan, the Systems Modeling and Analysis Project (SMAP) has two important tasks: 1) prioritizing investments in ALS Research and Technology Development (R&TD), and 2) guiding the evolution of ALS systems. Investments could be prioritized simply by independently ranking different technologies, but we should also consider a technology's impact on system design. Guiding future ALS systems will require SMAP to consider many aspects of systems engineering. R&TD investments can be prioritized using familiar methods for ranking technology. The first step is gathering data on technology performance, safety, readiness level, and cost. Then the technologies are ranked using metrics or by decision analysis using net present economic value. The R&TD portfolio can be optimized to provide the maximum expected payoff in the face of uncertain future events. But more is needed. The optimum ALS system can not be designed simply by selecting the best technology for each predefined subsystem. Incorporating a new technology, such as food plants, can change the specifications of other subsystems, such as air regeneration. Systems must be designed top-down starting from system objectives, not bottom-up from selected technologies. The familiar top-down systems engineering process includes defining mission objectives, mission design, system specification, technology analysis, preliminary design, and detail design. Technology selection is only one part of systems analysis and engineering, and it is strongly related to the subsystem definitions. ALS systems should be designed using top-down systems engineering. R&TD technology selection should consider how the technology affects ALS system design. Technology ranking is useful but it is only a small part of systems engineering.

  3. Design considerations of HUD projection systems applied to automobile industry

    NASA Astrophysics Data System (ADS)

    Betancur, J. Alejandro; Gómez, Gilberto Osorio

    2012-06-01

    Currently, the topics about HUD systems are strongly going inside on the automobile industries; consequently, there have been proposed new ways to understand and apply this technology in an economically viable way. To contribute to this situation, this paper presents a case study which sets out key parameters that should be considered on the design of an HUD, how can be configured these parameters, and how they are related. Finally, it is presented an optical design alternative that meets the main requirements of an HUD system applied to mid-range automobiles. There are several ways to cover the development and construction of HUD systems, the method here proposed is raised to provide and to understand the factors involved in this technology and the popularization of it on the automobile industry.

  4. Availability modeling methodology applied to solar power systems

    NASA Astrophysics Data System (ADS)

    Unione, A.; Burns, E.; Husseiny, A.

    1981-01-01

    Availability is discussed as a measure for estimating the expected performance for solar- and wind-powered generation systems and for identifying causes of performance loss. Applicable analysis techniques, ranging from simple system models to probabilistic fault tree analysis, are reviewed. A methodology incorporating typical availability models is developed for estimating reliable plant capacity. Examples illustrating the impact of design and configurational differences on the expected capacity of a solar-thermal power plant with a fossil-fired backup unit are given.

  5. Conference report: summary of the 2010 Applied Pharmaceutical Analysis Conference.

    PubMed

    Unger, Steve E

    2011-01-01

    This year, the Applied Pharmaceutical Analysis meeting changed its venue to the Grand Tremont Hotel in Baltimore, MD, USA. Proximity to Washington presented the opportunity to have four speakers from the US FDA. The purpose of the 4-day conference is to provide a forum in which pharmaceutical and CRO scientists can discuss and develop best practices for scientific challenges in bioanalysis and drug metabolism. This year's theme was 'Bioanalytical and Biotransformation Challenges in Meeting Global Regulatory Expectations & New Technologies for Drug Discovery Challenges'. Applied Pharmaceutical Analysis continued its tradition of highlighting new technologies and its impact on drug discovery, drug metabolism and small molecule-regulated bioanalysis. This year, the meeting included an integrated focus on metabolism in drug discovery and development. Middle and large molecule (biotherapeutics) drug development, immunoassay, immunogenicity and biomarkers were also integrated into the forum. Applied Pharmaceutical Analysis offered an enhanced diversity of topics this year while continuing to share experiences of discovering and developing new medicines. PMID:21175361

  6. Applying Trusted Network Technology To Process Control Systems

    NASA Astrophysics Data System (ADS)

    Okhravi, Hamed; Nicol, David

    Interconnections between process control networks and enterprise networks expose instrumentation and control systems and the critical infrastructure components they operate to a variety of cyber attacks. Several architectural standards and security best practices have been proposed for industrial control systems. However, they are based on older architectures and do not leverage the latest hardware and software technologies. This paper describes new technologies that can be applied to the design of next generation security architectures for industrial control systems. The technologies are discussed along with their security benefits and design trade-offs.

  7. Spectrophotometric multicomponent analysis applied to trace metal determinations

    SciTech Connect

    Otto, M.; Wegscheider, W.

    1985-01-01

    Quantitative spectrometric analysis of mixture components is featured for systems with low spectral selectivity, namely, in the ultraviolet, visible, and infrared spectral range. Limitations imposed by data reduction schemes based on ordinary multiple regression are shown to be overcome by means of partial least-squares analysis in latent variables. The influences of variables such as noise, band separation band intensity ratios, number of wavelengths, number of components, number of calibration mixtures, time drift, or deviations from Beer's law on the analytical result has been evaluated under a wide range of conditions providing a basis to search for new systems applicable to spectrophotometric multicomponent analysis. The practical utility of the method is demonstrated for simultaneous analysis of copper, nickel, cobalt, iron, and palladium down to 2 X 10/sup -6/ M concentrations by use of their diethyldithiocarbamate chelate complexes with relative errors less than 6%. 26 references, 4 figures, 6 tables.

  8. Activity anorexia: An interplay between basic and applied behavior analysis

    PubMed Central

    Pierce, W. David; Epling, W. Frank; Dews, Peter B.; Estes, William K.; Morse, William H.; Van Orman, Willard; Herrnstein, Richard J.

    1994-01-01

    The relationship between basic research with nonhumans and applied behavior analysis is illustrated by our work on activity anorexia. When rats are fed one meal a day and allowed to run on an activity wheel, they run excessively, stop eating, and die of starvation. Convergent evidence, from several different research areas, indicates that the behavior of these animals and humans who self-starve is functionally similar. A biobehavioral theory of activity anorexia is presented that details the cultural contingencies, behavioral processes, and physiology of anorexia. Diagnostic criteria and a three-stage treatment program for activity-based anorexia are outlined. The animal model permits basic research on anorexia that for practical and ethical reasons cannot be conducted with humans. Thus, basic research can have applied importance. PMID:22478169

  9. On Delayed and Anticipatory Systems in Applied Mechanics

    NASA Astrophysics Data System (ADS)

    Béda, Péter B.

    2010-11-01

    The stability of an inverted pendulum is a textbook example of control. The easiest case is to put the pendulum on a cart and apply feedback force control on it to keep the upright position stable. This paper compares the no delay case (feed-in-time control: an anticipatory effect) and the delay differential equation approach. Then we study both continuous and discrete time systems. The main aim of the work is to investigate the behaviour of such systems at the stability boundaries by using numerical simulation. The principal points of interest are how continuous time systems differ from discrete time system at a bifurcation point and how time delay or an anticipatory feed-in-time control acts on its behaviour. Other exciting questions are how sampling delay can be taken into consideration and is bifurcation a robust phenomenon.

  10. Cladistic analysis applied to the classification of volcanoes

    NASA Astrophysics Data System (ADS)

    Hone, D. W. E.; Mahony, S. H.; Sparks, R. S. J.; Martin, K. T.

    2007-11-01

    Cladistics is a systematic method of classification that groups entities on the basis of sharing similar characteristics in the most parsimonious manner. Here cladistics is applied to the classification of volcanoes using a dataset of 59 Quaternary volcanoes and 129 volcanic edifices of the Tohoku region, Northeast Japan. Volcano and edifice characteristics recorded in the database include attributes of volcano size, chemical composition, dominant eruptive products, volcano morphology, dominant landforms, volcano age and eruptive history. Without characteristics related to time the volcanic edifices divide into two groups, with characters related to volcano size, dominant composition and edifice morphology being the most diagnostic. Analysis including time based characteristics yields four groups with a good correlation between these groups and the two groups from the analysis without time for 108 out of 129 volcanic edifices. Thus when characters are slightly changed the volcanoes still form similar groupings. Analysis of the volcanoes both with and without time yields three groups based on compositional, eruptive products and morphological characters. Spatial clusters of volcanic centres have been recognised in the Tohoku region by Tamura et al. ( Earth Planet Sci Lett 197:105 106, 2002). The groups identified by cladistic analysis are distributed unevenly between the clusters, indicating a tendency for individual clusters to form similar kinds of volcanoes with distinctive but coherent styles of volcanism. Uneven distribution of volcano types between clusters can be explained by variations in dominant magma compositions through time, which are reflected in eruption products and volcanic landforms. Cladistic analysis can be a useful tool for elucidating dynamic igneous processes that could be applied to other regions and globally. Our exploratory study indicates that cladistics has promise as a method for classifying volcanoes and potentially elucidating dynamic

  11. Applying AI tools to operational space environmental analysis

    NASA Technical Reports Server (NTRS)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  12. Applying fuzzy logic to power system protective relays

    SciTech Connect

    Kolla, S.R.

    1997-06-01

    Power systems occasionally experience faults resulting from insulation failures caused by atmospheric disturbances or switching surges. If such a fault occurs, it can cause expensive damage to equipment and substantial revenue loss due to service interruption. A faulted element must, therefore, be disconnected without unnecessary delay. For this purpose, protective relays continuously monitor system elements (synchronous generators, transformers, transmission lines, motors, etc.) and isolate faulted elements by operating by operating circuit breakers. Originally, protective relays were designed containing electromechanical devices. Recently, however, rapid advances in digital-processor technology have prompted applying microprocessors to protective relays. This artical presents an application of a fuzzy-logic (FL) technique to microprocessor-based power system protective relays specifically for identifying unbalanced shunt faults on a power transmission line. 10 figs.

  13. Empirical modal decomposition applied to cardiac signals analysis

    NASA Astrophysics Data System (ADS)

    Beya, O.; Jalil, B.; Fauvet, E.; Laligant, O.

    2010-01-01

    In this article, we present the method of empirical modal decomposition (EMD) applied to the electrocardiograms and phonocardiograms signals analysis and denoising. The objective of this work is to detect automatically cardiac anomalies of a patient. As these anomalies are localized in time, therefore the localization of all the events should be preserved precisely. The methods based on the Fourier Transform (TFD) lose the localization property [13] and in the case of Wavelet Transform (WT) which makes possible to overcome the problem of localization, but the interpretation remains still difficult to characterize the signal precisely. In this work we propose to apply the EMD (Empirical Modal Decomposition) which have very significant properties on pseudo periodic signals. The second section describes the algorithm of EMD. In the third part we present the result obtained on Phonocardiograms (PCG) and on Electrocardiograms (ECG) test signals. The analysis and the interpretation of these signals are given in this same section. Finally, we introduce an adaptation of the EMD algorithm which seems to be very efficient for denoising.

  14. Creating Learning Organizations in Higher Education: Applying a Systems Perspective

    ERIC Educational Resources Information Center

    Bui, Hong; Baruch, Yehuda

    2010-01-01

    Purpose: The purpose of this paper is to offer an application of a system model for Senge's five disciplines in higher education (HE) institutions. Design/methodology/approach: The paper utilizes a conceptual framework for the analysis of antecedents and outcomes of Senge's five disciplines, focusing on specific factors unique to the HE sector.…

  15. Automated SEM Modal Analysis Applied to the Diogenites

    NASA Technical Reports Server (NTRS)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  16. Applied Information Systems Research Program (AISRP). Workshop 2: Meeting Proceedings

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The Earth and space science participants were able to see where the current research can be applied in their disciplines and computer science participants could see potential areas for future application of computer and information systems research. The Earth and Space Science research proposals for the High Performance Computing and Communications (HPCC) program were under evaluation. Therefore, this effort was not discussed at the AISRP Workshop. OSSA's other high priority area in computer science is scientific visualization, with the entire second day of the workshop devoted to it.

  17. Certification methodology applied to the NASA experimental radar system

    NASA Technical Reports Server (NTRS)

    Britt, Charles L.; Switzer, George F.; Bracalente, Emedio M.

    1994-01-01

    The objective of the research is to apply selected FAA certification techniques to the NASA experimental wind shear radar system. Although there is no intent to certify the NASA system, the procedures developed may prove useful to manufacturers that plan to undergo the certification process. The certification methodology for forward-looking wind shear detection radars will require estimation of system performance in several FAA-specified microburst/clutter scenarios as well as the estimation of probabilities of missed and false hazard alerts under general operational conditions. Because of the near-impossibility of obtaining these results experimentally, analytical and simulation approaches must be used. Hazard detection algorithms were developed that derived predictive estimates of aircraft hazard from basic radar measurements of weather reflectivity and radial wind velocity. These algorithms were designed to prevent false alarms due to ground clutter while providing accurate predictions of hazard to the aircraft due to weather. A method of calculation of the probability of missed and false hazard alerts has been developed that takes into account the effect of the various algorithms used in the system and provides estimates of the probability of missed and false alerts per microburst encounter under weather conditions found at Denver, Kansas City, and Orlando. Simulation techniques have been developed that permit the proper merging of radar ground clutter data (obtained from flight tests) with simulated microburst data (obtained from microburst models) to estimate system performance using the microburst/clutter scenarios defined by the FAA.

  18. An improved AVC strategy applied in distributed wind power system

    NASA Astrophysics Data System (ADS)

    Zhao, Y. N.; Liu, Q. H.; Song, S. Y.; Mao, W.

    2016-08-01

    Traditional AVC strategy is mainly used in wind farm and only concerns about grid connection point, which is not suitable for distributed wind power system. Therefore, this paper comes up with an improved AVC strategy applied in distributed wind power system. The strategy takes all nodes of distribution network into consideration and chooses the node having the most serious voltage deviation as control point to calculate the reactive power reference. In addition, distribution principles can be divided into two conditions: when wind generators access to network on single node, the reactive power reference is distributed according to reactive power capacity; when wind generators access to network on multi-node, the reference is distributed according to sensitivity. Simulation results show the correctness and reliability of the strategy. Compared with traditional control strategy, the strategy described in this paper can make full use of generators reactive power output ability according to the distribution network voltage condition and improve the distribution network voltage level effectively.

  19. Applying cluster analysis to physics education research data

    NASA Astrophysics Data System (ADS)

    Springuel, R. Padraic

    One major thrust of Physics Education Research (PER) is the identification of student ideas about specific physics concepts, both correct ideas and those that differ from the expert consensus. Typically the research process of eliciting the spectrum of student ideas involves the administration of specially designed questions to students. One major analysis task in PER is the sorting of these student responses into thematically coherent groups. This process is one which has previously been done by eye in PER. This thesis explores the possibility of using cluster analysis to perform the task in a more rigorous and less time-intensive fashion while making fewer assumptions about what the students are doing. Since this technique has not previously been used in PER, a summary of the various kinds of cluster analysis is included as well as a discussion of which might be appropriate for the task of sorting student responses into groups. Two example data sets (one based on the Force and Motion Conceptual Evaluation (DICE) the other looking at acceleration in two-dimensions (A2D) are examined in depth to demonstrate how cluster analysis can be applied to PER data and the various considerations which must be taken into account when doing so. In both cases, the techniques described in this thesis found 5 groups which contained about 90% of the students in the data set. The results of this application are compared to previous research on the topics covered by the two examples to demonstrate that cluster analysis can effectively uncover the same patterns in student responses that have already been identified.

  20. Integrated preclinical photosafety testing strategy for systemically applied pharmaceuticals.

    PubMed

    Schümann, Jens; Boudon, Stéphanie; Ulrich, Peter; Loll, Nathalie; Garcia, Déborah; Schaffner, René; Streich, Jeannine; Kittel, Birgit; Bauer, Daniel

    2014-05-01

    Phototoxic properties of systemically applied pharmaceuticals may be the cause of serious adverse drug reactions. Therefore, a reliable preclinical photosafety assessment strategy, combining in vitro and in vivo approaches in a quantitative manner, is important and has not been described so far. Here, we report the establishment of an optimized modified murine local lymph node assay (LLNA), adapted for phototoxicity assessment of systemically applied compounds, as well as the test results for 34 drug candidates in this in vivo photo-LLNA. The drug candidates were selected based on their ability to absorb ultraviolet/visible light and the photo irritation factors (PIFs) determined in the well-established in vitro 3T3 neutral red uptake phototoxicity test. An in vivo phototoxic potential was identified for 13 of these drug candidates. The use of multiple dose levels in the described murine in vivo phototoxicity studies enabled the establishment of no- and/or lowest-observed-adverse-effect levels (NOAELs/LOAELs), also supporting human photosafety assessment. An in vitro-in vivo correlation demonstrated that a drug candidate classified as "phototoxic" in vitro is not necessarily phototoxic in vivo. However, the probability for a drug candidate to cause phototoxicity in vivo clearly correlated with the magnitude of the phototoxicity identified in vitro.

  1. Performance Measurement Analysis System

    1989-06-01

    The PMAS4.0 (Performance Measurement Analysis System) is a user-oriented system designed to track the cost and schedule performance of Department of Energy (DOE) major projects (MPs) and major system acquisitions (MSAs) reporting under DOE Order 5700.4A, Project Management System. PMAS4.0 provides for the analysis of performance measurement data produced from management control systems complying with the Federal Government''s Cost and Schedule Control Systems Criteria.

  2. Applied Behavior Analysis is a Science and, Therefore, Progressive.

    PubMed

    Leaf, Justin B; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K; Smith, Tristram; Weiss, Mary Jane

    2016-02-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful progress for individuals diagnosed with autism spectrum disorder (ASD). We describe this approach as progressive. In a progressive approach to ABA, the therapist employs a structured yet flexible process, which is contingent upon and responsive to child progress. We will describe progressive ABA, contrast it to reductionist ABA, and provide rationales for both the substance and intent of ABA as a progressive scientific method for improving conditions of social relevance for individuals with ASD.

  3. Applied Behavior Analysis is a Science and, Therefore, Progressive.

    PubMed

    Leaf, Justin B; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K; Smith, Tristram; Weiss, Mary Jane

    2016-02-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful progress for individuals diagnosed with autism spectrum disorder (ASD). We describe this approach as progressive. In a progressive approach to ABA, the therapist employs a structured yet flexible process, which is contingent upon and responsive to child progress. We will describe progressive ABA, contrast it to reductionist ABA, and provide rationales for both the substance and intent of ABA as a progressive scientific method for improving conditions of social relevance for individuals with ASD. PMID:26373767

  4. Finite Element Analysis Applied to Dentoalveolar Trauma: Methodology Description

    PubMed Central

    da Silva, B. R.; Moreira Neto, J. J. S.; da Silva, F. I.; de Aguiar, A. S. W.

    2011-01-01

    Dentoalveolar traumatic injuries are among the clinical conditions most frequently treated in dental practice. However, few studies so far have addressed the biomechanical aspects of these events, probably as a result of difficulties in carrying out satisfactory experimental and clinical studies as well as the unavailability of truly scientific methodologies. The aim of this paper was to describe the use of finite element analysis applied to the biomechanical evaluation of dentoalveolar trauma. For didactic purposes, the methodological process was divided into steps that go from the creation of a geometric model to the evaluation of final results, always with a focus on methodological characteristics, advantages, and disadvantages, so as to allow the reader to customize the methodology according to specific needs. Our description shows that the finite element method can faithfully reproduce dentoalveolar trauma, provided the methodology is closely followed and thoroughly evaluated. PMID:21991463

  5. Robust regression applied to fractal/multifractal analysis.

    NASA Astrophysics Data System (ADS)

    Portilla, F.; Valencia, J. L.; Tarquis, A. M.; Saa-Requejo, A.

    2012-04-01

    Fractal and multifractal are concepts that have grown increasingly popular in recent years in the soil analysis, along with the development of fractal models. One of the common steps is to calculate the slope of a linear fit commonly using least squares method. This shouldn't be a special problem, however, in many situations using experimental data the researcher has to select the range of scales at which is going to work neglecting the rest of points to achieve the best linearity that in this type of analysis is necessary. Robust regression is a form of regression analysis designed to circumvent some limitations of traditional parametric and non-parametric methods. In this method we don't have to assume that the outlier point is simply an extreme observation drawn from the tail of a normal distribution not compromising the validity of the regression results. In this work we have evaluated the capacity of robust regression to select the points in the experimental data used trying to avoid subjective choices. Based on this analysis we have developed a new work methodology that implies two basic steps: • Evaluation of the improvement of linear fitting when consecutive points are eliminated based on R p-value. In this way we consider the implications of reducing the number of points. • Evaluation of the significance of slope difference between fitting with the two extremes points and fitted with the available points. We compare the results applying this methodology and the common used least squares one. The data selected for these comparisons are coming from experimental soil roughness transect and simulated based on middle point displacement method adding tendencies and noise. The results are discussed indicating the advantages and disadvantages of each methodology. Acknowledgements Funding provided by CEIGRAM (Research Centre for the Management of Agricultural and Environmental Risks) and by Spanish Ministerio de Ciencia e Innovación (MICINN) through project no

  6. Synchronisation and coupling analysis: applied cardiovascular physics in sleep medicine.

    PubMed

    Wessel, Niels; Riedl, Maik; Kramer, Jan; Muller, Andreas; Penzel, Thomas; Kurths, Jurgen

    2013-01-01

    Sleep is a physiological process with an internal program of a number of well defined sleep stages and intermediate wakefulness periods. The sleep stages modulate the autonomous nervous system and thereby the sleep stages are accompanied by different regulation regimes for the cardiovascular and respiratory system. The differences in regulation can be distinguished by new techniques of cardiovascular physics. The number of patients suffering from sleep disorders increases unproportionally with the increase of the human population and aging, leading to very high expenses in the public health system. Therefore, the challenge of cardiovascular physics is to develop highly-sophisticated methods which are able to, on the one hand, supplement and replace expensive medical devices and, on the other hand, improve the medical diagnostics with decreasing the patient's risk. Methods of cardiovascular physics are used to analyze heart rate, blood pressure and respiration to detect changes of the autonomous nervous system in different diseases. Data driven modeling analysis, synchronization and coupling analysis and their applications to biosignals in healthy subjects and patients with different sleep disorders are presented. Newly derived methods of cardiovascular physics can help to find indicators for these health risks.

  7. Performance analysis of high quality parallel preconditioners applied to 3D finite element structural analysis

    SciTech Connect

    Kolotilina, L.; Nikishin, A.; Yeremin, A.

    1994-12-31

    The solution of large systems of linear equations is a crucial bottleneck when performing 3D finite element analysis of structures. Also, in many cases the reliability and robustness of iterative solution strategies, and their efficiency when exploiting hardware resources, fully determine the scope of industrial applications which can be solved on a particular computer platform. This is especially true for modern vector/parallel supercomputers with large vector length and for modern massively parallel supercomputers. Preconditioned iterative methods have been successfully applied to industrial class finite element analysis of structures. The construction and application of high quality preconditioners constitutes a high percentage of the total solution time. Parallel implementation of high quality preconditioners on such architectures is a formidable challenge. Two common types of existing preconditioners are the implicit preconditioners and the explicit preconditioners. The implicit preconditioners (e.g. incomplete factorizations of several types) are generally high quality but require solution of lower and upper triangular systems of equations per iteration which are difficult to parallelize without deteriorating the convergence rate. The explicit type of preconditionings (e.g. polynomial preconditioners or Jacobi-like preconditioners) require sparse matrix-vector multiplications and can be parallelized but their preconditioning qualities are less than desirable. The authors present results of numerical experiments with Factorized Sparse Approximate Inverses (FSAI) for symmetric positive definite linear systems. These are high quality preconditioners that possess a large resource of parallelism by construction without increasing the serial complexity.

  8. Common cause evaluations in applied risk analysis of nuclear power plants. [PWR

    SciTech Connect

    Taniguchi, T.; Ligon, D.; Stamatelatos, M.

    1983-04-01

    Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights in the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system.

  9. Generalized Statistical Thermodyanmics Applied to Small Material Systems

    NASA Astrophysics Data System (ADS)

    Cammarata, Robert

    2012-02-01

    When characterizing the behavior of small material systems, surface effects can strongly influence the thermodynamic behavior and need to be taken into account in a complete thermal physics analysis. Although there have been a variety of approached proposed to incorporate surface effects, they are often restricted to certain types of systems (e.g., those involving incompressible phases) and often invoke thermodynamics parameters that are often not well-defined for the surface. It is proposed that a generalized statistical mechanics based on the concept of thermodynamic availability (exergy) can be formulated from which the surface properties and their influence on system behavior can be naturally and rigorously obtained. This availability-based statistical thermodynamics will be presented and its use illustrated in a treatment of nucleation during crystallization.

  10. Estimating an Applying Uncertainties in Probabilistic Tsunami Hazard Analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Thio, H. K.

    2013-12-01

    An integral part of the a probabilistic analysis is the formal inclusion of uncertainties, both due to a limited understanding of the physics processes (epistemic) as well their natural variability (aleatory). Because of the strong non-linearity of the tsunami inundation process, it is also important to not only understand the extent of the uncertainties, but also how and where to apply them. We can divide up the uncertainties into several stages: the source, ocean propagation and nearshore/inundation. On the source side, many of the uncertainties are identical to those used in probabilistic seismic hazard analysis (PSHA). However, the details of slip distributions are very significant in tsunami excitation, especially for near-field tsunamis.. We will show several ways of including slip variability, both stochastic and non-stochastic, by developing a probabilistic set of source scenarios. The uncertainties in ocean propagation are less significant since modern algorithms are very successful in modeling open ocean tsunami propagation. However, in the near-shore regime and the inundation, the situation is much more complex. Here, errors in the local elevation models, variability in bottom friction and the omission of built environment can lead to significant errors. Details of the implementation of the tsunami algorithms can yield different results. We will discuss the most significant sources of uncertainty and the alternative ways to implement them using examples for the probabilistic tsunami hazard mapping that we are currently carrying out for the state of California and other regions.

  11. A Hygrothermal Risk Analysis Applied to Residential Unvented Attics

    SciTech Connect

    Pallin, Simon B; Kehrer, Manfred

    2013-01-01

    Aresidential building, constructed with an unvented attic, is acommonroof assembly in the United States.The expected hygrothermal performance and service life of the roof are difficult to estimate due to a number of varying parameters.Typical parameters expected to vary are the climate, direction, and slope of the roof as well as the radiation properties of the surface material. Furthermore, influential parameters are indoor moisture excess, air leakages through the attic floor, and leakages from air-handling unit and ventilation ducts. In addition, the type of building materials such as the insulation material and closed or open cell spray polyurethane foam will influence the future performance of the roof. A development of a simulation model of the roof assembly will enable a risk and sensitivity analysis, in which the most important varying parameters on the hygrothermal performance can be determined. The model is designed to perform probabilistic simulations using mathematical and hygrothermal calculation tools. The varying input parameters can be chosen from existing measurements, simulations, or standards. An analysis is applied to determine the risk of consequences, such as mold growth, rot, or energy demand of the HVAC unit. Furthermore, the future performance of the roof can be simulated in different climates to facilitate the design of an efficient and reliable roof construction with the most suitable technical solution and to determine the most appropriate building materials for a given climate

  12. Hypercube expert system shell-applying production parallelism. Master's thesis

    SciTech Connect

    Harding, W.A.

    1989-12-01

    This research investigation proposes a hypercube design which supports efficient symbolic computing to permit real-time control of an air vehicle by an expert system. Design efforts are aimed at alleviating common expert system bottlenecks, such as the inefficiency of symbolic programming languages like Lisp and the disproportionate amount of computation time commonly spent in the match phase of the expert system match-select-act cycle. Faster processing of Robotic Air Vehicle (RAV) expert system software is approached through (1) fast production matching using the state-saving Rete match algorithm, (2) efficient shell implementation using the C-Programming Language and (3) parallel processing of the RAV using multiple copies of a serial expert system shell. In this investigation, the serial C-Language Integrated Production System shell is modified to execute in parallel on the iPSC/2 Hypercube. Speedups achieved using this architecture are quantified through theoretical timing analysis, and comparison with serial architecture performance results, with earlier designs performance results, with best case results and with goal performance.

  13. Adaptive control applied to Space Station attitude control system

    NASA Technical Reports Server (NTRS)

    Lam, Quang M.; Chipman, Richard; Hu, Tsay-Hsin G.; Holmes, Eric B.; Sunkel, John

    1992-01-01

    This paper presents an adaptive control approach to enhance the performance of current attitude control system used by the Space Station Freedom. The proposed control law was developed based on the direct adaptive control or model reference adaptive control scheme. Performance comparisons, subject to inertia variation, of the adaptive controller and the fixed-gain linear quadratic regulator currently implemented for the Space Station are conducted. Both the fixed-gain and the adaptive gain controllers are able to maintain the Station stability for inertia variations of up to 35 percent. However, when a 50 percent inertia variation is applied to the Station, only the adaptive controller is able to maintain the Station attitude.

  14. Multitaper Spectral Analysis and Wavelet Denoising Applied to Helioseismic Data

    NASA Technical Reports Server (NTRS)

    Komm, R. W.; Gu, Y.; Hill, F.; Stark, P. B.; Fodor, I. K.

    1999-01-01

    Estimates of solar normal mode frequencies from helioseismic observations can be improved by using Multitaper Spectral Analysis (MTSA) to estimate spectra from the time series, then using wavelet denoising of the log spectra. MTSA leads to a power spectrum estimate with reduced variance and better leakage properties than the conventional periodogram. Under the assumption of stationarity and mild regularity conditions, the log multitaper spectrum has a statistical distribution that is approximately Gaussian, so wavelet denoising is asymptotically an optimal method to reduce the noise in the estimated spectra. We find that a single m-upsilon spectrum benefits greatly from MTSA followed by wavelet denoising, and that wavelet denoising by itself can be used to improve m-averaged spectra. We compare estimates using two different 5-taper estimates (Stepian and sine tapers) and the periodogram estimate, for GONG time series at selected angular degrees l. We compare those three spectra with and without wavelet-denoising, both visually, and in terms of the mode parameters estimated from the pre-processed spectra using the GONG peak-fitting algorithm. The two multitaper estimates give equivalent results. The number of modes fitted well by the GONG algorithm is 20% to 60% larger (depending on l and the temporal frequency) when applied to the multitaper estimates than when applied to the periodogram. The estimated mode parameters (frequency, amplitude and width) are comparable for the three power spectrum estimates, except for modes with very small mode widths (a few frequency bins), where the multitaper spectra broadened the modest compared with the periodogram. We tested the influence of the number of tapers used and found that narrow modes at low n values are broadened to the extent that they can no longer be fit if the number of tapers is too large. For helioseismic time series of this length and temporal resolution, the optimal number of tapers is less than 10.

  15. Numerical Contractor Renormalization applied to strongly correlated systems

    NASA Astrophysics Data System (ADS)

    Capponi, Sylvain

    2006-02-01

    We demonstrate the utility of effective Hamilonians for studying strongly correlated systems, such as quantum spin systems. After defining local relevant degrees of freedom, the numerical Contractor Renormalization (CORE) method is applied in two steps: (i) building an effective Hamiltonian with longer ranged interactions up to a certain cut-off using the CORE algorithm and (ii) solving this new model numerically on finite clusters by exact diagonalization and performing finite-size extrapolations to obtain results in the thermodynamic limit. This approach, giving complementary information to analytical treatments of the CORE Hamiltonian, can be used as a semi-quantitative numerical method. For ladder type geometries, we explicitely check the accuracy of the effective models by increasing the range of the effective interactions until reaching convergence. Our results both in the doped and undoped case are in good agreement with previously established results. In two dimensions we consider the plaquette lattice and the kagomé lattice as non-trivial test cases for the numerical CORE method. As it becomes more difficult to extend the range of the effective interactions in two dimensions, we propose diagnostic tools (such as the density matrix of the local building block) to ascertain the validity of the basis truncation. On the plaquette lattice we have an excellent description of the system in both the disordered and the ordered phases, thereby showing that the CORE method is able to resolve quantum phase transitions. On the kagomé lattice we find that the previously proposed twofold degenerate S = 1/2 basis can account for a large number of phenomena of the spin 1/2 kagomé system and gives a good starting point to study the doped case.

  16. To Apply or Not to Apply: A Survey Analysis of Grant Writing Costs and Benefits

    PubMed Central

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads. PMID:25738742

  17. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    PubMed

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads.

  18. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    PubMed

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads. PMID:25738742

  19. Analysis of possibility of applying the PVDF foil in industrial vibration sensors

    NASA Astrophysics Data System (ADS)

    Wróbel, A.

    2015-11-01

    There are many machines using the piezoelectric effects. Systems with smart materials are often used because they have high potential applications for example transducers can be applied to receive required characteristic of projected system. Every engineer and designer know how important it is properly mathematical model and method of the analysis. Also it is important to consider all parameters of analyzed system for example glue layer between elements. Geometrical and material parameters has a significant impact on the characteristics of the all system's components because the omission of the influence of one of them results in inaccuracy in the analysis of the system. In article the modeling and testing of vibrating systems with piezoelectric ceramic materials transducers used as actuators and vibration dampers. The method of analysis of the vibrating sensor systems will be presented, mathematical model, and characteristics, to determine the influence of the system's properties on these characteristics. Main scientific point of the project is to analyze and demonstrate possibility of applying new construction with the PVDF foil or any other belonging to a group of smart materials in industrial sensors. Currently, the vibration level sensors are used by practically all manufacturers of piezoelectric ceramic plates to generate and detect the vibration of the fork.

  20. BATSE spectroscopy analysis system

    NASA Technical Reports Server (NTRS)

    Schaefer, Bradley E.; Bansal, Sandhia; Basu, Anju; Brisco, Phil; Cline, Thomas L.; Friend, Elliott; Laubenthal, Nancy; Panduranga, E. S.; Parkar, Nuru; Rust, Brad

    1992-01-01

    The Burst and Transient Source Experiment (BATSE) Spectroscopy Analysis System (BSAS) is the software system which is the primary tool for the analysis of spectral data from BATSE. As such, Guest Investigators and the community as a whole need to know its basic properties and characteristics. Described here are the characteristics of the BATSE spectroscopy detectors and the BSAS.

  1. Operational modal analysis applied to the concert harp

    NASA Astrophysics Data System (ADS)

    Chomette, B.; Le Carrou, J.-L.

    2015-05-01

    Operational modal analysis (OMA) methods are useful to extract modal parameters of operating systems. These methods seem to be particularly interesting to investigate the modal basis of string instruments during operation to avoid certain disadvantages due to conventional methods. However, the excitation in the case of string instruments is not optimal for OMA due to the presence of damped harmonic components and low noise in the disturbance signal. Therefore, the present study investigates the least-square complex exponential (LSCE) and the modified least-square complex exponential methods in the case of a string instrument to identify modal parameters of the instrument when it is played. The efficiency of the approach is experimentally demonstrated on a concert harp excited by some of its strings and the two methods are compared to a conventional modal analysis. The results show that OMA allows us to identify modes particularly present in the instrument's response with a good estimation especially if they are close to the excitation frequency with the modified LSCE method.

  2. A methodology for the probabilistic assessment of system effectiveness as applied to aircraft survivability and susceptibility

    NASA Astrophysics Data System (ADS)

    Soban, Danielle Suzanne

    2001-07-01

    Significant advances have been made recently in applying probabilistic methods to aerospace vehicle concepts. Given the explosive changes in today's political, social, and technological climate, it makes practical sense to try and extrapolate these methods to the campaign analysis level. This would allow the assessment of rapidly changing threat environments as well as technological advancements, aiding today's decision makers. These decision makers use this information in three primary ways: resource allocation, requirements definition, and trade studies between system components. In effect, these decision makers are looking for a way to quantify system effectiveness. Using traditional definitions, one can categorize an aerospace concept, such as an aircraft, as the system. Design and analysis conducted on the aircraft will result in system level Measures of Effectiveness. System effectiveness, therefore, becomes a function of only that aircraft's design variables and parameters. While this method of analysis can result in the design of a vehicle that is optimized to its own mission and performance requirements, the vehicle remains independent of its role for which it was created: the warfighting environment. It is therefore proposed that the system be redefined as the warfighting environment (campaign analysis) and the problem be considered to have a system of systems formulation. A methodology for the assessment of military system effectiveness is proposed. Called POSSEM (PrObabilisitic System of System Effectiveness Methodology), the methodology describes the creation of an analysis pathway that links engineering level changes to campaign level measures of effectiveness. The methodology includes probabilistic analysis techniques in order to manage the inherent uncertainties in the problem, which are functions of human decision making, rapidly changing threats, and the incorporation of new technologies. An example problem is presented, in which aircraft

  3. Factor Analysis Applied the VFY-218 RCS Data

    NASA Technical Reports Server (NTRS)

    Woo, Alex; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    Present statistical factor analysis of computer simulations and measurement data for the VFY-218 configuration. Factor analysis try to quantify the statistical grouping of measurements and simulations.

  4. Phase plane analysis: applying chaos theory in health care.

    PubMed

    Priesmeyer, H R; Sharp, L F

    1995-01-01

    This article applies the new science of nonlinearity to administrative issues and accounts receivable management in health care, and it provides a new perspective on common operating and quality control measures. PMID:10151628

  5. Space elevator systems level analysis

    SciTech Connect

    Laubscher, B. E.

    2004-01-01

    The Space Elevator (SE) represents a major paradigm shift in space access. It involves new, untried technologies in most of its subsystems. Thus the successful construction of the SE requires a significant amount of development, This in turn implies a high level of risk for the SE. This paper will present a systems level analysis of the SE by subdividing its components into their subsystems to determine their level of technological maturity. such a high-risk endeavor is to follow a disciplined approach to the challenges. A systems level analysis informs this process and is the guide to where resources should be applied in the development processes. It is an efficient path that, if followed, minimizes the overall risk of the system's development. systems level analysis is that the overall system is divided naturally into its subsystems, and those subsystems are further subdivided as appropriate for the analysis. By dealing with the complex system in layers, the parameter space of decisions is kept manageable. Moreover, A rational way to manage One key aspect of a resources are not expended capriciously; rather, resources are put toward the biggest challenges and most promising solutions. This overall graded approach is a proven road to success. The analysis includes topics such as nanotube technology, deployment scenario, power beaming technology, ground-based hardware and operations, ribbon maintenance and repair and climber technology.

  6. Multi-Criteria Analysis for Biomass Utilization Applying Data Envelopment Analysis

    NASA Astrophysics Data System (ADS)

    Morimoto, Hidetsugu; Hoshino, Satoshi; Kuki, Yasuaki

    This paper aimed to consider about material-recycling, preventing global warming, and economic efficiency on preset and planed 195 Biomass Towns applying DEA (Data Envelopment Analysis), which can evaluate operational efficiency entities such as private companies or projects. In the results, although the Biomass Town can recycle material efficiently, it was clarified that preventing global warming and business profitability was brushed off like it in Biomass Town Design. Moreover, from the point of view of operational efficiency, we suggested an improvement of the Biomass Town scale for more efficiency-enhancing applying DEA. We found that applying DEA was able to catch more improvements or indicator as compared with cost-benefit analysis and cost-effectiveness analysis.

  7. Evaluation of atrazine degradation applied to different energy systems.

    PubMed

    Moreira, Ailton J; Pinheiro, Bianca S; Araújo, André F; Freschi, Gian P G

    2016-09-01

    Atrazine is an herbicide widely used in crops and has drawn attention due to potential pollution present in soil, sediment, water, and food. Since conventional methods are not potentially efficient to persistent degradation of organic compounds, new technology has been developed to remove them, especially practices utilizing advanced oxidation processes (AOPs). This work aims to evaluate the use of different energies (ultraviolet (UV), microwaves (MW), and radiations (MW-UV)) to the herbicide atrazine through the process of photo-oxidation. These systems found degradation rates of around 12 % (UV), 28 % (MW), and 83 % (MW-UV), respectively, with time intervals of 120 s. After the photolytic processes, the samples were analyzed at a wavelength scanning the range of 190 to 300 nm, where the spectral analysis of the signal was used to evaluate the degradation of atrazine and the appearance of some other peaks (degradation products). The spectrum evaluation resulting from photolytic processes gave rise to a new signal which was confirmed by chromatography. This spectrum indicated the possible pathway of atrazine degradation by the process of photolytic MW-UV, generating atrazine-2-hydroxy, atrazine-desethyl-2-hidroxy, and atrazine-desisopropyl-2-hydroxy. The process indicated that in all situations, chloride was present in the analytic structure and was substituted by a hydroxyl group, which lowered the toxicity of the compound through the photolytic process MW-UV. Chromatographic analysis ascertained these preliminary assessments using spectrophotometry. It was also significantly observed that the process can be optimized by adjusting the pH of the solution, which was evident by an improvement of 10 % in the rate of degradation when subjected to a pH solution equal to 8.37. PMID:27289373

  8. Identifying a cooperative control mechanism between an applied field and the environment of open quantum systems

    NASA Astrophysics Data System (ADS)

    Gao, Fang; Rey-de-Castro, Roberto; Wang, Yaoxiong; Rabitz, Herschel; Shuang, Feng

    2016-05-01

    Many systems under control with an applied field also interact with the surrounding environment. Understanding the control mechanisms has remained a challenge, especially the role played by the interaction between the field and the environment. In order to address this need, here we expand the scope of the Hamiltonian-encoding and observable-decoding (HE-OD) technique. HE-OD was originally introduced as a theoretical and experimental tool for revealing the mechanism induced by control fields in closed quantum systems. The results of open-system HE-OD analysis presented here provide quantitative mechanistic insights into the roles played by a Markovian environment. Two model open quantum systems are considered for illustration. In these systems, transitions are induced by either an applied field linked to a dipole operator or Lindblad operators coupled to the system. For modest control yields, the HE-OD results clearly show distinct cooperation between the dynamics induced by the optimal field and the environment. Although the HE-OD methodology introduced here is considered in simulations, it has an analogous direct experimental formulation, which we suggest may be applied to open systems in the laboratory to reveal mechanistic insights.

  9. Systems biology applied to vaccine and immunotherapy development

    PubMed Central

    2011-01-01

    Immunotherapies, including vaccines, represent a potent tool to prevent or contain disease with high morbidity or mortality such as infections and cancer. However, despite their widespread use, we still have a limited understanding of the mechanisms underlying the induction of protective immune responses. Immunity is made of a multifaceted set of integrated responses involving a dynamic interaction of thousands of molecules; among those is a growing appreciation for the role the innate immunity (i.e. pathogen recognition receptors - PRRs) plays in determining the nature and duration (immune memory) of adaptive T and B cell immunity. The complex network of interactions between immune manipulation of the host (immunotherapy) on one side and innate and adaptive responses on the other might be fully understood only employing the global level of investigation provided by systems biology. In this framework, the advancement of high-throughput technologies, together with the extensive identification of new genes, proteins and other biomolecules in the "omics" era, facilitate large-scale biological measurements. Moreover, recent development of new computational tools enables the comprehensive and quantitative analysis of the interactions between all of the components of immunity over time. Here, we review recent progress in using systems biology to study and evaluate immunotherapy and vaccine strategies for infectious and neoplastic diseases. Multi-parametric data provide novel and often unsuspected mechanistic insights while enabling the identification of common immune signatures relevant to human investigation such as the prediction of immune responsiveness that could lead to the improvement of the design of future immunotherapy trials. Thus, the paradigm switch from "empirical" to "knowledge-based" conduct of medicine and immunotherapy in particular, leading to patient-tailored treatment. PMID:21933421

  10. Applying Regression Analysis to Problems in Institutional Research.

    ERIC Educational Resources Information Center

    Bohannon, Tom R.

    1988-01-01

    Regression analysis is one of the most frequently used statistical techniques in institutional research. Principles of least squares, model building, residual analysis, influence statistics, and multi-collinearity are described and illustrated. (Author/MSE)

  11. SUBSURFACE VISUAL ALARM SYSTEM ANALYSIS

    SciTech Connect

    D.W. Markman

    2001-08-06

    The ''Subsurface Fire Hazard Analysis'' (CRWMS M&O 1998, page 61), and the document, ''Title III Evaluation Report for the Surface and Subsurface Communication System'', (CRWMS M&O 1999a, pages 21 and 23), both indicate the installed communication system is adequate to support Exploratory Studies Facility (ESF) activities with the exception of the mine phone system for emergency notification purposes. They recommend the installation of a visual alarm system to supplement the page/party phone system The purpose of this analysis is to identify data communication highway design approaches, and provide justification for the selected or recommended alternatives for the data communication of the subsurface visual alarm system. This analysis is being prepared to document a basis for the design selection of the data communication method. This analysis will briefly describe existing data or voice communication or monitoring systems within the ESF, and look at how these may be revised or adapted to support the needed data highway of the subsurface visual alarm. system. The existing PLC communication system installed in subsurface is providing data communication for alcove No.5 ventilation fans, south portal ventilation fans, bulkhead doors and generator monitoring system. It is given that the data communication of the subsurface visual alarm system will be a digital based system. It is also given that it is most feasible to take advantage of existing systems and equipment and not consider an entirely new data communication system design and installation. The scope and primary objectives of this analysis are to: (1) Briefly review and describe existing available data communication highways or systems within the ESF. (2) Examine technical characteristics of an existing system to disqualify a design alternative is paramount in minimizing the number of and depth of a system review. (3) Apply general engineering design practices or criteria such as relative cost, and degree of

  12. Systems Analysis Sub site

    SciTech Connect

    EERE

    2012-03-16

    Systems analysis provides direction, focus, and support for the development and introduction of hydrogen production, storage, and end-use technologies, and provides a basis for recommendations on a balanced portfolio of activities.

  13. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  14. Classical linear-control analysis applied to business-cycle dynamics and stability

    NASA Technical Reports Server (NTRS)

    Wingrove, R. C.

    1983-01-01

    Linear control analysis is applied as an aid in understanding the fluctuations of business cycles in the past, and to examine monetary policies that might improve stabilization. The analysis shows how different policies change the frequency and damping of the economic system dynamics, and how they modify the amplitude of the fluctuations that are caused by random disturbances. Examples are used to show how policy feedbacks and policy lags can be incorporated, and how different monetary strategies for stabilization can be analytically compared. Representative numerical results are used to illustrate the main points.

  15. Launch vehicle systems design analysis

    NASA Technical Reports Server (NTRS)

    Ryan, Robert; Verderaime, V.

    1993-01-01

    Current launch vehicle design emphasis is on low life-cycle cost. This paper applies total quality management (TQM) principles to a conventional systems design analysis process to provide low-cost, high-reliability designs. Suggested TQM techniques include Steward's systems information flow matrix method, quality leverage principle, quality through robustness and function deployment, Pareto's principle, Pugh's selection and enhancement criteria, and other design process procedures. TQM quality performance at least-cost can be realized through competent concurrent engineering teams and brilliance of their technical leadership.

  16. Duration Analysis Applied to the Adoption of Knowledge.

    ERIC Educational Resources Information Center

    Vega-Cervera, Juan A.; Gordillo, Isabel Cuadrado

    2001-01-01

    Analyzes knowledge acquisition in a sample of 264 pupils in 9 Spanish elementary schools, using time as a dependent variable. Introduces psycho-pedagogical, pedagogical, and social variables into a hazard model applied to the reading process. Auditory discrimination (not intelligence or visual perception) most significantly influences learning to…

  17. Some Applied Research Concerns Using Multiple Linear Regression Analysis.

    ERIC Educational Resources Information Center

    Newman, Isadore; Fraas, John W.

    The intention of this paper is to provide an overall reference on how a researcher can apply multiple linear regression in order to utilize the advantages that it has to offer. The advantages and some concerns expressed about the technique are examined. A number of practical ways by which researchers can deal with such concerns as…

  18. Space lab system analysis

    NASA Technical Reports Server (NTRS)

    Rives, T. B.; Ingels, F. M.

    1988-01-01

    An analysis of the Automated Booster Assembly Checkout System (ABACS) has been conducted. A computer simulation of the ETHERNET LAN has been written. The simulation allows one to investigate different structures of the ABACS system. The simulation code is in PASCAL and is VAX compatible.

  19. How Has Applied Behavior Analysis and Behavior Therapy Changed?: An Historical Analysis of Journals

    ERIC Educational Resources Information Center

    O'Donohue, William; Fryling, Mitch

    2007-01-01

    Applied behavior analysis and behavior therapy are now nearly a half century old. It is interesting to ask if and how these disciplines have changed over time, particularly regarding some of their key internal controversies (e.g., role of cognitions). We examined the first five years and the 2000-2004 five year period of the "Journal of Applied…

  20. Applying Model Analysis to a Resource-Based Analysis of the Force and Motion Conceptual Evaluation

    ERIC Educational Resources Information Center

    Smith, Trevor I.; Wittmann, Michael C.; Carter, Tom

    2014-01-01

    Previously, we analyzed the Force and Motion Conceptual Evaluation in terms of a resources-based model that allows for clustering of questions so as to provide useful information on how students correctly or incorrectly reason about physics. In this paper, we apply model analysis to show that the associated model plots provide more information…

  1. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  2. Data analysis systems.

    NASA Astrophysics Data System (ADS)

    Wells, Donald C.

    The following sections are included: * INTRODUCTION * WHY SYSTEMS? * PROJECT MANAGEMENT REASONS * TECHNICAL REASONS * THE TIME-SCALE PROBLEM * WHAT END-USERS WANT * DATA INTERCHANGE FORMATS * PROJECT MANAGEMENT ISSUES * THE EXPORT POLICY * GROUP PROGRAMMING PSYCHOLOGY * THE CASE FOR SYSTEM PORTABILITY * HOW TO BUILD PORTABLE DATA ANALYSIS SYSTEMS * ON THE DESIGN OF COMMAND LANGUAGES * LAYERED VIRTUAL INTERFACES IN AIPS * STATUS REPORT ON NRAO'S AIPS PROJECT * AIPS DEVELOPMENT PLANS * CONCLUSION * DISCUSSION * REFERENCES * BIBLIOGRAPHY—FOR FURTHER READING

  3. CONVEYOR SYSTEM SAFETY ANALYSIS

    SciTech Connect

    M. Salem

    1995-06-23

    The purpose and objective of this analysis is to systematically identify and evaluate hazards related to the Yucca Mountain Project Exploratory Studies Facility (ESF) surface and subsurface conveyor system (for a list of conveyor subsystems see section 3). This process is an integral part of the systems engineering process; whereby safety is considered during planning, design, testing, and construction. A largely qualitative approach was used since a radiological System Safety Analysis is not required. The risk assessment in this analysis characterizes the accident scenarios associated with the conveyor structures/systems/components in terms of relative risk and includes recommendations for mitigating all identified risks. The priority for recommending and implementing mitigation control features is: (1) Incorporate measures to reduce risks and hazards into the structure/system/component (S/S/C) design, (2) add safety devices and capabilities to the designs that reduce risk, (3) provide devices that detect and warn personnel of hazardous conditions, and (4) develop procedures and conduct training to increase worker awareness of potential hazards, on methods to reduce exposure to hazards, and on the actions required to avoid accidents or correct hazardous conditions. The scope of this analysis is limited to the hazards related to the design of conveyor structures/systems/components (S/S/Cs) that occur during normal operation. Hazards occurring during assembly, test and maintenance or ''off normal'' operations have not been included in this analysis. Construction related work activities are specifically excluded per DOE Order 5481.1B section 4. c.

  4. Scale invariance analysis for genetic networks applying homogeneity.

    PubMed

    Bernuau, Emmanuel; Efimov, Denis; Perruquetti, Wilfrid

    2016-05-01

    Scalability is a property describing the change of the trajectory of a dynamical system under a scaling of the input stimulus and of the initial conditions. Particular cases of scalability include the scale invariance and fold change detection (when the scaling of the input does not influence the system output). In the present paper it is shown that homogeneous systems have this scalability property while locally homogeneous systems approximately possess this property. These facts are used for detecting scale invariance or approximate scalability (far from a steady state) in several biological systems. The results are illustrated by various regulatory networks. PMID:26304616

  5. Quantitative Schlieren analysis applied to holograms of crystals grown on Spacelab 3

    NASA Technical Reports Server (NTRS)

    Brooks, Howard L.

    1986-01-01

    In order to extract additional information about crystals grown in the microgravity environment of Spacelab, a quantitative schlieren analysis technique was developed for use in a Holography Ground System of the Fluid Experiment System. Utilizing the Unidex position controller, it was possible to measure deviation angles produced by refractive index gradients of 0.5 milliradians. Additionally, refractive index gradient maps for any recorded time during the crystal growth were drawn and used to create solute concentration maps for the environment around the crystal. The technique was applied to flight holograms of Cell 204 of the Fluid Experiment System that were recorded during the Spacelab 3 mission on STS 51B. A triglycine sulfate crystal was grown under isothermal conditions in the cell and the data gathered with the quantitative schlieren analysis technique is consistent with a diffusion limited growth process.

  6. Coal systems analysis

    SciTech Connect

    Warwick, P.D.

    2005-07-01

    This collection of papers provides an introduction to the concept of coal systems analysis and contains examples of how coal systems analysis can be used to understand, characterize, and evaluate coal and coal gas resources. Chapter are: Coal systems analysis: A new approach to the understanding of coal formation, coal quality and environmental considerations, and coal as a source rock for hydrocarbons by Peter D. Warwick. Appalachian coal assessment: Defining the coal systems of the Appalachian Basin by Robert C. Milici. Subtle structural influences on coal thickness and distribution: Examples from the Lower Broas-Stockton coal (Middle Pennsylvanian), Eastern Kentucky Coal Field, USA by Stephen F. Greb, Cortland F. Eble, and J.C. Hower. Palynology in coal systems analysis The key to floras, climate, and stratigraphy of coal-forming environments by Douglas J. Nichols. A comparison of late Paleocene and late Eocene lignite depositional systems using palynology, upper Wilcox and upper Jackson Groups, east-central Texas by Jennifer M.K. O'Keefe, Recep H. Sancay, Anne L. Raymond, and Thomas E. Yancey. New insights on the hydrocarbon system of the Fruitland Formation coal beds, northern San Juan Basin, Colorado and New Mexico, USA by W.C. Riese, William L. Pelzmann, and Glen T. Snyder.

  7. Improving the flash flood frequency analysis applying dendrogeomorphological evidences

    NASA Astrophysics Data System (ADS)

    Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.

    2009-09-01

    Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait

  8. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  9. Factorial kriging analysis applied to geological data from petroleum exploration

    SciTech Connect

    Jaquet, O.

    1989-10-01

    A regionalized variable, thickness of the reservoir layer, from a gas field is decomposed by factorial kriging analysis. Maps of the obtained components may be associated with depositional environments that are favorable for petroleum exploration.

  10. Independent comparison study of six different electronic tongues applied for pharmaceutical analysis.

    PubMed

    Pein, Miriam; Kirsanov, Dmitry; Ciosek, Patrycja; del Valle, Manel; Yaroshenko, Irina; Wesoły, Małgorzata; Zabadaj, Marcin; Gonzalez-Calabuig, Andreu; Wróblewski, Wojciech; Legin, Andrey

    2015-10-10

    Electronic tongue technology based on arrays of cross-sensitive chemical sensors and chemometric data processing has attracted a lot of researchers' attention through the last years. Several so far reported applications dealing with pharmaceutical related tasks employed different e-tongue systems to address different objectives. In this situation, it is hard to judge on the benefits and drawbacks of particular e-tongue implementations for R&D in pharmaceutics. The objective of this study was to compare the performance of six different e-tongues applied to the same set of pharmaceutical samples. For this purpose, two commercially available systems (from Insent and AlphaMOS) and four laboratory prototype systems (two potentiometric systems from Warsaw operating in flow and static modes, one potentiometric system from St. Petersburg, one voltammetric system from Barcelona) were employed. The sample set addressed in the study comprised nine different formulations based on caffeine citrate, lactose monohydrate, maltodextrine, saccharin sodium and citric acid in various combinations. To provide for the fair and unbiased comparison, samples were evaluated under blind conditions and data processing from all the systems was performed in a uniform way. Different mathematical methods were applied to judge on similarity of the e-tongues response from the samples. These were principal component analysis (PCA), RV' matrix correlation coefficients and Tuckeŕs congruency coefficients. PMID:26099261

  11. Independent comparison study of six different electronic tongues applied for pharmaceutical analysis.

    PubMed

    Pein, Miriam; Kirsanov, Dmitry; Ciosek, Patrycja; del Valle, Manel; Yaroshenko, Irina; Wesoły, Małgorzata; Zabadaj, Marcin; Gonzalez-Calabuig, Andreu; Wróblewski, Wojciech; Legin, Andrey

    2015-10-10

    Electronic tongue technology based on arrays of cross-sensitive chemical sensors and chemometric data processing has attracted a lot of researchers' attention through the last years. Several so far reported applications dealing with pharmaceutical related tasks employed different e-tongue systems to address different objectives. In this situation, it is hard to judge on the benefits and drawbacks of particular e-tongue implementations for R&D in pharmaceutics. The objective of this study was to compare the performance of six different e-tongues applied to the same set of pharmaceutical samples. For this purpose, two commercially available systems (from Insent and AlphaMOS) and four laboratory prototype systems (two potentiometric systems from Warsaw operating in flow and static modes, one potentiometric system from St. Petersburg, one voltammetric system from Barcelona) were employed. The sample set addressed in the study comprised nine different formulations based on caffeine citrate, lactose monohydrate, maltodextrine, saccharin sodium and citric acid in various combinations. To provide for the fair and unbiased comparison, samples were evaluated under blind conditions and data processing from all the systems was performed in a uniform way. Different mathematical methods were applied to judge on similarity of the e-tongues response from the samples. These were principal component analysis (PCA), RV' matrix correlation coefficients and Tuckeŕs congruency coefficients.

  12. Consulting with Parents: Applying Family Systems Concepts and Techniques.

    ERIC Educational Resources Information Center

    Mullis, Fran; Edwards, Dana

    2001-01-01

    This article describes family systems concepts and techniques that school counselors, as consultants, can use to better understand the family system. The concepts are life cycle transitions and extrafamilial influences, extended family influences, boundaries, parental hierarchy and power, and triangulation. (Contains 39 references.) (GCP)

  13. The colour analysis method applied to homogeneous rocks

    NASA Astrophysics Data System (ADS)

    Halász, Amadé; Halmai, Ákos

    2015-12-01

    Computer-aided colour analysis can facilitate cyclostratigraphic studies. Here we report on a case study involving the development of a digital colour analysis method for examination of the Boda Claystone Formation which is the most suitable in Hungary for the disposal of high-level radioactive waste. Rock type colours are reddish brown or brownish red, or any shade between brown and red. The method presented here could be used to differentiate similar colours and to identify gradual transitions between these; the latter are of great importance in a cyclostratigraphic analysis of the succession. Geophysical well-logging has demonstrated the existence of characteristic cyclic units, as detected by colour and natural gamma. Based on our research, colour, natural gamma and lithology correlate well. For core Ib-4, these features reveal the presence of orderly cycles with thicknesses of roughly 0.64 to 13 metres. Once the core has been scanned, this is a time- and cost-effective method.

  14. Orbit Response Matrix Analysis Applied at PEP-II

    SciTech Connect

    Steier, C.; Wolski, A.; Ecklund, S.; Safranek, J.A.; Tenenbaum, P.; Terebilo, A.; Turner, J.L.; Yocky, G.; /SLAC

    2005-05-17

    The analysis of orbit response matrices has been used very successfully to measure and correct the gradient and skew gradient distribution in many accelerators. It allows determination of an accurately calibrated model of the coupled machine lattice, which then can be used to calculate the corrections necessary to improve coupling, dynamic aperture and ultimately luminosity. At PEP-II, the Matlab version of LOCO has been used to analyze coupled response matrices for both the LER and the HER. The large number of elements in PEP-II and the very complicated interaction region present unique challenges to the data analysis. All necessary tools to make the analysis method useable at PEP-II have been implemented and LOCO can now be used as a routine tool for lattice diagnostic.

  15. Joint regression analysis and AMMI model applied to oat improvement

    NASA Astrophysics Data System (ADS)

    Oliveira, A.; Oliveira, T. A.; Mejza, S.

    2012-09-01

    In our work we present an application of some biometrical methods useful in genotype stability evaluation, namely AMMI model, Joint Regression Analysis (JRA) and multiple comparison tests. A genotype stability analysis of oat (Avena Sativa L.) grain yield was carried out using data of the Portuguese Plant Breeding Board, sample of the 22 different genotypes during the years 2002, 2003 and 2004 in six locations. In Ferreira et al. (2006) the authors state the relevance of the regression models and of the Additive Main Effects and Multiplicative Interactions (AMMI) model, to study and to estimate phenotypic stability effects. As computational techniques we use the Zigzag algorithm to estimate the regression coefficients and the agricolae-package available in R software for AMMI model analysis.

  16. Systems biology applied to heart failure with normal ejection fraction.

    PubMed

    Mesquita, Evandro Tinoco; Jorge, Antonio Jose Lagoeiro; Souza Junior, Celso Vale de; Cassino, João Paulo Pedroza

    2014-05-01

    Heart failure with normal ejection fraction (HFNEF) is currently the most prevalent clinical phenotype of heart failure. However, the treatments available have shown no reduction in mortality so far. Advances in the omics sciences and techniques of high data processing used in molecular biology have enabled the development of an integrating approach to HFNEF based on systems biology. This study aimed at presenting a systems-biology-based HFNEF model using the bottom-up and top-down approaches. A literature search was conducted for studies published between 1991 and 2013 regarding HFNEF pathophysiology, its biomarkers and systems biology. A conceptual model was developed using bottom-up and top-down approaches of systems biology. The use of systems-biology approaches for HFNEF, a complex clinical syndrome, can be useful to better understand its pathophysiology and to discover new therapeutic targets.

  17. Systems Biology Applied to Heart Failure With Normal Ejection Fraction

    PubMed Central

    Mesquita, Evandro Tinoco; Jorge, Antonio Jose Lagoeiro; de Souza, Celso Vale; Cassino, João Paulo Pedroza

    2014-01-01

    Heart failure with normal ejection fraction (HFNEF) is currently the most prevalent clinical phenotype of heart failure. However, the treatments available have shown no reduction in mortality so far. Advances in the omics sciences and techniques of high data processing used in molecular biology have enabled the development of an integrating approach to HFNEF based on systems biology. This study aimed at presenting a systems-biology-based HFNEF model using the bottom-up and top-down approaches. A literature search was conducted for studies published between 1991 and 2013 regarding HFNEF pathophysiology, its biomarkers and systems biology. A conceptual model was developed using bottom-up and top-down approaches of systems biology. The use of systems-biology approaches for HFNEF, a complex clinical syndrome, can be useful to better understand its pathophysiology and to discover new therapeutic targets. PMID:24918915

  18. On the relation between applied behavior analysis and positive behavioral support

    PubMed Central

    Carr, James E.; Sidener, Tina M.

    2002-01-01

    Anderson and Freeman (2000) recently defined positive behavioral support (PBS) as a systematic approach to the delivery of clinical and educational services that is rooted in behavior analysis. However, the recent literature contains varied definitions of PBS as well as discrepant notions regarding the relation between applied behavior analysis and PBS. After summarizing common definitional characteristics of PBS from the literature, we conclude that PBS is comprised almost exclusively of techniques and values originating in applied behavior analysis. We then discuss the relations between applied behavior analysis and PBS that have been proposed in the literature. Finally, we discuss possible implications of considering PBS a field separate from applied behavior analysis. PMID:22478389

  19. Biomedical systems analysis program

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Biomedical monitoring programs which were developed to provide a system analysis context for a unified hypothesis for adaptation to space flight are presented and discussed. A real-time system of data analysis and decision making to assure the greatest possible crew safety and mission success is described. Information about man's abilities, limitations, and characteristic reactions to weightless space flight was analyzed and simulation models were developed. The predictive capabilities of simulation models for fluid-electrolyte regulation, erythropoiesis regulation, and calcium regulation are discussed.

  20. Integrated analysis environment for high impact systems

    SciTech Connect

    Martinez, M.; Davis, J.; Scott, J.; Sztipanovits, J.; Karsai, G.

    1998-02-01

    Modeling and analysis of high consequence, high assurance systems requires special modeling considerations. System safety and reliability information must be captured in the models. Previously, high consequence systems were modeled using separate, disjoint models for safety, reliability, and security. The MultiGraph Architecture facilitates the implementation of a model integrated system for modeling and analysis of high assurance systems. Model integrated computing allows an integrated modeling technique to be applied to high consequence systems. Among the tools used for analyzing safety and reliability are a behavioral simulator and an automatic fault tree generation and analysis tool. Symbolic model checking techniques are used to efficiently investigate the system models. A method for converting finite state machine models to ordered binary decision diagrams allows the application of symbolic model checking routines to the integrated system models. This integrated approach to modeling and analysis of high consequence systems ensures consistency between the models and the different analysis tools.

  1. Microcomputer log analysis system

    SciTech Connect

    Ostrander, C.

    1984-04-01

    A comprehensive friendly log analysis system for use on a microcomputer requires only average log analysis skills. Most systems require both log analysis and computer professional for operation. This one has many capabilities: (1) data entry is handled by office personnel after minimal training; (2) entered data is filed and cataloged for future retrieval and analysis; (3) the system can handle more than 9,000,000 ft (2700 km) of log data in over 60,000 files; (4) all data can be edited; (5) searches and listings can be made using factors such as formation names; (6) facsimile reproductions can be made of any log on file; (7) a screening program turns the system into a sophisticated hand calculator to quickly determine zones of interest; and (8) up to 1100 ft (335 m) of contiguous data from a well can be analyzed in one run. Innovative features include: (1) a discriminating factor to separate reservoirs for individual attention concerning rock type, fluid content and potential reserves; and (2) a written report of each reservoir using artificial intelligence. The report discusses, among other things, the rock type and its consistency, comparing the system finding with the geologist's opinion. Differences between the two will elicit alternative analyses.

  2. Applying Skinner's Analysis of Verbal Behavior to Persons with Dementia

    ERIC Educational Resources Information Center

    Dixon, Mark; Baker, Jonathan C.; Sadowski, Katherine Ann

    2011-01-01

    Skinner's 1957 analysis of verbal behavior has demonstrated a fair amount of utility to teach language to children with autism and other various disorders. However, the learning of language can be forgotten, as is the case for many elderly suffering from dementia or other degenerative diseases. It appears possible that Skinner's operants may…

  3. Action, Content and Identity in Applied Genre Analysis for ESP

    ERIC Educational Resources Information Center

    Flowerdew, John

    2011-01-01

    Genres are staged, structured, communicative events, motivated by various communicative purposes, and performed by members of specific discourse communities (Swales 1990; Bhatia 1993, 2004; Berkenkotter & Huckin 1995). Since its inception, with the two seminal works on the topic by Swales (1990) and Bhatia (1993), genre analysis has taken pride of…

  4. Applying Adult Learning Theory through a Character Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    The purpose of this study is to analyze the behavior of a character, Celie, in a movie, 'The Color Purple," through the lens of two adult learning theorists to determine the relationships the character has with each theory. The development and portrayal of characters in movies can be explained and understood by the analysis of adult learning…

  5. Applying Score Analysis to a Rehearsal Pedagogy of Expressive Performance

    ERIC Educational Resources Information Center

    Byo, James L.

    2014-01-01

    The discoveries of score analysis (e.g., minor seventh chord, ostinato, phrase elision, melodic fragment, half cadence) are more than just compositional techniques or music vocabulary. They are sounds--fascinating, storytelling, dynamic modes of expression--that when approached as such enrich the rehearsal experience. This article presents a…

  6. Applying New Network Security Technologies to SCADA Systems.

    SciTech Connect

    Hurd, Steven A; Stamp, Jason Edwin; Duggan, David P; Chavez, Adrian R.

    2006-11-01

    Supervisory Control and Data Acquisition (SCADA) systems for automation are very important for critical infrastructure and manufacturing operations. They have been implemented to work in a number of physical environments using a variety of hardware, software, networking protocols, and communications technologies, often before security issues became of paramount concern. To offer solutions to security shortcomings in the short/medium term, this project was to identify technologies used to secure "traditional" IT networks and systems, and then assess their efficacy with respect to SCADA systems. These proposed solutions must be relatively simple to implement, reliable, and acceptable to SCADA owners and operators. 4This page intentionally left blank.

  7. Aircraft Electric Propulsion Systems Applied Research at NASA

    NASA Technical Reports Server (NTRS)

    Clarke, Sean

    2015-01-01

    Researchers at NASA are investigating the potential for electric propulsion systems to revolutionize the design of aircraft from the small-scale general aviation sector to commuter and transport-class vehicles. Electric propulsion provides new degrees of design freedom that may enable opportunities for tightly coupled design and optimization of the propulsion system with the aircraft structure and control systems. This could lead to extraordinary reductions in ownership and operating costs, greenhouse gas emissions, and noise annoyance levels. We are building testbeds, high-fidelity aircraft simulations, and the first highly distributed electric inhabited flight test vehicle to begin to explore these opportunities.

  8. [Dichotomizing method applied to calculating equilibrium constant of dimerization system].

    PubMed

    Cheng, Guo-zhong; Ye, Zhi-xiang

    2002-06-01

    The arbitrary trivariate algebraic equations are formed based on the combination principle. The univariata algebraic equation of equilibrium constant kappa for dimerization system is obtained through a series of algebraic transformation, and it depends on the properties of monotonic functions whether the equation is solvable or not. If the equation is solvable, equilibrium constant of dimerization system is obtained by dichotomy and its final equilibrium constant of dimerization system is determined according to the principle of error of fitting. The equilibrium constants of trisulfophthalocyanine and biosulfophthalocyanine obtained with this method are 47,973.4 and 30,271.8 respectively. The results are much better than those reported previously.

  9. EG&G Mound Applied Technologies payroll system

    SciTech Connect

    Not Available

    1992-02-07

    EG&G Mound Applied Technologies, Inc., manages and operates the Mound Facility, Miamisburg, Ohio, under a cost-plus-award-fee contract administered by the Department of Energy`s Albuquerque Field Office. The contractor`s Payroll Department is responsible for prompt payment in the proper amount to all persons entitled to be paid, in compliance with applicable laws, regulations, and legal decisions. The objective was to determine whether controls were in place to avoid erroneous payroll payments. EG&G Mound Applied Technologies, Inc., did not have all the internal controls required by General Accounting Office Title 6, ``Pay, Leave, and Allowances.`` Specifically, they did not have computerized edits, separation of duties and responsibilities, and restricted access to payroll data files. This condition occurred because its managers were not aware of Title 6 requirements. As a result, the contractor could not assure the Department of Energy that payroll costs were processes accurately; and fraud, waste, or abuse of Department of Energy funds could go undetected. Our sample of 212 payroll transactions from a population of 66,000 in FY 1991 disclosed only two minor processing errors and no instances of fraud, waste or abuse.

  10. A value analysis model applied to the management of amblyopia.

    PubMed Central

    Beauchamp, G R; Bane, M C; Stager, D R; Berry, P M; Wright, W W

    1999-01-01

    PURPOSE: To assess the value of amblyopia-related services by utilizing a health value model (HVM). Cost and quality criteria are evaluated in accordance with the interests of patients, physicians, and purchasers. METHODS: We applied an HVM to a hypothetical statistical ("median") child with amblyopia whose visual acuity is 20/80 and to a group of children with amblyopia who are managed by our practice. We applied the model to calculate the value of these services by evaluating the responses of patients and physicians and relating these responses to clinical outcomes. RESULTS: The consensus value of care for the hypothetical median child was calculated to be 0.406 (of 1.000). For those children managed in our practice, the calculated value is 0.682. Clinically, 79% achieved 20/40 or better visual acuity, and the mean final visual acuity was 0.2 logMAR (20/32). Value appraisals revealed significant concerns about the financial aspects of amblyopia-related services, particularly among physicians. Patients rated services more positively than did physicians. CONCLUSIONS: Amblyopia care is difficult, sustained, and important work that requires substantial sensitivity to and support of children and families. Compliance and early detection are essential to success. The value of amblyopia services is rated significantly higher by patients than by physicians. Relative to the measured value, amblyopia care is undercompensated. The HVM is useful to appraise clinical service delivery and its variation. The costs of failure and the benefits of success are high; high-value amblyopia care yields substantial dividends and should be commensurately compensated in the marketplace. PMID:10703133

  11. Applied estimation for hybrid dynamical systems using perceptional information

    NASA Astrophysics Data System (ADS)

    Plotnik, Aaron M.

    This dissertation uses the motivating example of robotic tracking of mobile deep ocean animals to present innovations in robotic perception and estimation for hybrid dynamical systems. An approach to estimation for hybrid systems is presented that utilizes uncertain perceptional information about the system's mode to improve tracking of its mode and continuous states. This results in significant improvements in situations where previously reported methods of estimation for hybrid systems perform poorly due to poor distinguishability of the modes. The specific application that motivates this research is an automatic underwater robotic observation system that follows and films individual deep ocean animals. A first version of such a system has been developed jointly by the Stanford Aerospace Robotics Laboratory and Monterey Bay Aquarium Research Institute (MBARI). This robotic observation system is successfully fielded on MBARI's ROVs, but agile specimens often evade the system. When a human ROV pilot performs this task, one advantage that he has over the robotic observation system in these situations is the ability to use visual perceptional information about the target, immediately recognizing any changes in the specimen's behavior mode. With the approach of the human pilot in mind, a new version of the robotic observation system is proposed which is extended to (a) derive perceptional information (visual cues) about the behavior mode of the tracked specimen, and (b) merge this dissimilar, discrete and uncertain information with more traditional continuous noisy sensor data by extending existing algorithms for hybrid estimation. These performance enhancements are enabled by integrating techniques in hybrid estimation, computer vision and machine learning. First, real-time computer vision and classification algorithms extract a visual observation of the target's behavior mode. Existing hybrid estimation algorithms are extended to admit this uncertain but discrete

  12. System Configured For Applying Multiple Modifying Agents To A Substrate.

    DOEpatents

    Propp, W. Alan; Argyle, Mark D.; Janikowski, Stuart K.; Fox, Robert V.; Toth, William J.; Ginosar, Daniel M.; Allen, Charles A.; Miller, David L.

    2005-11-08

    The present invention is related to the modifying of substrates with multiple modifying agents in a single continuous system. At least two processing chambers are configured for modifying the substrate in a continuous feed system. The processing chambers can be substantially isolated from one another by interstitial seals. Additionally, the two processing chambers can be substantially isolated from the surrounding atmosphere by end seals. Optionally, expansion chambers can be used to separate the seals from the processing chambers.

  13. System configured for applying multiple modifying agents to a substrate

    DOEpatents

    Propp, W. Alan; Argyle, Mark D.; Janikowski, Stuart K.; Fox, Robert V.; Toth, William J.; Ginosar, Daniel M.; Allen, Charles A.; Miller, David L.

    2003-11-25

    The present invention is related to the modifying of substrates with multiple modifying agents in a single continuous system. At least two processing chambers are configured for modifying the substrate in a continuous feed system. The processing chambers can be substantially isolated from one another by interstitial seals. Additionally, the two processing chambers can be substantially isolated from the surrounding atmosphere by end seals. Optionally, expansion chambers can be used to separate the seals from the processing chambers.

  14. [Theoretic and applicative aspects of applying of formulary system in military medicine].

    PubMed

    Belevitin, A E; Miroshnichenko, Iu V; Goriachev, A B; Bunin, S A; Krasavin, K D

    2010-08-01

    Development of the medicamental aid in military medicine can be realized only through the introduction of the formulary system. This system forms the informative-methodological basis of the achievement of socially necessary level of drug usage. On the basis of medical standards and analysis of sick rate the formulary of pharmaceuticals which can help to reduce the nomenclature of applying drugs, improve efficiency of medicamental aid is worked out. Medical service of Armed Forces of the Russian Federation has an experience in the development of formularies, but it is early to speak about the introduction of the formulary system into routine of military medicine. Development of the medicamental aid in military medicine on the basis of the formulary system will conduce to satisfying of medical and social requirements of servicemen, military retiree and members of their families. PMID:21089425

  15. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    SciTech Connect

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  16. Analysis of Facial Aesthetics as Applied to Injectables.

    PubMed

    Lam, Samuel M; Glasgold, Robert; Glasgold, Mark

    2015-11-01

    Understanding the role of volume loss in the aging face has resulted in a paradigm shift in facial rejuvenation techniques. Injectable materials for volume restoration are among the most widespread cosmetic procedures performed. A new approach to the aesthetics of facial aging is necessary to allow the greatest improvement from volumetric techniques while maintaining natural appearing results. Examining the face in terms of facial frames and facial shadows provides the fundamental basis for our injectable analysis.

  17. Applying hydraulic transient analysis: The Grizzly Hydro Project

    SciTech Connect

    Logan, T.H.; Stutsman, R.D. )

    1992-04-01

    No matter the size of the hydro plant, if it has a long waterway and will operate in peaking mode, the project designer needs to address the issue of hydraulic transients-known as water hammer-early in the design. This article describes the application of transient analysis to the design of a 20-MW hydro plant in California. In this case, a Howell Bunger valve was used as a pressure regulating valve to control transient pressures and speed rise.

  18. Applied Space Systems Engineering. Chapter 17; Manage Technical Data

    NASA Technical Reports Server (NTRS)

    Kent, Peter

    2008-01-01

    Effective space systems engineering (SSE) is conducted in a fully electronic manner. Competitive hardware, software, and system designs are created in a totally digital environment that enables rapid product design and manufacturing cycles, as well as a multitude of techniques such as modeling, simulation, and lean manufacturing that significantly reduce the lifecycle cost of systems. Because the SSE lifecycle depends on the digital environment, managing the enormous volumes of technical data needed to describe, build, deploy, and operate systems is a critical factor in the success of a project. This chapter presents the key aspects of Technical Data Management (TDM) within the SSE process. It is written from the perspective of the System Engineer tasked with establishing the TDM process and infrastructure for a major project. Additional perspectives are reflected from the point of view of the engineers on the project who work within the digital engineering environment established by the TDM toolset and infrastructure, and from the point of view of the contactors who interface via the TDM infrastructure. Table 17.1 lists the TDM process as it relates to SSE.

  19. Applied patent RFID systems for building reacting HEPA air ventilation system in hospital operation rooms.

    PubMed

    Lin, Jesun; Pai, Jar-Yuan; Chen, Chih-Cheng

    2012-12-01

    RFID technology, an automatic identification and data capture technology to provide identification, tracing, security and so on, was widely applied to healthcare industry in these years. Employing HEPA ventilation system in hospital is a way to ensure healthful indoor air quality to protect patients and healthcare workers against hospital-acquired infections. However, the system consumes lots of electricity which cost a lot. This study aims to apply the RFID technology to offer a unique medical staff and patient identification, and reacting HEPA air ventilation system in order to reduce the cost, save energy and prevent the prevalence of hospital-acquired infection. The system, reacting HEPA air ventilation system, contains RFID tags (for medical staffs and patients), sensor, and reacting system which receives the information regarding the number of medical staff and the status of the surgery, and controls the air volume of the HEPA air ventilation system accordingly. A pilot program was carried out in a unit of operation rooms of a medical center with 1,500 beds located in central Taiwan from Jan to Aug 2010. The results found the air ventilation system was able to function much more efficiently with less energy consumed. Furthermore, the indoor air quality could still keep qualified and hospital-acquired infection or other occupational diseases could be prevented.

  20. VENTILATION TECHNOLOGY SYSTEMS ANALYSIS

    EPA Science Inventory

    The report gives results of a project to develop a systems analysis of ventilation technology and provide a state-of-the-art assessment of ventilation and indoor air quality (IAQ) research needs. (NOTE: Ventilation technology is defined as the hardware necessary to bring outdoor ...

  1. Multi-agent cooperative systems applied to precision applications

    SciTech Connect

    McKay, M.D.; Anderson, M.O.; Gunderson, R.W.; Flann, N.; Abbott, B.

    1998-03-01

    Regulatory agencies are imposing limits and constraints to protect the operator and/or the environment. While generally necessary, these controls also tend to increase cost and decrease efficiency and productivity. Intelligent computer systems can be made to perform these hazardous tasks with greater efficiency and precision without danger to the operators. The Idaho national Engineering and Environmental Laboratory and the Center for Self-Organizing and Intelligent Systems at Utah State University have developed a series of autonomous all-terrain multi-agent systems capable of performing automated tasks within hazardous environments. This paper discusses the development and application of cooperative small-scale and large-scale robots for use in various activities associated with radiologically contaminated areas, prescription farming, and unexploded ordinances.

  2. Integrated hydrogen/oxygen technology applied to auxiliary propulsion systems

    NASA Technical Reports Server (NTRS)

    Gerhardt, David L.

    1990-01-01

    The purpose of the Integrated Hydrogen/Oxygen Technology (IHOT) study was to determine if the vehicle/mission needs and technology of the 1990's support development of an all cryogenic H2/O2 system. In order to accomplish this, IHOT adopted the approach of designing Integrated Auxiliary Propulsion Systems (IAPS) for a representative manned vehicle; the advanced manned launch system. The primary objectives were to develop IAPS concepts which appeared to offer viable alternatives to state-of-the-art (i.e., hypergolic, or earth-storable) APS approaches. The IHOT study resulted in the definition of three APS concepts; two cryogenic IAPS, and a third concept utilizing hypergolic propellants.

  3. Case study: applying management policies to manage distributed queuing systems

    NASA Astrophysics Data System (ADS)

    Neumair, Bernhard; Wies, René

    1996-06-01

    The increasing deployment of workstations and high performance endsystems in addition to the operation of mainframe computers leads to a situation where many companies can no longer afford for their expensive workstations to run idle for long hours during the night or with little load during daytime. Distributed queuing systems and batch systems (DQSs) provide an efficient basis to make use of these unexploited resources and allow corporations to replace expensive supercomputers with clustered workstations running DQSs. To employ these innovative DQSs on a large scale, the management policies for scheduling jobs, configuring queues, etc must be integrated in the overall management process for the IT infrastructure. For this purpose, the concepts of application management and management policies are introduced and discussed. The definition, automatic transformation, and implementation of policies on management platforms to effectively manage DQSs will show that policy-based application management is already possible using the existing management functionality found in today's systems.

  4. Robust sliding mode control applied to double Inverted pendulum system

    SciTech Connect

    Mahjoub, Sonia; Derbel, Nabil; Mnif, Faical

    2009-03-05

    A three hierarchical sliding mode control is presented for a class of an underactuated system which can overcome the mismatched perturbations. The considered underactuated system is a double inverted pendulum (DIP), can be modeled by three subsystems. Such structure allows the construction of several designs of hierarchies for the controller. For all hierarchical designs, the asymptotic stability of every layer sliding mode surface and the sliding mode surface of subsystems are proved theoretically by Barbalat's lemma. Simulation results show the validity of these methods.

  5. Robust sliding mode control applied to double Inverted pendulum system

    NASA Astrophysics Data System (ADS)

    Mahjoub, Sonia; Mnif, Faiçal; Derbel, Nabil

    2009-03-01

    A three hierarchical sliding mode control is presented for a class of an underactuated system which can overcome the mismatched perturbations. The considered underactuated system is a double inverted pendulum (DIP), can be modeled by three subsystems. Such structure allows the construction of several designs of hierarchies for the controller. For all hierarchical designs, the asymptotic stability of every layer sliding mode surface and the sliding mode surface of subsystems are proved theoretically by Barbalat's lemma. Simulation results show the validity of these methods.

  6. Arctic Climate Systems Analysis

    SciTech Connect

    Ivey, Mark D.; Robinson, David G.; Boslough, Mark B.; Backus, George A.; Peterson, Kara J.; van Bloemen Waanders, Bart G.; Swiler, Laura Painton; Desilets, Darin Maurice; Reinert, Rhonda Karen

    2015-03-01

    This study began with a challenge from program area managers at Sandia National Laboratories to technical staff in the energy, climate, and infrastructure security areas: apply a systems-level perspective to existing science and technology program areas in order to determine technology gaps, identify new technical capabilities at Sandia that could be applied to these areas, and identify opportunities for innovation. The Arctic was selected as one of these areas for systems level analyses, and this report documents the results. In this study, an emphasis was placed on the arctic atmosphere since Sandia has been active in atmospheric research in the Arctic since 1997. This study begins with a discussion of the challenges and benefits of analyzing the Arctic as a system. It goes on to discuss current and future needs of the defense, scientific, energy, and intelligence communities for more comprehensive data products related to the Arctic; assess the current state of atmospheric measurement resources available for the Arctic; and explain how the capabilities at Sandia National Laboratories can be used to address the identified technological, data, and modeling needs of the defense, scientific, energy, and intelligence communities for Arctic support.

  7. Applying temporal network analysis to the venture capital market

    NASA Astrophysics Data System (ADS)

    Zhang, Xin; Feng, Ling; Zhu, Rongqian; Stanley, H. Eugene

    2015-10-01

    Using complex network theory to study the investment relationships of venture capital firms has produced a number of significant results. However, previous studies have often neglected the temporal properties of those relationships, which in real-world scenarios play a pivotal role. Here we examine the time-evolving dynamics of venture capital investment in China by constructing temporal networks to represent (i) investment relationships between venture capital firms and portfolio companies and (ii) the syndication ties between venture capital investors. The evolution of the networks exhibits rich variations in centrality, connectivity and local topology. We demonstrate that a temporal network approach provides a dynamic and comprehensive analysis of real-world networks.

  8. Applied analysis/computational mathematics. Final report 1993

    SciTech Connect

    Lax, P.; Berger, M.

    1993-12-01

    This is the final report for the Courant Mathematics and Computing Laboratory (CMCL) research program for the years 1991--1993. Our research efforts encompass the formulation of physical problems in terms of mathematical models (both old and new), the mathematical analysis of such models, and their numerical resolution. This last step involves the development and implementation of efficient methods for large scale computation. Our analytic and numerical work often go hand in hand; new theoretical approaches often have numerical counterparts, while numerical experimentation often suggests avenues for analytical investigation.

  9. Fundamental and Applied Investigations in Atomic Spectrometric Analysis

    NASA Astrophysics Data System (ADS)

    Wu, Min

    Simultaneous laser-excited fluorescence and absorption measurements were performed and the results have revealed that any interference caused by easily ionized elements does not originate from variations in analyte emission (quantum) efficiency. A closely related area, the roles of wet and dry aerosols in the matrix interference are clarified through spatially resolved imaging of the plasma by a charged coupled device camera. To eliminate matrix interference effects practically, various methods have been developed based on the above studies. The use of column pre-concentration with flow injection analysis has been found to provide a simple solution for reducing interference effects and increasing sensitivity of elemental analysis. A novel mini-spray chamber was invented. The new vertical rotary spray chamber combines gravitational, centrifugal, turbulent, and impact droplet segregation mechanisms to achieve a higher efficiency of small-droplet formation in a nebulized sample spray. As a result, it offers also higher sample-transport efficiency, lower memory effects, and improved analytical figures of merit over existing devices. This new device was employed with flow injection analysis to simulate an interface for coupling high performance liquid chromatography (HPLC) to a microwave plasma for chromatographic detection. The detection limits for common metallic elements are in the range of 5-50 mug/mL, and are degraded only twofold when the elements are presented in an organic solvent such as ethanol or methanol. Other sample-introduction schemes have also been investigated to improve sample-introduction technology. The direct coupling of hydride-generation techniques to the helium microwave plasma torch was evaluated for the determination of arsenic, antimony and tin by atomic emission spectrometry. A manually controlled peristaltic pump was modified for computer control and continuous flow injection was evaluated for standard calibration and trace elemental

  10. Automated synchrogram analysis applied to heartbeat and reconstructed respiration

    NASA Astrophysics Data System (ADS)

    Hamann, Claudia; Bartsch, Ronny P.; Schumann, Aicko Y.; Penzel, Thomas; Havlin, Shlomo; Kantelhardt, Jan W.

    2009-03-01

    Phase synchronization between two weakly coupled oscillators has been studied in chaotic systems for a long time. However, it is difficult to unambiguously detect such synchronization in experimental data from complex physiological systems. In this paper we review our study of phase synchronization between heartbeat and respiration in 150 healthy subjects during sleep using an automated procedure for screening the synchrograms. We found that this synchronization is significantly enhanced during non-rapid-eye-movement (non-REM) sleep (deep sleep and light sleep) and is reduced during REM sleep. In addition, we show that the respiration signal can be reconstructed from the heartbeat recordings in many subjects. Our reconstruction procedure, which works particularly well during non-REM sleep, allows the detection of cardiorespiratory synchronization even if only heartbeat intervals were recorded.

  11. Productivity of Management Information Systems Researchers: Does Lotka's Law Apply?

    ERIC Educational Resources Information Center

    Nath, Ravinder; Jackson, Wade M.

    1991-01-01

    Considers the problem of bibliometric prediction and the applicability of Lotka's law regarding the number of papers written by each author. Results of a study of 899 Management Information Systems (MIS) research articles published in 10 journals between 1975 and 1987 are described. (24 references) (LRW)

  12. System Identification and POD Method Applied to Unsteady Aerodynamics

    NASA Technical Reports Server (NTRS)

    Tang, Deman; Kholodar, Denis; Juang, Jer-Nan; Dowell, Earl H.

    2001-01-01

    The representation of unsteady aerodynamic flow fields in terms of global aerodynamic modes has proven to be a useful method for reducing the size of the aerodynamic model over those representations that use local variables at discrete grid points in the flow field. Eigenmodes and Proper Orthogonal Decomposition (POD) modes have been used for this purpose with good effect. This suggests that system identification models may also be used to represent the aerodynamic flow field. Implicit in the use of a systems identification technique is the notion that a relative small state space model can be useful in describing a dynamical system. The POD model is first used to show that indeed a reduced order model can be obtained from a much larger numerical aerodynamical model (the vortex lattice method is used for illustrative purposes) and the results from the POD and the system identification methods are then compared. For the example considered, the two methods are shown to give comparable results in terms of accuracy and reduced model size. The advantages and limitations of each approach are briefly discussed. Both appear promising and complementary in their characteristics.

  13. 40 CFR 63.8030 - What requirements apply to my heat exchange systems?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 13 2010-07-01 2010-07-01 false What requirements apply to my heat... apply to my heat exchange systems? (a) You must comply with the requirements specified in Table 6 to this subpart that apply to your heat exchange systems, except as specified in paragraphs (b) through...

  14. 40 CFR 63.8030 - What requirements apply to my heat exchange systems?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 14 2012-07-01 2011-07-01 true What requirements apply to my heat... apply to my heat exchange systems? (a) You must comply with the requirements specified in Table 6 to this subpart that apply to your heat exchange systems, except as specified in paragraphs (b) through...

  15. 40 CFR 63.8030 - What requirements apply to my heat exchange systems?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 14 2014-07-01 2014-07-01 false What requirements apply to my heat... apply to my heat exchange systems? (a) You must comply with the requirements specified in Table 6 to this subpart that apply to your heat exchange systems, except as specified in paragraphs (b) through...

  16. 40 CFR 63.8030 - What requirements apply to my heat exchange systems?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 13 2011-07-01 2011-07-01 false What requirements apply to my heat... apply to my heat exchange systems? (a) You must comply with the requirements specified in Table 6 to this subpart that apply to your heat exchange systems, except as specified in paragraphs (b) through...

  17. Unsupervised feature relevance analysis applied to improve ECG heartbeat clustering.

    PubMed

    Rodríguez-Sotelo, J L; Peluffo-Ordoñez, D; Cuesta-Frau, D; Castellanos-Domínguez, G

    2012-10-01

    The computer-assisted analysis of biomedical records has become an essential tool in clinical settings. However, current devices provide a growing amount of data that often exceeds the processing capacity of normal computers. As this amount of information rises, new demands for more efficient data extracting methods appear. This paper addresses the task of data mining in physiological records using a feature selection scheme. An unsupervised method based on relevance analysis is described. This scheme uses a least-squares optimization of the input feature matrix in a single iteration. The output of the algorithm is a feature weighting vector. The performance of the method was assessed using a heartbeat clustering test on real ECG records. The quantitative cluster validity measures yielded a correctly classified heartbeat rate of 98.69% (specificity), 85.88% (sensitivity) and 95.04% (general clustering performance), which is even higher than the performance achieved by other similar ECG clustering studies. The number of features was reduced on average from 100 to 18, and the temporal cost was a 43% lower than in previous ECG clustering schemes. PMID:22672933

  18. Multivariate calibration applied to the quantitative analysis of infrared spectra

    SciTech Connect

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  19. Unsupervised feature relevance analysis applied to improve ECG heartbeat clustering.

    PubMed

    Rodríguez-Sotelo, J L; Peluffo-Ordoñez, D; Cuesta-Frau, D; Castellanos-Domínguez, G

    2012-10-01

    The computer-assisted analysis of biomedical records has become an essential tool in clinical settings. However, current devices provide a growing amount of data that often exceeds the processing capacity of normal computers. As this amount of information rises, new demands for more efficient data extracting methods appear. This paper addresses the task of data mining in physiological records using a feature selection scheme. An unsupervised method based on relevance analysis is described. This scheme uses a least-squares optimization of the input feature matrix in a single iteration. The output of the algorithm is a feature weighting vector. The performance of the method was assessed using a heartbeat clustering test on real ECG records. The quantitative cluster validity measures yielded a correctly classified heartbeat rate of 98.69% (specificity), 85.88% (sensitivity) and 95.04% (general clustering performance), which is even higher than the performance achieved by other similar ECG clustering studies. The number of features was reduced on average from 100 to 18, and the temporal cost was a 43% lower than in previous ECG clustering schemes.

  20. Sensitivity and uncertainty analysis applied to the JHR reactivity prediction

    SciTech Connect

    Leray, O.; Vaglio-Gaudard, C.; Hudelot, J. P.; Santamarina, A.; Noguere, G.; Di-Salvo, J.

    2012-07-01

    The on-going AMMON program in EOLE reactor at CEA Cadarache (France) provides experimental results to qualify the HORUS-3D/N neutronics calculation scheme used for the design and safety studies of the new Material Testing Jules Horowitz Reactor (JHR). This paper presents the determination of technological and nuclear data uncertainties on the core reactivity and the propagation of the latter from the AMMON experiment to JHR. The technological uncertainty propagation was performed with a direct perturbation methodology using the 3D French stochastic code TRIPOLI4 and a statistical methodology using the 2D French deterministic code APOLLO2-MOC which leads to a value of 289 pcm (1{sigma}). The Nuclear Data uncertainty propagation relies on a sensitivity study on the main isotopes and the use of a retroactive marginalization method applied to the JEFF 3.1.1 {sup 27}Al evaluation in order to obtain a realistic multi-group covariance matrix associated with the considered evaluation. This nuclear data uncertainty propagation leads to a K{sub eff} uncertainty of 624 pcm for the JHR core and 684 pcm for the AMMON reference configuration core. Finally, transposition and reduction of the prior uncertainty were made using the Representativity method which demonstrates the similarity of the AMMON experiment with JHR (the representativity factor is 0.95). The final impact of JEFF 3.1.1 nuclear data on the Begin Of Life (BOL) JHR reactivity calculated by the HORUS-3D/N V4.0 is a bias of +216 pcm with an associated posterior uncertainty of 304 pcm (1{sigma}). (authors)

  1. Parabolic dish systems at work - Applying the concepts

    NASA Technical Reports Server (NTRS)

    Marriott, A. T.

    1981-01-01

    An overview is given of parabolic dish solar concentrator application experiments being conducted by the U.S. Department of Energy. The 'engineering experiments' comprise the testing of (1) a small-community powerplant system, in conjunction with a grid-connected utility; (2) stand-alone applications at remote sites such as military installations, radar stations and villages; and (3) dish modules that can deliver heat for direct use in industrial processes. Applicability projections are based on a dish and receiver that use a Brayton engine with an engine/generator efficiency of 25% and a production level of up to 25,000 units per year. Analyses indicate that parabolic-dish power systems can potentially replace small, oil-fired power plants in all regions of the U.S. between 1985 and 1991.

  2. Applying Contamination Modelling to Spacecraft Propulsion Systems Designs and Operations

    NASA Technical Reports Server (NTRS)

    Chen, Philip T.; Thomson, Shaun; Woronowicz, Michael S.

    2000-01-01

    Molecular and particulate contaminants generated from the operations of a propulsion system may impinge on spacecraft critical surfaces. Plume depositions or clouds may hinder the spacecraft and instruments from performing normal operations. Firing thrusters will generate both molecular and particulate contaminants. How to minimize the contamination impact from the plume becomes very critical for a successful mission. The resulting effect from either molecular or particulate contamination of the thruster firing is very distinct. This paper will discuss the interconnection between the functions of spacecraft contamination modeling and propulsion system implementation. The paper will address an innovative contamination engineering approach implemented from the spacecraft concept design, manufacturing, integration and test (I&T), launch, to on- orbit operations. This paper will also summarize the implementation on several successful missions. Despite other contamination sources, only molecular contamination will be considered here.

  3. Applying twisted boundary conditions for few-body nuclear systems

    NASA Astrophysics Data System (ADS)

    Körber, Christopher; Luu, Thomas

    2016-05-01

    We describe and implement twisted boundary conditions for the deuteron and triton systems within finite volumes using the nuclear lattice EFT formalism. We investigate the finite-volume dependence of these systems with different twist angles. We demonstrate how various finite-volume information can be used to improve calculations of binding energies in such a framework. Our results suggests that with appropriate twisting of boundaries, infinite-volume binding energies can be reliably extracted from calculations using modest volume sizes with cubic length L ≈8 -14 fm. Of particular importance is our derivation and numerical verification of three-body analogs of "i-periodic" twist angles that eliminate the leading-order finite-volume effects to the three-body binding energy.

  4. Cellular systems biology profiling applied to cellular models of disease.

    PubMed

    Giuliano, Kenneth A; Premkumar, Daniel R; Strock, Christopher J; Johnston, Patricia; Taylor, Lansing

    2009-11-01

    Building cellular models of disease based on the approach of Cellular Systems Biology (CSB) has the potential to improve the process of creating drugs as part of the continuum from early drug discovery through drug development and clinical trials and diagnostics. This paper focuses on the application of CSB to early drug discovery. We discuss the integration of protein-protein interaction biosensors with other multiplexed, functional biomarkers as an example in using CSB to optimize the identification of quality lead series compounds.

  5. Applying Skinner's analysis of verbal behavior to persons with dementia.

    PubMed

    Dixon, Mark; Baker, Jonathan C; Sadowski, Katherine Ann

    2011-03-01

    Skinner's 1957 analysis of verbal behavior has demonstrated a fair amount of utility to teach language to children with autism and other various disorders. However, the learning of language can be forgotten, as is the case for many elderly suffering from dementia or other degenerative diseases. It appears possible that Skinner's operants may facilitate not only acquisition of language but also the ability to recall items or objects that may have appeared to be "forgotten." The present study examined the utility of having a series of adults in long-term care emit tacts, echoics, or intraverbals upon presentation of various visual stimuli. Compared to a no-verbal response condition, it appears that the incorporation of Skinner's verbal operants can in fact improve recall for this population. Implications for the retraining of lost language are presented. PMID:21292058

  6. Downside Risk analysis applied to the Hedge Funds universe

    NASA Astrophysics Data System (ADS)

    Perelló, Josep

    2007-09-01

    Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping the CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater or lower than investor's goal. We revisit most popular Downside Risk indicators and provide new analytical results on them. We compute these measures by taking the Credit Suisse/Tremont Investable Hedge Fund Index Data and with the Gaussian case as a benchmark. In this way, an unusual transversal lecture of the existing Downside Risk measures is provided.

  7. Applying Machine Learning to GlueX Data Analysis

    NASA Astrophysics Data System (ADS)

    Boettcher, Thomas

    2014-03-01

    GlueX is a high energy physics experiment with the goal of collecting data necessary for understanding confinement in quantum chromodynamics. Beginning in 2015, GlueX will collect huge amounts of data describing billions of particle collisions. In preparation for data collection, efforts are underway to develop a methodology for analyzing these large data sets. One of the primary challenges in GlueX data analysis is isolating events of interest from a proportionally large background. GlueX has recently begun approaching this selection problem using machine learning algorithms, specifically boosted decision trees. Preliminary studies indicate that these algorithms have the potential to offer vast improvements in both signal selection efficiency and purity over more traditional techniques.

  8. Naming, the formation of stimulus classes, and applied behavior analysis.

    PubMed Central

    Stromer, R; Mackay, H A; Remington, B

    1996-01-01

    The methods used in Sidman's original studies on equivalence classes provide a framework for analyzing functional verbal behavior. Sidman and others have shown how teaching receptive, name-referent matching may produce rudimentary oral reading and word comprehension skills. Eikeseth and Smith (1992) have extended these findings by showing that children with autism may acquire equivalence classes after learning to supply a common oral name to each stimulus in a potential class. A stimulus class analysis suggests ways to examine (a) the problem of programming generalization from teaching situations to other environments, (b) the expansion of the repertoires that occur in those settings, and (c) the use of naming to facilitate these forms of generalization. Such research will help to clarify and extend Horne and Lowe's recent (1996) account of the role of verbal behavior in the formation of stimulus classes. PMID:8810064

  9. NMR quantum computing: applying theoretical methods to designing enhanced systems.

    PubMed

    Mawhinney, Robert C; Schreckenbach, Georg

    2004-10-01

    Density functional theory results for chemical shifts and spin-spin coupling constants are presented for compounds currently used in NMR quantum computing experiments. Specific design criteria were examined and numerical guidelines were assessed. Using a field strength of 7.0 T, protons require a coupling constant of 4 Hz with a chemical shift separation of 0.3 ppm, whereas carbon needs a coupling constant of 25 Hz for a chemical shift difference of 10 ppm, based on the minimal coupling approximation. Using these guidelines, it was determined that 2,3-dibromothiophene is limited to only two qubits; the three qubit system bromotrifluoroethene could be expanded to five qubits and the three qubit system 2,3-dibromopropanoic acid could also be used as a six qubit system. An examination of substituent effects showed that judiciously choosing specific groups could increase the number of available qubits by removing rotational degeneracies in addition to introducing specific conformational preferences that could increase (or decrease) the magnitude of the couplings. The introduction of one site of unsaturation can lead to a marked improvement in spectroscopic properties, even increasing the number of active nuclei.

  10. Fractographic principles applied to Y-TZP mechanical behavior analysis.

    PubMed

    Ramos, Carla Müller; Cesar, Paulo Francisco; Bonfante, Estevam Augusto; Rubo, José Henrique; Wang, Linda; Borges, Ana Flávia Sanches

    2016-04-01

    The purpose of this study was to evaluate the use of fractography principles to determine the fracture toughness of Y-TZP dental ceramic in which KIc was measured fractographically using controlled-flaw beam bending techniques and to correlate the flaw distribution with the mechanical properties. The Y-TZP blocks studied were: Zirconia Zirklein (ZZ); Zirconcad (ZCA); IPS e.max ZirCad (ZMAX); and In Ceram YZ (ZYZ). Samples were prepared (16mm×4mm×2mm) according to ISO 6872 specifications and subjected to three-point bending at a crosshead speed of 0.5mm/min. Weibull probability curves (95% confidence bounds) were calculated and a contour plot with the Weibull modulus (m) versus characteristic strength (σ0) was used to examine the differences among groups. The fractured surface of each specimen was inspected in a scanning electron microscope (SEM) for qualitative and quantitative fractographic analysis. The critical defect size (c) and fracture toughness (KIc) were estimated. The fractured surfaces of the samples from all groups showed similar fractographic characteristics, except ZCA showed pores and defects. Fracture toughness and the flexural strength values were not different among the groups except for ZCA. The characteristic strength (p<0.05) of ZZ (η=920.4) was higher than the ZCA (η=651.1) and similar to the ZMAX (η=983.6) and ZYZ (η=1054.8). By means of quantitative and qualitative fractographic analysis, this study showed fracture toughness and strength that could be correlated to the observable microstructural features of the evaluated zirconia polycrystalline ceramics. PMID:26722988

  11. Neptune Aerocapture Systems Analysis

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae

    2004-01-01

    A Neptune Aerocapture Systems Analysis is completed to determine the feasibility, benefit and risk of an aeroshell aerocapture system for Neptune and to identify technology gaps and technology performance goals. The high fidelity systems analysis is completed by a five center NASA team and includes the following disciplines and analyses: science; mission design; aeroshell configuration screening and definition; interplanetary navigation analyses; atmosphere modeling; computational fluid dynamics for aerodynamic performance and database definition; initial stability analyses; guidance development; atmospheric flight simulation; computational fluid dynamics and radiation analyses for aeroheating environment definition; thermal protection system design, concepts and sizing; mass properties; structures; spacecraft design and packaging; and mass sensitivities. Results show that aerocapture can deliver 1.4 times more mass to Neptune orbit than an all-propulsive system for the same launch vehicle. In addition aerocapture results in a 3-4 year reduction in trip time compared to all-propulsive systems. Aerocapture is feasible and performance is adequate for the Neptune aerocapture mission. Monte Carlo simulation results show 100% successful capture for all cases including conservative assumptions on atmosphere and navigation. Enabling technologies for this mission include TPS manufacturing; and aerothermodynamic methods and validation for determining coupled 3-D convection, radiation and ablation aeroheating rates and loads, and the effects on surface recession.

  12. The impact of applied behavior analysis on diverse areas of research.

    PubMed

    Kazdin, A E

    1975-01-01

    The impact of applied behavior analysis on various disciplines and areas of research was assessed through two major analyses. First, the relationship of applied behavior analysis to the general area of "behavior modification" was evaluated by examining the citation characteristics of journal articles in JABA and three other behavior-modification journals. Second, the penetration of applied behavior analysis into diverse areas and disciplines, including behavior modification, psychiatry, clinical psychology, education, special education, retardation, speech and hearing, counselling, and law enforcement and correction was assessed. Twenty-five journals representing diverse research areas were evaluated from 1968 to 1974 to assess the extent to which operant techniques were applied for therapeutic, rehabilitative, and educative purposes and the degree to which methodological desiderata of applied behavior analysis were met. The analyses revealed diverse publication outlets for applied behavior analysis in various disciplines.

  13. A review of studies applying environmental impact assessment methods on fruit production systems.

    PubMed

    Cerutti, Alessandro K; Bruun, Sander; Beccaro, Gabriele L; Bounous, Giancarlo

    2011-10-01

    Although many aspects of environmental accounting methodologies in food production have already been investigated, the application of environmental indicators in the fruit sector is still rare and no consensus can be found on the preferred method. On the contrary, widely diverging approaches have been taken to several aspects of the analyses, such as data collection, handling of scaling issues, and goal and scope definition. This paper reviews studies assessing the sustainability or environmental impacts of fruit production under different conditions and identifies aspects of fruit production that are of environmental importance. Four environmental assessment methods which may be applied to assess fruit production systems are evaluated, namely Life Cycle Assessment, Ecological Footprint Analysis, Emergy Analysis and Energy Balance. In the 22 peer-reviewed journal articles and two conference articles applying one of these methods in the fruit sector that were included in this review, a total of 26 applications of environmental impact assessment methods are described. These applications differ concerning e.g. overall objective, set of environmental issues considered, definition of system boundaries and calculation algorithms. Due to the relatively high variability in study cases and approaches, it was not possible to identify any one method as being better than the others. However, remarks on methodologies and suggestions for standardisation are given and the environmental burdens of fruit systems are highlighted.

  14. Applying data mining for the analysis of breast cancer data.

    PubMed

    Liou, Der-Ming; Chang, Wei-Pin

    2015-01-01

    Data mining, also known as Knowledge-Discovery in Databases (KDD), is the process of automatically searching large volumes of data for patterns. For instance, a clinical pattern might indicate a female who have diabetes or hypertension are easier suffered from stroke for 5 years in a future. Then, a physician can learn valuable knowledge from the data mining processes. Here, we present a study focused on the investigation of the application of artificial intelligence and data mining techniques to the prediction models of breast cancer. The artificial neural network, decision tree, logistic regression, and genetic algorithm were used for the comparative studies and the accuracy and positive predictive value of each algorithm were used as the evaluation indicators. 699 records acquired from the breast cancer patients at the University of Wisconsin, nine predictor variables, and one outcome variable were incorporated for the data analysis followed by the tenfold cross-validation. The results revealed that the accuracies of logistic regression model were 0.9434 (sensitivity 0.9716 and specificity 0.9482), the decision tree model 0.9434 (sensitivity 0.9615, specificity 0.9105), the neural network model 0.9502 (sensitivity 0.9628, specificity 0.9273), and the genetic algorithm model 0.9878 (sensitivity 1, specificity 0.9802). The accuracy of the genetic algorithm was significantly higher than the average predicted accuracy of 0.9612. The predicted outcome of the logistic regression model was higher than that of the neural network model but no significant difference was observed. The average predicted accuracy of the decision tree model was 0.9435 which was the lowest of all four predictive models. The standard deviation of the tenfold cross-validation was rather unreliable. This study indicated that the genetic algorithm model yielded better results than other data mining models for the analysis of the data of breast cancer patients in terms of the overall accuracy of

  15. Applied Climate-Change Analysis: The Climate Wizard Tool

    PubMed Central

    Girvetz, Evan H.; Zganjar, Chris; Raber, George T.; Maurer, Edwin P.; Kareiva, Peter; Lawler, Joshua J.

    2009-01-01

    Background Although the message of “global climate change” is catalyzing international action, it is local and regional changes that directly affect people and ecosystems and are of immediate concern to scientists, managers, and policy makers. A major barrier preventing informed climate-change adaptation planning is the difficulty accessing, analyzing, and interpreting climate-change information. To address this problem, we developed a powerful, yet easy to use, web-based tool called Climate Wizard (http://ClimateWizard.org) that provides non-climate specialists with simple analyses and innovative graphical depictions for conveying how climate has and is projected to change within specific geographic areas throughout the world. Methodology/Principal Findings To demonstrate the Climate Wizard, we explored historic trends and future departures (anomalies) in temperature and precipitation globally, and within specific latitudinal zones and countries. We found the greatest temperature increases during 1951–2002 occurred in northern hemisphere countries (especially during January–April), but the latitude of greatest temperature change varied throughout the year, sinusoidally ranging from approximately 50°N during February-March to 10°N during August-September. Precipitation decreases occurred most commonly in countries between 0–20°N, and increases mostly occurred outside of this latitudinal region. Similarly, a quantile ensemble analysis based on projections from 16 General Circulation Models (GCMs) for 2070–2099 identified the median projected change within countries, which showed both latitudinal and regional patterns in projected temperature and precipitation change. Conclusions/Significance The results of these analyses are consistent with those reported by the Intergovernmental Panel on Climate Change, but at the same time, they provide examples of how Climate Wizard can be used to explore regionally- and temporally-specific analyses of climate

  16. System and method of applying energetic ions for sterilization

    DOEpatents

    Schmidt, John A.

    2003-12-23

    A method of sterilization of a container is provided whereby a cold plasma is caused to be disposed near a surface to be sterilized, and the cold plasma is then subjected to a pulsed voltage differential for producing energized ions in the plasma. Those energized ions then operate to achieve spore destruction on the surface to be sterilized. Further, a system for sterilization of a container which includes a conductive or non-conductive container, a cold plasma in proximity to the container, and a high voltage source for delivering a pulsed voltage differential between an electrode and the container and across the cold plasma, is provided.

  17. System And Method Of Applying Energetic Ions For Sterlization

    DOEpatents

    Schmidt, John A.

    2002-06-11

    A method of sterilization of a container is provided whereby a cold plasma is caused to be disposed near a surface to be sterilized, and the cold plasma is then subjected to a pulsed voltage differential for producing energized ions in the plasma. Those energized ions then operate to achieve spore destruction on the surface to be sterilized. Further, a system for sterilization of a container which includes a conductive or non-conductive container, a cold plasma in proximity to the container, and a high voltage source for delivering a pulsed voltage differential between an electrode and the container and across the cold plasma, is provided.

  18. Acton mass flow system applied to PFBC feed

    NASA Technical Reports Server (NTRS)

    Homburg, E.

    1977-01-01

    Dense phase pneumatic conveying and the Acton Mass Flow concept are defined with emphasis on the specific advantages to the coal and dolomite feed to the Pressurized Fluidized Bed Combustor. The transport and feed functions are explored with a comparison of designing the process for a combined function or for individual functions. The equipment required to accomplish these functions is described together with a typical example of sizing and air or gas requirements. A general outline of the control system required to obtain a uniform feed rate is provided. The condition of the coal and dolomite and conveying gas as required to obtain reliable transport and feed will be discussed.

  19. Aerodynamic Reconstruction Applied to Parachute Test Vehicle Flight Data Analysis

    NASA Technical Reports Server (NTRS)

    Cassady, Leonard D.; Ray, Eric S.; Truong, Tuan H.

    2013-01-01

    The aerodynamics, both static and dynamic, of a test vehicle are critical to determining the performance of the parachute cluster in a drop test and for conducting a successful test. The Capsule Parachute Assembly System (CPAS) project is conducting tests of NASA's Orion Multi-Purpose Crew Vehicle (MPCV) parachutes at the Army Yuma Proving Ground utilizing the Parachute Test Vehicle (PTV). The PTV shape is based on the MPCV, but the height has been reduced in order to fit within the C-17 aircraft for extraction. Therefore, the aerodynamics of the PTV are similar, but not the same as, the MPCV. A small series of wind tunnel tests and computational fluid dynamics cases were run to modify the MPCV aerodynamic database for the PTV, but aerodynamic reconstruction of the flights has proven an effective source for further improvements to the database. The acceleration and rotational rates measured during free flight, before parachute inflation but during deployment, were used to con rm vehicle static aerodynamics. A multibody simulation is utilized to reconstruct the parachute portions of the flight. Aerodynamic or parachute parameters are adjusted in the simulation until the prediction reasonably matches the flight trajectory. Knowledge of the static aerodynamics is critical in the CPAS project because the parachute riser load measurements are scaled based on forebody drag. PTV dynamic damping is critical because the vehicle has no reaction control system to maintain attitude - the vehicle dynamics must be understood and modeled correctly before flight. It will be shown here that aerodynamic reconstruction has successfully contributed to the CPAS project.

  20. Applied and computational harmonic analysis on graphs and networks

    NASA Astrophysics Data System (ADS)

    Irion, Jeff; Saito, Naoki

    2015-09-01

    In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.

  1. Ion Beam Analysis applied to laser-generated plasmas

    NASA Astrophysics Data System (ADS)

    Cutroneo, M.; Macková, A.; Havranek, V.; Malinsky, P.; Torrisi, L.; Kormunda, M.; Barchuk, M.; Ullschmied, J.; Dudzak, R.

    2016-04-01

    This paper presents the research activity on Ion Beam Analysis methods performed at Tandetron Laboratory (LT) of the Institute of Nuclear Physics AS CR, Rez, Czech Republic. Recently, many groups are paying attention to implantation by laser generated plasma. This process allows to insert a controllable amount of energetic ions into the surface layers of different materials modifying the physical and chemical properties of the surface material. Different substrates are implanted by accelerated ions from plasma through terawatt iodine laser, at nominal intensity of 1015 W/cm2, at the PALS Research Infrastructure AS CR, in the Czech Republic. This regime of the laser matter interaction generates, multi-MeV proton beams, and multi-charged ions that are tightly confined in time (hundreds ps) and space (source radius of a few microns). These ion beams have a much lower transverse temperature, a much shorter duration and a much higher current than those obtainable from conventional accelerators. The implementation of protons and ions acceleration driven by ultra-short high intensity lasers is exhibited by adopting suitable irradiation conditions as well as tailored targets. An overview of implanted targets and their morphological and structural characterizations is presented and discussed.

  2. Near-infrared radiation curable multilayer coating systems and methods for applying same

    DOEpatents

    Bowman, Mark P; Verdun, Shelley D; Post, Gordon L

    2015-04-28

    Multilayer coating systems, methods of applying and related substrates are disclosed. The coating system may comprise a first coating comprising a near-IR absorber, and a second coating deposited on a least a portion of the first coating. Methods of applying a multilayer coating composition to a substrate may comprise applying a first coating comprising a near-IR absorber, applying a second coating over at least a portion of the first coating and curing the coating with near infrared radiation.

  3. Reachability analysis of rational eigenvalue linear systems

    NASA Astrophysics Data System (ADS)

    Xu, Ming; Chen, Liangyu; Zeng, Zhenbing; Li, Zhi-bin

    2010-12-01

    One of the key problems in the safety analysis of control systems is the exact computation of reachable state spaces for continuous-time systems. Issues related to the controllability and observability of these systems are well-studied in systems theory. However, there are not many results on reachability, even for general linear systems. In this study, we present a large class of linear systems with decidable reachable state spaces. This is approached by reducing the reachability analysis to real root isolation of exponential polynomials. Furthermore, we have implemented this method in a Maple package based on symbolic computation and applied to several examples successfully.

  4. Applying Real Options for Evaluating Investments in ERP Systems

    NASA Astrophysics Data System (ADS)

    Nakagane, Jun; Sekozawa, Teruji

    This paper intends to verify effectiveness of real options approach for evaluating investments in Enterprise Resource Planning systems (ERP) and proves how important it is to disclose shadow options potentially embedded in ERP investment. The net present value (NPV) method is principally adopted to evaluate the value of ERP. However, the NPV method assumes no uncertainties exist in the object. It doesn't satisfy the current business circumstances which are filled with dynamic issues. Since the 1990s the effectiveness of option pricing models for Information System (IS) investment to solve issues in the NPV method has been discussed in the IS literature. This paper presents 3 business cases to review the practical advantages of such techniques for IS investments, especially ERP investments. The first case is EDI development. We evaluate the project by a new approach with lighting one of shadow options, EDI implementation. In the second case we reveal an ERP investment has an “expanding option” in a case of eliminating redundancy. The third case describes an option to contract which is deliberately slotted in ERP development to prepare transferring a manufacturing facility.

  5. Nanosecond step-scan FTIR spectroscopy applied to photobiological systems

    NASA Astrophysics Data System (ADS)

    Rödig, C.; Weidlich, O.; Hackmann, C.; Siebert, F.

    1998-06-01

    Our improved step-scan FTIR instrument, capable of measuring spectra within 15 ns after the flash, is employed to measure flash-induced infrared difference spectra of bacteriorhodopsin, halorhodopsin and CO-myoglobin. For all three systems it is necessary to cover a large time range extending into several milliseconds. Therefore, the linear time base provided by the transient recorder board is converted to a quasi-logarithmic scale. Each of the three systems is characterized by several time constants extending over the large time range. For bacteriorhodopsin, it is shown that two spectral changes occur, one in the 20 and the other in the 100 ns time range. Furthermore, spectral differences between the two M states could be detected in the μs time range. For halorhodopsin, a clear batho intermediate with red-shifted ethylenic mode could be identified in the nanosecond time range. In addition, a transition corresponding to the N intermediate in bacteriorhodopsin was deduced. Further, it is shown that the millisecond time constant depends on Cl- concentration, enabling the detection of the O intermediate. In the case of CO-myoglobin, spectral differences could be identified caused by mutations of the distal histidine of the heme binding pocket.

  6. An applied study using systems engineering methods to prioritize green systems options

    SciTech Connect

    Lee, Sonya M; Macdonald, John M

    2009-01-01

    For many years, there have been questions about the effectiveness of applying different green solutions. If you're building a home and wish to use green technologies, where do you start? While all technologies sound promising, which will perform the best over time? All this has to be considered within the cost and schedule of the project. The amount of information available on the topic can be overwhelming. We seek to examine if Systems Engineering methods can be used to help people choose and prioritize technologies that fit within their project and budget. Several methods are used to gain perspective into how to select the green technologies, such as the Analytic Hierarchy Process (AHP) and Kepner-Tregoe. In our study, subjects applied these methods to analyze cost, schedule, and trade-offs. Results will document whether the experimental approach is applicable to defining system priorities for green technologies.

  7. A System Analysis Tool

    SciTech Connect

    CAMPBELL,PHILIP L.; ESPINOZA,JUAN

    2000-06-01

    In this paper we describe a tool for analyzing systems. The analysis is based on program slicing. It answers the following question for the software: if the value of a particular variable changes, what other variable values also change, and what is the path in between? program slicing was developed based on intra-procedure control and data flow. It has been expanded commercially to inter-procedure flow. However, we extend slicing to collections of programs and non-program entities, which we term multi-domain systems. The value of our tool is that an analyst can model the entirety of a system, not just the software, and we believe that this makes for a significant increase in power. We are building a prototype system.

  8. Network systems security analysis

    NASA Astrophysics Data System (ADS)

    Yilmaz, Ä.°smail

    2015-05-01

    Network Systems Security Analysis has utmost importance in today's world. Many companies, like banks which give priority to data management, test their own data security systems with "Penetration Tests" by time to time. In this context, companies must also test their own network/server systems and take precautions, as the data security draws attention. Based on this idea, the study cyber-attacks are researched throughoutly and Penetration Test technics are examined. With these information on, classification is made for the cyber-attacks and later network systems' security is tested systematically. After the testing period, all data is reported and filed for future reference. Consequently, it is found out that human beings are the weakest circle of the chain and simple mistakes may unintentionally cause huge problems. Thus, it is clear that some precautions must be taken to avoid such threats like updating the security software.

  9. Quality analysis of the solution produced by dissection algorithms applied to the traveling salesman problem

    SciTech Connect

    Cesari, G.

    1994-12-31

    The aim of this paper is to analyze experimentally the quality of the solution obtained with dissection algorithms applied to the geometric Traveling Salesman Problem. Starting from Karp`s results. We apply a divide and conquer strategy, first dividing the plane into subregions where we calculate optimal subtours and then merging these subtours to obtain the final tour. The analysis is restricted to problem instances where points are uniformly distributed in the unit square. For relatively small sets of cities we analyze the quality of the solution by calculating the length of the optimal tour and by comparing it with our approximate solution. When the problem instance is too large we perform an asymptotical analysis estimating the length of the optimal tour. We apply the same dissection strategy also to classical heuristics by calculating approximate subtours and by comparing the results with the average quality of the heuristic. Our main result is the estimate of the rate of convergence of the approximate solution to the optimal solution as a function of the number of dissection steps, of the criterion used for the plane division and of the quality of the subtours. We have implemented our programs on MUSIC (MUlti Signal processor system with Intelligent Communication), a Single-Program-Multiple-Data parallel computer with distributed memory developed at the ETH Zurich.

  10. Applied Drama and the Higher Education Learning Spaces: A Reflective Analysis

    ERIC Educational Resources Information Center

    Moyo, Cletus

    2015-01-01

    This paper explores Applied Drama as a teaching approach in Higher Education learning spaces. The exploration takes a reflective analysis approach by first examining the impact that Applied Drama has had on my career as a Lecturer/Educator/Teacher working in Higher Education environments. My engagement with Applied Drama practice and theory is…

  11. Exploratory Factor Analysis as a Construct Validation Tool: (Mis)applications in Applied Linguistics Research

    ERIC Educational Resources Information Center

    Karami, Hossein

    2015-01-01

    Factor analysis has been frequently exploited in applied research to provide evidence about the underlying factors in various measurement instruments. A close inspection of a large number of studies published in leading applied linguistic journals shows that there is a misconception among applied linguists as to the relative merits of exploratory…

  12. Absorption and adsorption chillers applied to air conditioning systems

    NASA Astrophysics Data System (ADS)

    Kuczyńska, Agnieszka; Szaflik, Władysław

    2010-07-01

    This work presents an application possibility of sorption refrigerators driven by low temperature fluid for air conditioning of buildings. Thermodynamic models were formulated and absorption LiBr-water chiller with 10 kW cooling power as well as adsorption chiller with silica gel bed were investigated. Both of them are using water for desorption process with temperature Tdes = 80 °C. Coefficient of performance (COP) for both cooling cycles was analyzed in the same conditions of the driving heat source, cooling water Tc = 25 °C and temperature in evaporator Tevap = 5 °C. In this study, the computer software EES was used to investigate the performance of absorption heat pump system and its behaviour in configuration with geothermal heat source.

  13. Maximum likelihood estimation applied to multiepoch MEG/EEG analysis

    NASA Astrophysics Data System (ADS)

    Baryshnikov, Boris V.

    A maximum likelihood based algorithm for reducing the effects of spatially colored noise in evoked response MEG and EEG experiments is presented. The signal of interest is modeled as the low rank mean, while the noise is modeled as a Kronecker product of spatial and temporal covariance matrices. The temporal covariance is assumed known, while the spatial covariance is estimated as part of the algorithm. In contrast to prestimulus based whitening followed by principal component analysis, our algorithm does not require signal-free data for noise whitening and thus is more effective with non-stationary noise and produces better quality whitening for a given data record length. The efficacy of this approach is demonstrated using simulated and real MEG data. Next, a study in which we characterize MEG cortical response to coherent vs. incoherent motion is presented. It was found that coherent motion of the object induces not only an early sensory response around 180 ms relative to the stimulus onset but also a late field in the 250--500 ms range that has not been observed previously in similar random dot kinematogram experiments. The late field could not be resolved without signal processing using the maximum likelihood algorithm. The late activity localized to parietal areas. This is what would be expected. We believe that the late field corresponds to higher order processing related to the recognition of the moving object against the background. Finally, a maximum likelihood based dipole fitting algorithm is presented. It is suitable for dipole fitting of evoked response MEG data in the presence of spatially colored noise. The method exploits the temporal multiepoch structure of the evoked response data to estimate the spatial noise covariance matrix from the section of data being fit, eliminating the stationarity assumption implicit in prestimulus based whitening approaches. The preliminary results of the application of this algorithm to the simulated data show its

  14. DART system analysis.

    SciTech Connect

    Boggs, Paul T.; Althsuler, Alan; Larzelere, Alex R.; Walsh, Edward J.; Clay, Ruuobert L.; Hardwick, Michael F. (Sandia National Laboratories, Livermore, CA)

    2005-08-01

    The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a community model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.

  15. Hopf Method Applied to Low and High Dimensional Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Ma, Seungwook; Marston, Brad

    2004-03-01

    With an eye towards the goal of directly extracting statistical information from general circulation models (GCMs) of climate, thereby avoiding lengthy time integrations, we investigate the usage of the Hopf functional method(Uriel Frisch, Turbulence: The Legacy of A. N. Kolmogorov) (Cambridge University Press, 1995) chapter 9.5.. We use the method to calculate statistics over low-dimensional attractors, and for fluid flow on a rotating sphere. For the cases of the 3-dimensional Lorenz attractor, and a 5-dimensional nonlinear system introduced by Orszag as a toy model of turbulence(Steven Orszag in Fluid Dynamics: Les Houches (1977))., a comparison of results obtained by low-order truncations of the cumulant expansion against statistics calculated by direct numerical integration forward in time shows surprisingly good agreement. The extension of the Hopf method to a high-dimensional barotropic model of inviscid fluid flow on a rotating sphere, which employs Arakawa's method to conserve energy and enstrophy(Akio Arakawa, J. Comp. Phys. 1), 119 (1966)., is discussed.

  16. GIS Application System Design Applied to Information Monitoring

    NASA Astrophysics Data System (ADS)

    Qun, Zhou; Yujin, Yuan; Yuena, Kang

    Natural environment information management system involves on-line instrument monitoring, data communications, database establishment, information management software development and so on. Its core lies in collecting effective and reliable environmental information, increasing utilization rate and sharing degree of environment information by advanced information technology, and maximizingly providing timely and scientific foundation for environmental monitoring and management. This thesis adopts C# plug-in application development and uses a set of complete embedded GIS component libraries and tools libraries provided by GIS Engine to finish the core of plug-in GIS application framework, namely, the design and implementation of framework host program and each functional plug-in, as well as the design and implementation of plug-in GIS application framework platform. This thesis adopts the advantages of development technique of dynamic plug-in loading configuration, quickly establishes GIS application by visualized component collaborative modeling and realizes GIS application integration. The developed platform is applicable to any application integration related to GIS application (ESRI platform) and can be as basis development platform of GIS application development.

  17. 30 CFR 260.111 - What conditions apply to the bidding systems that MMS uses?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false What conditions apply to the bidding systems that MMS uses? 260.111 Section 260.111 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE... What conditions apply to the bidding systems that MMS uses? (a) For each of the bidding systems...

  18. 30 CFR 260.111 - What conditions apply to the bidding systems that MMS uses?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 2 2011-07-01 2011-07-01 false What conditions apply to the bidding systems that MMS uses? 260.111 Section 260.111 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, REGULATION... Bidding Systems General Provisions § 260.111 What conditions apply to the bidding systems that MMS...

  19. Beyond Time Out and Table Time: Today's Applied Behavior Analysis for Students with Autism

    ERIC Educational Resources Information Center

    Boutot, E. Amanda; Hume, Kara

    2010-01-01

    Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…

  20. Applying axiomatic design to a medication distribution system

    NASA Astrophysics Data System (ADS)

    Raguini, Pepito B.

    As the need to minimize medication errors drives many medical facilities to come up with robust solutions to the most common error that affects patient's safety, these hospitals would be wise to put a concerted effort into finding methodologies that can facilitate an optimized medical distribution system. If the hospitals' upper management is looking for an optimization method that is an ideal fit, it is just as important that the right tool be selected for the application at hand. In the present work, we propose the application of Axiomatic Design (AD), which is a process that focuses on the generation and selection of functional requirements to meet the customer needs for product and/or process design. The appeal of the axiomatic approach is to provide both a formal design process and a set of technical coefficients for meeting the customer's needs. Thus, AD offers a strategy for the effective integration of people, design methods, design tools and design data. Therefore, we propose the AD methodology to medical applications with the main objective of allowing nurses the opportunity to provide cost effective delivery of medications to inpatients, thereby improving quality patient care. The AD methodology will be implemented through the use of focused stores, where medications can be readily stored and can be conveniently located near patients, as well as a mobile apparatus that can also store medications and is commonly used by hospitals, the medication cart. Moreover, a robust methodology called the focused store methodology will be introduced and developed for both the uncapacitated and capacitated case studies, which will set up an appropriate AD framework and design problem for a medication distribution case study.

  1. Study of fuel cell co-generation systems applied to a dairy industry

    NASA Astrophysics Data System (ADS)

    Leal, Elisângela M.; Silveira, José Luz

    This paper presents a methodology for the study of a molten carbonate fuel cell co-generation system. This system is applied to a dairy industry of medium size that typically demands 2100 kW of electricity, 8500 kg/h of saturated steam ( P=1.08 MPa) and 2725 kW of cold water production. Depending on the associated recuperation equipment, the co-generation system permits the recovery of waste heat, which can be used for the production of steam, hot and cold water, hot and cold air. In this study, a comparison is made between two configurations of fuel cell co-generation systems (FCCS). The plant performance has been evaluated on the basis of fuel utilisation efficiency and each system component evaluated on the basis of second law efficiency. The energy analysis presented shows a fuel utilisation efficiency of about 87% and exergy analysis shows that the irreversibilities in the combustion chamber of the plant are significant. Further, the payback period estimated for the fuel cell investment between US 1000 and US 1500/kW is about 3 and 6 years, respectively.

  2. Versatile microanalytical system with porous polypropylene capillary membrane for calibration gas generation and trace gaseous pollutants sampling applied to the analysis of formaldehyde, formic acid, acetic acid and ammonia in outdoor air.

    PubMed

    Coelho, Lúcia H G; Melchert, Wanessa R; Rocha, Flavio R; Rocha, Fábio R P; Gutz, Ivano G R

    2010-11-15

    The analytical determination of atmospheric pollutants still presents challenges due to the low-level concentrations (frequently in the μg m(-3) range) and their variations with sampling site and time. In this work, a capillary membrane diffusion scrubber (CMDS) was scaled down to match with capillary electrophoresis (CE), a quick separation technique that requires nothing more than some nanoliters of sample and, when combined with capacitively coupled contactless conductometric detection (C(4)D), is particularly favorable for ionic species that do not absorb in the UV-vis region, like the target analytes formaldehyde, formic acid, acetic acid and ammonium. The CMDS was coaxially assembled inside a PTFE tube and fed with acceptor phase (deionized water for species with a high Henry's constant such as formaldehyde and carboxylic acids, or acidic solution for ammonia sampling with equilibrium displacement to the non-volatile ammonium ion) at a low flow rate (8.3 nL s(-1)), while the sample was aspirated through the annular gap of the concentric tubes at 2.5 mL s(-1). A second unit, in all similar to the CMDS, was operated as a capillary membrane diffusion emitter (CMDE), generating a gas flow with know concentrations of ammonia for the evaluation of the CMDS. The fluids of the system were driven with inexpensive aquarium air pumps, and the collected samples were stored in vials cooled by a Peltier element. Complete protocols were developed for the analysis, in air, of NH(3), CH(3)COOH, HCOOH and, with a derivatization setup, CH(2)O, by associating the CMDS collection with the determination by CE-C(4)D. The ammonia concentrations obtained by electrophoresis were checked against the reference spectrophotometric method based on Berthelot's reaction. Sensitivity enhancements of this reference method were achieved by using a modified Berthelot reaction, solenoid micro-pumps for liquid propulsion and a long optical path cell based on a liquid core waveguide (LCW). All

  3. Applying machine learning techniques to DNA sequence analysis. Progress report, February 14, 1991--February 13, 1992

    SciTech Connect

    Shavlik, J.W.

    1992-04-01

    We are developing a machine learning system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being learned. Using this information (which we call a ``domain theory``), our learning algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, the KBANN algorithm maps inference rules, such as consensus sequences, into a neural (connectionist) network. Neural network training techniques then use the training examples of refine these inference rules. We have been applying this approach to several problems in DNA sequence analysis and have also been extending the capabilities of our learning system along several dimensions.

  4. The accident evolution and barrier function (AEB) model applied to incident analysis in the processing industries.

    PubMed

    Svenson, O

    1991-09-01

    This study develops a theoretical model for accident evolutions and how they can be arrested. The model describes the interaction between technical and human-organizational systems which may lead to an accident. The analytic tool provided by the model gives equal weight to both these types of systems and necessitates simultaneous and interactive accident analysis by engineers and human factors specialists. It can be used in predictive safety analyses as well as in post hoc incident analyses. To illustrate this, the AEB model is applied to an incident reported by the nuclear industry in Sweden. In general, application of the model will indicate where and how safety can be improved, and it also raises questions about issues such as the cost, feasibility, and effectiveness of different ways of increasing safety.

  5. Distinguishing Pattern Formation Phenotypes: Applying Minkowski Functionals to Cell Biology Systems

    NASA Astrophysics Data System (ADS)

    Rericha, Erin; Guven, Can; Parent, Carole; Losert, Wolfgang

    2011-03-01

    Spatial Clustering of proteins within cells or cells themselves frequently occur in cell biology systems. However quantifying the underlying order and determining the regulators of these cluster patterns have proved difficult due to the inherent high noise levels in the systems. For instance the patterns formed by wild type and cyclic-AMP regulatory mutant Dictyostelium cells are visually distinctive, yet the large error bars in measurements of the fractal number, area, Euler number, eccentricity, and wavelength making it difficult to quantitatively distinguish between the patterns. We apply a spatial analysis technique based on Minkowski functionals and develop metrics which clearly separate wild type and mutant cell lines into distinct categories. Having such a metric facilitated the development of a computational model for cellular aggregation and its regulators. Supported by NIH-NGHS Nanotechnology (R01GM085574) and the Burroughs Wellcome Fund.

  6. Launch Vehicle Systems Analysis

    NASA Technical Reports Server (NTRS)

    Olds, John R.

    1999-01-01

    This report summaries the key accomplishments of Georgia Tech's Space Systems Design Laboratory (SSDL) under NASA Grant NAG8-1302 from NASA - Marshall Space Flight Center. The report consists of this summary white paper, copies of technical papers written under this grant, and several viewgraph-style presentations. During the course of this grant four main tasks were completed: (1)Simulated Combined-Cycle Rocket Engine Analysis Module (SCCREAM), a computer analysis tool for predicting the performance of various RBCC engine configurations; (2) Hyperion, a single stage to orbit vehicle capable of delivering 25,000 pound payloads to the International Space Station Orbit; (3) Bantam-X Support - a small payload mission; (4) International Trajectory Support for interplanetary human Mars missions.

  7. Neutron activation analysis system

    DOEpatents

    Taylor, M.C.; Rhodes, J.R.

    1973-12-25

    A neutron activation analysis system for monitoring a generally fluid media, such as slurries, solutions, and fluidized powders, including two separate conduit loops for circulating fluid samples within the range of radiation sources and detectors is described. Associated with the first loop is a neutron source that emits s high flux of slow and thermal neutrons. The second loop employs a fast neutron source, the flux from which is substantially free of thermal neutrons. Adjacent to both loops are gamma counters for spectrographic determination of the fluid constituents. Other gsmma sources and detectors are arranged across a portion of each loop for deterMining the fluid density. (Official Gazette)

  8. Beta systems error analysis

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The atmospheric backscatter coefficient, beta, measured with an airborne CO Laser Doppler Velocimeter (LDV) system operating in a continuous wave, focussed model is discussed. The Single Particle Mode (SPM) algorithm, was developed from concept through analysis of an extensive amount of data obtained with the system on board a NASA aircraft. The SPM algorithm is intended to be employed in situations where one particle at a time appears in the sensitive volume of the LDV. In addition to giving the backscatter coefficient, the SPM algorithm also produces as intermediate results the aerosol density and the aerosol backscatter cross section distribution. A second method, which measures only the atmospheric backscatter coefficient, is called the Volume Mode (VM) and was simultaneously employed. The results of these two methods differed by slightly less than an order of magnitude. The measurement uncertainties or other errors in the results of the two methods are examined.

  9. 40 CFR 63.1083 - Does this subpart apply to my heat exchange system?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Systems and Waste Operations Applicability for Heat Exchange Systems § 63.1083 Does this subpart apply to... or operate an ethylene production unit expressly referenced to this subpart XX from subpart YY...

  10. 40 CFR 63.1083 - Does this subpart apply to my heat exchange system?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Systems and Waste Operations Applicability for Heat Exchange Systems § 63.1083 Does this subpart apply to... or operate an ethylene production unit expressly referenced to this subpart XX from subpart YY...

  11. Computing and Systems Applied in Support of Coordinated Energy, Environmental, and Climate Planning

    EPA Science Inventory

    This talk focuses on how Dr. Loughlin is applying Computing and Systems models, tools and methods to more fully understand the linkages among energy systems, environmental quality, and climate change. Dr. Loughlin will highlight recent and ongoing research activities, including: ...

  12. Applied Time Domain Stability Margin Assessment for Nonlinear Time-Varying Systems

    NASA Technical Reports Server (NTRS)

    Kiefer, J. M.; Johnson, M. D.; Wall, J. H.; Dominguez, A.

    2016-01-01

    margins. At each time point, the system was linearized about the current operating point using Simulink's built-in solver. Each linearized system in time was evaluated for its rigid-body gain margin (high frequency gain margin), rigid-body phase margin, and aero gain margin (low frequency gain margin) for each control axis. Using the stability margins derived from the baseline linearization approach, the time domain derived stability margins were determined by executing time domain simulations in which axis-specific incremental gain and phase adjustments were made to the nominal system about the expected neutral stability point at specific flight times. The baseline stability margin time histories were used to shift the system gain to various values around the zero margin point such that a precise amount of expected gain margin was maintained throughout flight. When assessing the gain margins, the gain was applied starting at the time point under consideration, thereafter following the variation in the margin found in the linear analysis. When assessing the rigid-body phase margin, a constant time delay was applied to the system starting at the time point under consideration. If the baseline stability margins were correctly determined via the linear analysis, the time domain simulation results should contain unstable behavior at certain gain and phase values. Examples will be shown from repeated simulations with variable added gain and phase lag. Faithfulness of margins calculated from the linear analysis to the nonlinear system will be demonstrated.

  13. 40 CFR 63.1083 - Does this subpart apply to my heat exchange system?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 11 2012-07-01 2012-07-01 false Does this subpart apply to my heat... CATEGORIES (CONTINUED) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Applicability for Heat Exchange Systems § 63.1083 Does this subpart apply...

  14. 40 CFR 63.1083 - Does this subpart apply to my heat exchange system?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 11 2013-07-01 2013-07-01 false Does this subpart apply to my heat... CATEGORIES (CONTINUED) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Applicability for Heat Exchange Systems § 63.1083 Does this subpart apply...

  15. 40 CFR 63.1083 - Does this subpart apply to my heat exchange system?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 11 2014-07-01 2014-07-01 false Does this subpart apply to my heat... CATEGORIES (CONTINUED) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Applicability for Heat Exchange Systems § 63.1083 Does this subpart apply...

  16. Comparison of transverse dental changes induced by the palatally applied Frog appliance and buccally applied Karad's integrated distalizing system

    PubMed Central

    Kaygisiz, Emine; Unver, Fatih; Tortop, Tuba

    2016-01-01

    Objective To compare the transverse dental changes induced by the palatally applied Frog appliance and buccally applied Karad's integrated distalizing system (KIDS). Methods We evaluated the pre- and post distalization orthodontic models of 39 patients, including 19 treated using the Frog appliance, which is palatally positioned (Frog group), and 20 treated using KIDS, which is buccally positioned (KIDS group). Changes in intermolar and interpremolar distances and the amount of maxillary premolar and molar rotation were evaluated on model photocopies. Wilcoxon and Mann-Whitney U tests were used for statistical evaluations. A p-value of < 0.05 was considered statistically significant. Results Significant distopalatal rotation of premolars and distobuccal rotation of molars were observed in Frog group (p < 0.05), while significant distopalatal rotation of molars (p < 0.05), with no significant changes in premolars, was observed in KIDS group. The amount of second premolar and first molar rotation was significantly different between the two groups (p < 0.05 and p < 0.001, respectively). Furthermore, expansion in the region of the first molars and second premolars was significantly greater in KIDS group than in Frog group (p < 0.001 for both). Conclusions Our results suggest that the type and amount of first molar rotation and expansion vary with the design of the distalization appliance used. PMID:27019824

  17. The Research Mission of Universities of Applied Sciences and the Future Configuration of Higher Education Systems in Europe

    ERIC Educational Resources Information Center

    Lepori, Benedetto; Kyvik, Svein

    2010-01-01

    This article presents a comparative analysis of the development of research in universities of applied sciences (UAS) in eight European countries and its implications for the configuration of the higher education system. The enhancement of research has mostly been seen as a case of academic drift where UAS attempt to become more similar to…

  18. System of systems modeling and analysis.

    SciTech Connect

    Campbell, James E.; Anderson, Dennis James; Longsine, Dennis E.; Shirah, Donald N.

    2005-01-01

    This report documents the results of an LDRD program entitled 'System of Systems Modeling and Analysis' that was conducted during FY 2003 and FY 2004. Systems that themselves consist of multiple systems (referred to here as System of Systems or SoS) introduce a level of complexity to systems performance analysis and optimization that is not readily addressable by existing capabilities. The objective of the 'System of Systems Modeling and Analysis' project was to develop an integrated modeling and simulation environment that addresses the complex SoS modeling and analysis needs. The approach to meeting this objective involved two key efforts. First, a static analysis approach, called state modeling, has been developed that is useful for analyzing the average performance of systems over defined use conditions. The state modeling capability supports analysis and optimization of multiple systems and multiple performance measures or measures of effectiveness. The second effort involves time simulation which represents every system in the simulation using an encapsulated state model (State Model Object or SMO). The time simulation can analyze any number of systems including cross-platform dependencies and a detailed treatment of the logistics required to support the systems in a defined mission.

  19. Constraints to applying systems thinking concepts in health systems: A regional perspective from surveying stakeholders in Eastern Mediterranean countries

    PubMed Central

    El-Jardali, Fadi; Adam, Taghreed; Ataya, Nour; Jamal, Diana; Jaafar, Maha

    2014-01-01

    Background: Systems Thinking (ST) has recently been promoted as an important approach to health systems strengthening. However, ST is not common practice, particularly in Low- and Middle-Income Countries (LMICs). This paper seeks to explore the barriers that may hinder its application in the Eastern Mediterranean Region (EMR) and possible strategies to mitigate them. Methods: A survey consisting of open-ended questions was conducted with a purposive sample of health policy-makers such as senior officials from the Ministry of Health (MoH), researchers, and other stakeholders such as civil society groups and professional associations from ten countries in the region. A total of 62 respondents participated in the study. Thematic analysis was conducted. Results: There was strong recognition of the relevance and usefulness of ST to health systems policy-making and research, although misconceptions about what ST means were also identified. Experience with applying ST was very limited. Approaches to designing health policies in the EMR were perceived as reactive and fragmented (66%). Commonly perceived constraints to application of ST were: a perceived notion of its costliness combined with lack of the necessary funding to operationalize it (53%), competing political interests and lack of government accountability (50%), lack of awareness about relevance and value (47%), limited capacity to apply it (45%), and difficulty in coordinating and managing stakeholders (39%). Conclusion: While several strategies have been proposed to mitigate most of these constraints, they emphasized the importance of political endorsement and adoption of ST at the leadership level, together with building the necessary capacity to apply it and apply the learning in research and practice. PMID:25489598

  20. Symmetry analysis for nonlinear time reversal methods applied to nonlinear acoustic imaging

    NASA Astrophysics Data System (ADS)

    Dos Santos, Serge; Chaline, Jennifer

    2015-10-01

    Using symmetry invariance, nonlinear Time Reversal (TR) and reciprocity properties, the classical NEWS methods are supplemented and improved by new excitations having the intrinsic property of enlarging frequency analysis bandwidth and time domain scales, with now both medical acoustics and electromagnetic applications. The analysis of invariant quantities is a well-known tool which is often used in nonlinear acoustics in order to simplify complex equations. Based on a fundamental physical principle known as symmetry analysis, this approach consists in finding judicious variables, intrinsically scale dependant, and able to describe all stages of behaviour on the same theoretical foundation. Based on previously published results within the nonlinear acoustic areas, some practical implementation will be proposed as a new way to define TR-NEWS based methods applied to NDT and medical bubble based non-destructive imaging. This paper tends to show how symmetry analysis can help us to define new methodologies and new experimental set-up involving modern signal processing tools. Some example of practical realizations will be proposed in the context of biomedical non-destructive imaging using Ultrasound Contrast Agents (ACUs) where symmetry and invariance properties allow us to define a microscopic scale-invariant experimental set-up describing intrinsic symmetries of the microscopic complex system.

  1. Air pollution simulation and geographical information systems (GIS) applied to Athens International Airport.

    PubMed

    Theophanides, Mike; Anastassopoulou, Jane

    2009-07-01

    This study presents an improved methodology for analysing atmospheric pollution around airports using Gaussian-plume numerical simulation integrated with Geographical Information Systems (GIS). The new methodology focuses on streamlining the lengthy analysis process for Airport Environmental Impact Assessments by integrating the definition of emission sources, simulating and displaying the results in a GIS environment. One of the objectives of the research is to validate the methodology applied to the Athens International Airport, "Eleftherios Venizelos", to produce a realistic estimate of emission inventories, dispersion simulations and comparison to measured data. The methodology used a combination of the Emission Dispersion and Modelling System (EDMS) and the Atmospheric Dispersion and Modelling system (ADMS) to improve the analysis process. The second objective is to conduct numerical simulations under various adverse conditions (e.g. scenarios) and assess the dispersion in the surrounding areas. The study concludes that the use of GIS in environmental assessments provides a valuable advantage for organizing data and entering accurate geographical/topological information for the simulation engine. Emissions simulation produced estimates within 10% of published values. Dispersion simulations indicate that airport pollution will affect neighbouring cities such as Rafina and Loutsa. Presently, there are no measured controls in these areas. In some cases, airport pollution can contribute to as much as 40% of permissible EU levels in VOCs.

  2. Air pollution simulation and geographical information systems (GIS) applied to Athens International Airport.

    PubMed

    Theophanides, Mike; Anastassopoulou, Jane

    2009-07-01

    This study presents an improved methodology for analysing atmospheric pollution around airports using Gaussian-plume numerical simulation integrated with Geographical Information Systems (GIS). The new methodology focuses on streamlining the lengthy analysis process for Airport Environmental Impact Assessments by integrating the definition of emission sources, simulating and displaying the results in a GIS environment. One of the objectives of the research is to validate the methodology applied to the Athens International Airport, "Eleftherios Venizelos", to produce a realistic estimate of emission inventories, dispersion simulations and comparison to measured data. The methodology used a combination of the Emission Dispersion and Modelling System (EDMS) and the Atmospheric Dispersion and Modelling system (ADMS) to improve the analysis process. The second objective is to conduct numerical simulations under various adverse conditions (e.g. scenarios) and assess the dispersion in the surrounding areas. The study concludes that the use of GIS in environmental assessments provides a valuable advantage for organizing data and entering accurate geographical/topological information for the simulation engine. Emissions simulation produced estimates within 10% of published values. Dispersion simulations indicate that airport pollution will affect neighbouring cities such as Rafina and Loutsa. Presently, there are no measured controls in these areas. In some cases, airport pollution can contribute to as much as 40% of permissible EU levels in VOCs. PMID:19731833

  3. Applying Transactional Analysis and Personality Assessment to Improve Patient Counseling and Communication Skills

    PubMed Central

    Lawrence, Lesa

    2007-01-01

    Objective To teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling to improve communication. Design A lecture series for a required pharmacy communications class was developed to teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling. Students were asked to apply these techniques and to report their experiences. A personality self-assessment was also conducted. Assessment After attending the lecture series, students were able to apply the techniques and demonstrated an understanding of the psychological factors that may affect patient communication, an appreciation for the diversity created by different personality types, the ability to engage patients based on adult-to-adult interaction cues, and the ability to adapt the interactive patient counseling model to different personality traits. Conclusion Students gained a greater awareness of transactional analysis and personality assessment by applying these concepts. This understanding will help students communicate more effectively with patients. PMID:17786269

  4. Applying Dynamical Systems Theory to Optimize Libration Point Orbit Stationkeeping Maneuvers for WIND

    NASA Technical Reports Server (NTRS)

    Brown, Jonathan M.; Petersen, Jeremy D.

    2014-01-01

    NASA's WIND mission has been operating in a large amplitude Lissajous orbit in the vicinity of the interior libration point of the Sun-Earth/Moon system since 2004. Regular stationkeeping maneuvers are required to maintain the orbit due to the instability around the collinear libration points. Historically these stationkeeping maneuvers have been performed by applying an incremental change in velocity, or (delta)v along the spacecraft-Sun vector as projected into the ecliptic plane. Previous studies have shown that the magnitude of libration point stationkeeping maneuvers can be minimized by applying the (delta)v in the direction of the local stable manifold found using dynamical systems theory. This paper presents the analysis of this new maneuver strategy which shows that the magnitude of stationkeeping maneuvers can be decreased by 5 to 25 percent, depending on the location in the orbit where the maneuver is performed. The implementation of the optimized maneuver method into operations is discussed and results are presented for the first two optimized stationkeeping maneuvers executed by WIND.

  5. Quantitative Systems Pharmacology Approaches Applied to Microphysiological Systems (MPS): Data Interpretation and Multi-MPS Integration

    PubMed Central

    Yu, J; Cilfone, NA; Large, EM; Sarkar, U; Wishnok, JS; Tannenbaum, SR; Hughes, DJ; Lauffenburger, DA; Griffith, LG; Stokes, CL; Cirit, M

    2015-01-01

    Our goal in developing Microphysiological Systems (MPS) technology is to provide an improved approach for more predictive preclinical drug discovery via a highly integrated experimental/computational paradigm. Success will require quantitative characterization of MPSs and mechanistic analysis of experimental findings sufficient to translate resulting insights from in vitro to in vivo. We describe herein a systems pharmacology approach to MPS development and utilization that incorporates more mechanistic detail than traditional pharmacokinetic/pharmacodynamic (PK/PD) models. A series of studies illustrates diverse facets of our approach. First, we demonstrate two case studies: a PK data analysis and an inflammation response––focused on a single MPS, the liver/immune MPS. Building on the single MPS modeling, a theoretical investigation of a four-MPS interactome then provides a quantitative way to consider several pharmacological concepts such as absorption, distribution, metabolism, and excretion in the design of multi-MPS interactome operation and experiments. PMID:26535159

  6. Applying gap analysis and a comparison index to evaluate protected areas in Thailand.

    PubMed

    Trisurat, Yongyut

    2007-02-01

    Protected areas in Thailand were first established 40 years ago. The total area of existing protected areas covers 18.2% of the country's land area and the Class 1 Watershed, another form of protection, encompasses 18.1%. The government of Thailand intends to increase protected area systems to 25% of the country in 2006 and 30% in 2016. There are always questions arising about how much is enough protected areas to effectively protect biodiversity. The objective of this article is to assess the representation of ecosystems in the protected area network. This article also recommends which underrepresented ecosystems should be added to fill the gaps in representativeness. The research applies a gap analysis and a comparison index to assess the representation of ecosystems within the protected area network. The spatial analyses were applied to measure three aspects of representativeness, namely forest type, altitude, and natural land system. The analyses indicate that the existing protected area system covers 24.4% of the country's land area, nearly meeting the 25% target proposed by the National Forest Policy; and 83.8% of these areas are under forest cover. Most protected areas are situated in high altitudes, where biological diversity is less than in lowlands. Mangrove forest and riparian floodplain are extremely underrepresented in the existing system. Peat swamp forest, dry dipterocarp forest, and beach forest are relatively well represented. In addition, these five ecosystems are threatened by human pressures and natural disasters; therefore, they should be targeted as high priorities for the selection of new reserves. Future research should incorporate aquatic and marine ecosystems, as well as animal distributions, which were not included in this research due to data unavailabilities. PMID:17106794

  7. An introductory review of parallel independent component analysis (p-ICA) and a guide to applying p-ICA to genetic data and imaging phenotypes to identify disease-associated biological pathways and systems in common complex disorders.

    PubMed

    Pearlson, Godfrey D; Liu, Jingyu; Calhoun, Vince D

    2015-01-01

    Complex inherited phenotypes, including those for many common medical and psychiatric diseases, are most likely underpinned by multiple genes contributing to interlocking molecular biological processes, along with environmental factors (Owen et al., 2010). Despite this, genotyping strategies for complex, inherited, disease-related phenotypes mostly employ univariate analyses, e.g., genome wide association. Such procedures most often identify isolated risk-related SNPs or loci, not the underlying biological pathways necessary to help guide the development of novel treatment approaches. This article focuses on the multivariate analysis strategy of parallel (i.e., simultaneous combination of SNP and neuroimage information) independent component analysis (p-ICA), which typically yields large clusters of functionally related SNPs statistically correlated with phenotype components, whose overall molecular biologic relevance is inferred subsequently using annotation software suites. Because this is a novel approach, whose details are relatively new to the field we summarize its underlying principles and address conceptual questions regarding interpretation of resulting data and provide practical illustrations of the method. PMID:26442095

  8. Research in progress in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.

  9. Integrated fluorescence analysis system

    DOEpatents

    Buican, Tudor N.; Yoshida, Thomas M.

    1992-01-01

    An integrated fluorescence analysis system enables a component part of a sample to be virtually sorted within a sample volume after a spectrum of the component part has been identified from a fluorescence spectrum of the entire sample in a flow cytometer. Birefringent optics enables the entire spectrum to be resolved into a set of numbers representing the intensity of spectral components of the spectrum. One or more spectral components are selected to program a scanning laser microscope, preferably a confocal microscope, whereby the spectrum from individual pixels or voxels in the sample can be compared. Individual pixels or voxels containing the selected spectral components are identified and an image may be formed to show the morphology of the sample with respect to only those components having the selected spectral components. There is no need for any physical sorting of the sample components to obtain the morphological information.

  10. A method for work modeling at complex systems: towards applying information systems in family health care units.

    PubMed

    Jatobá, Alessandro; de Carvalho, Paulo Victor R; da Cunha, Amauri Marques

    2012-01-01

    Work in organizations requires a minimum level of consensus on the understanding of the practices performed. To adopt technological devices to support the activities in environments where work is complex, characterized by the interdependence among a large number of variables, understanding about how work is done not only takes an even greater importance, but also becomes a more difficult task. Therefore, this study aims to present a method for modeling of work in complex systems, which allows improving the knowledge about the way activities are performed where these activities do not simply happen by performing procedures. Uniting techniques of Cognitive Task Analysis with the concept of Work Process, this work seeks to provide a method capable of providing a detailed and accurate vision of how people perform their tasks, in order to apply information systems for supporting work in organizations.

  11. Normal vector analysis from GNSS-GPS data applied to Deception volcano surface deformation

    NASA Astrophysics Data System (ADS)

    Berrocoso, M.; Prates, G.; Fernández-Ros, A.; García, A.

    2012-09-01

    Surface deformation parameters and its use in volcano monitoring have evolved from classical geodetic procedures up to those based on Global Navigation Satellite Systems (GNSS), in particular the most widely used and known Global Positioning System (GPS), profiting from the automated data processing, positioning precision and rates, as well as the large storage capacity and low power consumption of its equipments. These features have enabled the permanent GNSS-GPS data acquisition to ensure the continuous monitoring of geodetic benchmarks for the evaluation of surface deformation in active tectonic or volcanic areas. In Deception Island (Antarctica), a normal vector analysis is being used to give surface deformation based on three permanently observed GNSS-GPS benchmarks. Due to data availability, both in the past and for near real-time use, all benchmarks used are inside the monitored volcanic area, although the reference is away from thermal springs and/or fumaroles, unlike the other two. The time variation of slope distances to the reference benchmark and of the magnitude and inclination of the normal vector to the triangle defined by the reference benchmark and any other two, provides the spatial deformation in the volcanic area covered. The normal vector variation in magnitude gives information on compression or expansion, here called spatial dilatometer, while the changes in inclination gives information on relative uplift or subsidence, here called spatial inclinometer. In geodesy, the triangle is a basic geometric unit and the areal strain is commonly applied in tectonics and volcanism. The normal vector analysis conjugates both, benefiting from the method's precision, simplicity and possibility to model the surface using several triangles. The proposed method was applied to GNSS-GPS data collected every austral summer between 2001-2002 and 2009-2010 in Deception Island. The results evidence that Deception Island acts as a strain marker in the Bransfield

  12. System Identification Applied to Dynamic CFD Simulation and Wind Tunnel Data

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Klein, Vladislav; Frink, Neal T.; Vicroy, Dan D.

    2011-01-01

    Demanding aerodynamic modeling requirements for military and civilian aircraft have provided impetus for researchers to improve computational and experimental techniques. Model validation is a key component for these research endeavors so this study is an initial effort to extend conventional time history comparisons by comparing model parameter estimates and their standard errors using system identification methods. An aerodynamic model of an aircraft performing one-degree-of-freedom roll oscillatory motion about its body axes is developed. The model includes linear aerodynamics and deficiency function parameters characterizing an unsteady effect. For estimation of unknown parameters two techniques, harmonic analysis and two-step linear regression, were applied to roll-oscillatory wind tunnel data and to computational fluid dynamics (CFD) simulated data. The model used for this study is a highly swept wing unmanned aerial combat vehicle. Differences in response prediction, parameters estimates, and standard errors are compared and discussed

  13. Multilayers quantitative X-ray fluorescence analysis applied to easel paintings.

    PubMed

    de Viguerie, Laurence; Sole, V Armando; Walter, Philippe

    2009-12-01

    X-ray fluorescence spectrometry (XRF) allows a rapid and simple determination of the elemental composition of a material. As a non-destructive tool, it has been extensively used for analysis in art and archaeology since the early 1970s. Whereas it is commonly used for qualitative analysis, recent efforts have been made to develop quantitative treatment even with portable systems. However, the interpretation of the results obtained with this technique can turn out to be problematic in the case of layered structures such as easel paintings. The use of differential X-ray attenuation enables modelling of the various layers: indeed, the absorption of X-rays through different layers will result in modification of intensity ratio between the different characteristic lines. This work focuses on the possibility to use XRF with the fundamental parameters method to reconstruct the composition and thickness of the layers. This method was tested on several multilayers standards and gives a maximum error of 15% for thicknesses and errors of 10% for concentrations. On a painting test sample that was rather inhomogeneous, the XRF analysis provides an average value. This method was applied in situ to estimate the thickness of the layers a painting from Marco d'Oggiono, pupil of Leonardo da Vinci.

  14. Microfabricated Genomic Analysis System

    NASA Technical Reports Server (NTRS)

    Gonda, Steve; Elms, Rene

    2005-01-01

    Genetic sequencing and many genetic tests and assays require electrophoretic separation of DNA. In this technique, DNA fragments are separated by size as they migrate through a sieving gel under the influence of an applied electric field. In order to conduct these analyses on-orbit, it is essential to acquire the capability to efficiently perform electrophoresis in a microgravity environment. Conventional bench top electrophoresis equipment is large and cumbersome and does not lead itself to on-orbit utilization. Much of the previous research regarding on-orbit electrophoresis involved altering conventional electrophoresis equipment for bioprocessing, purification, and/or separation technology applications. A new and more efficient approach to on-orbit electrophoresis is the use of a microfabricated electrophoresis platform. These platforms are much smaller, less expensive to produce and operate, use less power, require smaller sample sizes (nanoliters), and achieve separation in a much shorter distance (a few centimeters instead of 10 s or 100 s of centimeters.) In contrast to previous applications, this platform would be utilized as an analytical tool for life science/medical research, environmental monitoring, and medical diagnoses. Identification of infectious agents as well as radiation related damage are significant to NASA s efforts to maintain, study, and monitor crew health during and in support of near-Earth and interplanetary missions. The capability to perform genetic assays on-orbit is imperative to conduct relevant and insightful biological and medical research, as well as continuing NASA s search for life elsewhere. This technology would provide an essential analytical tool for research conducted in a microgravity environment (Shuttle, ISS, long duration/interplanetary missions.) In addition, this technology could serve as a critical and invaluable component of a biosentinel system to monitor space environment genotoxic insults to include radiation.

  15. A User-Based Response System for the Applied Research Needs of Comprehensive Outcomes Assessment Activities.

    ERIC Educational Resources Information Center

    Chatman, Steve; Sprengel, Archie

    The development of a computer-based decision support system for outcomes assessment at Southeast Missouri State University is described. A menu-driven, user-based retrieval and analysis and reporting system, the Statistical Analysis System (SAS) was designed. The decision support system incorporates two major components, research group…

  16. Jacobi stability analysis of Rikitake system

    NASA Astrophysics Data System (ADS)

    Gupta, M. K.; Yadav, C. K.

    2016-06-01

    We study the Rikitake system through the method of differential geometry, i.e. Kosambi-Cartan-Chern (KCC) theory for Jacobi stability analysis. For applying KCC theory we reformulate the Rikitake system as two second-order nonlinear differential equations. The five KCC invariants are obtained which express the intrinsic properties of nonlinear dynamical system. The deviation curvature tensor and its eigenvalues are obtained which determine the stability of the system. Jacobi stability of the equilibrium points is studied and obtain the conditions for stability. We study the dynamics of Rikitake system which shows the chaotic behaviour near the equilibrium points.

  17. A review of the technology and process on integrated circuits failure analysis applied in communications products

    NASA Astrophysics Data System (ADS)

    Ming, Zhimao; Ling, Xiaodong; Bai, Xiaoshu; Zong, Bo

    2016-02-01

    The failure analysis of integrated circuits plays a very important role in the improvement of the reliability in communications products. This paper intends to mainly introduce the failure analysis technology and process of integrated circuits applied in the communication products. There are many technologies for failure analysis, include optical microscopic analysis, infrared microscopic analysis, acoustic microscopy analysis, liquid crystal hot spot detection technology, optical microscopic analysis technology, micro analysis technology, electrical measurement, microprobe technology, chemical etching technology and ion etching technology. The integrated circuit failure analysis depends on the accurate confirmation and analysis of chip failure mode, the search of the root failure cause, the summary of failure mechanism and the implement of the improvement measures. Through the failure analysis, the reliability of integrated circuit and rate of good products can improve.

  18. [Risk Analysis applied to food safety in Brazil: prospects and challenges].

    PubMed

    Figueiredo, Ana Virgínia de Almeida; Miranda, Maria Spínola

    2011-04-01

    The scope of this case study is to discuss the ideas of the Brazilian Codex Alimentarius Committee (CCAB) coordinated by National Institute of Metrology, Standardization and Industrial Quality (Inmetro), with respect to the Codex Alimentarius norm on Risk Analysis (RA) applied to Food Safety. The objectives of this investigation were to identify and analyze the opinion of CCAB members on RA and to register their proposals for the application of this norm in Brazil, highlighting the local limitations and potential detected. CCAB members were found to be in favor of the Codex Alimentarius initiative of instituting an RA norm to promote the health safety of foods that circulate on the international market. There was a consensus that the Brazilian government should incorporate RA as official policy to improve the country's system of food control and leverage Brazilian food exports. They acknowledge that Brazil has the technical-scientific capacity to apply this norm, though they stressed several political and institutional limitations. The members consider RA to be a valid initiative for tackling risks in food, due to its ability to improve food safety control measures adopted by the government.

  19. Integrated tools for control-system analysis

    NASA Technical Reports Server (NTRS)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  20. Applying Systems Engineering to Implement an Evidence-based Intervention at a Community Health Center

    PubMed Central

    Tu, Shin-Ping; Feng, Sherry; Storch, Richard; Yip, Mei-Po; Sohng, HeeYon; Fu, Mingang; Chun, Alan

    2013-01-01

    Summary Impressive results in patient care and cost reduction have increased the demand for systems-engineering methodologies in large health care systems. This Report from the Field describes the feasibility of applying systems-engineering techniques at a community health center currently lacking the dedicated expertise and resources to perform these activities. PMID:23698657

  1. Applied Systemic Theory and Educational Psychology: Can the Twain Ever Meet?

    ERIC Educational Resources Information Center

    Pellegrini, Dario W.

    2009-01-01

    This article reflects on the potential benefits of applying systemic theory to the work of educational psychologists (EPs). It reviews developments in systemic thinking over time, and discusses the differences between more directive "first order" versus collaborative "second order" approaches. It considers systemic theories and illustrates their…

  2. Using Microcomputers To Apply Statewide Standards for Schools and School Systems: Technological Changes over Five Years.

    ERIC Educational Resources Information Center

    Wu, Yi-Cheng; Hebbler, Stephen W.

    The Evaluation and Assessment Laboratory at the University of Alabama (Tuscaloosa) has contracted with the Georgia Department of Education (GDOE) to develop a microcomputer-based data management system for use in applying evaluation standards to schools and school systems. The Comprehensive Evaluation System (CES) was implemented statewide and has…

  3. An Objective Comparison of Applied Behavior Analysis and Organizational Behavior Management Research

    ERIC Educational Resources Information Center

    Culig, Kathryn M.; Dickinson, Alyce M.; McGee, Heather M.; Austin, John

    2005-01-01

    This paper presents an objective review, analysis, and comparison of empirical studies targeting the behavior of adults published in Journal of Applied Behavior Analysis (JABA) and Journal of Organizational Behavior Management (JOBM) between 1997 and 2001. The purpose of the comparisons was to identify similarities and differences with respect to…

  4. Applied Behavior Analysis and the Imprisoned Adult Felon Project 1: The Cellblock Token Economy.

    ERIC Educational Resources Information Center

    Milan, Michael A.; And Others

    This report provides a technical-level analysis, discussion, and summary of five experiments in applied behavior analysis. Experiment 1 examined the token economy as a basis for motivating inmate behavior; Experiment 2 examined the relationship between magnitude of token reinforcement and level of inmate performance; Experiment 3 introduced a…

  5. Sociosexuality Education for Persons with Autism Spectrum Disorders Using Principles of Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Wolfe, Pamela S.; Condo, Bethany; Hardaway, Emily

    2009-01-01

    Applied behavior analysis (ABA) has emerged as one of the most effective empirically based strategies for instructing individuals with autism spectrum disorders (ASD). Four ABA-based strategies that have been found effective are video modeling, visual strategies, social script fading, and task analysis. Individuals with ASD often struggle with…

  6. Causal Modeling--Path Analysis a New Trend in Research in Applied Linguistics

    ERIC Educational Resources Information Center

    Rastegar, Mina

    2006-01-01

    This article aims at discussing a new statistical trend in research in applied linguistics. This rather new statistical procedure is causal modeling--path analysis. The article demonstrates that causal modeling--path analysis is the best statistical option to use when the effects of a multitude of L2 learners' variables on language achievement are…

  7. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    ERIC Educational Resources Information Center

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  8. Optical Image Analysis Applied to Pore Network Quantification of Sandstones Under Experimental CO2 Injection

    NASA Astrophysics Data System (ADS)

    Berrezueta, E.; González, L.; Ordóñez, B.; Luquot, L.; Quintana, L.; Gallastegui, G.; Martínez, R.; Olaya, P.; Breitner, D.

    2015-12-01

    This research aims to propose a protocol for pore network quantification in sandstones applying the Optical Image Analysis (OIA) procedure, which guarantees the measurement reproducibility and its reliability. Two geological formations of sandstone, located in Spain and potentially suitable for CO2 sequestration, were selected for this study: a) the Cretaceous Utrillas unit, at the base of the Cenozoic Duero Basin and b) a Triassic unit at the base of the Cenozoic Guadalquivir Basin. Sandstone samples were studied before and after the CO2 experimental injection using Optical and scanning electronic microscopy (SEM), while the quantification of petrographic changes was done with OIA. The first phase of the rersearch consisted on a detailed mineralogical and petrographic study of the sandstones (before and after CO2-injection), for which we observed thin sections. Later, the methodological and experimental processes of the investigation were focused on i) adjustment and calibration of OIA tools; ii) data acquisition protocol based on image capture with different polarization conditions (synchronized movement of polarizers), using 7 images of the same mineral scene (6 in crossed polarizer and 1 in parallel polarizer); and iii) automated identification and segmentation of pore in 2D mineral images, generating applications by executable macros. Finally, once the procedure protocols had been, the compiled data was interpreted through an automated approach and the qualitative petrography was carried out. The quantification of changes in the pore network through OIA (porosity increase ≈ 2.5%) has allowed corroborate the descriptions obtained by SEM and microscopic techniques, which consisted in an increase in the porosity when CO2 treatment occurs. Automated-image identification and quantification of minerals, pores and textures together with petrographic analysis can be applied to improve pore system characterization in sedimentary rocks. This research offers numerical

  9. Confirmation of standard error analysis techniques applied to EXAFS using simulations

    SciTech Connect

    Booth, Corwin H; Hu, Yung-Jin

    2009-12-14

    Systematic uncertainties, such as those in calculated backscattering amplitudes, crystal glitches, etc., not only limit the ultimate accuracy of the EXAFS technique, but also affect the covariance matrix representation of real parameter errors in typical fitting routines. Despite major advances in EXAFS analysis and in understanding all potential uncertainties, these methods are not routinely applied by all EXAFS users. Consequently, reported parameter errors are not reliable in many EXAFS studies in the literature. This situation has made many EXAFS practitioners leery of conventional error analysis applied to EXAFS data. However, conventional error analysis, if properly applied, can teach us more about our data, and even about the power and limitations of the EXAFS technique. Here, we describe the proper application of conventional error analysis to r-space fitting to EXAFS data. Using simulations, we demonstrate the veracity of this analysis by, for instance, showing that the number of independent dat a points from Stern's rule is balanced by the degrees of freedom obtained from a 2 statistical analysis. By applying such analysis to real data, we determine the quantitative effect of systematic errors. In short, this study is intended to remind the EXAFS community about the role of fundamental noise distributions in interpreting our final results.

  10. Spherical harmonic decomposition applied to spatial-temporal analysis of human high-density electroencephalogram

    NASA Astrophysics Data System (ADS)

    Wingeier, B. M.; Nunez, P. L.; Silberstein, R. B.

    2001-11-01

    We demonstrate an application of spherical harmonic decomposition to the analysis of the human electroencephalogram (EEG). We implement two methods and discuss issues specific to the analysis of hemispherical, irregularly sampled data. Spatial sampling requirements and performance of the methods are quantified using simulated data. The analysis is applied to experimental EEG data, confirming earlier reports of an approximate frequency-wave-number relationship in some bands.

  11. A covariance analysis algorithm for interconnected systems

    NASA Technical Reports Server (NTRS)

    Cheng, Victor H. L.; Curley, Robert D.; Lin, Ching-An

    1987-01-01

    A covariance analysis algorithm for propagation of signal statistics in arbitrarily interconnected nonlinear systems is presented which is applied to six-degree-of-freedom systems. The algorithm uses statistical linearization theory to linearize the nonlinear subsystems, and the resulting linearized subsystems are considered in the original interconnection framework for propagation of the signal statistics. Some nonlinearities commonly encountered in six-degree-of-freedom space-vehicle models are referred to in order to illustrate the limitations of this method, along with problems not encountered in standard deterministic simulation analysis. Moreover, the performance of the algorithm shall be numerically exhibited by comparing results using such techniques to Monte Carlo analysis results, both applied to a simple two-dimensional space-intercept problem.

  12. Quantum analysis applied to thermo field dynamics on dissipative systems

    SciTech Connect

    Hashizume, Yoichiro; Okamura, Soichiro; Suzuki, Masuo

    2015-03-10

    Thermo field dynamics is one of formulations useful to treat statistical mechanics in the scheme of field theory. In the present study, we discuss dissipative thermo field dynamics of quantum damped harmonic oscillators. To treat the effective renormalization of quantum dissipation, we use the Suzuki-Takano approximation. Finally, we derive a dissipative von Neumann equation in the Lindbrad form. In the present treatment, we can easily obtain the initial damping shown previously by Kubo.

  13. Euclidean geodesic loops on high-genus surfaces applied to the morphometry of vestibular systems.

    PubMed

    Xin, Shi-Qing; He, Ying; Fu, Chi-Wing; Wang, Defeng; Lin, Shi; Chu, Winnie C W; Cheng, Jack C Y; Gu, Xianfeng; Lui, Lok Ming

    2011-01-01

    This paper proposes a novel algorithm to extract feature landmarks on the vestibular system (VS), for the analysis of Adolescent Idiopathic Scoliosis (AIS) disease. AIS is a 3-D spinal deformity commonly occurred in adolescent girls with unclear etiology. One popular hypothesis was suggested to be the structural changes in the VS that induce the disturbed balance perception, and further cause the spinal deformity. The morphometry of VS to study the geometric differences between the healthy and AIS groups is of utmost importance. However, the VS is a genus-3 structure situated in the inner ear. The high-genus topology of the surface poses great challenge for shape analysis. In this work, we present a new method to compute exact geodesic loops on the VS. The resultant geodesic loops are in Euclidean metric, thus characterizing the intrinsic geometric properties of the VS based on the real background geometry. This leads to more accurate results than existing methods, such as the hyperbolic Ricci flow method. Furthermore, our method is fully automatic and highly efficient, e.g., one order of magnitude faster than. We applied our algorithm to the VS of normal and AIS subjects. The promising experimental results demonstrate the efficacy of our method and reveal more statistically significant shape difference in the VS between right-thoracic AIS and normal subjects.

  14. Applied Koopmanisma)

    NASA Astrophysics Data System (ADS)

    Budišić, Marko; Mohr, Ryan; Mezić, Igor

    2012-12-01

    A majority of methods from dynamical system analysis, especially those in applied settings, rely on Poincaré's geometric picture that focuses on "dynamics of states." While this picture has fueled our field for a century, it has shown difficulties in handling high-dimensional, ill-described, and uncertain systems, which are more and more common in engineered systems design and analysis of "big data" measurements. This overview article presents an alternative framework for dynamical systems, based on the "dynamics of observables" picture. The central object is the Koopman operator: an infinite-dimensional, linear operator that is nonetheless capable of capturing the full nonlinear dynamics. The first goal of this paper is to make it clear how methods that appeared in different papers and contexts all relate to each other through spectral properties of the Koopman operator. The second goal is to present these methods in a concise manner in an effort to make the framework accessible to researchers who would like to apply them, but also, expand and improve them. Finally, we aim to provide a road map through the literature where each of the topics was described in detail. We describe three main concepts: Koopman mode analysis, Koopman eigenquotients, and continuous indicators of ergodicity. For each concept, we provide a summary of theoretical concepts required to define and study them, numerical methods that have been developed for their analysis, and, when possible, applications that made use of them. The Koopman framework is showing potential for crossing over from academic and theoretical use to industrial practice. Therefore, the paper highlights its strengths in applied and numerical contexts. Additionally, we point out areas where an additional research push is needed before the approach is adopted as an off-the-shelf framework for analysis and design.

  15. Quasi-dynamic Material Flow Analysis applied to the Austrian Phosphorus cycle

    NASA Astrophysics Data System (ADS)

    Zoboli, Ottavia; Rechberger, Helmut

    2013-04-01

    Phosphorus (P) is one of the key elements that sustain life on earth and that allow achieving the current high levels of food production worldwide. It is a non-renewable resource, without any existing substitute. Because of its current dissipative use by mankind and to its very slow geochemical cycle, this resource is rapidly depleting and it is strongly connected to the problem of ensuring food security. Moreover P is also associated to important environmental problems. Its extraction often generates hazardous wastes, while its accumulation in water bodies can lead to eutrophication, with consequent severe ecological damages. It is therefore necessary to analyze and understand in detail the system of P, in regard to its use and management, to identify the processes that should be targeted in order to reduce the overall consumption of this resource. This work aims at establishing a generic quasi-dynamic model, which describes the Austrian P-budget and which allows investigating the trends of P use in the past, but also selected future scenarios. Given the importance of P throughout the whole anthropogenic metabolism, the model is based on a comprehensive system that encompasses several economic sectors, from agriculture and animal husbandry to industry, consumption and waste and wastewater treatment. Furthermore it includes the hydrosphere, to assess the losses of P into water bodies, due to the importance of eutrophication problems. The methodology applied is Material Flow Analysis (MFA), which is a systemic approach to assess and balance the stocks and flows of a material within a system defined in space and time. Moreover the model is integrated in the software STAN, a freeware tailor-made for MFA. Particular attention is paid to the characteristics and the quality of the data, in order to include data uncertainty and error propagation in the dynamic balance.

  16. Adaptive illumination source for multispectral vision system applied to material discrimination

    NASA Astrophysics Data System (ADS)

    Conde, Olga M.; Cobo, Adolfo; Cantero, Paulino; Conde, David; Mirapeix, Jesús; Cubillas, Ana M.; López-Higuera, José M.

    2008-04-01

    A multispectral system based on a monochrome camera and an adaptive illumination source is presented in this paper. Its preliminary application is focused on material discrimination for food and beverage industries, where monochrome, color and infrared imaging have been successfully applied for this task. This work proposes a different approach, in which the relevant wavelengths for the required discrimination task are selected in advance using a Sequential Forward Floating Selection (SFFS) Algorithm. A light source, based on Light Emitting Diodes (LEDs) at these wavelengths is then used to sequentially illuminate the material under analysis, and the resulting images are captured by a CCD camera with spectral response in the entire range of the selected wavelengths. Finally, the several multispectral planes obtained are processed using a Spectral Angle Mapping (SAM) algorithm, whose output is the desired material classification. Among other advantages, this approach of controlled and specific illumination produces multispectral imaging with a simple monochrome camera, and cold illumination restricted to specific relevant wavelengths, which is desirable for the food and beverage industry. The proposed system has been tested with success for the automatic detection of foreign object in the tobacco processing industry.

  17. Photometric analysis of the system Kepler-1

    NASA Astrophysics Data System (ADS)

    Budding, E.; Rhodes, M. D.; Püsküllü, Ç.; Ji, Y.; Erdem, A.; Banks, T.

    2016-10-01

    We have applied the close binary system analysis program WinFitter to an intensive study of Kepler-1 (= TrES-2) using all the available photometry (14 quarters; 1570640 measures) from the NASA Exoplanet Archive (NEA) at the Caltech website http://exoplanetarchive.ipac.caltech.edu. The mean individual data-point error of the normalized flux values is 0.00026, leading to the model's specification for the mean reference flux of the system to an accuracy of {˜} 0.5 ppm. Completion of the analysis requires a number of prior quantities, relating mainly to the host star, that are adopted from relevant literature.

  18. Design of multivariable feedback control systems via spectral assignment. [as applied to aircraft flight control

    NASA Technical Reports Server (NTRS)

    Liberty, S. R.; Mielke, R. R.; Tung, L. J.

    1981-01-01

    Applied research in the area of spectral assignment in multivariable systems is reported. A frequency domain technique for determining the set of all stabilizing controllers for a single feedback loop multivariable system is described. It is shown that decoupling and tracking are achievable using this procedure. The technique is illustrated with a simple example.

  19. Multi-Disciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song

    1997-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code developed under the leadership of NASA Lewis Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multi-disciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  20. Design and Analysis of a Thrust Vector Mechanism Applied in a Flying Wing

    NASA Astrophysics Data System (ADS)

    Zhu, Yanhe; Gao, Liang; Wang, Hongwei; Zhao, Jie

    This paper presents the design and analysis of a thrust vector mechanism applied in a flying wing. A thrust vector mechanism driven by two servos is developed. An analysis of the dynamic differences in minimum hovering radius between conventional flying wing and one with thrust vector mechanism is given and validated with simulation. It is shown that thrust vector has obvious advantages over the usual flying wing including decreasing the hovering radius and decreasing roll angle. The benefits should improve maneuverability and agility.

  1. Computational analysis of fluid flow within a device for applying biaxial strain to cultured cells.

    PubMed

    Lee, Jason; Baker, Aaron B

    2015-05-01

    In vitro systems for applying mechanical strain to cultured cells are commonly used to investigate cellular mechanotransduction pathways in a variety of cell types. These systems often apply mechanical forces to a flexible membrane on which cells are cultured. A consequence of the motion of the membrane in these systems is the generation of flow and the unintended application of shear stress to the cells. We recently described a flexible system for applying mechanical strain to cultured cells, which uses a linear motor to drive a piston array to create biaxial strain within multiwell culture plates. To better understand the fluidic stresses generated by this system and other systems of this type, we created a computational fluid dynamics model to simulate the flow during the mechanical loading cycle. Alterations in the frequency or maximal strain magnitude led to a linear increase in the average fluid velocity within the well and a nonlinear increase in the shear stress at the culture surface over the ranges tested (0.5-2.0 Hz and 1-10% maximal strain). For all cases, the applied shear stresses were relatively low and on the order of millipascal with a dynamic waveform having a primary and secondary peak in the shear stress over a single mechanical strain cycle. These findings should be considered when interpreting experimental results using these devices, particularly in the case when the cell type used is sensitive to low magnitude, oscillatory shear stresses. PMID:25611013

  2. Appliance of Independent Component Analysis to System Intrusion Analysis

    NASA Astrophysics Data System (ADS)

    Ishii, Yoshikazu; Takagi, Tarou; Nakai, Kouji

    In order to analyze the output of the intrusion detection system and the firewall, we evaluated the applicability of ICA(independent component analysis). We developed a simulator for evaluation of intrusion analysis method. The simulator consists of the network model of an information system, the service model and the vulnerability model of each server, and the action model performed on client and intruder. We applied the ICA for analyzing the audit trail of simulated information system. We report the evaluation result of the ICA on intrusion analysis. In the simulated case, ICA separated two attacks correctly, and related an attack and the abnormalities of the normal application produced under the influence of the attach.

  3. Quantitative tools for comparing animal communication systems: information theory applied to bottlenose dolphin whistle repertoires.

    PubMed

    McCOWAN; Hanser; Doyle

    1999-02-01

    Comparative analysis of nonhuman animal communication systems and their complexity, particularly in comparison to human language, has been generally hampered by both a lack of sufficiently extensive data sets and appropriate analytic tools. Information theory measures provide an important quantitative tool for examining and comparing communication systems across species. In this paper we use the original application of information theory, that of statistical examination of a communication system's structure and organization. As an example of the utility of information theory to the analysis of animal communication systems, we applied a series of information theory statistics to a statistically categorized set of bottlenose dolphin Tursiops truncatus, whistle vocalizations. First, we use the first-order entropic relation in a Zipf-type diagram (Zipf 1949 Human Behavior and the Principle of Least Effort) to illustrate the application of temporal statistics as comparative indicators of repertoire complexity, and as possible predictive indicators of acquisition/learning in animal vocal repertoires. Second, we illustrate the need for more extensive temporal data sets when examining the higher entropic orders, indicative of higher levels of internal informational structure, of such vocalizations, which could begin to allow the statistical reconstruction of repertoire organization. Third, we propose using 'communication capacity' as a measure of the degree of temporal structure and complexity of statistical correlation, represented by the values of entropic order, as an objective tool for interspecies comparison of communication complexity. In doing so, we introduce a new comparative measure, the slope of Shannon entropies, and illustrate how it potentially can be used to compare the organizational complexity of vocal repertoires across a diversity of species. Finally, we illustrate the nature and predictive application of these higher-order entropies using a preliminary

  4. Expert systems and ballistic range data analysis

    NASA Astrophysics Data System (ADS)

    Hathaway, Wayne; Steinhoff, Mark; Whyte, Robert; Brown, David; Choate, Jeff; Adelgren, Russ

    1992-07-01

    A program aimed at the development of an expert system for the reduction of ballistic range data is described. The program applies expert system and artificial intelligence techniques to develop a mathematically complex state-of-the-art spark range data reduction procedure that includes linear theory and six-degree-of-freedom analysis. The scope of the knowledge base includes both spin and statically stable vehicles. The expert system is expected to improve the quality of the data reduction process while reducing the work load on the senior range engineer.

  5. August Dvorak (1894-1975): Early expressions of applied behavior analysis and precision teaching.

    PubMed

    Joyce, B; Moxley, R A

    1988-01-01

    August Dvorak is best known for his development of the Dvorak keyboard. However, Dvorak also adapted and applied many behavioral and scientific management techniques to the field of education. Taken collectively, these techniques are representative of many of the procedures currently used in applied behavior analysis, in general, and especially in precision teaching. The failure to consider Dvorak's instructional methods may explain some of the discrepant findings in studies which compare the efficiency of the Dvorak to the standard keyboard. This article presents a brief background on the development of the standard (QWERTY) and Dvorak keyboards, describes parallels between Dvorak's teaching procedures and those used in precision teaching, reviews some of the comparative research on the Dvorak keyboard, and suggests some implications for further research in applying the principles of behavior analysis.

  6. August Dvorak (1894-1975): Early expressions of applied behavior analysis and precision teaching

    PubMed Central

    Joyce, Bonnie; Moxley, Roy A.

    1988-01-01

    August Dvorak is best known for his development of the Dvorak keyboard. However, Dvorak also adapted and applied many behavioral and scientific management techniques to the field of education. Taken collectively, these techniques are representative of many of the procedures currently used in applied behavior analysis, in general, and especially in precision teaching. The failure to consider Dvorak's instructional methods may explain some of the discrepant findings in studies which compare the efficiency of the Dvorak to the standard keyboard. This article presents a brief background on the development of the standard (QWERTY) and Dvorak keyboards, describes parallels between Dvorak's teaching procedures and those used in precision teaching, reviews some of the comparative research on the Dvorak keyboard, and suggests some implications for further research in applying the principles of behavior analysis. PMID:22477993

  7. Laser rocket system analysis

    NASA Technical Reports Server (NTRS)

    Jones, W. S.; Forsyth, J. B.; Skratt, J. P.

    1979-01-01

    The laser rocket systems investigated in this study were for orbital transportation using space-based, ground-based and airborne laser transmitters. The propulsion unit of these systems utilizes a continuous wave (CW) laser beam focused into a thrust chamber which initiates a plasma in the hydrogen propellant, thus heating the propellant and providing thrust through a suitably designed nozzle and expansion skirt. The specific impulse is limited only by the ability to adequately cool the thruster and the amount of laser energy entering the engine. The results of the study showed that, with advanced technology, laser rocket systems with either a space- or ground-based laser transmitter could reduce the national budget allocated to space transportation by 10 to 345 billion dollars over a 10-year life cycle when compared to advanced chemical propulsion systems (LO2-LH2) of equal capability. The variation in savings depends upon the projected mission model.

  8. Dynamics of a variable mass system applied to spacecraft rocket attitude theory

    NASA Astrophysics Data System (ADS)

    Mudge, Jason Dominic

    This research project is a study of the dynamics of a variable mass system. The scope of this research project is to gain understanding as to how a variable mass system will behave. The intent is to bring the level of understanding of variable mass dynamics higher and closer to the level of constant mass dynamics in the area of spacecrafts in particular. A main contribution is the finding of a set of criteria to minimize or eliminate the deviation of the nutation angle (or cone angle or angle of attack) of spacecraft rockets passively, i.e. without active control. The motivation for this research project is the Star 48 anomaly. The Star 48 is a solid rocket motor which has propelled (boosted) communication satellites from lower earth orbit to a higher one during the 1980's. The anomaly is that when the spacecraft rocket is being propelled, the nutation angle may deviate excessively which is considered undesirable. In the first part of this research project, a variable mass system is described and defined and the governing equations are derived. The type of governing equations derived are those that are most useful for analyzing the motion of a spacecraft rocket. The method of derivation makes use of Leibnitz Theorem, Divergence Theorem and Newton's Second Law of Motion. Next, the governing equations are specialized with several assumptions which are generally accepted assumptions applied in the analysis of spacecraft rockets. With these assumptions, the form governing equations is discussed and then the equations are solved analytically for the system's angular velocity. Having solved for the angular velocity of the system, the attitude of the system is obtained using a unique method which circumvents the nonlinearities that exist using Euler Angles and their kinematical equations. The attitude is approximately found analytically and a set of criteria is discussed which will minimize or eliminate the deviation of the nutation angle of a spacecraft rocket. Finally

  9. Power Plant Systems Analysis

    NASA Technical Reports Server (NTRS)

    Williams, J. R.; Yang, Y. Y.

    1973-01-01

    Three basic thermodynamic cycles of advanced nuclear MHD power plant systems are studied. The effect of reactor exit temperature and space radiator temperature on the overall thermal efficiency of a regenerative turbine compressor power plant system is shown. The effect of MHD pressure ratio on plant efficiency is also described, along with the dependence of MHD power output, compressor power requirement, turbine power output, mass flow rate of H2, and overall plant efficiency on the reactor exit temperature for a specific configuration.

  10. Using Applied Behaviour Analysis as Standard Practice in a UK Special Needs School

    ERIC Educational Resources Information Center

    Foran, Denise; Hoerger, Marguerite; Philpott, Hannah; Jones, Elin Walker; Hughes, J. Carl; Morgan, Jonathan

    2015-01-01

    This article describes how applied behaviour analysis can be implemented effectively and affordably in a maintained special needs school in the UK. Behaviour analysts collaborate with classroom teachers to provide early intensive behaviour education for young children with autism spectrum disorders (ASD), and function based behavioural…

  11. Says Who?: Students Apply Their Critical-Analysis Skills to Fight Town Hall

    ERIC Educational Resources Information Center

    Trimarchi, Ruth

    2002-01-01

    For some time the author looked for a tool to let students apply what they are learning about critical analysis in the science classroom to a relevant life experience. The opportunity occurred when a proposal to use environmentally friendly cleaning products in town buildings appeared on the local town meeting agenda. Using a copy of the proposal…

  12. Applied Behavior Analysis in Autism Spectrum Disorders: Recent Developments, Strengths, and Pitfalls

    ERIC Educational Resources Information Center

    Matson, Johnny L.; Turygin, Nicole C.; Beighley, Jennifer; Rieske, Robert; Tureck, Kimberly; Matson, Michael L.

    2012-01-01

    Autism has become one of the most heavily researched topics in the field of mental health and education. While genetics has been the most studied of all topics, applied behavior analysis (ABA) has also received a great deal of attention, and has arguably yielded the most promising results of any research area to date. The current paper provides a…

  13. Lovaas Model of Applied Behavior Analysis. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2010

    2010-01-01

    The "Lovaas Model of Applied Behavior Analysis" is a type of behavioral therapy that initially focuses on discrete trials: brief periods of one-on-one instruction, during which a teacher cues a behavior, prompts the appropriate response, and provides reinforcement to the child. Children in the program receive an average of 35 to 40 hours of…

  14. Applying Socio-Identity Analysis to Counseling Practice and Preparation: A Review of Four Techniques.

    ERIC Educational Resources Information Center

    Johnson, Samuel D., Jr.

    1990-01-01

    Reviews four training strategies for applying socioidentity analysis to multicultural counseling; the Clarification Group (C Group); the Personal Dimensions of Difference Self-Inventory (PDD); the Multifactor Needs Assessment; and the Cultural Grid. Each highlights a slightly different aspect of the complex matrix of relationships that define the…

  15. A Case Study in the Misrepresentation of Applied Behavior Analysis in Autism: The Gernsbacher Lectures

    PubMed Central

    Morris, Edward K

    2009-01-01

    I know that most men, including those at ease with problems of the greatest complexity, can seldom accept the simplest and most obvious truth if it be such as would oblige them to admit the falsity of conclusions which they have proudly taught to others, and which they have woven, thread by thread, into the fabrics of their life. (Tolstoy, 1894) This article presents a case study in the misrepresentation of applied behavior analysis for autism based on Morton Ann Gernsbacher's presentation of a lecture titled “The Science of Autism: Beyond the Myths and Misconceptions.” Her misrepresentations involve the characterization of applied behavior analysis, descriptions of practice guidelines, reviews of the treatment literature, presentations of the clinical trials research, and conclusions about those trials (e.g., children's improvements are due to development, not applied behavior analysis). The article also reviews applied behavior analysis' professional endorsements and research support, and addresses issues in professional conduct. It ends by noting the deleterious effects that misrepresenting any research on autism (e.g., biological, developmental, behavioral) have on our understanding and treating it in a transdisciplinary context. PMID:22478522

  16. Applied Behavior Analysis in the Treatment of Severe Psychiatric Disorders: A Bibliography.

    ERIC Educational Resources Information Center

    Scotti, Joseph R.; And Others

    Clinical research in the area of severe psychiatric disorders constituted the major focus for the discipline of applied behavior analysis during the early 1960s. Recently, however, there appears to be a notable lack of a behavioral focus within many inpatient psychiatric settings and a relative dearth of published behavioral treatment studies with…

  17. Graphical and Numerical Descriptive Analysis: Exploratory Tools Applied to Vietnamese Data

    ERIC Educational Resources Information Center

    Haughton, Dominique; Phong, Nguyen

    2004-01-01

    This case study covers several exploratory data analysis ideas, the histogram and boxplot, kernel density estimates, the recently introduced bagplot--a two-dimensional extension of the boxplot--as well as the violin plot, which combines a boxplot with a density shape plot. We apply these ideas and demonstrate how to interpret the output from these…

  18. Applying Monte Carlo Simulation to Launch Vehicle Design and Requirements Analysis

    NASA Technical Reports Server (NTRS)

    Hanson, J. M.; Beard, B. B.

    2010-01-01

    This Technical Publication (TP) is meant to address a number of topics related to the application of Monte Carlo simulation to launch vehicle design and requirements analysis. Although the focus is on a launch vehicle application, the methods may be applied to other complex systems as well. The TP is organized so that all the important topics are covered in the main text, and detailed derivations are in the appendices. The TP first introduces Monte Carlo simulation and the major topics to be discussed, including discussion of the input distributions for Monte Carlo runs, testing the simulation, how many runs are necessary for verification of requirements, what to do if results are desired for events that happen only rarely, and postprocessing, including analyzing any failed runs, examples of useful output products, and statistical information for generating desired results from the output data. Topics in the appendices include some tables for requirements verification, derivation of the number of runs required and generation of output probabilistic data with consumer risk included, derivation of launch vehicle models to include possible variations of assembled vehicles, minimization of a consumable to achieve a two-dimensional statistical result, recontact probability during staging, ensuring duplicated Monte Carlo random variations, and importance sampling.

  19. Applying Systems Theory to Systemic Change: A Generic Model for Educational Reform.

    ERIC Educational Resources Information Center

    Hansen, Joe B.

    Although educational reformers frequently use the words "system,""systemic change," and "systemic approach," many lack a fundamental understanding of the systems concept. This paper describes the application of systems theory to the problems of educational reform and educational assessment. It introduces basic concepts and principles and describes…

  20. The Art World's Concept of Negative Space Applied to System Safety Management

    NASA Technical Reports Server (NTRS)

    Goodin, James Ronald (Ronnie)

    2005-01-01

    Tools from several different disciplines can improve system safety management. This paper relates the Art World with our system safety world, showing useful art schools of thought applied to system safety management, developing an art theory-system safety bridge. This bridge is then used to demonstrate relations with risk management, the legal system, personnel management and basic management (establishing priorities). One goal of this presentation/paper is simply to be a fun diversion from the many technical topics presented during the conference.

  1. Land Analysis System (LAS)

    NASA Technical Reports Server (NTRS)

    Pease, P. B.

    1989-01-01

    Version 4.1 of LAS provides flexible framework for algorithm development and processing and analysis of image data. Over 500,000 lines of code enable image repair, clustering, classification, film processing, geometric registration, radiometric correction, and manipulation of image statistics.

  2. Exergy Analysis of Rocket Systems

    NASA Technical Reports Server (NTRS)

    Gilbert, Andrew; Mesmer, Bryan; Watson, Michael D.

    2015-01-01

    Exergy is defined as the useful work available from a system in a specified environment. Exergy analysis allows for comparison between different system designs, and allows for comparison of subsystem efficiencies within system designs. The proposed paper explores the relationship between the fundamental rocket equation and an exergy balance equation. A previously derived exergy equation related to rocket systems is investigated, and a higher fidelity analysis will be derived. The exergy assessments will enable informed, value-based decision making when comparing alternative rocket system designs, and will allow the most efficient configuration among candidate configurations to be determined.

  3. Applying cognitive load theory to the redesign of a conventional database systems course

    NASA Astrophysics Data System (ADS)

    Mason, Raina; Seton, Carolyn; Cooper, Graham

    2016-01-01

    Cognitive load theory (CLT) was used to redesign a Database Systems course for Information Technology students. The redesign was intended to address poor student performance and low satisfaction, and to provide a more relevant foundation in database design and use for subsequent studies and industry. The original course followed the conventional structure for a database course, covering database design first, then database development. Analysis showed the conventional course content was appropriate but the instructional materials used were too complex, especially for novice students. The redesign of instructional materials applied CLT to remove split attention and redundancy effects, to provide suitable worked examples and sub-goals, and included an extensive re-sequencing of content. The approach was primarily directed towards mid- to lower performing students and results showed a significant improvement for this cohort with the exam failure rate reducing by 34% after the redesign on identical final exams. Student satisfaction also increased and feedback from subsequent study was very positive. The application of CLT to the design of instructional materials is discussed for delivery of technical courses.

  4. The pyramid system for multiscale raster analysis

    USGS Publications Warehouse

    De Cola, L.; Montagne, N.

    1993-01-01

    Geographical research requires the management and analysis of spatial data at multiple scales. As part of the U.S. Geological Survey's global change research program a software system has been developed that reads raster data (such as an image or digital elevation model) and produces a pyramid of aggregated lattices as well as various measurements of spatial complexity. For a given raster dataset the system uses the pyramid to report: (1) mean, (2) variance, (3) a spatial autocorrelation parameter based on multiscale analysis of variance, and (4) a monofractal scaling parameter based on the analysis of isoline lengths. The system is applied to 1-km digital elevation model (DEM) data for a 256-km2 region of central California, as well as to 64 partitions of the region. PYRAMID, which offers robust descriptions of data complexity, also is used to describe the behavior of topographic aspect with scale. ?? 1993.

  5. Miniaturized flow injection analysis system

    DOEpatents

    Folta, James A.

    1997-01-01

    A chemical analysis technique known as flow injection analysis, wherein small quantities of chemical reagents and sample are intermixed and reacted within a capillary flow system and the reaction products are detected optically, electrochemically, or by other means. A highly miniaturized version of a flow injection analysis system has been fabricated utilizing microfabrication techniques common to the microelectronics industry. The microflow system uses flow capillaries formed by etching microchannels in a silicon or glass wafer followed by bonding to another wafer, commercially available microvalves bonded directly to the microflow channels, and an optical absorption detector cell formed near the capillary outlet, with light being both delivered and collected with fiber optics. The microflow system is designed mainly for analysis of liquids and currently measures 38.times.25.times.3 mm, but can be designed for gas analysis and be substantially smaller in construction.

  6. Miniaturized flow injection analysis system

    DOEpatents

    Folta, J.A.

    1997-07-01

    A chemical analysis technique known as flow injection analysis is described, wherein small quantities of chemical reagents and sample are intermixed and reacted within a capillary flow system and the reaction products are detected optically, electrochemically, or by other means. A highly miniaturized version of a flow injection analysis system has been fabricated utilizing microfabrication techniques common to the microelectronics industry. The microflow system uses flow capillaries formed by etching microchannels in a silicon or glass wafer followed by bonding to another wafer, commercially available microvalves bonded directly to the microflow channels, and an optical absorption detector cell formed near the capillary outlet, with light being both delivered and collected with fiber optics. The microflow system is designed mainly for analysis of liquids and currently measures 38{times}25{times}3 mm, but can be designed for gas analysis and be substantially smaller in construction. 9 figs.

  7. Fundamental Study on Saving Energy for Electrified Railway System Applying High Temperature Superconductor Motor and Energy Storage System

    NASA Astrophysics Data System (ADS)

    Konishi, Takeshi; Nakamura, Taketsune; Amemiya, Naoyuki

    Induction motor instead of dc one has been applied widely for dc electric rolling stock because of the advantage of its utility and efficiency. However, further improvement of motor characteristics will be required to realize environment-friendly dc railway system in the future. It is important to study more efficient machine applying dc electric rolling stock for next generation high performance system. On the other hand, the methods to reuse regenerative energy produced by motors effectively are also important. Therefore, we carried out fundamental study on saving energy for electrified railway system. For the first step, we introduced the energy storage system applying electric double-layer capacitors (EDLC), and its control system. And then, we tried to obtain the specification of high temperature superconductor induction/synchronous motor (HTS-ISM), which performance is similar with that of the conventional induction motors. Furthermore, we tried to evaluate an electrified railway system applying energy storage system and HTS-ISM based on simulation. We succeeded in showing the effectiveness of the introductions of energy storage system and HTS-ISM in DC electrified railway system.

  8. Operationalizing sustainability in urban coastal systems: a system dynamics analysis.

    PubMed

    Mavrommati, Georgia; Bithas, Kostas; Panayiotidis, Panayiotis

    2013-12-15

    We propose a system dynamics approach for Ecologically Sustainable Development (ESD) in urban coastal systems. A systematic analysis based on theoretical considerations, policy analysis and experts' knowledge is followed in order to define the concept of ESD. The principles underlying ESD feed the development of a System Dynamics Model (SDM) that connects the pollutant loads produced by urban systems' socioeconomic activities with the ecological condition of the coastal ecosystem that it is delineated in operational terms through key biological elements defined by the EU Water Framework Directive. The receiving waters of the Athens Metropolitan area, which bears the elements of typical high population density Mediterranean coastal city but which currently has also new dynamics induced by the ongoing financial crisis, are used as an experimental system for testing a system dynamics approach to apply the concept of ESD. Systems' thinking is employed to represent the complex relationships among the components of the system. Interconnections and dependencies that determine the potentials for achieving ESD are revealed. The proposed system dynamics analysis can facilitate decision makers to define paths of development that comply with the principles of ESD. PMID:24200010

  9. INDEPENDENT COMPONENT ANALYSIS (ICA) APPLIED TO LONG BUNCH BEAMS IN THE LOS ALAMOS PROTON STORAGE RING

    SciTech Connect

    Kolski, Jeffrey S.; Macek, Robert J.; McCrady, Rodney C.; Pang, Xiaoying

    2012-05-14

    Independent component analysis (ICA) is a powerful blind source separation (BSS) method. Compared to the typical BSS method, principal component analysis (PCA), which is the BSS foundation of the well known model independent analysis (MIA), ICA is more robust to noise, coupling, and nonlinearity. ICA of turn-by-turn beam position data has been used to measure the transverse betatron phase and amplitude functions, dispersion function, linear coupling, sextupole strength, and nonlinear beam dynamics. We apply ICA in a new way to slices along the bunch and discuss the source signals identified as betatron motion and longitudinal beam structure.

  10. Canister Model, Systems Analysis

    1993-09-29

    This packges provides a computer simulation of a systems model for packaging nuclear waste and spent nuclear fuel in canisters. The canister model calculates overall programmatic cost, number of canisters, and fuel and waste inventories for the Idaho Chemical Processing Plant (other initial conditions can be entered).

  11. WASTE COMBUSTION SYSTEM ANALYSIS

    EPA Science Inventory

    The report gives results of a study of biomass combustion alternatives. The objective was to evaluate the thermal performance and costs of available and developing biomass systems. The characteristics of available biomass fuels were reviewed, and the performance parameters of alt...

  12. Expert system interaction with existing analysis codes

    SciTech Connect

    Ransom, V.H.; Fink, R.K.; Bertch, W.J.; Callow, R.A.

    1986-01-01

    Coupling expert systems with existing engineering analysis codes is a promising area in the field of artificial intelligence. The added intelligence can provide for easier and less costly use of the code and also reduce the potential for code misuse. This paper will discuss the methods available to allow interaction between an expert system and a large analysis code running on a mainframe. Concluding remarks will identify potential areas of expert system application with specific areas that are being considered in a current research program. The difficulty of interaction between an analysis code and an expert system is due to the incompatibility between the FORTRAN environment used for the analysis code and the AI environment used for the expert system. Three methods, excluding file transfer techniques, are discussed to help overcome this incompatibility. The first method is linking the FORTRAN routines to the LISP environment on the same computer. Various LISP dialects available on mainframes and their interlanguage communication capabilities are discussed. The second method involves network interaction between a LISP machine and a mainframe computer. Comparisons between the linking method and networking are noted. The third method involves the use of an expert system tool that is campatible with a FORTRAN environment. Several available tools are discussed. With the interaction methods identified, several potential application areas are considered. Selection of the specific areas that will be developed for the pilot project and applied to a thermal-hydraulic energy analysis code are noted.

  13. Integrated systems analysis of the PIUS reactor

    SciTech Connect

    Fullwood, F.; Kroeger, P.; Higgins, J.

    1993-11-01

    Results are presented of a systems failure analysis of the PIUS plant systems that are used during normal reactor operation and postulated accidents. This study was performed to provide the NRC with an understanding of the behavior of the plant. The study applied two diverse failure identification methods, Failure Modes Effects & Criticality Analysis (FMECA) and Hazards & Operability (HAZOP) to the plant systems, supported by several deterministic analyses. Conventional PRA methods were also used along with a scheme for classifying events by initiator frequency and combinations of failures. Principal results of this study are: (a) an extensive listing of potential event sequences, grouped in categories that can be used by the NRC, (b) identification of support systems that are important to safety, and (c) identification of key operator actions.

  14. Systems analysis-independent analysis and verification

    SciTech Connect

    Badin, J.S.; DiPietro, J.P.

    1995-09-01

    The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.

  15. Research in progress in applied mathematics, numerical analysis, fluid mechanics, and computer science

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  16. Energy-Systems Economic Analysis

    NASA Technical Reports Server (NTRS)

    Doane, J.; Slonski, M. L.; Borden, C. S.

    1982-01-01

    Energy Systems Economic Analysis (ESEA) program is flexible analytical tool for rank ordering of alternative energy systems. Basic ESEA approach derives an estimate of those costs incurred as result of purchasing, installing and operating an energy system. These costs, suitably aggregated into yearly costs over lifetime of system, are divided by expected yearly energy output to determine busbar energy costs. ESEA, developed in 1979, is written in FORTRAN IV for batch execution.

  17. Controlled ecological life support system: Transportation analysis

    NASA Technical Reports Server (NTRS)

    Gustan, E.; Vinopal, T.

    1982-01-01

    This report discusses a study utilizing a systems analysis approach to determine which NASA missions would benefit from controlled ecological life support system (CELSS) technology. The study focuses on manned missions selected from NASA planning forecasts covering the next half century. Comparison of various life support scenarios for the selected missions and characteristics of projected transportation systems provided data for cost evaluations. This approach identified missions that derived benefits from a CELSS, showed the magnitude of the potential cost savings, and indicated which system or combination of systems would apply. This report outlines the analytical approach used in the evaluation, describes the missions and systems considered, and sets forth the benefits derived from CELSS when applicable.

  18. Applied Nuclear Accountability Systems: A Case Study in the System Architecture and Development of NuMAC

    SciTech Connect

    Campbell, Andrea Beth

    2004-07-01

    This is a case study of the NuMAC nuclear accountability system developed at a private fuel fabrication facility. This paper investigates nuclear material accountability and safeguards by researching expert knowledge applied in the system design and development. Presented is a system developed to detect and deter the theft of weapons grade nuclear material. Examined is the system architecture that includes: issues for the design and development of the system; stakeholder issues; how the system was built and evolved; software design, database design, and development tool considerations; security and computing ethics. (author)

  19. Expert Meeting Report: Recommendations for Applying Water Heaters in Combination Space and Domestic Water Heating Systems

    SciTech Connect

    Rudd, A.; Ueno, K.; Bergey, D.; Osser, R.

    2012-07-01

    The topic of this meeting was 'Recommendations For Applying Water Heaters In Combination Space And Domestic Water Heating Systems.' Presentations and discussions centered on the design, performance, and maintenance of these combination systems, with the goal of developing foundational information toward the development of a Building America Measure Guideline on this topic. The meeting was held at the Westford Regency Hotel, in Westford, Massachusetts on 7/31/2011.

  20. A generic organ based ontology system, applied to vertebrate heart anatomy, development and physiology.

    PubMed

    Bertens, Laura M F; Slob, Joris; Verbeek, Fons J

    2011-09-08

    We present a novel approach to modelling biological information using ontologies. The system interlinks three ontologies, comprising anatomical, developmental and taxonomical information, and includes instances of structures for different species. The framework is constructed for comparative analyses in the field of evolutionary development. We have applied the approach to the vertebrate heart and present four case studies of the functionality of the system, focusing on cross-species comparisons, developmental studies, physiological studies and 3D visualisation.

  1. Critical Education for Systemic Change: A World-Systems Analysis Perspective

    ERIC Educational Resources Information Center

    Griffiths, Tom G.

    2015-01-01

    This paper both draws on, and seeks to apply, world-systems analysis to a broad, critical education project that builds mass schooling's potential contribution to the process of world-systemic change. In short, this is done by first setting out the world-systems analysis account of the current state, and period of transition, of the capitalist…

  2. 40 CFR 63.8030 - What requirements apply to my heat exchange systems?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... apply to my heat exchange systems? (a) You must comply with the requirements specified in Table 6 to... § 63.10(b)(1). (e) The reference to the periodic report required by § 63.152(c) of subpart G of...

  3. Teaching Applied Genetics and Molecular Biology to Agriculture Engineers. Application of the European Credit Transfer System

    ERIC Educational Resources Information Center

    Weiss, J.; Egea-Cortines, M.

    2008-01-01

    We have been teaching applied molecular genetics to engineers and adapted the teaching methodology to the European Credit Transfer System. We teach core principles of genetics that are universal and form the conceptual basis of most molecular technologies. The course then teaches widely used techniques and finally shows how different techniques…

  4. Toward a Blended Ontology: Applying Knowledge Systems to Compare Therapeutic and Toxicological Nanoscale Domains

    EPA Science Inventory

    Bionanomedicine and environmental research share need common terms and ontologies. This study applied knowledge systems, data mining, and bibliometrics used in nano-scale ADME research from 1991 to 2011. The prominence of nano-ADME in environmental research began to exceed the pu...

  5. Embracing Connectedness and Change: A Complex Dynamic Systems Perspective for Applied Linguistic Research

    ERIC Educational Resources Information Center

    Cameron, Lynne

    2015-01-01

    Complex dynamic systems (CDS) theory offers a powerful metaphorical model of applied linguistic processes, allowing holistic descriptions of situated phenomena, and addressing the connectedness and change that often characterise issues in our field. A recent study of Kenyan conflict transformation illustrates application of a CDS perspective. Key…

  6. Systems analysis for DSN microwave antenna holography

    NASA Technical Reports Server (NTRS)

    Rochblatt, D. J.

    1989-01-01

    Proposed systems for Deep Space Network (DSN) microwave antenna holography are analyzed. Microwave holography, as applied to antennas, is a technique which utilizes the Fourier Transform relation between the complex far-field radiation pattern of an antenna and the complex aperture field distribution to provide a methodology for the analysis and evaluation of antenna performance. Resulting aperture phase and amplitude distribution data are used to precisely characterize various crucial performance parameters, including panel alignment, subreflector position, antenna aperture illumination, directivity at various frequencies, and gravity deformation. Microwave holographic analysis provides diagnostic capacity as well as being a powerful tool for evaluating antenna design specifications and their corresponding theoretical models.

  7. Moment analysis method as applied to the 2S --> 2P transition in cryogenic alkali metal/rare gas matrices.

    PubMed

    Terrill Vosbein, Heidi A; Boatz, Jerry A; Kenney, John W

    2005-12-22

    The moment analysis method (MA) has been tested for the case of 2S --> 2P ([core]ns1 --> [core]np1) transitions of alkali metal atoms (M) doped into cryogenic rare gas (Rg) matrices using theoretically validated simulations. Theoretical/computational M/Rg system models are constructed with precisely defined parameters that closely mimic known M/Rg systems. Monte Carlo (MC) techniques are then employed to generate simulated absorption and magnetic circular dichroism (MCD) spectra of the 2S --> 2P M/Rg transition to which the MA method can be applied with the goal of seeing how effective the MA method is in re-extracting the M/Rg system parameters from these known simulated systems. The MA method is summarized in general, and an assessment is made of the use of the MA method in the rigid shift approximation typically used to evaluate M/Rg systems. The MC-MCD simulation technique is summarized, and validating evidence is presented. The simulation results and the assumptions used in applying MA to M/Rg systems are evaluated. The simulation results on Na/Ar demonstrate that the MA method does successfully re-extract the 2P spin-orbit coupling constant and Landé g-factor values initially used to build the simulations. However, assigning physical significance to the cubic and noncubic Jahn-Teller (JT) vibrational mode parameters in cryogenic M/Rg systems is not supported.

  8. Complex System Ensemble Analysis

    NASA Astrophysics Data System (ADS)

    Pearson, Carl A.

    A new measure for interaction network ensembles and their dynamics is presented: the ensemble transition matrix, T, the proportions of networks in an ensemble that support particular transitions. That presentation begins with generation of the ensemble and application of constraint perturbations to compute T, including estimating alternatives to accommodate cases where the problem size becomes computationally intractable. Then, T is used to predict ensemble dynamics properties in expected-value like calculations. Finally, analyses from the two complementary assumptions about T - that it represents uncertainty about a unique system or that it represents stochasticity around a unique constraint - are presented: entropy-based experiment selection and generalized potentials/heat dissipation of the system, respectively. Extension of these techniques to more general graph models is described, but not demonstrated. Future directions for research using T are proposed in the summary chapter. Throughout this work, the presentation of various calculations involving T are motivated by the Budding Yeast Cell Cycle example, with argument for the generality of the approaches presented by the results of their application to a database of pseudo-randomly generated dynamic constraints.

  9. Applying Chemical Imaging Analysis to Improve Our Understanding of Cold Cloud Formation

    NASA Astrophysics Data System (ADS)

    Laskin, A.; Knopf, D. A.; Wang, B.; Alpert, P. A.; Roedel, T.; Gilles, M. K.; Moffet, R.; Tivanski, A.

    2012-12-01

    The impact that atmospheric ice nucleation has on the global radiation budget is one of the least understood problems in atmospheric sciences. This is in part due to the incomplete understanding of various ice nucleation pathways that lead to ice crystal formation from pre-existing aerosol particles. Studies investigating the ice nucleation propensity of laboratory generated particles indicate that individual particle types are highly selective in their ice nucleating efficiency. This description of heterogeneous ice nucleation would present a challenge when applying to the atmosphere which contains a complex mixture of particles. Here, we employ a combination of micro-spectroscopic and optical single particle analytical methods to relate particle physical and chemical properties with observed water uptake and ice nucleation. Field-collected particles from urban environments impacted by anthropogenic and marine emissions and aging processes are investigated. Single particle characterization is provided by computer controlled scanning electron microscopy with energy dispersive analysis of X-rays (CCSEM/EDX) and scanning transmission X-ray microscopy with near edge X-ray absorption fine structure spectroscopy (STXM/NEXAFS). A particle-on-substrate approach coupled to a vapor controlled cooling-stage and a microscope system is applied to determine the onsets of water uptake and ice nucleation including immersion freezing and deposition ice nucleation as a function of temperature (T) as low as 200 K and relative humidity (RH) up to water saturation. We observe for urban aerosol particles that for T > 230 K the oxidation level affects initial water uptake and that subsequent immersion freezing depends on particle mixing state, e.g. by the presence of insoluble particles. For T < 230 K the particles initiate deposition ice nucleation well below the homogeneous freezing limit. Particles collected throughout one day for similar meteorological conditions show very similar

  10. 1992 NASA Life Support Systems Analysis workshop

    NASA Technical Reports Server (NTRS)

    Evanich, Peggy L.; Crabb, Thomas M.; Gartrell, Charles F.

    1992-01-01

    The 1992 Life Support Systems Analysis Workshop was sponsored by NASA's Office of Aeronautics and Space Technology (OAST) to integrate the inputs from, disseminate information to, and foster communication among NASA, industry, and academic specialists. The workshop continued discussion and definition of key issues identified in the 1991 workshop, including: (1) modeling and experimental validation; (2) definition of systems analysis evaluation criteria; (3) integration of modeling at multiple levels; and (4) assessment of process control modeling approaches. Through both the 1991 and 1992 workshops, NASA has continued to seek input from industry and university chemical process modeling and analysis experts, and to introduce and apply new systems analysis approaches to life support systems. The workshop included technical presentations, discussions, and interactive planning, with sufficient time allocated for discussion of both technology status and technology development recommendations. Key personnel currently involved with life support technology developments from NASA, industry, and academia provided input to the status and priorities of current and future systems analysis methods and requirements.

  11. Information systems vulnerability: A systems analysis perspective

    SciTech Connect

    Wyss, G.D.; Daniel, S.L.; Schriner, H.K.; Gaylor, T.R.

    1996-07-01

    Vulnerability analyses for information systems are complicated because the systems are often geographically distributed. Sandia National Laboratories has assembled an interdisciplinary team to explore the applicability of probabilistic logic modeling (PLM) techniques (including vulnerability and vital area analysis) to examine the risks associated with networked information systems. The authors have found that the reliability and failure modes of many network technologies can be effectively assessed using fault trees and other PLM methods. The results of these models are compatible with an expanded set of vital area analysis techniques that can model both physical locations and virtual (logical) locations to identify both categories of vital areas simultaneously. These results can also be used with optimization techniques to direct the analyst toward the most cost-effective security solution.

  12. Applying behavior analysis to clinical problems: review and analysis of habit reversal.

    PubMed Central

    Miltenberger, R G; Fuqua, R W; Woods, D W

    1998-01-01

    This article provides a review and analysis of habit reversal, a multicomponent procedure developed by Azrin and Nunn (1973, 1974) for the treatment of nervous habits, tics, and stuttering. The article starts with a discussion of the behaviors treated with habit reversal, behavioral covariation among habits, and functional analysis and assessment of habits. Research on habit reversal and simplified versions of the procedure is then described. Next the article discusses the limitations of habit reversal and the evidence for its generality. The article concludes with an analysis of the behavioral processes involved in habit reversal and suggestions for future research. PMID:9757583

  13. A Study in the Founding of Applied Behavior Analysis Through Its Publications

    PubMed Central

    Morris, Edward K.; Altus, Deborah E.; Smith, Nathaniel G.

    2013-01-01

    This article reports a study of the founding of applied behavior analysis through its publications. Our methods included hand searches of sources (e.g., journals, reference lists), search terms (i.e., early, applied, behavioral, research, literature), inclusion criteria (e.g., the field's applied dimension), and (d) challenges to their face and content validity. Our results were 36 articles published between 1959 and 1967 that we organized into 4 groups: 12 in 3 programs of research and 24 others. Our discussion addresses (a) limitations in our method (e.g., the completeness of our search), (b) challenges to the validity of our methods and results (e.g., convergent validity), and (c) priority claims about the field's founding. We conclude that the claims are irresolvable because identification of the founding publications depends significantly on methods and because the field's founding was an evolutionary process. We close with suggestions for future research. PMID:25729133

  14. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    SciTech Connect

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity is

  15. Thermal Analysis System

    NASA Technical Reports Server (NTRS)

    DiStefano, III, Frank James (Inventor); Wobick, Craig A. (Inventor); Chapman, Kirt Auldwin (Inventor); McCloud, Peter L. (Inventor)

    2014-01-01

    A thermal fluid system modeler including a plurality of individual components. A solution vector is configured and ordered as a function of one or more inlet dependencies of the plurality of individual components. A fluid flow simulator simulates thermal energy being communicated with the flowing fluid and between first and second components of the plurality of individual components. The simulation extends from an initial time to a later time step and bounds heat transfer to be substantially between the flowing fluid, walls of tubes formed in each of the individual components of the plurality, and between adjacent tubes. Component parameters of the solution vector are updated with simulation results for each of the plurality of individual components of the simulation.

  16. MetaNetVar: Pipeline for applying network analysis tools for genomic variants analysis

    PubMed Central

    Moyer, Eric; Hagenauer, Megan; Lesko, Matthew; Francis, Felix; Rodriguez, Oscar; Nagarajan, Vijayaraj; Huser, Vojtech; Busby, Ben

    2016-01-01

    Network analysis can make variant analysis better. There are existing tools like HotNet2 and dmGWAS that can provide various analytical methods. We developed a prototype of a pipeline called MetaNetVar that allows execution of multiple tools. The code is published at https://github.com/NCBI-Hackathons/Network_SNPs. A working prototype is published as an Amazon Machine Image - ami-4510312f . PMID:27158457

  17. Summary of research in applied mathematics, numerical analysis, and computer sciences

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The major categories of current ICASE research programs addressed include: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effective numerical methods; computational problems in engineering and physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and computer systems and software, especially vector and parallel computers.

  18. Applying behavior analysis to school violence and discipline problems: Schoolwide positive behavior support.

    PubMed

    Anderson, Cynthia M; Kincaid, Donald

    2005-01-01

    School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis. PMID:22478439

  19. Applying behavior analysis to school violence and discipline problems: Schoolwide positive behavior support

    PubMed Central

    Anderson, Cynthia M.; Kincaid, Donald

    2005-01-01

    School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis. PMID:22478439

  20. Geospatial analysis applied to epidemiological studies of dengue: a systematic review.

    PubMed

    Oliveira, Maria Aparecida de; Ribeiro, Helena; Castillo-Salgado, Carlos

    2013-12-01

    A systematic review of the geospatial analysis methods used in the dengue fever studies published between January 2001 and March 2011 was undertaken. In accordance with specific selection criteria thirty-five studies were selected for inclusion in the review. The aim was to assess the types of spatial methods that have been used to analyze dengue transmission. We found twenty-one different methods that had been used in dengue fever epidemiological studies in that period, three of which were most frequently used. The results show that few articles had applied spatial analysis methods in dengue fever studies; however, whenever they were applied they contributed to a better understanding of dengue fever geospatial diffusion.

  1. Analysis hierarchical model for discrete event systems

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  2. Leaching of Particulate and Dissolved Organic Carbon from Compost Applied to Bioretention Systems

    NASA Astrophysics Data System (ADS)

    Iqbal, Hamid; Flury, Markus; Mullane, Jessica; Baig, Muhammad

    2015-04-01

    Compost is used in bioretention systems to improve soil quality, to promote plant growth, and to remove metal contaminants from stormwater. However, compost itself, particularly when applied freshly, can be a source of contamination of the stormwater. To test the potential contamination caused by compost when applied to bioretention systems, we continuously leached a compost column with water under unsaturated conditions and characterized dissolved and particulate organic matter in the leachate. Freshly applied, mature compost leached up to 400 mg/L of dissolved organic carbon and 2,000 mg/L of suspended particulate organic carbon. It required a cumulative water flux of 4,000 mm until concentrations of dissolved and particulate organic carbon declined to levels typical for surface waters. Although, dissolved and particulate organic carbon are not contaminants per se, they can facilitate the movement of metals, thereby enhancing the mobility of toxic metals present in stormwater. Therefore, we recommended that compost is washed before it is applied to bioretention systems. Keywords compost; leachate; alkali extract; dissolved organic carbon; flux

  3. System safety engineering analysis handbook

    NASA Technical Reports Server (NTRS)

    Ijams, T. E.

    1972-01-01

    The basic requirements and guidelines for the preparation of System Safety Engineering Analysis are presented. The philosophy of System Safety and the various analytic methods available to the engineering profession are discussed. A text-book description of each of the methods is included.

  4. Speech Analysis Systems: An Evaluation.

    ERIC Educational Resources Information Center

    Read, Charles; And Others

    1992-01-01

    Performance characteristics are reviewed for seven computerized systems marketed for acoustic speech analysis: CSpeech, CSRE, ILS-PC, Kay Elemetrics model 550 Sona-Graph, MacSpeech Lab II, MSL, and Signalyze. Characteristics reviewed include system components, basic capabilities, documentation, user interface, data formats and journaling, and…

  5. Applying the revenge system to the criminal justice system and jury decision-making.

    PubMed

    Roberts, S Craig; Murray, Jennifer

    2013-02-01

    McCullough et al. propose an evolved cognitive revenge system which imposes retaliatory costs on aggressors. They distinguish between this and other forms of punishment (e.g., those administered by judges) which are not underpinned by a specifically designed evolutionary mechanism. Here we outline mechanisms and circumstances through which the revenge system might nonetheless infiltrate decision-making within the criminal justice system.

  6. Applying the least restrictive alternative principle to treatment decisions: A legal and behavioral analysis

    PubMed Central

    Johnston, J. M.; Sherman, Robert A.

    1993-01-01

    The least restrictive alternative concept is widely used in mental health law. This paper addresses how the concept has been applied to treatment decisions. The paper offers both a legal and a behavioral analysis to some problems that have emerged in recent years concerning the selection of behavioral procedures used to change client behavior. The paper also offers ways of improving the application of the concept, which involve developing a more behaviorally functional perspective toward restrictiveness. PMID:22478138

  7. High-resolution frequency analysis as applied to the singing voice.

    PubMed

    Morsomme, D; Remacle, M; Millet, B

    1993-01-01

    We have applied high-resolution vocal frequent analysis to a population of singing voices. Two important elements have become apparent: (1) Confirmation that the singing formant originates in the resonators. This is observed especially on a low fundamental, and it is acquired through technical skill and experience. (2) Observation of the vibrato, which, isolated from the clinical study, regarding only its graphic presentation, could have been interpreted as 'abnormal'. PMID:8253452

  8. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    NASA Astrophysics Data System (ADS)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  9. Intelligent user interface for expert systems applied to power plant maintenance and troubleshooting

    SciTech Connect

    Kock, C.G.; Isle, B.A.; Butler, A.W.

    1988-03-01

    A research and development project is under way to specify, design, construct, and evaluate a user interface system to meet the unique requirements of a delivery vehicle for a knowledge-based system applied to gas turbine electronics equipment maintenance and troubleshooting. The user interface is a portable device with text display, video and overlay graphics display, voice recognition and speech production, special-function keypad, and printer. A modular software structure based on a serial communications protocol between user interface device and expert system host computer provides flexibility, expandability, and a simple, effective user interface dialogue.

  10. [Clustering analysis applied to near-infrared spectroscopy analysis of Chinese traditional medicine].

    PubMed

    Liu, Mu-qing; Zhou, De-cheng; Xu, Xin-yuan; Sun, Yao-jie; Zhou, Xiao-li; Han, Lei

    2007-10-01

    The present article discusses the clustering analysis used in the near-infrared (NIR) spectroscopy analysis of Chinese traditional medicines, which provides a new method for the classification of Chinese traditional medicines. Samples selected purposely in the authors' research to measure their absorption spectra in seconds by a multi-channel NIR spectrometer developed in the authors' lab were safrole, eucalypt oil, laurel oil, turpentine, clove oil and three samples of costmary oil from different suppliers. The spectra in the range of 0.70-1.7 microm were measured with air as background and the results indicated that they are quite distinct. Qualitative mathematical model was set up and cluster analysis based on the spectra was carried out through different clustering methods for optimization, and came out the cluster correlation coefficient of 0.9742 in the authors' research. This indicated that cluster analysis of the group of samples is practicable. Also it is reasonable to get the result that the calculated classification of 8 samples was quite accorded with their characteristics, especially the three samples of costmary oil were in the closest classification of the clustering analysis. PMID:18306778

  11. Applying observations of work activity in designing prototype data analysis tools

    SciTech Connect

    Springmeyer, R.R.

    1993-07-06

    Designers, implementers, and marketers of data analysis tools typically have different perspectives than users. Consequently, data analysis often find themselves using tools focused on graphics and programming concepts rather than concepts which reflect their own domain and the context of their work. Some user studies focus on usability tests late in development; others observe work activity, but fail to show how to apply that knowledge in design. This paper describes a methodology for applying observations of data analysis work activity in prototype tool design. The approach can be used both in designing improved data analysis tools, and customizing visualization environments to specific applications. We present an example of user-centered design for a prototype tool to cull large data sets. We revisit the typical graphical approach of animating a large data set from the point of view of an analysis who is culling data. Field evaluations using the prototype tool not only revealed valuable usability information, but initiated in-depth discussions about user`s work, tools, technology, and requirements.

  12. A Computer-Mediated Instruction System, Applied to Its Own Operating System and Peripheral Equipment.

    ERIC Educational Resources Information Center

    Winiecki, Roger D.

    Each semester students in the School of Health Sciences of Hunter College learn how to use a computer, how a computer system operates, and how peripheral equipment can be used. To overcome inadequate computer center services and equipment, programed subject matter and accompanying reference material were developed. The instructional system has a…

  13. Residual energy applications program systems analysis report

    SciTech Connect

    Yngve, P.W.

    1980-10-01

    Current DOE plans call for building an Energy Applied Systems Test (EAST) Facility at the Savannah River Plant in close proximity to the 140 to 150/sup 0/F waste heat from one of several operating nuclear reactors. The waste water flow from each reactor, approximately 165,000 gpm, provides a unique opportunity to test the performance and operating characteristics of large-scale waste heat power generation and heat pump system concepts. This report provides a preliminary description of the potential end-use market, parametric data on heat pump and the power generation system technology, a preliminary listing of EAST Facility requirements, and an example of an integrated industrial park utilizing the technology to maximize economic pay back. The parametric heat pump analysis concluded that dual-fluid Rankine cycle heat pumps with capacities as high as 400 x 10/sup 6/ Btu/h, can utilize large sources of low temperature residual heat to provide 300/sup 0/F saturatd steam for an industrial park. The before tax return on investment for this concept is 36.2%. The analysis also concluded that smaller modular heat pumps could fulfill the same objective while sacrificing only a moderate rate of return. The parametric power generation analysis concluded that multi-pressure Rankine cycle systems not only are superior to single pressure systems, but can also be developed for large systems (approx. = 17 MW/sub e/). This same technology is applicable to smaller systems at the sacrifice of higher investment per unit output.

  14. Galvanic Liquid Applied Coating System for Protection of Embedded Steel Surfaces from Corrosion

    NASA Technical Reports Server (NTRS)

    Curran, Joseph; MacDowell, Louis; Voska, N. (Technical Monitor)

    2002-01-01

    The corrosion of reinforcing steel in concrete is an insidious problem for the Kennedy Space Center, government agencies, and the general public. Existing corrosion protection systems on the market are costly, complex, and time-consuming to install, require continuous maintenance and monitoring, and require specialized skills for installation. NASA's galvanic liquid-applied coating offers companies the ability to conveniently protect embedded steel rebar surfaces from corrosion. Liquid-applied inorganic galvanic coating contains one ore more of the following metallic particles: magnesium, zinc, or indium and may contain moisture attracting compounds that facilitate the protection process. The coating is applied to the outer surface of reinforced concrete so that electrical current is established between metallic particles and surfaces of embedded steel rebar; and electric (ionic) current is responsible for providing the necessary cathodic protection for embedded rebar surfaces.

  15. Applying FSL to the FIAC data: model-based and model-free analysis of voice and sentence repetition priming.

    PubMed

    Beckmann, Christian F; Jenkinson, Mark; Woolrich, Mark W; Behrens, Timothy E J; Flitney, David E; Devlin, Joseph T; Smith, Stephen M

    2006-05-01

    This article presents results obtained from applying various tools from FSL (FMRIB Software Library) to data from the repetition priming experiment used for the HBM'05 Functional Image Analysis Contest. We present analyses from the model-based General Linear Model (GLM) tool (FEAT) and from the model-free independent component analysis tool (MELODIC). We also discuss the application of tools for the correction of image distortions prior to the statistical analysis and the utility of recent advances in functional magnetic resonance imaging (FMRI) time series modeling and inference such as the use of optimal constrained HRF basis function modeling and mixture modeling inference. The combination of hemodynamic response function (HRF) and mixture modeling, in particular, revealed that both sentence content and speaker voice priming effects occurred bilaterally along the length of the superior temporal sulcus (STS). These results suggest that both are processed in a single underlying system without any significant asymmetries for content vs. voice processing. PMID:16565953

  16. Robustness of fuzzy logic power system stabilizers applied to multimachine power system

    SciTech Connect

    Hiyama, Takashi . Dept. of Electrical Engineering and Computer Science)

    1994-09-01

    This paper investigates the robustness of fuzzy logic stabilizers using the information of speed and acceleration states of a study unit. The input signals are the real power output and/or the speed of the study unit. Non-linear simulations show the robustness of the fuzzy logic power system stabilizers. Experiments are also performed by using a micro-machine system. The results show the feasibility of proposed fuzzy logic stabilizer.

  17. System based practice: a concept analysis

    PubMed Central

    YAZDANI, SHAHRAM; HOSSEINI, FAKHROLSADAT; AHMADY, SOLEIMAN

    2016-01-01

    Introduction Systems-Based Practice (SBP) is one of the six competencies introduced by the ACGME for physicians to provide high quality of care and also the most challenging of them in performance, training, and evaluation of medical students. This concept analysis clarifies the concept of SBP by identifying its components to make it possible to differentiate it from other similar concepts. For proper training of SBP and to ensure these competencies in physicians, it is necessary to have an operational definition, and SBP’s components must be precisely defined in order to provide valid and reliable assessment tools. Methods Walker & Avant’s approach to concept analysis was performed in eight stages: choosing a concept, determining the purpose of analysis, identifying all uses of the concept, defining attributes, identifying a model case, identifying borderline, related, and contrary cases, identifying antecedents and consequences, and defining empirical referents. Results Based on the analysis undertaken, the attributes of SBP includes knowledge of the system, balanced decision between patients’ need and system goals, effective role playing in interprofessional health care team, system level of health advocacy, and acting for system improvement. System thinking and a functional system are antecedents and system goals are consequences. A case model, as well as border, and contrary cases of SBP, has been introduced. Conclusion he identification of SBP attributes in this study contributes to the body of knowledge in SBP and reduces the ambiguity of this concept to make it possible for applying it in training of different medical specialties. Also, it would be possible to develop and use more precise tools to evaluate SBP competency by using empirical referents of the analysis. PMID:27104198

  18. A review of dendrogeomorphological research applied to flood risk analysis in Spain

    NASA Astrophysics Data System (ADS)

    Díez-Herrero, A.; Ballesteros, J. A.; Ruiz-Villanueva, V.; Bodoque, J. M.

    2013-08-01

    Over the last forty years, applying dendrogeomorphology to palaeoflood analysis has improved estimates of the frequency and magnitude of past floods worldwide. This paper reviews the main results obtained by applying dendrogeomorphology to flood research in several case studies in Central Spain. These dendrogeomorphological studies focused on the following topics: (1) anatomical analysis to understand the physiological response of trees to flood damage and improve sampling efficiency; (2) compiling robust flood chronologies in ungauged mountain streams, (3) determining flow depth and estimating flood discharge using two-dimensional hydraulic modelling, and comparing them with other palaeostage indicators; (4) calibrating hydraulic model parameters (i.e. Manning roughness); and (5) implementing stochastic-based, cost-benefit analysis to select optimal mitigation measures. The progress made in these areas is presented with suggestions for further research to improve the applicability of dendrogeochronology to palaeoflood studies. Further developments will include new methods for better identification of the causes of specific types of flood damage to trees (e.g. tilted trees) or stable isotope analysis of tree rings to identify the climatic conditions associated with periods of increasing flood magnitude or frequency.

  19. Weld analysis and control system

    NASA Technical Reports Server (NTRS)

    Kennedy, Larry Z. (Inventor); Rodgers, Michael H. (Inventor); Powell, Bradley W. (Inventor); Burroughs, Ivan A. (Inventor); Goode, K. Wayne (Inventor)

    1994-01-01

    The invention is a Weld Analysis and Control System developed for active weld system control through real time weld data acquisition. Closed-loop control is based on analysis of weld system parameters and weld geometry. The system is adapted for use with automated welding apparatus having a weld controller which is capable of active electronic control of all aspects of a welding operation. Enhanced graphics and data displays are provided for post-weld analysis. The system provides parameter acquisition, including seam location which is acquired for active torch cross-seam positioning. Torch stand-off is also monitored for control. Weld bead and parent surface geometrical parameters are acquired as an indication of weld quality. These parameters include mismatch, peaking, undercut, underfill, crown height, weld width, puddle diameter, and other measurable information about the weld puddle regions, such as puddle symmetry, etc. These parameters provide a basis for active control as well as post-weld quality analysis and verification. Weld system parameters, such as voltage, current and wire feed rate, are also monitored and archived for correlation with quality parameters.

  20. The transfer function method for gear system dynamics applied to conventional and minimum excitation gearing designs

    NASA Technical Reports Server (NTRS)

    Mark, W. D.

    1982-01-01

    A transfer function method for predicting the dynamic responses of gear systems with more than one gear mesh is developed and applied to the NASA Lewis four-square gear fatigue test apparatus. Methods for computing bearing-support force spectra and temporal histories of the total force transmitted by a gear mesh, the force transmitted by a single pair of teeth, and the maximum root stress in a single tooth are developed. Dynamic effects arising from other gear meshes in the system are included. A profile modification design method to minimize the vibration excitation arising from a pair of meshing gears is reviewed and extended. Families of tooth loading functions required for such designs are developed and examined for potential excitation of individual tooth vibrations. The profile modification design method is applied to a pair of test gears.

  1. Applying generalized stochastic Petri nets to manufacturing systems containing nonexponential transition functions

    NASA Technical Reports Server (NTRS)

    Watson, James F., III; Desrochers, Alan A.

    1991-01-01

    Generalized stochastic Petri nets (GSPNs) are applied to flexible manufacturing systems (FMSs). Throughput subnets and s-transitions are presented. Two FMS examples containing nonexponential distributions which were analyzed in previous papers by queuing theory and probability theory, respectively, are treated using GSPNs developed using throughput subnets and s-transitions. The GSPN results agree with the previous results, and developing and analyzing the GSPN models are straightforward and relatively easy compared to other methodologies.

  2. Quantification of applied dose in irradiated citrus fruits by DNA Comet Assay together with image analysis.

    PubMed

    Cetinkaya, Nurcan; Ercin, Demet; Özvatan, Sümer; Erel, Yakup

    2016-02-01

    The experiments were conducted for quantification of applied dose for quarantine control in irradiated citrus fruits. Citrus fruits exposed to doses of 0.1 to 1.5 kGy and analyzed by DNA Comet Assay. Observed comets were evaluated by image analysis. The tail length, tail moment and tail DNA% of comets were used for the interpretation of comets. Irradiated citrus fruits showed the separated tails from the head of the comet by increasing applied doses from 0.1 to 1.5 kGy. The mean tail length and mean tail moment% levels of irradiated citrus fruits at all doses are significantly different (p < 0.01) from control even for the lowest dose at 0.1 kGy. Thus, DNA Comet Assay may be a practical quarantine control method for irradiated citrus fruits since it has been possible to estimate the applied low doses as small as 0.1 kGy when it is combined with image analysis. PMID:26304361

  3. Quantification of applied dose in irradiated citrus fruits by DNA Comet Assay together with image analysis.

    PubMed

    Cetinkaya, Nurcan; Ercin, Demet; Özvatan, Sümer; Erel, Yakup

    2016-02-01

    The experiments were conducted for quantification of applied dose for quarantine control in irradiated citrus fruits. Citrus fruits exposed to doses of 0.1 to 1.5 kGy and analyzed by DNA Comet Assay. Observed comets were evaluated by image analysis. The tail length, tail moment and tail DNA% of comets were used for the interpretation of comets. Irradiated citrus fruits showed the separated tails from the head of the comet by increasing applied doses from 0.1 to 1.5 kGy. The mean tail length and mean tail moment% levels of irradiated citrus fruits at all doses are significantly different (p < 0.01) from control even for the lowest dose at 0.1 kGy. Thus, DNA Comet Assay may be a practical quarantine control method for irradiated citrus fruits since it has been possible to estimate the applied low doses as small as 0.1 kGy when it is combined with image analysis.

  4. Wavelet-based image analysis system for soil texture analysis

    NASA Astrophysics Data System (ADS)

    Sun, Yun; Long, Zhiling; Jang, Ping-Rey; Plodinec, M. John

    2003-05-01

    Soil texture is defined as the relative proportion of clay, silt and sand found in a given soil sample. It is an important physical property of soil that affects such phenomena as plant growth and agricultural fertility. Traditional methods used to determine soil texture are either time consuming (hydrometer), or subjective and experience-demanding (field tactile evaluation). Considering that textural patterns observed at soil surfaces are uniquely associated with soil textures, we propose an innovative approach to soil texture analysis, in which wavelet frames-based features representing texture contents of soil images are extracted and categorized by applying a maximum likelihood criterion. The soil texture analysis system has been tested successfully with an accuracy of 91% in classifying soil samples into one of three general categories of soil textures. In comparison with the common methods, this wavelet-based image analysis approach is convenient, efficient, fast, and objective.

  5. Multivariat least-squares methods applied to the quantitative spectral analysis of multicomponent samples

    SciTech Connect

    Haaland, D.M.; Easterling, R.G.; Vopicka, D.A.

    1985-01-01

    In an extension of earlier work, weighted multivariate least-squares methods of quantitative FT-IR analysis have been developed. A linear least-squares approximation to nonlinearities in the Beer-Lambert law is made by allowing the reference spectra to be a set of known mixtures, The incorporation of nonzero intercepts in the relation between absorbance and concentration further improves the approximation of nonlinearities while simultaneously accounting for nonzero spectra baselines. Pathlength variations are also accommodated in the analysis, and under certain conditions, unknown sample pathlengths can be determined. All spectral data are used to improve the precision and accuracy of the estimated concentrations. During the calibration phase of the analysis, pure component spectra are estimated from the standard mixture spectra. These can be compared with the measured pure component spectra to determine which vibrations experience nonlinear behavior. In the predictive phase of the analysis, the calculated spectra are used in our previous least-squares analysis to estimate sample component concentrations. These methods were applied to the analysis of the IR spectra of binary mixtures of esters. Even with severely overlapping spectral bands and nonlinearities in the Beer-Lambert law, the average relative error in the estimated concentration was <1%.

  6. Applying latent semantic analysis to large-scale medical image databases.

    PubMed

    Stathopoulos, Spyridon; Kalamboukis, Theodore

    2015-01-01

    Latent Semantic Analysis (LSA) although has been used successfully in text retrieval when applied to CBIR induces scalability issues with large image collections. The method so far has been used with small collections due to the high cost of storage and computational time for solving the SVD problem for a large and dense feature matrix. Here we present an effective and efficient approach of applying LSA skipping the SVD solution of the feature matrix and overcoming in this way the deficiencies of the method with large scale datasets. Early and late fusion techniques are tested and their performance is calculated. The study demonstrates that early fusion of several composite descriptors with visual words increase retrieval effectiveness. It also combines well in a late fusion for mixed (textual and visual) ad hoc and modality classification. The results reported are comparable to state of the art algorithms without including additional knowledge from the medical domain. PMID:24934416

  7. An uncertainty analysis of the PVT gauging method applied to sub-critical cryogenic propellant tanks

    NASA Astrophysics Data System (ADS)

    Van Dresar, Neil T.

    2004-06-01

    The PVT (pressure, volume, temperature) method of liquid quantity gauging in low-gravity is based on gas law calculations assuming conservation of pressurant gas within the propellant tank and the pressurant supply bottle. There is interest in applying this method to cryogenic propellant tanks since the method requires minimal additional hardware or instrumentation. To use PVT with cryogenic fluids, a non-condensable pressurant gas (helium) is required. With cryogens, there will be a significant amount of propellant vapor mixed with the pressurant gas in the tank ullage. This condition, along with the high sensitivity of propellant vapor pressure to temperature, makes the PVT method susceptible to substantially greater measurement uncertainty than is the case with less volatile propellants. A conventional uncertainty analysis is applied to example cases of liquid hydrogen and liquid oxygen tanks. It appears that the PVT method may be feasible for liquid oxygen. Acceptable accuracy will be more difficult to obtain with liquid hydrogen.

  8. Applied Behavior Analysis, Autism, and Occupational Therapy: A Search for Understanding.

    PubMed

    Welch, Christie D; Polatajko, H J

    2016-01-01

    Occupational therapists strive to be mindful, competent practitioners and continuously look for ways to improve practice. Applied behavior analysis (ABA) has strong evidence of effectiveness in helping people with autism achieve goals, yet it does not seem to be implemented in occupational therapy practice. To better understand whether ABA could be an evidence-based option to expand occupational therapy practice, the authors conducted an iterative, multiphase investigation of relevant literature. Findings suggest that occupational therapists apply developmental and sensory approaches to autism treatment. The occupational therapy literature does not reflect any use of ABA despite its strong evidence base. Occupational therapists may currently avoid using ABA principles because of a perception that ABA is not client centered. ABA principles and occupational therapy are compatible, and the two could work synergistically.

  9. Automated Loads Analysis System (ATLAS)

    NASA Technical Reports Server (NTRS)

    Gardner, Stephen; Frere, Scot; O’Reilly, Patrick

    2013-01-01

    ATLAS is a generalized solution that can be used for launch vehicles. ATLAS is used to produce modal transient analysis and quasi-static analysis results (i.e., accelerations, displacements, and forces) for the payload math models on a specific Shuttle Transport System (STS) flight using the shuttle math model and associated forcing functions. This innovation solves the problem of coupling of payload math models into a shuttle math model. It performs a transient loads analysis simulating liftoff, landing, and all flight events between liftoff and landing. ATLAS utilizes efficient and numerically stable algorithms available in MSC/NASTRAN.

  10. Thermodynamic Vent System Applied as Propellant Delivery System for Air Force

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Responding to a request from the Air Force, NASA Lewis Research Center engineers designed a combination pressure control and propellant delivery system based on thermodynamic vent system (TVS) technology. The Air Force is designing a new type of orbit transfer vehicle that uses energy from sunlight to both propel and power the vehicle. Because this vehicle uses propellant at a substantially slower rate than higher-energy rockets, it needed the Lewis-developed TVS technology for long-duration storage of cryogen propellants. Lewis engineers, in conjunction with industry partners, showed how this TVS technology could also be used to deliver propellant to the thruster. The Air Force has now begun the ground test demonstration phase. After successful completion of ground testing, the Air Force plans to use this technology in a space flight as early as 1999.

  11. X-ray microfluorescence with synchrotron radiation applied in the analysis of pigments from ancient Egypt

    NASA Astrophysics Data System (ADS)

    Calza, C.; Anjos, M. J.; Mendonça de Souza, S. M. F.; Brancaglion, A., Jr.; Lopes, R. T.

    2008-01-01

    In this work, X-ray microfluorescence with the synchrotron radiation technique was applied in the analysis of pigments found in decorative paintings in the sarcophagus of an Egyptian mummy. This female mummy, from the Roman Period, which was embalmed with the arms and legs swathed separately is considered one of the most important pieces of the Egyptian Collection from the National Museum (Rio de Janeiro, Brazil). The measurements were performed at the XRF beamline D09B of the Brazilian Synchrotron Light Laboratory (LNLS), using the white beam and a Si(Li) detector with resolution of 165 eV at 5.9 keV. The possible pigments found in the samples were: Egyptian blue, Egyptian green frit, green earth, verdigris, malachite, ochre, realgar, chalk, gypsum, bone white, ivory black and magnetite. Hierarchical cluster analysis (HCA) and principal component analysis (PCA) were applied to the results in order to verify if the samples belong to the same period of a linen wrapping fragment, whose provenance was well established.

  12. Polarization analysis of optical systems

    NASA Technical Reports Server (NTRS)

    Chipman, Russell A.

    1989-01-01

    For most optical systems it is typically assumed that the transmitted wavefront has uniform (or Gaussian) amplitude and constant polarization state. This is the default assumption of geometrical optics. This paper considers methods suitable for analyzing systems for which this assumption is not valid. Such methods of polarization analysis include polarization ray tracing and polarization aberration theory. Definitions of the basic classes of polarization phenomena and a review of the Jones calculus are included to form a basis for the discussion.

  13. Single CMOS sensor system for high resolution double volume measurement applied to membrane distillation system

    NASA Astrophysics Data System (ADS)

    Lorenz, M. G.; Izquierdo-Gil, M. A.; Sanchez-Reillo, R.; Fernandez-Pineda, C.

    2007-01-01

    Membrane distillation (MD) [1] is a relatively new process that is being investigated world-wide as a low cost, energy saving alternative to conventional separation processes such as distillation and reverse osmosis (RO). This process offers some advantages compared to other more popular separation processes, such as working at room conditions (pressure and temperature); low-grade, waste and/or alternative energy sources such as solar and geothermal energy may be used; a very high level of rejection with inorganic solutions; small equipment can be employed, etc. The driving force in MD processes is the vapor pressure difference across the membrane. A temperature difference is imposed across the membrane, which results in a vapor pressure difference. The principal problem in this kind of system is the accurate measurement of the recipient volume change, especially at very low flows. A cathetometer, with up to 0,05 mm resolution, is the instrument used to take these measurements, but the necessary human intervention makes this instrument not suitable for automated systems. In order to overcome this lack, a high resolution system is proposed, that makes automatic measurements of the volume of both recipients, cold and hot, at a rate of up to 10 times per second.

  14. Geophysical and astronomical models applied in the analysis of very long baseline interferometry

    NASA Technical Reports Server (NTRS)

    Ma, C.; Ryan, J. W.; Schupler, B. R.

    1980-01-01

    Very long baseline interferometry presents an opportunity to measure at the centimeter level such geodetic parameters as baseline length and instantaneous pole position. In order to achieve such precision, the geophysical and astronomical models used in data analysis must be as accurate as possible. The Mark-3 interactive data analysis system includes a number of refinements beyond conventional practice in modeling precession, nutation, diurnal polar motion, UT1, solid Earth tides, relativistic light deflection, and reduction to solar system barycentric coordinates. The algorithms and their effects on the recovered geodetic, geophysical, and astrometric parameters are discussed.

  15. Development of double-pulse lasers ablation system for generating gold ion source under applying an electric field

    NASA Astrophysics Data System (ADS)

    Khalil, A. A. I.

    2015-12-01

    Double-pulse lasers ablation (DPLA) technique was developed to generate gold (Au) ion source and produce high current under applying an electric potential in an argon ambient gas environment. Two Q-switched Nd:YAG lasers operating at 1064 and 266 nm wavelengths are combined in an unconventional orthogonal (crossed-beam) double-pulse configuration with 45° angle to focus on a gold target along with a spectrometer for spectral analysis of gold plasma. The properties of gold plasma produced under double-pulse lasers excitation were studied. The velocity distribution function (VDF) of the emitted plasma was studied using a dedicated Faraday-cup ion probe (FCIP) under argon gas discharge. The experimental parameters were optimized to attain the best signal to noise (S/N) ratio. The results depicted that the VDF and current signals depend on the discharge applied voltage, laser intensity, laser wavelength and ambient argon gas pressure. A seven-fold increases in the current signal by increasing the discharge applied voltage and ion velocity under applying double-pulse lasers field. The plasma parameters (electron temperature and density) were also studied and their dependence on the delay (times between the excitation laser pulse and the opening of camera shutter) was investigated as well. This study could provide significant reference data for the optimization and design of DPLA systems engaged in laser induced plasma deposition thin films and facing components diagnostics.

  16. Analysis of Preconditioning and Relaxation Operators for the Discontinuous Galerkin Method Applied to Diffusion

    NASA Technical Reports Server (NTRS)

    Atkins, H. L.; Shu, Chi-Wang

    2001-01-01

    The explicit stability constraint of the discontinuous Galerkin method applied to the diffusion operator decreases dramatically as the order of the method is increased. Block Jacobi and block Gauss-Seidel preconditioner operators are examined for their effectiveness at accelerating convergence. A Fourier analysis for methods of order 2 through 6 reveals that both preconditioner operators bound the eigenvalues of the discrete spatial operator. Additionally, in one dimension, the eigenvalues are grouped into two or three regions that are invariant with order of the method. Local relaxation methods are constructed that rapidly damp high frequencies for arbitrarily large time step.

  17. Analysis of asphalt-based roof systems using thermal analysis

    SciTech Connect

    Paroli, R.M.; Delgado, A.H.

    1996-10-01

    Asphalt is used extensively in roofing applications. Traditionally, it is used in a built-up roof system, where four or five plies are applied in conjunction with asphalt. This is labour intensive and requires good quality assurance on the roof top. Alternatively, asphalt can be used in a polymer-modified sheet where styrene-butadiene-styrene (SBS) or atactic polypropylene (APP) are added to the asphalt shipped in a roll where reinforcement (e.g., glass fibre mat) has been added. Regardless of the system used, the roof must be able to withstand the environmental loads such UV, heat, etc. Thermoanalytical techniques such as DSC, DMA, TMA and TG/DTA are ideally suited to monitor the weathering of asphalt. This paper presents data obtained using these techniques and shows how the performance of asphalt-based roof systems can be followed by thermal analysis.

  18. Multimodal tissue perfusion imaging using multi-spectral and thermographic imaging systems applied on clinical data

    NASA Astrophysics Data System (ADS)

    Klaessens, John H. G. M.; Nelisse, Martin; Verdaasdonk, Rudolf M.; Noordmans, Herke Jan

    2013-03-01

    Clinical interventions can cause changes in tissue perfusion, oxygenation or temperature. Real-time imaging of these phenomena could be useful for surgical strategy or understanding of physiological regulation mechanisms. Two noncontact imaging techniques were applied for imaging of large tissue areas: LED based multispectral imaging (MSI, 17 different wavelengths 370 nm-880 nm) and thermal imaging (7.5 to 13.5 μm). Oxygenation concentration changes were calculated using different analyzing methods. The advantages of these methods are presented for stationary and dynamic applications. Concentration calculations of chromophores in tissue require right choices of wavelengths The effects of different wavelength choices for hemoglobin concentration calculations were studied in laboratory conditions and consequently applied in clinical studies. Corrections for interferences during the clinical registrations (ambient light fluctuations, tissue movements) were performed. The wavelength dependency of the algorithms were studied and wavelength sets with the best results will be presented. The multispectral and thermal imaging systems were applied during clinical intervention studies: reperfusion of tissue flap transplantation (ENT), effectiveness of local anesthetic block and during open brain surgery in patients with epileptic seizures. The LED multispectral imaging system successfully imaged the perfusion and oxygenation changes during clinical interventions. The thermal images show local heat distributions over tissue areas as a result of changes in tissue perfusion. Multispectral imaging and thermal imaging provide complementary information and are promising techniques for real-time diagnostics of physiological processes in medicine.

  19. Systems analysis - independent analysis and verification

    SciTech Connect

    DiPietro, J.P.; Skolnik, E.G.; Badin, J.S.

    1996-10-01

    The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.

  20. Stakeholder analysis for industrial waste management systems.

    PubMed

    Heidrich, Oliver; Harvey, Joan; Tollin, Nicola

    2009-02-01

    Stakeholder approaches have been applied to the management of companies with a view to the improvement of all areas of performance, including economic, health and safety, waste reduction, future policies, etc. However no agreement exists regarding stakeholders, their interests and levels of importance. This paper considers stakeholder analysis with particular reference to environmental and waste management systems. It proposes a template and matrix model for identification of stakeholder roles and influences by rating the stakeholders. A case study demonstrates the use of these and their ability to be transferred to other circumstances and organizations is illustrated by using a large educational institution.

  1. System Safety Common Cause Analysis

    1992-03-10

    The COMCAN fault tree analysis codes are designed to analyze complex systems such as nuclear plants for common causes of failure. A common cause event, or common mode failure, is a secondary cause that could contribute to the failure of more than one component and violates the assumption of independence. Analysis of such events is an integral part of system reliability and safety analysis. A significant common cause event is a secondary cause common tomore » all basic events in one or more minimal cut sets. Minimal cut sets containing events from components sharing a common location or a common link are called common cause candidates. Components share a common location if no barrier insulates any one of them from the secondary cause. A common link is a dependency among components which cannot be removed by a physical barrier (e.g.,a common energy source or common maintenance instructions).« less

  2. Escalation research: Providing new frontiers for applying behavior analysis to organizational behavior

    PubMed Central

    Goltz, Sonia M.

    2000-01-01

    Decision fiascoes such as escalation of commitment, the tendency of decision makers to “throw good money after bad,” can have serious consequences for organizations and are therefore of great interest in applied research. This paper discusses the use of behavior analysis in organizational behavior research on escalation. Among the most significant aspects of behavior-analytic research on escalation is that it has indicated that both the patterns of outcomes that decision makers have experienced for past decisions and the patterns of responses that they make are critical for understanding escalation. This research has also stimulated the refinement of methods by researchers to better assess decision making and the role reinforcement plays in it. Finally, behavior-analytic escalation research has not only indicated the utility of reinforcement principles for predicting more complex human behavior but has also suggested some additional areas for future exploration of decision making using behavior analysis. PMID:22478347

  3. Escalation research: providing new frontiers for applying behavior analysis to organizational behavior.

    PubMed

    Goltz, S M

    2000-01-01

    Decision fiascoes such as escalation of commitment, the tendency of decision makers to "throw good money after bad," can have serious consequences for organizations and are therefore of great interest in applied research. This paper discusses the use of behavior analysis in organizational behavior research on escalation. Among the most significant aspects of behavior-analytic research on escalation is that it has indicated that both the patterns of outcomes that decision makers have experienced for past decisions and the patterns of responses that they make are critical for understanding escalation. This research has also stimulated the refinement of methods by researchers to better assess decision making and the role reinforcement plays in it. Finally, behavior-analytic escalation research has not only indicated the utility of reinforcement principles for predicting more complex human behavior but has also suggested some additional areas for future exploration of decision making using behavior analysis.

  4. Analysis of sonic well logs applied to erosion estimates in the Bighorn Basin, Wyoming

    SciTech Connect

    Heasler, H.P.; Kharitonova, N.A.

    1996-05-01

    An improved exponential model of sonic transit time data as a function of depth takes into account the physical range of rock sonic velocities. In this way, the model is more geologically realistic for predicting compaction trends when compared to linear or simple exponential functions that fail at large depth intervals. The improved model is applied to the Bighorn basin of northwestern Wyoming for calculation of erosion amounts. This basin was chosen because of extensive geomorphic research that constrains erosion models and because of the importance of quantifying erosion amounts for basin analysis and hydrocarbon maturation prediction. Thirty-six wells were analyzed using the improved exponential model. Seven of these wells, due to limited data from the Tertiary section, were excluded from the basin erosion analysis. Erosion amounts from the remaining 29 wells ranged from 0 to 5600 ft (1700 m), with an average of 2500 ft (800 m).

  5. Graphical Analysis of PET Data Applied to Reversible and Irreversible Tracers

    SciTech Connect

    Logan, Jean

    1999-11-18

    Graphical analysis refers to the transformation of multiple time measurements of plasma and tissue uptake data into a linear plot, the slope of which is related to the number of available tracer binding sites. This type of analysis allows easy comparisons among experiments. No particular model structure is assumed, however it is assumed that the tracer is given by bolus injection and that both tissue uptake and the plasma concentration of unchanged tracer are monitored following tracer injection. The requirement of plasma measurements can be eliminated in some cases when a reference region is available. There are two categories of graphical methods which apply to two general types of ligands--those which bind reversibly during the scanning procedure and those which are irreversible or trapped during the time of the scanning procedure.

  6. A comparative assessment of texture analysis techniques applied to bone tool use-wear

    NASA Astrophysics Data System (ADS)

    Watson, Adam S.; Gleason, Matthew A.

    2016-06-01

    The study of bone tools, a specific class of artifacts often essential to perishable craft production, provides insight into industries otherwise largely invisible archaeologically. Building on recent breakthroughs in the analysis of microwear, this research applies confocal laser scanning microscopy and texture analysis techniques drawn from the field of surface metrology to identify use-wear patterns on experimental and archaeological bone artifacts. Our approach utilizes both conventional parameters and multi-scale geometric characterizations of the areas of worn surfaces to identify statistical similarities as a function of scale. The introduction of this quantitative approach to the study of microtopography holds significant potential for advancement in use-wear studies by reducing inter-observer variability and identifying new parameters useful in the detection of differential wear-patterns.

  7. Prognostic Analysis System and Methods of Operation

    NASA Technical Reports Server (NTRS)

    MacKey, Ryan M. E. (Inventor); Sneddon, Robert (Inventor)

    2014-01-01

    A prognostic analysis system and methods of operating the system are provided. In particular, a prognostic analysis system for the analysis of physical system health applicable to mechanical, electrical, chemical and optical systems and methods of operating the system are described herein.

  8. Galvanic Liquid Applied Coating System For Protection of Embedded Steel Surfaces from Corrosion

    NASA Technical Reports Server (NTRS)

    Curran, Joseph; Curran, Jerome; Voska, N. (Technical Monitor)

    2002-01-01

    Corrosion of reinforcing steel in concrete is an insidious problem facing Kennedy Space Center (KSC), other Government Agencies, and the general public. These problems include KSC launch support structures, highway bridge infrastructure, and building structures such as condominium balconies. Due to these problems, the development of a Galvanic Liquid Applied Coating System would be a breakthrough technology having great commercial value for the following industries: Transportation, Infrastructure, Marine Infrastructure, Civil Engineering, and the Construction Industry. This sacrificial coating system consists of a paint matrix that may include metallic components, conducting agents, and moisture attractors. Similar systems have been used in the past with varying degrees of success. These systems have no proven history of effectiveness over the long term. In addition, these types of systems have had limited success overcoming the initial resistance between the concrete/coating interface. The coating developed at KSC incorporates methods proven to overcome the barriers that previous systems could not achieve. Successful development and continued optimization of this breakthrough system would produce great interest in NASA/KSC for corrosion engineering technology and problem solutions. Commercial patents on this technology would enhance KSC's ability to attract industry partners for similar corrosion control applications.

  9. Comparison of gradient methods for gain tuning of a PD controller applied on a quadrotor system

    NASA Astrophysics Data System (ADS)

    Kim, Jinho; Wilkerson, Stephen A.; Gadsden, S. Andrew

    2016-05-01

    Many mechanical and electrical systems have utilized the proportional-integral-derivative (PID) control strategy. The concept of PID control is a classical approach but it is easy to implement and yields a very good tracking performance. Unmanned aerial vehicles (UAVs) are currently experiencing a significant growth in popularity. Due to the advantages of PID controllers, UAVs are implementing PID controllers for improved stability and performance. An important consideration for the system is the selection of PID gain values in order to achieve a safe flight and successful mission. There are a number of different algorithms that can be used for real-time tuning of gains. This paper presents two algorithms for gain tuning, and are based on the method of steepest descent and Newton's minimization of an objective function. This paper compares the results of applying these two gain tuning algorithms in conjunction with a PD controller on a quadrotor system.

  10. Plug nozzles - The ultimate customer driven propulsion system. [applied to manned lunar and Martian landers

    NASA Technical Reports Server (NTRS)

    Aukerman, Carl A.

    1991-01-01

    This paper presents the results of a study applying the plug cluster nozzle concept to the propulsion system for a typical lunar excursion vehicle. Primary attention for the design criteria is given to user defined factors such as reliability, low volume, and ease of propulsion system development. Total thrust and specific impulse are held constant in the study while other parameters are explored to minimize the design chamber pressure. A brief history of the plug nozzle concept is included to point out the advanced level of technology of the concept and the feasibility of exploiting the variables considered in the study. The plug cluster concept looks very promising as a candidate for consideration for the ultimate customer driven propulsion system.

  11. Applying operational research and data mining to performance based medical personnel motivation system.

    PubMed

    Niaksu, Olegas; Zaptorius, Jonas

    2014-01-01

    This paper presents the methodology suitable for creation of a performance related remuneration system in healthcare sector, which would meet requirements for efficiency and sustainable quality of healthcare services. Methodology for performance indicators selection, ranking and a posteriori evaluation has been proposed and discussed. Priority Distribution Method is applied for unbiased performance criteria weighting. Data mining methods are proposed to monitor and evaluate the results of motivation system.We developed a method for healthcare specific criteria selection consisting of 8 steps; proposed and demonstrated application of Priority Distribution Method for the selected criteria weighting. Moreover, a set of data mining methods for evaluation of the motivational system outcomes was proposed. The described methodology for calculating performance related payment needs practical approbation. We plan to develop semi-automated tools for institutional and personal performance indicators monitoring. The final step would be approbation of the methodology in a healthcare facility.

  12. Sender-receiver systems and applying information theory for quantitative synthetic biology.

    PubMed

    Barcena Menendez, Diego; Senthivel, Vivek Raj; Isalan, Mark

    2015-02-01

    Sender-receiver (S-R) systems abound in biology, with communication systems sending information in various forms. Information theory provides a quantitative basis for analysing these processes and is being applied to study natural genetic, enzymatic and neural networks. Recent advances in synthetic biology are providing us with a wealth of artificial S-R systems, giving us quantitative control over networks with a finite number of well-characterised components. Combining the two approaches can help to predict how to maximise signalling robustness, and will allow us to make increasingly complex biological computers. Ultimately, pushing the boundaries of synthetic biology will require moving beyond engineering the flow of information and towards building more sophisticated circuits that interpret biological meaning.

  13. Applying operational research and data mining to performance based medical personnel motivation system.

    PubMed

    Niaksu, Olegas; Zaptorius, Jonas

    2014-01-01

    This paper presents the methodology suitable for creation of a performance related remuneration system in healthcare sector, which would meet requirements for efficiency and sustainable quality of healthcare services. Methodology for performance indicators selection, ranking and a posteriori evaluation has been proposed and discussed. Priority Distribution Method is applied for unbiased performance criteria weighting. Data mining methods are proposed to monitor and evaluate the results of motivation system.We developed a method for healthcare specific criteria selection consisting of 8 steps; proposed and demonstrated application of Priority Distribution Method for the selected criteria weighting. Moreover, a set of data mining methods for evaluation of the motivational system outcomes was proposed. The described methodology for calculating performance related payment needs practical approbation. We plan to develop semi-automated tools for institutional and personal performance indicators monitoring. The final step would be approbation of the methodology in a healthcare facility. PMID:24825686

  14. Feasibility Studies of Applying Kalman Filter Techniques to Power System Dynamic State Estimation

    SciTech Connect

    Huang, Zhenyu; Schneider, Kevin P.; Nieplocha, Jarek

    2007-08-01

    Abstract—Lack of dynamic information in power system operations mainly attributes to the static modeling of traditional state estimation, as state estimation is the basis driving many other operations functions. This paper investigates the feasibility of applying Kalman filter techniques to enable the inclusion of dynamic modeling in the state estimation process and the estimation of power system dynamic states. The proposed Kalman-filter-based dynamic state estimation is tested on a multi-machine system with both large and small disturbances. Sensitivity studies of the dynamic state estimation performance with respect to measurement characteristics – sampling rate and noise level – are presented as well. The study results show that there is a promising path forward to implementation the Kalman-filter-based dynamic state estimation with the emerging phasor measurement technologies.

  15. Quantitative Feedback Theory (QFT) applied to the design of a rotorcraft flight control system

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Gorder, P. J.

    1992-01-01

    Quantitative Feedback Theory describes a frequency-domain technique for the design of multi-input, multi-output control systems which meet time or frequency domain performance criteria when specified uncertainty exists in the linear description of the vehicle dynamics. Quantitative Feedback Theory is applied to the design of the longitudinal flight control system for a linear uncertain model of the AH-64 rotorcraft. In this model, the uncertainty is assigned, and is assumed to be attributable to actual uncertainty in the dynamic model and to the changes in the vehicle aerodynamic characteristics which occur near hover. The model includes an approximation to the rotor and actuator dynamics. The design example indicates the manner in which handling qualities criteria may be incorporated into the design of realistic rotorcraft control systems in which significant uncertainty exists in the vehicle model.

  16. Sender–receiver systems and applying information theory for quantitative synthetic biology

    PubMed Central

    Barcena Menendez, Diego; Senthivel, Vivek Raj; Isalan, Mark

    2015-01-01

    Sender–receiver (S–R) systems abound in biology, with communication systems sending information in various forms. Information theory provides a quantitative basis for analysing these processes and is being applied to study natural genetic, enzymatic and neural networks. Recent advances in synthetic biology are providing us with a wealth of artificial S–R systems, giving us quantitative control over networks with a finite number of well-characterised components. Combining the two approaches can help to predict how to maximise signalling robustness, and will allow us to make increasingly complex biological computers. Ultimately, pushing the boundaries of synthetic biology will require moving beyond engineering the flow of information and towards building more sophisticated circuits that interpret biological meaning. PMID:25282688

  17. Applying AI systems in the T and D arena. [Artificial Intelligence, Transmission and Distribution

    SciTech Connect

    Venkata, S.S.; Liu, Chenching ); Sumic, Z. Puget Sound Power and Light Co., Bellevue, WA ); Vadari, S.V.

    1993-04-01

    The power engineering community has capitalized on various computer technologies since the early 1960s, with most successful application to solving well-defined problems that are capable of being modeled. Although computing methods have made notable progress in the power engineering arena, there is still a class of problems that is not easy to define or formulate to apply conventional computerized methods. In addition to being difficult to express in a closed mathematical form, these problems are often characterized by the absence of one or both of the following features: a predetermined decision path from the initial state to goal (ill-structured problem); a well-defined criteria for whether an obtained solution is acceptable (open-ended problem). Power engineers have been investigating the application of AI-based methodologies to power system problems. Most of the work in the past has been geared towards the development of expert systems as an operator's aid in energy control centers for bulk power transmission systems operating under abnormal conditions. Alarm processing, fault diagnosis, system restoration, and voltage/var control are a few key areas where significant research work has progressed to date. Results of this research have effected more than 100 prototype expert systems for power systems throughout the US, Japan, and Europe. The objectives of this article are to: expose engineers to the benefits of using AI methods for a host of transmission and distribution (T and D) problems that need immediate attention; identify problems that could be solved more effectively by applying AI approaches; summarize recent developments and successful AI applications in T and D.

  18. Microfabricated integrated DNA analysis systems

    SciTech Connect

    Woolley, A.T.; Mathies, R.A.; Northrup, M.A.

    1996-12-31

    Microfabrication has the potential to revolutionize chemical analysis, from reactions to separations to molecular biotechnology. Microfabricated devices allow high speed separations, automated sample handling, and the study of reactions in the pl to {mu}l volume range. Our research has focused on microfabricated integrated DNA analysis systems (MIDAS). As a first step, we have demonstrated high-speed DNA restriction fragment sizing, and DNA sequencing on microfabricated capillary electrophoresis (CE) chips. We have recently coupled microfabricated PCR reactors and CE chips to make integrated DNA analysis devices. With these devices, rapid PCR can be performed and the reaction products immediately analyzed on the CE chip, eliminating the need for manual transfer of the amplified sample. PCR amplifications have been done in less than 16 minutes, followed by CE analysis in under 100 seconds. These PCR-CE chips represent an important step towards completely integrated sample manipulation on microfabricated devices.

  19. A traffic situation analysis system

    NASA Astrophysics Data System (ADS)

    Sidla, Oliver; Rosner, Marcin

    2011-01-01

    The observation and monitoring of traffic with smart visions systems for the purpose of improving traffic safety has a big potential. For example embedded vision systems built into vehicles can be used as early warning systems, or stationary camera systems can modify the switching frequency of signals at intersections. Today the automated analysis of traffic situations is still in its infancy - the patterns of vehicle motion and pedestrian flow in an urban environment are too complex to be fully understood by a vision system. We present steps towards such a traffic monitoring system which is designed to detect potentially dangerous traffic situations, especially incidents in which the interaction of pedestrians and vehicles might develop into safety critical encounters. The proposed system is field-tested at a real pedestrian crossing in the City of Vienna for the duration of one year. It consists of a cluster of 3 smart cameras, each of which is built from a very compact PC hardware system in an outdoor capable housing. Two cameras run vehicle detection software including license plate detection and recognition, one camera runs a complex pedestrian detection and tracking module based on the HOG detection principle. As a supplement, all 3 cameras use additional optical flow computation in a low-resolution video stream in order to estimate the motion path and speed of objects. This work describes the foundation for all 3 different object detection modalities (pedestrians, vehi1cles, license plates), and explains the system setup and its design.

  20. Limit Cycle Analysis Applied to the Oscillations of Decelerating Blunt-Body Entry Vehicles

    NASA Technical Reports Server (NTRS)

    Schoenenberger, Mark; Queen, Eric M.

    2008-01-01

    Many blunt-body entry vehicles have nonlinear dynamic stability characteristics that produce self-limiting oscillations in flight. Several different test techniques can be used to extract dynamic aerodynamic coefficients to predict this oscillatory behavior for planetary entry mission design and analysis. Most of these test techniques impose boundary conditions that alter the oscillatory behavior from that seen in flight. Three sets of test conditions, representing three commonly used test techniques, are presented to highlight these effects. Analytical solutions to the constant-coefficient planar equations-of-motion for each case are developed to show how the same blunt body behaves differently depending on the imposed test conditions. The energy equation is applied to further illustrate the governing dynamics. Then, the mean value theorem is applied to the energy rate equation to find the effective damping for an example blunt body with nonlinear, self-limiting dynamic characteristics. This approach is used to predict constant-energy oscillatory behavior and the equilibrium oscillation amplitudes for the various test conditions. These predictions are verified with planar simulations. The analysis presented provides an overview of dynamic stability test techniques and illustrates the effects of dynamic stability, static aerodynamics and test conditions on observed dynamic motions. It is proposed that these effects may be leveraged to develop new test techniques and refine test matrices in future tests to better define the nonlinear functional forms of blunt body dynamic stability curves.

  1. Lévy scaling: the diffusion entropy analysis applied to DNA sequences.

    PubMed

    Scafetta, Nicola; Latora, Vito; Grigolini, Paolo

    2002-09-01

    We address the problem of the statistical analysis of a time series generated by complex dynamics with the diffusion entropy analysis (DEA) [N. Scafetta, P. Hamilton, and P. Grigolini, Fractals 9, 193 (2001)]. This method is based on the evaluation of the Shannon entropy of the diffusion process generated by the time series imagined as a physical source of fluctuations, rather than on the measurement of the variance of this diffusion process, as done with the traditional methods. We compare the DEA to the traditional methods of scaling detection and prove that the DEA is the only method that always yields the correct scaling value, if the scaling condition applies. Furthermore, DEA detects the real scaling of a time series without requiring any form of detrending. We show that the joint use of DEA and variance method allows to assess whether a time series is characterized by Lévy or Gauss statistics. We apply the DEA to the study of DNA sequences and prove that their large-time scales are characterized by Lévy statistics, regardless of whether they are coding or noncoding sequences. We show that the DEA is a reliable technique and, at the same time, we use it to confirm the validity of the dynamic approach to the DNA sequences, proposed in earlier work. PMID:12366151

  2. Lévy scaling: The diffusion entropy analysis applied to DNA sequences

    NASA Astrophysics Data System (ADS)

    Scafetta, Nicola; Latora, Vito; Grigolini, Paolo

    2002-09-01

    We address the problem of the statistical analysis of a time series generated by complex dynamics with the diffusion entropy analysis (DEA) [N. Scafetta, P. Hamilton, and P. Grigolini, Fractals 9, 193 (2001)]. This method is based on the evaluation of the Shannon entropy of the diffusion process generated by the time series imagined as a physical source of fluctuations, rather than on the measurement of the variance of this diffusion process, as done with the traditional methods. We compare the DEA to the traditional methods of scaling detection and prove that the DEA is the only method that always yields the correct scaling value, if the scaling condition applies. Furthermore, DEA detects the real scaling of a time series without requiring any form of detrending. We show that the joint use of DEA and variance method allows to assess whether a time series is characterized by Lévy or Gauss statistics. We apply the DEA to the study of DNA sequences and prove that their large-time scales are characterized by Lévy statistics, regardless of whether they are coding or noncoding sequences. We show that the DEA is a reliable technique and, at the same time, we use it to confirm the validity of the dynamic approach to the DNA sequences, proposed in earlier work.

  3. Applied learning-based color tone mapping for face recognition in video surveillance system

    NASA Astrophysics Data System (ADS)

    Yew, Chuu Tian; Suandi, Shahrel Azmin

    2012-04-01

    In this paper, we present an applied learning-based color tone mapping technique for video surveillance system. This technique can be applied onto both color and grayscale surveillance images. The basic idea is to learn the color or intensity statistics from a training dataset of photorealistic images of the candidates appeared in the surveillance images, and remap the color or intensity of the input image so that the color or intensity statistics match those in the training dataset. It is well known that the difference in commercial surveillance cameras models, and signal processing chipsets used by different manufacturers will cause the color and intensity of the images to differ from one another, thus creating additional challenges for face recognition in video surveillance system. Using Multi-Class Support Vector Machines as the classifier on a publicly available video surveillance camera database, namely SCface database, this approach is validated and compared to the results of using holistic approach on grayscale images. The results show that this technique is suitable to improve the color or intensity quality of video surveillance system for face recognition.

  4. Multi-spectral optical simulation system applied in hardware-in-the-loop

    NASA Astrophysics Data System (ADS)

    Yu, Hong; Lei, Jie; Gao, Yang; Liu, Yang

    2009-07-01

    The Multi-spectral simulation system has been constructed at Beijing Simulation Center (BSC) for hardware-in-the-loop (HWIL) testing of optical and infrared seekers, in single-band and dual-band, or even multi-band. This multi-spectral simulation facility consists primarily of several projectors and a wide-angular simulation mechanism, the projector technologies utilized at BSC include a broadband point source collimator, a laser echo simulator and a visible scene projection system. These projectors can be used individually with the wide-angular simulation mechanism, or any combination of both or all of three can be used according to different needs. The configuration and performance of each technology are reviewed in the paper. Future plans include two IR imaging projectors which run at high frame frequency. The multi-spectral optical simulation system has been successfully applied for visible and IR imaging seekers testing in HWIL simulation. The laser echo simulator hardware will be applied soon.

  5. Magnetic Field Experiment Data Analysis System

    NASA Technical Reports Server (NTRS)

    Holland, D. B.; Zanetti, L. J.; Suther, L. L.; Potemra, T. A.; Anderson, B. J.

    1995-01-01

    The Johns Hopkins University Applied Physics Laboratory (JHU/APL) Magnetic Field Experiment Data Analysis System (MFEDAS) has been developed to process and analyze satellite magnetic field experiment data from the TRIAD, MAGSAT, AMPTE/CCE, Viking, Polar BEAR, DMSP, HILAT, UARS, and Freja satellites. The MFEDAS provides extensive data management and analysis capabilities. The system is based on standard data structures and a standard user interface. The MFEDAS has two major elements: (1) a set of satellite unique telemetry processing programs for uniform and rapid conversion of the raw data to a standard format and (2) the program Magplot which has file handling, data analysis, and data display sections. This system is an example of software reuse, allowing new data sets and software extensions to be added in a cost effective and timely manner. Future additions to the system will include the addition of standard format file import routines, modification of the display routines to use a commercial graphics package based on X-Window protocols, and a generic utility for telemetry data access and conversion.

  6. Marine systems analysis and modeling

    NASA Astrophysics Data System (ADS)

    Fedra, K.

    1995-03-01

    Oceanography and marine ecology have a considerable history in the use of computers for modeling both physical and ecological processes. With increasing stress on the marine environment due to human activities such as fisheries and numerous forms of pollution, the analysis of marine problems must increasingly and jointly consider physical, ecological and socio-economic aspects in a broader systems framework that transcends more traditional disciplinary boundaries. This often introduces difficult-to-quantify, “soft” elements, such as values and perceptions, into formal analysis. Thus, the problem domain combines a solid foundation in the physical sciences, with strong elements of ecological, socio-economic and political considerations. At the same time, the domain is also characterized by both a very large volume of some data, and an extremely datapoor situation for other variables, as well as a very high degree of uncertainty, partly due to the temporal and spatial heterogeneity of the marine environment. Consequently, marine systems analysis and management require tools that can integrate these diverse aspects into efficient information systems that can support research as well as planning and also policy- and decisionmaking processes. Supporting scientific research, as well as decision-making processes and the diverse groups and actors involved, requires better access and direct understanding of the information basis as well as easy-to-use, but powerful tools for analysis. Advanced information technology provides the tools to design and implement smart software where, in a broad sense, the emphasis is on the man-machine interface. Symbolic and analogous, graphical interaction, visual representation of problems, integrated data sources, and built-in domain knowledge can effectively support users of complex and complicated software systems. Integration, interaction, visualization and intelligence are key concepts that are discussed in detail, using an

  7. Clinical usefulness of the clock drawing test applying rasch analysis in predicting of cognitive impairment.

    PubMed

    Yoo, Doo Han; Lee, Jae Shin

    2016-07-01

    [Purpose] This study examined the clinical usefulness of the clock drawing test applying Rasch analysis for predicting the level of cognitive impairment. [Subjects and Methods] A total of 187 stroke patients with cognitive impairment were enrolled in this study. The 187 patients were evaluated by the clock drawing test developed through Rasch analysis along with the mini-mental state examination of cognitive evaluation tool. An analysis of the variance was performed to examine the significance of the mini-mental state examination and the clock drawing test according to the general characteristics of the subjects. Receiver operating characteristic analysis was performed to determine the cutoff point for cognitive impairment and to calculate the sensitivity and specificity values. [Results] The results of comparison of the clock drawing test with the mini-mental state showed significant differences in according to gender, age, education, and affected side. A total CDT of 10.5, which was selected as the cutoff point to identify cognitive impairement, showed a sensitivity, specificity, Youden index, positive predictive, and negative predicive values of 86.4%, 91.5%, 0.8, 95%, and 88.2%. [Conclusion] The clock drawing test is believed to be useful in assessments and interventions based on its excellent ability to identify cognitive disorders.

  8. Testing and Analysis Validation of a Metallic Repair Applied to a PRSEUS Tension Panel

    NASA Technical Reports Server (NTRS)

    Przekop, Adam; Jegley, Dawn C.

    2013-01-01

    A design and analysis of a repair concept applicable to a stiffened composite panel based on the Pultruded Rod Stitched Efficient Unitized Structure was recently completed. The damage scenario considered was a midbay-to-midbay saw-cut with a severed stiffener, flange and skin. Advanced modeling techniques such as mesh-independent definition of compliant fasteners and elastic-plastic material properties for metal parts were utilized in the finite element analysis supporting the design effort. A bolted metallic repair was selected so that it could be easily applied in the operational environment. The present work describes results obtained from a tension panel test conducted to validate both the repair concept and finite element analysis techniques used in the design effort. The test proved that the proposed repair concept is capable of sustaining load levels that are higher than those resulting from the current working stress allowables. This conclusion enables upward revision of the stress allowables that had been kept at an overly-conservative level due to concerns associated with repairability of the panels. Correlation of test data with finite element analysis results is also presented and assessed.

  9. Multivariate Curve Resolution Applied to Hyperspectral Imaging Analysis of Chocolate Samples.

    PubMed

    Zhang, Xin; de Juan, Anna; Tauler, Romà

    2015-08-01

    This paper shows the application of Raman and infrared hyperspectral imaging combined with multivariate curve resolution (MCR) to the analysis of the constituents of commercial chocolate samples. The combination of different spectral data pretreatment methods allowed decreasing the high fluorescent Raman signal contribution of whey in the investigated chocolate samples. Using equality constraints during MCR analysis, estimations of the pure spectra of the chocolate sample constituents were improved, as well as their relative contributions and their spatial distribution on the analyzed samples. In addition, unknown constituents could be also resolved. White chocolate constituents resolved from Raman hyperspectral image indicate that, at macro scale, sucrose, lactose, fat, and whey constituents were intermixed in particles. Infrared hyperspectral imaging did not suffer from fluorescence and could be applied for white and milk chocolate. As a conclusion of this study, micro-hyperspectral imaging coupled to the MCR method is confirmed to be an appropriate tool for the direct analysis of the constituents of chocolate samples, and by extension, it is proposed for the analysis of other mixture constituents in commercial food samples.

  10. Clinical usefulness of the clock drawing test applying rasch analysis in predicting of cognitive impairment

    PubMed Central

    Yoo, Doo Han; Lee, Jae Shin

    2016-01-01

    [Purpose] This study examined the clinical usefulness of the clock drawing test applying Rasch analysis for predicting the level of cognitive impairment. [Subjects and Methods] A total of 187 stroke patients with cognitive impairment were enrolled in this study. The 187 patients were evaluated by the clock drawing test developed through Rasch analysis along with the mini-mental state examination of cognitive evaluation tool. An analysis of the variance was performed to examine the significance of the mini-mental state examination and the clock drawing test according to the general characteristics of the subjects. Receiver operating characteristic analysis was performed to determine the cutoff point for cognitive impairment and to calculate the sensitivity and specificity values. [Results] The results of comparison of the clock drawing test with the mini-mental state showed significant differences in according to gender, age, education, and affected side. A total CDT of 10.5, which was selected as the cutoff point to identify cognitive impairement, showed a sensitivity, specificity, Youden index, positive predictive, and negative predicive values of 86.4%, 91.5%, 0.8, 95%, and 88.2%. [Conclusion] The clock drawing test is believed to be useful in assessments and interventions based on its excellent ability to identify cognitive disorders. PMID:27512283

  11. Formulation of Indomethacin Colon Targeted Delivery Systems Using Polysaccharides as Carriers by Applying Liquisolid Technique

    PubMed Central

    Elkhodairy, Kadria A.; Elsaghir, Hanna A.; Al-Subayiel, Amal M.

    2014-01-01

    The present study aimed at the formulation of matrix tablets for colon-specific drug delivery (CSDD) system of indomethacin (IDM) by applying liquisolid (LS) technique. A CSDD system based on time-dependent polymethacrylates and enzyme degradable polysaccharides was established. Eudragit RL 100 (E-RL 100) was employed as time-dependent polymer, whereas bacterial degradable polysaccharides were presented as LS systems loaded with the drug. Indomethacin-loaded LS systems were prepared using different polysaccharides, namely, guar gum (GG), pectin (PEC), and chitosan (CH), as carriers separately or in mixtures of different ratios of 1 : 3, 1 : 1, and 3 : 1. Liquisolid systems that displayed promising results concerning drug release rate in both pH 1.2 and pH 6.8 were compressed into tablets after the addition of the calculated amount of E-RL 100 and lubrication with magnesium stearate and talc in the ratio of 1 : 9. It was found that E-RL 100 improved the flowability and compressibility of all LS formulations. The release data revealed that all formulations succeeded to sustain drug release over a period of 24 hours. Stability study indicated that PEC-based LS system as well as its matrix tablets was stable over the period of storage (one year) and could provide a minimum shelf life of two years. PMID:24971345

  12. Formulation of indomethacin colon targeted delivery systems using polysaccharides as carriers by applying liquisolid technique.

    PubMed

    Elkhodairy, Kadria A; Elsaghir, Hanna A; Al-Subayiel, Amal M

    2014-01-01

    The present study aimed at the formulation of matrix tablets for colon-specific drug delivery (CSDD) system of indomethacin (IDM) by applying liquisolid (LS) technique. A CSDD system based on time-dependent polymethacrylates and enzyme degradable polysaccharides was established. Eudragit RL 100 (E-RL 100) was employed as time-dependent polymer, whereas bacterial degradable polysaccharides were presented as LS systems loaded with the drug. Indomethacin-loaded LS systems were prepared using different polysaccharides, namely, guar gum (GG), pectin (PEC), and chitosan (CH), as carriers separately or in mixtures of different ratios of 1:3, 1:1, and 3:1. Liquisolid systems that displayed promising results concerning drug release rate in both pH 1.2 and pH 6.8 were compressed into tablets after the addition of the calculated amount of E-RL 100 and lubrication with magnesium stearate and talc in the ratio of 1:9. It was found that E-RL 100 improved the flowability and compressibility of all LS formulations. The release data revealed that all formulations succeeded to sustain drug release over a period of 24 hours. Stability study indicated that PEC-based LS system as well as its matrix tablets was stable over the period of storage (one year) and could provide a minimum shelf life of two years. PMID:24971345

  13. Dynamic sensitivity analysis of biological systems

    PubMed Central

    Wu, Wu Hsiung; Wang, Feng Sheng; Chang, Maw Shang

    2008-01-01

    Background A mathematical model to understand, predict, control, or even design a real biological system is a central theme in systems biology. A dynamic biological system is always modeled as a nonlinear ordinary differential equation (ODE) system. How to simulate the dynamic behavior and dynamic parameter sensitivities of systems described by ODEs efficiently and accurately is a critical job. In many practical applications, e.g., the fed-batch fermentation systems, the system admissible input (corresponding to independent variables of the system) can be time-dependent. The main difficulty for investigating the dynamic log gains of these systems is the infinite dimension due to the time-dependent input. The classical dynamic sensitivity analysis does not take into account this case for the dynamic log gains. Results We present an algorithm with an adaptive step size control that can be used for computing the solution and dynamic sensitivities of an autonomous ODE system simultaneously. Although our algorithm is one of the decouple direct methods in computing dynamic sensitivities of an ODE system, the step size determined by model equations can be used on the computations of the time profile and dynamic sensitivities with moderate accuracy even when sensitivity equations are more stiff than model equations. To show this algorithm can perform the dynamic sensitivity analysis on very stiff ODE systems with moderate accuracy, it is implemented and applied to two sets of chemical reactions: pyrolysis of ethane and oxidation of formaldehyde. The accuracy of this algorithm is demonstrated by comparing the dynamic parameter sensitivities obtained from this new algorithm and from the direct method with Rosenbrock stiff integrator based on the indirect method. The same dynamic sensitivity analysis was performed on an ethanol fed-batch fermentation system with a time-varying feed rate to evaluate the applicability of the algorithm to realistic models with time

  14. Solar load ratio method applied to commercial building active solar system sizing

    SciTech Connect

    Schnurr, N.M.; Hunn, B.D.; Williamson, K.D. III

    1981-01-01

    The hourly simulation procedure is the DOE-2 building energy analysis computer program. It is capable of calculating the loads and of simulating various control strategies in detail for both residential and commercial buildings and yet is computationally efficient enough to be used for extensive parametric studies. In addition, to a Building Service Hot Water (BSHW) System and a combined space heating and hot water system using liquid collectors for a commercial building analyzed previously, a space heating system using an air collector is analyzed. A series of runs is made for systems using evacuated tube collectors for comparison to flat-plate collectors, and the effects of additional system design parameters are investigated. Also, the generic collector types are characterized by standard efficiency curves, rather than by detailed collector specifications. (MHR)

  15. Cost effectiveness as applied to the Viking Lander systems-level thermal development test program

    NASA Technical Reports Server (NTRS)

    Buna, T.; Shupert, T. C.

    1974-01-01

    The economic aspects of thermal testing at the systems-level as applied to the Viking Lander Capsule thermal development program are reviewed. The unique mission profile and pioneering scientific goals of Viking imposed novel requirements on testing, including the development of a simulation technique for the Martian thermal environment. The selected approach included modifications of an existing conventional thermal vacuum facility, and improved test-operational techniques that are applicable to the simulation of the other mission phases as well, thereby contributing significantly to the cost effectiveness of the overall thermal test program.

  16. Cluster analysis applied to the spatial and temporal variability of monthly rainfall in Mato Grosso do Sul State, Brazil

    NASA Astrophysics Data System (ADS)

    Teodoro, Paulo Eduardo; de Oliveira-Júnior, José Francisco; da Cunha, Elias Rodrigues; Correa, Caio Cezar Guedes; Torres, Francisco Eduardo; Bacani, Vitor Matheus; Gois, Givanildo; Ribeiro, Larissa Pereira

    2016-04-01

    The State of Mato Grosso do Sul (MS) located in Brazil Midwest is devoid of climatological studies, mainly in the characterization of rainfall regime and producers' meteorological systems and rain inhibitors. This state has different soil and climatic characteristics distributed among three biomes: Cerrado, Atlantic Forest and Pantanal. This study aimed to apply the cluster analysis using Ward's algorithm and identify those meteorological systems that affect the rainfall regime in the biomes. The rainfall data of 32 stations (sites) of the MS State were obtained from the Agência Nacional de Águas (ANA) database, collected from 1954 to 2013. In each of the 384 monthly rainfall temporal series was calculated the average and applied the Ward's algorithm to identify spatial and temporal variability of rainfall. Bartlett's test revealed only in January homogeneous variance at all sites. Run test showed that there was no increase or decrease in trend of monthly rainfall. Cluster analysis identified five rainfall homogeneous regions in the MS State, followed by three seasons (rainy, transitional and dry). The rainy season occurs during the months of November, December, January, February and March. The transitional season ranges between the months of April and May, September and October. The dry season occurs in June, July and August. The groups G1, G4 and G5 are influenced by South Atlantic Subtropical Anticyclone (SASA), Chaco's Low (CL), Bolivia's High (BH), Low Levels Jet (LLJ) and South Atlantic Convergence Zone (SACZ) and Maden-Julian Oscillation (MJO). Group G2 is influenced by Upper Tropospheric Cyclonic Vortex (UTCV) and Front Systems (FS). The group G3 is affected by UTCV, FS and SACZ. The meteorological systems' interaction that operates in each biome and the altitude causes the rainfall spatial and temporal diversity in MS State.

  17. Common reduced spaces of representation applied to multispectral texture analysis in cosmetology

    NASA Astrophysics Data System (ADS)

    Corvo, Joris; Angulo, Jesus; Breugnot, Josselin; Borbes, Sylvie; Closs, Brigitte

    2016-03-01

    Principal Component Analysis (PCA) is a technique of multivariate data analysis widely used in various fields like biology, ecology or economy to reduce data dimensionality while retaining most important information. It is becoming a standard practice in multispectral/hyperspectral imaging since those multivariate data generally suffer from a high redundancy level. Nevertheless, by definition, PCA is meant to be applied to a single multispectral/hyperspectral image at a time. When several images have to be treated, running a PCA on each image would generate specific reduced spaces, which is not suitable for comparison between results. Thus, we focus on two PCA based algorithms that could define common reduced spaces of representation. The first method arises from literature and is computed with the barycenter covariance matrix. On the contrary, we designed the second algorithm with the idea of correcting standard PCA using permutations and inversions of eigenvectors. These dimensionality reduction methods are used within the context of a cosmetological study of a foundation make-up. Available data are in-vivo multispectral images of skin acquired on different volunteers in time series. The main purpose of this study is to characterize the make-up degradation especially in terms of texture analysis. Results have to be validate by statistical prediction of time since applying the product. PCA algorithms produce eigenimages that separately enhance skin components (pores, radiance, vessels...). From these eigenimages, we extract morphological texture descriptors and intent a time prediction. Accuracy of common reduced spaces outperform classical PCA one. In this paper, we detail how PCA is extended to the multiple groups case and explain what are the advantages of common reduced spaces when it comes to study several multispectral images.

  18. Comprehensive Mechanisms for Combustion Chemistry: An Experimental and Numerical Study with Emphasis on Applied Sensitivity Analysis

    SciTech Connect

    Dryer, Frederick L.

    2009-04-10

    This project was an integrated experimental/numerical effort to study pyrolysis and oxidation reactions and mechanisms for small-molecule hydrocarbon structures under conditions representative of combustion environments. The experimental aspects of the work were conducted in large-diameter flow reactors, at 0.3 to 18 atm pressure, 500 to 1100 K temperature, and 10-2 to 2 seconds reaction time. Experiments were also conducted to determine reference laminar flame speeds using a premixed laminar stagnation flame experiment and particle image velocimetry, as well as pressurized bomb experiments. Flow reactor data for oxidation experiments include: (1)adiabatic/isothermal species time-histories of a reaction under fixed initial pressure, temperature, and composition; to determine the species present after a fixed reaction time, initial pressure; (2)species distributions with varying initial reaction temperature; (3)perturbations of a well-defined reaction systems (e.g. CO/H2/O2 or H2/O2)by the addition of small amounts of an additive species. Radical scavenging techniques are applied to determine unimolecular decomposition rates from pyrolysis experiments. Laminar flame speed measurements are determined as a function of equivalence ratio, dilution, and unburned gas temperature at 1 atm pressure. Hierarchical, comprehensive mechanistic construction methods were applied to develop detailed kinetic mechanisms which describe the measurements and literature kinetic data. Modeling using well-defined and validated mechanisms for the CO/H2/Oxidant systems and perturbations of oxidation experiments by small amounts of additives were also used to derive absolute reaction rates and to investigate the compatibility of published elementary kinetic and thermochemical information. Numerical tools were developed and applied to assess the importance of individual elementary reactions to the predictive performance of the

  19. A low cost concept for data acquisition systems applied to decentralized renewable energy plants.

    PubMed

    Jucá, Sandro C S; Carvalho, Paulo C M; Brito, Fábio T

    2011-01-01

    The present paper describes experiences of the use of monitoring and data acquisition systems (DAS) and proposes a new concept of a low cost DAS applied to decentralized renewable energy (RE) plants with an USB interface. The use of such systems contributes to disseminate these plants, recognizing in real time local energy resources, monitoring energy conversion efficiency and sending information concerning failures. These aspects are important, mainly for developing countries, where decentralized power plants based on renewable sources are in some cases the best option for supplying electricity to rural areas. Nevertheless, the cost of commercial DAS is still a barrier for a greater dissemination of such systems in developing countries. The proposed USB based DAS presents a new dual clock operation philosophy, in which the acquisition system contains two clock sources for parallel information processing from different communication protocols. To ensure the low cost of the DAS and to promote the dissemination of this technology in developing countries, the proposed data acquisition firmware and the software for USB microcontrollers programming is a free and open source software, executable in the Linux and Windows® operating systems.

  20. A Low Cost Concept for Data Acquisition Systems Applied to Decentralized Renewable Energy Plants

    PubMed Central

    Jucá, Sandro C. S.; Carvalho, Paulo C. M.; Brito, Fábio T.

    2011-01-01

    The present paper describes experiences of the use of monitoring and data acquisition systems (DAS) and proposes a new concept of a low cost DAS applied to decentralized renewable energy (RE) plants with an USB interface. The use of such systems contributes to disseminate these plants, recognizing in real time local energy resources, monitoring energy conversion efficiency and sending information concerning failures. These aspects are important, mainly for developing countries, where decentralized power plants based on renewable sources are in some cases the best option for supplying electricity to rural areas. Nevertheless, the cost of commercial DAS is still a barrier for a greater dissemination of such systems in developing countries. The proposed USB based DAS presents a new dual clock operation philosophy, in which the acquisition system contains two clock sources for parallel information processing from different communication protocols. To ensure the low cost of the DAS and to promote the dissemination of this technology in developing countries, the proposed data acquisition firmware and the software for USB microcontrollers programming is a free and open source software, executable in the Linux and Windows® operating systems. PMID:22346600