Science.gov

Sample records for applied systems analysis

  1. Applied mathematics analysis of the multibody systems

    NASA Astrophysics Data System (ADS)

    Sahin, H.; Kar, A. K.; Tacgin, E.

    2012-08-01

    A methodology is developed for the analysis of the multibody systems that is applied on the vehicle as a case study. The previous study emphasizes the derivation of the multibody dynamics equations of motion for bogie [2]. In this work, we have developed a guide-way for the analysis of the dynamical behavior of the multibody systems for mainly validation, verification of the realistic mathematical model and partly for the design of the alternative optimum vehicle parameters.

  2. Sneak analysis applied to process systems

    NASA Astrophysics Data System (ADS)

    Whetton, Cris

    Traditional safety analyses, such as HAZOP, FMEA, FTA, and MORT, are less than effective at identifying hazards resulting from incorrect 'flow' - whether this be flow of information, actions, electric current, or even the literal flow of process fluids. Sneak Analysis (SA) has existed since the mid nineteen-seventies as a means of identifying such conditions in electric circuits; in which area, it is usually known as Sneak Circuit Analysis (SCA). This paper extends the ideas of Sneak Circuit Analysis to a general method of Sneak Analysis applied to process plant. The methods of SA attempt to capitalize on previous work in the electrical field by first producing a pseudo-electrical analog of the process and then analyzing the analog by the existing techniques of SCA, supplemented by some additional rules and clues specific to processes. The SA method is not intended to replace any existing method of safety analysis; instead, it is intended to supplement such techniques as HAZOP and FMEA by providing systematic procedures for the identification of a class of potential problems which are not well covered by any other method.

  3. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  4. Thermographic techniques applied to solar collector systems analysis

    SciTech Connect

    Eden, A.

    1980-02-01

    The use of thermography to analyze large solar collector array systems under dynamic operating conditions is discussed. The research at the Solar Energy Research Institute (SERI) in this area has focused on thermographic techniques and equipment to determine temperature distributions, flow patterns, and air blockages in solar collectors. The results of this extensive study, covering many sites and types of collectors, illustrate the capabilities of infrared (IR) analysis as a qualitative analysis tool and operation and maintenance procedure when applied to large arrays. Thermographic analysis of most collector systems qualitatively showed relative temperature distributions that indicated balanced flow patterns. In three significant cases, blocked or broken collector arrays, which previously had gone undetected, were discovered. Using this analysis, validation studies of large computer codes could examine collector arrays for flow patterns or blockages that could cause disagreement between actual and predicted performance. Initial operation and balancing of large systems could be accomplished without complicated sensor systems not needed for normal operations. Maintenance personnel could quickly check their systems without climbing onto the roof and without complicated sensor systems.

  5. Systems design analysis applied to launch vehicle configuration

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Verderaime, V.

    1993-01-01

    As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.

  6. Applying an Activity System to Online Collaborative Group Work Analysis

    ERIC Educational Resources Information Center

    Choi, Hyungshin; Kang, Myunghee

    2010-01-01

    This study determines whether an activity system provides a systematic framework to analyse collaborative group work. Using an activity system as a unit of analysis, the research examined learner behaviours, conflicting factors and facilitating factors while students engaged in collaborative work via asynchronous computer-mediated communication.…

  7. Dynamical systems analysis applied to working memory data.

    PubMed

    Gasimova, Fidan; Robitzsch, Alexander; Wilhelm, Oliver; Boker, Steven M; Hu, Yueqin; Hülür, Gizem

    2014-01-01

    In the present paper we investigate weekly fluctuations in the working memory capacity (WMC) assessed over a period of 2 years. We use dynamical system analysis, specifically a second order linear differential equation, to model weekly variability in WMC in a sample of 112 9th graders. In our longitudinal data we use a B-spline imputation method to deal with missing data. The results show a significant negative frequency parameter in the data, indicating a cyclical pattern in weekly memory updating performance across time. We use a multilevel modeling approach to capture individual differences in model parameters and find that a higher initial performance level and a slower improvement at the MU task is associated with a slower frequency of oscillation. Additionally, we conduct a simulation study examining the analysis procedure's performance using different numbers of B-spline knots and values of time delay embedding dimensions. Results show that the number of knots in the B-spline imputation influence accuracy more than the number of embedding dimensions. PMID:25071657

  8. Experiences with Probabilistic Analysis Applied to Controlled Systems

    NASA Technical Reports Server (NTRS)

    Kenny, Sean P.; Giesy, Daniel P.

    2004-01-01

    This paper presents a semi-analytic method for computing frequency dependent means, variances, and failure probabilities for arbitrarily large-order closed-loop dynamical systems possessing a single uncertain parameter or with multiple highly correlated uncertain parameters. The approach will be shown to not suffer from the same computational challenges associated with computing failure probabilities using conventional FORM/SORM techniques. The approach is demonstrated by computing the probabilistic frequency domain performance of an optimal feed-forward disturbance rejection scheme.

  9. System Analysis Applied to Autonomy: Application to High-Altitude Long-Endurance Remotely Operated Aircraft

    NASA Technical Reports Server (NTRS)

    Young, Larry A.; Yetter, Jeffrey A.; Guynn, Mark D.

    2006-01-01

    Maturation of intelligent systems technologies and their incorporation into aerial platforms are dictating the development of new analysis tools and incorporation of such tools into existing system analysis methodologies in order to fully capture the trade-offs of autonomy on vehicle and mission success. A first-order "system analysis of autonomy" methodology is outlined in this paper. Further, this analysis methodology is subsequently applied to notional high-altitude long-endurance (HALE) aerial vehicle missions.

  10. System Analysis Applied to Autonomy: Application to Human-Rated Lunar/Mars Landers

    NASA Technical Reports Server (NTRS)

    Young, Larry A.

    2006-01-01

    System analysis is an essential technical discipline for the modern design of spacecraft and their associated missions. Specifically, system analysis is a powerful aid in identifying and prioritizing the required technologies needed for mission and/or vehicle development efforts. Maturation of intelligent systems technologies, and their incorporation into spacecraft systems, are dictating the development of new analysis tools, and incorporation of such tools into existing system analysis methodologies, in order to fully capture the trade-offs of autonomy on vehicle and mission success. A "system analysis of autonomy" methodology will be outlined and applied to a set of notional human-rated lunar/Mars lander missions toward answering these questions: 1. what is the optimum level of vehicle autonomy and intelligence required? and 2. what are the specific attributes of an autonomous system implementation essential for a given surface lander mission/application in order to maximize mission success? Future human-rated lunar/Mars landers, though nominally under the control of their crew, will, nonetheless, be highly automated systems. These automated systems will range from mission/flight control functions, to vehicle health monitoring and prognostication, to life-support and other "housekeeping" functions. The optimum degree of autonomy afforded to these spacecraft systems/functions has profound implications from an exploration system architecture standpoint.

  11. System Sensitivity Analysis Applied to the Conceptual Design of a Dual-Fuel Rocket SSTO

    NASA Technical Reports Server (NTRS)

    Olds, John R.

    1994-01-01

    This paper reports the results of initial efforts to apply the System Sensitivity Analysis (SSA) optimization method to the conceptual design of a single-stage-to-orbit (SSTO) launch vehicle. SSA is an efficient, calculus-based MDO technique for generating sensitivity derivatives in a highly multidisciplinary design environment. The method has been successfully applied to conceptual aircraft design and has been proven to have advantages over traditional direct optimization methods. The method is applied to the optimization of an advanced, piloted SSTO design similar to vehicles currently being analyzed by NASA as possible replacements for the Space Shuttle. Powered by a derivative of the Russian RD-701 rocket engine, the vehicle employs a combination of hydrocarbon, hydrogen, and oxygen propellants. Three primary disciplines are included in the design - propulsion, performance, and weights & sizing. A complete, converged vehicle analysis depends on the use of three standalone conceptual analysis computer codes. Efforts to minimize vehicle dry (empty) weight are reported in this paper. The problem consists of six system-level design variables and one system-level constraint. Using SSA in a 'manual' fashion to generate gradient information, six system-level iterations were performed from each of two different starting points. The results showed a good pattern of convergence for both starting points. A discussion of the advantages and disadvantages of the method, possible areas of improvement, and future work is included.

  12. Success Factors of European Syndromic Surveillance Systems: A Worked Example of Applying Qualitative Comparative Analysis

    PubMed Central

    Ziemann, Alexandra; Fouillet, Anne; Brand, Helmut; Krafft, Thomas

    2016-01-01

    Introduction Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors. Materials and Methods We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events. Results We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness. Conclusions We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings. PMID:27182731

  13. Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Szapacs, Cindy

    2006-01-01

    Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…

  14. PRACTICAL SENSITIVITY AND UNCERTAINTY ANALYSIS TECHNIQUES APPLIED TO AGRICULTURAL SYSTEMS MODELS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We present a practical evaluation framework for analysis of two complex, process-based agricultural system models, WEPP and RZWQM. The evaluation framework combines sensitivity analysis and the uncertainty analysis techniques of first order error analysis (FOA) and Monte Carlo simulation with Latin ...

  15. RFI analysis applied to the TDRSS system. [Tracking and Data Relay Satellite System performance

    NASA Technical Reports Server (NTRS)

    Jenny, J. A.

    1973-01-01

    The effect of radio frequency interference (RFI) on the proposed Tracking and Data Relay Satellite System (TDRSS) was assessed. The method of assessing RFI was to create a discrete emitter listing containing all the required parameters of transmitters in the applicable VHF and UHF frequency bands. The transmitter and spacecraft receiver characteristics were used to calculate the RFI contribution due to each emitter. The individual contributions were summed to obtain the total impact in the operational bandwidth. Using an as yet incomplete emitter base, it is concluded that the 136- to 137-MHz band should be used by TDRSS rather than the whole 136- to 138-MHz band because of the higher interference levels in the 137- to 138 MHz band. Even when restricting the link to 136 to 137 MHz, the existing link design is marginal, and it is recommended that interference reduction units, such as the adaptive digital filter, be incorporated in the TDRSS ground station.

  16. Applying STAMP in Accident Analysis

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen

    2003-01-01

    Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.

  17. Model-Based Systems Engineering With the Architecture Analysis and Design Language (AADL) Applied to NASA Mission Operations

    NASA Technical Reports Server (NTRS)

    Munoz Fernandez, Michela Miche

    2014-01-01

    The potential of Model Model Systems Engineering (MBSE) using the Architecture Analysis and Design Language (AADL) applied to space systems will be described. AADL modeling is applicable to real-time embedded systems- the types of systems NASA builds. A case study with the Juno mission to Jupiter showcases how this work would enable future missions to benefit from using these models throughout their life cycle from design to flight operations.

  18. Applied Behavior Analysis in Education.

    ERIC Educational Resources Information Center

    Cooper, John O.

    1982-01-01

    Applied behavioral analysis in education is expanding rapidly. This article describes the dimensions of applied behavior analysis and the contributions this technology offers teachers in the area of systematic applications, direct and daily measurement, and experimental methodology. (CJ)

  19. Genetic algorithm applied to a Soil-Vegetation-Atmosphere system: Sensitivity and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Schneider, Sébastien; Jacques, Diederik; Mallants, Dirk

    2010-05-01

    Numerical models are of precious help for predicting water fluxes in the vadose zone and more specifically in Soil-Vegetation-Atmosphere (SVA) systems. For such simulations, robust models and representative soil hydraulic parameters are required. Calibration of unsaturated hydraulic properties is known to be a difficult optimization problem due to the high non-linearity of the water flow equations. Therefore, robust methods are needed to avoid the optimization process to lead to non-optimal parameters. Evolutionary algorithms and specifically genetic algorithms (GAs) are very well suited for those complex parameter optimization problems. Additionally, GAs offer the opportunity to assess the confidence in the hydraulic parameter estimations, because of the large number of model realizations. The SVA system in this study concerns a pine stand on a heterogeneous sandy soil (podzol) in the Campine region in the north of Belgium. Throughfall and other meteorological data and water contents at different soil depths have been recorded during one year at a daily time step in two lysimeters. The water table level, which is varying between 95 and 170 cm, has been recorded with intervals of 0.5 hour. The leaf area index was measured as well at some selected time moments during the year in order to evaluate the energy which reaches the soil and to deduce the potential evaporation. Water contents at several depths have been recorded. Based on the profile description, five soil layers have been distinguished in the podzol. Two models have been used for simulating water fluxes: (i) a mechanistic model, the HYDRUS-1D model, which solves the Richards' equation, and (ii) a compartmental model, which treats the soil profile as a bucket into which water flows until its maximum capacity is reached. A global sensitivity analysis (Morris' one-at-a-time sensitivity analysis) was run previously to the calibration, in order to check the sensitivity in the chosen parameter search space. For

  20. Morphological analysis of galvanized coating applied under vibrowave process system conditions

    NASA Astrophysics Data System (ADS)

    Lebedev, V. A.; Ivanov, V. V.; Fedorov, V. P.

    2016-04-01

    The article presents the morphological research results of galvanized coating applied to the metal surface in the course of mechanical and chemical synthesis realized under vibrowave process system conditions. The paper reveals the specifics of the coating morphology, its activating role in free-moving indentors formed under the impact of low-frequency vibrations and its positive influence on the operational performance of the part surface layer. The advantages of this galvanized coating application method are presented in comparison with conventional methods.

  1. Evaluation Applied to Reliability Analysis of Reconfigurable, Highly Reliable, Fault-Tolerant, Computing Systems for Avionics

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques are proposed as a solution to a difficulty arising in the analysis of the reliability of highly reliable computer systems for future commercial aircraft. The difficulty, viz., the lack of credible precision in reliability estimates obtained by analytical modeling techniques are established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible, (2) a complex system design technique, fault tolerance, (3) system reliability dominated by errors due to flaws in the system definition, and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. The technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. The use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques.

  2. An image-processing system, motion analysis oriented (IPS-100), applied to microscopy.

    PubMed

    Gualtieri, P; Coltelli, P

    1991-09-01

    This paper describes a real-time video image processing system, suitable for image analysis of stationary and moving images. It consists of a high-quality microscope, a general-purpose personal computer, a commercially available image-processing hardware module plugged into the computer bus, a b/w TV-camera, video monitors and a software package. The structure and the capability of this system are explained. The software is menu-driven and performs real-time image enhancements, real-time mathematical and morphological filters, image segmentation and labelling, real-time identification of moving objects, and real-time analysis of their movements. The program is available in listing form. PMID:1760921

  3. Energy analysis of facade-integrated photovoltaic systems applied to UAE commercial buildings

    SciTech Connect

    Radhi, Hassan

    2010-12-15

    Developments in the design and manufacture of photovoltaic cells have recently been a growing concern in the UAE. At present, the embodied energy pay-back time (EPBT) is the criterion used for comparing the viability of such technology against other forms. However, the impact of PV technology on the thermal performance of buildings is not considered at the time of EPBT estimation. If additional energy savings gained over the PV system life are also included, the total EPBT could be shorter. This paper explores the variation of the total energy of building integrated photovoltaic systems (BiPV) as a wall cladding system applied to the UAE commercial sector and shows that the ratio between PV output and saving in energy due to PV panels is within the range of 1:3-1:4. The result indicates that for the southern and western facades in the UAE, the embodied energy pay-back time for photovoltaic system is within the range of 12-13 years. When reductions in operational energy are considered, the pay-back time is reduced to 3.0-3.2 years. This study comes to the conclusion that the reduction in operational energy due to PV panels represents an important factor in the estimation of EPBT. (author)

  4. Multi-attribute criteria applied to electric generation energy system analysis LDRD.

    SciTech Connect

    Kuswa, Glenn W.; Tsao, Jeffrey Yeenien; Drennen, Thomas E.; Zuffranieri, Jason V.; Paananen, Orman Henrie; Jones, Scott A.; Ortner, Juergen G.; Brewer, Jeffrey D.; Valdez, Maximo M.

    2005-10-01

    This report began with a Laboratory-Directed Research and Development (LDRD) project to improve Sandia National Laboratories multidisciplinary capabilities in energy systems analysis. The aim is to understand how various electricity generating options can best serve needs in the United States. The initial product is documented in a series of white papers that span a broad range of topics, including the successes and failures of past modeling studies, sustainability, oil dependence, energy security, and nuclear power. Summaries of these projects are included here. These projects have provided a background and discussion framework for the Energy Systems Analysis LDRD team to carry out an inter-comparison of many of the commonly available electric power sources in present use, comparisons of those options, and efforts needed to realize progress towards those options. A computer aid has been developed to compare various options based on cost and other attributes such as technological, social, and policy constraints. The Energy Systems Analysis team has developed a multi-criteria framework that will allow comparison of energy options with a set of metrics that can be used across all technologies. This report discusses several evaluation techniques and introduces the set of criteria developed for this LDRD.

  5. Analysis of Godunov type schemes applied to the compressible Euler system at low Mach number

    NASA Astrophysics Data System (ADS)

    Dellacherie, Stéphane

    2010-02-01

    We propose a theoretical framework to clearly explain the inaccuracy of Godunov type schemes applied to the compressible Euler system at low Mach number on a Cartesian mesh. In particular, we clearly explain why this inaccuracy problem concerns the 2D or 3D geometry and does not concern the 1D geometry. The theoretical arguments are based on the Hodge decomposition, on the fact that an appropriate well-prepared subspace is invariant for the linear wave equation and on the notion of first-order modified equation. This theoretical approach allows to propose a simple modification that can be applied to any colocated scheme of Godunov type or not in order to define a large class of colocated schemes accurate at low Mach number on any mesh. It also allows to justify colocated schemes that are accurate at low Mach number as, for example, the Roe-Turkel and the AUSM +-up schemes, and to find a link with a colocated incompressible scheme stabilized with a Brezzi-Pitkäranta type stabilization. Numerical results justify the theoretical arguments proposed in this paper.

  6. Development and analysis of a resource-aware power management system as applied to small spacecraft

    NASA Astrophysics Data System (ADS)

    Shriver, Patrick

    In this thesis, an overall framework and solution method for managing the limited power resources of a small spacecraft is presented. Analogous to mobile computing technology, a primary limiting factor is the available power resources. In spite of the millions of dollars budgeted for research and development over decades, improvements in battery efficiency remains low. This situation is exacerbated by advances in payload technology that lead to increasingly power-hungry and data-intensive instruments. The challenge for the small spacecraft is to maximize capabilities and performance while meeting difficult design requirements and small project budgets. Power management is sought as a solution that can be applied with an existing generation of batteries. Ultimately, the power management problem is one of optimizing system performance and lifetime while maintaining safe operating conditions. This problem is formulated as a constrained, multi-objective combinatorial optimization problem. The problem is argued to be computationally intractable, and a formal proof of optimal substructure is presented. A multi-agent solution paradigm is developed that implements Dynamic Programming and Compromise Programming solutions. A high-level, "black box" software simulation of a typical power system is used to evaluate the developed method. The parameters used in simulation are taken from existing satellite designs. Compared to a traditional spacecraft operations approach, the developed method is shown to be useful in maximizing the utility of the spacecraft. As a battery ages, the method also has an increasing benefit on minimizing the missions risk.

  7. Underdetermined system theory applied to qualitative analysis of response caused by attenuating plane waves

    NASA Astrophysics Data System (ADS)

    Sano, Yukio

    1989-05-01

    A qualitative analysis of the mechanical response of rate-dependent media caused by a one-dimensional plane smooth wave front and by a continuous wave front attenuating in the media is performed by an underdetermined system of nonlinear partial differential equations. The analysis reveals that smooth strain, particle velocity, and stress profiles, which the smooth wave front has, are not similar and that the wave front is composed of some partial waves having different properties. The property is represented by a set of strain rate, acceleration, and stress rate. The wave front derived here from the analysis is composed of four different partial waves. The front of the wave front is necessarily a contraction wave in which strain, particle velocity, and stress increase with time, while the rear is a rarefaction wave where they all decrease with time. Between these two wave fronts there are two remaining wave fronts. We call these wave fronts mesocontraction waves I and II. Wave front I is a wave in which stress decreases notwithstanding the increase in strain and particle velocity with time, which is followed by the other, i.e., wave front II, where with time, particle velocity, and stress decrease in spite of the increase in strain. The continuous wave front having continuous and nonsmooth profiles of strain, particle velocity, and stress can also be composed of four waves. These waves possess the same property as the corresponding waves in the smooth wave front mentioned above. The velocities at three boundaries that the waves have are discontinuous. Therefore, these four wave fronts are independent waves, just as a shock wave and a rarefraction wave. Specifically, the front wave, i.e., a contraction wave front is being outrun by a second wave front, the second one is being outrun by a third wave front, and the third is being outrun by a fourth wave front, i.e., a rarefaction wave. We call the second wave front degenerate contraction wave I. We also call the third

  8. Analysis and Design of International Emission Trading Markets Applying System Dynamics Techniques

    NASA Astrophysics Data System (ADS)

    Hu, Bo; Pickl, Stefan

    2010-11-01

    The design and analysis of international emission trading markets is an important actual challenge. Time-discrete models are needed to understand and optimize these procedures. We give an introduction into this scientific area and present actual modeling approaches. Furthermore, we develop a model which is embedded in a holistic problem solution. Measures for energy efficiency are characterized. The economic time-discrete "cap-and-trade" mechanism is influenced by various underlying anticipatory effects. With a systematic dynamic approach the effects can be examined. First numerical results show that fair international emissions trading can only be conducted with the use of protective export duties. Furthermore a comparatively high price which evokes emission reduction inevitably has an inhibiting effect on economic growth according to our model. As it always has been expected it is not without difficulty to find a balance between economic growth and emission reduction. It can be anticipated using our System Dynamics model simulation that substantial changes must be taken place before international emissions trading markets can contribute to global GHG emissions mitigation.

  9. Hydrogeochemistry and statistical analysis applied to understand fluoride provenance in the Guarani Aquifer System, Southern Brazil.

    PubMed

    Marimon, Maria Paula C; Roisenberg, Ari; Suhogusoff, Alexandra V; Viero, Antonio Pedro

    2013-06-01

    High fluoride concentrations (up to 11 mg/L) have been reported in the groundwater of the Guarani Aquifer System (Santa Maria Formation) in the central region of the state of Rio Grande do Sul, Southern Brazil. In this area, dental fluorosis is an endemic disease. This paper presents the geochemical data and the combination of statistical analysis (Principal components and cluster analyses) and geochemical modeling to achieve the hydrogeochemistry of the groundwater and discusses the possible fluoride origin. The groundwater from the Santa Maria Formation is comprised of four different geochemical groups. The first group corresponds to a sodium chloride groundwater which evolves to sodium bicarbonate, the second one, both containing fluoride anomalies. The third group is represented by calcium bicarbonate groundwater, and in the fourth, magnesium is the distinctive parameter. The statistical and geochemical analyses supported by isotopic measurements indicated that groundwater may have originated from mixtures of deeper aquifers and the fluoride concentrations could be derived from rock/water interactions (e.g., desorption from clay minerals). PMID:23149723

  10. Comparison of complexity measures using two complex system analysis methods applied to the epileptic ECoG

    NASA Astrophysics Data System (ADS)

    Janjarasjitt, Suparerk; Loparo, Kenneth A.

    2013-10-01

    A complex system analysis has been widely applied to examine the characteristics of an electroencephalogram (EEG) in health and disease, as well as the dynamics of the brain. In this study, two complexity measures, the correlation dimension and the spectral exponent, are applied to electrocorticogram (ECoG) data from subjects with epilepsy obtained during different states (seizure and non-seizure) and from different brain regions, and the complexities of ECoG data obtained during different states and from different brain regions are examined. From the computational results, the spectral exponent obtained from the wavelet-based fractal analysis is observed to provide information complementary to the correlation dimension derived from the nonlinear dynamical-systems analysis. ECoG data obtained during seizure activity have smoother temporal patterns and are less complex than data obtained during non-seizure activity. In addition, significant differences between these two ECoG complexity measures exist when applied to ECoG data obtained from different brain regions of subjects with epilepsy.

  11. A novel image processing and measurement system applied to quantitative analysis of simulated tooth root canal shape

    NASA Astrophysics Data System (ADS)

    Yong, Tao; Yong, Wei; Jin, Guofan; Gao, Xuejun

    2005-02-01

    Dental pulp is located in root canal of tooth. To modern root canal therapy, "Root canal preparation" is the main means to debride dental pulp infection. The shape of root canal will be changed after preparation, so, when assessing the preparation instruments and techniques, the root canal shaping ability especially the apical offset is very important factor. In this paper, a novel digital image processing and measurement system is designed and applied to quantitative analysis of simulated canal shape. By image pretreatment, feature extraction, registration and fusion, the variation of the root canals' characteristics (before and after preparation) can be accurately compared and measured, so as to assess the shaping ability of instruments. When the scanning resolution is 1200dpi or higher, the registration and measurement precision of the system can achieve 0.021mm or higher. The performance of the system is tested by a series of simulated root canals and stainless steel K-files.

  12. Matrix effects in applying mono- and polyclonal ELISA systems to the analysis of weathered oils in contaminated soil.

    PubMed

    Pollard, S J T; Farmer, J G; Knight, D M; Young, P J

    2002-01-01

    Commercial mono- and polyclonal enzyme-linked immunosorbent assay (ELISA) systems were applied to the on-site analysis of weathered hydrocarbon-contaminated soils at a former integrated steelworks. Comparisons were made between concentrations of solvent extractable matter (SEM) determined gravimetrically by Soxhlet (dichloromethane) extraction and those estimated immunologically by ELISA determination over a concentration range of 2000-330,000 mg SEM/kg soil dry weight. Both ELISA systems tinder-reported for the more weathered soil samples. Results suggest this is due to matrix effects in the sample rather than any inherent bias in the ELISA systems and it is concluded that, for weathered hydrocarbons typical of steelworks and coke production sites, the use of ELISA requires careful consideration as a field technique. Consideration of the target analyte relative to the composition of the hydrocarbon waste encountered appears critical. PMID:11858166

  13. Comparing Waste-to-Energy technologies by applying energy system analysis.

    PubMed

    Münster, Marie; Lund, Henrik

    2010-07-01

    Even when policies of waste prevention, re-use and recycling are prioritised, a fraction of waste will still be left which can be used for energy recovery. This article asks the question: How to utilise waste for energy in the best way seen from an energy system perspective? Eight different Waste-to-Energy technologies are compared with a focus on fuel efficiency, CO(2) reductions and costs. The comparison is carried out by conducting detailed energy system analyses of the present as well as a potential future Danish energy system with a large share of combined heat and power as well as wind power. The study shows potential of using waste for the production of transport fuels. Biogas and thermal gasification technologies are hence interesting alternatives to waste incineration and it is recommended to support the use of biogas based on manure and organic waste. It is also recommended to support research into gasification of waste without the addition of coal and biomass. Together the two solutions may contribute to alternate use of one third of the waste which is currently incinerated. The remaining fractions should still be incinerated with priority to combined heat and power plants with high electric efficiency. PMID:19700298

  14. Advanced Behavioral Applications in Schools: A Review of R. Douglas Greer's "Designing Teaching Strategies: An Applied Behavior Analysis Systems Approach"

    ERIC Educational Resources Information Center

    Moxley, Roy A.

    2004-01-01

    R. Douglas Greer's "Designing Teaching Strategies" is an important book directed to advanced students in applied behavior analysis for classrooms. This review presents some of the striking features of the Comprehensive Applied Behavior Analysis to Schooling (CABAS[R]) program and the individualized instruction that the book advances. These include…

  15. Underdetermined system theory applied to quantitative analysis of responses caused by unsteady smooth-plane waves

    NASA Astrophysics Data System (ADS)

    Sano, Yukio

    1993-01-01

    The mechanical responses of rate-dependent media caused by unsteady smooth-plane waves are quantitatively analyzed by an underdetermined system of equations without using any constitutive relation of the media; that is, by using the particle velocity field expressed by an algebraic equation that is derived from the mass conservation equation, and the stress field expressed by an algebraic equation that is derived from the momentum conservation equation. First of all, this approach for analyzing unsteady wave motion is justified by the verification of various inferences such as the existence of the nonindependent elementary waves by Sano [J. Appl. Phys. 65, 3857(1989)] and the degradation process by Sano [J. Appl. Phys. 67, 4072(1990)]. Second, the situation under which a spike arises in particle velocity-time and stress-time profiles, and the reason for the arising are clarified. Third, the influence of the spike on stress-particle velocity and stress-strain paths is examined. The spike induced in the profiles by a growing wave greatly influences the paths near the impacted surface. Finally, calculated particle velocity-time profiles are compared with experimental data.

  16. [Apply association rules to analysis adverse drug reactions of shuxuening injection based on spontaneous reporting system data].

    PubMed

    Yang, Wei; Xie, Yan-Ming; Xiang, Yong-Yang

    2014-09-01

    This research based on the analysis of spontaneous reporting system (SRS) data which the 9 601 case reports of Shuxuening injection adverse drug reactions (ADR) in national adverse drug reaction monitoring center during 2005-2012. Apply to the association rules to analysis of the relationship between Shuxuening injection's ADR and the characteristics of ADR reports were. We found that ADR commonly combination were "nausea + breath + chills + vomiting", "nausea + chills + vomiting + palpitations", and their confidence level were 100%. The ADR and the case reports information commonly combination were "itching, and glucose and sodium chloride Injection, and generally ADR report, and normal dosage", "palpitation, and glucose and sodium chloride injection, and normal dosage, and new report", "chills, and generally ADR report, and normal dosage, and 0.9% sodium chloride injection", and their confidence level were 100% too. The results showed that patients using Shuxuening injection occurred most of ADRs were systemic damage, skin and its accessories damage, digestive system damage, etc. And most of cases were generally and new reports, and patients with normal dosage. The ADR's occurred had little related with solvent. It is showed that the Shuxuening injection occurred of ADR mainly related to drug composition. So Shuxuening injection used in clinical need to closely observation, and focus on the ADR reaction, and to do a good job of drug risk management. PMID:25532406

  17. Frequency Domain Analysis of Beat-Less Control Method for Converter-Inverter Driving Systems Applied to AC Electric Cars

    NASA Astrophysics Data System (ADS)

    Kimura, Akira

    In inverter-converter driving systems for AC electric cars, the DC input voltage of an inverter contains a ripple component with a frequency that is twice as high as the line voltage frequency, because of a single-phase converter. The ripple component of the inverter input voltage causes pulsations on torques and currents of driving motors. To decrease the pulsations, a beat-less control method, which modifies a slip frequency depending on the ripple component, is applied to the inverter control. In the present paper, the beat-less control method was analyzed in the frequency domain. In the first step of the analysis, transfer functions, which revealed the relationship among the ripple component of the inverter input voltage, the slip frequency, the motor torque pulsation and the current pulsation, were derived with a synchronous rotating model of induction motors. An analysis model of the beat-less control method was then constructed using the transfer functions. The optimal setting of the control method was obtained according to the analysis model. The transfer functions and the analysis model were verified through simulations.

  18. NextGen Brain Microdialysis: Applying Modern Metabolomics Technology to the Analysis of Extracellular Fluid in the Central Nervous System

    PubMed Central

    Kao, Chi-Ya; Anderzhanova, Elmira; Asara, John M.; Wotjak, Carsten T.; Turck, Christoph W.

    2015-01-01

    Microdialysis is a powerful method for in vivo neurochemical analyses. It allows fluid sampling in a dynamic manner in specific brain regions over an extended period of time. A particular focus has been the neurochemical analysis of extracellular fluids to explore central nervous system functions. Brain microdialysis recovers neurotransmitters, low-molecular-weight neuromodulators and neuropeptides of special interest when studying behavior and drug effects. Other small molecules, such as central metabolites, are typically not assessed despite their potential to yield important information related to brain metabolism and activity in selected brain regions. We have implemented a liquid chromatography online mass spectrometry metabolomics platform for an expanded analysis of mouse brain microdialysates. The method is sensitive and delivers information for a far greater number of analytes than commonly used electrochemical and fluorescent detection or biochemical assays. The metabolomics platform was applied to the analysis of microdialysates in a foot shock-induced mouse model of posttraumatic stress disorder (PTSD). The rich metabolite data information was then used to delineate affected prefrontal molecular pathways that reflect individual susceptibility for developing PTSD-like symptoms. We demonstrate that hypothesis-free metabolomics can be adapted to the analysis of microdialysates for the discovery of small molecules with functional significance.

  19. Simulating Nationwide Pandemics: Applying the Multi-scale Epidemiologic Simulation and Analysis System to Human Infectious Diseases

    SciTech Connect

    Dombroski, M; Melius, C; Edmunds, T; Banks, L E; Bates, T; Wheeler, R

    2008-09-24

    This study uses the Multi-scale Epidemiologic Simulation and Analysis (MESA) system developed for foreign animal diseases to assess consequences of nationwide human infectious disease outbreaks. A literature review identified the state of the art in both small-scale regional models and large-scale nationwide models and characterized key aspects of a nationwide epidemiological model. The MESA system offers computational advantages over existing epidemiological models and enables a broader array of stochastic analyses of model runs to be conducted because of those computational advantages. However, it has only been demonstrated on foreign animal diseases. This paper applied the MESA modeling methodology to human epidemiology. The methodology divided 2000 US Census data at the census tract level into school-bound children, work-bound workers, elderly, and stay at home individuals. The model simulated mixing among these groups by incorporating schools, workplaces, households, and long-distance travel via airports. A baseline scenario with fixed input parameters was run for a nationwide influenza outbreak using relatively simple social distancing countermeasures. Analysis from the baseline scenario showed one of three possible results: (1) the outbreak burned itself out before it had a chance to spread regionally, (2) the outbreak spread regionally and lasted a relatively long time, although constrained geography enabled it to eventually be contained without affecting a disproportionately large number of people, or (3) the outbreak spread through air travel and lasted a long time with unconstrained geography, becoming a nationwide pandemic. These results are consistent with empirical influenza outbreak data. The results showed that simply scaling up a regional small-scale model is unlikely to account for all the complex variables and their interactions involved in a nationwide outbreak. There are several limitations of the methodology that should be explored in future

  20. Functional analysis, a resilience improvement tool applied to a waste management system - application to the "household waste management chain"

    NASA Astrophysics Data System (ADS)

    Beraud, H.; Barroca, B.; Hubert, G.

    2012-12-01

    A waste management system plays a leading role in the capacity of an area to restart after flooding, as their impact on post-crisis management can be very considerable. Improving resilience, i.e. enabling it to maintain or recover acceptable operating levels after flooding is primordial. To achieve this, we must understand how the system works for bringing any potential dysfunctions to light and taking preventive measures. Functional analysis has been used for understanding the complexity of this type of system. The purpose of this article is to show the interest behind this type of method and the limits in its use for improving resilience of waste management system as well as other urban technical systems1, by means of theoretical modelling and its application on a study site. 1In a systemic vision of the city, urban technical systems combine all the user service systems that are essential for the city to operate (electricity, water supplies, transport, sewerage, etc.). These systems are generally organised in the form of networks (Coutard, 2010; CERTU, 2005).

  1. Applied mathematics of chaotic systems

    SciTech Connect

    Jen, E.; Alber, M.; Camassa, R.; Choi, W.; Crutchfield, J.; Holm, D.; Kovacic, G.; Marsden, J.

    1996-07-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objectives of the project were to develop new mathematical techniques for describing chaotic systems and for reexpressing them in forms that can be solved analytically and computationally. The authors focused on global bifurcation analysis of rigid body motion in an ideal incompressible fluid and on an analytical technique for the exact solution of nonlinear cellular automata. For rigid-body motion, they investigated a new completely integrable partial differential equation (PDE) representing model motion of fronts in nematic crystals and studied perturbations of the integrable PDE. For cellular automata with multiple domain structures, the work has included: (1) identification of the associated set of conserved quantities for each type of domain; (2) use of the conserved quantities to construct isomorphism between the nonlinear system and a linear template; and (3) use of exact solvability methods to characterize detailed structure of equilibrium states and to derive bounds for maximal transience times.

  2. Structured Biological Modelling: a method for the analysis and simulation of biological systems applied to oscillatory intracellular calcium waves.

    PubMed

    Kraus, M; Lais, P; Wolf, B

    1992-01-01

    In biology signal and information processing networks are widely known. Due to their inherent complexity and non-linear dynamics the time evolution of these systems can not be predicted by simple plausibility arguments. Fortunately, the power of modern computers allows the simulation of complex biological models. Therefore the problem becomes reduced to the question of how to develop a consistent mathematical model which comprises the essentials of the real biological system. As an interface between the phenomenological description and a computer simulation of the system the proposed method of Structured Biological Modelling (SBM) uses top-down levelled dataflow diagrams. They serve as a powerful tool for the analysis and the mathematical description of the system in terms of a stochastic formulation. The stochastic treatment, regarding the time evolution of the system as a stochastic process governed by a master equation, circumvents most difficulties arising from high dimensional and non-linear systems. As an application of SBM we develop a stochastic computer model of intracellular oscillatory Ca2+ waves in non-excitable cells. As demonstrated on this example, SBM can be used for the design of computer experiments which under certain conditions can be used as cheap and harmless counterparts to the usual time-consuming biological experiments. PMID:1334718

  3. Economic impacts of bio-refinery and resource cascading systems: an applied general equilibrium analysis for Poland.

    PubMed

    Ignaciuk, Adriana M; Sanders, Johan

    2007-12-01

    Due to more stringent energy and climate policies, it is expected that many traditional chemicals will be replaced by their biomass-based substitutes, bio-chemicals. These innovations, however, can influence land allocation since the demand for land dedicated to specific crops might increase. Moreover, it can have an influence on traditional agricultural production. In this paper, we use an applied general equilibrium framework, in which we include two different bio-refinery processes and incorporate so-called cascading mechanisms. The bio-refinery processes use grass, as one of the major inputs, to produce bio-nylon and propane-diol (1,3PDO) to substitute currently produced fossil fuel-based nylon and ethane-diol. We examine the impact of specific climate policies on the bioelectricity share in total electricity production, land allocation, and production quantities and prices of selected commodities. The novel technologies become competitive, with an increased stringency of climate policies. This switch, however, does not induce a higher share of bioelectricity. The cascade does stimulate the production of bioelectricity, but it induces more of a shift in inputs in the bioelectricity sector (from biomass to the cascaded bio-nylon and 1, 3PDO) than an increase in production level of bioelectricity. We conclude that dedicated biomass crops will remain the main option for bioelectricity production: the contribution of the biomass systems remains limited. Moreover, the bioelectricity sector looses a competition for land for biomass production with bio-refineries. PMID:17924388

  4. A mathematical method for the 3D analysis of rotating deformable systems applied on lumen-forming MDCK cell aggregates.

    PubMed

    Marmaras, Anastasios; Berge, Ulrich; Ferrari, Aldo; Kurtcuoglu, Vartan; Poulikakos, Dimos; Kroschewski, Ruth

    2010-04-01

    Cell motility contributes to the formation of organs and tissues, into which multiple cells self-organize. However such mammalian cellular motilities are not characterized in a quantitative manner and the systemic consequences are thus unknown. A mathematical tool to decipher cell motility, accounting for changes in cell shape, within a three-dimensional (3D) cell system was missing. We report here such a tool, usable on segmented images reporting the outline of clusters (cells) and allowing the time-resolved 3D analysis of circular motility of these as parts of a system (cell aggregate). Our method can analyze circular motility in sub-cellular, cellular, multi-cellular, and also non-cellular systems for which time-resolved segmented cluster outlines are available. To exemplify, we characterized the circular motility of lumen-initiating MDCK cell aggregates, embedded in extracellular matrix. We show that the organization of the major surrounding matrix fibers was not significantly affected during this cohort rotation. Using our developed tool, we discovered two classes of circular motion, rotation and random walk, organized in three behavior patterns during lumen initiation. As rotational movements were more rapid than random walk and as both could continue during lumen initiation, we conclude that neither the class nor the rate of motion regulates lumen initiation. We thus reveal a high degree of plasticity during a developmentally critical cell polarization step, indicating that lumen initiation is a robust process. However, motility rates decreased with increasing cell number, previously shown to correlate with epithelial polarization, suggesting that migratory polarization is converted into epithelial polarization during aggregate development. PMID:20183868

  5. Module systems applied to biomass

    SciTech Connect

    Jenkins, B.M.

    1983-12-01

    Applications of cotton moduling equipment to biomass have been tested in California. A module of chopped rice straw was made to determine physical characteristics of straw modules. A module system for tree prunings using a heavy duty module builder was tested extensively in 1983. Total direct costs to module, transport 8 km (5 mi), store, cut, tubgrind, and haul chips 50 km (30 mi) to a cogeneration plant is estimated to be $26.64/t ($24.17/t).

  6. A Systematic Comprehensive Computational Model for Stake Estimation in Mission Assurance: Applying Cyber Security Econometrics System (CSES) to Mission Assurance Analysis Protocol (MAAP)

    SciTech Connect

    Abercrombie, Robert K; Sheldon, Frederick T; Grimaila, Michael R

    2010-01-01

    In earlier works, we presented a computational infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder stands to sustain as a result of security breakdowns. In this paper, we discuss how this infrastructure can be used in the subject domain of mission assurance as defined as the full life-cycle engineering process to identify and mitigate design, production, test, and field support deficiencies of mission success. We address the opportunity to apply the Cyberspace Security Econometrics System (CSES) to Carnegie Mellon University and Software Engineering Institute s Mission Assurance Analysis Protocol (MAAP) in this context.

  7. R-matrix analysis of reactions in the 9B compound system applied to the 7Li problem in BBN

    NASA Astrophysics Data System (ADS)

    Paris, M.; Hale, G.; Hayes-Sterbenz, A.; Jungman, G.

    2016-01-01

    Recent activity in solving the ‘lithium problem’ in big bang nucleosynthesis has focused on the role that putative resonances may play in resonance-enhanced destruction of 7Li. Particular attention has been paid to the reactions involving the 9B compound nuclear system, d+7Be → 9B. These reactions are analyzed via the multichannel, two-body unitary R-matrix method using the code EDA developed by Hale and collaborators. We employ much of the known elastic and reaction data, in a four-channel treatment. The data include elastic 3He +6Li differential cross sections from 0.7 to 2.0 MeV, integrated reaction cross sections for energies from 0.7 to 5.0 MeV for 6Li(3He,p)8Be* and from 0.4 to 5.0 MeV for the 6Li(3He,d)7Be reaction. Capture data have been added to an earlier analysis with integrated cross section measurements from 0.7 to 0.825 MeV for 6Li(3He,γ)9B. The resulting resonance parameters are compared with tabulated values, and previously unidentified resonances are noted. Our results show that there are no near d+7Be threshold resonances with widths that are 10’s of keV and reduce the likelihood that a resonance-enhanced mass-7 destruction mechanism, as suggested in recently published work, can explain the 7Li problem.

  8. Applied Information Systems Research Program Workshop

    NASA Technical Reports Server (NTRS)

    Bredekamp, Joe

    1991-01-01

    Viewgraphs on Applied Information Systems Research Program Workshop are presented. Topics covered include: the Earth Observing System Data and Information System; the planetary data system; Astrophysics Data System project review; OAET Computer Science and Data Systems Programs; the Center of Excellence in Space Data and Information Sciences; and CASIS background.

  9. The basic importance of applied behavior analysis

    PubMed Central

    Epling, W. Frank; Pierce, W. David

    1986-01-01

    We argue that applied behavior analysis is relevant to basic research. Modification studies, and a broad range of investigations that focus on the precipitating and maintaining conditions of socially significant human behavior, have basic importance. Applied behavior analysis may aid basic researchers in the design of externally valid experiments and thereby enhance the theoretical significance of basic research for understanding human behavior. Applied research with humans, directed at culturally-important problems, will help to propagate the science of human behavior. Such a science will also be furthered by analogue experiments that model socially important behavior. Analytical-applied studies and analogue experiments are forms of applied behavior analysis that could suggest new environment-behavior relationships. These relationships could lead to basic research and principles that further the prediction, control, and understanding of behavior. PMID:22478650

  10. Applied Behavior Analysis and Statistical Process Control?

    ERIC Educational Resources Information Center

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  11. 2012 International Conference on Medical Physics and Biomedical Engineering Thermal Economic Analysis on LiBr Refrigeration -Heat Pump System Applied in CCHP System

    NASA Astrophysics Data System (ADS)

    Zhang, CuiZhen; Yang, Mo; Lu, Mei; Zhu, Jiaxian; Xu, Wendong

    LiBr refrigeration cooling water contains a lot of low-temperature heat source, can use this part of the heat source heat boiler feed water. This paper introduced LiBr refrigeration - heat pump system which recovery heat of the LiBr refrigeration cooling water by heat pump system to heat the feed water of boiler. Hot economic analysis on the system has been performed based on the experimental data. Results show that LiBr refrigeration-heat pump system brings 26.6 percent decrease in primary energy rate consumption comparing with the combined heat and power production system(CHP) and separate generation of cold;

  12. System planning analysis applied to OTEC: initial cases by Florida Power Corporation. Task II report No. FC-5237-2

    SciTech Connect

    1980-03-01

    The objective of the task was to exercise the FPC system planning methodology on: (1) Base Case, 10 year generation expansion plan with coal plants providing base load expansion, and (2) same, but 400 MW of OTEC substituting for coal burning units with equal resultant system reliability. OTEC inputs were based on reasonable economic projections of direct capital cost and O and M costs for first-generation large commercial plants. OTEC inputs discussed in Section 2. The Base Case conditions for FPC system planning methodology involved base load coal fueled additions during the 1980's and early 1990's. The first trial runs of the PROMOD system planning model substituted OTEC for 400 MW purchases of coal generated power during 1988-1989 and then 400 MW coal capacity thereafter. Result showed higher system reliability than Base Case runs. Reruns with greater coal fueled capacity displacement showed that OTEC could substitute for 400 MW purchases in 1988-1989 and replace the 800 MW coal unit scheduled for 1990 to yield equivalent system reliability. However, a 1995 unit would need to be moved to 1994. Production costing computer model runs were used as input to Corporate Model to examine corporate financial impact. Present value of total revenue requirements were primary indication of relative competitiveness between Base Case and OTEC. Results show present value of total revenue requirements unfavorable to OTEC as compared to coal units. The disparity was in excess of the allowable range for possible consideration.

  13. Electrochemical analysis of acetaminophen using a boron-doped diamond thin film electrode applied to flow injection system.

    PubMed

    Wangfuengkanagul, Nattakarn; Chailapakul, Orawon

    2002-06-01

    The electrochemistry of acetaminophen in phosphate buffer solution (pH 8) was studied at a boron-doped diamond (BDD) thin film electrode using cyclic voltammetry, hydrodynamic voltammetry, and flow injection with amperometric detection. Cyclic voltammetry was used to study the reaction as a function of concentration of analyte. Comparison experiments were performed using a polished glassy carbon (GC) electrode. Acetaminophen undergoes quasi-reversible reaction at both of these two electrodes. The BDD and GC electrodes provided well-resolved cyclic voltammograms but the voltammetric signal-to-background ratios obtained from the diamond electrode were higher than those obtained from the GC electrode. The diamond electrode provided a linear dynamic range from 0.1 to 8 mM and a detection of 10 microM (S/B approximately 3) for voltammetric measurement. The flow injection analysis results at the diamond electrode indicated a linear dynamic range from 0.5 to 50 microM and a detection limit of 10 nM (S/N approximately 4). Acetaminophen in syrup samples has also been investigated. The results obtained in the recovery study (24.68+/-0.26 mg/ml) were comparable to those labeled (24 mg/ml). PMID:12039625

  14. Defining applied behavior analysis: An historical analogy

    PubMed Central

    Deitz, Samuel M.

    1982-01-01

    This article examines two criteria for a definition of applied behavior analysis. The criteria are derived from a 19th century attempt to establish medicine as a scientific field. The first criterion, experimental determinism, specifies the methodological boundaries of an experimental science. The second criterion, philosophic doubt, clarifies the tentative nature of facts and theories derived from those facts. Practices which will advance the science of behavior are commented upon within each criteria. To conclude, the problems of a 19th century form of empiricism in medicine are related to current practices in applied behavior analysis. PMID:22478557

  15. Global energy perspectives: A summary of the joint study by the International Institute for Applied Systems Analysis and World Energy Council

    SciTech Connect

    Gruebler, A.; Nakicenovic, N. |; Jefferson, M.

    1996-03-01

    This article reports a study on Global Energy Perspectives to 2050 and Beyond conducted jointly by the International Institute for Applied Systems Analysis (IIASA) and the World Energy Council (WEC). All together three cases of economic and energy developments were developed that sprawl into six scenarios of energy supply alternatives extending until the end of the 21st century. The international consistency of the scenarios was assessed with the help of formal energy models. The study took close account of world population prospects, economic growth, technological advance, the energy resource base, environmental implications from the local to the global level, financing requirements, and the future prospects of both fossil and nonfossil fuels and industries. Although no analysis can turn an uncertain future into a sure thing, the study identifies patterns that are robust across a purposely broad range of scenarios. The study also enables one to relate alternative near-term research and development, technology, economic, and environmental policies to the possible long-term divergence of energy systems structures. Due to the long lead times involved in the turnover of capital stock and infrastructures of the energy system, policy would need to be implemented now in order to initiate long-term structural changes in the energy system that would, however, become significant only after the year 2020. 23 refs., 10 figs., 8 tabs.

  16. Goals Analysis Procedure Guidelines for Applying the Goals Analysis Process

    NASA Technical Reports Server (NTRS)

    Motley, Albert E., III

    2000-01-01

    One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.

  17. Applied behavior analysis and statistical process control?

    PubMed Central

    Hopkins, B L

    1995-01-01

    This paper examines Pfadt and Wheeler's (1995) suggestions that the methods of statistical process control (SPC) be incorporated into applied behavior analysis. The research strategies of SPC are examined and compared to those of applied behavior analysis. I argue that the statistical methods that are a part of SPC would likely reduce applied behavior analysts' intimate contacts with the problems with which they deal and would, therefore, likely yield poor treatment and research decisions. Examples of these kinds of results and decisions are drawn from the cases and data Pfadt and Wheeler present. This paper also describes and clarifies many common misconceptions about SPC, including W. Edwards Deming's involvement in its development, its relationship to total quality management, and its confusion with various other methods designed to detect sources of unwanted variability. PMID:7592156

  18. Applied Behavior Analysis: Beyond Discrete Trial Teaching

    ERIC Educational Resources Information Center

    Steege, Mark W.; Mace, F. Charles; Perry, Lora; Longenecker, Harold

    2007-01-01

    We discuss the problem of autism-specific special education programs representing themselves as Applied Behavior Analysis (ABA) programs when the only ABA intervention employed is Discrete Trial Teaching (DTT), and often for limited portions of the school day. Although DTT has many advantages to recommend its use, it is not well suited to teach…

  19. Positive Behavior Support and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Johnston, J. M.; Foxx, R. M.; Jacobson, J. W.; Green, G.; Mulick, J. A.

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We…

  20. Thermodynamic Laws Applied to Economic Systems

    ERIC Educational Resources Information Center

    González, José Villacís

    2009-01-01

    Economic activity in its different manifestations--production, exchange, consumption and, particularly, information on quantities and prices--generates and transfers energy. As a result, we can apply to it the basic laws of thermodynamics. These laws are applicable within a system, i.e., in a country or between systems and countries. To these…

  1. System Applies Polymer Powder To Filament Tow

    NASA Technical Reports Server (NTRS)

    Baucom, Robert M.; Snoha, John J.; Marchello, Joseph M.

    1993-01-01

    Polymer powder applied uniformly and in continuous manner. Powder-coating system applies dry polymer powder to continuous fiber tow. Unique filament-spreading technique, combined with precise control of tension on fibers in system, ensures uniform application of polymer powder to web of spread filaments. Fiber tows impregnated with dry polymer powders ("towpregs") produced for preform-weaving and composite-material-molding applications. System and process valuable to prepreg industry, for production of flexible filament-windable tows and high-temperature polymer prepregs.

  2. Caldwell University's Department of Applied Behavior Analysis.

    PubMed

    Reeve, Kenneth F; Reeve, Sharon A

    2016-05-01

    Since 2004, faculty members at Caldwell University have developed three successful graduate programs in Applied Behavior Analysis (i.e., PhD, MA, non-degree programs), increased program faculty from two to six members, developed and operated an on-campus autism center, and begun a stand-alone Applied Behavior Analysis Department. This paper outlines a number of strategies used to advance these initiatives, including those associated with an extensive public relations campaign. We also outline challenges that have limited our programs' growth. These strategies, along with a consideration of potential challenges, might prove useful in guiding academicians who are interested in starting their own programs in behavior analysis. PMID:27606194

  3. Some still-current dimensions of applied behavior analysis

    PubMed Central

    Baer, Donald M.; Wolf, Montrose M.; Risley, Todd R.

    1987-01-01

    Twenty years ago, an anthropological note described the current dimensions of applied behavior analysis as it was prescribed and practiced in 1968: It was, or ought to become, applied, behavioral, analytic, technological, conceptual, effective, and capable of appropriately generalized outcomes. A similar anthropological note today finds the same dimensions still prescriptive, and to an increasing extent, descriptive. Several new tactics have become evident, however, some in the realm of conceptual analysis, some in the sociological status of the discipline, and some in its understanding of the necessary systemic nature of any applied discipline that is to operate in the domain of important human behaviors. PMID:16795703

  4. Tropospheric Delay Raytracing Applied in VLBI Analysis

    NASA Astrophysics Data System (ADS)

    MacMillan, D. S.; Eriksson, D.; Gipson, J. M.

    2013-12-01

    Tropospheric delay modeling error continues to be one of the largest sources of error in VLBI analysis. For standard operational solutions, we use the VMF1 elevation-dependent mapping functions derived from ECMWF data. These mapping functions assume that tropospheric delay at a site is azimuthally symmetric. As this assumption does not reflect reality, we have determined the raytrace delay along the signal path through the troposphere for each VLBI quasar observation. We determined the troposphere refractivity fields from the pressure, temperature, specific humidity and geopotential height fields of the NASA GSFC GEOS-5 numerical weather model. We discuss results from analysis of the CONT11 R&D and the weekly operational R1+R4 experiment sessions. When applied in VLBI analysis, baseline length repeatabilities were better for 66-72% of baselines with raytraced delays than with VMF1 mapping functions. Vertical repeatabilities were better for 65% of sites.

  5. Wavelet analysis applied to the IRAS cirrus

    NASA Technical Reports Server (NTRS)

    Langer, William D.; Wilson, Robert W.; Anderson, Charles H.

    1994-01-01

    The structure of infrared cirrus clouds is analyzed with Laplacian pyramid transforms, a form of non-orthogonal wavelets. Pyramid and wavelet transforms provide a means to decompose images into their spatial frequency components such that all spatial scales are treated in an equivalent manner. The multiscale transform analysis is applied to IRAS 100 micrometer maps of cirrus emission in the north Galactic pole region to extract features on different scales. In the maps we identify filaments, fragments and clumps by separating all connected regions. These structures are analyzed with respect to their Hausdorff dimension for evidence of the scaling relationships in the cirrus clouds.

  6. Positive Behavior Support and Applied Behavior Analysis

    PubMed Central

    Johnston, J.M; Foxx, Richard M; Jacobson, John W; Green, Gina; Mulick, James A

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We also consider the features of PBS that have facilitated its broad dissemination and how ABA might benefit from emulating certain practices of the PBS movement. PMID:22478452

  7. Artificial intelligence applied to process signal analysis

    NASA Technical Reports Server (NTRS)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  8. Applied spectrophotometry: analysis of a biochemical mixture.

    PubMed

    Trumbo, Toni A; Schultz, Emeric; Borland, Michael G; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the concentration of a given species (RNA, DNA, protein) in isolation (a contrived circumstance) as opposed to determining that concentration in the presence of other species (a more realistic situation). To present the student with a more realistic laboratory experience and also to fill a hole that we believe exists in student experience prior to reaching a biochemistry course, we have devised a three week laboratory experience designed so that students learn to: connect laboratory practice with theory, apply the Beer-Lambert-Bougert Law to biochemical analyses, demonstrate the utility and limitations of example quantitative colorimetric assays, demonstrate the utility and limitations of UV analyses for biomolecules, develop strategies for analysis of a solution of unknown biomolecular composition, use digital micropipettors to make accurate and precise measurements, and apply graphing software. PMID:23625877

  9. A novel mating system analysis for modes of self-oriented mating applied to diploid and polyploid arctic Easter daisies (Townsendia hookeri).

    PubMed

    Thompson, S L; Ritland, K

    2006-08-01

    We have developed a new model for mating system analysis, which attempts to distinguish among alternative modes of self-oriented mating within populations. This model jointly estimates the rates of outcrossing, selfing, automixis and apomixis, through the use of information in the family structure given by dominant genetic marker data. The method is presented, its statistical properties evaluated, and is applied to three arctic Easter daisy populations, one consisting of diploids, the other two of tetraploids. The tetraploids are predominantly male sterile and reported to be apomictic while the diploids are male fertile. In each Easter daisy population, 10 maternal arrays of six progeny were assayed for amplified fragment length polymorphism markers. Estimates, confirmed with likelihood ratio tests of mating hypotheses, showed apomixis to be predominant in all populations (ca. 70%), but selfing or automixis was moderate (ca. 25%) in tetraploids. It was difficult to distinguish selfing from automixis, and simulations confirm that with even very large sample sizes, the estimates have a very strong negative statistical correlation, for example, they are not independent. No selfing or automixis was apparent in the diploid population, instead, moderate levels of outcrossing were detected (23%). Low but significant levels of outcrossing (2-4%) seemed to occur in the male-sterile tetraploid populations; this may be due to genotyping error of this level. Overall, this study shows apomixis can be partial, and provides evidence for higher levels of inbreeding in polyploids compared to diploids and for significant levels of apomixis in a diploid plant population. PMID:16721390

  10. Spectral Selectivity Applied To Hybrid Concentration Systems

    NASA Astrophysics Data System (ADS)

    Hamdy, M. A.; Luttmann, F.; Osborn, D. E.; Jacobson, M. R.; MacLeod, H. A.

    1985-12-01

    The efficiency of conversion of concentrated solar energy can be improved by separating the solar spectrum into portions matched to specific photoquantum processes and the balance used for photothermal conversion. The basic approaches of spectrally selective beam splitters are presented. A detailed simulation analysis using TRNSYS is developed for a spectrally selective hybrid photovoltaic/photothermal concentrating system. The analysis shows definite benefits to a spectrally selective approach.

  11. Applied Information Systems Research Program Workshop

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The first Applied Information Systems Research Program (AISRP) Workshop provided the impetus for several groups involved in information systems to review current activities. The objectives of the workshop included: (1) to provide an open forum for interaction and discussion of information systems; (2) to promote understanding by initiating a dialogue with the intended benefactors of the program, the scientific user community, and discuss options for improving their support; (3) create an advocacy in having science users and investigators of the program meet together and establish the basis for direction and growth; and (4) support the future of the program by building collaborations and interaction to encourage an investigator working group approach for conducting the program.

  12. The Applied Mathematics for Power Systems (AMPS)

    SciTech Connect

    Chertkov, Michael

    2012-07-24

    Increased deployment of new technologies, e.g., renewable generation and electric vehicles, is rapidly transforming electrical power networks by crossing previously distinct spatiotemporal scales and invalidating many traditional approaches for designing, analyzing, and operating power grids. This trend is expected to accelerate over the coming years, bringing the disruptive challenge of complexity, but also opportunities to deliver unprecedented efficiency and reliability. Our Applied Mathematics for Power Systems (AMPS) Center will discover, enable, and solve emerging mathematics challenges arising in power systems and, more generally, in complex engineered networks. We will develop foundational applied mathematics resulting in rigorous algorithms and simulation toolboxes for modern and future engineered networks. The AMPS Center deconstruction/reconstruction approach 'deconstructs' complex networks into sub-problems within non-separable spatiotemporal scales, a missing step in 20th century modeling of engineered networks. These sub-problems are addressed within the appropriate AMPS foundational pillar - complex systems, control theory, and optimization theory - and merged or 'reconstructed' at their boundaries into more general mathematical descriptions of complex engineered networks where important new questions are formulated and attacked. These two steps, iterated multiple times, will bridge the growing chasm between the legacy power grid and its future as a complex engineered network.

  13. Use of an automated chromium reduction system for hydrogen isotope ratio analysis of physiological fluids applied to doubly labeled water analysis.

    PubMed

    Schoeller, D A; Colligan, A S; Shriver, T; Avak, H; Bartok-Olson, C

    2000-09-01

    The doubly labeled water method is commonly used to measure total energy expenditure in free-living subjects. The method, however, requires accurate and precise deuterium abundance determinations, which can be laborious. The aim of this study was to evaluate a fully automated, high-throughput, chromium reduction technique for the measurement of deuterium abundances in physiological fluids. The chromium technique was compared with an off-line zinc bomb reduction technique and also subjected to test-retest analysis. Analysis of international water standards demonstrated that the chromium technique was accurate and had a within-day precision of <1 per thousand. Addition of organic matter to water samples demonstrated that the technique was sensitive to interference at levels between 2 and 5 g l(-1). Physiological samples could be analyzed without this interference, plasma by 10000 Da exclusion filtration, saliva by sedimentation and urine by decolorizing with carbon black. Chromium reduction of urine specimens from doubly labeled water studies indicated no bias relative to zinc reduction with a mean difference in calculated energy expenditure of -0.2 +/- 3.9%. Blinded reanalysis of urine specimens from a second doubly labeled water study demonstrated a test-retest coefficient of variation of 4%. The chromium reduction method was found to be a rapid, accurate and precise method for the analysis of urine specimens from doubly labeled water. PMID:11006607

  14. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2002-01-01

    This viewgraph report presents an overview of activities and accomplishments of NASA's Marshall Space Flight Center's Applied Fluid Dynamics Analysis Group. Expertise in this group focuses on high-fidelity fluids design and analysis with application to space shuttle propulsion and next generation launch technologies. Topics covered include: computational fluid dynamics research and goals, turbomachinery research and activities, nozzle research and activities, combustion devices, engine systems, MDA development and CFD process improvements.

  15. Tribological systems as applied to aircraft engines

    NASA Technical Reports Server (NTRS)

    Buckley, D. H.

    1985-01-01

    Tribological systems as applied to aircraft are reviewed. The importance of understanding the fundamental concepts involved in such systems is discussed. Basic properties of materials which can be related to adhesion, friction and wear are presented and correlated with tribology. Surface processes including deposition and treatment are addressed in relation to their present and future application to aircraft components such as bearings, gears and seals. Lubrication of components with both liquids and solids is discussed. Advances in both new liquid molecular structures and additives for those structures are reviewed and related to the needs of advanced engines. Solids and polymer composites are suggested for increasing use and ceramic coatings containing fluoride compounds are offered for the extreme temperatures encountered in such components as advanced bearings and seals.

  16. Magnetic Analysis Techniques Applied to Desert Varnish

    NASA Technical Reports Server (NTRS)

    Schmidgall, E. R.; Moskowitz, B. M.; Dahlberg, E. D.; Kuhlman, K. R.

    2003-01-01

    Desert varnish is a black or reddish coating commonly found on rock samples from arid regions. Typically, the coating is very thin, less than half a millimeter thick. Previous research has shown that the primary components of desert varnish are silicon oxide clay minerals (60%), manganese and iron oxides (20-30%), and trace amounts of other compounds [1]. Desert varnish is thought to originate when windborne particles containing iron and manganese oxides are deposited onto rock surfaces where manganese oxidizing bacteria concentrate the manganese and form the varnish [4,5]. If desert varnish is indeed biogenic, then the presence of desert varnish on rock surfaces could serve as a biomarker, indicating the presence of microorganisms. This idea has considerable appeal, especially for Martian exploration [6]. Magnetic analysis techniques have not been extensively applied to desert varnish. The only previous magnetic study reported that based on room temperature demagnetization experiments, there were noticeable differences in magnetic properties between a sample of desert varnish and the substrate sandstone [7]. Based upon the results of the demagnetization experiments, the authors concluded that the primary magnetic component of desert varnish was either magnetite (Fe3O4) or maghemite ( Fe2O3).

  17. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  18. Toward applied behavior analysis of life aloft.

    PubMed

    Brady, J V

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  19. Toward applied behavior analysis of life aloft

    NASA Technical Reports Server (NTRS)

    Brady, J. V.

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  20. System safety as applied to Skylab

    NASA Technical Reports Server (NTRS)

    Kleinknecht, K. S.; Miller, B. J.

    1974-01-01

    Procedural and organizational guidelines used in accordance with NASA safety policy for the Skylab missions are outlined. The basic areas examined in the safety program for Skylab were the crew interface, extra-vehicular activity (EVA), energy sources, spacecraft interface, and hardware complexity. Fire prevention was a primary goal, with firefighting as backup. Studies of the vectorcardiogram and sleep monitoring experiments exemplify special efforts to prevent fire and shock. The final fire control study included material review, fire detection capability, and fire extinguishing capability. Contractors had major responsibility for system safety. Failure mode and effects analysis (FMEA) and equipment criticality categories are outlined. Redundancy was provided on systems that were critical to crew survival (category I). The five key checkpoints in Skylab hardware development are explained. Skylab rescue capability was demonstrated by preparations to rescue the Skylab 3 crew after their spacecraft developed attitude control problems.

  1. Moving Forward: Positive Behavior Support and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Tincani, Matt

    2007-01-01

    A controversy has emerged about the relationship between positive behavior support and applied behavior analysis. Some behavior analysts suggest that positive behavior support and applied behavior analysis are the same (e.g., Carr & Sidener, 2002). Others argue that positive behavior support is harmful to applied behavior analysis (e.g., Johnston,…

  2. Analysis on operational power and eddy current losses for applying coreless double-sided permanent magnet synchronous motor/generator to high-power flywheel energy storage system

    NASA Astrophysics Data System (ADS)

    Jang, Seok-Myeong; Park, Ji-Hoon; You, Dae-Joon; Choi, Sang-Ho

    2009-04-01

    This paper deals with analytical approach of operational power defined as load power and rotor loss represented as eddy current loss for applying a permanent magnet (PM) synchronous motor/generator to the high-power flywheel energy storage system. The used model is composed of a double-sided Halbach magnetized PM rotor and coreless three-phase winding stator. For one such motor/generator structure, we provide the magnetic field and eddy current with space and time harmonics via magnetic vector potential in two-dimensional (2D) polar coordinate system. From these, the operational power is estimated by backelectromotive force according to the PM rotor speed, and the rotor loss is also calculated from Poynting theorem.

  3. Verifying Anonymous Credential Systems in Applied Pi Calculus

    NASA Astrophysics Data System (ADS)

    Li, Xiangxi; Zhang, Yu; Deng, Yuxin

    Anonymous credentials are widely used to certify properties of a credential owner or to support the owner to demand valuable services, while hiding the user's identity at the same time. A credential system (a.k.a. pseudonym system) usually consists of multiple interactive procedures between users and organizations, including generating pseudonyms, issuing credentials and verifying credentials, which are required to meet various security properties. We propose a general symbolic model (based on the applied pi calculus) for anonymous credential systems and give formal definitions of a few important security properties, including pseudonym and credential unforgeability, credential safety, pseudonym untraceability. We specialize the general formalization and apply it to the verification of a concrete anonymous credential system proposed by Camenisch and Lysyanskaya. The analysis is done automatically with the tool ProVerif and several security properties have been verified.

  4. Fluorescent Protein Biosensors Applied to Microphysiological Systems

    PubMed Central

    Senutovitch, Nina; Vernetti, Lawrence; Boltz, Robert; DeBiasio, Richard; Gough, Albert; Taylor, D. Lansing

    2015-01-01

    This mini-review discusses the evolution of fluorescence as a tool to study living cells and tissues in vitro and the present role of fluorescent protein biosensors (FPBs) in microphysiological systems (MPS). FPBs allow the measurement of temporal and spatial dynamics of targeted cellular events involved in normal and perturbed cellular assay systems and microphysiological systems in real-time. FPBs evolved from fluorescent analog cytochemistry (FAC) that permitted the measurement of the dynamics of purified proteins covalently labeled with environmentally insensitive fluorescent dyes and then incorporated into living cells, as well as a large list of diffusible fluorescent probes engineered to measure environmental changes in living cells. In parallel, a wide range of fluorescence microscopy methods were developed to measure the chemical and molecular activities of the labeled cells, including ratio imaging, fluorescence lifetime, total internal reflection, 3D imaging, including super-resolution, as well as high content screening (HCS). FPBs evolved from FAC by combining environmentally sensitive fluorescent dyes with proteins in order to monitor specific physiological events such as post-translational modifications, production of metabolites, changes in various ion concentrations and the dynamic interaction of proteins with defined macromolecules in time and space within cells. Original FPBs involved the engineering of fluorescent dyes to sense specific activities when covalently attached to particular domains of the targeted protein. The subsequent development of fluorescent proteins (FPs), such as the green fluorescent protein (GFP), dramatically accelerated the adoption of studying living cells, since the genetic “labeling” of proteins became a relatively simple method that permitted the analysis of temporal-spatial dynamics of a wide range of proteins. Investigators subsequently engineered the fluorescence properties of the FPs for environmental

  5. Introduction: Conversation Analysis in Applied Linguistics

    ERIC Educational Resources Information Center

    Sert, Olcay; Seedhouse, Paul

    2011-01-01

    This short, introductory paper presents an up-to-date account of works within the field of Applied Linguistics which have been influenced by a Conversation Analytic paradigm. The article reviews recent studies in classroom interaction, materials development, proficiency assessment and language teacher education. We believe that the publication of…

  6. The Evolution of Fungicide Resistance Resulting from Combinations of Foliar-Acting Systemic Seed Treatments and Foliar-Applied Fungicides: A Modeling Analysis.

    PubMed

    Kitchen, James L; van den Bosch, Frank; Paveley, Neil D; Helps, Joseph; van den Berg, Femke

    2016-01-01

    For the treatment of foliar diseases of cereals, fungicides may be applied as foliar sprays or systemic seed treatments which are translocated to leaves. Little research has been done to assess the resistance risks associated with foliar-acting systemic seed treatments when used alone or in combination with foliar sprays, even though both types of treatment may share the same mode of action. It is therefore unknown to what extent adding a systemic seed treatment to a foliar spray programme poses an additional resistance risk and whether in the presence of a seed treatment additional resistance management strategies (such as limiting the total number of treatments) are necessary to limit the evolution of fungicide-resistance. A mathematical model was developed to simulate an epidemic and the resistance evolution of Zymoseptoria tritici on winter wheat, which was used to compare different combinations of seed and foliar treatments by calculating the fungicide effective life, i.e. the number of years before effective disease control is lost to resistance. A range of parameterizations for the seed treatment fungicide and different fungicide uptake models were compared. Despite the different parameterizations, the model consistently predicted the same trends in that i) similar levels of efficacy delivered either by a foliar-acting seed treatment, or a foliar application, resulted in broadly similar resistance selection, ii) adding a foliar-acting seed treatment to a foliar spray programme increased resistance selection and usually decreased effective life, and iii) splitting a given total dose-by adding a seed treatment to foliar treatments, but decreasing dose per treatment-gave effective lives that were the same as, or shorter than those given by the spray programme alone. For our chosen plant-pathogen-fungicide system, the model results suggest that to effectively manage selection for fungicide-resistance, foliar acting systemic seed treatments should be included as

  7. The Evolution of Fungicide Resistance Resulting from Combinations of Foliar-Acting Systemic Seed Treatments and Foliar-Applied Fungicides: A Modeling Analysis

    PubMed Central

    Kitchen, James L.; van den Bosch, Frank; Paveley, Neil D.; Helps, Joseph; van den Berg, Femke

    2016-01-01

    For the treatment of foliar diseases of cereals, fungicides may be applied as foliar sprays or systemic seed treatments which are translocated to leaves. Little research has been done to assess the resistance risks associated with foliar-acting systemic seed treatments when used alone or in combination with foliar sprays, even though both types of treatment may share the same mode of action. It is therefore unknown to what extent adding a systemic seed treatment to a foliar spray programme poses an additional resistance risk and whether in the presence of a seed treatment additional resistance management strategies (such as limiting the total number of treatments) are necessary to limit the evolution of fungicide-resistance. A mathematical model was developed to simulate an epidemic and the resistance evolution of Zymoseptoria tritici on winter wheat, which was used to compare different combinations of seed and foliar treatments by calculating the fungicide effective life, i.e. the number of years before effective disease control is lost to resistance. A range of parameterizations for the seed treatment fungicide and different fungicide uptake models were compared. Despite the different parameterizations, the model consistently predicted the same trends in that i) similar levels of efficacy delivered either by a foliar-acting seed treatment, or a foliar application, resulted in broadly similar resistance selection, ii) adding a foliar-acting seed treatment to a foliar spray programme increased resistance selection and usually decreased effective life, and iii) splitting a given total dose—by adding a seed treatment to foliar treatments, but decreasing dose per treatment—gave effective lives that were the same as, or shorter than those given by the spray programme alone. For our chosen plant-pathogen-fungicide system, the model results suggest that to effectively manage selection for fungicide-resistance, foliar acting systemic seed treatments should be included

  8. Liquid Chromatography Applied to Space System

    NASA Astrophysics Data System (ADS)

    Poinot, Pauline; Chazalnoel, Pascale; Geffroy, Claude; Sternberg, Robert; Carbonnier, Benjamin

    Searching for signs of past or present life in our Solar System is a real challenge that stirs up the curiosity of scientists. Until now, in situ instrumentation was designed to detect and determine concentrations of a wide number of organic biomarkers. The relevant method which was and still is employed in missions dedicated to the quest of life (from Viking to ExoMars) corresponds to the pyrolysis-GC-MS. Along the missions, this approach has been significantly improved in terms of extraction efficiency and detection with the use of chemical derivative agents (e.g. MTBSTFA, DMF-DMA, TMAH…), and in terms of analysis sensitivity and resolution with the development of in situ high-resolution mass spectrometer (e.g. TOF-MS). Thanks to such an approach, organic compounds such as amino acids, sugars, tholins or polycyclic aromatic hydrocarbons (PAHs) were expected to be found. However, while there’s a consensus that the GC-MS of Viking, Huygens, MSL and MOMA space missions worked the way they had been designed to, pyrolysis is much more in debate (Glavin et al. 2001; Navarro-González et al. 2006). Indeed, (1) it is thought to remove low levels of organics, (2) water and CO2 could interfere with the detection of likely organic pyrolysis products, and (3) only low to mid-molecular weight organic molecules can be detected by this technique. As a result, researchers are now focusing on other in situ techniques which are no longer based on the volatility of the organic matter, but on the liquid phase extraction and analysis. In this line, micro-fluidic systems involving sandwich and/or competitive immunoassays (e.g. LMC, SOLID; Parro et al. 2005; Sims et al. 2012), micro-chip capillary electrophoreses (e.g. MOA; Bada et al. 2008), or nanopore-based analysis (e.g. BOLD; Schulze-Makuch et al. 2012) have been conceived for in situ analysis. Thanks to such approaches, molecular biological polymers (polysaccharides, polypeptides, polynucleotides, phospholipids, glycolipids

  9. Statistical Uncertainty Analysis Applied to Criticality Calculation

    SciTech Connect

    Hartini, Entin; Andiwijayakusuma, Dinan; Susmikanti, Mike; Nursinta, A. W.

    2010-06-22

    In this paper, we present an uncertainty methodology based on a statistical approach, for assessing uncertainties in criticality prediction using monte carlo method due to uncertainties in the isotopic composition of the fuel. The methodology has been applied to criticality calculations with MCNP5 with additional stochastic input of the isotopic fuel composition. The stochastic input were generated using the latin hypercube sampling method based one the probability density function of each nuclide composition. The automatic passing of the stochastic input to the MCNP and the repeated criticality calculation is made possible by using a python script to link the MCNP and our latin hypercube sampling code.

  10. Liquid Chromatography Applied to Space System

    NASA Astrophysics Data System (ADS)

    Poinot, Pauline; Chazalnoel, Pascale; Geffroy, Claude; Sternberg, Robert; Carbonnier, Benjamin

    Searching for signs of past or present life in our Solar System is a real challenge that stirs up the curiosity of scientists. Until now, in situ instrumentation was designed to detect and determine concentrations of a wide number of organic biomarkers. The relevant method which was and still is employed in missions dedicated to the quest of life (from Viking to ExoMars) corresponds to the pyrolysis-GC-MS. Along the missions, this approach has been significantly improved in terms of extraction efficiency and detection with the use of chemical derivative agents (e.g. MTBSTFA, DMF-DMA, TMAH…), and in terms of analysis sensitivity and resolution with the development of in situ high-resolution mass spectrometer (e.g. TOF-MS). Thanks to such an approach, organic compounds such as amino acids, sugars, tholins or polycyclic aromatic hydrocarbons (PAHs) were expected to be found. However, while there’s a consensus that the GC-MS of Viking, Huygens, MSL and MOMA space missions worked the way they had been designed to, pyrolysis is much more in debate (Glavin et al. 2001; Navarro-González et al. 2006). Indeed, (1) it is thought to remove low levels of organics, (2) water and CO2 could interfere with the detection of likely organic pyrolysis products, and (3) only low to mid-molecular weight organic molecules can be detected by this technique. As a result, researchers are now focusing on other in situ techniques which are no longer based on the volatility of the organic matter, but on the liquid phase extraction and analysis. In this line, micro-fluidic systems involving sandwich and/or competitive immunoassays (e.g. LMC, SOLID; Parro et al. 2005; Sims et al. 2012), micro-chip capillary electrophoreses (e.g. MOA; Bada et al. 2008), or nanopore-based analysis (e.g. BOLD; Schulze-Makuch et al. 2012) have been conceived for in situ analysis. Thanks to such approaches, molecular biological polymers (polysaccharides, polypeptides, polynucleotides, phospholipids, glycolipids

  11. Robustness analysis applied to substructure controller synthesis

    NASA Technical Reports Server (NTRS)

    Gonzalez-Oberdoerffer, Marcelo F.; Craig, Roy R., Jr.

    1993-01-01

    The stability and robustness of the controlled system obtained via the substructure control synthesis (SCS) method of Su et al. (1990) were examined using a six-bay truss model, and employing an LQG control design method to obtain controllers for two separate structures. It is found that the assembled controller provides a stability in this instance. A qualitative assessment of the stability robustness of the system with controller designed with the SCS method is provided by obtaining a controller using the complete truss model and comparing the robustness of the corresponding closed-loop systems.

  12. pH recycling aqueous two-phase systems applied in extraction of Maitake β-Glucan and mechanism analysis using low-field nuclear magnetic resonance.

    PubMed

    Hou, Huiyun; Cao, Xuejun

    2015-07-31

    In this paper, a recycling aqueous two-phase systems (ATPS) based on two pH-response copolymers PADB and PMDM were used in purification of β-Glucan from Grifola frondosa. The main parameters, such as polymer concentration, type and concentration of salt, extraction temperature and pH, were investigated to optimize partition conditions. The results demonstrated that β-Glucan was extracted into PADB-rich phase, while impurities were extracted into PMDM-rich phase. In this 2.5% PADB/2.5% PMDM ATPS, 7.489 partition coefficient and 96.92% extraction recovery for β-Glucan were obtained in the presence of 30mmol/L KBr, at pH 8.20, 30°C. The phase-forming copolymers could be recycled by adjusting pH, with recoveries of over 96.0%. Furthermore, the partition mechanism of Maitake β-Glucan in PADB/PMDM aqueous two-phase systems was studied. Fourier transform infrared spectra, ForteBio Octet system and low-field nuclear magnetic resonance (LF-NMR) were introduced for elucidating the partition mechanism of β-Glucan. Especially, LF-NMR was firstly used in the mechanism analysis in partition of aqueous two-phase systems. The change of transverse relaxation time (T2) in ATPS could reflect the interaction between polymers and β-Glucan. PMID:26094138

  13. Science, Skepticism, and Applied Behavior Analysis

    PubMed Central

    Normand, Matthew P

    2008-01-01

    Pseudoscientific claims concerning medical and psychological treatments of all varieties are commonplace. As behavior analysts, a sound skeptical approach to our science and practice is essential. The present paper offers an overview of science and skepticism and discusses the relationship of skepticism to behavior analysis, with an emphasis on the types of issues concerning behavior analysts in practice. PMID:22477687

  14. Applied Spectrophotometry: Analysis of a Biochemical Mixture

    ERIC Educational Resources Information Center

    Trumbo, Toni A.; Schultz, Emeric; Borland, Michael G.; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the…

  15. A robust multisyringe system for process flow analysis. Part II. A multi-commuted injection system applied to the photometric determination of free acidity and iron(III) in metallurgical solutions.

    PubMed

    Albertús, F; Cladera, A; Cerda, V

    2000-12-01

    A new software-controlled volume-based system for sample introduction in process flow injection analysis was developed. By using a multi-syringe burette coupled with one or two additional commutation valves, the multi-commuted injection of precise sample volumes was accomplished. Characteristics and performance of the injection system were studied by injecting an indicator in a buffered carrier. Three configurations were implemented in order to achieve two different tasks: the single injection of a sample in a two- or three-channels manifold, and the dual injection into different streams. The two channel flow system using the single injection was applied to the determination of free acidity in diluted samples containing high levels of iron(III), by employing the single point titration methodology. The precipitation of ferric hydroxide was prevented using the ammonium and sodium salts of oxalate and acetate as buffer titrant. Methyl Red was employed as indicator. The procedure allows determination of acid concentration in solutions with a Fe(III)/H+ molar ratio up to 0.2. Samples with higher Fe(III)/H+ molar ratios were spiked with a known strong acid at dilution. The three-channel configuration was applied to the determination of ferric ions, using, as reagent, a merging mixture of sulfuric acid and potassium thiocyanate. The double injection system was implemented in series in a single (three-channel) manifold in such a way that a different injection volume and a changed reagent were used for each analyte. It was applied to the separated or sequential determination of free acidity and ferric ions. In this configuration, iron(III) was determined using 0.5-0.7% (w/v) sodium salicylate solution as reagent. The systems can operate at up to 100, 84 and 78 injections per hour, respectively. Determinations on synthetic and process samples compared well with the reference values and procedures. Recoveries of 95-102% with a maximum RSD value of 5.4% were found for acidity

  16. Applied behavior analysis at West Virginia University: A brief history.

    PubMed

    Hawkins, R P; Chase, P N; Scotti, J R

    1993-01-01

    The development of an emphasis on applied behavior analysis in the Department of Psychology at West Virginia University is traced. The emphasis began primarily in the early 1970s, under the leadership of Roger Maley and Jon Krapfl, and has continued to expand and evolve with the participation of numerous behavior analysts and behavior therapists, both inside and outside the department. The development has been facilitated by several factors: establishment of a strong behavioral emphasis in the three Clinical graduate programs; change of the graduate program in Experimental Psychology to a program in basic Behavior Analysis; development of nonclinical applied behavior analysis within the Behavior Analysis program; establishment of a joint graduate program with Educational Psychology; establishment of a Community/Systems graduate program; and organization of numerous conferences. Several factors are described that seem to assure a stable role for behavior analysis in the department: a stable and supportive "culture" within the department; American Psychological Association accreditation of the clinical training; a good reputation both within the university and in psychology; and a broader community of behavior analysts and behavior therapists. PMID:16795816

  17. Applied methods of testing and evaluation for IR imaging system

    NASA Astrophysics Data System (ADS)

    Liao, Xiao-yue; Lu, Jin

    2009-07-01

    Different methods of testing and evaluation for IR imaging system are used with the application of the 2nd and the 3rd generation infrared detectors. The performance of IR imaging system can be reflected by many specifications, such as Noise Equivalent Temperature Difference (NETD), Nonuniformity, system Modulation Transfer Function (MTF), Minimum Resolvable Temperature Difference (MRTD), and Minimum Detectable Temperature Difference (MRTD) etc. The sensitivity of IR sensors is estimated by NETD. The sensitivity of thermal imaging sensors and space resolution are evaluated by MRTD, which is the chief specification of system. In this paper, the theoretical analysis of different testing methods is introduced. The characteristics of them are analyzed and compared. Based on discussing the factors that affect measurement results, an applied method of testing NETD and MRTD for IR system is proposed.

  18. Reliability analysis applied to structural tests

    NASA Technical Reports Server (NTRS)

    Diamond, P.; Payne, A. O.

    1972-01-01

    The application of reliability theory to predict, from structural fatigue test data, the risk of failure of a structure under service conditions because its load-carrying capability is progressively reduced by the extension of a fatigue crack, is considered. The procedure is applicable to both safe-life and fail-safe structures and, for a prescribed safety level, it will enable an inspection procedure to be planned or, if inspection is not feasible, it will evaluate the life to replacement. The theory has been further developed to cope with the case of structures with initial cracks, such as can occur in modern high-strength materials which are susceptible to the formation of small flaws during the production process. The method has been applied to a structure of high-strength steel and the results are compared with those obtained by the current life estimation procedures. This has shown that the conventional methods can be unconservative in certain cases, depending on the characteristics of the structure and the design operating conditions. The suitability of the probabilistic approach to the interpretation of the results from full-scale fatigue testing of aircraft structures is discussed and the assumptions involved are examined.

  19. Scanning methods applied to bitemark analysis

    NASA Astrophysics Data System (ADS)

    Bush, Peter J.; Bush, Mary A.

    2010-06-01

    The 2009 National Academy of Sciences report on forensics focused criticism on pattern evidence subdisciplines in which statements of unique identity are utilized. One principle of bitemark analysis is that the human dentition is unique to the extent that a perpetrator may be identified based on dental traits in a bitemark. Optical and electron scanning methods were used to measure dental minutia and to investigate replication of detail in human skin. Results indicated that being a visco-elastic substrate, skin effectively reduces the resolution of measurement of dental detail. Conclusions indicate caution in individualization statements.

  20. Expert systems applied to spacecraft fire safety

    NASA Technical Reports Server (NTRS)

    Smith, Richard L.; Kashiwagi, Takashi

    1989-01-01

    Expert systems are problem-solving programs that combine a knowledge base and a reasoning mechanism to simulate a human expert. The development of an expert system to manage fire safety in spacecraft, in particular the NASA Space Station Freedom, is difficult but clearly advantageous in the long-term. Some needs in low-gravity flammability characteristics, ventilating-flow effects, fire detection, fire extinguishment, and decision models, all necessary to establish the knowledge base for an expert system, are discussed.

  1. Applying Modeling Tools to Ground System Procedures

    NASA Technical Reports Server (NTRS)

    Di Pasquale, Peter

    2012-01-01

    As part of a long-term effort to revitalize the Ground Systems (GS) Engineering Section practices, Systems Modeling Language (SysML) and Business Process Model and Notation (BPMN) have been used to model existing GS products and the procedures GS engineers use to produce them.

  2. EG G Mound Applied Technologies payroll system

    SciTech Connect

    Not Available

    1992-02-07

    EG G Mound Applied Technologies, Inc., manages and operates the Mound Facility, Miamisburg, Ohio, under a cost-plus-award-fee contract administered by the Department of Energy's Albuquerque Field Office. The contractor's Payroll Department is responsible for prompt payment in the proper amount to all persons entitled to be paid, in compliance with applicable laws, regulations, and legal decisions. The objective was to determine whether controls were in place to avoid erroneous payroll payments. EG G Mound Applied Technologies, Inc., did not have all the internal controls required by General Accounting Office Title 6, Pay, Leave, and Allowances.'' Specifically, they did not have computerized edits, separation of duties and responsibilities, and restricted access to payroll data files. This condition occurred because its managers were not aware of Title 6 requirements. As a result, the contractor could not assure the Department of Energy that payroll costs were processes accurately; and fraud, waste, or abuse of Department of Energy funds could go undetected. Our sample of 212 payroll transactions from a population of 66,000 in FY 1991 disclosed only two minor processing errors and no instances of fraud, waste or abuse.

  3. Applying QCVV protocols to real physical systems

    NASA Astrophysics Data System (ADS)

    Magesan, Easwar

    As experimental systems move closer to realizing small-scale quantum computers with high fidelity operations, errors become harder to detect and diagnose. Verification and validation protocols are becoming increasingly important for detecting and understanding the precise nature of these errors. I will outline various methods and protocols currently used to deal with errors in experimental systems. I will also discuss recent advances in implementing high fidelity operations which will help to understand some of the tools that are still needed on the road to realizing larger scale quantum systems. Work partially supported by ARO under Contract W911NF-14-1-0124.

  4. Digital Systems Analysis

    ERIC Educational Resources Information Center

    Martin, Vance S.

    2009-01-01

    There have been many attempts to understand how the Internet affects our modern world. There have also been numerous attempts to understand specific areas of the Internet. This article applies Immanuel Wallerstein's World Systems Analysis to our informationalist society. Understanding this world as divided among individual core, semi-periphery,…

  5. Pipeline rehabilitation using field applied tape systems

    SciTech Connect

    Reeves, C.R.

    1998-12-31

    Bare steel pipelines were first installed years before the turn of the century. Pipeline operators soon realized the lie of bare steel could be greatly enhanced by applying coatings. Thus began ``pipeline rehabilitation.`` Many of the older pipelines were exposed, evaluated, coated and returned to service. This procedure has reached new heights in recent years as coated pipelines of the twentieth century, having lived past their original design life, are now subject to coating failure. Many operator companies with pipelines thirty years or older are faced with ``replace or recondition.`` Considering the emphasis on cost restraints and environmental issues, replacing an existing pipeline is often not the best decision. Rehabilitation is a preferred solution for many operators.

  6. How Systems Thinking Applies to Education.

    ERIC Educational Resources Information Center

    Betts, Frank

    1992-01-01

    Seeds of public education's current failures are found in its past successes (transmitting culture and providing custodial care). Education is experiencing paradigm paralysis because of piecemeal reform approaches, failure to integrate solution ideas, and reductionist, boundary-limiting orientation. The old system is no longer adequate. Total…

  7. Systems biology: the reincarnation of systems theory applied in biology?

    PubMed

    Wolkenhauer, O

    2001-09-01

    With the availability of quantitative data on the transcriptome and proteome level, there is an increasing interest in formal mathematical models of gene expression and regulation. International conferences, research institutes and research groups concerned with systems biology have appeared in recent years and systems theory, the study of organisation and behaviour per se, is indeed a natural conceptual framework for such a task. This is, however, not the first time that systems theory has been applied in modelling cellular processes. Notably in the 1960s systems theory and biology enjoyed considerable interest among eminent scientists, mathematicians and engineers. Why did these early attempts vanish from research agendas? Here we shall review the domain of systems theory, its application to biology and the lessons that can be learned from the work of Robert Rosen. Rosen emerged from the early developments in the 1960s as a main critic but also developed a new alternative perspective to living systems, a concept that deserves a fresh look in the post-genome era of bioinformatics. PMID:11589586

  8. Chebyshev Expansion Applied to Dissipative Quantum Systems.

    PubMed

    Popescu, Bogdan; Rahman, Hasan; Kleinekathöfer, Ulrich

    2016-05-19

    To determine the dynamics of a molecular aggregate under the influence of a strongly time-dependent perturbation within a dissipative environment is still, in general, a challenge. The time-dependent perturbation might be, for example, due to external fields or explicitly treated fluctuations within the environment. Methods to calculate the dynamics in these cases do exist though some of these approaches assume that the corresponding correlation functions can be written as a weighted sum of exponentials. One such theory is the hierarchical equations of motion approach. If the environment, however, is described by a complex spectral density or if its temperature is low, these approaches become very inefficient. Therefore, we propose a scheme based on a Chebyshev decomposition of the bath correlation functions and detail the respective quantum master equations within second-order perturbation theory in the environmental coupling. Similar approaches have recently been proposed for systems coupled to Fermionic reservoirs. The proposed scheme is tested for a simple two-level system and compared to existing results. Furthermore, the advantages and disadvantages of the present Chebyshev approach are discussed. PMID:26845380

  9. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W.

    1992-01-01

    We are developing a machine learning system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being learned. Using this information (which we call a domain theory''), our learning algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, the KBANN algorithm maps inference rules, such as consensus sequences, into a neural (connectionist) network. Neural network training techniques then use the training examples of refine these inference rules. We have been applying this approach to several problems in DNA sequence analysis and have also been extending the capabilities of our learning system along several dimensions.

  10. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W. . Dept. of Computer Sciences); Noordewier, M.O. . Dept. of Computer Science)

    1992-01-01

    We are primarily developing a machine teaming (ML) system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being teamed. Using this information, our teaming algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, our KBANN algorithm maps inference rules about a given recognition task into a neural network. Neural network training techniques then use the training examples to refine these inference rules. We call these rules a domain theory, following the convention in the machine teaming community. We have been applying this approach to several problems in DNA sequence analysis. In addition, we have been extending the capabilities of our teaming system along several dimensions. We have also been investigating parallel algorithms that perform sequence alignments in the presence of frameshift errors.

  11. Thermal diffusivity measurement system applied to polymers

    NASA Astrophysics Data System (ADS)

    Abad, B.; Díaz-Chao, P.; Almarza, A.; Amantia, D.; Vázquez-Campos, S.; Isoda, Y.; Shinohara, Y.; Briones, F.; Martín-González, M. S.

    2012-06-01

    In the search for cleaner energy sources, the improvement of the efficiency of the actual ones appears as a primary objective. In this way, thermoelectric materials, which are able to convert wasted heat into electricity, are reveal as an interesting way to improve efficiency of car engines, for example. Cost-effective energy harvesting from thermoelectric devices requires materials with high electrical conductivities and Seebeck coefficient, but low thermal conductivity. Conductive polymers can fulfil these conditions if they are doped appropriately. One of the most promising polymers is Polyaniline. In this work, the thermal conductivity of the polyaniline and mixtures of polyaniline with nanoclays has been studied, using a new experimental set-up developed in the lab. The novel system is based on the steady-state method and it is used to obtain the thermal diffusivity of the polymers and the nanocomposites.

  12. Setting events in applied behavior analysis: Toward a conceptual and methodological expansion

    PubMed Central

    Wahler, Robert G.; Fox, James J.

    1981-01-01

    The contributions of applied behavior analysis as a natural science approach to the study of human behavior are acknowledged. However, it is also argued that applied behavior analysis has provided limited access to the full range of environmental events that influence socially significant behavior. Recent changes in applied behavior analysis to include analysis of side effects and social validation represent ways in which the traditional applied behavior analysis conceptual and methodological model has been profitably expanded. A third area of expansion, the analysis of setting events, is proposed by the authors. The historical development of setting events as a behavior influence concept is traced. Modifications of the basic applied behavior analysis methodology and conceptual systems that seem necessary to setting event analysis are discussed and examples of descriptive and experimental setting event analyses are presented. PMID:16795646

  13. Animal Research in the "Journal of Applied Behavior Analysis"

    ERIC Educational Resources Information Center

    Edwards, Timothy L.; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the "Journal of Applied Behavior Analysis" and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance…

  14. B. F. Skinner's contributions to applied behavior analysis

    PubMed Central

    Morris, Edward K.; Smith, Nathaniel G.; Altus, Deborah E.

    2005-01-01

    Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew from his science for application, his descriptions of possible applications, and his own applications to nonhuman and human behavior. Second, we found that he explicitly or implicitly addressed all seven dimensions of applied behavior analysis. These contributions and the dimensions notwithstanding, he neither incorporated the field's scientific (e.g., analytic) and social dimensions (e.g., applied) into any program of published research such that he was its originator, nor did he systematically integrate, advance, and promote the dimensions so to have been its founder. As the founder of behavior analysis, however, he was the father of applied behavior analysis. PMID:22478444

  15. Multidisciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  16. Hyperspectral imaging applied to complex particulate solids systems

    NASA Astrophysics Data System (ADS)

    Bonifazi, Giuseppe; Serranti, Silvia

    2008-04-01

    HyperSpectral Imaging (HSI) is based on the utilization of an integrated hardware and software (HW&SW) platform embedding conventional imaging and spectroscopy to attain both spatial and spectral information from an object. Although HSI was originally developed for remote sensing, it has recently emerged as a powerful process analytical tool, for non-destructive analysis, in many research and industrial sectors. The possibility to apply on-line HSI based techniques in order to identify and quantify specific particulate solid systems characteristics is presented and critically evaluated. The originally developed HSI based logics can be profitably applied in order to develop fast, reliable and lowcost strategies for: i) quality control of particulate products that must comply with specific chemical, physical and biological constraints, ii) performance evaluation of manufacturing strategies related to processing chains and/or realtime tuning of operative variables and iii) classification-sorting actions addressed to recognize and separate different particulate solid products. Case studies, related to recent advances in the application of HSI to different industrial sectors, as agriculture, food, pharmaceuticals, solid waste handling and recycling, etc. and addressed to specific goals as contaminant detection, defect identification, constituent analysis and quality evaluation are described, according to authors' originally developed application.

  17. First Attempt of Applying Factor Analysis in Moving Base Gravimetry

    NASA Astrophysics Data System (ADS)

    Li, X.; Roman, D. R.

    2014-12-01

    For gravimetric observation systems on mobile platforms (land/sea/airborne), the Low Signal to Noise Ratio (SNR) issue is the main barrier to achieving an accurate, high resolution gravity signal. Normally, low-pass filters (Childers et al 1999, Forsberg et al 2000, Kwon and Jekeli 2000, Hwang et al 2006) are applied to smooth or remove the high frequency "noise" - even though some of the high frequency component is not necessarily noise. This is especially true for aerogravity surveys such as those from the Gravity for the Redefinition of the American Vertical Datum (GRAV-D) project. These gravity survey flights have a spatial resolution of 10 km between tracks but higher resolution along track. The along track resolution is improved due to the lower flight height (6.1 km), equipment sensitivity, and improved modeling of potential errors. Additionally, these surveys suffer from a loss of signal power due to the increased flight elevation. Hence, application of a low-pass filter removes possible signal sensed in the along-track direction that might otherwise prove useful for various geophysical and geodetic applications. Some cutting-edge developments in Wavelets and Artificial Neural Networks had been successfully applied for obtaining improved results (Li 2008 and 2011, Liang and Liu 2013). However, a clearer and fundamental understanding of the error characteristics will further improve the quality of the gravity estimates out of these gravimetric systems. Here, instead of using any predefined basis function or any a priori model, the idea of Factor Analysis is first employed to try to extract the underlying factors of the noises in the systems. Real data sets collected by both land vehicle and aircraft will be processed as the examples.

  18. Computer-Aided Decision Support for Melanoma Detection Applied on Melanocytic and Nonmelanocytic Skin Lesions: A Comparison of Two Systems Based on Automatic Analysis of Dermoscopic Images

    PubMed Central

    Møllersen, Kajsa; Kirchesch, Herbert; Zortea, Maciel; Schopf, Thomas R.; Hindberg, Kristian; Godtliebsen, Fred

    2015-01-01

    Commercially available clinical decision support systems (CDSSs) for skin cancer have been designed for the detection of melanoma only. Correct use of the systems requires expert knowledge, hampering their utility for nonexperts. Furthermore, there are no systems to detect other common skin cancer types, that is, nonmelanoma skin cancer (NMSC). As early diagnosis of skin cancer is essential, there is a need for a CDSS that is applicable to all types of skin lesions and is suitable for nonexperts. Nevus Doctor (ND) is a CDSS being developed by the authors. We here investigate ND's ability to detect both melanoma and NMSC and the opportunities for improvement. An independent test set of dermoscopic images of 870 skin lesions, including 44 melanomas and 101 NMSCs, were analysed by ND. Its sensitivity to melanoma and NMSC was compared to that of Mole Expert (ME), a commercially available CDSS, using the same set of lesions. ND and ME had similar sensitivity to melanoma. For ND at 95% melanoma sensitivity, the NMSC sensitivity was 100%, and the specificity was 12%. The melanomas misclassified by ND at 95% sensitivity were correctly classified by ME, and vice versa. ND is able to detect NMSC without sacrificing melanoma sensitivity. PMID:26693486

  19. A Vision for Systems Engineering Applied to Wind Energy (Presentation)

    SciTech Connect

    Felker, F.; Dykes, K.

    2015-01-01

    This presentation was given at the Third Wind Energy Systems Engineering Workshop on January 14, 2015. Topics covered include the importance of systems engineering, a vision for systems engineering as applied to wind energy, and application of systems engineering approaches to wind energy research and development.

  20. Applying expertise to data in the Geologist's Assistant expert system

    SciTech Connect

    Berkbigler, K.P.; Papcun, G.J.; Marusak, N.L.; Hutson, J.E.

    1988-01-01

    The Geologist's Assistant combines expert system technology with numerical pattern-matching and online communication to a large database. This paper discusses the types of rules used for the expert system, the pattern-matching technique applied, and the implementation of the system using a commercial expert system development environment. 13 refs., 8 figs.

  1. Competing Uses of Underground Systems Related to Energy Supply: Applying Single- and Multiphase Simulations for Site Characterization and Risk-Analysis

    NASA Astrophysics Data System (ADS)

    Kissinger, A.; Walter, L.; Darcis, M.; Flemisch, B.; Class, H.

    2012-04-01

    Global climate change, shortage of resources and the resulting turn towards renewable sources of energy lead to a growing demand for the utilization of subsurface systems. Among these competing uses are Carbon Capture and Storage (CCS), geothermal energy, nuclear waste disposal, "renewable" methane or hydrogen storage as well as the ongoing production of fossil resources like oil, gas, and coal. Besides competing among themselves, these technologies may also create conflicts with essential public interests like water supply. For example, the injection of CO2 into the underground causes an increase in pressure reaching far beyond the actual radius of influence of the CO2 plume, potentially leading to large amounts of displaced salt water. Finding suitable sites is a demanding task for several reasons. Natural systems as opposed to technical systems are always characterized by heterogeneity. Therefore, parameter uncertainty impedes reliable predictions towards capacity and safety of a site. State of the art numerical simulations combined with stochastic approaches need to be used to obtain a more reliable assessment of the involved risks and the radii of influence of the different processes. These simulations may include the modeling of single- and multiphase non-isothermal flow, geo-chemical and geo-mechanical processes in order to describe all relevant physical processes adequately. Stochastic approaches have the aim to estimate a bandwidth of the key output parameters based on uncertain input parameters. Risks for these different underground uses can then be made comparable with each other. Along with the importance and the urgency of the competing processes this may lead to a more profound basis for a decision. Communicating risks to stake holders and a concerned public is crucial for the success of finding a suitable site for CCS (or other subsurface utilization). We present and discuss first steps towards an approach for addressing the issue of competitive

  2. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2003-01-01

    TD64, the Applied Fluid Dynamics Analysis Group, is one of several groups with high-fidelity fluids design and analysis expertise in the Space Transportation Directorate at Marshall Space Flight Center (MSFC). TD64 assists personnel working on other programs. The group participates in projects in the following areas: turbomachinery activities, nozzle activities, combustion devices, and the Columbia accident investigation.

  3. The Significance of Regional Analysis in Applied Geography.

    ERIC Educational Resources Information Center

    Sommers, Lawrence M.

    Regional analysis is central to applied geographic research, contributing to better planning and policy development for a variety of societal problems facing the United States. The development of energy policy serves as an illustration of the capabilities of this type of analysis. The United States has had little success in formulating a national…

  4. Applying a toolkit for dissemination and analysis of near real-time data through the World Wide Web: integration of the Antelope Real Time System, ROADNet, and PHP

    NASA Astrophysics Data System (ADS)

    Newman, R. L.; Lindquist, K. G.; Hansen, T. S.; Vernon, F. L.; Eakins, J.; Foley, S.; Orcutt, J.

    2005-12-01

    The ROADNet project has enabled the acquisition and storage of diverse data streams through seamless integration of the Antelope Real Time System (ARTS) with (for example) ecological, seismological and geodetic instrumentation. The robust system architecture allows researchers to simply network data loggers with relational databases; however, the ability to disseminate these data to policy makers, scientists and the general public has (until recently) been provided on an 'as needed' basis. The recent development of a Datascope interface to the popular open source scripting language PHP has provided an avenue for presenting near real time data (such as integers, images and movies) from within the ARTS framework easily on the World Wide Web. The interface also indirectly provided the means to transform data types into various formats using the extensive function libraries that accompany a PHP installation (such as image creation and manipulation, data encryption for sensitive information, and XML creation for structured document interchange through the World Wide Web). Using a combination of Datascope and PHP library functions, an extensible tool-kit is being developed to allow data managers to easily present their products on the World Wide Web. The tool-kit has been modeled after the pre-existing ARTS architecture to simplify the installation, development and ease-of-use for both the seasoned researcher and the casual user. The methodology and results of building the applications that comprise the tool-kit are the focus of this presentation, including procedural vs. object oriented design, incorporation of the tool-kit into the existing contributed software libraries, and case-studies of researchers who are employing the tools to present their data. http://anf.ucsd.edu

  5. Automated speech analysis applied to laryngeal disease categorization.

    PubMed

    Gelzinis, A; Verikas, A; Bacauskiene, M

    2008-07-01

    The long-term goal of the work is a decision support system for diagnostics of laryngeal diseases. Colour images of vocal folds, a voice signal, and questionnaire data are the information sources to be used in the analysis. This paper is concerned with automated analysis of a voice signal applied to screening of laryngeal diseases. The effectiveness of 11 different feature sets in classification of voice recordings of the sustained phonation of the vowel sound /a/ into a healthy and two pathological classes, diffuse and nodular, is investigated. A k-NN classifier, SVM, and a committee build using various aggregation options are used for the classification. The study was made using the mixed gender database containing 312 voice recordings. The correct classification rate of 84.6% was achieved when using an SVM committee consisting of four members. The pitch and amplitude perturbation measures, cepstral energy features, autocorrelation features as well as linear prediction cosine transform coefficients were amongst the feature sets providing the best performance. In the case of two class classification, using recordings from 79 subjects representing the pathological and 69 the healthy class, the correct classification rate of 95.5% was obtained from a five member committee. Again the pitch and amplitude perturbation measures provided the best performance. PMID:18346812

  6. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    PubMed

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  7. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics

    PubMed Central

    Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999–2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  8. Methodology, the matching law, and applied behavior analysis

    PubMed Central

    Vyse, Stuart A.

    1986-01-01

    The practical value of the quantitative analysis of behavior is limited by two methodological characteristics of this area of research: the use of (a) steady-state strategies and (b) relative vs. absolute response rates. Applied behavior analysts are concerned with both transition-state and steady-state behavior, and applied interventions are typically evaluated by their effects on absolute response rates. Quantitative analyses of behavior will have greater practical value when methods are developed for their extension to traditional rate-of-response variables measured across time. Although steady-state and relative-rate-of-response strategies are appropriate to the experimental analysis of many behavioral phenomena, these methods are rarely used by applied behavior analysts and further separate the basic and applied areas. PMID:22478657

  9. Applied research in the solar thermal-energy-systems program

    SciTech Connect

    Brown, C. T.; Lefferdo, J. M.

    1981-03-01

    Within the Solar Thermal Research and Advanced Development (RAD) program a coordinated effort in materials research, fuels and chemical research and applied research is being carried out to meet the systems' needs. Each of these three program elements are described with particular attention given to the applied research activity.

  10. XML: How It Will Be Applied to Digital Library Systems.

    ERIC Educational Resources Information Center

    Kim, Hyun-Hee; Choi, Chang-Seok

    2000-01-01

    Shows how XML is applied to digital library systems. Compares major features of XML with those of HTML and describes an experimental XML-based metadata retrieval system, which is based on the Dublin Core and is designed as a subsystem of the Korean Virtual Library and Information System (VINIS). (Author/LRW)

  11. Negative reinforcement in applied behavior analysis: an emerging technology.

    PubMed Central

    Iwata, B A

    1987-01-01

    Although the effects of negative reinforcement on human behavior have been studied for a number of years, a comprehensive body of applied research does not exist at this time. This article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. A consideration of research currently being done in these areas suggests the emergence of an applied technology on negative reinforcement. PMID:3323157

  12. Correlation Network Analysis Applied to Complex Biofilm Communities

    PubMed Central

    Duran-Pinedo, Ana E.; Paster, Bruce; Teles, Ricardo; Frias-Lopez, Jorge

    2011-01-01

    The complexity of the human microbiome makes it difficult to reveal organizational principles of the community and even more challenging to generate testable hypotheses. It has been suggested that in the gut microbiome species such as Bacteroides thetaiotaomicron are keystone in maintaining the stability and functional adaptability of the microbial community. In this study, we investigate the interspecies associations in a complex microbial biofilm applying systems biology principles. Using correlation network analysis we identified bacterial modules that represent important microbial associations within the oral community. We used dental plaque as a model community because of its high diversity and the well known species-species interactions that are common in the oral biofilm. We analyzed samples from healthy individuals as well as from patients with periodontitis, a polymicrobial disease. Using results obtained by checkerboard hybridization on cultivable bacteria we identified modules that correlated well with microbial complexes previously described. Furthermore, we extended our analysis using the Human Oral Microbe Identification Microarray (HOMIM), which includes a large number of bacterial species, among them uncultivated organisms present in the mouth. Two distinct microbial communities appeared in healthy individuals while there was one major type in disease. Bacterial modules in all communities did not overlap, indicating that bacteria were able to effectively re-associate with new partners depending on the environmental conditions. We then identified hubs that could act as keystone species in the bacterial modules. Based on those results we then cultured a not-yet-cultivated microorganism, Tannerella sp. OT286 (clone BU063). After two rounds of enrichment by a selected helper (Prevotella oris OT311) we obtained colonies of Tannerella sp. OT286 growing on blood agar plates. This system-level approach would open the possibility of manipulating microbial

  13. Applying Association Rule Discovery Algorithm to Multipoint Linkage Analysis.

    PubMed

    Mitsuhashi; Hishigaki; Takagi

    1997-01-01

    Knowledge discovery in large databases (KDD) is being performed in several application domains, for example, the analysis of sales data, and is expected to be applied to other domains. We propose a KDD approach to multipoint linkage analysis, which is a way of ordering loci on a chromosome. Strict multipoint linkage analysis based on maximum likelihood estimation is a computationally tough problem. So far various kinds of approximate methods have been implemented. Our method based on the discovery of association between genetic recombinations is so different from others that it is useful to recheck the result of them. In this paper, we describe how to apply the framework of association rule discovery to linkage analysis, and also discuss that filtering input data and interpretation of discovered rules after data mining are practically important as well as data mining process itself. PMID:11072310

  14. Applied behavior analysis: New directions from the laboratory

    PubMed Central

    Epling, W. Frank; Pierce, W. David

    1983-01-01

    Applied behavior analysis began when laboratory based principles were extended to humans inorder to change socially significant behavior. Recent laboratory findings may have applied relevance; however, the majority of basic researchers have not clearly communicated the practical implications of their work. The present paper samples some of the new findings and attempts to demonstrate their applied importance. Schedule-induced behavior which occurs as a by-product of contingencies of reinforcement is discussed. Possible difficulties in treatment and management of induced behaviors are considered. Next, the correlation-based law of effect and the implications of relative reinforcement are explored in terms of applied examples. Relative rate of reinforcement is then extended to the literature dealing with concurrent operants. Concurrent operant models may describe human behavior of applied importance, and several techniques for modification of problem behavior are suggested. As a final concern, the paper discusses several new paradigms. While the practical importance of these models is not clear at the moment, it may be that new practical advantages will soon arise. Thus, it is argued that basic research continues to be of theoretical and practical importance to applied behavior analysis. PMID:22478574

  15. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Wang, Tee-See; Griffin, Lisa; Turner, James E. (Technical Monitor)

    2001-01-01

    This document is a presentation graphic which reviews the activities of the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center (i.e., Code TD64). The work of this group focused on supporting the space transportation programs. The work of the group is in Computational Fluid Dynamic tool development. This development is driven by hardware design needs. The major applications for the design and analysis tools are: turbines, pumps, propulsion-to-airframe integration, and combustion devices.

  16. Support systems design and analysis

    NASA Technical Reports Server (NTRS)

    Ferguson, R. M.

    1985-01-01

    The integration of Kennedy Space Center (KSC) ground support systems with the new launch processing system and new launch vehicle provided KSC with a unique challenge in system design and analysis for the Space Transportation System. Approximately 70 support systems are controlled and monitored by the launch processing system. Typical systems are main propulsion oxygen and hydrogen loading systems, environmental control life support system, hydraulics, etc. An End-to-End concept of documentation and analysis was chosen and applied to these systems. Unique problems were resolved in the areas of software analysis, safing under emergency conditions, sampling rates, and control loop analysis. New methods of performing End-to-End reliability analyses were implemented. The systems design approach selected and the resolution of major problem areas are discussed.

  17. Animal research in the Journal of Applied Behavior Analysis.

    PubMed

    Edwards, Timothy L; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the Journal of Applied Behavior Analysis and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance (say-do correspondence and fluency), 3 described interventions that changed animals' behavior (self-injury by a baboon, feces throwing and spitting by a chimpanzee, and unsafe trailer entry by horses) in ways that benefited the animals and the people in charge of them, and 1 described the use of trained rats that performed a service to humans (land-mine detection). We suggest that each of these general research areas merits further attention and that the Journal of Applied Behavior Analysis is an appropriate outlet for some of these publications. PMID:21709802

  18. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    ERIC Educational Resources Information Center

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  19. Opportunities for Applied Behavior Analysis in the Total Quality Movement.

    ERIC Educational Resources Information Center

    Redmon, William K.

    1992-01-01

    This paper identifies critical components of recent organizational quality improvement programs and specifies how applied behavior analysis can contribute to quality technology. Statistical Process Control and Total Quality Management approaches are compared, and behavior analysts are urged to build their research base and market behavior change…

  20. Positive Behavior Support and Applied Behavior Analysis: A Familial Alliance

    ERIC Educational Resources Information Center

    Dunlap, Glen; Carr, Edward G.; Horner, Robert H.; Zarcone, Jennifer R.; Schwartz, Ilene

    2008-01-01

    Positive behavior support (PBS) emerged in the mid-1980s as an approach for understanding and addressing problem behaviors. PBS was derived primarily from applied behavior analysis (ABA). Over time, however, PBS research and practice has incorporated evaluative methods, assessment and intervention procedures, and conceptual perspectives associated…

  1. B. F. Skinner's Contributions to Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Morris, Edward K.; Smith, Nathaniel G.; Altus, Deborah E.

    2005-01-01

    Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew…

  2. Context, Cognition, and Biology in Applied Behavior Analysis.

    ERIC Educational Resources Information Center

    Morris, Edward K.

    Behavior analysts are having their professional identities challenged by the roles that cognition and biology are said to play in the conduct and outcome of applied behavior analysis and behavior therapy. For cogniphiliacs, cognition and biology are central to their interventions because cognition and biology are said to reflect various processes,…

  3. Progressive-Ratio Schedules and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Poling, Alan

    2010-01-01

    Establishing appropriate relations between the basic and applied areas of behavior analysis has been of long and persistent interest to the author. In this article, the author illustrates that there is a direct relation between how hard an organism will work for access to an object or activity, as indexed by the largest ratio completed under a…

  4. Overview af MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2004-01-01

    This paper presents viewgraphs on NASA Marshall Space Flight Center's Applied Fluid Dynamics Analysis Group Activities. The topics include: 1) Status of programs at MSFC; 2) Fluid Mechanics at MSFC; 3) Relevant Fluid Dynamics Activities at MSFC; and 4) Shuttle Return to Flight.

  5. System theory as applied differential geometry. [linear system

    NASA Technical Reports Server (NTRS)

    Hermann, R.

    1979-01-01

    The invariants of input-output systems under the action of the feedback group was examined. The approach used the theory of Lie groups and concepts of modern differential geometry, and illustrated how the latter provides a basis for the discussion of the analytic structure of systems. Finite dimensional linear systems in a single independent variable are considered. Lessons of more general situations (e.g., distributed parameter and multidimensional systems) which are increasingly encountered as technology advances are presented.

  6. Complex, Dynamic Systems: A New Transdisciplinary Theme for Applied Linguistics?

    ERIC Educational Resources Information Center

    Larsen-Freeman, Diane

    2012-01-01

    In this plenary address, I suggest that Complexity Theory has the potential to contribute a transdisciplinary theme to applied linguistics. Transdisciplinary themes supersede disciplines and spur new kinds of creative activity (Halliday 2001 [1990]). Investigating complex systems requires researchers to pay attention to system dynamics. Since…

  7. Cognitive task analysis: Techniques applied to airborne weapons training

    SciTech Connect

    Terranova, M.; Seamster, T.L.; Snyder, C.E.; Treitler, I.E.; Carlow Associates, Inc., Fairfax, VA; Martin Marietta Energy Systems, Inc., Oak Ridge, TN; Tennessee Univ., Knoxville, TN )

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented along with the results. 6 refs., 2 figs., 4 tabs.

  8. Treatment integrity in applied behavior analysis with children.

    PubMed

    Gresham, F M; Gansle, K A; Noell, G H

    1993-01-01

    Functional analysis of behavior depends upon accurate measurement of both independent and dependent variables. Quantifiable and controllable operations that demonstrate these functional relationships are necessary for a science of human behavior. Failure to implement independent variables with integrity threatens the internal and external validity of experiments. A review of all applied behavior analysis studies with children as subjects that have been published in the Journal of Applied Behavior Analysis between 1980 and 1990 found that approximately 16% of these studies measured the accuracy of independent variable implementation. Two thirds of these studies did not operationally define the components of the independent variable. Specific recommendations for improving the accuracy of independent variable implementation and for defining independent variables are discussed. PMID:8331022

  9. Conformity with the HIRF Environment Applied to Avionic System

    NASA Astrophysics Data System (ADS)

    Tristant, F.; Rotteleur, J. P.; Moreau, J. P.

    2012-05-01

    This paper presents the qualification and certification methodology applied to the avionic system for the HIRF and Lightning environment. Several versions of this system are installed in our legacy Falcon with different variations. The paper presents the compliance process taking into account the criticality and the complexity of the system, its installation, the level of exposition for EM environment and some solutions used by Dassault Aviation to demonstrate the compliance process.

  10. Flight control system design factors for applying automated testing techniques

    NASA Technical Reports Server (NTRS)

    Sitz, Joel R.; Vernon, Todd H.

    1990-01-01

    Automated validation of flight-critical embedded systems is being done at ARC Dryden Flight Research Facility. The automated testing techniques are being used to perform closed-loop validation of man-rated flight control systems. The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 High Alpha Research Vehicle (HARV) automated test systems are discussed. Operationally applying automated testing techniques has accentuated flight control system features that either help or hinder the application of these techniques. The paper also discusses flight control system features which foster the use of automated testing techniques.

  11. Applying systems engineering methodologies to the micro- and nanoscale realm

    NASA Astrophysics Data System (ADS)

    Garrison Darrin, M. Ann

    2012-06-01

    Micro scale and nano scale technology developments have the potential to revolutionize smart and small systems. The application of systems engineering methodologies that integrate standalone, small-scale technologies and interface them with macro technologies to build useful systems is critical to realizing the potential of these technologies. This paper covers the expanding knowledge base on systems engineering principles for micro and nano technology integration starting with a discussion of the drivers for applying a systems approach. Technology development on the micro and nano scale has transition from laboratory curiosity to the realization of products in the health, automotive, aerospace, communication, and numerous other arenas. This paper focuses on the maturity (or lack thereof) of the field of nanosystems which is emerging in a third generation having transitioned from completing active structures to creating systems. The emphasis of applying a systems approach focuses on successful technology development based on the lack of maturity of current nano scale systems. Therefore the discussion includes details relating to enabling roles such as product systems engineering and technology development. Classical roles such as acquisition systems engineering are not covered. The results are also targeted towards small-scale technology developers who need to take into account systems engineering processes such as requirements definition, verification, and validation interface management and risk management in the concept phase of technology development to maximize the likelihood of success, cost effective micro and nano technology to increase the capability of emerging deployed systems and long-term growth and profits.

  12. Applying Model Based Systems Engineering to NASA's Space Communications Networks

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul; Barnes, Patrick; Reinert, Jessica; Golden, Bert

    2013-01-01

    System engineering practices for complex systems and networks now require that requirement, architecture, and concept of operations product development teams, simultaneously harmonize their activities to provide timely, useful and cost-effective products. When dealing with complex systems of systems, traditional systems engineering methodology quickly falls short of achieving project objectives. This approach is encumbered by the use of a number of disparate hardware and software tools, spreadsheets and documents to grasp the concept of the network design and operation. In case of NASA's space communication networks, since the networks are geographically distributed, and so are its subject matter experts, the team is challenged to create a common language and tools to produce its products. Using Model Based Systems Engineering methods and tools allows for a unified representation of the system in a model that enables a highly related level of detail. To date, Program System Engineering (PSE) team has been able to model each network from their top-level operational activities and system functions down to the atomic level through relational modeling decomposition. These models allow for a better understanding of the relationships between NASA's stakeholders, internal organizations, and impacts to all related entities due to integration and sustainment of existing systems. Understanding the existing systems is essential to accurate and detailed study of integration options being considered. In this paper, we identify the challenges the PSE team faced in its quest to unify complex legacy space communications networks and their operational processes. We describe the initial approaches undertaken and the evolution toward model based system engineering applied to produce Space Communication and Navigation (SCaN) PSE products. We will demonstrate the practice of Model Based System Engineering applied to integrating space communication networks and the summary of its

  13. Boston Society's 11th Annual Applied Pharmaceutical Analysis conference.

    PubMed

    Lee, Violet; Liu, Ang; Groeber, Elizabeth; Moghaddam, Mehran; Schiller, James; Tweed, Joseph A; Walker, Gregory S

    2016-02-01

    Boston Society's 11th Annual Applied Pharmaceutical Analysis conference, Hyatt Regency Hotel, Cambridge, MA, USA, 14-16 September 2015 The Boston Society's 11th Annual Applied Pharmaceutical Analysis (APA) conference took place at the Hyatt Regency hotel in Cambridge, MA, on 14-16 September 2015. The 3-day conference affords pharmaceutical professionals, academic researchers and industry regulators the opportunity to collectively participate in meaningful and relevant discussions impacting the areas of pharmaceutical drug development. The APA conference was organized in three workshops encompassing the disciplines of regulated bioanalysis, discovery bioanalysis (encompassing new and emerging technologies) and biotransformation. The conference included a short course titled 'Bioanalytical considerations for the clinical development of antibody-drug conjugates (ADCs)', an engaging poster session, several panel and round table discussions and over 50 diverse talks from leading industry and academic scientists. PMID:26853375

  14. Applying Sustainable Systems Development Approach to Educational Technology Systems

    ERIC Educational Resources Information Center

    Huang, Albert

    2012-01-01

    Information technology (IT) is an essential part of modern education. The roles and contributions of technology to education have been thoroughly documented in academic and professional literature. Despite the benefits, the use of educational technology systems (ETS) also creates a significant impact on the environment, primarily due to energy…

  15. Stratospheric Data Analysis System (STRATAN)

    NASA Technical Reports Server (NTRS)

    Rood, Richard B.; Fox-Rabinovitz, Michael; Lamich, David J.; Newman, Paul A.; Pfaendtner, James W.

    1990-01-01

    A state of the art stratospheric analyses using a coupled stratosphere/troposphere data assimilation system is produced. These analyses can be applied to stratospheric studies of all types. Of importance to this effort is the application of the Stratospheric Data Analysis System (STRATAN) to constituent transport and chemistry problems.

  16. A layered neural network model applied to the auditory system

    NASA Astrophysics Data System (ADS)

    Travis, Bryan J.

    1986-08-01

    The structure of the auditory system is described with emphasis on the cerebral cortex. A layered neural network model incorporating much of the known structure of the cortex is applied to word discrimination. The concepts of iterated maps and atrractive fixed points are used to enable the model to recognize words despite variations in pitch, intensity and duration.

  17. Applied Information Systems Research Program (AISRP) Workshop 3 meeting proceedings

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The third Workshop of the Applied Laboratory Systems Research Program (AISRP) met at the Univeristy of Colorado's Laboratory for Atmospheric and Space Physics in August of 1993. The presentations were organized into four sessions: Artificial Intelligence Techniques; Scientific Visualization; Data Management and Archiving; and Research and Technology.

  18. Associate of Applied Science Degree in Office Systems. Proposal.

    ERIC Educational Resources Information Center

    Gallaudet Coll., Washington, DC. School of Preparatory Studies.

    This proposal culminates a 5-year study of the possibility of awarding associate degrees at Gallaudet College, a private, liberal arts college for hearing impaired adults. The proposal outlines an Associate of Applied Science degree (AAS) in Office Systems at the School of Preparatory Studies. First, introductory material provides a brief history…

  19. Applying Technology Ranking and Systems Engineering in Advanced Life Support

    NASA Technical Reports Server (NTRS)

    Jones, Harry; Luna, Bernadette (Technical Monitor)

    2000-01-01

    According to the Advanced Life Support (ALS) Program Plan, the Systems Modeling and Analysis Project (SMAP) has two important tasks: 1) prioritizing investments in ALS Research and Technology Development (R&TD), and 2) guiding the evolution of ALS systems. Investments could be prioritized simply by independently ranking different technologies, but we should also consider a technology's impact on system design. Guiding future ALS systems will require SMAP to consider many aspects of systems engineering. R&TD investments can be prioritized using familiar methods for ranking technology. The first step is gathering data on technology performance, safety, readiness level, and cost. Then the technologies are ranked using metrics or by decision analysis using net present economic value. The R&TD portfolio can be optimized to provide the maximum expected payoff in the face of uncertain future events. But more is needed. The optimum ALS system can not be designed simply by selecting the best technology for each predefined subsystem. Incorporating a new technology, such as food plants, can change the specifications of other subsystems, such as air regeneration. Systems must be designed top-down starting from system objectives, not bottom-up from selected technologies. The familiar top-down systems engineering process includes defining mission objectives, mission design, system specification, technology analysis, preliminary design, and detail design. Technology selection is only one part of systems analysis and engineering, and it is strongly related to the subsystem definitions. ALS systems should be designed using top-down systems engineering. R&TD technology selection should consider how the technology affects ALS system design. Technology ranking is useful but it is only a small part of systems engineering.

  20. Applied Nonlinear Dynamics and Stochastic Systems Near The Millenium. Proceedings

    SciTech Connect

    Kadtke, J.B.; Bulsara, A.

    1997-12-01

    These proceedings represent papers presented at the Applied Nonlinear Dynamics and Stochastic Systems conference held in San Diego, California in July 1997. The conference emphasized the applications of nonlinear dynamical systems theory in fields as diverse as neuroscience and biomedical engineering, fluid dynamics, chaos control, nonlinear signal/image processing, stochastic resonance, devices and nonlinear dynamics in socio{minus}economic systems. There were 56 papers presented at the conference and 5 have been abstracted for the Energy Science and Technology database.(AIP)

  1. Activity anorexia: An interplay between basic and applied behavior analysis

    PubMed Central

    Pierce, W. David; Epling, W. Frank; Dews, Peter B.; Estes, William K.; Morse, William H.; Van Orman, Willard; Herrnstein, Richard J.

    1994-01-01

    The relationship between basic research with nonhumans and applied behavior analysis is illustrated by our work on activity anorexia. When rats are fed one meal a day and allowed to run on an activity wheel, they run excessively, stop eating, and die of starvation. Convergent evidence, from several different research areas, indicates that the behavior of these animals and humans who self-starve is functionally similar. A biobehavioral theory of activity anorexia is presented that details the cultural contingencies, behavioral processes, and physiology of anorexia. Diagnostic criteria and a three-stage treatment program for activity-based anorexia are outlined. The animal model permits basic research on anorexia that for practical and ethical reasons cannot be conducted with humans. Thus, basic research can have applied importance. PMID:22478169

  2. Creating a System for Data-Driven Decision-Making: Applying the Principal-Agent Framework

    ERIC Educational Resources Information Center

    Wohlstetter, Priscilla; Datnow, Amanda; Park, Vicki

    2008-01-01

    The purpose of this article is to improve our understanding of data-driven decision-making strategies that are initiated at the district or system level. We apply principal-agent theory to the analysis of qualitative data gathered in a case study of 4 urban school systems. Our findings suggest educators at the school level need not only systemic…

  3. Cladistic analysis applied to the classification of volcanoes

    NASA Astrophysics Data System (ADS)

    Hone, D. W. E.; Mahony, S. H.; Sparks, R. S. J.; Martin, K. T.

    2007-11-01

    Cladistics is a systematic method of classification that groups entities on the basis of sharing similar characteristics in the most parsimonious manner. Here cladistics is applied to the classification of volcanoes using a dataset of 59 Quaternary volcanoes and 129 volcanic edifices of the Tohoku region, Northeast Japan. Volcano and edifice characteristics recorded in the database include attributes of volcano size, chemical composition, dominant eruptive products, volcano morphology, dominant landforms, volcano age and eruptive history. Without characteristics related to time the volcanic edifices divide into two groups, with characters related to volcano size, dominant composition and edifice morphology being the most diagnostic. Analysis including time based characteristics yields four groups with a good correlation between these groups and the two groups from the analysis without time for 108 out of 129 volcanic edifices. Thus when characters are slightly changed the volcanoes still form similar groupings. Analysis of the volcanoes both with and without time yields three groups based on compositional, eruptive products and morphological characters. Spatial clusters of volcanic centres have been recognised in the Tohoku region by Tamura et al. ( Earth Planet Sci Lett 197:105 106, 2002). The groups identified by cladistic analysis are distributed unevenly between the clusters, indicating a tendency for individual clusters to form similar kinds of volcanoes with distinctive but coherent styles of volcanism. Uneven distribution of volcano types between clusters can be explained by variations in dominant magma compositions through time, which are reflected in eruption products and volcanic landforms. Cladistic analysis can be a useful tool for elucidating dynamic igneous processes that could be applied to other regions and globally. Our exploratory study indicates that cladistics has promise as a method for classifying volcanoes and potentially elucidating dynamic

  4. Scanning proton microprobe analysis applied to wood and bark samples

    NASA Astrophysics Data System (ADS)

    Lövestam, N. E. G.; Johansson, E.-M.; Johansson, S. A. E.; Pallon, J.

    1990-04-01

    In this study the feasibility of applying scanning micro-PIXE to analysis of wood and bark samples is demonstrated. Elemental mapping of the analysed sections show the patterns of Cl, K, Ca, Mn, Fe, Cu and Zn. Some of these patterns can be related to the annual tree ring structure. It is observed that the variation of elements having an environmental character can be rather large within a single tree ring, thus illuminating possible difficulties when using tree ring sections as a pollution monitor. The variations in elemental concentrations when crossing from bark to wood are also shown to be smooth for some elements but rather abrupt for others.

  5. Applying AI tools to operational space environmental analysis

    NASA Technical Reports Server (NTRS)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  6. Systems engineering and analysis

    SciTech Connect

    Blanchard, B.S.; Fabrycky, W.J.

    1981-01-01

    An introduction to systems is provided and tools for systems analysis are considered, taking into account system definitions and concepts, approaches for bringing systems into being, models in systems analysis, economic analysis techniques, mathematical modeling and optimization, probability and statistics, queuing theory and analysis, and control concepts and techniques. The system design process is discussed along with the design for operational feasibility, systems engineering management, and system design case studies. Attention is given to conceptual design, preliminary system design, detail design and development, system test and evaluation, design for reliability, design for maintainability, design for supportability, design for economic feasibility, communication system design, finite population system design, energy storage system design, and procurement-inventory system design.

  7. Nuclear safety as applied to space power reactor systems

    SciTech Connect

    Cummings, G.E.

    1987-01-01

    To develop a strategy for incorporating and demonstrating safety, it is necessary to enumerate the unique aspects of space power reactor systems from a safety standpoint. These features must be differentiated from terrestrial nuclear power plants so that our experience can be applied properly. Some ideas can then be developed on how safe designs can be achieved so that they are safe and perceived to be safe by the public. These ideas include operating only after achieving a stable orbit, developing an inherently safe design, ''designing'' in safety from the start and managing the system development (design) so that it is perceived safe. These and other ideas are explored further in this paper.

  8. Discrete Event Supervisory Control Applied to Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S.; Shah, Neerav

    2005-01-01

    The theory of discrete event supervisory (DES) control was applied to the optimal control of a twin-engine aircraft propulsion system and demonstrated in a simulation. The supervisory control, which is implemented as a finite-state automaton, oversees the behavior of a system and manages it in such a way that it maximizes a performance criterion, similar to a traditional optimal control problem. DES controllers can be nested such that a high-level controller supervises multiple lower level controllers. This structure can be expanded to control huge, complex systems, providing optimal performance and increasing autonomy with each additional level. The DES control strategy for propulsion systems was validated using a distributed testbed consisting of multiple computers--each representing a module of the overall propulsion system--to simulate real-time hardware-in-the-loop testing. In the first experiment, DES control was applied to the operation of a nonlinear simulation of a turbofan engine (running in closed loop using its own feedback controller) to minimize engine structural damage caused by a combination of thermal and structural loads. This enables increased on-wing time for the engine through better management of the engine-component life usage. Thus, the engine-level DES acts as a life-extending controller through its interaction with and manipulation of the engine s operation.

  9. Empirical modal decomposition applied to cardiac signals analysis

    NASA Astrophysics Data System (ADS)

    Beya, O.; Jalil, B.; Fauvet, E.; Laligant, O.

    2010-01-01

    In this article, we present the method of empirical modal decomposition (EMD) applied to the electrocardiograms and phonocardiograms signals analysis and denoising. The objective of this work is to detect automatically cardiac anomalies of a patient. As these anomalies are localized in time, therefore the localization of all the events should be preserved precisely. The methods based on the Fourier Transform (TFD) lose the localization property [13] and in the case of Wavelet Transform (WT) which makes possible to overcome the problem of localization, but the interpretation remains still difficult to characterize the signal precisely. In this work we propose to apply the EMD (Empirical Modal Decomposition) which have very significant properties on pseudo periodic signals. The second section describes the algorithm of EMD. In the third part we present the result obtained on Phonocardiograms (PCG) and on Electrocardiograms (ECG) test signals. The analysis and the interpretation of these signals are given in this same section. Finally, we introduce an adaptation of the EMD algorithm which seems to be very efficient for denoising.

  10. Automated SEM Modal Analysis Applied to the Diogenites

    NASA Technical Reports Server (NTRS)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  11. Performance Measurement Analysis System

    Energy Science and Technology Software Center (ESTSC)

    1989-06-01

    The PMAS4.0 (Performance Measurement Analysis System) is a user-oriented system designed to track the cost and schedule performance of Department of Energy (DOE) major projects (MPs) and major system acquisitions (MSAs) reporting under DOE Order 5700.4A, Project Management System. PMAS4.0 provides for the analysis of performance measurement data produced from management control systems complying with the Federal Government''s Cost and Schedule Control Systems Criteria.

  12. Applying cluster analysis to physics education research data

    NASA Astrophysics Data System (ADS)

    Springuel, R. Padraic

    One major thrust of Physics Education Research (PER) is the identification of student ideas about specific physics concepts, both correct ideas and those that differ from the expert consensus. Typically the research process of eliciting the spectrum of student ideas involves the administration of specially designed questions to students. One major analysis task in PER is the sorting of these student responses into thematically coherent groups. This process is one which has previously been done by eye in PER. This thesis explores the possibility of using cluster analysis to perform the task in a more rigorous and less time-intensive fashion while making fewer assumptions about what the students are doing. Since this technique has not previously been used in PER, a summary of the various kinds of cluster analysis is included as well as a discussion of which might be appropriate for the task of sorting student responses into groups. Two example data sets (one based on the Force and Motion Conceptual Evaluation (DICE) the other looking at acceleration in two-dimensions (A2D) are examined in depth to demonstrate how cluster analysis can be applied to PER data and the various considerations which must be taken into account when doing so. In both cases, the techniques described in this thesis found 5 groups which contained about 90% of the students in the data set. The results of this application are compared to previous research on the topics covered by the two examples to demonstrate that cluster analysis can effectively uncover the same patterns in student responses that have already been identified.

  13. Soft tissue cephalometric analysis applied to regional Indian population

    PubMed Central

    Upadhyay, Jay S.; Maheshwari, Sandhya; Verma, Sanjeev K.; Zahid, Syed Naved

    2013-01-01

    Introduction: Importance of soft tissue consideration in establishing treatment goals for orthodontics and orthognathic surgery has been recognized and various cephalometric analysis incorporating soft tissue parameters have evolved. The great variance in soft tissue drape of the human face and perception of esthetics exists and normative data based on one population group cannot be applied to all. The study was conducted to compare the standard soft tissue cephalometric analysis (STCA) norms with norms derived for population of western Uttar Pradesh region of India. Materials and Methods: The sample consisted of lateral cephalograms taken in natural head position of 33 normal subjects (16 males, 17 females). The cephalograms were analyzed with soft tissue cephalometric analysis for orthodontic diagnosis and treatment planning, and the Student's t test was used to compare the difference in means between study population and standard STCA norms. Results: Compared with established STCA norms, females in our study had steeper maxillary occlusal plane, more proclined mandibular incisors, and less protrusive lips. Both males and females showed an overall decrease in facial lengths, less prominent midface and mandibular structures and more convex profile compared with established norms for the White population. Conclusions: Statistically significant differences were found in certain key parameters of STCA for western Uttar Pradesh population when compared with established norms. PMID:24665169

  14. Finite Element Analysis Applied to Dentoalveolar Trauma: Methodology Description

    PubMed Central

    da Silva, B. R.; Moreira Neto, J. J. S.; da Silva, F. I.; de Aguiar, A. S. W.

    2011-01-01

    Dentoalveolar traumatic injuries are among the clinical conditions most frequently treated in dental practice. However, few studies so far have addressed the biomechanical aspects of these events, probably as a result of difficulties in carrying out satisfactory experimental and clinical studies as well as the unavailability of truly scientific methodologies. The aim of this paper was to describe the use of finite element analysis applied to the biomechanical evaluation of dentoalveolar trauma. For didactic purposes, the methodological process was divided into steps that go from the creation of a geometric model to the evaluation of final results, always with a focus on methodological characteristics, advantages, and disadvantages, so as to allow the reader to customize the methodology according to specific needs. Our description shows that the finite element method can faithfully reproduce dentoalveolar trauma, provided the methodology is closely followed and thoroughly evaluated. PMID:21991463

  15. Applied Behavior Analysis is a Science and, Therefore, Progressive.

    PubMed

    Leaf, Justin B; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K; Smith, Tristram; Weiss, Mary Jane

    2016-02-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful progress for individuals diagnosed with autism spectrum disorder (ASD). We describe this approach as progressive. In a progressive approach to ABA, the therapist employs a structured yet flexible process, which is contingent upon and responsive to child progress. We will describe progressive ABA, contrast it to reductionist ABA, and provide rationales for both the substance and intent of ABA as a progressive scientific method for improving conditions of social relevance for individuals with ASD. PMID:26373767

  16. Seismic analysis applied to the delimiting of a gas reservoir

    SciTech Connect

    Ronquillo, G.; Navarro, M.; Lozada, M.; Tafolla, C.

    1996-08-01

    We present the results of correlating seismic models with petrophysical parameters and well logs to mark the limits of a gas reservoir in sand lenses. To fulfill the objectives of the study, we used a data processing sequence that included wavelet manipulation, complex trace attributes and pseudovelocities inversion, along with several quality control schemes to insure proper amplitude preservation. Based on the analysis and interpretation of the seismic sections, several areas of interest were selected to apply additional signal treatment as preconditioning for petrophysical inversion. Signal classification was performed to control the amplitudes along the horizons of interest, and to be able to find an indirect interpretation of lithologies. Additionally, seismic modeling was done to support the results obtained and to help integrate the interpretation. The study proved to be a good auxiliary tool in the location of the probable extension of the gas reservoir in sand lenses.

  17. Image analysis technique applied to lock-exchange gravity currents

    NASA Astrophysics Data System (ADS)

    Nogueira, Helena I. S.; Adduce, Claudia; Alves, Elsa; Franca, Mário J.

    2013-04-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in the image relating the amount of dye uniformly distributed in the tank and the greyscale values in the corresponding images. The results are evaluated and corrected by applying the mass conservation principle within the experimental tank. The procedure is a simple way to assess the time-varying density distribution within the gravity current, allowing the investigation of gravity current dynamics and mixing processes.

  18. Systemic toxicity of dermally applied crude oils in rats

    SciTech Connect

    Feuston, M.H.; Mackerer, C.R.; Schreiner, C.A.; Hamilton, C.E.

    1997-12-31

    Two crude oils, differing in viscosity (V) and nitrogen (N) and sulfur (S) content, were evaluated for systemic toxicity, In the Crude I (low V, low N, low S) study, the material was applied to the clipped backs of rats at dose levels of 0, 30, 125, and 500 mg/kg. In the Crude II (high V, high N, moderate S) study, the oil was applied similarly at the same dose levels. The crude oils were applied for 13 wk, 5 d/wk. Exposure sites were not occluded. Mean body weight gain (wk 1-14) was significantly reduced in male rats exposed to Crude II; body weight gain of all other animals was not adversely affected by treatment. An increase in absolute (A) and relative (R) liver weights and a decrease in A and R thymus weights were observed in male and female rats exposed to Crude II at 500 mg/kg; only liver weights (A and R) were adversely affected in male and female rats exposed to Crude I. In general, there was no consistent pattern of toxicity for serum chemistry endpoints; however, more parameters were adversely affected in Crude II-exposed female rats than in the other exposed groups. A consistent pattern of toxicity for hematology endpoints was observed among male rats exposed to Crude I and male and female rats exposed to Crude II. Parameters affected included: Crudes I and II, red blood cell count, hemoglobin, and hematocrit, Crude II, platelet count. Microscopic evaluation of tissues revealed the following treatment-related findings: Crude I, treated skin, thymus, and thyroid; Crude II, bone marrow, treated skin, thymus, and thyroid. The LOEL (lowest observable effect level) for skin irritation and systemic toxicity (based on marginal effects on the thyroid) for both crude oils was 30 mg/kg; effects were more numerous and more pronounced in animals exposed to Crude II. Systemic effects are probably related to concentrations of polycyclic aromatic compounds (PAC) found in crude oil.

  19. Applied Information Systems Research Program (AISRP). Workshop 2: Meeting Proceedings

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The Earth and space science participants were able to see where the current research can be applied in their disciplines and computer science participants could see potential areas for future application of computer and information systems research. The Earth and Space Science research proposals for the High Performance Computing and Communications (HPCC) program were under evaluation. Therefore, this effort was not discussed at the AISRP Workshop. OSSA's other high priority area in computer science is scientific visualization, with the entire second day of the workshop devoted to it.

  20. Robust regression applied to fractal/multifractal analysis.

    NASA Astrophysics Data System (ADS)

    Portilla, F.; Valencia, J. L.; Tarquis, A. M.; Saa-Requejo, A.

    2012-04-01

    Fractal and multifractal are concepts that have grown increasingly popular in recent years in the soil analysis, along with the development of fractal models. One of the common steps is to calculate the slope of a linear fit commonly using least squares method. This shouldn't be a special problem, however, in many situations using experimental data the researcher has to select the range of scales at which is going to work neglecting the rest of points to achieve the best linearity that in this type of analysis is necessary. Robust regression is a form of regression analysis designed to circumvent some limitations of traditional parametric and non-parametric methods. In this method we don't have to assume that the outlier point is simply an extreme observation drawn from the tail of a normal distribution not compromising the validity of the regression results. In this work we have evaluated the capacity of robust regression to select the points in the experimental data used trying to avoid subjective choices. Based on this analysis we have developed a new work methodology that implies two basic steps: • Evaluation of the improvement of linear fitting when consecutive points are eliminated based on R p-value. In this way we consider the implications of reducing the number of points. • Evaluation of the significance of slope difference between fitting with the two extremes points and fitted with the available points. We compare the results applying this methodology and the common used least squares one. The data selected for these comparisons are coming from experimental soil roughness transect and simulated based on middle point displacement method adding tendencies and noise. The results are discussed indicating the advantages and disadvantages of each methodology. Acknowledgements Funding provided by CEIGRAM (Research Centre for the Management of Agricultural and Environmental Risks) and by Spanish Ministerio de Ciencia e Innovación (MICINN) through project no

  1. Performance analysis of high quality parallel preconditioners applied to 3D finite element structural analysis

    SciTech Connect

    Kolotilina, L.; Nikishin, A.; Yeremin, A.

    1994-12-31

    The solution of large systems of linear equations is a crucial bottleneck when performing 3D finite element analysis of structures. Also, in many cases the reliability and robustness of iterative solution strategies, and their efficiency when exploiting hardware resources, fully determine the scope of industrial applications which can be solved on a particular computer platform. This is especially true for modern vector/parallel supercomputers with large vector length and for modern massively parallel supercomputers. Preconditioned iterative methods have been successfully applied to industrial class finite element analysis of structures. The construction and application of high quality preconditioners constitutes a high percentage of the total solution time. Parallel implementation of high quality preconditioners on such architectures is a formidable challenge. Two common types of existing preconditioners are the implicit preconditioners and the explicit preconditioners. The implicit preconditioners (e.g. incomplete factorizations of several types) are generally high quality but require solution of lower and upper triangular systems of equations per iteration which are difficult to parallelize without deteriorating the convergence rate. The explicit type of preconditionings (e.g. polynomial preconditioners or Jacobi-like preconditioners) require sparse matrix-vector multiplications and can be parallelized but their preconditioning qualities are less than desirable. The authors present results of numerical experiments with Factorized Sparse Approximate Inverses (FSAI) for symmetric positive definite linear systems. These are high quality preconditioners that possess a large resource of parallelism by construction without increasing the serial complexity.

  2. Applying principles of health system strengthening to eye care

    PubMed Central

    Blanchet, Karl; Patel, Daksha

    2012-01-01

    Understanding Health systems have now become the priority focus of researchers and policy makers, who have progressively moved away from a project-centred perspectives. The new tendency is to facilitate a convergence between health system developers and disease-specific programme managers in terms of both thinking and action, and to reconcile both approaches: one focusing on integrated health systems and improving the health status of the population and the other aiming at improving access to health care. Eye care interventions particularly in developing countries have generally been vertically implemented (e.g. trachoma, cataract surgeries) often with parallel organizational structures or specialised disease specific services. With the emergence of health system strengthening in health strategies and in the service delivery of interventions there is a need to clarify and examine inputs in terms governance, financing and management. This present paper aims to clarify key concepts in health system strengthening and describe the various components of the framework as applied in eye care interventions. PMID:22944762

  3. Certification methodology applied to the NASA experimental radar system

    NASA Technical Reports Server (NTRS)

    Britt, Charles L.; Switzer, George F.; Bracalente, Emedio M.

    1994-01-01

    The objective of the research is to apply selected FAA certification techniques to the NASA experimental wind shear radar system. Although there is no intent to certify the NASA system, the procedures developed may prove useful to manufacturers that plan to undergo the certification process. The certification methodology for forward-looking wind shear detection radars will require estimation of system performance in several FAA-specified microburst/clutter scenarios as well as the estimation of probabilities of missed and false hazard alerts under general operational conditions. Because of the near-impossibility of obtaining these results experimentally, analytical and simulation approaches must be used. Hazard detection algorithms were developed that derived predictive estimates of aircraft hazard from basic radar measurements of weather reflectivity and radial wind velocity. These algorithms were designed to prevent false alarms due to ground clutter while providing accurate predictions of hazard to the aircraft due to weather. A method of calculation of the probability of missed and false hazard alerts has been developed that takes into account the effect of the various algorithms used in the system and provides estimates of the probability of missed and false alerts per microburst encounter under weather conditions found at Denver, Kansas City, and Orlando. Simulation techniques have been developed that permit the proper merging of radar ground clutter data (obtained from flight tests) with simulated microburst data (obtained from microburst models) to estimate system performance using the microburst/clutter scenarios defined by the FAA.

  4. Applying principles of health system strengthening to eye care.

    PubMed

    Blanchet, Karl; Patel, Daksha

    2012-01-01

    Understanding health systems have now become the priority focus of researchers and policy makers, who have progressively moved away from a project-centred perspectives. The new tendency is to facilitate a convergence between health system developers and disease-specific programme managers in terms of both thinking and action, and to reconcile both approaches: one focusing on integrated health systems and improving the health status of the population and the other aiming at improving access to health care. Eye care interventions particularly in developing countries have generally been vertically implemented (e.g. trachoma, cataract surgeries) often with parallel organizational structures or specialised disease specific services. With the emergence of health system strengthening in health strategies and in the service delivery of interventions there is a need to clarify and examine inputs in terms governance, financing and management. This present paper aims to clarify key concepts in health system strengthening and describe the various components of the framework as applied in eye care interventions. PMID:22944762

  5. A Hygrothermal Risk Analysis Applied to Residential Unvented Attics

    SciTech Connect

    Pallin, Simon B; Kehrer, Manfred

    2013-01-01

    Aresidential building, constructed with an unvented attic, is acommonroof assembly in the United States.The expected hygrothermal performance and service life of the roof are difficult to estimate due to a number of varying parameters.Typical parameters expected to vary are the climate, direction, and slope of the roof as well as the radiation properties of the surface material. Furthermore, influential parameters are indoor moisture excess, air leakages through the attic floor, and leakages from air-handling unit and ventilation ducts. In addition, the type of building materials such as the insulation material and closed or open cell spray polyurethane foam will influence the future performance of the roof. A development of a simulation model of the roof assembly will enable a risk and sensitivity analysis, in which the most important varying parameters on the hygrothermal performance can be determined. The model is designed to perform probabilistic simulations using mathematical and hygrothermal calculation tools. The varying input parameters can be chosen from existing measurements, simulations, or standards. An analysis is applied to determine the risk of consequences, such as mold growth, rot, or energy demand of the HVAC unit. Furthermore, the future performance of the roof can be simulated in different climates to facilitate the design of an efficient and reliable roof construction with the most suitable technical solution and to determine the most appropriate building materials for a given climate

  6. System engineering applied to VLTI: a scientific success

    NASA Astrophysics Data System (ADS)

    Haguenauer, P.; Alonso, J.; Bourget, P.; Gitton, Ph.; Morel, S.; Poupar, S.; Schuhler, Nicolas

    2014-07-01

    The ESO Very Large Telescope Interferometer (VLTI) offers access to the four 8-m Unit Telescopes (UT) and the four 1.8-m Auxiliary Telescopes (AT) of the Paranal Observatory. After the first fringes obtained in 2011 with the commissioning instrument VINCI and with siderostats, the VLTI has seen an important number of systems upgrades, paving the path towards reaching the infrastructure level and scientific results it had been designed for. The current status of the VLTI operation all year round with up to four telescopes simultaneously and real imaging capability demonstrates the powerful interferometric infrastructure that has been delivered to the astronomical community. Reaching today's level of robustness and operability of the VLTI has been a long journey, with a lot of lessons learned and gained experience. In 2007, the Paranal Observatory recognized the need for a global system approach for the VLTI, and a dedicated system engineering team was set to analyse the status of the interferometer, identify weak points and area where performances were not met, propose and apply solutions. The gains of this specific effort can be found today in the very good operability level with faster observations executions, in the decreased downtime, in the improved performances, and in the better reliability of the different systems. We will present an historical summary of the system engineering effort done at the VLTI, showing the strategy used, and the implemented upgrades and technical solutions. Improvements in terms of scientific data quality will be highlighted when possible. We will conclude on the legacy of the VLTI system engineering effort, for the VLTI and for future systems.

  7. Multitaper Spectral Analysis and Wavelet Denoising Applied to Helioseismic Data

    NASA Technical Reports Server (NTRS)

    Komm, R. W.; Gu, Y.; Hill, F.; Stark, P. B.; Fodor, I. K.

    1999-01-01

    Estimates of solar normal mode frequencies from helioseismic observations can be improved by using Multitaper Spectral Analysis (MTSA) to estimate spectra from the time series, then using wavelet denoising of the log spectra. MTSA leads to a power spectrum estimate with reduced variance and better leakage properties than the conventional periodogram. Under the assumption of stationarity and mild regularity conditions, the log multitaper spectrum has a statistical distribution that is approximately Gaussian, so wavelet denoising is asymptotically an optimal method to reduce the noise in the estimated spectra. We find that a single m-upsilon spectrum benefits greatly from MTSA followed by wavelet denoising, and that wavelet denoising by itself can be used to improve m-averaged spectra. We compare estimates using two different 5-taper estimates (Stepian and sine tapers) and the periodogram estimate, for GONG time series at selected angular degrees l. We compare those three spectra with and without wavelet-denoising, both visually, and in terms of the mode parameters estimated from the pre-processed spectra using the GONG peak-fitting algorithm. The two multitaper estimates give equivalent results. The number of modes fitted well by the GONG algorithm is 20% to 60% larger (depending on l and the temporal frequency) when applied to the multitaper estimates than when applied to the periodogram. The estimated mode parameters (frequency, amplitude and width) are comparable for the three power spectrum estimates, except for modes with very small mode widths (a few frequency bins), where the multitaper spectra broadened the modest compared with the periodogram. We tested the influence of the number of tapers used and found that narrow modes at low n values are broadened to the extent that they can no longer be fit if the number of tapers is too large. For helioseismic time series of this length and temporal resolution, the optimal number of tapers is less than 10.

  8. Advanced solar irradiances applied to satellite and ionospheric operational systems

    NASA Astrophysics Data System (ADS)

    Tobiska, W. Kent; Schunk, Robert; Eccles, Vince; Bouwer, Dave

    Satellite and ionospheric operational systems require solar irradiances in a variety of time scales and spectral formats. We describe the development of a system using operational grade solar irradiances that are applied to empirical thermospheric density models and physics-based ionospheric models used by operational systems that require a space weather characterization. The SOLAR2000 (S2K) and SOLARFLARE (SFLR) models developed by Space Environment Technologies (SET) provide solar irradiances from the soft X-rays (XUV) through the Far Ultraviolet (FUV) spectrum. The irradiances are provided as integrated indices for the JB2006 empirical atmosphere density models and as line/band spectral irradiances for the physics-based Ionosphere Forecast Model (IFM) developed by the Space Environment Corporation (SEC). We describe the integration of these irradiances in historical, current epoch, and forecast modes through the Communication Alert and Prediction System (CAPS). CAPS provides real-time and forecast HF radio availability for global and regional users and global total electron content (TEC) conditions.

  9. Painleve singularity analysis applied to charged particle dynamics during reconnection

    SciTech Connect

    Larson, J.W.

    1992-01-01

    For a plasma in the collisionless regime, test-particle modelling can lend some insight into the macroscopic behavior of the plasma, e.g. conductivity and heating. A common example for which this technique is used is a system with electric and magnetic fields given by B = [delta]yx + zy + yz and E = [epsilon]z, where [delta], [gamma], and [epsilon] are constant parameters. This model can be used to model plasma behavior near neutral lines, ([gamma] = 0), as well as current sheets ([gamma] = 0, [delta] = 0). The integrability properties of the particle motion in such fields might affect the plasma's macroscopic behavior, and the author has asked the question [open quotes]For what values of [delta], [gamma], and [epsilon] is the system integrable [close quotes] To answer this question, the author has employed Painleve singularity analysis, which is an examination of the singularity properties of a test particle's equations of motion in the complex time plane. This analysis has identified two field geometries for which the system's particle dynamics are integrable in terms of the second Painleve transcendent: the circular O-line case and the case of the neutral sheet configuration. These geometries yield particle dynamics that are integrable in the Liouville sense (i.e., there exist the proper number of integrals in involution) in an extended phase space which includes the time as a canonical coordinate, and this property is also true for nonzero [gamma]. The singularity property tests also identified a large, dense set of X-line and O-line field geometries that yield dynamics that may possess the weak Painleve property. In the case of the X-line geometries, this result shows little relevance to the physical nature of the system, but the existence of a dense set of elliptical O-line geometries with this property may be related to the fact that for [epsilon] positive, one can construct asymptotic solutions in the limit t [yields] [infinity].

  10. Generalized Statistical Thermodyanmics Applied to Small Material Systems

    NASA Astrophysics Data System (ADS)

    Cammarata, Robert

    2012-02-01

    When characterizing the behavior of small material systems, surface effects can strongly influence the thermodynamic behavior and need to be taken into account in a complete thermal physics analysis. Although there have been a variety of approached proposed to incorporate surface effects, they are often restricted to certain types of systems (e.g., those involving incompressible phases) and often invoke thermodynamics parameters that are often not well-defined for the surface. It is proposed that a generalized statistical mechanics based on the concept of thermodynamic availability (exergy) can be formulated from which the surface properties and their influence on system behavior can be naturally and rigorously obtained. This availability-based statistical thermodynamics will be presented and its use illustrated in a treatment of nucleation during crystallization.

  11. Non-Harmonic Analysis Applied to Optical Coherence Tomography Imaging

    NASA Astrophysics Data System (ADS)

    Cao, Xu; Uchida, Tetsuya; Hirobayashi, Shigeki; Chong, Changho; Morosawa, Atsushi; Totsuka, Koki; Suzuki, Takuya

    2012-02-01

    A new processing technique called non-harmonic analysis (NHA) is proposed for optical coherence tomography (OCT) imaging. Conventional Fourier-domain OCT employs the discrete Fourier transform (DFT), which depends on the window function and length. The axial resolution of the OCT image, calculated by using DFT, is inversely proportional to the full width at half maximum (FWHM) of the wavelength range. The FWHM of wavelength range is limited by the sweeping range of the source in swept-source OCT and it is limited by the number of CCD pixels in spectral-domain OCT. However, the NHA process does not have such constraints; NHA can resolve high frequencies irrespective of the window function and the frame length of the sampled data. In this study, the NHA process is described and it is applied to OCT imaging. It is compared with OCT images based on the DFT. To demonstrate the benefits of using NHA for OCT, we perform OCT imaging with NHA of an onion skin. The results reveal that NHA can achieve an image resolution equivalent that of a 100-nm sweep range using a significantly reduced wavelength range. They also reveal the potential of using this technique to achieve high-resolution imaging without using a broadband source. However, the long calculation times required for NHA must be addressed if it is to be used in clinical applications.

  12. Applying DNA computation to intractable problems in social network analysis.

    PubMed

    Chen, Rick C S; Yang, Stephen J H

    2010-09-01

    From ancient times to the present day, social networks have played an important role in the formation of various organizations for a range of social behaviors. As such, social networks inherently describe the complicated relationships between elements around the world. Based on mathematical graph theory, social network analysis (SNA) has been developed in and applied to various fields such as Web 2.0 for Web applications and product developments in industries, etc. However, some definitions of SNA, such as finding a clique, N-clique, N-clan, N-club and K-plex, are NP-complete problems, which are not easily solved via traditional computer architecture. These challenges have restricted the uses of SNA. This paper provides DNA-computing-based approaches with inherently high information density and massive parallelism. Using these approaches, we aim to solve the three primary problems of social networks: N-clique, N-clan, and N-club. Their accuracy and feasible time complexities discussed in the paper will demonstrate that DNA computing can be used to facilitate the development of SNA. PMID:20566337

  13. Applying of digital signal processing to optical equisignal zone system

    NASA Astrophysics Data System (ADS)

    Maraev, Anton A.; Timofeev, Aleksandr N.; Gusarov, Vadim F.

    2015-05-01

    In this work we are trying to assess the application of array detectors and digital information processing to the system with the optical equisignal zone as a new method of evaluating of optical equisignal zone position. Peculiarities of optical equisignal zone formation are described. The algorithm of evaluation of optical equisignal zone position is applied to processing on the array detector. This algorithm enables to evaluate as lateral displacement as turning angles of the receiver relative to the projector. Interrelation of parameters of the projector and the receiver is considered. According to described principles an experimental set was made and then characterized. The accuracy of position evaluation of the equisignal zone is shown dependent of the size of the equivalent entrance pupil at processing.

  14. Adaptive control applied to Space Station attitude control system

    NASA Technical Reports Server (NTRS)

    Lam, Quang M.; Chipman, Richard; Hu, Tsay-Hsin G.; Holmes, Eric B.; Sunkel, John

    1992-01-01

    This paper presents an adaptive control approach to enhance the performance of current attitude control system used by the Space Station Freedom. The proposed control law was developed based on the direct adaptive control or model reference adaptive control scheme. Performance comparisons, subject to inertia variation, of the adaptive controller and the fixed-gain linear quadratic regulator currently implemented for the Space Station are conducted. Both the fixed-gain and the adaptive gain controllers are able to maintain the Station stability for inertia variations of up to 35 percent. However, when a 50 percent inertia variation is applied to the Station, only the adaptive controller is able to maintain the Station attitude.

  15. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    PubMed

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads. PMID:25738742

  16. To Apply or Not to Apply: A Survey Analysis of Grant Writing Costs and Benefits

    PubMed Central

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads. PMID:25738742

  17. BATSE spectroscopy analysis system

    NASA Technical Reports Server (NTRS)

    Schaefer, Bradley E.; Bansal, Sandhia; Basu, Anju; Brisco, Phil; Cline, Thomas L.; Friend, Elliott; Laubenthal, Nancy; Panduranga, E. S.; Parkar, Nuru; Rust, Brad

    1992-01-01

    The Burst and Transient Source Experiment (BATSE) Spectroscopy Analysis System (BSAS) is the software system which is the primary tool for the analysis of spectral data from BATSE. As such, Guest Investigators and the community as a whole need to know its basic properties and characteristics. Described here are the characteristics of the BATSE spectroscopy detectors and the BSAS.

  18. Analysis of possibility of applying the PVDF foil in industrial vibration sensors

    NASA Astrophysics Data System (ADS)

    Wróbel, A.

    2015-11-01

    There are many machines using the piezoelectric effects. Systems with smart materials are often used because they have high potential applications for example transducers can be applied to receive required characteristic of projected system. Every engineer and designer know how important it is properly mathematical model and method of the analysis. Also it is important to consider all parameters of analyzed system for example glue layer between elements. Geometrical and material parameters has a significant impact on the characteristics of the all system's components because the omission of the influence of one of them results in inaccuracy in the analysis of the system. In article the modeling and testing of vibrating systems with piezoelectric ceramic materials transducers used as actuators and vibration dampers. The method of analysis of the vibrating sensor systems will be presented, mathematical model, and characteristics, to determine the influence of the system's properties on these characteristics. Main scientific point of the project is to analyze and demonstrate possibility of applying new construction with the PVDF foil or any other belonging to a group of smart materials in industrial sensors. Currently, the vibration level sensors are used by practically all manufacturers of piezoelectric ceramic plates to generate and detect the vibration of the fork.

  19. Database mining applied to central nervous system (CNS) activity.

    PubMed

    Pintore, M; Taboureau, O; Ros, F; Chrétien, J R

    2001-04-01

    A data set of 389 compounds, active in the central nervous system (CNS) and divided into eight classes according to the receptor type, was extracted from the RBI database and analyzed by Self-Organizing Maps (SOM), also known as Kohonen Artificial Neural Networks. This method gives a 2D representation of the distribution of the compounds in the hyperspace derived from their molecular descriptors. As SOM belongs to the category of unsupervised techniques, it has to be combined with another method in order to generate classification models with predictive ability. The fuzzy clustering (FC) approach seems to be particularly suitable to delineate clusters in a rational way from SOM and to get an automatic objective map interpretation. Maps derived by SOM showed specific regions associated with a unique receptor type and zones in which two or more activity classes are nested. Then, the modeling ability of the proposed SOM/FC Hybrid System tools applied simultaneously to eight activity classes was validated after dividing the 389 compounds into a training set and a test set, including 259 and 130 molecules, respectively. The proper experimental activity class, among the eight possible ones, was predicted simultaneously and correctly for 81% of the test set compounds. PMID:11461760

  20. Advanced imaging systems for diagnostic investigations applied to Cultural Heritage

    NASA Astrophysics Data System (ADS)

    Peccenini, E.; Albertin, F.; Bettuzzi, M.; Brancaccio, R.; Casali, F.; Morigi, M. P.; Petrucci, F.

    2014-12-01

    The diagnostic investigations are an important resource in the studies on Cultural Heritage to enhance the knowledge on execution techniques, materials and conservation status of a work of art. In this field, due to the great historical and artistic value of the objects, preservation is the main concern; for this reason, new technological equipment has been designed and developed in the Physics Departments of the Universities of Ferrara and Bologna to enhance the non-invasive approach to the study of pictorial artworks and other objects of cultural interest. Infrared (IR) reflectography, X-ray radiography and computed tomography (CT), applied to works of art, are joined by the same goal: to get hidden information on execution techniques and inner structure pursuing the non-invasiveness of the methods, although using different setup and physical principles. In this work transportable imaging systems to investigate large objects in museums and galleries are presented. In particular, 2D scanning devices for IR reflectography and X-ray radiography, CT systems and some applications to the Cultural Heritage are described.

  1. Operational modal analysis applied to the concert harp

    NASA Astrophysics Data System (ADS)

    Chomette, B.; Le Carrou, J.-L.

    2015-05-01

    Operational modal analysis (OMA) methods are useful to extract modal parameters of operating systems. These methods seem to be particularly interesting to investigate the modal basis of string instruments during operation to avoid certain disadvantages due to conventional methods. However, the excitation in the case of string instruments is not optimal for OMA due to the presence of damped harmonic components and low noise in the disturbance signal. Therefore, the present study investigates the least-square complex exponential (LSCE) and the modified least-square complex exponential methods in the case of a string instrument to identify modal parameters of the instrument when it is played. The efficiency of the approach is experimentally demonstrated on a concert harp excited by some of its strings and the two methods are compared to a conventional modal analysis. The results show that OMA allows us to identify modes particularly present in the instrument's response with a good estimation especially if they are close to the excitation frequency with the modified LSCE method.

  2. Principles of Micellar Electrokinetic Capillary Chromatography Applied in Pharmaceutical Analysis

    PubMed Central

    Hancu, Gabriel; Simon, Brigitta; Rusu, Aura; Mircia, Eleonora; Gyéresi, Árpád

    2013-01-01

    Since its introduction capillary electrophoresis has shown great potential in areas where electrophoretic techniques have rarely been used before, including here the analysis of pharmaceutical substances. The large majority of pharmaceutical substances are neutral from electrophoretic point of view, consequently separations by the classic capillary zone electrophoresis; where separation is based on the differences between the own electrophoretic mobilities of the analytes; are hard to achieve. Micellar electrokinetic capillary chromatography, a hybrid method that combines chromatographic and electrophoretic separation principles, extends the applicability of capillary electrophoretic methods to neutral analytes. In micellar electrokinetic capillary chromatography, surfactants are added to the buffer solution in concentration above their critical micellar concentrations, consequently micelles are formed; micelles that undergo electrophoretic migration like any other charged particle. The separation is based on the differential partitioning of an analyte between the two-phase system: the mobile aqueous phase and micellar pseudostationary phase. The present paper aims to summarize the basic aspects regarding separation principles and practical applications of micellar electrokinetic capillary chromatography, with particular attention to those relevant in pharmaceutical analysis. PMID:24312804

  3. Quantitative phase imaging applied to laser damage detection and analysis.

    PubMed

    Douti, Dam-Bé L; Chrayteh, Mhamad; Aknoun, Sherazade; Doualle, Thomas; Hecquet, Christophe; Monneret, Serge; Gallais, Laurent

    2015-10-01

    We investigate phase imaging as a measurement method for laser damage detection and analysis of laser-induced modification of optical materials. Experiments have been conducted with a wavefront sensor based on lateral shearing interferometry associated with a high-magnification optical microscope. The system has been used for the in-line observation of optical thin films and bulk samples, laser irradiated in two different conditions: 500 fs pulses at 343 and 1030 nm, and millisecond to second irradiation with a CO2 laser at 10.6 μm. We investigate the measurement of the laser-induced damage threshold of optical material by detection and phase changes and show that the technique realizes high sensitivity with different optical path measurements lower than 1 nm. Additionally, the quantitative information on the refractive index or surface modification of the samples under test that is provided by the system has been compared to classical metrology instruments used for laser damage or laser ablation characterization (an atomic force microscope, a differential interference contrast microscope, and an optical surface profiler). An accurate in-line measurement of the morphology of laser-ablated sites, from few nanometers to hundred microns in depth, is shown. PMID:26479612

  4. Factor Analysis Applied the VFY-218 RCS Data

    NASA Technical Reports Server (NTRS)

    Woo, Alex; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    Present statistical factor analysis of computer simulations and measurement data for the VFY-218 configuration. Factor analysis try to quantify the statistical grouping of measurements and simulations.

  5. Differential Network Analysis Applied to Preoperative Breast Cancer Chemotherapy Response

    PubMed Central

    Warsow, Gregor; Struckmann, Stephan; Kerkhoff, Claus; Reimer, Toralf; Engel, Nadja; Fuellen, Georg

    2013-01-01

    In silico approaches are increasingly considered to improve breast cancer treatment. One of these treatments, neoadjuvant TFAC chemotherapy, is used in cases where application of preoperative systemic therapy is indicated. Estimating response to treatment allows or improves clinical decision-making and this, in turn, may be based on a good understanding of the underlying molecular mechanisms. Ever increasing amounts of high throughput data become available for integration into functional networks. In this study, we applied our software tool ExprEssence to identify specific mechanisms relevant for TFAC therapy response, from a gene/protein interaction network. We contrasted the resulting active subnetwork to the subnetworks of two other such methods, OptDis and KeyPathwayMiner. We could show that the ExprEssence subnetwork is more related to the mechanistic functional principles of TFAC therapy than the subnetworks of the other two methods despite the simplicity of ExprEssence. We were able to validate our method by recovering known mechanisms and as an application example of our method, we identified a mechanism that may further explain the synergism between paclitaxel and doxorubicin in TFAC treatment: Paclitaxel may attenuate MELK gene expression, resulting in lower levels of its target MYBL2, already associated with doxorubicin synergism in hepatocellular carcinoma cell lines. We tested our hypothesis in three breast cancer cell lines, confirming it in part. In particular, the predicted effect on MYBL2 could be validated, and a synergistic effect of paclitaxel and doxorubicin could be demonstrated in the breast cancer cell lines SKBR3 and MCF-7. PMID:24349128

  6. Differential network analysis applied to preoperative breast cancer chemotherapy response.

    PubMed

    Warsow, Gregor; Struckmann, Stephan; Kerkhoff, Claus; Reimer, Toralf; Engel, Nadja; Fuellen, Georg

    2013-01-01

    In silico approaches are increasingly considered to improve breast cancer treatment. One of these treatments, neoadjuvant TFAC chemotherapy, is used in cases where application of preoperative systemic therapy is indicated. Estimating response to treatment allows or improves clinical decision-making and this, in turn, may be based on a good understanding of the underlying molecular mechanisms. Ever increasing amounts of high throughput data become available for integration into functional networks. In this study, we applied our software tool ExprEssence to identify specific mechanisms relevant for TFAC therapy response, from a gene/protein interaction network. We contrasted the resulting active subnetwork to the subnetworks of two other such methods, OptDis and KeyPathwayMiner. We could show that the ExprEssence subnetwork is more related to the mechanistic functional principles of TFAC therapy than the subnetworks of the other two methods despite the simplicity of ExprEssence. We were able to validate our method by recovering known mechanisms and as an application example of our method, we identified a mechanism that may further explain the synergism between paclitaxel and doxorubicin in TFAC treatment: Paclitaxel may attenuate MELK gene expression, resulting in lower levels of its target MYBL2, already associated with doxorubicin synergism in hepatocellular carcinoma cell lines. We tested our hypothesis in three breast cancer cell lines, confirming it in part. In particular, the predicted effect on MYBL2 could be validated, and a synergistic effect of paclitaxel and doxorubicin could be demonstrated in the breast cancer cell lines SKBR3 and MCF-7. PMID:24349128

  7. Space elevator systems level analysis

    SciTech Connect

    Laubscher, B. E.

    2004-01-01

    The Space Elevator (SE) represents a major paradigm shift in space access. It involves new, untried technologies in most of its subsystems. Thus the successful construction of the SE requires a significant amount of development, This in turn implies a high level of risk for the SE. This paper will present a systems level analysis of the SE by subdividing its components into their subsystems to determine their level of technological maturity. such a high-risk endeavor is to follow a disciplined approach to the challenges. A systems level analysis informs this process and is the guide to where resources should be applied in the development processes. It is an efficient path that, if followed, minimizes the overall risk of the system's development. systems level analysis is that the overall system is divided naturally into its subsystems, and those subsystems are further subdivided as appropriate for the analysis. By dealing with the complex system in layers, the parameter space of decisions is kept manageable. Moreover, A rational way to manage One key aspect of a resources are not expended capriciously; rather, resources are put toward the biggest challenges and most promising solutions. This overall graded approach is a proven road to success. The analysis includes topics such as nanotube technology, deployment scenario, power beaming technology, ground-based hardware and operations, ribbon maintenance and repair and climber technology.

  8. Phase plane analysis: applying chaos theory in health care.

    PubMed

    Priesmeyer, H R; Sharp, L F

    1995-01-01

    This article applies the new science of nonlinearity to administrative issues and accounts receivable management in health care, and it provides a new perspective on common operating and quality control measures. PMID:10151628

  9. Systems Analysis Sub site

    SciTech Connect

    EERE

    2012-03-16

    Systems analysis provides direction, focus, and support for the development and introduction of hydrogen production, storage, and end-use technologies, and provides a basis for recommendations on a balanced portfolio of activities.

  10. SUBSURFACE VISUAL ALARM SYSTEM ANALYSIS

    SciTech Connect

    D.W. Markman

    2001-08-06

    The ''Subsurface Fire Hazard Analysis'' (CRWMS M&O 1998, page 61), and the document, ''Title III Evaluation Report for the Surface and Subsurface Communication System'', (CRWMS M&O 1999a, pages 21 and 23), both indicate the installed communication system is adequate to support Exploratory Studies Facility (ESF) activities with the exception of the mine phone system for emergency notification purposes. They recommend the installation of a visual alarm system to supplement the page/party phone system The purpose of this analysis is to identify data communication highway design approaches, and provide justification for the selected or recommended alternatives for the data communication of the subsurface visual alarm system. This analysis is being prepared to document a basis for the design selection of the data communication method. This analysis will briefly describe existing data or voice communication or monitoring systems within the ESF, and look at how these may be revised or adapted to support the needed data highway of the subsurface visual alarm. system. The existing PLC communication system installed in subsurface is providing data communication for alcove No.5 ventilation fans, south portal ventilation fans, bulkhead doors and generator monitoring system. It is given that the data communication of the subsurface visual alarm system will be a digital based system. It is also given that it is most feasible to take advantage of existing systems and equipment and not consider an entirely new data communication system design and installation. The scope and primary objectives of this analysis are to: (1) Briefly review and describe existing available data communication highways or systems within the ESF. (2) Examine technical characteristics of an existing system to disqualify a design alternative is paramount in minimizing the number of and depth of a system review. (3) Apply general engineering design practices or criteria such as relative cost, and degree of

  11. Storage battery systems analysis

    SciTech Connect

    Murphy, K.D.

    1982-01-01

    Storage Battery Systems Analysis supports the battery Exploratory Technology Development and Testing Project with technical and economic analysis of battery systems in various end-use applications. Computer modeling and simulation techniques are used in the analyses. Analysis objectives are achieved through both in-house efforts and outside contracts. In-house studies during FY82 included a study of the relationship between storage battery system reliability and cost, through cost-of-investment and cost-of-service interruption inputs; revision and update of the SOLSTOR computer code in standard FORTRAN 77 form; parametric studies of residential stand-alone photovoltaic systems using the SOLSTOR code; simulation of wind turbine collector/storage battery systems for the community of Kalaupapa, Molokai, Hawaii.

  12. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  13. Space lab system analysis

    NASA Technical Reports Server (NTRS)

    Rives, T. B.; Ingels, F. M.

    1988-01-01

    An analysis of the Automated Booster Assembly Checkout System (ABACS) has been conducted. A computer simulation of the ETHERNET LAN has been written. The simulation allows one to investigate different structures of the ABACS system. The simulation code is in PASCAL and is VAX compatible.

  14. MOUSE UNCERTAINTY ANALYSIS SYSTEM

    EPA Science Inventory

    The original MOUSE (Modular Oriented Uncertainty System) system was designed to deal with the problem of uncertainties in Environmental engineering calculations, such as a set of engineering cost or risk analysis equations. t was especially intended for use by individuals with li...

  15. Classical linear-control analysis applied to business-cycle dynamics and stability

    NASA Technical Reports Server (NTRS)

    Wingrove, R. C.

    1983-01-01

    Linear control analysis is applied as an aid in understanding the fluctuations of business cycles in the past, and to examine monetary policies that might improve stabilization. The analysis shows how different policies change the frequency and damping of the economic system dynamics, and how they modify the amplitude of the fluctuations that are caused by random disturbances. Examples are used to show how policy feedbacks and policy lags can be incorporated, and how different monetary strategies for stabilization can be analytically compared. Representative numerical results are used to illustrate the main points.

  16. Ariel Performance Analysis System

    NASA Astrophysics Data System (ADS)

    Ariel, Gideon B.; Penny, M. A.; Saar, Dany

    1990-08-01

    The Ariel Performance Analysis System is a computer-based system for the measurement, analysis and presentation of human performance. The system is based on a proprietary technique for processing multiple high-speed film and video recordings of a subject's performance. It is noninvasive, and does not require wires, sensors, markers or reflectors. In addition, it is portable and does not require modification of the performing environment. The scale and accuracy of measurement can be set to whatever levels are required by the activity being performed.

  17. CONVEYOR SYSTEM SAFETY ANALYSIS

    SciTech Connect

    M. Salem

    1995-06-23

    The purpose and objective of this analysis is to systematically identify and evaluate hazards related to the Yucca Mountain Project Exploratory Studies Facility (ESF) surface and subsurface conveyor system (for a list of conveyor subsystems see section 3). This process is an integral part of the systems engineering process; whereby safety is considered during planning, design, testing, and construction. A largely qualitative approach was used since a radiological System Safety Analysis is not required. The risk assessment in this analysis characterizes the accident scenarios associated with the conveyor structures/systems/components in terms of relative risk and includes recommendations for mitigating all identified risks. The priority for recommending and implementing mitigation control features is: (1) Incorporate measures to reduce risks and hazards into the structure/system/component (S/S/C) design, (2) add safety devices and capabilities to the designs that reduce risk, (3) provide devices that detect and warn personnel of hazardous conditions, and (4) develop procedures and conduct training to increase worker awareness of potential hazards, on methods to reduce exposure to hazards, and on the actions required to avoid accidents or correct hazardous conditions. The scope of this analysis is limited to the hazards related to the design of conveyor structures/systems/components (S/S/Cs) that occur during normal operation. Hazards occurring during assembly, test and maintenance or ''off normal'' operations have not been included in this analysis. Construction related work activities are specifically excluded per DOE Order 5481.1B section 4. c.

  18. Duration Analysis Applied to the Adoption of Knowledge.

    ERIC Educational Resources Information Center

    Vega-Cervera, Juan A.; Gordillo, Isabel Cuadrado

    2001-01-01

    Analyzes knowledge acquisition in a sample of 264 pupils in 9 Spanish elementary schools, using time as a dependent variable. Introduces psycho-pedagogical, pedagogical, and social variables into a hazard model applied to the reading process. Auditory discrimination (not intelligence or visual perception) most significantly influences learning to…

  19. Optical methods of stress analysis applied to cracked components

    NASA Technical Reports Server (NTRS)

    Smith, C. W.

    1991-01-01

    After briefly describing the principles of frozen stress photoelastic and moire interferometric analyses, and the corresponding algorithms for converting optical data from each method into stress intensity factors (SIF), the methods are applied to the determination of crack shapes, SIF determination, crack closure displacement fields, and pre-crack damage mechanisms in typical aircraft component configurations.

  20. Applying Research: An Analysis of Texts for Consumers of Research.

    ERIC Educational Resources Information Center

    Erion, R. L.; Steinley, Gary

    The critical reading of research involves: (1) comprehension, (2) evaluation, and (3) application. A study examined six recently published textbooks to determine to what extent they attempt to help students learn to apply educational research; these texts were specifically designed for "consumers" of research (i.e., critical readers of research)…

  1. How Has Applied Behavior Analysis and Behavior Therapy Changed?: An Historical Analysis of Journals

    ERIC Educational Resources Information Center

    O'Donohue, William; Fryling, Mitch

    2007-01-01

    Applied behavior analysis and behavior therapy are now nearly a half century old. It is interesting to ask if and how these disciplines have changed over time, particularly regarding some of their key internal controversies (e.g., role of cognitions). We examined the first five years and the 2000-2004 five year period of the "Journal of Applied…

  2. Applying Model Analysis to a Resource-Based Analysis of the Force and Motion Conceptual Evaluation

    ERIC Educational Resources Information Center

    Smith, Trevor I.; Wittmann, Michael C.; Carter, Tom

    2014-01-01

    Previously, we analyzed the Force and Motion Conceptual Evaluation in terms of a resources-based model that allows for clustering of questions so as to provide useful information on how students correctly or incorrectly reason about physics. In this paper, we apply model analysis to show that the associated model plots provide more information…

  3. Identifying a cooperative control mechanism between an applied field and the environment of open quantum systems

    NASA Astrophysics Data System (ADS)

    Gao, Fang; Rey-de-Castro, Roberto; Wang, Yaoxiong; Rabitz, Herschel; Shuang, Feng

    2016-05-01

    Many systems under control with an applied field also interact with the surrounding environment. Understanding the control mechanisms has remained a challenge, especially the role played by the interaction between the field and the environment. In order to address this need, here we expand the scope of the Hamiltonian-encoding and observable-decoding (HE-OD) technique. HE-OD was originally introduced as a theoretical and experimental tool for revealing the mechanism induced by control fields in closed quantum systems. The results of open-system HE-OD analysis presented here provide quantitative mechanistic insights into the roles played by a Markovian environment. Two model open quantum systems are considered for illustration. In these systems, transitions are induced by either an applied field linked to a dipole operator or Lindblad operators coupled to the system. For modest control yields, the HE-OD results clearly show distinct cooperation between the dynamics induced by the optimal field and the environment. Although the HE-OD methodology introduced here is considered in simulations, it has an analogous direct experimental formulation, which we suggest may be applied to open systems in the laboratory to reveal mechanistic insights.

  4. Coal systems analysis

    SciTech Connect

    Warwick, P.D.

    2005-07-01

    This collection of papers provides an introduction to the concept of coal systems analysis and contains examples of how coal systems analysis can be used to understand, characterize, and evaluate coal and coal gas resources. Chapter are: Coal systems analysis: A new approach to the understanding of coal formation, coal quality and environmental considerations, and coal as a source rock for hydrocarbons by Peter D. Warwick. Appalachian coal assessment: Defining the coal systems of the Appalachian Basin by Robert C. Milici. Subtle structural influences on coal thickness and distribution: Examples from the Lower Broas-Stockton coal (Middle Pennsylvanian), Eastern Kentucky Coal Field, USA by Stephen F. Greb, Cortland F. Eble, and J.C. Hower. Palynology in coal systems analysis The key to floras, climate, and stratigraphy of coal-forming environments by Douglas J. Nichols. A comparison of late Paleocene and late Eocene lignite depositional systems using palynology, upper Wilcox and upper Jackson Groups, east-central Texas by Jennifer M.K. O'Keefe, Recep H. Sancay, Anne L. Raymond, and Thomas E. Yancey. New insights on the hydrocarbon system of the Fruitland Formation coal beds, northern San Juan Basin, Colorado and New Mexico, USA by W.C. Riese, William L. Pelzmann, and Glen T. Snyder.

  5. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  6. Modal analysis applied to circular, rectangular, and coaxial waveguides

    NASA Technical Reports Server (NTRS)

    Hoppe, D. J.

    1988-01-01

    Recent developments in the analysis of various waveguide components and feedhorns using Modal Analysis (Mode Matching Method) are summarized. A brief description of the theory is presented, and the important features of the method are pointed out. Specific examples in circular, rectangular, and coaxial waveguides are included, with comparisons between the theory and experimental measurements. Extensions to the methods are described.

  7. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  8. Quantitative Schlieren analysis applied to holograms of crystals grown on Spacelab 3

    NASA Technical Reports Server (NTRS)

    Brooks, Howard L.

    1986-01-01

    In order to extract additional information about crystals grown in the microgravity environment of Spacelab, a quantitative schlieren analysis technique was developed for use in a Holography Ground System of the Fluid Experiment System. Utilizing the Unidex position controller, it was possible to measure deviation angles produced by refractive index gradients of 0.5 milliradians. Additionally, refractive index gradient maps for any recorded time during the crystal growth were drawn and used to create solute concentration maps for the environment around the crystal. The technique was applied to flight holograms of Cell 204 of the Fluid Experiment System that were recorded during the Spacelab 3 mission on STS 51B. A triglycine sulfate crystal was grown under isothermal conditions in the cell and the data gathered with the quantitative schlieren analysis technique is consistent with a diffusion limited growth process.

  9. Improving the flash flood frequency analysis applying dendrogeomorphological evidences

    NASA Astrophysics Data System (ADS)

    Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.

    2009-09-01

    Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait

  10. Seat belt usage: A potential target for applied behavior analysis

    PubMed Central

    Geller, E. Scott; Casali, John G.; Johnson, Richard P.

    1980-01-01

    Results of 1,579 observations of cars entering or exiting campus parking lots showed direct relationships between seat belt wearing and the intrusiveness of the engineering device designed to induce belt usage, and between device intrusiveness and system defeat. For example, all drivers with working interlocks or unlimited buzzer reminders were wearing a seat belt; but 62% of the systems with interlocks or unlimited buzzers had been defeated, and only 15.9% of the drivers in these cars were wearing a seat belt. The normative data indicated marked ineffectiveness of the negative reinforcement contingencies implied by current seat belt inducement systems; but suggested that unlimited buzzer systems would be the optimal system currently available if contingencies were developed to discourage the disconnection and circumvention of such systems. Positive reinforcement strategies are discussed that would be quite feasible for large-scale promotion of seat belt usage. PMID:16795638

  11. Factorial kriging analysis applied to geological data from petroleum exploration

    SciTech Connect

    Jaquet, O.

    1989-10-01

    A regionalized variable, thickness of the reservoir layer, from a gas field is decomposed by factorial kriging analysis. Maps of the obtained components may be associated with depositional environments that are favorable for petroleum exploration.

  12. Neutron-activation analysis applied to copper ores and artifacts

    NASA Technical Reports Server (NTRS)

    Linder, N. F.

    1970-01-01

    Neutron activation analysis is used for quantitative identification of trace metals in copper. Establishing a unique fingerprint of impurities in Michigan copper would enable identification of artifacts made from this copper.

  13. Biomedical systems analysis program

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Biomedical monitoring programs which were developed to provide a system analysis context for a unified hypothesis for adaptation to space flight are presented and discussed. A real-time system of data analysis and decision making to assure the greatest possible crew safety and mission success is described. Information about man's abilities, limitations, and characteristic reactions to weightless space flight was analyzed and simulation models were developed. The predictive capabilities of simulation models for fluid-electrolyte regulation, erythropoiesis regulation, and calcium regulation are discussed.

  14. Independent comparison study of six different electronic tongues applied for pharmaceutical analysis.

    PubMed

    Pein, Miriam; Kirsanov, Dmitry; Ciosek, Patrycja; del Valle, Manel; Yaroshenko, Irina; Wesoły, Małgorzata; Zabadaj, Marcin; Gonzalez-Calabuig, Andreu; Wróblewski, Wojciech; Legin, Andrey

    2015-10-10

    Electronic tongue technology based on arrays of cross-sensitive chemical sensors and chemometric data processing has attracted a lot of researchers' attention through the last years. Several so far reported applications dealing with pharmaceutical related tasks employed different e-tongue systems to address different objectives. In this situation, it is hard to judge on the benefits and drawbacks of particular e-tongue implementations for R&D in pharmaceutics. The objective of this study was to compare the performance of six different e-tongues applied to the same set of pharmaceutical samples. For this purpose, two commercially available systems (from Insent and AlphaMOS) and four laboratory prototype systems (two potentiometric systems from Warsaw operating in flow and static modes, one potentiometric system from St. Petersburg, one voltammetric system from Barcelona) were employed. The sample set addressed in the study comprised nine different formulations based on caffeine citrate, lactose monohydrate, maltodextrine, saccharin sodium and citric acid in various combinations. To provide for the fair and unbiased comparison, samples were evaluated under blind conditions and data processing from all the systems was performed in a uniform way. Different mathematical methods were applied to judge on similarity of the e-tongues response from the samples. These were principal component analysis (PCA), RV' matrix correlation coefficients and Tuckeŕs congruency coefficients. PMID:26099261

  15. The colour analysis method applied to homogeneous rocks

    NASA Astrophysics Data System (ADS)

    Halász, Amadé; Halmai, Ákos

    2015-12-01

    Computer-aided colour analysis can facilitate cyclostratigraphic studies. Here we report on a case study involving the development of a digital colour analysis method for examination of the Boda Claystone Formation which is the most suitable in Hungary for the disposal of high-level radioactive waste. Rock type colours are reddish brown or brownish red, or any shade between brown and red. The method presented here could be used to differentiate similar colours and to identify gradual transitions between these; the latter are of great importance in a cyclostratigraphic analysis of the succession. Geophysical well-logging has demonstrated the existence of characteristic cyclic units, as detected by colour and natural gamma. Based on our research, colour, natural gamma and lithology correlate well. For core Ib-4, these features reveal the presence of orderly cycles with thicknesses of roughly 0.64 to 13 metres. Once the core has been scanned, this is a time- and cost-effective method.

  16. Orbit Response Matrix Analysis Applied at PEP-II

    SciTech Connect

    Steier, C.; Wolski, A.; Ecklund, S.; Safranek, J.A.; Tenenbaum, P.; Terebilo, A.; Turner, J.L.; Yocky, G.; /SLAC

    2005-05-17

    The analysis of orbit response matrices has been used very successfully to measure and correct the gradient and skew gradient distribution in many accelerators. It allows determination of an accurately calibrated model of the coupled machine lattice, which then can be used to calculate the corrections necessary to improve coupling, dynamic aperture and ultimately luminosity. At PEP-II, the Matlab version of LOCO has been used to analyze coupled response matrices for both the LER and the HER. The large number of elements in PEP-II and the very complicated interaction region present unique challenges to the data analysis. All necessary tools to make the analysis method useable at PEP-II have been implemented and LOCO can now be used as a routine tool for lattice diagnostic.

  17. Joint regression analysis and AMMI model applied to oat improvement

    NASA Astrophysics Data System (ADS)

    Oliveira, A.; Oliveira, T. A.; Mejza, S.

    2012-09-01

    In our work we present an application of some biometrical methods useful in genotype stability evaluation, namely AMMI model, Joint Regression Analysis (JRA) and multiple comparison tests. A genotype stability analysis of oat (Avena Sativa L.) grain yield was carried out using data of the Portuguese Plant Breeding Board, sample of the 22 different genotypes during the years 2002, 2003 and 2004 in six locations. In Ferreira et al. (2006) the authors state the relevance of the regression models and of the Additive Main Effects and Multiplicative Interactions (AMMI) model, to study and to estimate phenotypic stability effects. As computational techniques we use the Zigzag algorithm to estimate the regression coefficients and the agricolae-package available in R software for AMMI model analysis.

  18. Integrated analysis environment for high impact systems

    SciTech Connect

    Martinez, M.; Davis, J.; Scott, J.; Sztipanovits, J.; Karsai, G.

    1998-02-01

    Modeling and analysis of high consequence, high assurance systems requires special modeling considerations. System safety and reliability information must be captured in the models. Previously, high consequence systems were modeled using separate, disjoint models for safety, reliability, and security. The MultiGraph Architecture facilitates the implementation of a model integrated system for modeling and analysis of high assurance systems. Model integrated computing allows an integrated modeling technique to be applied to high consequence systems. Among the tools used for analyzing safety and reliability are a behavioral simulator and an automatic fault tree generation and analysis tool. Symbolic model checking techniques are used to efficiently investigate the system models. A method for converting finite state machine models to ordered binary decision diagrams allows the application of symbolic model checking routines to the integrated system models. This integrated approach to modeling and analysis of high consequence systems ensures consistency between the models and the different analysis tools.

  19. On the relation between applied behavior analysis and positive behavioral support

    PubMed Central

    Carr, James E.; Sidener, Tina M.

    2002-01-01

    Anderson and Freeman (2000) recently defined positive behavioral support (PBS) as a systematic approach to the delivery of clinical and educational services that is rooted in behavior analysis. However, the recent literature contains varied definitions of PBS as well as discrepant notions regarding the relation between applied behavior analysis and PBS. After summarizing common definitional characteristics of PBS from the literature, we conclude that PBS is comprised almost exclusively of techniques and values originating in applied behavior analysis. We then discuss the relations between applied behavior analysis and PBS that have been proposed in the literature. Finally, we discuss possible implications of considering PBS a field separate from applied behavior analysis. PMID:22478389

  20. Flight control system design factors for applying automated testing techniques

    NASA Technical Reports Server (NTRS)

    Sitz, Joel R.; Vernon, Todd H.

    1990-01-01

    The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 high alpha research vehicle (HARV) automated test systems are discussed. It is noted that operational experiences in developing and using these automated testing techniques have highlighted the need for incorporating target system features to improve testability. Improved target system testability can be accomplished with the addition of nonreal-time and real-time features. Online access to target system implementation details, unobtrusive real-time access to internal user-selectable variables, and proper software instrumentation are all desirable features of the target system. Also, test system and target system design issues must be addressed during the early stages of the target system development. Processing speeds of up to 20 million instructions/s and the development of high-bandwidth reflective memory systems have improved the ability to integrate the target system and test system for the application of automated testing techniques. It is concluded that new methods of designing testability into the target systems are required.

  1. Applying Score Analysis to a Rehearsal Pedagogy of Expressive Performance

    ERIC Educational Resources Information Center

    Byo, James L.

    2014-01-01

    The discoveries of score analysis (e.g., minor seventh chord, ostinato, phrase elision, melodic fragment, half cadence) are more than just compositional techniques or music vocabulary. They are sounds--fascinating, storytelling, dynamic modes of expression--that when approached as such enrich the rehearsal experience. This article presents a…

  2. Action, Content and Identity in Applied Genre Analysis for ESP

    ERIC Educational Resources Information Center

    Flowerdew, John

    2011-01-01

    Genres are staged, structured, communicative events, motivated by various communicative purposes, and performed by members of specific discourse communities (Swales 1990; Bhatia 1993, 2004; Berkenkotter & Huckin 1995). Since its inception, with the two seminal works on the topic by Swales (1990) and Bhatia (1993), genre analysis has taken pride of…

  3. Applying Adult Learning Theory through a Character Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    The purpose of this study is to analyze the behavior of a character, Celie, in a movie, 'The Color Purple," through the lens of two adult learning theorists to determine the relationships the character has with each theory. The development and portrayal of characters in movies can be explained and understood by the analysis of adult learning…

  4. Applying Skinner's Analysis of Verbal Behavior to Persons with Dementia

    ERIC Educational Resources Information Center

    Dixon, Mark; Baker, Jonathan C.; Sadowski, Katherine Ann

    2011-01-01

    Skinner's 1957 analysis of verbal behavior has demonstrated a fair amount of utility to teach language to children with autism and other various disorders. However, the learning of language can be forgotten, as is the case for many elderly suffering from dementia or other degenerative diseases. It appears possible that Skinner's operants may…

  5. Applying MORT maintenance safety analysis in Finnish industry

    NASA Astrophysics Data System (ADS)

    Ruuhilehto, Kaarin; Virolainen, Kimmo

    1992-02-01

    A safety analysis method based on MORT (Management Oversight and Risk Tree) method, especially on the version developed for safety considerations in the evaluation of maintenance programs, is presented. The MORT maintenance safety analysis is intended especially for the use maintenance safety management. The analysis helps managers evaluate the goals of their safety work and measures taken to reach them. The analysis is done by a team or teams. The team ought to have expert knowledge of the organization both vertically and horizontally in order to be able to identify factors that may contribute to accidents or other interruptions in the maintenance work. Identification is made by using the MORT maintenance key question set as a check list. The questions check the way safety matters are connnected with the maintenance planning and managing, as well as the safety management itself. In the second stage, means to eliminate the factors causing problems are developed. New practices are established to improve safety of maintenance planning and managing in the enterprise.

  6. Applied Bibliometrics: Using Citation Analysis in the Journal Submission Process.

    ERIC Educational Resources Information Center

    Robinson, Michael D.

    1991-01-01

    Discusses the use of citation analysis as an effective tool for scholars to determine what journals would be appropriate for publication of their work. Calculating citation distance is explained, and a study with economics journals is described that computed citation distance between previously published articles and journals in the field. (12…

  7. Best practices: applying management analysis of excellence to immunization.

    PubMed

    Wishner, Amy; Aronson, Jerold; Kohrt, Alan; Norton, Gary

    2005-01-01

    The authors applied business management tools to analyze and promote excellence and to evaluate differences between average and above-average immunization peformers in private practices. The authors conducted a pilot study of 10 private practices in Pennsylvania using tools common in management to assess practices' organizational climate and managerial style. Authoritative and coaching styles of physician leaders were common to both groups. Managerial styles that emphasized higher levels of clarity and responsibility managerial styles were evident in the large practices; and rewards and flexibility styles were higher in the small above-average practices. The findings of this pilot study match results seen in high performers in other industries. It concludes that the authoritative style appears to have the most impact on performance. It has interesting implications for training/behavior change to improve immunization rates, along with traditional medical interventions. PMID:15921143

  8. VENTILATION TECHNOLOGY SYSTEMS ANALYSIS

    EPA Science Inventory

    The report gives results of a project to develop a systems analysis of ventilation technology and provide a state-of-the-art assessment of ventilation and indoor air quality (IAQ) research needs. (NOTE: Ventilation technology is defined as the hardware necessary to bring outdoor ...

  9. A value analysis model applied to the management of amblyopia.

    PubMed Central

    Beauchamp, G R; Bane, M C; Stager, D R; Berry, P M; Wright, W W

    1999-01-01

    PURPOSE: To assess the value of amblyopia-related services by utilizing a health value model (HVM). Cost and quality criteria are evaluated in accordance with the interests of patients, physicians, and purchasers. METHODS: We applied an HVM to a hypothetical statistical ("median") child with amblyopia whose visual acuity is 20/80 and to a group of children with amblyopia who are managed by our practice. We applied the model to calculate the value of these services by evaluating the responses of patients and physicians and relating these responses to clinical outcomes. RESULTS: The consensus value of care for the hypothetical median child was calculated to be 0.406 (of 1.000). For those children managed in our practice, the calculated value is 0.682. Clinically, 79% achieved 20/40 or better visual acuity, and the mean final visual acuity was 0.2 logMAR (20/32). Value appraisals revealed significant concerns about the financial aspects of amblyopia-related services, particularly among physicians. Patients rated services more positively than did physicians. CONCLUSIONS: Amblyopia care is difficult, sustained, and important work that requires substantial sensitivity to and support of children and families. Compliance and early detection are essential to success. The value of amblyopia services is rated significantly higher by patients than by physicians. Relative to the measured value, amblyopia care is undercompensated. The HVM is useful to appraise clinical service delivery and its variation. The costs of failure and the benefits of success are high; high-value amblyopia care yields substantial dividends and should be commensurately compensated in the marketplace. PMID:10703133

  10. Local structural excitations in model glass systems under applied load

    NASA Astrophysics Data System (ADS)

    Swayamjyoti, S.; Löffler, J. F.; Derlet, P. M.

    2016-04-01

    The potential-energy landscape of a model binary Lennard-Jones structural glass is investigated as a function of applied external strain, in terms of how local structural excitations (LSEs) respond to the load. Using the activation relaxation technique and nudged elastic band methods, the evolving structure and barrier energy of such LSEs are studied in detail. For the case of a tensile/compressive strain, the LSE barrier energies generally decrease/increase, whereas under pure shear, it may either increase or decrease resulting in a broadening of the barrier energy distribution. It is found that how a particular LSE responds to an applied strain is strongly controlled by the LSE's far-field internal stress signature prior to loading.