Science.gov

Sample records for applied systems analysis

  1. Applied mathematics analysis of the multibody systems

    NASA Astrophysics Data System (ADS)

    Sahin, H.; Kar, A. K.; Tacgin, E.

    2012-08-01

    A methodology is developed for the analysis of the multibody systems that is applied on the vehicle as a case study. The previous study emphasizes the derivation of the multibody dynamics equations of motion for bogie [2]. In this work, we have developed a guide-way for the analysis of the dynamical behavior of the multibody systems for mainly validation, verification of the realistic mathematical model and partly for the design of the alternative optimum vehicle parameters.

  2. Sneak analysis applied to process systems

    NASA Astrophysics Data System (ADS)

    Whetton, Cris

    Traditional safety analyses, such as HAZOP, FMEA, FTA, and MORT, are less than effective at identifying hazards resulting from incorrect 'flow' - whether this be flow of information, actions, electric current, or even the literal flow of process fluids. Sneak Analysis (SA) has existed since the mid nineteen-seventies as a means of identifying such conditions in electric circuits; in which area, it is usually known as Sneak Circuit Analysis (SCA). This paper extends the ideas of Sneak Circuit Analysis to a general method of Sneak Analysis applied to process plant. The methods of SA attempt to capitalize on previous work in the electrical field by first producing a pseudo-electrical analog of the process and then analyzing the analog by the existing techniques of SCA, supplemented by some additional rules and clues specific to processes. The SA method is not intended to replace any existing method of safety analysis; instead, it is intended to supplement such techniques as HAZOP and FMEA by providing systematic procedures for the identification of a class of potential problems which are not well covered by any other method.

  3. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  4. Thermographic techniques applied to solar collector systems analysis

    SciTech Connect

    Eden, A.

    1980-02-01

    The use of thermography to analyze large solar collector array systems under dynamic operating conditions is discussed. The research at the Solar Energy Research Institute (SERI) in this area has focused on thermographic techniques and equipment to determine temperature distributions, flow patterns, and air blockages in solar collectors. The results of this extensive study, covering many sites and types of collectors, illustrate the capabilities of infrared (IR) analysis as a qualitative analysis tool and operation and maintenance procedure when applied to large arrays. Thermographic analysis of most collector systems qualitatively showed relative temperature distributions that indicated balanced flow patterns. In three significant cases, blocked or broken collector arrays, which previously had gone undetected, were discovered. Using this analysis, validation studies of large computer codes could examine collector arrays for flow patterns or blockages that could cause disagreement between actual and predicted performance. Initial operation and balancing of large systems could be accomplished without complicated sensor systems not needed for normal operations. Maintenance personnel could quickly check their systems without climbing onto the roof and without complicated sensor systems.

  5. Systems design analysis applied to launch vehicle configuration

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Verderaime, V.

    1993-01-01

    As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.

  6. Applying an Activity System to Online Collaborative Group Work Analysis

    ERIC Educational Resources Information Center

    Choi, Hyungshin; Kang, Myunghee

    2010-01-01

    This study determines whether an activity system provides a systematic framework to analyse collaborative group work. Using an activity system as a unit of analysis, the research examined learner behaviours, conflicting factors and facilitating factors while students engaged in collaborative work via asynchronous computer-mediated communication.…

  7. Dynamical systems analysis applied to working memory data.

    PubMed

    Gasimova, Fidan; Robitzsch, Alexander; Wilhelm, Oliver; Boker, Steven M; Hu, Yueqin; Hülür, Gizem

    2014-01-01

    In the present paper we investigate weekly fluctuations in the working memory capacity (WMC) assessed over a period of 2 years. We use dynamical system analysis, specifically a second order linear differential equation, to model weekly variability in WMC in a sample of 112 9th graders. In our longitudinal data we use a B-spline imputation method to deal with missing data. The results show a significant negative frequency parameter in the data, indicating a cyclical pattern in weekly memory updating performance across time. We use a multilevel modeling approach to capture individual differences in model parameters and find that a higher initial performance level and a slower improvement at the MU task is associated with a slower frequency of oscillation. Additionally, we conduct a simulation study examining the analysis procedure's performance using different numbers of B-spline knots and values of time delay embedding dimensions. Results show that the number of knots in the B-spline imputation influence accuracy more than the number of embedding dimensions. PMID:25071657

  8. Experiences with Probabilistic Analysis Applied to Controlled Systems

    NASA Technical Reports Server (NTRS)

    Kenny, Sean P.; Giesy, Daniel P.

    2004-01-01

    This paper presents a semi-analytic method for computing frequency dependent means, variances, and failure probabilities for arbitrarily large-order closed-loop dynamical systems possessing a single uncertain parameter or with multiple highly correlated uncertain parameters. The approach will be shown to not suffer from the same computational challenges associated with computing failure probabilities using conventional FORM/SORM techniques. The approach is demonstrated by computing the probabilistic frequency domain performance of an optimal feed-forward disturbance rejection scheme.

  9. System Analysis Applied to Autonomy: Application to High-Altitude Long-Endurance Remotely Operated Aircraft

    NASA Technical Reports Server (NTRS)

    Young, Larry A.; Yetter, Jeffrey A.; Guynn, Mark D.

    2006-01-01

    Maturation of intelligent systems technologies and their incorporation into aerial platforms are dictating the development of new analysis tools and incorporation of such tools into existing system analysis methodologies in order to fully capture the trade-offs of autonomy on vehicle and mission success. A first-order "system analysis of autonomy" methodology is outlined in this paper. Further, this analysis methodology is subsequently applied to notional high-altitude long-endurance (HALE) aerial vehicle missions.

  10. System Analysis Applied to Autonomy: Application to Human-Rated Lunar/Mars Landers

    NASA Technical Reports Server (NTRS)

    Young, Larry A.

    2006-01-01

    System analysis is an essential technical discipline for the modern design of spacecraft and their associated missions. Specifically, system analysis is a powerful aid in identifying and prioritizing the required technologies needed for mission and/or vehicle development efforts. Maturation of intelligent systems technologies, and their incorporation into spacecraft systems, are dictating the development of new analysis tools, and incorporation of such tools into existing system analysis methodologies, in order to fully capture the trade-offs of autonomy on vehicle and mission success. A "system analysis of autonomy" methodology will be outlined and applied to a set of notional human-rated lunar/Mars lander missions toward answering these questions: 1. what is the optimum level of vehicle autonomy and intelligence required? and 2. what are the specific attributes of an autonomous system implementation essential for a given surface lander mission/application in order to maximize mission success? Future human-rated lunar/Mars landers, though nominally under the control of their crew, will, nonetheless, be highly automated systems. These automated systems will range from mission/flight control functions, to vehicle health monitoring and prognostication, to life-support and other "housekeeping" functions. The optimum degree of autonomy afforded to these spacecraft systems/functions has profound implications from an exploration system architecture standpoint.

  11. System Sensitivity Analysis Applied to the Conceptual Design of a Dual-Fuel Rocket SSTO

    NASA Technical Reports Server (NTRS)

    Olds, John R.

    1994-01-01

    This paper reports the results of initial efforts to apply the System Sensitivity Analysis (SSA) optimization method to the conceptual design of a single-stage-to-orbit (SSTO) launch vehicle. SSA is an efficient, calculus-based MDO technique for generating sensitivity derivatives in a highly multidisciplinary design environment. The method has been successfully applied to conceptual aircraft design and has been proven to have advantages over traditional direct optimization methods. The method is applied to the optimization of an advanced, piloted SSTO design similar to vehicles currently being analyzed by NASA as possible replacements for the Space Shuttle. Powered by a derivative of the Russian RD-701 rocket engine, the vehicle employs a combination of hydrocarbon, hydrogen, and oxygen propellants. Three primary disciplines are included in the design - propulsion, performance, and weights & sizing. A complete, converged vehicle analysis depends on the use of three standalone conceptual analysis computer codes. Efforts to minimize vehicle dry (empty) weight are reported in this paper. The problem consists of six system-level design variables and one system-level constraint. Using SSA in a 'manual' fashion to generate gradient information, six system-level iterations were performed from each of two different starting points. The results showed a good pattern of convergence for both starting points. A discussion of the advantages and disadvantages of the method, possible areas of improvement, and future work is included.

  12. Success Factors of European Syndromic Surveillance Systems: A Worked Example of Applying Qualitative Comparative Analysis

    PubMed Central

    Ziemann, Alexandra; Fouillet, Anne; Brand, Helmut; Krafft, Thomas

    2016-01-01

    Introduction Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors. Materials and Methods We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events. Results We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness. Conclusions We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings. PMID:27182731

  13. Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Szapacs, Cindy

    2006-01-01

    Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…

  14. PRACTICAL SENSITIVITY AND UNCERTAINTY ANALYSIS TECHNIQUES APPLIED TO AGRICULTURAL SYSTEMS MODELS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We present a practical evaluation framework for analysis of two complex, process-based agricultural system models, WEPP and RZWQM. The evaluation framework combines sensitivity analysis and the uncertainty analysis techniques of first order error analysis (FOA) and Monte Carlo simulation with Latin ...

  15. RFI analysis applied to the TDRSS system. [Tracking and Data Relay Satellite System performance

    NASA Technical Reports Server (NTRS)

    Jenny, J. A.

    1973-01-01

    The effect of radio frequency interference (RFI) on the proposed Tracking and Data Relay Satellite System (TDRSS) was assessed. The method of assessing RFI was to create a discrete emitter listing containing all the required parameters of transmitters in the applicable VHF and UHF frequency bands. The transmitter and spacecraft receiver characteristics were used to calculate the RFI contribution due to each emitter. The individual contributions were summed to obtain the total impact in the operational bandwidth. Using an as yet incomplete emitter base, it is concluded that the 136- to 137-MHz band should be used by TDRSS rather than the whole 136- to 138-MHz band because of the higher interference levels in the 137- to 138 MHz band. Even when restricting the link to 136 to 137 MHz, the existing link design is marginal, and it is recommended that interference reduction units, such as the adaptive digital filter, be incorporated in the TDRSS ground station.

  16. Applying STAMP in Accident Analysis

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen

    2003-01-01

    Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.

  17. Model-Based Systems Engineering With the Architecture Analysis and Design Language (AADL) Applied to NASA Mission Operations

    NASA Technical Reports Server (NTRS)

    Munoz Fernandez, Michela Miche

    2014-01-01

    The potential of Model Model Systems Engineering (MBSE) using the Architecture Analysis and Design Language (AADL) applied to space systems will be described. AADL modeling is applicable to real-time embedded systems- the types of systems NASA builds. A case study with the Juno mission to Jupiter showcases how this work would enable future missions to benefit from using these models throughout their life cycle from design to flight operations.

  18. Applied Behavior Analysis in Education.

    ERIC Educational Resources Information Center

    Cooper, John O.

    1982-01-01

    Applied behavioral analysis in education is expanding rapidly. This article describes the dimensions of applied behavior analysis and the contributions this technology offers teachers in the area of systematic applications, direct and daily measurement, and experimental methodology. (CJ)

  19. Genetic algorithm applied to a Soil-Vegetation-Atmosphere system: Sensitivity and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Schneider, Sébastien; Jacques, Diederik; Mallants, Dirk

    2010-05-01

    Numerical models are of precious help for predicting water fluxes in the vadose zone and more specifically in Soil-Vegetation-Atmosphere (SVA) systems. For such simulations, robust models and representative soil hydraulic parameters are required. Calibration of unsaturated hydraulic properties is known to be a difficult optimization problem due to the high non-linearity of the water flow equations. Therefore, robust methods are needed to avoid the optimization process to lead to non-optimal parameters. Evolutionary algorithms and specifically genetic algorithms (GAs) are very well suited for those complex parameter optimization problems. Additionally, GAs offer the opportunity to assess the confidence in the hydraulic parameter estimations, because of the large number of model realizations. The SVA system in this study concerns a pine stand on a heterogeneous sandy soil (podzol) in the Campine region in the north of Belgium. Throughfall and other meteorological data and water contents at different soil depths have been recorded during one year at a daily time step in two lysimeters. The water table level, which is varying between 95 and 170 cm, has been recorded with intervals of 0.5 hour. The leaf area index was measured as well at some selected time moments during the year in order to evaluate the energy which reaches the soil and to deduce the potential evaporation. Water contents at several depths have been recorded. Based on the profile description, five soil layers have been distinguished in the podzol. Two models have been used for simulating water fluxes: (i) a mechanistic model, the HYDRUS-1D model, which solves the Richards' equation, and (ii) a compartmental model, which treats the soil profile as a bucket into which water flows until its maximum capacity is reached. A global sensitivity analysis (Morris' one-at-a-time sensitivity analysis) was run previously to the calibration, in order to check the sensitivity in the chosen parameter search space. For

  20. Morphological analysis of galvanized coating applied under vibrowave process system conditions

    NASA Astrophysics Data System (ADS)

    Lebedev, V. A.; Ivanov, V. V.; Fedorov, V. P.

    2016-04-01

    The article presents the morphological research results of galvanized coating applied to the metal surface in the course of mechanical and chemical synthesis realized under vibrowave process system conditions. The paper reveals the specifics of the coating morphology, its activating role in free-moving indentors formed under the impact of low-frequency vibrations and its positive influence on the operational performance of the part surface layer. The advantages of this galvanized coating application method are presented in comparison with conventional methods.

  1. Evaluation Applied to Reliability Analysis of Reconfigurable, Highly Reliable, Fault-Tolerant, Computing Systems for Avionics

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques are proposed as a solution to a difficulty arising in the analysis of the reliability of highly reliable computer systems for future commercial aircraft. The difficulty, viz., the lack of credible precision in reliability estimates obtained by analytical modeling techniques are established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible, (2) a complex system design technique, fault tolerance, (3) system reliability dominated by errors due to flaws in the system definition, and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. The technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. The use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques.

  2. An image-processing system, motion analysis oriented (IPS-100), applied to microscopy.

    PubMed

    Gualtieri, P; Coltelli, P

    1991-09-01

    This paper describes a real-time video image processing system, suitable for image analysis of stationary and moving images. It consists of a high-quality microscope, a general-purpose personal computer, a commercially available image-processing hardware module plugged into the computer bus, a b/w TV-camera, video monitors and a software package. The structure and the capability of this system are explained. The software is menu-driven and performs real-time image enhancements, real-time mathematical and morphological filters, image segmentation and labelling, real-time identification of moving objects, and real-time analysis of their movements. The program is available in listing form. PMID:1760921

  3. Energy analysis of facade-integrated photovoltaic systems applied to UAE commercial buildings

    SciTech Connect

    Radhi, Hassan

    2010-12-15

    Developments in the design and manufacture of photovoltaic cells have recently been a growing concern in the UAE. At present, the embodied energy pay-back time (EPBT) is the criterion used for comparing the viability of such technology against other forms. However, the impact of PV technology on the thermal performance of buildings is not considered at the time of EPBT estimation. If additional energy savings gained over the PV system life are also included, the total EPBT could be shorter. This paper explores the variation of the total energy of building integrated photovoltaic systems (BiPV) as a wall cladding system applied to the UAE commercial sector and shows that the ratio between PV output and saving in energy due to PV panels is within the range of 1:3-1:4. The result indicates that for the southern and western facades in the UAE, the embodied energy pay-back time for photovoltaic system is within the range of 12-13 years. When reductions in operational energy are considered, the pay-back time is reduced to 3.0-3.2 years. This study comes to the conclusion that the reduction in operational energy due to PV panels represents an important factor in the estimation of EPBT. (author)

  4. Multi-attribute criteria applied to electric generation energy system analysis LDRD.

    SciTech Connect

    Kuswa, Glenn W.; Tsao, Jeffrey Yeenien; Drennen, Thomas E.; Zuffranieri, Jason V.; Paananen, Orman Henrie; Jones, Scott A.; Ortner, Juergen G.; Brewer, Jeffrey D.; Valdez, Maximo M.

    2005-10-01

    This report began with a Laboratory-Directed Research and Development (LDRD) project to improve Sandia National Laboratories multidisciplinary capabilities in energy systems analysis. The aim is to understand how various electricity generating options can best serve needs in the United States. The initial product is documented in a series of white papers that span a broad range of topics, including the successes and failures of past modeling studies, sustainability, oil dependence, energy security, and nuclear power. Summaries of these projects are included here. These projects have provided a background and discussion framework for the Energy Systems Analysis LDRD team to carry out an inter-comparison of many of the commonly available electric power sources in present use, comparisons of those options, and efforts needed to realize progress towards those options. A computer aid has been developed to compare various options based on cost and other attributes such as technological, social, and policy constraints. The Energy Systems Analysis team has developed a multi-criteria framework that will allow comparison of energy options with a set of metrics that can be used across all technologies. This report discusses several evaluation techniques and introduces the set of criteria developed for this LDRD.

  5. Analysis of Godunov type schemes applied to the compressible Euler system at low Mach number

    NASA Astrophysics Data System (ADS)

    Dellacherie, Stéphane

    2010-02-01

    We propose a theoretical framework to clearly explain the inaccuracy of Godunov type schemes applied to the compressible Euler system at low Mach number on a Cartesian mesh. In particular, we clearly explain why this inaccuracy problem concerns the 2D or 3D geometry and does not concern the 1D geometry. The theoretical arguments are based on the Hodge decomposition, on the fact that an appropriate well-prepared subspace is invariant for the linear wave equation and on the notion of first-order modified equation. This theoretical approach allows to propose a simple modification that can be applied to any colocated scheme of Godunov type or not in order to define a large class of colocated schemes accurate at low Mach number on any mesh. It also allows to justify colocated schemes that are accurate at low Mach number as, for example, the Roe-Turkel and the AUSM +-up schemes, and to find a link with a colocated incompressible scheme stabilized with a Brezzi-Pitkäranta type stabilization. Numerical results justify the theoretical arguments proposed in this paper.

  6. Development and analysis of a resource-aware power management system as applied to small spacecraft

    NASA Astrophysics Data System (ADS)

    Shriver, Patrick

    In this thesis, an overall framework and solution method for managing the limited power resources of a small spacecraft is presented. Analogous to mobile computing technology, a primary limiting factor is the available power resources. In spite of the millions of dollars budgeted for research and development over decades, improvements in battery efficiency remains low. This situation is exacerbated by advances in payload technology that lead to increasingly power-hungry and data-intensive instruments. The challenge for the small spacecraft is to maximize capabilities and performance while meeting difficult design requirements and small project budgets. Power management is sought as a solution that can be applied with an existing generation of batteries. Ultimately, the power management problem is one of optimizing system performance and lifetime while maintaining safe operating conditions. This problem is formulated as a constrained, multi-objective combinatorial optimization problem. The problem is argued to be computationally intractable, and a formal proof of optimal substructure is presented. A multi-agent solution paradigm is developed that implements Dynamic Programming and Compromise Programming solutions. A high-level, "black box" software simulation of a typical power system is used to evaluate the developed method. The parameters used in simulation are taken from existing satellite designs. Compared to a traditional spacecraft operations approach, the developed method is shown to be useful in maximizing the utility of the spacecraft. As a battery ages, the method also has an increasing benefit on minimizing the missions risk.

  7. Underdetermined system theory applied to qualitative analysis of response caused by attenuating plane waves

    NASA Astrophysics Data System (ADS)

    Sano, Yukio

    1989-05-01

    A qualitative analysis of the mechanical response of rate-dependent media caused by a one-dimensional plane smooth wave front and by a continuous wave front attenuating in the media is performed by an underdetermined system of nonlinear partial differential equations. The analysis reveals that smooth strain, particle velocity, and stress profiles, which the smooth wave front has, are not similar and that the wave front is composed of some partial waves having different properties. The property is represented by a set of strain rate, acceleration, and stress rate. The wave front derived here from the analysis is composed of four different partial waves. The front of the wave front is necessarily a contraction wave in which strain, particle velocity, and stress increase with time, while the rear is a rarefaction wave where they all decrease with time. Between these two wave fronts there are two remaining wave fronts. We call these wave fronts mesocontraction waves I and II. Wave front I is a wave in which stress decreases notwithstanding the increase in strain and particle velocity with time, which is followed by the other, i.e., wave front II, where with time, particle velocity, and stress decrease in spite of the increase in strain. The continuous wave front having continuous and nonsmooth profiles of strain, particle velocity, and stress can also be composed of four waves. These waves possess the same property as the corresponding waves in the smooth wave front mentioned above. The velocities at three boundaries that the waves have are discontinuous. Therefore, these four wave fronts are independent waves, just as a shock wave and a rarefraction wave. Specifically, the front wave, i.e., a contraction wave front is being outrun by a second wave front, the second one is being outrun by a third wave front, and the third is being outrun by a fourth wave front, i.e., a rarefaction wave. We call the second wave front degenerate contraction wave I. We also call the third

  8. Hydrogeochemistry and statistical analysis applied to understand fluoride provenance in the Guarani Aquifer System, Southern Brazil.

    PubMed

    Marimon, Maria Paula C; Roisenberg, Ari; Suhogusoff, Alexandra V; Viero, Antonio Pedro

    2013-06-01

    High fluoride concentrations (up to 11 mg/L) have been reported in the groundwater of the Guarani Aquifer System (Santa Maria Formation) in the central region of the state of Rio Grande do Sul, Southern Brazil. In this area, dental fluorosis is an endemic disease. This paper presents the geochemical data and the combination of statistical analysis (Principal components and cluster analyses) and geochemical modeling to achieve the hydrogeochemistry of the groundwater and discusses the possible fluoride origin. The groundwater from the Santa Maria Formation is comprised of four different geochemical groups. The first group corresponds to a sodium chloride groundwater which evolves to sodium bicarbonate, the second one, both containing fluoride anomalies. The third group is represented by calcium bicarbonate groundwater, and in the fourth, magnesium is the distinctive parameter. The statistical and geochemical analyses supported by isotopic measurements indicated that groundwater may have originated from mixtures of deeper aquifers and the fluoride concentrations could be derived from rock/water interactions (e.g., desorption from clay minerals). PMID:23149723

  9. Analysis and Design of International Emission Trading Markets Applying System Dynamics Techniques

    NASA Astrophysics Data System (ADS)

    Hu, Bo; Pickl, Stefan

    2010-11-01

    The design and analysis of international emission trading markets is an important actual challenge. Time-discrete models are needed to understand and optimize these procedures. We give an introduction into this scientific area and present actual modeling approaches. Furthermore, we develop a model which is embedded in a holistic problem solution. Measures for energy efficiency are characterized. The economic time-discrete "cap-and-trade" mechanism is influenced by various underlying anticipatory effects. With a systematic dynamic approach the effects can be examined. First numerical results show that fair international emissions trading can only be conducted with the use of protective export duties. Furthermore a comparatively high price which evokes emission reduction inevitably has an inhibiting effect on economic growth according to our model. As it always has been expected it is not without difficulty to find a balance between economic growth and emission reduction. It can be anticipated using our System Dynamics model simulation that substantial changes must be taken place before international emissions trading markets can contribute to global GHG emissions mitigation.

  10. Comparison of complexity measures using two complex system analysis methods applied to the epileptic ECoG

    NASA Astrophysics Data System (ADS)

    Janjarasjitt, Suparerk; Loparo, Kenneth A.

    2013-10-01

    A complex system analysis has been widely applied to examine the characteristics of an electroencephalogram (EEG) in health and disease, as well as the dynamics of the brain. In this study, two complexity measures, the correlation dimension and the spectral exponent, are applied to electrocorticogram (ECoG) data from subjects with epilepsy obtained during different states (seizure and non-seizure) and from different brain regions, and the complexities of ECoG data obtained during different states and from different brain regions are examined. From the computational results, the spectral exponent obtained from the wavelet-based fractal analysis is observed to provide information complementary to the correlation dimension derived from the nonlinear dynamical-systems analysis. ECoG data obtained during seizure activity have smoother temporal patterns and are less complex than data obtained during non-seizure activity. In addition, significant differences between these two ECoG complexity measures exist when applied to ECoG data obtained from different brain regions of subjects with epilepsy.

  11. A novel image processing and measurement system applied to quantitative analysis of simulated tooth root canal shape

    NASA Astrophysics Data System (ADS)

    Yong, Tao; Yong, Wei; Jin, Guofan; Gao, Xuejun

    2005-02-01

    Dental pulp is located in root canal of tooth. To modern root canal therapy, "Root canal preparation" is the main means to debride dental pulp infection. The shape of root canal will be changed after preparation, so, when assessing the preparation instruments and techniques, the root canal shaping ability especially the apical offset is very important factor. In this paper, a novel digital image processing and measurement system is designed and applied to quantitative analysis of simulated canal shape. By image pretreatment, feature extraction, registration and fusion, the variation of the root canals' characteristics (before and after preparation) can be accurately compared and measured, so as to assess the shaping ability of instruments. When the scanning resolution is 1200dpi or higher, the registration and measurement precision of the system can achieve 0.021mm or higher. The performance of the system is tested by a series of simulated root canals and stainless steel K-files.

  12. Matrix effects in applying mono- and polyclonal ELISA systems to the analysis of weathered oils in contaminated soil.

    PubMed

    Pollard, S J T; Farmer, J G; Knight, D M; Young, P J

    2002-01-01

    Commercial mono- and polyclonal enzyme-linked immunosorbent assay (ELISA) systems were applied to the on-site analysis of weathered hydrocarbon-contaminated soils at a former integrated steelworks. Comparisons were made between concentrations of solvent extractable matter (SEM) determined gravimetrically by Soxhlet (dichloromethane) extraction and those estimated immunologically by ELISA determination over a concentration range of 2000-330,000 mg SEM/kg soil dry weight. Both ELISA systems tinder-reported for the more weathered soil samples. Results suggest this is due to matrix effects in the sample rather than any inherent bias in the ELISA systems and it is concluded that, for weathered hydrocarbons typical of steelworks and coke production sites, the use of ELISA requires careful consideration as a field technique. Consideration of the target analyte relative to the composition of the hydrocarbon waste encountered appears critical. PMID:11858166

  13. Comparing Waste-to-Energy technologies by applying energy system analysis.

    PubMed

    Münster, Marie; Lund, Henrik

    2010-07-01

    Even when policies of waste prevention, re-use and recycling are prioritised, a fraction of waste will still be left which can be used for energy recovery. This article asks the question: How to utilise waste for energy in the best way seen from an energy system perspective? Eight different Waste-to-Energy technologies are compared with a focus on fuel efficiency, CO(2) reductions and costs. The comparison is carried out by conducting detailed energy system analyses of the present as well as a potential future Danish energy system with a large share of combined heat and power as well as wind power. The study shows potential of using waste for the production of transport fuels. Biogas and thermal gasification technologies are hence interesting alternatives to waste incineration and it is recommended to support the use of biogas based on manure and organic waste. It is also recommended to support research into gasification of waste without the addition of coal and biomass. Together the two solutions may contribute to alternate use of one third of the waste which is currently incinerated. The remaining fractions should still be incinerated with priority to combined heat and power plants with high electric efficiency. PMID:19700298

  14. Advanced Behavioral Applications in Schools: A Review of R. Douglas Greer's "Designing Teaching Strategies: An Applied Behavior Analysis Systems Approach"

    ERIC Educational Resources Information Center

    Moxley, Roy A.

    2004-01-01

    R. Douglas Greer's "Designing Teaching Strategies" is an important book directed to advanced students in applied behavior analysis for classrooms. This review presents some of the striking features of the Comprehensive Applied Behavior Analysis to Schooling (CABAS[R]) program and the individualized instruction that the book advances. These include…

  15. Underdetermined system theory applied to quantitative analysis of responses caused by unsteady smooth-plane waves

    NASA Astrophysics Data System (ADS)

    Sano, Yukio

    1993-01-01

    The mechanical responses of rate-dependent media caused by unsteady smooth-plane waves are quantitatively analyzed by an underdetermined system of equations without using any constitutive relation of the media; that is, by using the particle velocity field expressed by an algebraic equation that is derived from the mass conservation equation, and the stress field expressed by an algebraic equation that is derived from the momentum conservation equation. First of all, this approach for analyzing unsteady wave motion is justified by the verification of various inferences such as the existence of the nonindependent elementary waves by Sano [J. Appl. Phys. 65, 3857(1989)] and the degradation process by Sano [J. Appl. Phys. 67, 4072(1990)]. Second, the situation under which a spike arises in particle velocity-time and stress-time profiles, and the reason for the arising are clarified. Third, the influence of the spike on stress-particle velocity and stress-strain paths is examined. The spike induced in the profiles by a growing wave greatly influences the paths near the impacted surface. Finally, calculated particle velocity-time profiles are compared with experimental data.

  16. [Apply association rules to analysis adverse drug reactions of shuxuening injection based on spontaneous reporting system data].

    PubMed

    Yang, Wei; Xie, Yan-Ming; Xiang, Yong-Yang

    2014-09-01

    This research based on the analysis of spontaneous reporting system (SRS) data which the 9 601 case reports of Shuxuening injection adverse drug reactions (ADR) in national adverse drug reaction monitoring center during 2005-2012. Apply to the association rules to analysis of the relationship between Shuxuening injection's ADR and the characteristics of ADR reports were. We found that ADR commonly combination were "nausea + breath + chills + vomiting", "nausea + chills + vomiting + palpitations", and their confidence level were 100%. The ADR and the case reports information commonly combination were "itching, and glucose and sodium chloride Injection, and generally ADR report, and normal dosage", "palpitation, and glucose and sodium chloride injection, and normal dosage, and new report", "chills, and generally ADR report, and normal dosage, and 0.9% sodium chloride injection", and their confidence level were 100% too. The results showed that patients using Shuxuening injection occurred most of ADRs were systemic damage, skin and its accessories damage, digestive system damage, etc. And most of cases were generally and new reports, and patients with normal dosage. The ADR's occurred had little related with solvent. It is showed that the Shuxuening injection occurred of ADR mainly related to drug composition. So Shuxuening injection used in clinical need to closely observation, and focus on the ADR reaction, and to do a good job of drug risk management. PMID:25532406

  17. Frequency Domain Analysis of Beat-Less Control Method for Converter-Inverter Driving Systems Applied to AC Electric Cars

    NASA Astrophysics Data System (ADS)

    Kimura, Akira

    In inverter-converter driving systems for AC electric cars, the DC input voltage of an inverter contains a ripple component with a frequency that is twice as high as the line voltage frequency, because of a single-phase converter. The ripple component of the inverter input voltage causes pulsations on torques and currents of driving motors. To decrease the pulsations, a beat-less control method, which modifies a slip frequency depending on the ripple component, is applied to the inverter control. In the present paper, the beat-less control method was analyzed in the frequency domain. In the first step of the analysis, transfer functions, which revealed the relationship among the ripple component of the inverter input voltage, the slip frequency, the motor torque pulsation and the current pulsation, were derived with a synchronous rotating model of induction motors. An analysis model of the beat-less control method was then constructed using the transfer functions. The optimal setting of the control method was obtained according to the analysis model. The transfer functions and the analysis model were verified through simulations.

  18. NextGen Brain Microdialysis: Applying Modern Metabolomics Technology to the Analysis of Extracellular Fluid in the Central Nervous System

    PubMed Central

    Kao, Chi-Ya; Anderzhanova, Elmira; Asara, John M.; Wotjak, Carsten T.; Turck, Christoph W.

    2015-01-01

    Microdialysis is a powerful method for in vivo neurochemical analyses. It allows fluid sampling in a dynamic manner in specific brain regions over an extended period of time. A particular focus has been the neurochemical analysis of extracellular fluids to explore central nervous system functions. Brain microdialysis recovers neurotransmitters, low-molecular-weight neuromodulators and neuropeptides of special interest when studying behavior and drug effects. Other small molecules, such as central metabolites, are typically not assessed despite their potential to yield important information related to brain metabolism and activity in selected brain regions. We have implemented a liquid chromatography online mass spectrometry metabolomics platform for an expanded analysis of mouse brain microdialysates. The method is sensitive and delivers information for a far greater number of analytes than commonly used electrochemical and fluorescent detection or biochemical assays. The metabolomics platform was applied to the analysis of microdialysates in a foot shock-induced mouse model of posttraumatic stress disorder (PTSD). The rich metabolite data information was then used to delineate affected prefrontal molecular pathways that reflect individual susceptibility for developing PTSD-like symptoms. We demonstrate that hypothesis-free metabolomics can be adapted to the analysis of microdialysates for the discovery of small molecules with functional significance.

  19. Simulating Nationwide Pandemics: Applying the Multi-scale Epidemiologic Simulation and Analysis System to Human Infectious Diseases

    SciTech Connect

    Dombroski, M; Melius, C; Edmunds, T; Banks, L E; Bates, T; Wheeler, R

    2008-09-24

    This study uses the Multi-scale Epidemiologic Simulation and Analysis (MESA) system developed for foreign animal diseases to assess consequences of nationwide human infectious disease outbreaks. A literature review identified the state of the art in both small-scale regional models and large-scale nationwide models and characterized key aspects of a nationwide epidemiological model. The MESA system offers computational advantages over existing epidemiological models and enables a broader array of stochastic analyses of model runs to be conducted because of those computational advantages. However, it has only been demonstrated on foreign animal diseases. This paper applied the MESA modeling methodology to human epidemiology. The methodology divided 2000 US Census data at the census tract level into school-bound children, work-bound workers, elderly, and stay at home individuals. The model simulated mixing among these groups by incorporating schools, workplaces, households, and long-distance travel via airports. A baseline scenario with fixed input parameters was run for a nationwide influenza outbreak using relatively simple social distancing countermeasures. Analysis from the baseline scenario showed one of three possible results: (1) the outbreak burned itself out before it had a chance to spread regionally, (2) the outbreak spread regionally and lasted a relatively long time, although constrained geography enabled it to eventually be contained without affecting a disproportionately large number of people, or (3) the outbreak spread through air travel and lasted a long time with unconstrained geography, becoming a nationwide pandemic. These results are consistent with empirical influenza outbreak data. The results showed that simply scaling up a regional small-scale model is unlikely to account for all the complex variables and their interactions involved in a nationwide outbreak. There are several limitations of the methodology that should be explored in future

  20. Functional analysis, a resilience improvement tool applied to a waste management system - application to the "household waste management chain"

    NASA Astrophysics Data System (ADS)

    Beraud, H.; Barroca, B.; Hubert, G.

    2012-12-01

    A waste management system plays a leading role in the capacity of an area to restart after flooding, as their impact on post-crisis management can be very considerable. Improving resilience, i.e. enabling it to maintain or recover acceptable operating levels after flooding is primordial. To achieve this, we must understand how the system works for bringing any potential dysfunctions to light and taking preventive measures. Functional analysis has been used for understanding the complexity of this type of system. The purpose of this article is to show the interest behind this type of method and the limits in its use for improving resilience of waste management system as well as other urban technical systems1, by means of theoretical modelling and its application on a study site. 1In a systemic vision of the city, urban technical systems combine all the user service systems that are essential for the city to operate (electricity, water supplies, transport, sewerage, etc.). These systems are generally organised in the form of networks (Coutard, 2010; CERTU, 2005).

  1. Applied mathematics of chaotic systems

    SciTech Connect

    Jen, E.; Alber, M.; Camassa, R.; Choi, W.; Crutchfield, J.; Holm, D.; Kovacic, G.; Marsden, J.

    1996-07-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objectives of the project were to develop new mathematical techniques for describing chaotic systems and for reexpressing them in forms that can be solved analytically and computationally. The authors focused on global bifurcation analysis of rigid body motion in an ideal incompressible fluid and on an analytical technique for the exact solution of nonlinear cellular automata. For rigid-body motion, they investigated a new completely integrable partial differential equation (PDE) representing model motion of fronts in nematic crystals and studied perturbations of the integrable PDE. For cellular automata with multiple domain structures, the work has included: (1) identification of the associated set of conserved quantities for each type of domain; (2) use of the conserved quantities to construct isomorphism between the nonlinear system and a linear template; and (3) use of exact solvability methods to characterize detailed structure of equilibrium states and to derive bounds for maximal transience times.

  2. Structured Biological Modelling: a method for the analysis and simulation of biological systems applied to oscillatory intracellular calcium waves.

    PubMed

    Kraus, M; Lais, P; Wolf, B

    1992-01-01

    In biology signal and information processing networks are widely known. Due to their inherent complexity and non-linear dynamics the time evolution of these systems can not be predicted by simple plausibility arguments. Fortunately, the power of modern computers allows the simulation of complex biological models. Therefore the problem becomes reduced to the question of how to develop a consistent mathematical model which comprises the essentials of the real biological system. As an interface between the phenomenological description and a computer simulation of the system the proposed method of Structured Biological Modelling (SBM) uses top-down levelled dataflow diagrams. They serve as a powerful tool for the analysis and the mathematical description of the system in terms of a stochastic formulation. The stochastic treatment, regarding the time evolution of the system as a stochastic process governed by a master equation, circumvents most difficulties arising from high dimensional and non-linear systems. As an application of SBM we develop a stochastic computer model of intracellular oscillatory Ca2+ waves in non-excitable cells. As demonstrated on this example, SBM can be used for the design of computer experiments which under certain conditions can be used as cheap and harmless counterparts to the usual time-consuming biological experiments. PMID:1334718

  3. Economic impacts of bio-refinery and resource cascading systems: an applied general equilibrium analysis for Poland.

    PubMed

    Ignaciuk, Adriana M; Sanders, Johan

    2007-12-01

    Due to more stringent energy and climate policies, it is expected that many traditional chemicals will be replaced by their biomass-based substitutes, bio-chemicals. These innovations, however, can influence land allocation since the demand for land dedicated to specific crops might increase. Moreover, it can have an influence on traditional agricultural production. In this paper, we use an applied general equilibrium framework, in which we include two different bio-refinery processes and incorporate so-called cascading mechanisms. The bio-refinery processes use grass, as one of the major inputs, to produce bio-nylon and propane-diol (1,3PDO) to substitute currently produced fossil fuel-based nylon and ethane-diol. We examine the impact of specific climate policies on the bioelectricity share in total electricity production, land allocation, and production quantities and prices of selected commodities. The novel technologies become competitive, with an increased stringency of climate policies. This switch, however, does not induce a higher share of bioelectricity. The cascade does stimulate the production of bioelectricity, but it induces more of a shift in inputs in the bioelectricity sector (from biomass to the cascaded bio-nylon and 1, 3PDO) than an increase in production level of bioelectricity. We conclude that dedicated biomass crops will remain the main option for bioelectricity production: the contribution of the biomass systems remains limited. Moreover, the bioelectricity sector looses a competition for land for biomass production with bio-refineries. PMID:17924388

  4. A mathematical method for the 3D analysis of rotating deformable systems applied on lumen-forming MDCK cell aggregates.

    PubMed

    Marmaras, Anastasios; Berge, Ulrich; Ferrari, Aldo; Kurtcuoglu, Vartan; Poulikakos, Dimos; Kroschewski, Ruth

    2010-04-01

    Cell motility contributes to the formation of organs and tissues, into which multiple cells self-organize. However such mammalian cellular motilities are not characterized in a quantitative manner and the systemic consequences are thus unknown. A mathematical tool to decipher cell motility, accounting for changes in cell shape, within a three-dimensional (3D) cell system was missing. We report here such a tool, usable on segmented images reporting the outline of clusters (cells) and allowing the time-resolved 3D analysis of circular motility of these as parts of a system (cell aggregate). Our method can analyze circular motility in sub-cellular, cellular, multi-cellular, and also non-cellular systems for which time-resolved segmented cluster outlines are available. To exemplify, we characterized the circular motility of lumen-initiating MDCK cell aggregates, embedded in extracellular matrix. We show that the organization of the major surrounding matrix fibers was not significantly affected during this cohort rotation. Using our developed tool, we discovered two classes of circular motion, rotation and random walk, organized in three behavior patterns during lumen initiation. As rotational movements were more rapid than random walk and as both could continue during lumen initiation, we conclude that neither the class nor the rate of motion regulates lumen initiation. We thus reveal a high degree of plasticity during a developmentally critical cell polarization step, indicating that lumen initiation is a robust process. However, motility rates decreased with increasing cell number, previously shown to correlate with epithelial polarization, suggesting that migratory polarization is converted into epithelial polarization during aggregate development. PMID:20183868

  5. Module systems applied to biomass

    SciTech Connect

    Jenkins, B.M.

    1983-12-01

    Applications of cotton moduling equipment to biomass have been tested in California. A module of chopped rice straw was made to determine physical characteristics of straw modules. A module system for tree prunings using a heavy duty module builder was tested extensively in 1983. Total direct costs to module, transport 8 km (5 mi), store, cut, tubgrind, and haul chips 50 km (30 mi) to a cogeneration plant is estimated to be $26.64/t ($24.17/t).

  6. A Systematic Comprehensive Computational Model for Stake Estimation in Mission Assurance: Applying Cyber Security Econometrics System (CSES) to Mission Assurance Analysis Protocol (MAAP)

    SciTech Connect

    Abercrombie, Robert K; Sheldon, Frederick T; Grimaila, Michael R

    2010-01-01

    In earlier works, we presented a computational infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder stands to sustain as a result of security breakdowns. In this paper, we discuss how this infrastructure can be used in the subject domain of mission assurance as defined as the full life-cycle engineering process to identify and mitigate design, production, test, and field support deficiencies of mission success. We address the opportunity to apply the Cyberspace Security Econometrics System (CSES) to Carnegie Mellon University and Software Engineering Institute s Mission Assurance Analysis Protocol (MAAP) in this context.

  7. R-matrix analysis of reactions in the 9B compound system applied to the 7Li problem in BBN

    NASA Astrophysics Data System (ADS)

    Paris, M.; Hale, G.; Hayes-Sterbenz, A.; Jungman, G.

    2016-01-01

    Recent activity in solving the ‘lithium problem’ in big bang nucleosynthesis has focused on the role that putative resonances may play in resonance-enhanced destruction of 7Li. Particular attention has been paid to the reactions involving the 9B compound nuclear system, d+7Be → 9B. These reactions are analyzed via the multichannel, two-body unitary R-matrix method using the code EDA developed by Hale and collaborators. We employ much of the known elastic and reaction data, in a four-channel treatment. The data include elastic 3He +6Li differential cross sections from 0.7 to 2.0 MeV, integrated reaction cross sections for energies from 0.7 to 5.0 MeV for 6Li(3He,p)8Be* and from 0.4 to 5.0 MeV for the 6Li(3He,d)7Be reaction. Capture data have been added to an earlier analysis with integrated cross section measurements from 0.7 to 0.825 MeV for 6Li(3He,γ)9B. The resulting resonance parameters are compared with tabulated values, and previously unidentified resonances are noted. Our results show that there are no near d+7Be threshold resonances with widths that are 10’s of keV and reduce the likelihood that a resonance-enhanced mass-7 destruction mechanism, as suggested in recently published work, can explain the 7Li problem.

  8. Applied Information Systems Research Program Workshop

    NASA Technical Reports Server (NTRS)

    Bredekamp, Joe

    1991-01-01

    Viewgraphs on Applied Information Systems Research Program Workshop are presented. Topics covered include: the Earth Observing System Data and Information System; the planetary data system; Astrophysics Data System project review; OAET Computer Science and Data Systems Programs; the Center of Excellence in Space Data and Information Sciences; and CASIS background.

  9. The basic importance of applied behavior analysis

    PubMed Central

    Epling, W. Frank; Pierce, W. David

    1986-01-01

    We argue that applied behavior analysis is relevant to basic research. Modification studies, and a broad range of investigations that focus on the precipitating and maintaining conditions of socially significant human behavior, have basic importance. Applied behavior analysis may aid basic researchers in the design of externally valid experiments and thereby enhance the theoretical significance of basic research for understanding human behavior. Applied research with humans, directed at culturally-important problems, will help to propagate the science of human behavior. Such a science will also be furthered by analogue experiments that model socially important behavior. Analytical-applied studies and analogue experiments are forms of applied behavior analysis that could suggest new environment-behavior relationships. These relationships could lead to basic research and principles that further the prediction, control, and understanding of behavior. PMID:22478650

  10. Applied Behavior Analysis and Statistical Process Control?

    ERIC Educational Resources Information Center

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  11. 2012 International Conference on Medical Physics and Biomedical Engineering Thermal Economic Analysis on LiBr Refrigeration -Heat Pump System Applied in CCHP System

    NASA Astrophysics Data System (ADS)

    Zhang, CuiZhen; Yang, Mo; Lu, Mei; Zhu, Jiaxian; Xu, Wendong

    LiBr refrigeration cooling water contains a lot of low-temperature heat source, can use this part of the heat source heat boiler feed water. This paper introduced LiBr refrigeration - heat pump system which recovery heat of the LiBr refrigeration cooling water by heat pump system to heat the feed water of boiler. Hot economic analysis on the system has been performed based on the experimental data. Results show that LiBr refrigeration-heat pump system brings 26.6 percent decrease in primary energy rate consumption comparing with the combined heat and power production system(CHP) and separate generation of cold;

  12. System planning analysis applied to OTEC: initial cases by Florida Power Corporation. Task II report No. FC-5237-2

    SciTech Connect

    1980-03-01

    The objective of the task was to exercise the FPC system planning methodology on: (1) Base Case, 10 year generation expansion plan with coal plants providing base load expansion, and (2) same, but 400 MW of OTEC substituting for coal burning units with equal resultant system reliability. OTEC inputs were based on reasonable economic projections of direct capital cost and O and M costs for first-generation large commercial plants. OTEC inputs discussed in Section 2. The Base Case conditions for FPC system planning methodology involved base load coal fueled additions during the 1980's and early 1990's. The first trial runs of the PROMOD system planning model substituted OTEC for 400 MW purchases of coal generated power during 1988-1989 and then 400 MW coal capacity thereafter. Result showed higher system reliability than Base Case runs. Reruns with greater coal fueled capacity displacement showed that OTEC could substitute for 400 MW purchases in 1988-1989 and replace the 800 MW coal unit scheduled for 1990 to yield equivalent system reliability. However, a 1995 unit would need to be moved to 1994. Production costing computer model runs were used as input to Corporate Model to examine corporate financial impact. Present value of total revenue requirements were primary indication of relative competitiveness between Base Case and OTEC. Results show present value of total revenue requirements unfavorable to OTEC as compared to coal units. The disparity was in excess of the allowable range for possible consideration.

  13. Electrochemical analysis of acetaminophen using a boron-doped diamond thin film electrode applied to flow injection system.

    PubMed

    Wangfuengkanagul, Nattakarn; Chailapakul, Orawon

    2002-06-01

    The electrochemistry of acetaminophen in phosphate buffer solution (pH 8) was studied at a boron-doped diamond (BDD) thin film electrode using cyclic voltammetry, hydrodynamic voltammetry, and flow injection with amperometric detection. Cyclic voltammetry was used to study the reaction as a function of concentration of analyte. Comparison experiments were performed using a polished glassy carbon (GC) electrode. Acetaminophen undergoes quasi-reversible reaction at both of these two electrodes. The BDD and GC electrodes provided well-resolved cyclic voltammograms but the voltammetric signal-to-background ratios obtained from the diamond electrode were higher than those obtained from the GC electrode. The diamond electrode provided a linear dynamic range from 0.1 to 8 mM and a detection of 10 microM (S/B approximately 3) for voltammetric measurement. The flow injection analysis results at the diamond electrode indicated a linear dynamic range from 0.5 to 50 microM and a detection limit of 10 nM (S/N approximately 4). Acetaminophen in syrup samples has also been investigated. The results obtained in the recovery study (24.68+/-0.26 mg/ml) were comparable to those labeled (24 mg/ml). PMID:12039625

  14. Defining applied behavior analysis: An historical analogy

    PubMed Central

    Deitz, Samuel M.

    1982-01-01

    This article examines two criteria for a definition of applied behavior analysis. The criteria are derived from a 19th century attempt to establish medicine as a scientific field. The first criterion, experimental determinism, specifies the methodological boundaries of an experimental science. The second criterion, philosophic doubt, clarifies the tentative nature of facts and theories derived from those facts. Practices which will advance the science of behavior are commented upon within each criteria. To conclude, the problems of a 19th century form of empiricism in medicine are related to current practices in applied behavior analysis. PMID:22478557

  15. Global energy perspectives: A summary of the joint study by the International Institute for Applied Systems Analysis and World Energy Council

    SciTech Connect

    Gruebler, A.; Nakicenovic, N. |; Jefferson, M.

    1996-03-01

    This article reports a study on Global Energy Perspectives to 2050 and Beyond conducted jointly by the International Institute for Applied Systems Analysis (IIASA) and the World Energy Council (WEC). All together three cases of economic and energy developments were developed that sprawl into six scenarios of energy supply alternatives extending until the end of the 21st century. The international consistency of the scenarios was assessed with the help of formal energy models. The study took close account of world population prospects, economic growth, technological advance, the energy resource base, environmental implications from the local to the global level, financing requirements, and the future prospects of both fossil and nonfossil fuels and industries. Although no analysis can turn an uncertain future into a sure thing, the study identifies patterns that are robust across a purposely broad range of scenarios. The study also enables one to relate alternative near-term research and development, technology, economic, and environmental policies to the possible long-term divergence of energy systems structures. Due to the long lead times involved in the turnover of capital stock and infrastructures of the energy system, policy would need to be implemented now in order to initiate long-term structural changes in the energy system that would, however, become significant only after the year 2020. 23 refs., 10 figs., 8 tabs.

  16. Goals Analysis Procedure Guidelines for Applying the Goals Analysis Process

    NASA Technical Reports Server (NTRS)

    Motley, Albert E., III

    2000-01-01

    One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.

  17. Applied behavior analysis and statistical process control?

    PubMed Central

    Hopkins, B L

    1995-01-01

    This paper examines Pfadt and Wheeler's (1995) suggestions that the methods of statistical process control (SPC) be incorporated into applied behavior analysis. The research strategies of SPC are examined and compared to those of applied behavior analysis. I argue that the statistical methods that are a part of SPC would likely reduce applied behavior analysts' intimate contacts with the problems with which they deal and would, therefore, likely yield poor treatment and research decisions. Examples of these kinds of results and decisions are drawn from the cases and data Pfadt and Wheeler present. This paper also describes and clarifies many common misconceptions about SPC, including W. Edwards Deming's involvement in its development, its relationship to total quality management, and its confusion with various other methods designed to detect sources of unwanted variability. PMID:7592156

  18. Positive Behavior Support and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Johnston, J. M.; Foxx, R. M.; Jacobson, J. W.; Green, G.; Mulick, J. A.

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We…

  19. Applied Behavior Analysis: Beyond Discrete Trial Teaching

    ERIC Educational Resources Information Center

    Steege, Mark W.; Mace, F. Charles; Perry, Lora; Longenecker, Harold

    2007-01-01

    We discuss the problem of autism-specific special education programs representing themselves as Applied Behavior Analysis (ABA) programs when the only ABA intervention employed is Discrete Trial Teaching (DTT), and often for limited portions of the school day. Although DTT has many advantages to recommend its use, it is not well suited to teach…

  20. Thermodynamic Laws Applied to Economic Systems

    ERIC Educational Resources Information Center

    González, José Villacís

    2009-01-01

    Economic activity in its different manifestations--production, exchange, consumption and, particularly, information on quantities and prices--generates and transfers energy. As a result, we can apply to it the basic laws of thermodynamics. These laws are applicable within a system, i.e., in a country or between systems and countries. To these…

  1. System Applies Polymer Powder To Filament Tow

    NASA Technical Reports Server (NTRS)

    Baucom, Robert M.; Snoha, John J.; Marchello, Joseph M.

    1993-01-01

    Polymer powder applied uniformly and in continuous manner. Powder-coating system applies dry polymer powder to continuous fiber tow. Unique filament-spreading technique, combined with precise control of tension on fibers in system, ensures uniform application of polymer powder to web of spread filaments. Fiber tows impregnated with dry polymer powders ("towpregs") produced for preform-weaving and composite-material-molding applications. System and process valuable to prepreg industry, for production of flexible filament-windable tows and high-temperature polymer prepregs.

  2. Caldwell University's Department of Applied Behavior Analysis.

    PubMed

    Reeve, Kenneth F; Reeve, Sharon A

    2016-05-01

    Since 2004, faculty members at Caldwell University have developed three successful graduate programs in Applied Behavior Analysis (i.e., PhD, MA, non-degree programs), increased program faculty from two to six members, developed and operated an on-campus autism center, and begun a stand-alone Applied Behavior Analysis Department. This paper outlines a number of strategies used to advance these initiatives, including those associated with an extensive public relations campaign. We also outline challenges that have limited our programs' growth. These strategies, along with a consideration of potential challenges, might prove useful in guiding academicians who are interested in starting their own programs in behavior analysis. PMID:27606194

  3. Some still-current dimensions of applied behavior analysis

    PubMed Central

    Baer, Donald M.; Wolf, Montrose M.; Risley, Todd R.

    1987-01-01

    Twenty years ago, an anthropological note described the current dimensions of applied behavior analysis as it was prescribed and practiced in 1968: It was, or ought to become, applied, behavioral, analytic, technological, conceptual, effective, and capable of appropriately generalized outcomes. A similar anthropological note today finds the same dimensions still prescriptive, and to an increasing extent, descriptive. Several new tactics have become evident, however, some in the realm of conceptual analysis, some in the sociological status of the discipline, and some in its understanding of the necessary systemic nature of any applied discipline that is to operate in the domain of important human behaviors. PMID:16795703

  4. Tropospheric Delay Raytracing Applied in VLBI Analysis

    NASA Astrophysics Data System (ADS)

    MacMillan, D. S.; Eriksson, D.; Gipson, J. M.

    2013-12-01

    Tropospheric delay modeling error continues to be one of the largest sources of error in VLBI analysis. For standard operational solutions, we use the VMF1 elevation-dependent mapping functions derived from ECMWF data. These mapping functions assume that tropospheric delay at a site is azimuthally symmetric. As this assumption does not reflect reality, we have determined the raytrace delay along the signal path through the troposphere for each VLBI quasar observation. We determined the troposphere refractivity fields from the pressure, temperature, specific humidity and geopotential height fields of the NASA GSFC GEOS-5 numerical weather model. We discuss results from analysis of the CONT11 R&D and the weekly operational R1+R4 experiment sessions. When applied in VLBI analysis, baseline length repeatabilities were better for 66-72% of baselines with raytraced delays than with VMF1 mapping functions. Vertical repeatabilities were better for 65% of sites.

  5. Wavelet analysis applied to the IRAS cirrus

    NASA Technical Reports Server (NTRS)

    Langer, William D.; Wilson, Robert W.; Anderson, Charles H.

    1994-01-01

    The structure of infrared cirrus clouds is analyzed with Laplacian pyramid transforms, a form of non-orthogonal wavelets. Pyramid and wavelet transforms provide a means to decompose images into their spatial frequency components such that all spatial scales are treated in an equivalent manner. The multiscale transform analysis is applied to IRAS 100 micrometer maps of cirrus emission in the north Galactic pole region to extract features on different scales. In the maps we identify filaments, fragments and clumps by separating all connected regions. These structures are analyzed with respect to their Hausdorff dimension for evidence of the scaling relationships in the cirrus clouds.

  6. Positive Behavior Support and Applied Behavior Analysis

    PubMed Central

    Johnston, J.M; Foxx, Richard M; Jacobson, John W; Green, Gina; Mulick, James A

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We also consider the features of PBS that have facilitated its broad dissemination and how ABA might benefit from emulating certain practices of the PBS movement. PMID:22478452

  7. Artificial intelligence applied to process signal analysis

    NASA Technical Reports Server (NTRS)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  8. Applied spectrophotometry: analysis of a biochemical mixture.

    PubMed

    Trumbo, Toni A; Schultz, Emeric; Borland, Michael G; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the concentration of a given species (RNA, DNA, protein) in isolation (a contrived circumstance) as opposed to determining that concentration in the presence of other species (a more realistic situation). To present the student with a more realistic laboratory experience and also to fill a hole that we believe exists in student experience prior to reaching a biochemistry course, we have devised a three week laboratory experience designed so that students learn to: connect laboratory practice with theory, apply the Beer-Lambert-Bougert Law to biochemical analyses, demonstrate the utility and limitations of example quantitative colorimetric assays, demonstrate the utility and limitations of UV analyses for biomolecules, develop strategies for analysis of a solution of unknown biomolecular composition, use digital micropipettors to make accurate and precise measurements, and apply graphing software. PMID:23625877

  9. A novel mating system analysis for modes of self-oriented mating applied to diploid and polyploid arctic Easter daisies (Townsendia hookeri).

    PubMed

    Thompson, S L; Ritland, K

    2006-08-01

    We have developed a new model for mating system analysis, which attempts to distinguish among alternative modes of self-oriented mating within populations. This model jointly estimates the rates of outcrossing, selfing, automixis and apomixis, through the use of information in the family structure given by dominant genetic marker data. The method is presented, its statistical properties evaluated, and is applied to three arctic Easter daisy populations, one consisting of diploids, the other two of tetraploids. The tetraploids are predominantly male sterile and reported to be apomictic while the diploids are male fertile. In each Easter daisy population, 10 maternal arrays of six progeny were assayed for amplified fragment length polymorphism markers. Estimates, confirmed with likelihood ratio tests of mating hypotheses, showed apomixis to be predominant in all populations (ca. 70%), but selfing or automixis was moderate (ca. 25%) in tetraploids. It was difficult to distinguish selfing from automixis, and simulations confirm that with even very large sample sizes, the estimates have a very strong negative statistical correlation, for example, they are not independent. No selfing or automixis was apparent in the diploid population, instead, moderate levels of outcrossing were detected (23%). Low but significant levels of outcrossing (2-4%) seemed to occur in the male-sterile tetraploid populations; this may be due to genotyping error of this level. Overall, this study shows apomixis can be partial, and provides evidence for higher levels of inbreeding in polyploids compared to diploids and for significant levels of apomixis in a diploid plant population. PMID:16721390

  10. Spectral Selectivity Applied To Hybrid Concentration Systems

    NASA Astrophysics Data System (ADS)

    Hamdy, M. A.; Luttmann, F.; Osborn, D. E.; Jacobson, M. R.; MacLeod, H. A.

    1985-12-01

    The efficiency of conversion of concentrated solar energy can be improved by separating the solar spectrum into portions matched to specific photoquantum processes and the balance used for photothermal conversion. The basic approaches of spectrally selective beam splitters are presented. A detailed simulation analysis using TRNSYS is developed for a spectrally selective hybrid photovoltaic/photothermal concentrating system. The analysis shows definite benefits to a spectrally selective approach.

  11. Applied Information Systems Research Program Workshop

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The first Applied Information Systems Research Program (AISRP) Workshop provided the impetus for several groups involved in information systems to review current activities. The objectives of the workshop included: (1) to provide an open forum for interaction and discussion of information systems; (2) to promote understanding by initiating a dialogue with the intended benefactors of the program, the scientific user community, and discuss options for improving their support; (3) create an advocacy in having science users and investigators of the program meet together and establish the basis for direction and growth; and (4) support the future of the program by building collaborations and interaction to encourage an investigator working group approach for conducting the program.

  12. The Applied Mathematics for Power Systems (AMPS)

    SciTech Connect

    Chertkov, Michael

    2012-07-24

    Increased deployment of new technologies, e.g., renewable generation and electric vehicles, is rapidly transforming electrical power networks by crossing previously distinct spatiotemporal scales and invalidating many traditional approaches for designing, analyzing, and operating power grids. This trend is expected to accelerate over the coming years, bringing the disruptive challenge of complexity, but also opportunities to deliver unprecedented efficiency and reliability. Our Applied Mathematics for Power Systems (AMPS) Center will discover, enable, and solve emerging mathematics challenges arising in power systems and, more generally, in complex engineered networks. We will develop foundational applied mathematics resulting in rigorous algorithms and simulation toolboxes for modern and future engineered networks. The AMPS Center deconstruction/reconstruction approach 'deconstructs' complex networks into sub-problems within non-separable spatiotemporal scales, a missing step in 20th century modeling of engineered networks. These sub-problems are addressed within the appropriate AMPS foundational pillar - complex systems, control theory, and optimization theory - and merged or 'reconstructed' at their boundaries into more general mathematical descriptions of complex engineered networks where important new questions are formulated and attacked. These two steps, iterated multiple times, will bridge the growing chasm between the legacy power grid and its future as a complex engineered network.

  13. Use of an automated chromium reduction system for hydrogen isotope ratio analysis of physiological fluids applied to doubly labeled water analysis.

    PubMed

    Schoeller, D A; Colligan, A S; Shriver, T; Avak, H; Bartok-Olson, C

    2000-09-01

    The doubly labeled water method is commonly used to measure total energy expenditure in free-living subjects. The method, however, requires accurate and precise deuterium abundance determinations, which can be laborious. The aim of this study was to evaluate a fully automated, high-throughput, chromium reduction technique for the measurement of deuterium abundances in physiological fluids. The chromium technique was compared with an off-line zinc bomb reduction technique and also subjected to test-retest analysis. Analysis of international water standards demonstrated that the chromium technique was accurate and had a within-day precision of <1 per thousand. Addition of organic matter to water samples demonstrated that the technique was sensitive to interference at levels between 2 and 5 g l(-1). Physiological samples could be analyzed without this interference, plasma by 10000 Da exclusion filtration, saliva by sedimentation and urine by decolorizing with carbon black. Chromium reduction of urine specimens from doubly labeled water studies indicated no bias relative to zinc reduction with a mean difference in calculated energy expenditure of -0.2 +/- 3.9%. Blinded reanalysis of urine specimens from a second doubly labeled water study demonstrated a test-retest coefficient of variation of 4%. The chromium reduction method was found to be a rapid, accurate and precise method for the analysis of urine specimens from doubly labeled water. PMID:11006607

  14. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2002-01-01

    This viewgraph report presents an overview of activities and accomplishments of NASA's Marshall Space Flight Center's Applied Fluid Dynamics Analysis Group. Expertise in this group focuses on high-fidelity fluids design and analysis with application to space shuttle propulsion and next generation launch technologies. Topics covered include: computational fluid dynamics research and goals, turbomachinery research and activities, nozzle research and activities, combustion devices, engine systems, MDA development and CFD process improvements.

  15. Tribological systems as applied to aircraft engines

    NASA Technical Reports Server (NTRS)

    Buckley, D. H.

    1985-01-01

    Tribological systems as applied to aircraft are reviewed. The importance of understanding the fundamental concepts involved in such systems is discussed. Basic properties of materials which can be related to adhesion, friction and wear are presented and correlated with tribology. Surface processes including deposition and treatment are addressed in relation to their present and future application to aircraft components such as bearings, gears and seals. Lubrication of components with both liquids and solids is discussed. Advances in both new liquid molecular structures and additives for those structures are reviewed and related to the needs of advanced engines. Solids and polymer composites are suggested for increasing use and ceramic coatings containing fluoride compounds are offered for the extreme temperatures encountered in such components as advanced bearings and seals.

  16. Magnetic Analysis Techniques Applied to Desert Varnish

    NASA Technical Reports Server (NTRS)

    Schmidgall, E. R.; Moskowitz, B. M.; Dahlberg, E. D.; Kuhlman, K. R.

    2003-01-01

    Desert varnish is a black or reddish coating commonly found on rock samples from arid regions. Typically, the coating is very thin, less than half a millimeter thick. Previous research has shown that the primary components of desert varnish are silicon oxide clay minerals (60%), manganese and iron oxides (20-30%), and trace amounts of other compounds [1]. Desert varnish is thought to originate when windborne particles containing iron and manganese oxides are deposited onto rock surfaces where manganese oxidizing bacteria concentrate the manganese and form the varnish [4,5]. If desert varnish is indeed biogenic, then the presence of desert varnish on rock surfaces could serve as a biomarker, indicating the presence of microorganisms. This idea has considerable appeal, especially for Martian exploration [6]. Magnetic analysis techniques have not been extensively applied to desert varnish. The only previous magnetic study reported that based on room temperature demagnetization experiments, there were noticeable differences in magnetic properties between a sample of desert varnish and the substrate sandstone [7]. Based upon the results of the demagnetization experiments, the authors concluded that the primary magnetic component of desert varnish was either magnetite (Fe3O4) or maghemite ( Fe2O3).

  17. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  18. Toward applied behavior analysis of life aloft.

    PubMed

    Brady, J V

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  19. Toward applied behavior analysis of life aloft

    NASA Technical Reports Server (NTRS)

    Brady, J. V.

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  20. System safety as applied to Skylab

    NASA Technical Reports Server (NTRS)

    Kleinknecht, K. S.; Miller, B. J.

    1974-01-01

    Procedural and organizational guidelines used in accordance with NASA safety policy for the Skylab missions are outlined. The basic areas examined in the safety program for Skylab were the crew interface, extra-vehicular activity (EVA), energy sources, spacecraft interface, and hardware complexity. Fire prevention was a primary goal, with firefighting as backup. Studies of the vectorcardiogram and sleep monitoring experiments exemplify special efforts to prevent fire and shock. The final fire control study included material review, fire detection capability, and fire extinguishing capability. Contractors had major responsibility for system safety. Failure mode and effects analysis (FMEA) and equipment criticality categories are outlined. Redundancy was provided on systems that were critical to crew survival (category I). The five key checkpoints in Skylab hardware development are explained. Skylab rescue capability was demonstrated by preparations to rescue the Skylab 3 crew after their spacecraft developed attitude control problems.

  1. Moving Forward: Positive Behavior Support and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Tincani, Matt

    2007-01-01

    A controversy has emerged about the relationship between positive behavior support and applied behavior analysis. Some behavior analysts suggest that positive behavior support and applied behavior analysis are the same (e.g., Carr & Sidener, 2002). Others argue that positive behavior support is harmful to applied behavior analysis (e.g., Johnston,…

  2. Analysis on operational power and eddy current losses for applying coreless double-sided permanent magnet synchronous motor/generator to high-power flywheel energy storage system

    NASA Astrophysics Data System (ADS)

    Jang, Seok-Myeong; Park, Ji-Hoon; You, Dae-Joon; Choi, Sang-Ho

    2009-04-01

    This paper deals with analytical approach of operational power defined as load power and rotor loss represented as eddy current loss for applying a permanent magnet (PM) synchronous motor/generator to the high-power flywheel energy storage system. The used model is composed of a double-sided Halbach magnetized PM rotor and coreless three-phase winding stator. For one such motor/generator structure, we provide the magnetic field and eddy current with space and time harmonics via magnetic vector potential in two-dimensional (2D) polar coordinate system. From these, the operational power is estimated by backelectromotive force according to the PM rotor speed, and the rotor loss is also calculated from Poynting theorem.

  3. Verifying Anonymous Credential Systems in Applied Pi Calculus

    NASA Astrophysics Data System (ADS)

    Li, Xiangxi; Zhang, Yu; Deng, Yuxin

    Anonymous credentials are widely used to certify properties of a credential owner or to support the owner to demand valuable services, while hiding the user's identity at the same time. A credential system (a.k.a. pseudonym system) usually consists of multiple interactive procedures between users and organizations, including generating pseudonyms, issuing credentials and verifying credentials, which are required to meet various security properties. We propose a general symbolic model (based on the applied pi calculus) for anonymous credential systems and give formal definitions of a few important security properties, including pseudonym and credential unforgeability, credential safety, pseudonym untraceability. We specialize the general formalization and apply it to the verification of a concrete anonymous credential system proposed by Camenisch and Lysyanskaya. The analysis is done automatically with the tool ProVerif and several security properties have been verified.

  4. Fluorescent Protein Biosensors Applied to Microphysiological Systems

    PubMed Central

    Senutovitch, Nina; Vernetti, Lawrence; Boltz, Robert; DeBiasio, Richard; Gough, Albert; Taylor, D. Lansing

    2015-01-01

    This mini-review discusses the evolution of fluorescence as a tool to study living cells and tissues in vitro and the present role of fluorescent protein biosensors (FPBs) in microphysiological systems (MPS). FPBs allow the measurement of temporal and spatial dynamics of targeted cellular events involved in normal and perturbed cellular assay systems and microphysiological systems in real-time. FPBs evolved from fluorescent analog cytochemistry (FAC) that permitted the measurement of the dynamics of purified proteins covalently labeled with environmentally insensitive fluorescent dyes and then incorporated into living cells, as well as a large list of diffusible fluorescent probes engineered to measure environmental changes in living cells. In parallel, a wide range of fluorescence microscopy methods were developed to measure the chemical and molecular activities of the labeled cells, including ratio imaging, fluorescence lifetime, total internal reflection, 3D imaging, including super-resolution, as well as high content screening (HCS). FPBs evolved from FAC by combining environmentally sensitive fluorescent dyes with proteins in order to monitor specific physiological events such as post-translational modifications, production of metabolites, changes in various ion concentrations and the dynamic interaction of proteins with defined macromolecules in time and space within cells. Original FPBs involved the engineering of fluorescent dyes to sense specific activities when covalently attached to particular domains of the targeted protein. The subsequent development of fluorescent proteins (FPs), such as the green fluorescent protein (GFP), dramatically accelerated the adoption of studying living cells, since the genetic “labeling” of proteins became a relatively simple method that permitted the analysis of temporal-spatial dynamics of a wide range of proteins. Investigators subsequently engineered the fluorescence properties of the FPs for environmental

  5. Introduction: Conversation Analysis in Applied Linguistics

    ERIC Educational Resources Information Center

    Sert, Olcay; Seedhouse, Paul

    2011-01-01

    This short, introductory paper presents an up-to-date account of works within the field of Applied Linguistics which have been influenced by a Conversation Analytic paradigm. The article reviews recent studies in classroom interaction, materials development, proficiency assessment and language teacher education. We believe that the publication of…

  6. The Evolution of Fungicide Resistance Resulting from Combinations of Foliar-Acting Systemic Seed Treatments and Foliar-Applied Fungicides: A Modeling Analysis

    PubMed Central

    Kitchen, James L.; van den Bosch, Frank; Paveley, Neil D.; Helps, Joseph; van den Berg, Femke

    2016-01-01

    For the treatment of foliar diseases of cereals, fungicides may be applied as foliar sprays or systemic seed treatments which are translocated to leaves. Little research has been done to assess the resistance risks associated with foliar-acting systemic seed treatments when used alone or in combination with foliar sprays, even though both types of treatment may share the same mode of action. It is therefore unknown to what extent adding a systemic seed treatment to a foliar spray programme poses an additional resistance risk and whether in the presence of a seed treatment additional resistance management strategies (such as limiting the total number of treatments) are necessary to limit the evolution of fungicide-resistance. A mathematical model was developed to simulate an epidemic and the resistance evolution of Zymoseptoria tritici on winter wheat, which was used to compare different combinations of seed and foliar treatments by calculating the fungicide effective life, i.e. the number of years before effective disease control is lost to resistance. A range of parameterizations for the seed treatment fungicide and different fungicide uptake models were compared. Despite the different parameterizations, the model consistently predicted the same trends in that i) similar levels of efficacy delivered either by a foliar-acting seed treatment, or a foliar application, resulted in broadly similar resistance selection, ii) adding a foliar-acting seed treatment to a foliar spray programme increased resistance selection and usually decreased effective life, and iii) splitting a given total dose—by adding a seed treatment to foliar treatments, but decreasing dose per treatment—gave effective lives that were the same as, or shorter than those given by the spray programme alone. For our chosen plant-pathogen-fungicide system, the model results suggest that to effectively manage selection for fungicide-resistance, foliar acting systemic seed treatments should be included

  7. The Evolution of Fungicide Resistance Resulting from Combinations of Foliar-Acting Systemic Seed Treatments and Foliar-Applied Fungicides: A Modeling Analysis.

    PubMed

    Kitchen, James L; van den Bosch, Frank; Paveley, Neil D; Helps, Joseph; van den Berg, Femke

    2016-01-01

    For the treatment of foliar diseases of cereals, fungicides may be applied as foliar sprays or systemic seed treatments which are translocated to leaves. Little research has been done to assess the resistance risks associated with foliar-acting systemic seed treatments when used alone or in combination with foliar sprays, even though both types of treatment may share the same mode of action. It is therefore unknown to what extent adding a systemic seed treatment to a foliar spray programme poses an additional resistance risk and whether in the presence of a seed treatment additional resistance management strategies (such as limiting the total number of treatments) are necessary to limit the evolution of fungicide-resistance. A mathematical model was developed to simulate an epidemic and the resistance evolution of Zymoseptoria tritici on winter wheat, which was used to compare different combinations of seed and foliar treatments by calculating the fungicide effective life, i.e. the number of years before effective disease control is lost to resistance. A range of parameterizations for the seed treatment fungicide and different fungicide uptake models were compared. Despite the different parameterizations, the model consistently predicted the same trends in that i) similar levels of efficacy delivered either by a foliar-acting seed treatment, or a foliar application, resulted in broadly similar resistance selection, ii) adding a foliar-acting seed treatment to a foliar spray programme increased resistance selection and usually decreased effective life, and iii) splitting a given total dose-by adding a seed treatment to foliar treatments, but decreasing dose per treatment-gave effective lives that were the same as, or shorter than those given by the spray programme alone. For our chosen plant-pathogen-fungicide system, the model results suggest that to effectively manage selection for fungicide-resistance, foliar acting systemic seed treatments should be included as

  8. Liquid Chromatography Applied to Space System

    NASA Astrophysics Data System (ADS)

    Poinot, Pauline; Chazalnoel, Pascale; Geffroy, Claude; Sternberg, Robert; Carbonnier, Benjamin

    Searching for signs of past or present life in our Solar System is a real challenge that stirs up the curiosity of scientists. Until now, in situ instrumentation was designed to detect and determine concentrations of a wide number of organic biomarkers. The relevant method which was and still is employed in missions dedicated to the quest of life (from Viking to ExoMars) corresponds to the pyrolysis-GC-MS. Along the missions, this approach has been significantly improved in terms of extraction efficiency and detection with the use of chemical derivative agents (e.g. MTBSTFA, DMF-DMA, TMAH…), and in terms of analysis sensitivity and resolution with the development of in situ high-resolution mass spectrometer (e.g. TOF-MS). Thanks to such an approach, organic compounds such as amino acids, sugars, tholins or polycyclic aromatic hydrocarbons (PAHs) were expected to be found. However, while there’s a consensus that the GC-MS of Viking, Huygens, MSL and MOMA space missions worked the way they had been designed to, pyrolysis is much more in debate (Glavin et al. 2001; Navarro-González et al. 2006). Indeed, (1) it is thought to remove low levels of organics, (2) water and CO2 could interfere with the detection of likely organic pyrolysis products, and (3) only low to mid-molecular weight organic molecules can be detected by this technique. As a result, researchers are now focusing on other in situ techniques which are no longer based on the volatility of the organic matter, but on the liquid phase extraction and analysis. In this line, micro-fluidic systems involving sandwich and/or competitive immunoassays (e.g. LMC, SOLID; Parro et al. 2005; Sims et al. 2012), micro-chip capillary electrophoreses (e.g. MOA; Bada et al. 2008), or nanopore-based analysis (e.g. BOLD; Schulze-Makuch et al. 2012) have been conceived for in situ analysis. Thanks to such approaches, molecular biological polymers (polysaccharides, polypeptides, polynucleotides, phospholipids, glycolipids

  9. Statistical Uncertainty Analysis Applied to Criticality Calculation

    SciTech Connect

    Hartini, Entin; Andiwijayakusuma, Dinan; Susmikanti, Mike; Nursinta, A. W.

    2010-06-22

    In this paper, we present an uncertainty methodology based on a statistical approach, for assessing uncertainties in criticality prediction using monte carlo method due to uncertainties in the isotopic composition of the fuel. The methodology has been applied to criticality calculations with MCNP5 with additional stochastic input of the isotopic fuel composition. The stochastic input were generated using the latin hypercube sampling method based one the probability density function of each nuclide composition. The automatic passing of the stochastic input to the MCNP and the repeated criticality calculation is made possible by using a python script to link the MCNP and our latin hypercube sampling code.

  10. Liquid Chromatography Applied to Space System

    NASA Astrophysics Data System (ADS)

    Poinot, Pauline; Chazalnoel, Pascale; Geffroy, Claude; Sternberg, Robert; Carbonnier, Benjamin

    Searching for signs of past or present life in our Solar System is a real challenge that stirs up the curiosity of scientists. Until now, in situ instrumentation was designed to detect and determine concentrations of a wide number of organic biomarkers. The relevant method which was and still is employed in missions dedicated to the quest of life (from Viking to ExoMars) corresponds to the pyrolysis-GC-MS. Along the missions, this approach has been significantly improved in terms of extraction efficiency and detection with the use of chemical derivative agents (e.g. MTBSTFA, DMF-DMA, TMAH…), and in terms of analysis sensitivity and resolution with the development of in situ high-resolution mass spectrometer (e.g. TOF-MS). Thanks to such an approach, organic compounds such as amino acids, sugars, tholins or polycyclic aromatic hydrocarbons (PAHs) were expected to be found. However, while there’s a consensus that the GC-MS of Viking, Huygens, MSL and MOMA space missions worked the way they had been designed to, pyrolysis is much more in debate (Glavin et al. 2001; Navarro-González et al. 2006). Indeed, (1) it is thought to remove low levels of organics, (2) water and CO2 could interfere with the detection of likely organic pyrolysis products, and (3) only low to mid-molecular weight organic molecules can be detected by this technique. As a result, researchers are now focusing on other in situ techniques which are no longer based on the volatility of the organic matter, but on the liquid phase extraction and analysis. In this line, micro-fluidic systems involving sandwich and/or competitive immunoassays (e.g. LMC, SOLID; Parro et al. 2005; Sims et al. 2012), micro-chip capillary electrophoreses (e.g. MOA; Bada et al. 2008), or nanopore-based analysis (e.g. BOLD; Schulze-Makuch et al. 2012) have been conceived for in situ analysis. Thanks to such approaches, molecular biological polymers (polysaccharides, polypeptides, polynucleotides, phospholipids, glycolipids

  11. Robustness analysis applied to substructure controller synthesis

    NASA Technical Reports Server (NTRS)

    Gonzalez-Oberdoerffer, Marcelo F.; Craig, Roy R., Jr.

    1993-01-01

    The stability and robustness of the controlled system obtained via the substructure control synthesis (SCS) method of Su et al. (1990) were examined using a six-bay truss model, and employing an LQG control design method to obtain controllers for two separate structures. It is found that the assembled controller provides a stability in this instance. A qualitative assessment of the stability robustness of the system with controller designed with the SCS method is provided by obtaining a controller using the complete truss model and comparing the robustness of the corresponding closed-loop systems.

  12. pH recycling aqueous two-phase systems applied in extraction of Maitake β-Glucan and mechanism analysis using low-field nuclear magnetic resonance.

    PubMed

    Hou, Huiyun; Cao, Xuejun

    2015-07-31

    In this paper, a recycling aqueous two-phase systems (ATPS) based on two pH-response copolymers PADB and PMDM were used in purification of β-Glucan from Grifola frondosa. The main parameters, such as polymer concentration, type and concentration of salt, extraction temperature and pH, were investigated to optimize partition conditions. The results demonstrated that β-Glucan was extracted into PADB-rich phase, while impurities were extracted into PMDM-rich phase. In this 2.5% PADB/2.5% PMDM ATPS, 7.489 partition coefficient and 96.92% extraction recovery for β-Glucan were obtained in the presence of 30mmol/L KBr, at pH 8.20, 30°C. The phase-forming copolymers could be recycled by adjusting pH, with recoveries of over 96.0%. Furthermore, the partition mechanism of Maitake β-Glucan in PADB/PMDM aqueous two-phase systems was studied. Fourier transform infrared spectra, ForteBio Octet system and low-field nuclear magnetic resonance (LF-NMR) were introduced for elucidating the partition mechanism of β-Glucan. Especially, LF-NMR was firstly used in the mechanism analysis in partition of aqueous two-phase systems. The change of transverse relaxation time (T2) in ATPS could reflect the interaction between polymers and β-Glucan. PMID:26094138

  13. Science, Skepticism, and Applied Behavior Analysis

    PubMed Central

    Normand, Matthew P

    2008-01-01

    Pseudoscientific claims concerning medical and psychological treatments of all varieties are commonplace. As behavior analysts, a sound skeptical approach to our science and practice is essential. The present paper offers an overview of science and skepticism and discusses the relationship of skepticism to behavior analysis, with an emphasis on the types of issues concerning behavior analysts in practice. PMID:22477687

  14. Applied Spectrophotometry: Analysis of a Biochemical Mixture

    ERIC Educational Resources Information Center

    Trumbo, Toni A.; Schultz, Emeric; Borland, Michael G.; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the…

  15. A robust multisyringe system for process flow analysis. Part II. A multi-commuted injection system applied to the photometric determination of free acidity and iron(III) in metallurgical solutions.

    PubMed

    Albertús, F; Cladera, A; Cerda, V

    2000-12-01

    A new software-controlled volume-based system for sample introduction in process flow injection analysis was developed. By using a multi-syringe burette coupled with one or two additional commutation valves, the multi-commuted injection of precise sample volumes was accomplished. Characteristics and performance of the injection system were studied by injecting an indicator in a buffered carrier. Three configurations were implemented in order to achieve two different tasks: the single injection of a sample in a two- or three-channels manifold, and the dual injection into different streams. The two channel flow system using the single injection was applied to the determination of free acidity in diluted samples containing high levels of iron(III), by employing the single point titration methodology. The precipitation of ferric hydroxide was prevented using the ammonium and sodium salts of oxalate and acetate as buffer titrant. Methyl Red was employed as indicator. The procedure allows determination of acid concentration in solutions with a Fe(III)/H+ molar ratio up to 0.2. Samples with higher Fe(III)/H+ molar ratios were spiked with a known strong acid at dilution. The three-channel configuration was applied to the determination of ferric ions, using, as reagent, a merging mixture of sulfuric acid and potassium thiocyanate. The double injection system was implemented in series in a single (three-channel) manifold in such a way that a different injection volume and a changed reagent were used for each analyte. It was applied to the separated or sequential determination of free acidity and ferric ions. In this configuration, iron(III) was determined using 0.5-0.7% (w/v) sodium salicylate solution as reagent. The systems can operate at up to 100, 84 and 78 injections per hour, respectively. Determinations on synthetic and process samples compared well with the reference values and procedures. Recoveries of 95-102% with a maximum RSD value of 5.4% were found for acidity

  16. Applied behavior analysis at West Virginia University: A brief history.

    PubMed

    Hawkins, R P; Chase, P N; Scotti, J R

    1993-01-01

    The development of an emphasis on applied behavior analysis in the Department of Psychology at West Virginia University is traced. The emphasis began primarily in the early 1970s, under the leadership of Roger Maley and Jon Krapfl, and has continued to expand and evolve with the participation of numerous behavior analysts and behavior therapists, both inside and outside the department. The development has been facilitated by several factors: establishment of a strong behavioral emphasis in the three Clinical graduate programs; change of the graduate program in Experimental Psychology to a program in basic Behavior Analysis; development of nonclinical applied behavior analysis within the Behavior Analysis program; establishment of a joint graduate program with Educational Psychology; establishment of a Community/Systems graduate program; and organization of numerous conferences. Several factors are described that seem to assure a stable role for behavior analysis in the department: a stable and supportive "culture" within the department; American Psychological Association accreditation of the clinical training; a good reputation both within the university and in psychology; and a broader community of behavior analysts and behavior therapists. PMID:16795816

  17. Applied methods of testing and evaluation for IR imaging system

    NASA Astrophysics Data System (ADS)

    Liao, Xiao-yue; Lu, Jin

    2009-07-01

    Different methods of testing and evaluation for IR imaging system are used with the application of the 2nd and the 3rd generation infrared detectors. The performance of IR imaging system can be reflected by many specifications, such as Noise Equivalent Temperature Difference (NETD), Nonuniformity, system Modulation Transfer Function (MTF), Minimum Resolvable Temperature Difference (MRTD), and Minimum Detectable Temperature Difference (MRTD) etc. The sensitivity of IR sensors is estimated by NETD. The sensitivity of thermal imaging sensors and space resolution are evaluated by MRTD, which is the chief specification of system. In this paper, the theoretical analysis of different testing methods is introduced. The characteristics of them are analyzed and compared. Based on discussing the factors that affect measurement results, an applied method of testing NETD and MRTD for IR system is proposed.

  18. Reliability analysis applied to structural tests

    NASA Technical Reports Server (NTRS)

    Diamond, P.; Payne, A. O.

    1972-01-01

    The application of reliability theory to predict, from structural fatigue test data, the risk of failure of a structure under service conditions because its load-carrying capability is progressively reduced by the extension of a fatigue crack, is considered. The procedure is applicable to both safe-life and fail-safe structures and, for a prescribed safety level, it will enable an inspection procedure to be planned or, if inspection is not feasible, it will evaluate the life to replacement. The theory has been further developed to cope with the case of structures with initial cracks, such as can occur in modern high-strength materials which are susceptible to the formation of small flaws during the production process. The method has been applied to a structure of high-strength steel and the results are compared with those obtained by the current life estimation procedures. This has shown that the conventional methods can be unconservative in certain cases, depending on the characteristics of the structure and the design operating conditions. The suitability of the probabilistic approach to the interpretation of the results from full-scale fatigue testing of aircraft structures is discussed and the assumptions involved are examined.

  19. Scanning methods applied to bitemark analysis

    NASA Astrophysics Data System (ADS)

    Bush, Peter J.; Bush, Mary A.

    2010-06-01

    The 2009 National Academy of Sciences report on forensics focused criticism on pattern evidence subdisciplines in which statements of unique identity are utilized. One principle of bitemark analysis is that the human dentition is unique to the extent that a perpetrator may be identified based on dental traits in a bitemark. Optical and electron scanning methods were used to measure dental minutia and to investigate replication of detail in human skin. Results indicated that being a visco-elastic substrate, skin effectively reduces the resolution of measurement of dental detail. Conclusions indicate caution in individualization statements.

  20. Expert systems applied to spacecraft fire safety

    NASA Technical Reports Server (NTRS)

    Smith, Richard L.; Kashiwagi, Takashi

    1989-01-01

    Expert systems are problem-solving programs that combine a knowledge base and a reasoning mechanism to simulate a human expert. The development of an expert system to manage fire safety in spacecraft, in particular the NASA Space Station Freedom, is difficult but clearly advantageous in the long-term. Some needs in low-gravity flammability characteristics, ventilating-flow effects, fire detection, fire extinguishment, and decision models, all necessary to establish the knowledge base for an expert system, are discussed.

  1. Applying Modeling Tools to Ground System Procedures

    NASA Technical Reports Server (NTRS)

    Di Pasquale, Peter

    2012-01-01

    As part of a long-term effort to revitalize the Ground Systems (GS) Engineering Section practices, Systems Modeling Language (SysML) and Business Process Model and Notation (BPMN) have been used to model existing GS products and the procedures GS engineers use to produce them.

  2. EG G Mound Applied Technologies payroll system

    SciTech Connect

    Not Available

    1992-02-07

    EG G Mound Applied Technologies, Inc., manages and operates the Mound Facility, Miamisburg, Ohio, under a cost-plus-award-fee contract administered by the Department of Energy's Albuquerque Field Office. The contractor's Payroll Department is responsible for prompt payment in the proper amount to all persons entitled to be paid, in compliance with applicable laws, regulations, and legal decisions. The objective was to determine whether controls were in place to avoid erroneous payroll payments. EG G Mound Applied Technologies, Inc., did not have all the internal controls required by General Accounting Office Title 6, Pay, Leave, and Allowances.'' Specifically, they did not have computerized edits, separation of duties and responsibilities, and restricted access to payroll data files. This condition occurred because its managers were not aware of Title 6 requirements. As a result, the contractor could not assure the Department of Energy that payroll costs were processes accurately; and fraud, waste, or abuse of Department of Energy funds could go undetected. Our sample of 212 payroll transactions from a population of 66,000 in FY 1991 disclosed only two minor processing errors and no instances of fraud, waste or abuse.

  3. Applying QCVV protocols to real physical systems

    NASA Astrophysics Data System (ADS)

    Magesan, Easwar

    As experimental systems move closer to realizing small-scale quantum computers with high fidelity operations, errors become harder to detect and diagnose. Verification and validation protocols are becoming increasingly important for detecting and understanding the precise nature of these errors. I will outline various methods and protocols currently used to deal with errors in experimental systems. I will also discuss recent advances in implementing high fidelity operations which will help to understand some of the tools that are still needed on the road to realizing larger scale quantum systems. Work partially supported by ARO under Contract W911NF-14-1-0124.

  4. Pipeline rehabilitation using field applied tape systems

    SciTech Connect

    Reeves, C.R.

    1998-12-31

    Bare steel pipelines were first installed years before the turn of the century. Pipeline operators soon realized the lie of bare steel could be greatly enhanced by applying coatings. Thus began ``pipeline rehabilitation.`` Many of the older pipelines were exposed, evaluated, coated and returned to service. This procedure has reached new heights in recent years as coated pipelines of the twentieth century, having lived past their original design life, are now subject to coating failure. Many operator companies with pipelines thirty years or older are faced with ``replace or recondition.`` Considering the emphasis on cost restraints and environmental issues, replacing an existing pipeline is often not the best decision. Rehabilitation is a preferred solution for many operators.

  5. Digital Systems Analysis

    ERIC Educational Resources Information Center

    Martin, Vance S.

    2009-01-01

    There have been many attempts to understand how the Internet affects our modern world. There have also been numerous attempts to understand specific areas of the Internet. This article applies Immanuel Wallerstein's World Systems Analysis to our informationalist society. Understanding this world as divided among individual core, semi-periphery,…

  6. How Systems Thinking Applies to Education.

    ERIC Educational Resources Information Center

    Betts, Frank

    1992-01-01

    Seeds of public education's current failures are found in its past successes (transmitting culture and providing custodial care). Education is experiencing paradigm paralysis because of piecemeal reform approaches, failure to integrate solution ideas, and reductionist, boundary-limiting orientation. The old system is no longer adequate. Total…

  7. Systems biology: the reincarnation of systems theory applied in biology?

    PubMed

    Wolkenhauer, O

    2001-09-01

    With the availability of quantitative data on the transcriptome and proteome level, there is an increasing interest in formal mathematical models of gene expression and regulation. International conferences, research institutes and research groups concerned with systems biology have appeared in recent years and systems theory, the study of organisation and behaviour per se, is indeed a natural conceptual framework for such a task. This is, however, not the first time that systems theory has been applied in modelling cellular processes. Notably in the 1960s systems theory and biology enjoyed considerable interest among eminent scientists, mathematicians and engineers. Why did these early attempts vanish from research agendas? Here we shall review the domain of systems theory, its application to biology and the lessons that can be learned from the work of Robert Rosen. Rosen emerged from the early developments in the 1960s as a main critic but also developed a new alternative perspective to living systems, a concept that deserves a fresh look in the post-genome era of bioinformatics. PMID:11589586

  8. Chebyshev Expansion Applied to Dissipative Quantum Systems.

    PubMed

    Popescu, Bogdan; Rahman, Hasan; Kleinekathöfer, Ulrich

    2016-05-19

    To determine the dynamics of a molecular aggregate under the influence of a strongly time-dependent perturbation within a dissipative environment is still, in general, a challenge. The time-dependent perturbation might be, for example, due to external fields or explicitly treated fluctuations within the environment. Methods to calculate the dynamics in these cases do exist though some of these approaches assume that the corresponding correlation functions can be written as a weighted sum of exponentials. One such theory is the hierarchical equations of motion approach. If the environment, however, is described by a complex spectral density or if its temperature is low, these approaches become very inefficient. Therefore, we propose a scheme based on a Chebyshev decomposition of the bath correlation functions and detail the respective quantum master equations within second-order perturbation theory in the environmental coupling. Similar approaches have recently been proposed for systems coupled to Fermionic reservoirs. The proposed scheme is tested for a simple two-level system and compared to existing results. Furthermore, the advantages and disadvantages of the present Chebyshev approach are discussed. PMID:26845380

  9. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W. . Dept. of Computer Sciences); Noordewier, M.O. . Dept. of Computer Science)

    1992-01-01

    We are primarily developing a machine teaming (ML) system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being teamed. Using this information, our teaming algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, our KBANN algorithm maps inference rules about a given recognition task into a neural network. Neural network training techniques then use the training examples to refine these inference rules. We call these rules a domain theory, following the convention in the machine teaming community. We have been applying this approach to several problems in DNA sequence analysis. In addition, we have been extending the capabilities of our teaming system along several dimensions. We have also been investigating parallel algorithms that perform sequence alignments in the presence of frameshift errors.

  10. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W.

    1992-01-01

    We are developing a machine learning system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being learned. Using this information (which we call a domain theory''), our learning algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, the KBANN algorithm maps inference rules, such as consensus sequences, into a neural (connectionist) network. Neural network training techniques then use the training examples of refine these inference rules. We have been applying this approach to several problems in DNA sequence analysis and have also been extending the capabilities of our learning system along several dimensions.

  11. Thermal diffusivity measurement system applied to polymers

    NASA Astrophysics Data System (ADS)

    Abad, B.; Díaz-Chao, P.; Almarza, A.; Amantia, D.; Vázquez-Campos, S.; Isoda, Y.; Shinohara, Y.; Briones, F.; Martín-González, M. S.

    2012-06-01

    In the search for cleaner energy sources, the improvement of the efficiency of the actual ones appears as a primary objective. In this way, thermoelectric materials, which are able to convert wasted heat into electricity, are reveal as an interesting way to improve efficiency of car engines, for example. Cost-effective energy harvesting from thermoelectric devices requires materials with high electrical conductivities and Seebeck coefficient, but low thermal conductivity. Conductive polymers can fulfil these conditions if they are doped appropriately. One of the most promising polymers is Polyaniline. In this work, the thermal conductivity of the polyaniline and mixtures of polyaniline with nanoclays has been studied, using a new experimental set-up developed in the lab. The novel system is based on the steady-state method and it is used to obtain the thermal diffusivity of the polymers and the nanocomposites.

  12. Setting events in applied behavior analysis: Toward a conceptual and methodological expansion

    PubMed Central

    Wahler, Robert G.; Fox, James J.

    1981-01-01

    The contributions of applied behavior analysis as a natural science approach to the study of human behavior are acknowledged. However, it is also argued that applied behavior analysis has provided limited access to the full range of environmental events that influence socially significant behavior. Recent changes in applied behavior analysis to include analysis of side effects and social validation represent ways in which the traditional applied behavior analysis conceptual and methodological model has been profitably expanded. A third area of expansion, the analysis of setting events, is proposed by the authors. The historical development of setting events as a behavior influence concept is traced. Modifications of the basic applied behavior analysis methodology and conceptual systems that seem necessary to setting event analysis are discussed and examples of descriptive and experimental setting event analyses are presented. PMID:16795646

  13. Animal Research in the "Journal of Applied Behavior Analysis"

    ERIC Educational Resources Information Center

    Edwards, Timothy L.; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the "Journal of Applied Behavior Analysis" and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance…

  14. B. F. Skinner's contributions to applied behavior analysis

    PubMed Central

    Morris, Edward K.; Smith, Nathaniel G.; Altus, Deborah E.

    2005-01-01

    Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew from his science for application, his descriptions of possible applications, and his own applications to nonhuman and human behavior. Second, we found that he explicitly or implicitly addressed all seven dimensions of applied behavior analysis. These contributions and the dimensions notwithstanding, he neither incorporated the field's scientific (e.g., analytic) and social dimensions (e.g., applied) into any program of published research such that he was its originator, nor did he systematically integrate, advance, and promote the dimensions so to have been its founder. As the founder of behavior analysis, however, he was the father of applied behavior analysis. PMID:22478444

  15. Multidisciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  16. Hyperspectral imaging applied to complex particulate solids systems

    NASA Astrophysics Data System (ADS)

    Bonifazi, Giuseppe; Serranti, Silvia

    2008-04-01

    HyperSpectral Imaging (HSI) is based on the utilization of an integrated hardware and software (HW&SW) platform embedding conventional imaging and spectroscopy to attain both spatial and spectral information from an object. Although HSI was originally developed for remote sensing, it has recently emerged as a powerful process analytical tool, for non-destructive analysis, in many research and industrial sectors. The possibility to apply on-line HSI based techniques in order to identify and quantify specific particulate solid systems characteristics is presented and critically evaluated. The originally developed HSI based logics can be profitably applied in order to develop fast, reliable and lowcost strategies for: i) quality control of particulate products that must comply with specific chemical, physical and biological constraints, ii) performance evaluation of manufacturing strategies related to processing chains and/or realtime tuning of operative variables and iii) classification-sorting actions addressed to recognize and separate different particulate solid products. Case studies, related to recent advances in the application of HSI to different industrial sectors, as agriculture, food, pharmaceuticals, solid waste handling and recycling, etc. and addressed to specific goals as contaminant detection, defect identification, constituent analysis and quality evaluation are described, according to authors' originally developed application.

  17. First Attempt of Applying Factor Analysis in Moving Base Gravimetry

    NASA Astrophysics Data System (ADS)

    Li, X.; Roman, D. R.

    2014-12-01

    For gravimetric observation systems on mobile platforms (land/sea/airborne), the Low Signal to Noise Ratio (SNR) issue is the main barrier to achieving an accurate, high resolution gravity signal. Normally, low-pass filters (Childers et al 1999, Forsberg et al 2000, Kwon and Jekeli 2000, Hwang et al 2006) are applied to smooth or remove the high frequency "noise" - even though some of the high frequency component is not necessarily noise. This is especially true for aerogravity surveys such as those from the Gravity for the Redefinition of the American Vertical Datum (GRAV-D) project. These gravity survey flights have a spatial resolution of 10 km between tracks but higher resolution along track. The along track resolution is improved due to the lower flight height (6.1 km), equipment sensitivity, and improved modeling of potential errors. Additionally, these surveys suffer from a loss of signal power due to the increased flight elevation. Hence, application of a low-pass filter removes possible signal sensed in the along-track direction that might otherwise prove useful for various geophysical and geodetic applications. Some cutting-edge developments in Wavelets and Artificial Neural Networks had been successfully applied for obtaining improved results (Li 2008 and 2011, Liang and Liu 2013). However, a clearer and fundamental understanding of the error characteristics will further improve the quality of the gravity estimates out of these gravimetric systems. Here, instead of using any predefined basis function or any a priori model, the idea of Factor Analysis is first employed to try to extract the underlying factors of the noises in the systems. Real data sets collected by both land vehicle and aircraft will be processed as the examples.

  18. Computer-Aided Decision Support for Melanoma Detection Applied on Melanocytic and Nonmelanocytic Skin Lesions: A Comparison of Two Systems Based on Automatic Analysis of Dermoscopic Images

    PubMed Central

    Møllersen, Kajsa; Kirchesch, Herbert; Zortea, Maciel; Schopf, Thomas R.; Hindberg, Kristian; Godtliebsen, Fred

    2015-01-01

    Commercially available clinical decision support systems (CDSSs) for skin cancer have been designed for the detection of melanoma only. Correct use of the systems requires expert knowledge, hampering their utility for nonexperts. Furthermore, there are no systems to detect other common skin cancer types, that is, nonmelanoma skin cancer (NMSC). As early diagnosis of skin cancer is essential, there is a need for a CDSS that is applicable to all types of skin lesions and is suitable for nonexperts. Nevus Doctor (ND) is a CDSS being developed by the authors. We here investigate ND's ability to detect both melanoma and NMSC and the opportunities for improvement. An independent test set of dermoscopic images of 870 skin lesions, including 44 melanomas and 101 NMSCs, were analysed by ND. Its sensitivity to melanoma and NMSC was compared to that of Mole Expert (ME), a commercially available CDSS, using the same set of lesions. ND and ME had similar sensitivity to melanoma. For ND at 95% melanoma sensitivity, the NMSC sensitivity was 100%, and the specificity was 12%. The melanomas misclassified by ND at 95% sensitivity were correctly classified by ME, and vice versa. ND is able to detect NMSC without sacrificing melanoma sensitivity. PMID:26693486

  19. Applying expertise to data in the Geologist's Assistant expert system

    SciTech Connect

    Berkbigler, K.P.; Papcun, G.J.; Marusak, N.L.; Hutson, J.E.

    1988-01-01

    The Geologist's Assistant combines expert system technology with numerical pattern-matching and online communication to a large database. This paper discusses the types of rules used for the expert system, the pattern-matching technique applied, and the implementation of the system using a commercial expert system development environment. 13 refs., 8 figs.

  20. A Vision for Systems Engineering Applied to Wind Energy (Presentation)

    SciTech Connect

    Felker, F.; Dykes, K.

    2015-01-01

    This presentation was given at the Third Wind Energy Systems Engineering Workshop on January 14, 2015. Topics covered include the importance of systems engineering, a vision for systems engineering as applied to wind energy, and application of systems engineering approaches to wind energy research and development.

  1. Competing Uses of Underground Systems Related to Energy Supply: Applying Single- and Multiphase Simulations for Site Characterization and Risk-Analysis

    NASA Astrophysics Data System (ADS)

    Kissinger, A.; Walter, L.; Darcis, M.; Flemisch, B.; Class, H.

    2012-04-01

    Global climate change, shortage of resources and the resulting turn towards renewable sources of energy lead to a growing demand for the utilization of subsurface systems. Among these competing uses are Carbon Capture and Storage (CCS), geothermal energy, nuclear waste disposal, "renewable" methane or hydrogen storage as well as the ongoing production of fossil resources like oil, gas, and coal. Besides competing among themselves, these technologies may also create conflicts with essential public interests like water supply. For example, the injection of CO2 into the underground causes an increase in pressure reaching far beyond the actual radius of influence of the CO2 plume, potentially leading to large amounts of displaced salt water. Finding suitable sites is a demanding task for several reasons. Natural systems as opposed to technical systems are always characterized by heterogeneity. Therefore, parameter uncertainty impedes reliable predictions towards capacity and safety of a site. State of the art numerical simulations combined with stochastic approaches need to be used to obtain a more reliable assessment of the involved risks and the radii of influence of the different processes. These simulations may include the modeling of single- and multiphase non-isothermal flow, geo-chemical and geo-mechanical processes in order to describe all relevant physical processes adequately. Stochastic approaches have the aim to estimate a bandwidth of the key output parameters based on uncertain input parameters. Risks for these different underground uses can then be made comparable with each other. Along with the importance and the urgency of the competing processes this may lead to a more profound basis for a decision. Communicating risks to stake holders and a concerned public is crucial for the success of finding a suitable site for CCS (or other subsurface utilization). We present and discuss first steps towards an approach for addressing the issue of competitive

  2. The Significance of Regional Analysis in Applied Geography.

    ERIC Educational Resources Information Center

    Sommers, Lawrence M.

    Regional analysis is central to applied geographic research, contributing to better planning and policy development for a variety of societal problems facing the United States. The development of energy policy serves as an illustration of the capabilities of this type of analysis. The United States has had little success in formulating a national…

  3. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2003-01-01

    TD64, the Applied Fluid Dynamics Analysis Group, is one of several groups with high-fidelity fluids design and analysis expertise in the Space Transportation Directorate at Marshall Space Flight Center (MSFC). TD64 assists personnel working on other programs. The group participates in projects in the following areas: turbomachinery activities, nozzle activities, combustion devices, and the Columbia accident investigation.

  4. Applying a toolkit for dissemination and analysis of near real-time data through the World Wide Web: integration of the Antelope Real Time System, ROADNet, and PHP

    NASA Astrophysics Data System (ADS)

    Newman, R. L.; Lindquist, K. G.; Hansen, T. S.; Vernon, F. L.; Eakins, J.; Foley, S.; Orcutt, J.

    2005-12-01

    The ROADNet project has enabled the acquisition and storage of diverse data streams through seamless integration of the Antelope Real Time System (ARTS) with (for example) ecological, seismological and geodetic instrumentation. The robust system architecture allows researchers to simply network data loggers with relational databases; however, the ability to disseminate these data to policy makers, scientists and the general public has (until recently) been provided on an 'as needed' basis. The recent development of a Datascope interface to the popular open source scripting language PHP has provided an avenue for presenting near real time data (such as integers, images and movies) from within the ARTS framework easily on the World Wide Web. The interface also indirectly provided the means to transform data types into various formats using the extensive function libraries that accompany a PHP installation (such as image creation and manipulation, data encryption for sensitive information, and XML creation for structured document interchange through the World Wide Web). Using a combination of Datascope and PHP library functions, an extensible tool-kit is being developed to allow data managers to easily present their products on the World Wide Web. The tool-kit has been modeled after the pre-existing ARTS architecture to simplify the installation, development and ease-of-use for both the seasoned researcher and the casual user. The methodology and results of building the applications that comprise the tool-kit are the focus of this presentation, including procedural vs. object oriented design, incorporation of the tool-kit into the existing contributed software libraries, and case-studies of researchers who are employing the tools to present their data. http://anf.ucsd.edu

  5. Automated speech analysis applied to laryngeal disease categorization.

    PubMed

    Gelzinis, A; Verikas, A; Bacauskiene, M

    2008-07-01

    The long-term goal of the work is a decision support system for diagnostics of laryngeal diseases. Colour images of vocal folds, a voice signal, and questionnaire data are the information sources to be used in the analysis. This paper is concerned with automated analysis of a voice signal applied to screening of laryngeal diseases. The effectiveness of 11 different feature sets in classification of voice recordings of the sustained phonation of the vowel sound /a/ into a healthy and two pathological classes, diffuse and nodular, is investigated. A k-NN classifier, SVM, and a committee build using various aggregation options are used for the classification. The study was made using the mixed gender database containing 312 voice recordings. The correct classification rate of 84.6% was achieved when using an SVM committee consisting of four members. The pitch and amplitude perturbation measures, cepstral energy features, autocorrelation features as well as linear prediction cosine transform coefficients were amongst the feature sets providing the best performance. In the case of two class classification, using recordings from 79 subjects representing the pathological and 69 the healthy class, the correct classification rate of 95.5% was obtained from a five member committee. Again the pitch and amplitude perturbation measures provided the best performance. PMID:18346812

  6. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics

    PubMed Central

    Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999–2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  7. Methodology, the matching law, and applied behavior analysis

    PubMed Central

    Vyse, Stuart A.

    1986-01-01

    The practical value of the quantitative analysis of behavior is limited by two methodological characteristics of this area of research: the use of (a) steady-state strategies and (b) relative vs. absolute response rates. Applied behavior analysts are concerned with both transition-state and steady-state behavior, and applied interventions are typically evaluated by their effects on absolute response rates. Quantitative analyses of behavior will have greater practical value when methods are developed for their extension to traditional rate-of-response variables measured across time. Although steady-state and relative-rate-of-response strategies are appropriate to the experimental analysis of many behavioral phenomena, these methods are rarely used by applied behavior analysts and further separate the basic and applied areas. PMID:22478657

  8. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    PubMed

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  9. Applied research in the solar thermal-energy-systems program

    SciTech Connect

    Brown, C. T.; Lefferdo, J. M.

    1981-03-01

    Within the Solar Thermal Research and Advanced Development (RAD) program a coordinated effort in materials research, fuels and chemical research and applied research is being carried out to meet the systems' needs. Each of these three program elements are described with particular attention given to the applied research activity.

  10. XML: How It Will Be Applied to Digital Library Systems.

    ERIC Educational Resources Information Center

    Kim, Hyun-Hee; Choi, Chang-Seok

    2000-01-01

    Shows how XML is applied to digital library systems. Compares major features of XML with those of HTML and describes an experimental XML-based metadata retrieval system, which is based on the Dublin Core and is designed as a subsystem of the Korean Virtual Library and Information System (VINIS). (Author/LRW)

  11. Negative reinforcement in applied behavior analysis: an emerging technology.

    PubMed Central

    Iwata, B A

    1987-01-01

    Although the effects of negative reinforcement on human behavior have been studied for a number of years, a comprehensive body of applied research does not exist at this time. This article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. A consideration of research currently being done in these areas suggests the emergence of an applied technology on negative reinforcement. PMID:3323157

  12. Correlation Network Analysis Applied to Complex Biofilm Communities

    PubMed Central

    Duran-Pinedo, Ana E.; Paster, Bruce; Teles, Ricardo; Frias-Lopez, Jorge

    2011-01-01

    The complexity of the human microbiome makes it difficult to reveal organizational principles of the community and even more challenging to generate testable hypotheses. It has been suggested that in the gut microbiome species such as Bacteroides thetaiotaomicron are keystone in maintaining the stability and functional adaptability of the microbial community. In this study, we investigate the interspecies associations in a complex microbial biofilm applying systems biology principles. Using correlation network analysis we identified bacterial modules that represent important microbial associations within the oral community. We used dental plaque as a model community because of its high diversity and the well known species-species interactions that are common in the oral biofilm. We analyzed samples from healthy individuals as well as from patients with periodontitis, a polymicrobial disease. Using results obtained by checkerboard hybridization on cultivable bacteria we identified modules that correlated well with microbial complexes previously described. Furthermore, we extended our analysis using the Human Oral Microbe Identification Microarray (HOMIM), which includes a large number of bacterial species, among them uncultivated organisms present in the mouth. Two distinct microbial communities appeared in healthy individuals while there was one major type in disease. Bacterial modules in all communities did not overlap, indicating that bacteria were able to effectively re-associate with new partners depending on the environmental conditions. We then identified hubs that could act as keystone species in the bacterial modules. Based on those results we then cultured a not-yet-cultivated microorganism, Tannerella sp. OT286 (clone BU063). After two rounds of enrichment by a selected helper (Prevotella oris OT311) we obtained colonies of Tannerella sp. OT286 growing on blood agar plates. This system-level approach would open the possibility of manipulating microbial

  13. Applying Association Rule Discovery Algorithm to Multipoint Linkage Analysis.

    PubMed

    Mitsuhashi; Hishigaki; Takagi

    1997-01-01

    Knowledge discovery in large databases (KDD) is being performed in several application domains, for example, the analysis of sales data, and is expected to be applied to other domains. We propose a KDD approach to multipoint linkage analysis, which is a way of ordering loci on a chromosome. Strict multipoint linkage analysis based on maximum likelihood estimation is a computationally tough problem. So far various kinds of approximate methods have been implemented. Our method based on the discovery of association between genetic recombinations is so different from others that it is useful to recheck the result of them. In this paper, we describe how to apply the framework of association rule discovery to linkage analysis, and also discuss that filtering input data and interpretation of discovered rules after data mining are practically important as well as data mining process itself. PMID:11072310

  14. Applied behavior analysis: New directions from the laboratory

    PubMed Central

    Epling, W. Frank; Pierce, W. David

    1983-01-01

    Applied behavior analysis began when laboratory based principles were extended to humans inorder to change socially significant behavior. Recent laboratory findings may have applied relevance; however, the majority of basic researchers have not clearly communicated the practical implications of their work. The present paper samples some of the new findings and attempts to demonstrate their applied importance. Schedule-induced behavior which occurs as a by-product of contingencies of reinforcement is discussed. Possible difficulties in treatment and management of induced behaviors are considered. Next, the correlation-based law of effect and the implications of relative reinforcement are explored in terms of applied examples. Relative rate of reinforcement is then extended to the literature dealing with concurrent operants. Concurrent operant models may describe human behavior of applied importance, and several techniques for modification of problem behavior are suggested. As a final concern, the paper discusses several new paradigms. While the practical importance of these models is not clear at the moment, it may be that new practical advantages will soon arise. Thus, it is argued that basic research continues to be of theoretical and practical importance to applied behavior analysis. PMID:22478574

  15. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Wang, Tee-See; Griffin, Lisa; Turner, James E. (Technical Monitor)

    2001-01-01

    This document is a presentation graphic which reviews the activities of the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center (i.e., Code TD64). The work of this group focused on supporting the space transportation programs. The work of the group is in Computational Fluid Dynamic tool development. This development is driven by hardware design needs. The major applications for the design and analysis tools are: turbines, pumps, propulsion-to-airframe integration, and combustion devices.

  16. Animal research in the Journal of Applied Behavior Analysis.

    PubMed

    Edwards, Timothy L; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the Journal of Applied Behavior Analysis and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance (say-do correspondence and fluency), 3 described interventions that changed animals' behavior (self-injury by a baboon, feces throwing and spitting by a chimpanzee, and unsafe trailer entry by horses) in ways that benefited the animals and the people in charge of them, and 1 described the use of trained rats that performed a service to humans (land-mine detection). We suggest that each of these general research areas merits further attention and that the Journal of Applied Behavior Analysis is an appropriate outlet for some of these publications. PMID:21709802

  17. Support systems design and analysis

    NASA Technical Reports Server (NTRS)

    Ferguson, R. M.

    1985-01-01

    The integration of Kennedy Space Center (KSC) ground support systems with the new launch processing system and new launch vehicle provided KSC with a unique challenge in system design and analysis for the Space Transportation System. Approximately 70 support systems are controlled and monitored by the launch processing system. Typical systems are main propulsion oxygen and hydrogen loading systems, environmental control life support system, hydraulics, etc. An End-to-End concept of documentation and analysis was chosen and applied to these systems. Unique problems were resolved in the areas of software analysis, safing under emergency conditions, sampling rates, and control loop analysis. New methods of performing End-to-End reliability analyses were implemented. The systems design approach selected and the resolution of major problem areas are discussed.

  18. System theory as applied differential geometry. [linear system

    NASA Technical Reports Server (NTRS)

    Hermann, R.

    1979-01-01

    The invariants of input-output systems under the action of the feedback group was examined. The approach used the theory of Lie groups and concepts of modern differential geometry, and illustrated how the latter provides a basis for the discussion of the analytic structure of systems. Finite dimensional linear systems in a single independent variable are considered. Lessons of more general situations (e.g., distributed parameter and multidimensional systems) which are increasingly encountered as technology advances are presented.

  19. Positive Behavior Support and Applied Behavior Analysis: A Familial Alliance

    ERIC Educational Resources Information Center

    Dunlap, Glen; Carr, Edward G.; Horner, Robert H.; Zarcone, Jennifer R.; Schwartz, Ilene

    2008-01-01

    Positive behavior support (PBS) emerged in the mid-1980s as an approach for understanding and addressing problem behaviors. PBS was derived primarily from applied behavior analysis (ABA). Over time, however, PBS research and practice has incorporated evaluative methods, assessment and intervention procedures, and conceptual perspectives associated…

  20. B. F. Skinner's Contributions to Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Morris, Edward K.; Smith, Nathaniel G.; Altus, Deborah E.

    2005-01-01

    Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew…

  1. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    ERIC Educational Resources Information Center

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  2. Opportunities for Applied Behavior Analysis in the Total Quality Movement.

    ERIC Educational Resources Information Center

    Redmon, William K.

    1992-01-01

    This paper identifies critical components of recent organizational quality improvement programs and specifies how applied behavior analysis can contribute to quality technology. Statistical Process Control and Total Quality Management approaches are compared, and behavior analysts are urged to build their research base and market behavior change…

  3. Overview af MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2004-01-01

    This paper presents viewgraphs on NASA Marshall Space Flight Center's Applied Fluid Dynamics Analysis Group Activities. The topics include: 1) Status of programs at MSFC; 2) Fluid Mechanics at MSFC; 3) Relevant Fluid Dynamics Activities at MSFC; and 4) Shuttle Return to Flight.

  4. Context, Cognition, and Biology in Applied Behavior Analysis.

    ERIC Educational Resources Information Center

    Morris, Edward K.

    Behavior analysts are having their professional identities challenged by the roles that cognition and biology are said to play in the conduct and outcome of applied behavior analysis and behavior therapy. For cogniphiliacs, cognition and biology are central to their interventions because cognition and biology are said to reflect various processes,…

  5. Progressive-Ratio Schedules and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Poling, Alan

    2010-01-01

    Establishing appropriate relations between the basic and applied areas of behavior analysis has been of long and persistent interest to the author. In this article, the author illustrates that there is a direct relation between how hard an organism will work for access to an object or activity, as indexed by the largest ratio completed under a…

  6. Complex, Dynamic Systems: A New Transdisciplinary Theme for Applied Linguistics?

    ERIC Educational Resources Information Center

    Larsen-Freeman, Diane

    2012-01-01

    In this plenary address, I suggest that Complexity Theory has the potential to contribute a transdisciplinary theme to applied linguistics. Transdisciplinary themes supersede disciplines and spur new kinds of creative activity (Halliday 2001 [1990]). Investigating complex systems requires researchers to pay attention to system dynamics. Since…

  7. Cognitive task analysis: Techniques applied to airborne weapons training

    SciTech Connect

    Terranova, M.; Seamster, T.L.; Snyder, C.E.; Treitler, I.E.; Carlow Associates, Inc., Fairfax, VA; Martin Marietta Energy Systems, Inc., Oak Ridge, TN; Tennessee Univ., Knoxville, TN )

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented along with the results. 6 refs., 2 figs., 4 tabs.

  8. Treatment integrity in applied behavior analysis with children.

    PubMed

    Gresham, F M; Gansle, K A; Noell, G H

    1993-01-01

    Functional analysis of behavior depends upon accurate measurement of both independent and dependent variables. Quantifiable and controllable operations that demonstrate these functional relationships are necessary for a science of human behavior. Failure to implement independent variables with integrity threatens the internal and external validity of experiments. A review of all applied behavior analysis studies with children as subjects that have been published in the Journal of Applied Behavior Analysis between 1980 and 1990 found that approximately 16% of these studies measured the accuracy of independent variable implementation. Two thirds of these studies did not operationally define the components of the independent variable. Specific recommendations for improving the accuracy of independent variable implementation and for defining independent variables are discussed. PMID:8331022

  9. Conformity with the HIRF Environment Applied to Avionic System

    NASA Astrophysics Data System (ADS)

    Tristant, F.; Rotteleur, J. P.; Moreau, J. P.

    2012-05-01

    This paper presents the qualification and certification methodology applied to the avionic system for the HIRF and Lightning environment. Several versions of this system are installed in our legacy Falcon with different variations. The paper presents the compliance process taking into account the criticality and the complexity of the system, its installation, the level of exposition for EM environment and some solutions used by Dassault Aviation to demonstrate the compliance process.

  10. Flight control system design factors for applying automated testing techniques

    NASA Technical Reports Server (NTRS)

    Sitz, Joel R.; Vernon, Todd H.

    1990-01-01

    Automated validation of flight-critical embedded systems is being done at ARC Dryden Flight Research Facility. The automated testing techniques are being used to perform closed-loop validation of man-rated flight control systems. The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 High Alpha Research Vehicle (HARV) automated test systems are discussed. Operationally applying automated testing techniques has accentuated flight control system features that either help or hinder the application of these techniques. The paper also discusses flight control system features which foster the use of automated testing techniques.

  11. Applying systems engineering methodologies to the micro- and nanoscale realm

    NASA Astrophysics Data System (ADS)

    Garrison Darrin, M. Ann

    2012-06-01

    Micro scale and nano scale technology developments have the potential to revolutionize smart and small systems. The application of systems engineering methodologies that integrate standalone, small-scale technologies and interface them with macro technologies to build useful systems is critical to realizing the potential of these technologies. This paper covers the expanding knowledge base on systems engineering principles for micro and nano technology integration starting with a discussion of the drivers for applying a systems approach. Technology development on the micro and nano scale has transition from laboratory curiosity to the realization of products in the health, automotive, aerospace, communication, and numerous other arenas. This paper focuses on the maturity (or lack thereof) of the field of nanosystems which is emerging in a third generation having transitioned from completing active structures to creating systems. The emphasis of applying a systems approach focuses on successful technology development based on the lack of maturity of current nano scale systems. Therefore the discussion includes details relating to enabling roles such as product systems engineering and technology development. Classical roles such as acquisition systems engineering are not covered. The results are also targeted towards small-scale technology developers who need to take into account systems engineering processes such as requirements definition, verification, and validation interface management and risk management in the concept phase of technology development to maximize the likelihood of success, cost effective micro and nano technology to increase the capability of emerging deployed systems and long-term growth and profits.

  12. Applying Model Based Systems Engineering to NASA's Space Communications Networks

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul; Barnes, Patrick; Reinert, Jessica; Golden, Bert

    2013-01-01

    System engineering practices for complex systems and networks now require that requirement, architecture, and concept of operations product development teams, simultaneously harmonize their activities to provide timely, useful and cost-effective products. When dealing with complex systems of systems, traditional systems engineering methodology quickly falls short of achieving project objectives. This approach is encumbered by the use of a number of disparate hardware and software tools, spreadsheets and documents to grasp the concept of the network design and operation. In case of NASA's space communication networks, since the networks are geographically distributed, and so are its subject matter experts, the team is challenged to create a common language and tools to produce its products. Using Model Based Systems Engineering methods and tools allows for a unified representation of the system in a model that enables a highly related level of detail. To date, Program System Engineering (PSE) team has been able to model each network from their top-level operational activities and system functions down to the atomic level through relational modeling decomposition. These models allow for a better understanding of the relationships between NASA's stakeholders, internal organizations, and impacts to all related entities due to integration and sustainment of existing systems. Understanding the existing systems is essential to accurate and detailed study of integration options being considered. In this paper, we identify the challenges the PSE team faced in its quest to unify complex legacy space communications networks and their operational processes. We describe the initial approaches undertaken and the evolution toward model based system engineering applied to produce Space Communication and Navigation (SCaN) PSE products. We will demonstrate the practice of Model Based System Engineering applied to integrating space communication networks and the summary of its

  13. Boston Society's 11th Annual Applied Pharmaceutical Analysis conference.

    PubMed

    Lee, Violet; Liu, Ang; Groeber, Elizabeth; Moghaddam, Mehran; Schiller, James; Tweed, Joseph A; Walker, Gregory S

    2016-02-01

    Boston Society's 11th Annual Applied Pharmaceutical Analysis conference, Hyatt Regency Hotel, Cambridge, MA, USA, 14-16 September 2015 The Boston Society's 11th Annual Applied Pharmaceutical Analysis (APA) conference took place at the Hyatt Regency hotel in Cambridge, MA, on 14-16 September 2015. The 3-day conference affords pharmaceutical professionals, academic researchers and industry regulators the opportunity to collectively participate in meaningful and relevant discussions impacting the areas of pharmaceutical drug development. The APA conference was organized in three workshops encompassing the disciplines of regulated bioanalysis, discovery bioanalysis (encompassing new and emerging technologies) and biotransformation. The conference included a short course titled 'Bioanalytical considerations for the clinical development of antibody-drug conjugates (ADCs)', an engaging poster session, several panel and round table discussions and over 50 diverse talks from leading industry and academic scientists. PMID:26853375

  14. Applying Sustainable Systems Development Approach to Educational Technology Systems

    ERIC Educational Resources Information Center

    Huang, Albert

    2012-01-01

    Information technology (IT) is an essential part of modern education. The roles and contributions of technology to education have been thoroughly documented in academic and professional literature. Despite the benefits, the use of educational technology systems (ETS) also creates a significant impact on the environment, primarily due to energy…

  15. Stratospheric Data Analysis System (STRATAN)

    NASA Technical Reports Server (NTRS)

    Rood, Richard B.; Fox-Rabinovitz, Michael; Lamich, David J.; Newman, Paul A.; Pfaendtner, James W.

    1990-01-01

    A state of the art stratospheric analyses using a coupled stratosphere/troposphere data assimilation system is produced. These analyses can be applied to stratospheric studies of all types. Of importance to this effort is the application of the Stratospheric Data Analysis System (STRATAN) to constituent transport and chemistry problems.

  16. A layered neural network model applied to the auditory system

    NASA Astrophysics Data System (ADS)

    Travis, Bryan J.

    1986-08-01

    The structure of the auditory system is described with emphasis on the cerebral cortex. A layered neural network model incorporating much of the known structure of the cortex is applied to word discrimination. The concepts of iterated maps and atrractive fixed points are used to enable the model to recognize words despite variations in pitch, intensity and duration.

  17. Associate of Applied Science Degree in Office Systems. Proposal.

    ERIC Educational Resources Information Center

    Gallaudet Coll., Washington, DC. School of Preparatory Studies.

    This proposal culminates a 5-year study of the possibility of awarding associate degrees at Gallaudet College, a private, liberal arts college for hearing impaired adults. The proposal outlines an Associate of Applied Science degree (AAS) in Office Systems at the School of Preparatory Studies. First, introductory material provides a brief history…

  18. Applied Information Systems Research Program (AISRP) Workshop 3 meeting proceedings

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The third Workshop of the Applied Laboratory Systems Research Program (AISRP) met at the Univeristy of Colorado's Laboratory for Atmospheric and Space Physics in August of 1993. The presentations were organized into four sessions: Artificial Intelligence Techniques; Scientific Visualization; Data Management and Archiving; and Research and Technology.

  19. Applying Technology Ranking and Systems Engineering in Advanced Life Support

    NASA Technical Reports Server (NTRS)

    Jones, Harry; Luna, Bernadette (Technical Monitor)

    2000-01-01

    According to the Advanced Life Support (ALS) Program Plan, the Systems Modeling and Analysis Project (SMAP) has two important tasks: 1) prioritizing investments in ALS Research and Technology Development (R&TD), and 2) guiding the evolution of ALS systems. Investments could be prioritized simply by independently ranking different technologies, but we should also consider a technology's impact on system design. Guiding future ALS systems will require SMAP to consider many aspects of systems engineering. R&TD investments can be prioritized using familiar methods for ranking technology. The first step is gathering data on technology performance, safety, readiness level, and cost. Then the technologies are ranked using metrics or by decision analysis using net present economic value. The R&TD portfolio can be optimized to provide the maximum expected payoff in the face of uncertain future events. But more is needed. The optimum ALS system can not be designed simply by selecting the best technology for each predefined subsystem. Incorporating a new technology, such as food plants, can change the specifications of other subsystems, such as air regeneration. Systems must be designed top-down starting from system objectives, not bottom-up from selected technologies. The familiar top-down systems engineering process includes defining mission objectives, mission design, system specification, technology analysis, preliminary design, and detail design. Technology selection is only one part of systems analysis and engineering, and it is strongly related to the subsystem definitions. ALS systems should be designed using top-down systems engineering. R&TD technology selection should consider how the technology affects ALS system design. Technology ranking is useful but it is only a small part of systems engineering.

  20. Applied Nonlinear Dynamics and Stochastic Systems Near The Millenium. Proceedings

    SciTech Connect

    Kadtke, J.B.; Bulsara, A.

    1997-12-01

    These proceedings represent papers presented at the Applied Nonlinear Dynamics and Stochastic Systems conference held in San Diego, California in July 1997. The conference emphasized the applications of nonlinear dynamical systems theory in fields as diverse as neuroscience and biomedical engineering, fluid dynamics, chaos control, nonlinear signal/image processing, stochastic resonance, devices and nonlinear dynamics in socio{minus}economic systems. There were 56 papers presented at the conference and 5 have been abstracted for the Energy Science and Technology database.(AIP)

  1. Activity anorexia: An interplay between basic and applied behavior analysis

    PubMed Central

    Pierce, W. David; Epling, W. Frank; Dews, Peter B.; Estes, William K.; Morse, William H.; Van Orman, Willard; Herrnstein, Richard J.

    1994-01-01

    The relationship between basic research with nonhumans and applied behavior analysis is illustrated by our work on activity anorexia. When rats are fed one meal a day and allowed to run on an activity wheel, they run excessively, stop eating, and die of starvation. Convergent evidence, from several different research areas, indicates that the behavior of these animals and humans who self-starve is functionally similar. A biobehavioral theory of activity anorexia is presented that details the cultural contingencies, behavioral processes, and physiology of anorexia. Diagnostic criteria and a three-stage treatment program for activity-based anorexia are outlined. The animal model permits basic research on anorexia that for practical and ethical reasons cannot be conducted with humans. Thus, basic research can have applied importance. PMID:22478169

  2. Creating a System for Data-Driven Decision-Making: Applying the Principal-Agent Framework

    ERIC Educational Resources Information Center

    Wohlstetter, Priscilla; Datnow, Amanda; Park, Vicki

    2008-01-01

    The purpose of this article is to improve our understanding of data-driven decision-making strategies that are initiated at the district or system level. We apply principal-agent theory to the analysis of qualitative data gathered in a case study of 4 urban school systems. Our findings suggest educators at the school level need not only systemic…

  3. Cladistic analysis applied to the classification of volcanoes

    NASA Astrophysics Data System (ADS)

    Hone, D. W. E.; Mahony, S. H.; Sparks, R. S. J.; Martin, K. T.

    2007-11-01

    Cladistics is a systematic method of classification that groups entities on the basis of sharing similar characteristics in the most parsimonious manner. Here cladistics is applied to the classification of volcanoes using a dataset of 59 Quaternary volcanoes and 129 volcanic edifices of the Tohoku region, Northeast Japan. Volcano and edifice characteristics recorded in the database include attributes of volcano size, chemical composition, dominant eruptive products, volcano morphology, dominant landforms, volcano age and eruptive history. Without characteristics related to time the volcanic edifices divide into two groups, with characters related to volcano size, dominant composition and edifice morphology being the most diagnostic. Analysis including time based characteristics yields four groups with a good correlation between these groups and the two groups from the analysis without time for 108 out of 129 volcanic edifices. Thus when characters are slightly changed the volcanoes still form similar groupings. Analysis of the volcanoes both with and without time yields three groups based on compositional, eruptive products and morphological characters. Spatial clusters of volcanic centres have been recognised in the Tohoku region by Tamura et al. ( Earth Planet Sci Lett 197:105 106, 2002). The groups identified by cladistic analysis are distributed unevenly between the clusters, indicating a tendency for individual clusters to form similar kinds of volcanoes with distinctive but coherent styles of volcanism. Uneven distribution of volcano types between clusters can be explained by variations in dominant magma compositions through time, which are reflected in eruption products and volcanic landforms. Cladistic analysis can be a useful tool for elucidating dynamic igneous processes that could be applied to other regions and globally. Our exploratory study indicates that cladistics has promise as a method for classifying volcanoes and potentially elucidating dynamic

  4. Scanning proton microprobe analysis applied to wood and bark samples

    NASA Astrophysics Data System (ADS)

    Lövestam, N. E. G.; Johansson, E.-M.; Johansson, S. A. E.; Pallon, J.

    1990-04-01

    In this study the feasibility of applying scanning micro-PIXE to analysis of wood and bark samples is demonstrated. Elemental mapping of the analysed sections show the patterns of Cl, K, Ca, Mn, Fe, Cu and Zn. Some of these patterns can be related to the annual tree ring structure. It is observed that the variation of elements having an environmental character can be rather large within a single tree ring, thus illuminating possible difficulties when using tree ring sections as a pollution monitor. The variations in elemental concentrations when crossing from bark to wood are also shown to be smooth for some elements but rather abrupt for others.

  5. Applying AI tools to operational space environmental analysis

    NASA Technical Reports Server (NTRS)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  6. Systems engineering and analysis

    SciTech Connect

    Blanchard, B.S.; Fabrycky, W.J.

    1981-01-01

    An introduction to systems is provided and tools for systems analysis are considered, taking into account system definitions and concepts, approaches for bringing systems into being, models in systems analysis, economic analysis techniques, mathematical modeling and optimization, probability and statistics, queuing theory and analysis, and control concepts and techniques. The system design process is discussed along with the design for operational feasibility, systems engineering management, and system design case studies. Attention is given to conceptual design, preliminary system design, detail design and development, system test and evaluation, design for reliability, design for maintainability, design for supportability, design for economic feasibility, communication system design, finite population system design, energy storage system design, and procurement-inventory system design.

  7. Nuclear safety as applied to space power reactor systems

    SciTech Connect

    Cummings, G.E.

    1987-01-01

    To develop a strategy for incorporating and demonstrating safety, it is necessary to enumerate the unique aspects of space power reactor systems from a safety standpoint. These features must be differentiated from terrestrial nuclear power plants so that our experience can be applied properly. Some ideas can then be developed on how safe designs can be achieved so that they are safe and perceived to be safe by the public. These ideas include operating only after achieving a stable orbit, developing an inherently safe design, ''designing'' in safety from the start and managing the system development (design) so that it is perceived safe. These and other ideas are explored further in this paper.

  8. Discrete Event Supervisory Control Applied to Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S.; Shah, Neerav

    2005-01-01

    The theory of discrete event supervisory (DES) control was applied to the optimal control of a twin-engine aircraft propulsion system and demonstrated in a simulation. The supervisory control, which is implemented as a finite-state automaton, oversees the behavior of a system and manages it in such a way that it maximizes a performance criterion, similar to a traditional optimal control problem. DES controllers can be nested such that a high-level controller supervises multiple lower level controllers. This structure can be expanded to control huge, complex systems, providing optimal performance and increasing autonomy with each additional level. The DES control strategy for propulsion systems was validated using a distributed testbed consisting of multiple computers--each representing a module of the overall propulsion system--to simulate real-time hardware-in-the-loop testing. In the first experiment, DES control was applied to the operation of a nonlinear simulation of a turbofan engine (running in closed loop using its own feedback controller) to minimize engine structural damage caused by a combination of thermal and structural loads. This enables increased on-wing time for the engine through better management of the engine-component life usage. Thus, the engine-level DES acts as a life-extending controller through its interaction with and manipulation of the engine s operation.

  9. Empirical modal decomposition applied to cardiac signals analysis

    NASA Astrophysics Data System (ADS)

    Beya, O.; Jalil, B.; Fauvet, E.; Laligant, O.

    2010-01-01

    In this article, we present the method of empirical modal decomposition (EMD) applied to the electrocardiograms and phonocardiograms signals analysis and denoising. The objective of this work is to detect automatically cardiac anomalies of a patient. As these anomalies are localized in time, therefore the localization of all the events should be preserved precisely. The methods based on the Fourier Transform (TFD) lose the localization property [13] and in the case of Wavelet Transform (WT) which makes possible to overcome the problem of localization, but the interpretation remains still difficult to characterize the signal precisely. In this work we propose to apply the EMD (Empirical Modal Decomposition) which have very significant properties on pseudo periodic signals. The second section describes the algorithm of EMD. In the third part we present the result obtained on Phonocardiograms (PCG) and on Electrocardiograms (ECG) test signals. The analysis and the interpretation of these signals are given in this same section. Finally, we introduce an adaptation of the EMD algorithm which seems to be very efficient for denoising.

  10. Automated SEM Modal Analysis Applied to the Diogenites

    NASA Technical Reports Server (NTRS)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  11. Performance Measurement Analysis System

    Energy Science and Technology Software Center (ESTSC)

    1989-06-01

    The PMAS4.0 (Performance Measurement Analysis System) is a user-oriented system designed to track the cost and schedule performance of Department of Energy (DOE) major projects (MPs) and major system acquisitions (MSAs) reporting under DOE Order 5700.4A, Project Management System. PMAS4.0 provides for the analysis of performance measurement data produced from management control systems complying with the Federal Government''s Cost and Schedule Control Systems Criteria.

  12. Applying cluster analysis to physics education research data

    NASA Astrophysics Data System (ADS)

    Springuel, R. Padraic

    One major thrust of Physics Education Research (PER) is the identification of student ideas about specific physics concepts, both correct ideas and those that differ from the expert consensus. Typically the research process of eliciting the spectrum of student ideas involves the administration of specially designed questions to students. One major analysis task in PER is the sorting of these student responses into thematically coherent groups. This process is one which has previously been done by eye in PER. This thesis explores the possibility of using cluster analysis to perform the task in a more rigorous and less time-intensive fashion while making fewer assumptions about what the students are doing. Since this technique has not previously been used in PER, a summary of the various kinds of cluster analysis is included as well as a discussion of which might be appropriate for the task of sorting student responses into groups. Two example data sets (one based on the Force and Motion Conceptual Evaluation (DICE) the other looking at acceleration in two-dimensions (A2D) are examined in depth to demonstrate how cluster analysis can be applied to PER data and the various considerations which must be taken into account when doing so. In both cases, the techniques described in this thesis found 5 groups which contained about 90% of the students in the data set. The results of this application are compared to previous research on the topics covered by the two examples to demonstrate that cluster analysis can effectively uncover the same patterns in student responses that have already been identified.

  13. Soft tissue cephalometric analysis applied to regional Indian population

    PubMed Central

    Upadhyay, Jay S.; Maheshwari, Sandhya; Verma, Sanjeev K.; Zahid, Syed Naved

    2013-01-01

    Introduction: Importance of soft tissue consideration in establishing treatment goals for orthodontics and orthognathic surgery has been recognized and various cephalometric analysis incorporating soft tissue parameters have evolved. The great variance in soft tissue drape of the human face and perception of esthetics exists and normative data based on one population group cannot be applied to all. The study was conducted to compare the standard soft tissue cephalometric analysis (STCA) norms with norms derived for population of western Uttar Pradesh region of India. Materials and Methods: The sample consisted of lateral cephalograms taken in natural head position of 33 normal subjects (16 males, 17 females). The cephalograms were analyzed with soft tissue cephalometric analysis for orthodontic diagnosis and treatment planning, and the Student's t test was used to compare the difference in means between study population and standard STCA norms. Results: Compared with established STCA norms, females in our study had steeper maxillary occlusal plane, more proclined mandibular incisors, and less protrusive lips. Both males and females showed an overall decrease in facial lengths, less prominent midface and mandibular structures and more convex profile compared with established norms for the White population. Conclusions: Statistically significant differences were found in certain key parameters of STCA for western Uttar Pradesh population when compared with established norms. PMID:24665169

  14. Seismic analysis applied to the delimiting of a gas reservoir

    SciTech Connect

    Ronquillo, G.; Navarro, M.; Lozada, M.; Tafolla, C.

    1996-08-01

    We present the results of correlating seismic models with petrophysical parameters and well logs to mark the limits of a gas reservoir in sand lenses. To fulfill the objectives of the study, we used a data processing sequence that included wavelet manipulation, complex trace attributes and pseudovelocities inversion, along with several quality control schemes to insure proper amplitude preservation. Based on the analysis and interpretation of the seismic sections, several areas of interest were selected to apply additional signal treatment as preconditioning for petrophysical inversion. Signal classification was performed to control the amplitudes along the horizons of interest, and to be able to find an indirect interpretation of lithologies. Additionally, seismic modeling was done to support the results obtained and to help integrate the interpretation. The study proved to be a good auxiliary tool in the location of the probable extension of the gas reservoir in sand lenses.

  15. Finite Element Analysis Applied to Dentoalveolar Trauma: Methodology Description

    PubMed Central

    da Silva, B. R.; Moreira Neto, J. J. S.; da Silva, F. I.; de Aguiar, A. S. W.

    2011-01-01

    Dentoalveolar traumatic injuries are among the clinical conditions most frequently treated in dental practice. However, few studies so far have addressed the biomechanical aspects of these events, probably as a result of difficulties in carrying out satisfactory experimental and clinical studies as well as the unavailability of truly scientific methodologies. The aim of this paper was to describe the use of finite element analysis applied to the biomechanical evaluation of dentoalveolar trauma. For didactic purposes, the methodological process was divided into steps that go from the creation of a geometric model to the evaluation of final results, always with a focus on methodological characteristics, advantages, and disadvantages, so as to allow the reader to customize the methodology according to specific needs. Our description shows that the finite element method can faithfully reproduce dentoalveolar trauma, provided the methodology is closely followed and thoroughly evaluated. PMID:21991463

  16. Applied Behavior Analysis is a Science and, Therefore, Progressive.

    PubMed

    Leaf, Justin B; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K; Smith, Tristram; Weiss, Mary Jane

    2016-02-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful progress for individuals diagnosed with autism spectrum disorder (ASD). We describe this approach as progressive. In a progressive approach to ABA, the therapist employs a structured yet flexible process, which is contingent upon and responsive to child progress. We will describe progressive ABA, contrast it to reductionist ABA, and provide rationales for both the substance and intent of ABA as a progressive scientific method for improving conditions of social relevance for individuals with ASD. PMID:26373767

  17. Image analysis technique applied to lock-exchange gravity currents

    NASA Astrophysics Data System (ADS)

    Nogueira, Helena I. S.; Adduce, Claudia; Alves, Elsa; Franca, Mário J.

    2013-04-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in the image relating the amount of dye uniformly distributed in the tank and the greyscale values in the corresponding images. The results are evaluated and corrected by applying the mass conservation principle within the experimental tank. The procedure is a simple way to assess the time-varying density distribution within the gravity current, allowing the investigation of gravity current dynamics and mixing processes.

  18. Systemic toxicity of dermally applied crude oils in rats

    SciTech Connect

    Feuston, M.H.; Mackerer, C.R.; Schreiner, C.A.; Hamilton, C.E.

    1997-12-31

    Two crude oils, differing in viscosity (V) and nitrogen (N) and sulfur (S) content, were evaluated for systemic toxicity, In the Crude I (low V, low N, low S) study, the material was applied to the clipped backs of rats at dose levels of 0, 30, 125, and 500 mg/kg. In the Crude II (high V, high N, moderate S) study, the oil was applied similarly at the same dose levels. The crude oils were applied for 13 wk, 5 d/wk. Exposure sites were not occluded. Mean body weight gain (wk 1-14) was significantly reduced in male rats exposed to Crude II; body weight gain of all other animals was not adversely affected by treatment. An increase in absolute (A) and relative (R) liver weights and a decrease in A and R thymus weights were observed in male and female rats exposed to Crude II at 500 mg/kg; only liver weights (A and R) were adversely affected in male and female rats exposed to Crude I. In general, there was no consistent pattern of toxicity for serum chemistry endpoints; however, more parameters were adversely affected in Crude II-exposed female rats than in the other exposed groups. A consistent pattern of toxicity for hematology endpoints was observed among male rats exposed to Crude I and male and female rats exposed to Crude II. Parameters affected included: Crudes I and II, red blood cell count, hemoglobin, and hematocrit, Crude II, platelet count. Microscopic evaluation of tissues revealed the following treatment-related findings: Crude I, treated skin, thymus, and thyroid; Crude II, bone marrow, treated skin, thymus, and thyroid. The LOEL (lowest observable effect level) for skin irritation and systemic toxicity (based on marginal effects on the thyroid) for both crude oils was 30 mg/kg; effects were more numerous and more pronounced in animals exposed to Crude II. Systemic effects are probably related to concentrations of polycyclic aromatic compounds (PAC) found in crude oil.

  19. Applied Information Systems Research Program (AISRP). Workshop 2: Meeting Proceedings

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The Earth and space science participants were able to see where the current research can be applied in their disciplines and computer science participants could see potential areas for future application of computer and information systems research. The Earth and Space Science research proposals for the High Performance Computing and Communications (HPCC) program were under evaluation. Therefore, this effort was not discussed at the AISRP Workshop. OSSA's other high priority area in computer science is scientific visualization, with the entire second day of the workshop devoted to it.

  20. Robust regression applied to fractal/multifractal analysis.

    NASA Astrophysics Data System (ADS)

    Portilla, F.; Valencia, J. L.; Tarquis, A. M.; Saa-Requejo, A.

    2012-04-01

    Fractal and multifractal are concepts that have grown increasingly popular in recent years in the soil analysis, along with the development of fractal models. One of the common steps is to calculate the slope of a linear fit commonly using least squares method. This shouldn't be a special problem, however, in many situations using experimental data the researcher has to select the range of scales at which is going to work neglecting the rest of points to achieve the best linearity that in this type of analysis is necessary. Robust regression is a form of regression analysis designed to circumvent some limitations of traditional parametric and non-parametric methods. In this method we don't have to assume that the outlier point is simply an extreme observation drawn from the tail of a normal distribution not compromising the validity of the regression results. In this work we have evaluated the capacity of robust regression to select the points in the experimental data used trying to avoid subjective choices. Based on this analysis we have developed a new work methodology that implies two basic steps: • Evaluation of the improvement of linear fitting when consecutive points are eliminated based on R p-value. In this way we consider the implications of reducing the number of points. • Evaluation of the significance of slope difference between fitting with the two extremes points and fitted with the available points. We compare the results applying this methodology and the common used least squares one. The data selected for these comparisons are coming from experimental soil roughness transect and simulated based on middle point displacement method adding tendencies and noise. The results are discussed indicating the advantages and disadvantages of each methodology. Acknowledgements Funding provided by CEIGRAM (Research Centre for the Management of Agricultural and Environmental Risks) and by Spanish Ministerio de Ciencia e Innovación (MICINN) through project no

  1. Performance analysis of high quality parallel preconditioners applied to 3D finite element structural analysis

    SciTech Connect

    Kolotilina, L.; Nikishin, A.; Yeremin, A.

    1994-12-31

    The solution of large systems of linear equations is a crucial bottleneck when performing 3D finite element analysis of structures. Also, in many cases the reliability and robustness of iterative solution strategies, and their efficiency when exploiting hardware resources, fully determine the scope of industrial applications which can be solved on a particular computer platform. This is especially true for modern vector/parallel supercomputers with large vector length and for modern massively parallel supercomputers. Preconditioned iterative methods have been successfully applied to industrial class finite element analysis of structures. The construction and application of high quality preconditioners constitutes a high percentage of the total solution time. Parallel implementation of high quality preconditioners on such architectures is a formidable challenge. Two common types of existing preconditioners are the implicit preconditioners and the explicit preconditioners. The implicit preconditioners (e.g. incomplete factorizations of several types) are generally high quality but require solution of lower and upper triangular systems of equations per iteration which are difficult to parallelize without deteriorating the convergence rate. The explicit type of preconditionings (e.g. polynomial preconditioners or Jacobi-like preconditioners) require sparse matrix-vector multiplications and can be parallelized but their preconditioning qualities are less than desirable. The authors present results of numerical experiments with Factorized Sparse Approximate Inverses (FSAI) for symmetric positive definite linear systems. These are high quality preconditioners that possess a large resource of parallelism by construction without increasing the serial complexity.

  2. Applying principles of health system strengthening to eye care

    PubMed Central

    Blanchet, Karl; Patel, Daksha

    2012-01-01

    Understanding Health systems have now become the priority focus of researchers and policy makers, who have progressively moved away from a project-centred perspectives. The new tendency is to facilitate a convergence between health system developers and disease-specific programme managers in terms of both thinking and action, and to reconcile both approaches: one focusing on integrated health systems and improving the health status of the population and the other aiming at improving access to health care. Eye care interventions particularly in developing countries have generally been vertically implemented (e.g. trachoma, cataract surgeries) often with parallel organizational structures or specialised disease specific services. With the emergence of health system strengthening in health strategies and in the service delivery of interventions there is a need to clarify and examine inputs in terms governance, financing and management. This present paper aims to clarify key concepts in health system strengthening and describe the various components of the framework as applied in eye care interventions. PMID:22944762

  3. Certification methodology applied to the NASA experimental radar system

    NASA Technical Reports Server (NTRS)

    Britt, Charles L.; Switzer, George F.; Bracalente, Emedio M.

    1994-01-01

    The objective of the research is to apply selected FAA certification techniques to the NASA experimental wind shear radar system. Although there is no intent to certify the NASA system, the procedures developed may prove useful to manufacturers that plan to undergo the certification process. The certification methodology for forward-looking wind shear detection radars will require estimation of system performance in several FAA-specified microburst/clutter scenarios as well as the estimation of probabilities of missed and false hazard alerts under general operational conditions. Because of the near-impossibility of obtaining these results experimentally, analytical and simulation approaches must be used. Hazard detection algorithms were developed that derived predictive estimates of aircraft hazard from basic radar measurements of weather reflectivity and radial wind velocity. These algorithms were designed to prevent false alarms due to ground clutter while providing accurate predictions of hazard to the aircraft due to weather. A method of calculation of the probability of missed and false hazard alerts has been developed that takes into account the effect of the various algorithms used in the system and provides estimates of the probability of missed and false alerts per microburst encounter under weather conditions found at Denver, Kansas City, and Orlando. Simulation techniques have been developed that permit the proper merging of radar ground clutter data (obtained from flight tests) with simulated microburst data (obtained from microburst models) to estimate system performance using the microburst/clutter scenarios defined by the FAA.

  4. Applying principles of health system strengthening to eye care.

    PubMed

    Blanchet, Karl; Patel, Daksha

    2012-01-01

    Understanding health systems have now become the priority focus of researchers and policy makers, who have progressively moved away from a project-centred perspectives. The new tendency is to facilitate a convergence between health system developers and disease-specific programme managers in terms of both thinking and action, and to reconcile both approaches: one focusing on integrated health systems and improving the health status of the population and the other aiming at improving access to health care. Eye care interventions particularly in developing countries have generally been vertically implemented (e.g. trachoma, cataract surgeries) often with parallel organizational structures or specialised disease specific services. With the emergence of health system strengthening in health strategies and in the service delivery of interventions there is a need to clarify and examine inputs in terms governance, financing and management. This present paper aims to clarify key concepts in health system strengthening and describe the various components of the framework as applied in eye care interventions. PMID:22944762

  5. A Hygrothermal Risk Analysis Applied to Residential Unvented Attics

    SciTech Connect

    Pallin, Simon B; Kehrer, Manfred

    2013-01-01

    Aresidential building, constructed with an unvented attic, is acommonroof assembly in the United States.The expected hygrothermal performance and service life of the roof are difficult to estimate due to a number of varying parameters.Typical parameters expected to vary are the climate, direction, and slope of the roof as well as the radiation properties of the surface material. Furthermore, influential parameters are indoor moisture excess, air leakages through the attic floor, and leakages from air-handling unit and ventilation ducts. In addition, the type of building materials such as the insulation material and closed or open cell spray polyurethane foam will influence the future performance of the roof. A development of a simulation model of the roof assembly will enable a risk and sensitivity analysis, in which the most important varying parameters on the hygrothermal performance can be determined. The model is designed to perform probabilistic simulations using mathematical and hygrothermal calculation tools. The varying input parameters can be chosen from existing measurements, simulations, or standards. An analysis is applied to determine the risk of consequences, such as mold growth, rot, or energy demand of the HVAC unit. Furthermore, the future performance of the roof can be simulated in different climates to facilitate the design of an efficient and reliable roof construction with the most suitable technical solution and to determine the most appropriate building materials for a given climate

  6. System engineering applied to VLTI: a scientific success

    NASA Astrophysics Data System (ADS)

    Haguenauer, P.; Alonso, J.; Bourget, P.; Gitton, Ph.; Morel, S.; Poupar, S.; Schuhler, Nicolas

    2014-07-01

    The ESO Very Large Telescope Interferometer (VLTI) offers access to the four 8-m Unit Telescopes (UT) and the four 1.8-m Auxiliary Telescopes (AT) of the Paranal Observatory. After the first fringes obtained in 2011 with the commissioning instrument VINCI and with siderostats, the VLTI has seen an important number of systems upgrades, paving the path towards reaching the infrastructure level and scientific results it had been designed for. The current status of the VLTI operation all year round with up to four telescopes simultaneously and real imaging capability demonstrates the powerful interferometric infrastructure that has been delivered to the astronomical community. Reaching today's level of robustness and operability of the VLTI has been a long journey, with a lot of lessons learned and gained experience. In 2007, the Paranal Observatory recognized the need for a global system approach for the VLTI, and a dedicated system engineering team was set to analyse the status of the interferometer, identify weak points and area where performances were not met, propose and apply solutions. The gains of this specific effort can be found today in the very good operability level with faster observations executions, in the decreased downtime, in the improved performances, and in the better reliability of the different systems. We will present an historical summary of the system engineering effort done at the VLTI, showing the strategy used, and the implemented upgrades and technical solutions. Improvements in terms of scientific data quality will be highlighted when possible. We will conclude on the legacy of the VLTI system engineering effort, for the VLTI and for future systems.

  7. Multitaper Spectral Analysis and Wavelet Denoising Applied to Helioseismic Data

    NASA Technical Reports Server (NTRS)

    Komm, R. W.; Gu, Y.; Hill, F.; Stark, P. B.; Fodor, I. K.

    1999-01-01

    Estimates of solar normal mode frequencies from helioseismic observations can be improved by using Multitaper Spectral Analysis (MTSA) to estimate spectra from the time series, then using wavelet denoising of the log spectra. MTSA leads to a power spectrum estimate with reduced variance and better leakage properties than the conventional periodogram. Under the assumption of stationarity and mild regularity conditions, the log multitaper spectrum has a statistical distribution that is approximately Gaussian, so wavelet denoising is asymptotically an optimal method to reduce the noise in the estimated spectra. We find that a single m-upsilon spectrum benefits greatly from MTSA followed by wavelet denoising, and that wavelet denoising by itself can be used to improve m-averaged spectra. We compare estimates using two different 5-taper estimates (Stepian and sine tapers) and the periodogram estimate, for GONG time series at selected angular degrees l. We compare those three spectra with and without wavelet-denoising, both visually, and in terms of the mode parameters estimated from the pre-processed spectra using the GONG peak-fitting algorithm. The two multitaper estimates give equivalent results. The number of modes fitted well by the GONG algorithm is 20% to 60% larger (depending on l and the temporal frequency) when applied to the multitaper estimates than when applied to the periodogram. The estimated mode parameters (frequency, amplitude and width) are comparable for the three power spectrum estimates, except for modes with very small mode widths (a few frequency bins), where the multitaper spectra broadened the modest compared with the periodogram. We tested the influence of the number of tapers used and found that narrow modes at low n values are broadened to the extent that they can no longer be fit if the number of tapers is too large. For helioseismic time series of this length and temporal resolution, the optimal number of tapers is less than 10.

  8. Advanced solar irradiances applied to satellite and ionospheric operational systems

    NASA Astrophysics Data System (ADS)

    Tobiska, W. Kent; Schunk, Robert; Eccles, Vince; Bouwer, Dave

    Satellite and ionospheric operational systems require solar irradiances in a variety of time scales and spectral formats. We describe the development of a system using operational grade solar irradiances that are applied to empirical thermospheric density models and physics-based ionospheric models used by operational systems that require a space weather characterization. The SOLAR2000 (S2K) and SOLARFLARE (SFLR) models developed by Space Environment Technologies (SET) provide solar irradiances from the soft X-rays (XUV) through the Far Ultraviolet (FUV) spectrum. The irradiances are provided as integrated indices for the JB2006 empirical atmosphere density models and as line/band spectral irradiances for the physics-based Ionosphere Forecast Model (IFM) developed by the Space Environment Corporation (SEC). We describe the integration of these irradiances in historical, current epoch, and forecast modes through the Communication Alert and Prediction System (CAPS). CAPS provides real-time and forecast HF radio availability for global and regional users and global total electron content (TEC) conditions.

  9. Painleve singularity analysis applied to charged particle dynamics during reconnection

    SciTech Connect

    Larson, J.W.

    1992-01-01

    For a plasma in the collisionless regime, test-particle modelling can lend some insight into the macroscopic behavior of the plasma, e.g. conductivity and heating. A common example for which this technique is used is a system with electric and magnetic fields given by B = [delta]yx + zy + yz and E = [epsilon]z, where [delta], [gamma], and [epsilon] are constant parameters. This model can be used to model plasma behavior near neutral lines, ([gamma] = 0), as well as current sheets ([gamma] = 0, [delta] = 0). The integrability properties of the particle motion in such fields might affect the plasma's macroscopic behavior, and the author has asked the question [open quotes]For what values of [delta], [gamma], and [epsilon] is the system integrable [close quotes] To answer this question, the author has employed Painleve singularity analysis, which is an examination of the singularity properties of a test particle's equations of motion in the complex time plane. This analysis has identified two field geometries for which the system's particle dynamics are integrable in terms of the second Painleve transcendent: the circular O-line case and the case of the neutral sheet configuration. These geometries yield particle dynamics that are integrable in the Liouville sense (i.e., there exist the proper number of integrals in involution) in an extended phase space which includes the time as a canonical coordinate, and this property is also true for nonzero [gamma]. The singularity property tests also identified a large, dense set of X-line and O-line field geometries that yield dynamics that may possess the weak Painleve property. In the case of the X-line geometries, this result shows little relevance to the physical nature of the system, but the existence of a dense set of elliptical O-line geometries with this property may be related to the fact that for [epsilon] positive, one can construct asymptotic solutions in the limit t [yields] [infinity].

  10. Generalized Statistical Thermodyanmics Applied to Small Material Systems

    NASA Astrophysics Data System (ADS)

    Cammarata, Robert

    2012-02-01

    When characterizing the behavior of small material systems, surface effects can strongly influence the thermodynamic behavior and need to be taken into account in a complete thermal physics analysis. Although there have been a variety of approached proposed to incorporate surface effects, they are often restricted to certain types of systems (e.g., those involving incompressible phases) and often invoke thermodynamics parameters that are often not well-defined for the surface. It is proposed that a generalized statistical mechanics based on the concept of thermodynamic availability (exergy) can be formulated from which the surface properties and their influence on system behavior can be naturally and rigorously obtained. This availability-based statistical thermodynamics will be presented and its use illustrated in a treatment of nucleation during crystallization.

  11. Non-Harmonic Analysis Applied to Optical Coherence Tomography Imaging

    NASA Astrophysics Data System (ADS)

    Cao, Xu; Uchida, Tetsuya; Hirobayashi, Shigeki; Chong, Changho; Morosawa, Atsushi; Totsuka, Koki; Suzuki, Takuya

    2012-02-01

    A new processing technique called non-harmonic analysis (NHA) is proposed for optical coherence tomography (OCT) imaging. Conventional Fourier-domain OCT employs the discrete Fourier transform (DFT), which depends on the window function and length. The axial resolution of the OCT image, calculated by using DFT, is inversely proportional to the full width at half maximum (FWHM) of the wavelength range. The FWHM of wavelength range is limited by the sweeping range of the source in swept-source OCT and it is limited by the number of CCD pixels in spectral-domain OCT. However, the NHA process does not have such constraints; NHA can resolve high frequencies irrespective of the window function and the frame length of the sampled data. In this study, the NHA process is described and it is applied to OCT imaging. It is compared with OCT images based on the DFT. To demonstrate the benefits of using NHA for OCT, we perform OCT imaging with NHA of an onion skin. The results reveal that NHA can achieve an image resolution equivalent that of a 100-nm sweep range using a significantly reduced wavelength range. They also reveal the potential of using this technique to achieve high-resolution imaging without using a broadband source. However, the long calculation times required for NHA must be addressed if it is to be used in clinical applications.

  12. Applying DNA computation to intractable problems in social network analysis.

    PubMed

    Chen, Rick C S; Yang, Stephen J H

    2010-09-01

    From ancient times to the present day, social networks have played an important role in the formation of various organizations for a range of social behaviors. As such, social networks inherently describe the complicated relationships between elements around the world. Based on mathematical graph theory, social network analysis (SNA) has been developed in and applied to various fields such as Web 2.0 for Web applications and product developments in industries, etc. However, some definitions of SNA, such as finding a clique, N-clique, N-clan, N-club and K-plex, are NP-complete problems, which are not easily solved via traditional computer architecture. These challenges have restricted the uses of SNA. This paper provides DNA-computing-based approaches with inherently high information density and massive parallelism. Using these approaches, we aim to solve the three primary problems of social networks: N-clique, N-clan, and N-club. Their accuracy and feasible time complexities discussed in the paper will demonstrate that DNA computing can be used to facilitate the development of SNA. PMID:20566337

  13. Applying of digital signal processing to optical equisignal zone system

    NASA Astrophysics Data System (ADS)

    Maraev, Anton A.; Timofeev, Aleksandr N.; Gusarov, Vadim F.

    2015-05-01

    In this work we are trying to assess the application of array detectors and digital information processing to the system with the optical equisignal zone as a new method of evaluating of optical equisignal zone position. Peculiarities of optical equisignal zone formation are described. The algorithm of evaluation of optical equisignal zone position is applied to processing on the array detector. This algorithm enables to evaluate as lateral displacement as turning angles of the receiver relative to the projector. Interrelation of parameters of the projector and the receiver is considered. According to described principles an experimental set was made and then characterized. The accuracy of position evaluation of the equisignal zone is shown dependent of the size of the equivalent entrance pupil at processing.

  14. Adaptive control applied to Space Station attitude control system

    NASA Technical Reports Server (NTRS)

    Lam, Quang M.; Chipman, Richard; Hu, Tsay-Hsin G.; Holmes, Eric B.; Sunkel, John

    1992-01-01

    This paper presents an adaptive control approach to enhance the performance of current attitude control system used by the Space Station Freedom. The proposed control law was developed based on the direct adaptive control or model reference adaptive control scheme. Performance comparisons, subject to inertia variation, of the adaptive controller and the fixed-gain linear quadratic regulator currently implemented for the Space Station are conducted. Both the fixed-gain and the adaptive gain controllers are able to maintain the Station stability for inertia variations of up to 35 percent. However, when a 50 percent inertia variation is applied to the Station, only the adaptive controller is able to maintain the Station attitude.

  15. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    PubMed

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads. PMID:25738742

  16. To Apply or Not to Apply: A Survey Analysis of Grant Writing Costs and Benefits

    PubMed Central

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads. PMID:25738742

  17. BATSE spectroscopy analysis system

    NASA Technical Reports Server (NTRS)

    Schaefer, Bradley E.; Bansal, Sandhia; Basu, Anju; Brisco, Phil; Cline, Thomas L.; Friend, Elliott; Laubenthal, Nancy; Panduranga, E. S.; Parkar, Nuru; Rust, Brad

    1992-01-01

    The Burst and Transient Source Experiment (BATSE) Spectroscopy Analysis System (BSAS) is the software system which is the primary tool for the analysis of spectral data from BATSE. As such, Guest Investigators and the community as a whole need to know its basic properties and characteristics. Described here are the characteristics of the BATSE spectroscopy detectors and the BSAS.

  18. Analysis of possibility of applying the PVDF foil in industrial vibration sensors

    NASA Astrophysics Data System (ADS)

    Wróbel, A.

    2015-11-01

    There are many machines using the piezoelectric effects. Systems with smart materials are often used because they have high potential applications for example transducers can be applied to receive required characteristic of projected system. Every engineer and designer know how important it is properly mathematical model and method of the analysis. Also it is important to consider all parameters of analyzed system for example glue layer between elements. Geometrical and material parameters has a significant impact on the characteristics of the all system's components because the omission of the influence of one of them results in inaccuracy in the analysis of the system. In article the modeling and testing of vibrating systems with piezoelectric ceramic materials transducers used as actuators and vibration dampers. The method of analysis of the vibrating sensor systems will be presented, mathematical model, and characteristics, to determine the influence of the system's properties on these characteristics. Main scientific point of the project is to analyze and demonstrate possibility of applying new construction with the PVDF foil or any other belonging to a group of smart materials in industrial sensors. Currently, the vibration level sensors are used by practically all manufacturers of piezoelectric ceramic plates to generate and detect the vibration of the fork.

  19. Advanced imaging systems for diagnostic investigations applied to Cultural Heritage

    NASA Astrophysics Data System (ADS)

    Peccenini, E.; Albertin, F.; Bettuzzi, M.; Brancaccio, R.; Casali, F.; Morigi, M. P.; Petrucci, F.

    2014-12-01

    The diagnostic investigations are an important resource in the studies on Cultural Heritage to enhance the knowledge on execution techniques, materials and conservation status of a work of art. In this field, due to the great historical and artistic value of the objects, preservation is the main concern; for this reason, new technological equipment has been designed and developed in the Physics Departments of the Universities of Ferrara and Bologna to enhance the non-invasive approach to the study of pictorial artworks and other objects of cultural interest. Infrared (IR) reflectography, X-ray radiography and computed tomography (CT), applied to works of art, are joined by the same goal: to get hidden information on execution techniques and inner structure pursuing the non-invasiveness of the methods, although using different setup and physical principles. In this work transportable imaging systems to investigate large objects in museums and galleries are presented. In particular, 2D scanning devices for IR reflectography and X-ray radiography, CT systems and some applications to the Cultural Heritage are described.

  20. Database mining applied to central nervous system (CNS) activity.

    PubMed

    Pintore, M; Taboureau, O; Ros, F; Chrétien, J R

    2001-04-01

    A data set of 389 compounds, active in the central nervous system (CNS) and divided into eight classes according to the receptor type, was extracted from the RBI database and analyzed by Self-Organizing Maps (SOM), also known as Kohonen Artificial Neural Networks. This method gives a 2D representation of the distribution of the compounds in the hyperspace derived from their molecular descriptors. As SOM belongs to the category of unsupervised techniques, it has to be combined with another method in order to generate classification models with predictive ability. The fuzzy clustering (FC) approach seems to be particularly suitable to delineate clusters in a rational way from SOM and to get an automatic objective map interpretation. Maps derived by SOM showed specific regions associated with a unique receptor type and zones in which two or more activity classes are nested. Then, the modeling ability of the proposed SOM/FC Hybrid System tools applied simultaneously to eight activity classes was validated after dividing the 389 compounds into a training set and a test set, including 259 and 130 molecules, respectively. The proper experimental activity class, among the eight possible ones, was predicted simultaneously and correctly for 81% of the test set compounds. PMID:11461760

  1. Operational modal analysis applied to the concert harp

    NASA Astrophysics Data System (ADS)

    Chomette, B.; Le Carrou, J.-L.

    2015-05-01

    Operational modal analysis (OMA) methods are useful to extract modal parameters of operating systems. These methods seem to be particularly interesting to investigate the modal basis of string instruments during operation to avoid certain disadvantages due to conventional methods. However, the excitation in the case of string instruments is not optimal for OMA due to the presence of damped harmonic components and low noise in the disturbance signal. Therefore, the present study investigates the least-square complex exponential (LSCE) and the modified least-square complex exponential methods in the case of a string instrument to identify modal parameters of the instrument when it is played. The efficiency of the approach is experimentally demonstrated on a concert harp excited by some of its strings and the two methods are compared to a conventional modal analysis. The results show that OMA allows us to identify modes particularly present in the instrument's response with a good estimation especially if they are close to the excitation frequency with the modified LSCE method.

  2. Principles of Micellar Electrokinetic Capillary Chromatography Applied in Pharmaceutical Analysis

    PubMed Central

    Hancu, Gabriel; Simon, Brigitta; Rusu, Aura; Mircia, Eleonora; Gyéresi, Árpád

    2013-01-01

    Since its introduction capillary electrophoresis has shown great potential in areas where electrophoretic techniques have rarely been used before, including here the analysis of pharmaceutical substances. The large majority of pharmaceutical substances are neutral from electrophoretic point of view, consequently separations by the classic capillary zone electrophoresis; where separation is based on the differences between the own electrophoretic mobilities of the analytes; are hard to achieve. Micellar electrokinetic capillary chromatography, a hybrid method that combines chromatographic and electrophoretic separation principles, extends the applicability of capillary electrophoretic methods to neutral analytes. In micellar electrokinetic capillary chromatography, surfactants are added to the buffer solution in concentration above their critical micellar concentrations, consequently micelles are formed; micelles that undergo electrophoretic migration like any other charged particle. The separation is based on the differential partitioning of an analyte between the two-phase system: the mobile aqueous phase and micellar pseudostationary phase. The present paper aims to summarize the basic aspects regarding separation principles and practical applications of micellar electrokinetic capillary chromatography, with particular attention to those relevant in pharmaceutical analysis. PMID:24312804

  3. Quantitative phase imaging applied to laser damage detection and analysis.

    PubMed

    Douti, Dam-Bé L; Chrayteh, Mhamad; Aknoun, Sherazade; Doualle, Thomas; Hecquet, Christophe; Monneret, Serge; Gallais, Laurent

    2015-10-01

    We investigate phase imaging as a measurement method for laser damage detection and analysis of laser-induced modification of optical materials. Experiments have been conducted with a wavefront sensor based on lateral shearing interferometry associated with a high-magnification optical microscope. The system has been used for the in-line observation of optical thin films and bulk samples, laser irradiated in two different conditions: 500 fs pulses at 343 and 1030 nm, and millisecond to second irradiation with a CO2 laser at 10.6 μm. We investigate the measurement of the laser-induced damage threshold of optical material by detection and phase changes and show that the technique realizes high sensitivity with different optical path measurements lower than 1 nm. Additionally, the quantitative information on the refractive index or surface modification of the samples under test that is provided by the system has been compared to classical metrology instruments used for laser damage or laser ablation characterization (an atomic force microscope, a differential interference contrast microscope, and an optical surface profiler). An accurate in-line measurement of the morphology of laser-ablated sites, from few nanometers to hundred microns in depth, is shown. PMID:26479612

  4. Factor Analysis Applied the VFY-218 RCS Data

    NASA Technical Reports Server (NTRS)

    Woo, Alex; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    Present statistical factor analysis of computer simulations and measurement data for the VFY-218 configuration. Factor analysis try to quantify the statistical grouping of measurements and simulations.

  5. Differential Network Analysis Applied to Preoperative Breast Cancer Chemotherapy Response

    PubMed Central

    Warsow, Gregor; Struckmann, Stephan; Kerkhoff, Claus; Reimer, Toralf; Engel, Nadja; Fuellen, Georg

    2013-01-01

    In silico approaches are increasingly considered to improve breast cancer treatment. One of these treatments, neoadjuvant TFAC chemotherapy, is used in cases where application of preoperative systemic therapy is indicated. Estimating response to treatment allows or improves clinical decision-making and this, in turn, may be based on a good understanding of the underlying molecular mechanisms. Ever increasing amounts of high throughput data become available for integration into functional networks. In this study, we applied our software tool ExprEssence to identify specific mechanisms relevant for TFAC therapy response, from a gene/protein interaction network. We contrasted the resulting active subnetwork to the subnetworks of two other such methods, OptDis and KeyPathwayMiner. We could show that the ExprEssence subnetwork is more related to the mechanistic functional principles of TFAC therapy than the subnetworks of the other two methods despite the simplicity of ExprEssence. We were able to validate our method by recovering known mechanisms and as an application example of our method, we identified a mechanism that may further explain the synergism between paclitaxel and doxorubicin in TFAC treatment: Paclitaxel may attenuate MELK gene expression, resulting in lower levels of its target MYBL2, already associated with doxorubicin synergism in hepatocellular carcinoma cell lines. We tested our hypothesis in three breast cancer cell lines, confirming it in part. In particular, the predicted effect on MYBL2 could be validated, and a synergistic effect of paclitaxel and doxorubicin could be demonstrated in the breast cancer cell lines SKBR3 and MCF-7. PMID:24349128

  6. Differential network analysis applied to preoperative breast cancer chemotherapy response.

    PubMed

    Warsow, Gregor; Struckmann, Stephan; Kerkhoff, Claus; Reimer, Toralf; Engel, Nadja; Fuellen, Georg

    2013-01-01

    In silico approaches are increasingly considered to improve breast cancer treatment. One of these treatments, neoadjuvant TFAC chemotherapy, is used in cases where application of preoperative systemic therapy is indicated. Estimating response to treatment allows or improves clinical decision-making and this, in turn, may be based on a good understanding of the underlying molecular mechanisms. Ever increasing amounts of high throughput data become available for integration into functional networks. In this study, we applied our software tool ExprEssence to identify specific mechanisms relevant for TFAC therapy response, from a gene/protein interaction network. We contrasted the resulting active subnetwork to the subnetworks of two other such methods, OptDis and KeyPathwayMiner. We could show that the ExprEssence subnetwork is more related to the mechanistic functional principles of TFAC therapy than the subnetworks of the other two methods despite the simplicity of ExprEssence. We were able to validate our method by recovering known mechanisms and as an application example of our method, we identified a mechanism that may further explain the synergism between paclitaxel and doxorubicin in TFAC treatment: Paclitaxel may attenuate MELK gene expression, resulting in lower levels of its target MYBL2, already associated with doxorubicin synergism in hepatocellular carcinoma cell lines. We tested our hypothesis in three breast cancer cell lines, confirming it in part. In particular, the predicted effect on MYBL2 could be validated, and a synergistic effect of paclitaxel and doxorubicin could be demonstrated in the breast cancer cell lines SKBR3 and MCF-7. PMID:24349128

  7. Space elevator systems level analysis

    SciTech Connect

    Laubscher, B. E.

    2004-01-01

    The Space Elevator (SE) represents a major paradigm shift in space access. It involves new, untried technologies in most of its subsystems. Thus the successful construction of the SE requires a significant amount of development, This in turn implies a high level of risk for the SE. This paper will present a systems level analysis of the SE by subdividing its components into their subsystems to determine their level of technological maturity. such a high-risk endeavor is to follow a disciplined approach to the challenges. A systems level analysis informs this process and is the guide to where resources should be applied in the development processes. It is an efficient path that, if followed, minimizes the overall risk of the system's development. systems level analysis is that the overall system is divided naturally into its subsystems, and those subsystems are further subdivided as appropriate for the analysis. By dealing with the complex system in layers, the parameter space of decisions is kept manageable. Moreover, A rational way to manage One key aspect of a resources are not expended capriciously; rather, resources are put toward the biggest challenges and most promising solutions. This overall graded approach is a proven road to success. The analysis includes topics such as nanotube technology, deployment scenario, power beaming technology, ground-based hardware and operations, ribbon maintenance and repair and climber technology.

  8. Phase plane analysis: applying chaos theory in health care.

    PubMed

    Priesmeyer, H R; Sharp, L F

    1995-01-01

    This article applies the new science of nonlinearity to administrative issues and accounts receivable management in health care, and it provides a new perspective on common operating and quality control measures. PMID:10151628

  9. Systems Analysis Sub site

    SciTech Connect

    EERE

    2012-03-16

    Systems analysis provides direction, focus, and support for the development and introduction of hydrogen production, storage, and end-use technologies, and provides a basis for recommendations on a balanced portfolio of activities.

  10. SUBSURFACE VISUAL ALARM SYSTEM ANALYSIS

    SciTech Connect

    D.W. Markman

    2001-08-06

    The ''Subsurface Fire Hazard Analysis'' (CRWMS M&O 1998, page 61), and the document, ''Title III Evaluation Report for the Surface and Subsurface Communication System'', (CRWMS M&O 1999a, pages 21 and 23), both indicate the installed communication system is adequate to support Exploratory Studies Facility (ESF) activities with the exception of the mine phone system for emergency notification purposes. They recommend the installation of a visual alarm system to supplement the page/party phone system The purpose of this analysis is to identify data communication highway design approaches, and provide justification for the selected or recommended alternatives for the data communication of the subsurface visual alarm system. This analysis is being prepared to document a basis for the design selection of the data communication method. This analysis will briefly describe existing data or voice communication or monitoring systems within the ESF, and look at how these may be revised or adapted to support the needed data highway of the subsurface visual alarm. system. The existing PLC communication system installed in subsurface is providing data communication for alcove No.5 ventilation fans, south portal ventilation fans, bulkhead doors and generator monitoring system. It is given that the data communication of the subsurface visual alarm system will be a digital based system. It is also given that it is most feasible to take advantage of existing systems and equipment and not consider an entirely new data communication system design and installation. The scope and primary objectives of this analysis are to: (1) Briefly review and describe existing available data communication highways or systems within the ESF. (2) Examine technical characteristics of an existing system to disqualify a design alternative is paramount in minimizing the number of and depth of a system review. (3) Apply general engineering design practices or criteria such as relative cost, and degree of

  11. Storage battery systems analysis

    SciTech Connect

    Murphy, K.D.

    1982-01-01

    Storage Battery Systems Analysis supports the battery Exploratory Technology Development and Testing Project with technical and economic analysis of battery systems in various end-use applications. Computer modeling and simulation techniques are used in the analyses. Analysis objectives are achieved through both in-house efforts and outside contracts. In-house studies during FY82 included a study of the relationship between storage battery system reliability and cost, through cost-of-investment and cost-of-service interruption inputs; revision and update of the SOLSTOR computer code in standard FORTRAN 77 form; parametric studies of residential stand-alone photovoltaic systems using the SOLSTOR code; simulation of wind turbine collector/storage battery systems for the community of Kalaupapa, Molokai, Hawaii.

  12. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  13. MOUSE UNCERTAINTY ANALYSIS SYSTEM

    EPA Science Inventory

    The original MOUSE (Modular Oriented Uncertainty System) system was designed to deal with the problem of uncertainties in Environmental engineering calculations, such as a set of engineering cost or risk analysis equations. t was especially intended for use by individuals with li...

  14. Space lab system analysis

    NASA Technical Reports Server (NTRS)

    Rives, T. B.; Ingels, F. M.

    1988-01-01

    An analysis of the Automated Booster Assembly Checkout System (ABACS) has been conducted. A computer simulation of the ETHERNET LAN has been written. The simulation allows one to investigate different structures of the ABACS system. The simulation code is in PASCAL and is VAX compatible.

  15. Classical linear-control analysis applied to business-cycle dynamics and stability

    NASA Technical Reports Server (NTRS)

    Wingrove, R. C.

    1983-01-01

    Linear control analysis is applied as an aid in understanding the fluctuations of business cycles in the past, and to examine monetary policies that might improve stabilization. The analysis shows how different policies change the frequency and damping of the economic system dynamics, and how they modify the amplitude of the fluctuations that are caused by random disturbances. Examples are used to show how policy feedbacks and policy lags can be incorporated, and how different monetary strategies for stabilization can be analytically compared. Representative numerical results are used to illustrate the main points.

  16. Ariel Performance Analysis System

    NASA Astrophysics Data System (ADS)

    Ariel, Gideon B.; Penny, M. A.; Saar, Dany

    1990-08-01

    The Ariel Performance Analysis System is a computer-based system for the measurement, analysis and presentation of human performance. The system is based on a proprietary technique for processing multiple high-speed film and video recordings of a subject's performance. It is noninvasive, and does not require wires, sensors, markers or reflectors. In addition, it is portable and does not require modification of the performing environment. The scale and accuracy of measurement can be set to whatever levels are required by the activity being performed.

  17. CONVEYOR SYSTEM SAFETY ANALYSIS

    SciTech Connect

    M. Salem

    1995-06-23

    The purpose and objective of this analysis is to systematically identify and evaluate hazards related to the Yucca Mountain Project Exploratory Studies Facility (ESF) surface and subsurface conveyor system (for a list of conveyor subsystems see section 3). This process is an integral part of the systems engineering process; whereby safety is considered during planning, design, testing, and construction. A largely qualitative approach was used since a radiological System Safety Analysis is not required. The risk assessment in this analysis characterizes the accident scenarios associated with the conveyor structures/systems/components in terms of relative risk and includes recommendations for mitigating all identified risks. The priority for recommending and implementing mitigation control features is: (1) Incorporate measures to reduce risks and hazards into the structure/system/component (S/S/C) design, (2) add safety devices and capabilities to the designs that reduce risk, (3) provide devices that detect and warn personnel of hazardous conditions, and (4) develop procedures and conduct training to increase worker awareness of potential hazards, on methods to reduce exposure to hazards, and on the actions required to avoid accidents or correct hazardous conditions. The scope of this analysis is limited to the hazards related to the design of conveyor structures/systems/components (S/S/Cs) that occur during normal operation. Hazards occurring during assembly, test and maintenance or ''off normal'' operations have not been included in this analysis. Construction related work activities are specifically excluded per DOE Order 5481.1B section 4. c.

  18. Optical methods of stress analysis applied to cracked components

    NASA Technical Reports Server (NTRS)

    Smith, C. W.

    1991-01-01

    After briefly describing the principles of frozen stress photoelastic and moire interferometric analyses, and the corresponding algorithms for converting optical data from each method into stress intensity factors (SIF), the methods are applied to the determination of crack shapes, SIF determination, crack closure displacement fields, and pre-crack damage mechanisms in typical aircraft component configurations.

  19. Applying Research: An Analysis of Texts for Consumers of Research.

    ERIC Educational Resources Information Center

    Erion, R. L.; Steinley, Gary

    The critical reading of research involves: (1) comprehension, (2) evaluation, and (3) application. A study examined six recently published textbooks to determine to what extent they attempt to help students learn to apply educational research; these texts were specifically designed for "consumers" of research (i.e., critical readers of research)…

  20. Duration Analysis Applied to the Adoption of Knowledge.

    ERIC Educational Resources Information Center

    Vega-Cervera, Juan A.; Gordillo, Isabel Cuadrado

    2001-01-01

    Analyzes knowledge acquisition in a sample of 264 pupils in 9 Spanish elementary schools, using time as a dependent variable. Introduces psycho-pedagogical, pedagogical, and social variables into a hazard model applied to the reading process. Auditory discrimination (not intelligence or visual perception) most significantly influences learning to…

  1. Identifying a cooperative control mechanism between an applied field and the environment of open quantum systems

    NASA Astrophysics Data System (ADS)

    Gao, Fang; Rey-de-Castro, Roberto; Wang, Yaoxiong; Rabitz, Herschel; Shuang, Feng

    2016-05-01

    Many systems under control with an applied field also interact with the surrounding environment. Understanding the control mechanisms has remained a challenge, especially the role played by the interaction between the field and the environment. In order to address this need, here we expand the scope of the Hamiltonian-encoding and observable-decoding (HE-OD) technique. HE-OD was originally introduced as a theoretical and experimental tool for revealing the mechanism induced by control fields in closed quantum systems. The results of open-system HE-OD analysis presented here provide quantitative mechanistic insights into the roles played by a Markovian environment. Two model open quantum systems are considered for illustration. In these systems, transitions are induced by either an applied field linked to a dipole operator or Lindblad operators coupled to the system. For modest control yields, the HE-OD results clearly show distinct cooperation between the dynamics induced by the optimal field and the environment. Although the HE-OD methodology introduced here is considered in simulations, it has an analogous direct experimental formulation, which we suggest may be applied to open systems in the laboratory to reveal mechanistic insights.

  2. How Has Applied Behavior Analysis and Behavior Therapy Changed?: An Historical Analysis of Journals

    ERIC Educational Resources Information Center

    O'Donohue, William; Fryling, Mitch

    2007-01-01

    Applied behavior analysis and behavior therapy are now nearly a half century old. It is interesting to ask if and how these disciplines have changed over time, particularly regarding some of their key internal controversies (e.g., role of cognitions). We examined the first five years and the 2000-2004 five year period of the "Journal of Applied…

  3. Applying Model Analysis to a Resource-Based Analysis of the Force and Motion Conceptual Evaluation

    ERIC Educational Resources Information Center

    Smith, Trevor I.; Wittmann, Michael C.; Carter, Tom

    2014-01-01

    Previously, we analyzed the Force and Motion Conceptual Evaluation in terms of a resources-based model that allows for clustering of questions so as to provide useful information on how students correctly or incorrectly reason about physics. In this paper, we apply model analysis to show that the associated model plots provide more information…

  4. Coal systems analysis

    SciTech Connect

    Warwick, P.D.

    2005-07-01

    This collection of papers provides an introduction to the concept of coal systems analysis and contains examples of how coal systems analysis can be used to understand, characterize, and evaluate coal and coal gas resources. Chapter are: Coal systems analysis: A new approach to the understanding of coal formation, coal quality and environmental considerations, and coal as a source rock for hydrocarbons by Peter D. Warwick. Appalachian coal assessment: Defining the coal systems of the Appalachian Basin by Robert C. Milici. Subtle structural influences on coal thickness and distribution: Examples from the Lower Broas-Stockton coal (Middle Pennsylvanian), Eastern Kentucky Coal Field, USA by Stephen F. Greb, Cortland F. Eble, and J.C. Hower. Palynology in coal systems analysis The key to floras, climate, and stratigraphy of coal-forming environments by Douglas J. Nichols. A comparison of late Paleocene and late Eocene lignite depositional systems using palynology, upper Wilcox and upper Jackson Groups, east-central Texas by Jennifer M.K. O'Keefe, Recep H. Sancay, Anne L. Raymond, and Thomas E. Yancey. New insights on the hydrocarbon system of the Fruitland Formation coal beds, northern San Juan Basin, Colorado and New Mexico, USA by W.C. Riese, William L. Pelzmann, and Glen T. Snyder.

  5. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  6. Modal analysis applied to circular, rectangular, and coaxial waveguides

    NASA Technical Reports Server (NTRS)

    Hoppe, D. J.

    1988-01-01

    Recent developments in the analysis of various waveguide components and feedhorns using Modal Analysis (Mode Matching Method) are summarized. A brief description of the theory is presented, and the important features of the method are pointed out. Specific examples in circular, rectangular, and coaxial waveguides are included, with comparisons between the theory and experimental measurements. Extensions to the methods are described.

  7. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  8. Quantitative Schlieren analysis applied to holograms of crystals grown on Spacelab 3

    NASA Technical Reports Server (NTRS)

    Brooks, Howard L.

    1986-01-01

    In order to extract additional information about crystals grown in the microgravity environment of Spacelab, a quantitative schlieren analysis technique was developed for use in a Holography Ground System of the Fluid Experiment System. Utilizing the Unidex position controller, it was possible to measure deviation angles produced by refractive index gradients of 0.5 milliradians. Additionally, refractive index gradient maps for any recorded time during the crystal growth were drawn and used to create solute concentration maps for the environment around the crystal. The technique was applied to flight holograms of Cell 204 of the Fluid Experiment System that were recorded during the Spacelab 3 mission on STS 51B. A triglycine sulfate crystal was grown under isothermal conditions in the cell and the data gathered with the quantitative schlieren analysis technique is consistent with a diffusion limited growth process.

  9. Improving the flash flood frequency analysis applying dendrogeomorphological evidences

    NASA Astrophysics Data System (ADS)

    Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.

    2009-09-01

    Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait

  10. Seat belt usage: A potential target for applied behavior analysis

    PubMed Central

    Geller, E. Scott; Casali, John G.; Johnson, Richard P.

    1980-01-01

    Results of 1,579 observations of cars entering or exiting campus parking lots showed direct relationships between seat belt wearing and the intrusiveness of the engineering device designed to induce belt usage, and between device intrusiveness and system defeat. For example, all drivers with working interlocks or unlimited buzzer reminders were wearing a seat belt; but 62% of the systems with interlocks or unlimited buzzers had been defeated, and only 15.9% of the drivers in these cars were wearing a seat belt. The normative data indicated marked ineffectiveness of the negative reinforcement contingencies implied by current seat belt inducement systems; but suggested that unlimited buzzer systems would be the optimal system currently available if contingencies were developed to discourage the disconnection and circumvention of such systems. Positive reinforcement strategies are discussed that would be quite feasible for large-scale promotion of seat belt usage. PMID:16795638

  11. Factorial kriging analysis applied to geological data from petroleum exploration

    SciTech Connect

    Jaquet, O.

    1989-10-01

    A regionalized variable, thickness of the reservoir layer, from a gas field is decomposed by factorial kriging analysis. Maps of the obtained components may be associated with depositional environments that are favorable for petroleum exploration.

  12. Neutron-activation analysis applied to copper ores and artifacts

    NASA Technical Reports Server (NTRS)

    Linder, N. F.

    1970-01-01

    Neutron activation analysis is used for quantitative identification of trace metals in copper. Establishing a unique fingerprint of impurities in Michigan copper would enable identification of artifacts made from this copper.

  13. Biomedical systems analysis program

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Biomedical monitoring programs which were developed to provide a system analysis context for a unified hypothesis for adaptation to space flight are presented and discussed. A real-time system of data analysis and decision making to assure the greatest possible crew safety and mission success is described. Information about man's abilities, limitations, and characteristic reactions to weightless space flight was analyzed and simulation models were developed. The predictive capabilities of simulation models for fluid-electrolyte regulation, erythropoiesis regulation, and calcium regulation are discussed.

  14. Independent comparison study of six different electronic tongues applied for pharmaceutical analysis.

    PubMed

    Pein, Miriam; Kirsanov, Dmitry; Ciosek, Patrycja; del Valle, Manel; Yaroshenko, Irina; Wesoły, Małgorzata; Zabadaj, Marcin; Gonzalez-Calabuig, Andreu; Wróblewski, Wojciech; Legin, Andrey

    2015-10-10

    Electronic tongue technology based on arrays of cross-sensitive chemical sensors and chemometric data processing has attracted a lot of researchers' attention through the last years. Several so far reported applications dealing with pharmaceutical related tasks employed different e-tongue systems to address different objectives. In this situation, it is hard to judge on the benefits and drawbacks of particular e-tongue implementations for R&D in pharmaceutics. The objective of this study was to compare the performance of six different e-tongues applied to the same set of pharmaceutical samples. For this purpose, two commercially available systems (from Insent and AlphaMOS) and four laboratory prototype systems (two potentiometric systems from Warsaw operating in flow and static modes, one potentiometric system from St. Petersburg, one voltammetric system from Barcelona) were employed. The sample set addressed in the study comprised nine different formulations based on caffeine citrate, lactose monohydrate, maltodextrine, saccharin sodium and citric acid in various combinations. To provide for the fair and unbiased comparison, samples were evaluated under blind conditions and data processing from all the systems was performed in a uniform way. Different mathematical methods were applied to judge on similarity of the e-tongues response from the samples. These were principal component analysis (PCA), RV' matrix correlation coefficients and Tuckeŕs congruency coefficients. PMID:26099261

  15. Joint regression analysis and AMMI model applied to oat improvement

    NASA Astrophysics Data System (ADS)

    Oliveira, A.; Oliveira, T. A.; Mejza, S.

    2012-09-01

    In our work we present an application of some biometrical methods useful in genotype stability evaluation, namely AMMI model, Joint Regression Analysis (JRA) and multiple comparison tests. A genotype stability analysis of oat (Avena Sativa L.) grain yield was carried out using data of the Portuguese Plant Breeding Board, sample of the 22 different genotypes during the years 2002, 2003 and 2004 in six locations. In Ferreira et al. (2006) the authors state the relevance of the regression models and of the Additive Main Effects and Multiplicative Interactions (AMMI) model, to study and to estimate phenotypic stability effects. As computational techniques we use the Zigzag algorithm to estimate the regression coefficients and the agricolae-package available in R software for AMMI model analysis.

  16. Orbit Response Matrix Analysis Applied at PEP-II

    SciTech Connect

    Steier, C.; Wolski, A.; Ecklund, S.; Safranek, J.A.; Tenenbaum, P.; Terebilo, A.; Turner, J.L.; Yocky, G.; /SLAC

    2005-05-17

    The analysis of orbit response matrices has been used very successfully to measure and correct the gradient and skew gradient distribution in many accelerators. It allows determination of an accurately calibrated model of the coupled machine lattice, which then can be used to calculate the corrections necessary to improve coupling, dynamic aperture and ultimately luminosity. At PEP-II, the Matlab version of LOCO has been used to analyze coupled response matrices for both the LER and the HER. The large number of elements in PEP-II and the very complicated interaction region present unique challenges to the data analysis. All necessary tools to make the analysis method useable at PEP-II have been implemented and LOCO can now be used as a routine tool for lattice diagnostic.

  17. The colour analysis method applied to homogeneous rocks

    NASA Astrophysics Data System (ADS)

    Halász, Amadé; Halmai, Ákos

    2015-12-01

    Computer-aided colour analysis can facilitate cyclostratigraphic studies. Here we report on a case study involving the development of a digital colour analysis method for examination of the Boda Claystone Formation which is the most suitable in Hungary for the disposal of high-level radioactive waste. Rock type colours are reddish brown or brownish red, or any shade between brown and red. The method presented here could be used to differentiate similar colours and to identify gradual transitions between these; the latter are of great importance in a cyclostratigraphic analysis of the succession. Geophysical well-logging has demonstrated the existence of characteristic cyclic units, as detected by colour and natural gamma. Based on our research, colour, natural gamma and lithology correlate well. For core Ib-4, these features reveal the presence of orderly cycles with thicknesses of roughly 0.64 to 13 metres. Once the core has been scanned, this is a time- and cost-effective method.

  18. Integrated analysis environment for high impact systems

    SciTech Connect

    Martinez, M.; Davis, J.; Scott, J.; Sztipanovits, J.; Karsai, G.

    1998-02-01

    Modeling and analysis of high consequence, high assurance systems requires special modeling considerations. System safety and reliability information must be captured in the models. Previously, high consequence systems were modeled using separate, disjoint models for safety, reliability, and security. The MultiGraph Architecture facilitates the implementation of a model integrated system for modeling and analysis of high assurance systems. Model integrated computing allows an integrated modeling technique to be applied to high consequence systems. Among the tools used for analyzing safety and reliability are a behavioral simulator and an automatic fault tree generation and analysis tool. Symbolic model checking techniques are used to efficiently investigate the system models. A method for converting finite state machine models to ordered binary decision diagrams allows the application of symbolic model checking routines to the integrated system models. This integrated approach to modeling and analysis of high consequence systems ensures consistency between the models and the different analysis tools.

  19. On the relation between applied behavior analysis and positive behavioral support

    PubMed Central

    Carr, James E.; Sidener, Tina M.

    2002-01-01

    Anderson and Freeman (2000) recently defined positive behavioral support (PBS) as a systematic approach to the delivery of clinical and educational services that is rooted in behavior analysis. However, the recent literature contains varied definitions of PBS as well as discrepant notions regarding the relation between applied behavior analysis and PBS. After summarizing common definitional characteristics of PBS from the literature, we conclude that PBS is comprised almost exclusively of techniques and values originating in applied behavior analysis. We then discuss the relations between applied behavior analysis and PBS that have been proposed in the literature. Finally, we discuss possible implications of considering PBS a field separate from applied behavior analysis. PMID:22478389

  20. Flight control system design factors for applying automated testing techniques

    NASA Technical Reports Server (NTRS)

    Sitz, Joel R.; Vernon, Todd H.

    1990-01-01

    The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 high alpha research vehicle (HARV) automated test systems are discussed. It is noted that operational experiences in developing and using these automated testing techniques have highlighted the need for incorporating target system features to improve testability. Improved target system testability can be accomplished with the addition of nonreal-time and real-time features. Online access to target system implementation details, unobtrusive real-time access to internal user-selectable variables, and proper software instrumentation are all desirable features of the target system. Also, test system and target system design issues must be addressed during the early stages of the target system development. Processing speeds of up to 20 million instructions/s and the development of high-bandwidth reflective memory systems have improved the ability to integrate the target system and test system for the application of automated testing techniques. It is concluded that new methods of designing testability into the target systems are required.

  1. Applying Adult Learning Theory through a Character Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    The purpose of this study is to analyze the behavior of a character, Celie, in a movie, 'The Color Purple," through the lens of two adult learning theorists to determine the relationships the character has with each theory. The development and portrayal of characters in movies can be explained and understood by the analysis of adult learning…

  2. Applying Skinner's Analysis of Verbal Behavior to Persons with Dementia

    ERIC Educational Resources Information Center

    Dixon, Mark; Baker, Jonathan C.; Sadowski, Katherine Ann

    2011-01-01

    Skinner's 1957 analysis of verbal behavior has demonstrated a fair amount of utility to teach language to children with autism and other various disorders. However, the learning of language can be forgotten, as is the case for many elderly suffering from dementia or other degenerative diseases. It appears possible that Skinner's operants may…

  3. Applying Score Analysis to a Rehearsal Pedagogy of Expressive Performance

    ERIC Educational Resources Information Center

    Byo, James L.

    2014-01-01

    The discoveries of score analysis (e.g., minor seventh chord, ostinato, phrase elision, melodic fragment, half cadence) are more than just compositional techniques or music vocabulary. They are sounds--fascinating, storytelling, dynamic modes of expression--that when approached as such enrich the rehearsal experience. This article presents a…

  4. Action, Content and Identity in Applied Genre Analysis for ESP

    ERIC Educational Resources Information Center

    Flowerdew, John

    2011-01-01

    Genres are staged, structured, communicative events, motivated by various communicative purposes, and performed by members of specific discourse communities (Swales 1990; Bhatia 1993, 2004; Berkenkotter & Huckin 1995). Since its inception, with the two seminal works on the topic by Swales (1990) and Bhatia (1993), genre analysis has taken pride of…

  5. Applied Bibliometrics: Using Citation Analysis in the Journal Submission Process.

    ERIC Educational Resources Information Center

    Robinson, Michael D.

    1991-01-01

    Discusses the use of citation analysis as an effective tool for scholars to determine what journals would be appropriate for publication of their work. Calculating citation distance is explained, and a study with economics journals is described that computed citation distance between previously published articles and journals in the field. (12…

  6. Applying MORT maintenance safety analysis in Finnish industry

    NASA Astrophysics Data System (ADS)

    Ruuhilehto, Kaarin; Virolainen, Kimmo

    1992-02-01

    A safety analysis method based on MORT (Management Oversight and Risk Tree) method, especially on the version developed for safety considerations in the evaluation of maintenance programs, is presented. The MORT maintenance safety analysis is intended especially for the use maintenance safety management. The analysis helps managers evaluate the goals of their safety work and measures taken to reach them. The analysis is done by a team or teams. The team ought to have expert knowledge of the organization both vertically and horizontally in order to be able to identify factors that may contribute to accidents or other interruptions in the maintenance work. Identification is made by using the MORT maintenance key question set as a check list. The questions check the way safety matters are connnected with the maintenance planning and managing, as well as the safety management itself. In the second stage, means to eliminate the factors causing problems are developed. New practices are established to improve safety of maintenance planning and managing in the enterprise.

  7. Best practices: applying management analysis of excellence to immunization.

    PubMed

    Wishner, Amy; Aronson, Jerold; Kohrt, Alan; Norton, Gary

    2005-01-01

    The authors applied business management tools to analyze and promote excellence and to evaluate differences between average and above-average immunization peformers in private practices. The authors conducted a pilot study of 10 private practices in Pennsylvania using tools common in management to assess practices' organizational climate and managerial style. Authoritative and coaching styles of physician leaders were common to both groups. Managerial styles that emphasized higher levels of clarity and responsibility managerial styles were evident in the large practices; and rewards and flexibility styles were higher in the small above-average practices. The findings of this pilot study match results seen in high performers in other industries. It concludes that the authoritative style appears to have the most impact on performance. It has interesting implications for training/behavior change to improve immunization rates, along with traditional medical interventions. PMID:15921143

  8. A value analysis model applied to the management of amblyopia.

    PubMed Central

    Beauchamp, G R; Bane, M C; Stager, D R; Berry, P M; Wright, W W

    1999-01-01

    PURPOSE: To assess the value of amblyopia-related services by utilizing a health value model (HVM). Cost and quality criteria are evaluated in accordance with the interests of patients, physicians, and purchasers. METHODS: We applied an HVM to a hypothetical statistical ("median") child with amblyopia whose visual acuity is 20/80 and to a group of children with amblyopia who are managed by our practice. We applied the model to calculate the value of these services by evaluating the responses of patients and physicians and relating these responses to clinical outcomes. RESULTS: The consensus value of care for the hypothetical median child was calculated to be 0.406 (of 1.000). For those children managed in our practice, the calculated value is 0.682. Clinically, 79% achieved 20/40 or better visual acuity, and the mean final visual acuity was 0.2 logMAR (20/32). Value appraisals revealed significant concerns about the financial aspects of amblyopia-related services, particularly among physicians. Patients rated services more positively than did physicians. CONCLUSIONS: Amblyopia care is difficult, sustained, and important work that requires substantial sensitivity to and support of children and families. Compliance and early detection are essential to success. The value of amblyopia services is rated significantly higher by patients than by physicians. Relative to the measured value, amblyopia care is undercompensated. The HVM is useful to appraise clinical service delivery and its variation. The costs of failure and the benefits of success are high; high-value amblyopia care yields substantial dividends and should be commensurately compensated in the marketplace. PMID:10703133

  9. VENTILATION TECHNOLOGY SYSTEMS ANALYSIS

    EPA Science Inventory

    The report gives results of a project to develop a systems analysis of ventilation technology and provide a state-of-the-art assessment of ventilation and indoor air quality (IAQ) research needs. (NOTE: Ventilation technology is defined as the hardware necessary to bring outdoor ...

  10. Local structural excitations in model glass systems under applied load

    NASA Astrophysics Data System (ADS)

    Swayamjyoti, S.; Löffler, J. F.; Derlet, P. M.

    2016-04-01

    The potential-energy landscape of a model binary Lennard-Jones structural glass is investigated as a function of applied external strain, in terms of how local structural excitations (LSEs) respond to the load. Using the activation relaxation technique and nudged elastic band methods, the evolving structure and barrier energy of such LSEs are studied in detail. For the case of a tensile/compressive strain, the LSE barrier energies generally decrease/increase, whereas under pure shear, it may either increase or decrease resulting in a broadening of the barrier energy distribution. It is found that how a particular LSE responds to an applied strain is strongly controlled by the LSE's far-field internal stress signature prior to loading.

  11. Object-oriented fault tree models applied to system diagnosis

    NASA Technical Reports Server (NTRS)

    Iverson, David L.; Patterson-Hine, F. A.

    1990-01-01

    When a diagnosis system is used in a dynamic environment, such as the distributed computer system planned for use on Space Station Freedom, it must execute quickly and its knowledge base must be easily updated. Representing system knowledge as object-oriented augmented fault trees provides both features. The diagnosis system described here is based on the failure cause identification process of the diagnostic system described by Narayanan and Viswanadham. Their system has been enhanced in this implementation by replacing the knowledge base of if-then rules with an object-oriented fault tree representation. This allows the system to perform its task much faster and facilitates dynamic updating of the knowledge base in a changing diagnosis environment. Accessing the information contained in the objects is more efficient than performing a lookup operation on an indexed rule base. Additionally, the object-oriented fault trees can be easily updated to represent current system status. This paper describes the fault tree representation, the diagnosis algorithm extensions, and an example application of this system. Comparisons are made between the object-oriented fault tree knowledge structure solution and one implementation of a rule-based solution. Plans for future work on this system are also discussed.

  12. Nested sampling applied in Bayesian room-acoustics decay analysis.

    PubMed

    Jasa, Tomislav; Xiang, Ning

    2012-11-01

    Room-acoustic energy decays often exhibit single-rate or multiple-rate characteristics in a wide variety of rooms/halls. Both the energy decay order and decay parameter estimation are of practical significance in architectural acoustics applications, representing two different levels of Bayesian probabilistic inference. This paper discusses a model-based sound energy decay analysis within a Bayesian framework utilizing the nested sampling algorithm. The nested sampling algorithm is specifically developed to evaluate the Bayesian evidence required for determining the energy decay order with decay parameter estimates as a secondary result. Taking the energy decay analysis in architectural acoustics as an example, this paper demonstrates that two different levels of inference, decay model-selection and decay parameter estimation, can be cohesively accomplished by the nested sampling algorithm. PMID:23145609

  13. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    SciTech Connect

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  14. LAMQS analysis applied to ancient Egyptian bronze coins

    NASA Astrophysics Data System (ADS)

    Torrisi, L.; Caridi, F.; Giuffrida, L.; Torrisi, A.; Mondio, G.; Serafino, T.; Caltabiano, M.; Castrizio, E. D.; Paniz, E.; Salici, A.

    2010-05-01

    Some Egyptian bronze coins, dated VI-VII sec A.D. are analyzed through different physical techniques in order to compare their composition and morphology and to identify their origin and the type of manufacture. The investigations have been performed by using micro-invasive analysis, such as Laser Ablation and Mass Quadrupole Spectrometry (LAMQS), X-ray Fluorescence (XRF), Laser Induced Breakdown Spectroscopy (LIBS), Electronic (SEM) and Optical Microscopy, Surface Profile Analysis (SPA) and density measurements. Results indicate that the coins have a similar bulk composition but significant differences have been evidenced due to different constituents of the patina, bulk alloy composition, isotopic ratios, density and surface morphology. The results are in agreement with the archaeological expectations, indicating that the coins have been produced in two different Egypt sites: Alexandria and Antinoupolis. A group of fake coins produced in Alexandria in the same historical period is also identified.

  15. Applying hydraulic transient analysis: The Grizzly Hydro Project

    SciTech Connect

    Logan, T.H.; Stutsman, R.D. )

    1992-04-01

    No matter the size of the hydro plant, if it has a long waterway and will operate in peaking mode, the project designer needs to address the issue of hydraulic transients-known as water hammer-early in the design. This article describes the application of transient analysis to the design of a 20-MW hydro plant in California. In this case, a Howell Bunger valve was used as a pressure regulating valve to control transient pressures and speed rise.

  16. Arctic Climate Systems Analysis

    SciTech Connect

    Ivey, Mark D.; Robinson, David G.; Boslough, Mark B.; Backus, George A.; Peterson, Kara J.; van Bloemen Waanders, Bart G.; Swiler, Laura Painton; Desilets, Darin Maurice; Reinert, Rhonda Karen

    2015-03-01

    This study began with a challenge from program area managers at Sandia National Laboratories to technical staff in the energy, climate, and infrastructure security areas: apply a systems-level perspective to existing science and technology program areas in order to determine technology gaps, identify new technical capabilities at Sandia that could be applied to these areas, and identify opportunities for innovation. The Arctic was selected as one of these areas for systems level analyses, and this report documents the results. In this study, an emphasis was placed on the arctic atmosphere since Sandia has been active in atmospheric research in the Arctic since 1997. This study begins with a discussion of the challenges and benefits of analyzing the Arctic as a system. It goes on to discuss current and future needs of the defense, scientific, energy, and intelligence communities for more comprehensive data products related to the Arctic; assess the current state of atmospheric measurement resources available for the Arctic; and explain how the capabilities at Sandia National Laboratories can be used to address the identified technological, data, and modeling needs of the defense, scientific, energy, and intelligence communities for Arctic support.

  17. Applying thiouracil (TU)-tagging for mouse transcriptome analysis

    PubMed Central

    Gay, Leslie; Karfilis, Kate V.; Miller, Michael R.; Doe, Chris Q.; Stankunas, Kryn

    2014-01-01

    Transcriptional profiling is a powerful approach to study mouse development, physiology, and disease models. Here, we describe a protocol for mouse thiouracil-tagging (TU-tagging), a transcriptome analysis technology that includes in vivo covalent labeling, purification, and analysis of cell type-specific RNA. TU-tagging enables 1) the isolation of RNA from a given cell population of a complex tissue, avoiding transcriptional changes induced by cell isolation trauma, and 2) the identification of actively transcribed RNAs and not pre-existing transcripts. Therefore, in contrast to other cell-specific transcriptional profiling methods based on purification of tagged ribosomes or nuclei, TU-tagging provides a direct examination of transcriptional regulation. We describe how to: 1) deliver 4-thiouracil to transgenic mice to thio-label cell lineage-specific transcripts, 2) purify TU-tagged RNA and prepare libraries for Illumina sequencing, and 3) follow a straight-forward bioinformatics workflow to identify cell type-enriched or differentially expressed genes. Tissue containing TU-tagged RNA can be obtained in one day, RNA-Seq libraries generated within two days, and, following sequencing, an initial bioinformatics analysis completed in one additional day. PMID:24457332

  18. Applying temporal network analysis to the venture capital market

    NASA Astrophysics Data System (ADS)

    Zhang, Xin; Feng, Ling; Zhu, Rongqian; Stanley, H. Eugene

    2015-10-01

    Using complex network theory to study the investment relationships of venture capital firms has produced a number of significant results. However, previous studies have often neglected the temporal properties of those relationships, which in real-world scenarios play a pivotal role. Here we examine the time-evolving dynamics of venture capital investment in China by constructing temporal networks to represent (i) investment relationships between venture capital firms and portfolio companies and (ii) the syndication ties between venture capital investors. The evolution of the networks exhibits rich variations in centrality, connectivity and local topology. We demonstrate that a temporal network approach provides a dynamic and comprehensive analysis of real-world networks.

  19. Applying Formal Verification Techniques to Ambient Assisted Living Systems

    NASA Astrophysics Data System (ADS)

    Benghazi, Kawtar; Visitación Hurtado, María; Rodríguez, María Luisa; Noguera, Manuel

    This paper presents a verification approach based on timed traces semantics and MEDISTAM-RT [1] to check the fulfillment of non-functional requirements, such as timeliness and safety, and assure the correct functioning of the Ambient Assisted Living (AAL) systems. We validate this approach by its application to an Emergency Assistance System for monitoring people suffering from cardiac alteration with syncope.

  20. STAIRS: A Storage and Retrieval System Applied in Online Cataloging.

    ERIC Educational Resources Information Center

    Poor, William

    1982-01-01

    Describes the use of IBM's Storage and Information Retrieval System (STAIRS) in the development of an online catalog for the Business and Technical Library of the Cummins Engine Company. The functions, advantages, and disadvantages of the system are outlined. A reference list and three sample searches are attached. (JL)

  1. Intelligent monitoring system applied to super long distance telerobotic tasks

    NASA Technical Reports Server (NTRS)

    Wakita, Yujin; Hirai, Shigeoki; Machida, Kazuo

    1994-01-01

    Time delay and small capacity of communication are the primary constraint in super long distance telerobotic systems such as astronautical robotic tasks. Intelligent telerobotics is thought to break this constraint. We aim to realize this super long distance telerobotic system with object handling knowledge base and intelligent monitoring. We will discuss physical and technical factors for this purpose.

  2. Systems Biology Applied to Heart Failure With Normal Ejection Fraction

    PubMed Central

    Mesquita, Evandro Tinoco; Jorge, Antonio Jose Lagoeiro; de Souza, Celso Vale; Cassino, João Paulo Pedroza

    2014-01-01

    Heart failure with normal ejection fraction (HFNEF) is currently the most prevalent clinical phenotype of heart failure. However, the treatments available have shown no reduction in mortality so far. Advances in the omics sciences and techniques of high data processing used in molecular biology have enabled the development of an integrating approach to HFNEF based on systems biology. This study aimed at presenting a systems-biology-based HFNEF model using the bottom-up and top-down approaches. A literature search was conducted for studies published between 1991 and 2013 regarding HFNEF pathophysiology, its biomarkers and systems biology. A conceptual model was developed using bottom-up and top-down approaches of systems biology. The use of systems-biology approaches for HFNEF, a complex clinical syndrome, can be useful to better understand its pathophysiology and to discover new therapeutic targets. PMID:24918915

  3. Neptune Aerocapture Systems Analysis

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae

    2004-01-01

    A Neptune Aerocapture Systems Analysis is completed to determine the feasibility, benefit and risk of an aeroshell aerocapture system for Neptune and to identify technology gaps and technology performance goals. The high fidelity systems analysis is completed by a five center NASA team and includes the following disciplines and analyses: science; mission design; aeroshell configuration screening and definition; interplanetary navigation analyses; atmosphere modeling; computational fluid dynamics for aerodynamic performance and database definition; initial stability analyses; guidance development; atmospheric flight simulation; computational fluid dynamics and radiation analyses for aeroheating environment definition; thermal protection system design, concepts and sizing; mass properties; structures; spacecraft design and packaging; and mass sensitivities. Results show that aerocapture can deliver 1.4 times more mass to Neptune orbit than an all-propulsive system for the same launch vehicle. In addition aerocapture results in a 3-4 year reduction in trip time compared to all-propulsive systems. Aerocapture is feasible and performance is adequate for the Neptune aerocapture mission. Monte Carlo simulation results show 100% successful capture for all cases including conservative assumptions on atmosphere and navigation. Enabling technologies for this mission include TPS manufacturing; and aerothermodynamic methods and validation for determining coupled 3-D convection, radiation and ablation aeroheating rates and loads, and the effects on surface recession.

  4. Multivariate calibration applied to the quantitative analysis of infrared spectra

    NASA Astrophysics Data System (ADS)

    Haaland, David M.

    1992-03-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in- situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mid- or near-infrared spectra of the blood. Progress toward the noninvasive determination of glucose levels in diabetics is an ultimate goal of this research.

  5. Multivariate calibration applied to the quantitative analysis of infrared spectra

    SciTech Connect

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  6. Automated document analysis system

    NASA Astrophysics Data System (ADS)

    Black, Jeffrey D.; Dietzel, Robert; Hartnett, David

    2002-08-01

    A software application has been developed to aid law enforcement and government intelligence gathering organizations in the translation and analysis of foreign language documents with potential intelligence content. The Automated Document Analysis System (ADAS) provides the capability to search (data or text mine) documents in English and the most commonly encountered foreign languages, including Arabic. Hardcopy documents are scanned by a high-speed scanner and are optical character recognized (OCR). Documents obtained in an electronic format bypass the OCR and are copied directly to a working directory. For translation and analysis, the script and the language of the documents are first determined. If the document is not in English, the document is machine translated to English. The documents are searched for keywords and key features in either the native language or translated English. The user can quickly review the document to determine if it has any intelligence content and whether detailed, verbatim human translation is required. The documents and document content are cataloged for potential future analysis. The system allows non-linguists to evaluate foreign language documents and allows for the quick analysis of a large quantity of documents. All document processing can be performed manually or automatically on a single document or a batch of documents.

  7. Sensitivity and uncertainty analysis applied to the JHR reactivity prediction

    SciTech Connect

    Leray, O.; Vaglio-Gaudard, C.; Hudelot, J. P.; Santamarina, A.; Noguere, G.; Di-Salvo, J.

    2012-07-01

    The on-going AMMON program in EOLE reactor at CEA Cadarache (France) provides experimental results to qualify the HORUS-3D/N neutronics calculation scheme used for the design and safety studies of the new Material Testing Jules Horowitz Reactor (JHR). This paper presents the determination of technological and nuclear data uncertainties on the core reactivity and the propagation of the latter from the AMMON experiment to JHR. The technological uncertainty propagation was performed with a direct perturbation methodology using the 3D French stochastic code TRIPOLI4 and a statistical methodology using the 2D French deterministic code APOLLO2-MOC which leads to a value of 289 pcm (1{sigma}). The Nuclear Data uncertainty propagation relies on a sensitivity study on the main isotopes and the use of a retroactive marginalization method applied to the JEFF 3.1.1 {sup 27}Al evaluation in order to obtain a realistic multi-group covariance matrix associated with the considered evaluation. This nuclear data uncertainty propagation leads to a K{sub eff} uncertainty of 624 pcm for the JHR core and 684 pcm for the AMMON reference configuration core. Finally, transposition and reduction of the prior uncertainty were made using the Representativity method which demonstrates the similarity of the AMMON experiment with JHR (the representativity factor is 0.95). The final impact of JEFF 3.1.1 nuclear data on the Begin Of Life (BOL) JHR reactivity calculated by the HORUS-3D/N V4.0 is a bias of +216 pcm with an associated posterior uncertainty of 304 pcm (1{sigma}). (authors)

  8. EG&G Mound Applied Technologies payroll system

    SciTech Connect

    Not Available

    1992-02-07

    EG&G Mound Applied Technologies, Inc., manages and operates the Mound Facility, Miamisburg, Ohio, under a cost-plus-award-fee contract administered by the Department of Energy`s Albuquerque Field Office. The contractor`s Payroll Department is responsible for prompt payment in the proper amount to all persons entitled to be paid, in compliance with applicable laws, regulations, and legal decisions. The objective was to determine whether controls were in place to avoid erroneous payroll payments. EG&G Mound Applied Technologies, Inc., did not have all the internal controls required by General Accounting Office Title 6, ``Pay, Leave, and Allowances.`` Specifically, they did not have computerized edits, separation of duties and responsibilities, and restricted access to payroll data files. This condition occurred because its managers were not aware of Title 6 requirements. As a result, the contractor could not assure the Department of Energy that payroll costs were processes accurately; and fraud, waste, or abuse of Department of Energy funds could go undetected. Our sample of 212 payroll transactions from a population of 66,000 in FY 1991 disclosed only two minor processing errors and no instances of fraud, waste or abuse.

  9. Aircraft Electric Propulsion Systems Applied Research at NASA

    NASA Technical Reports Server (NTRS)

    Clarke, Sean

    2015-01-01

    Researchers at NASA are investigating the potential for electric propulsion systems to revolutionize the design of aircraft from the small-scale general aviation sector to commuter and transport-class vehicles. Electric propulsion provides new degrees of design freedom that may enable opportunities for tightly coupled design and optimization of the propulsion system with the aircraft structure and control systems. This could lead to extraordinary reductions in ownership and operating costs, greenhouse gas emissions, and noise annoyance levels. We are building testbeds, high-fidelity aircraft simulations, and the first highly distributed electric inhabited flight test vehicle to begin to explore these opportunities.

  10. Applying New Network Security Technologies to SCADA Systems.

    SciTech Connect

    Hurd, Steven A.; Stamp, Jason E.; Duggan, David P.; Chavez, Adrian R.

    2006-11-01

    Supervisory Control and Data Acquisition (SCADA) systems for automation are very important for critical infrastructure and manufacturing operations. They have been implemented to work in a number of physical environments using a variety of hardware, software, networking protocols, and communications technologies, often before security issues became of paramount concern. To offer solutions to security shortcomings in the short/medium term, this project was to identify technologies used to secure %22traditional%22 IT networks and systems, and then assess their efficacy with respect to SCADA systems. These proposed solutions must be relatively simple to implement, reliable, and acceptable to SCADA owners and operators. 4This page intentionally left blank.

  11. Naming, the formation of stimulus classes, and applied behavior analysis.

    PubMed Central

    Stromer, R; Mackay, H A; Remington, B

    1996-01-01

    The methods used in Sidman's original studies on equivalence classes provide a framework for analyzing functional verbal behavior. Sidman and others have shown how teaching receptive, name-referent matching may produce rudimentary oral reading and word comprehension skills. Eikeseth and Smith (1992) have extended these findings by showing that children with autism may acquire equivalence classes after learning to supply a common oral name to each stimulus in a potential class. A stimulus class analysis suggests ways to examine (a) the problem of programming generalization from teaching situations to other environments, (b) the expansion of the repertoires that occur in those settings, and (c) the use of naming to facilitate these forms of generalization. Such research will help to clarify and extend Horne and Lowe's recent (1996) account of the role of verbal behavior in the formation of stimulus classes. PMID:8810064

  12. Detailed analysis of POD method applied on turbulent flow

    NASA Astrophysics Data System (ADS)

    Kellnerova, Radka; Kukacka, Libor; Uruba, Vaclav; Jurcakova, Klara; Janour, Zbynek

    2012-04-01

    Proper orthogonal decomposition (POD) of a very turbulent flow inside a street canyon is performed. The energy contribution of each mode is obtained. Also, physical meaning of the POD result is clarified. Particular modes of POD are assigned to the particular flow events like a sweep event, a vortex behind a roof or a vortex at the bottom of a street. Test of POD sensitivity to the acquisition time of data records is done. Test with decreasing sample frequency is also executed. Further, interpolation of POD expansion coefficient is performed in order to test possible increase in sample frequency and get new information about the flow from the POD analysis. We tested a linear and a spline type of the interpolation and the linear one carried out a slightly better result.

  13. Biophotogrammetry model of respiratory motion analysis applied to children.

    PubMed

    Ripka, W L; Ricieri, D da V; Ulbricht, L; Neves, E B; Stadnik, A M W; Romaneli, E F R

    2012-01-01

    This study aimed to test a protocol of measurements based on Biophotogrammetry to Analysis of Respiratory Mechanics (BARM) in healthy children. Seventeen normal spirometric children (six male and 11 female) were tested. Their performed maneuvers of forced inspiratory vital capacity were recorded in the supine position. The images were acquired by a digital camera, laterally placed to the trunk. Surface markers allowed that the files, exported to CorelDraw® software, were processed by irregular trapezoids paths. Compartments were defined in the thoracic (TX), abdominal (AB) and the chest wall (CW). They were defined at the end of an inspiration and expiration, both maximum, controlled by a digital spirometer. The result showed that the measured areas at the inspiratory and expiratory periods were statistically different (p<0.05). It reflects the mobility of CW and compartments. In conclusion, the proposed method can identify the breathing pattern of the measured subject using images in two dimensions (2D). PMID:23366409

  14. Applying GIS technology to the Regional Information Sharing Systems database

    NASA Astrophysics Data System (ADS)

    Aumond, Karen L.

    1997-02-01

    The Regional Information Sharing Systems (RISS) program was formed as a partnership for information exchange between the federal government and state and local law enforcement. The six regional projects provide member law enforcement agencies in all 50 states with a broad range of intelligence and investigative support services. Recently, the existing RISS databases were redesigned to allow for connectivity among projects and the capability of a nationwide search of over 450,000 suspects. This relational database of intelligence information, along with a photographic imaging system, an operational `critical event' database, and GIS mapping are integrated components of RISSNET. The Geographical-Regional Information Sharing System (G-RISS) application is being prototypes by Graphic Data Systems Corporation at one RISS site, the Western States Information Network in Sacramento, California. G-RISS is a tool that will combine information from various law enforcement resources, map criminal activities to detect trends and assist agencies by being proactive to combat these activities.

  15. Applied estimation for hybrid dynamical systems using perceptional information

    NASA Astrophysics Data System (ADS)

    Plotnik, Aaron M.

    This dissertation uses the motivating example of robotic tracking of mobile deep ocean animals to present innovations in robotic perception and estimation for hybrid dynamical systems. An approach to estimation for hybrid systems is presented that utilizes uncertain perceptional information about the system's mode to improve tracking of its mode and continuous states. This results in significant improvements in situations where previously reported methods of estimation for hybrid systems perform poorly due to poor distinguishability of the modes. The specific application that motivates this research is an automatic underwater robotic observation system that follows and films individual deep ocean animals. A first version of such a system has been developed jointly by the Stanford Aerospace Robotics Laboratory and Monterey Bay Aquarium Research Institute (MBARI). This robotic observation system is successfully fielded on MBARI's ROVs, but agile specimens often evade the system. When a human ROV pilot performs this task, one advantage that he has over the robotic observation system in these situations is the ability to use visual perceptional information about the target, immediately recognizing any changes in the specimen's behavior mode. With the approach of the human pilot in mind, a new version of the robotic observation system is proposed which is extended to (a) derive perceptional information (visual cues) about the behavior mode of the tracked specimen, and (b) merge this dissimilar, discrete and uncertain information with more traditional continuous noisy sensor data by extending existing algorithms for hybrid estimation. These performance enhancements are enabled by integrating techniques in hybrid estimation, computer vision and machine learning. First, real-time computer vision and classification algorithms extract a visual observation of the target's behavior mode. Existing hybrid estimation algorithms are extended to admit this uncertain but discrete

  16. [Theoretic and applicative aspects of applying of formulary system in military medicine].

    PubMed

    Belevitin, A E; Miroshnichenko, Iu V; Goriachev, A B; Bunin, S A; Krasavin, K D

    2010-08-01

    Development of the medicamental aid in military medicine can be realized only through the introduction of the formulary system. This system forms the informative-methodological basis of the achievement of socially necessary level of drug usage. On the basis of medical standards and analysis of sick rate the formulary of pharmaceuticals which can help to reduce the nomenclature of applying drugs, improve efficiency of medicamental aid is worked out. Medical service of Armed Forces of the Russian Federation has an experience in the development of formularies, but it is early to speak about the introduction of the formulary system into routine of military medicine. Development of the medicamental aid in military medicine on the basis of the formulary system will conduce to satisfying of medical and social requirements of servicemen, military retiree and members of their families. PMID:21089425

  17. System configured for applying multiple modifying agents to a substrate

    DOEpatents

    Propp, W. Alan; Argyle, Mark D.; Janikowski, Stuart K.; Fox, Robert V.; Toth, William J.; Ginosar, Daniel M.; Allen, Charles A.; Miller, David L.

    2003-11-25

    The present invention is related to the modifying of substrates with multiple modifying agents in a single continuous system. At least two processing chambers are configured for modifying the substrate in a continuous feed system. The processing chambers can be substantially isolated from one another by interstitial seals. Additionally, the two processing chambers can be substantially isolated from the surrounding atmosphere by end seals. Optionally, expansion chambers can be used to separate the seals from the processing chambers.

  18. System Configured For Applying Multiple Modifying Agents To A Substrate.

    DOEpatents

    Propp, W. Alan; Argyle, Mark D.; Janikowski, Stuart K.; Fox, Robert V.; Toth, William J.; Ginosar, Daniel M.; Allen, Charles A.; Miller, David L.

    2005-11-08

    The present invention is related to the modifying of substrates with multiple modifying agents in a single continuous system. At least two processing chambers are configured for modifying the substrate in a continuous feed system. The processing chambers can be substantially isolated from one another by interstitial seals. Additionally, the two processing chambers can be substantially isolated from the surrounding atmosphere by end seals. Optionally, expansion chambers can be used to separate the seals from the processing chambers.

  19. Network systems security analysis

    NASA Astrophysics Data System (ADS)

    Yilmaz, Ä.°smail

    2015-05-01

    Network Systems Security Analysis has utmost importance in today's world. Many companies, like banks which give priority to data management, test their own data security systems with "Penetration Tests" by time to time. In this context, companies must also test their own network/server systems and take precautions, as the data security draws attention. Based on this idea, the study cyber-attacks are researched throughoutly and Penetration Test technics are examined. With these information on, classification is made for the cyber-attacks and later network systems' security is tested systematically. After the testing period, all data is reported and filed for future reference. Consequently, it is found out that human beings are the weakest circle of the chain and simple mistakes may unintentionally cause huge problems. Thus, it is clear that some precautions must be taken to avoid such threats like updating the security software.

  20. A System Analysis Tool

    SciTech Connect

    CAMPBELL,PHILIP L.; ESPINOZA,JUAN

    2000-06-01

    In this paper we describe a tool for analyzing systems. The analysis is based on program slicing. It answers the following question for the software: if the value of a particular variable changes, what other variable values also change, and what is the path in between? program slicing was developed based on intra-procedure control and data flow. It has been expanded commercially to inter-procedure flow. However, we extend slicing to collections of programs and non-program entities, which we term multi-domain systems. The value of our tool is that an analyst can model the entirety of a system, not just the software, and we believe that this makes for a significant increase in power. We are building a prototype system.

  1. Reachability analysis of rational eigenvalue linear systems

    NASA Astrophysics Data System (ADS)

    Xu, Ming; Chen, Liangyu; Zeng, Zhenbing; Li, Zhi-bin

    2010-12-01

    One of the key problems in the safety analysis of control systems is the exact computation of reachable state spaces for continuous-time systems. Issues related to the controllability and observability of these systems are well-studied in systems theory. However, there are not many results on reachability, even for general linear systems. In this study, we present a large class of linear systems with decidable reachable state spaces. This is approached by reducing the reachability analysis to real root isolation of exponential polynomials. Furthermore, we have implemented this method in a Maple package based on symbolic computation and applied to several examples successfully.

  2. Aerodynamic Reconstruction Applied to Parachute Test Vehicle Flight Data Analysis

    NASA Technical Reports Server (NTRS)

    Cassady, Leonard D.; Ray, Eric S.; Truong, Tuan H.

    2013-01-01

    The aerodynamics, both static and dynamic, of a test vehicle are critical to determining the performance of the parachute cluster in a drop test and for conducting a successful test. The Capsule Parachute Assembly System (CPAS) project is conducting tests of NASA's Orion Multi-Purpose Crew Vehicle (MPCV) parachutes at the Army Yuma Proving Ground utilizing the Parachute Test Vehicle (PTV). The PTV shape is based on the MPCV, but the height has been reduced in order to fit within the C-17 aircraft for extraction. Therefore, the aerodynamics of the PTV are similar, but not the same as, the MPCV. A small series of wind tunnel tests and computational fluid dynamics cases were run to modify the MPCV aerodynamic database for the PTV, but aerodynamic reconstruction of the flights has proven an effective source for further improvements to the database. The acceleration and rotational rates measured during free flight, before parachute inflation but during deployment, were used to con rm vehicle static aerodynamics. A multibody simulation is utilized to reconstruct the parachute portions of the flight. Aerodynamic or parachute parameters are adjusted in the simulation until the prediction reasonably matches the flight trajectory. Knowledge of the static aerodynamics is critical in the CPAS project because the parachute riser load measurements are scaled based on forebody drag. PTV dynamic damping is critical because the vehicle has no reaction control system to maintain attitude - the vehicle dynamics must be understood and modeled correctly before flight. It will be shown here that aerodynamic reconstruction has successfully contributed to the CPAS project.

  3. Applied Space Systems Engineering. Chapter 17; Manage Technical Data

    NASA Technical Reports Server (NTRS)

    Kent, Peter

    2008-01-01

    Effective space systems engineering (SSE) is conducted in a fully electronic manner. Competitive hardware, software, and system designs are created in a totally digital environment that enables rapid product design and manufacturing cycles, as well as a multitude of techniques such as modeling, simulation, and lean manufacturing that significantly reduce the lifecycle cost of systems. Because the SSE lifecycle depends on the digital environment, managing the enormous volumes of technical data needed to describe, build, deploy, and operate systems is a critical factor in the success of a project. This chapter presents the key aspects of Technical Data Management (TDM) within the SSE process. It is written from the perspective of the System Engineer tasked with establishing the TDM process and infrastructure for a major project. Additional perspectives are reflected from the point of view of the engineers on the project who work within the digital engineering environment established by the TDM toolset and infrastructure, and from the point of view of the contactors who interface via the TDM infrastructure. Table 17.1 lists the TDM process as it relates to SSE.

  4. Statistical methods for texture analysis applied to agronomical images

    NASA Astrophysics Data System (ADS)

    Cointault, F.; Journaux, L.; Gouton, P.

    2008-02-01

    For activities of agronomical research institute, the land experimentations are essential and provide relevant information on crops such as disease rate, yield components, weed rate... Generally accurate, they are manually done and present numerous drawbacks, such as penibility, notably for wheat ear counting. In this case, the use of color and/or texture image processing to estimate the number of ears per square metre can be an improvement. Then, different image segmentation techniques based on feature extraction have been tested using textural information with first and higher order statistical methods. The Run Length method gives the best results closed to manual countings with an average error of 3%. Nevertheless, a fine justification of hypothesis made on the values of the classification and description parameters is necessary, especially for the number of classes and the size of analysis windows, through the estimation of a cluster validity index. The first results show that the mean number of classes in wheat image is of 11, which proves that our choice of 3 is not well adapted. To complete these results, we are currently analysing each of the class previously extracted to gather together all the classes characterizing the ears.

  5. Ion Beam Analysis applied to laser-generated plasmas

    NASA Astrophysics Data System (ADS)

    Cutroneo, M.; Macková, A.; Havranek, V.; Malinsky, P.; Torrisi, L.; Kormunda, M.; Barchuk, M.; Ullschmied, J.; Dudzak, R.

    2016-04-01

    This paper presents the research activity on Ion Beam Analysis methods performed at Tandetron Laboratory (LT) of the Institute of Nuclear Physics AS CR, Rez, Czech Republic. Recently, many groups are paying attention to implantation by laser generated plasma. This process allows to insert a controllable amount of energetic ions into the surface layers of different materials modifying the physical and chemical properties of the surface material. Different substrates are implanted by accelerated ions from plasma through terawatt iodine laser, at nominal intensity of 1015 W/cm2, at the PALS Research Infrastructure AS CR, in the Czech Republic. This regime of the laser matter interaction generates, multi-MeV proton beams, and multi-charged ions that are tightly confined in time (hundreds ps) and space (source radius of a few microns). These ion beams have a much lower transverse temperature, a much shorter duration and a much higher current than those obtainable from conventional accelerators. The implementation of protons and ions acceleration driven by ultra-short high intensity lasers is exhibited by adopting suitable irradiation conditions as well as tailored targets. An overview of implanted targets and their morphological and structural characterizations is presented and discussed.

  6. Applied and computational harmonic analysis on graphs and networks

    NASA Astrophysics Data System (ADS)

    Irion, Jeff; Saito, Naoki

    2015-09-01

    In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.

  7. Applying Logic Analysis to Genomic Data and Phylogenetic Profiles

    NASA Astrophysics Data System (ADS)

    Yeates, Todd

    2005-03-01

    One of the main goals of comparative genomics is to understand how all the various proteins in a cell relate to each other in terms of pathways and interaction networks. Various computational ideas have been explored with this goal in mind. In the original phylogenetic profile method, `functional linkages' were inferred between pairs of proteins when the two proteins, A and B, showed identical (or statistically similar) patterns of presence vs. absence across a set of completely sequenced genomes. Here we describe a new generalization, logic analysis of phylogenetic profiles (LAPP), from which higher order relationships can be identified between three (or more) different proteins. For instance, in one type of triplet logic relation -- of which there are eight distinct types -- a protein C may be present in a genome iff proteins A and B are both present (C=AB). An application of the LAPP method identifies thousands of previously unidentified relationships between protein triplets. These higher order logic relationships offer insights -- not available from pairwise approaches -- into branching, competition, and alternate routes through cellular pathways and networks. The results also make it possible to assign tentative cellular functions to many novel proteins of unknown function. Co-authors: Peter Bowers, Shawn Cokus, Morgan Beeby, and David Eisenberg

  8. Applied patent RFID systems for building reacting HEPA air ventilation system in hospital operation rooms.

    PubMed

    Lin, Jesun; Pai, Jar-Yuan; Chen, Chih-Cheng

    2012-12-01

    RFID technology, an automatic identification and data capture technology to provide identification, tracing, security and so on, was widely applied to healthcare industry in these years. Employing HEPA ventilation system in hospital is a way to ensure healthful indoor air quality to protect patients and healthcare workers against hospital-acquired infections. However, the system consumes lots of electricity which cost a lot. This study aims to apply the RFID technology to offer a unique medical staff and patient identification, and reacting HEPA air ventilation system in order to reduce the cost, save energy and prevent the prevalence of hospital-acquired infection. The system, reacting HEPA air ventilation system, contains RFID tags (for medical staffs and patients), sensor, and reacting system which receives the information regarding the number of medical staff and the status of the surgery, and controls the air volume of the HEPA air ventilation system accordingly. A pilot program was carried out in a unit of operation rooms of a medical center with 1,500 beds located in central Taiwan from Jan to Aug 2010. The results found the air ventilation system was able to function much more efficiently with less energy consumed. Furthermore, the indoor air quality could still keep qualified and hospital-acquired infection or other occupational diseases could be prevented. PMID:22081235

  9. DART system analysis.

    SciTech Connect

    Boggs, Paul T.; Althsuler, Alan; Larzelere, Alex R.; Walsh, Edward J.; Clay, Ruuobert L.; Hardwick, Michael F. (Sandia National Laboratories, Livermore, CA)

    2005-08-01

    The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a community model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.

  10. Integrated hydrogen/oxygen technology applied to auxiliary propulsion systems

    NASA Technical Reports Server (NTRS)

    Gerhardt, David L.

    1990-01-01

    The purpose of the Integrated Hydrogen/Oxygen Technology (IHOT) study was to determine if the vehicle/mission needs and technology of the 1990's support development of an all cryogenic H2/O2 system. In order to accomplish this, IHOT adopted the approach of designing Integrated Auxiliary Propulsion Systems (IAPS) for a representative manned vehicle; the advanced manned launch system. The primary objectives were to develop IAPS concepts which appeared to offer viable alternatives to state-of-the-art (i.e., hypergolic, or earth-storable) APS approaches. The IHOT study resulted in the definition of three APS concepts; two cryogenic IAPS, and a third concept utilizing hypergolic propellants.

  11. Multi-agent cooperative systems applied to precision applications

    SciTech Connect

    McKay, M.D.; Anderson, M.O.; Gunderson, R.W.; Flann, N.; Abbott, B.

    1998-03-01

    Regulatory agencies are imposing limits and constraints to protect the operator and/or the environment. While generally necessary, these controls also tend to increase cost and decrease efficiency and productivity. Intelligent computer systems can be made to perform these hazardous tasks with greater efficiency and precision without danger to the operators. The Idaho national Engineering and Environmental Laboratory and the Center for Self-Organizing and Intelligent Systems at Utah State University have developed a series of autonomous all-terrain multi-agent systems capable of performing automated tasks within hazardous environments. This paper discusses the development and application of cooperative small-scale and large-scale robots for use in various activities associated with radiologically contaminated areas, prescription farming, and unexploded ordinances.

  12. Applying programmable logic controllers to safety-related systems

    SciTech Connect

    Ruether, J.C. )

    1992-01-01

    Northern States Power Company (NSP) recently installed programmable logic controllers (PLCs) in two safety-related systems at its Prairie Island nuclear generating plant. The lessons learned during these applications at the 19-yr old two-unit plant may benefit similar projects. Prairie Island responded to the station black out (SBO) issue by upgrading its electrical distribution system. This included installing additional safeguard diesel generators (DGs), new 4160-V buses, and new 480-V buses. As part of this upgrade, PLCs were commercially dedicated for use in two safety-related applications: (1) bus load sequencer project, (2) 480-V voltage regulator project.

  13. Error behaviour of multistep methods applied to unstable differential systems

    NASA Technical Reports Server (NTRS)

    Brown, R. L.

    1978-01-01

    The problem of modelling a dynamic system described by a system of ordinary differential equations which has unstable components for limited periods of time is discussed. It is shown that the global error in a multistep numerical method is the solution to a difference equation initial value problem, and the approximate solution is given for several popular multistep integration formulae. Inspection of the solution leads to the formulation of four criteria for integrators appropriate to unstable problems. A sample problem is solved numerically using three popular formulae and two different stepsizes to illustrate the appropriateness of the criteria.

  14. Robust sliding mode control applied to double Inverted pendulum system

    SciTech Connect

    Mahjoub, Sonia; Derbel, Nabil; Mnif, Faical

    2009-03-05

    A three hierarchical sliding mode control is presented for a class of an underactuated system which can overcome the mismatched perturbations. The considered underactuated system is a double inverted pendulum (DIP), can be modeled by three subsystems. Such structure allows the construction of several designs of hierarchies for the controller. For all hierarchical designs, the asymptotic stability of every layer sliding mode surface and the sliding mode surface of subsystems are proved theoretically by Barbalat's lemma. Simulation results show the validity of these methods.

  15. Quality analysis of the solution produced by dissection algorithms applied to the traveling salesman problem

    SciTech Connect

    Cesari, G.

    1994-12-31

    The aim of this paper is to analyze experimentally the quality of the solution obtained with dissection algorithms applied to the geometric Traveling Salesman Problem. Starting from Karp`s results. We apply a divide and conquer strategy, first dividing the plane into subregions where we calculate optimal subtours and then merging these subtours to obtain the final tour. The analysis is restricted to problem instances where points are uniformly distributed in the unit square. For relatively small sets of cities we analyze the quality of the solution by calculating the length of the optimal tour and by comparing it with our approximate solution. When the problem instance is too large we perform an asymptotical analysis estimating the length of the optimal tour. We apply the same dissection strategy also to classical heuristics by calculating approximate subtours and by comparing the results with the average quality of the heuristic. Our main result is the estimate of the rate of convergence of the approximate solution to the optimal solution as a function of the number of dissection steps, of the criterion used for the plane division and of the quality of the subtours. We have implemented our programs on MUSIC (MUlti Signal processor system with Intelligent Communication), a Single-Program-Multiple-Data parallel computer with distributed memory developed at the ETH Zurich.

  16. Speckle interferometry applied to asteroids and other solar system objects

    NASA Technical Reports Server (NTRS)

    Drummond, J. D.; Hege, E. K.

    1985-01-01

    The application of speckle interferometry to asteroids and other solar system objects is discussed. The assumption of a triaxial ellipsoid rotating about its shortest axis is the standard model. Binary asteroids, 433 Eros, 532 Herculina, 511 Davida, and Pallas are discussed.

  17. System Identification and POD Method Applied to Unsteady Aerodynamics

    NASA Technical Reports Server (NTRS)

    Tang, Deman; Kholodar, Denis; Juang, Jer-Nan; Dowell, Earl H.

    2001-01-01

    The representation of unsteady aerodynamic flow fields in terms of global aerodynamic modes has proven to be a useful method for reducing the size of the aerodynamic model over those representations that use local variables at discrete grid points in the flow field. Eigenmodes and Proper Orthogonal Decomposition (POD) modes have been used for this purpose with good effect. This suggests that system identification models may also be used to represent the aerodynamic flow field. Implicit in the use of a systems identification technique is the notion that a relative small state space model can be useful in describing a dynamical system. The POD model is first used to show that indeed a reduced order model can be obtained from a much larger numerical aerodynamical model (the vortex lattice method is used for illustrative purposes) and the results from the POD and the system identification methods are then compared. For the example considered, the two methods are shown to give comparable results in terms of accuracy and reduced model size. The advantages and limitations of each approach are briefly discussed. Both appear promising and complementary in their characteristics.

  18. 40 CFR 63.8030 - What requirements apply to my heat exchange systems?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 13 2011-07-01 2011-07-01 false What requirements apply to my heat... apply to my heat exchange systems? (a) You must comply with the requirements specified in Table 6 to this subpart that apply to your heat exchange systems, except as specified in paragraphs (b) through...

  19. 40 CFR 63.8030 - What requirements apply to my heat exchange systems?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 14 2012-07-01 2011-07-01 true What requirements apply to my heat... apply to my heat exchange systems? (a) You must comply with the requirements specified in Table 6 to this subpart that apply to your heat exchange systems, except as specified in paragraphs (b) through...

  20. 40 CFR 63.8030 - What requirements apply to my heat exchange systems?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 14 2014-07-01 2014-07-01 false What requirements apply to my heat... apply to my heat exchange systems? (a) You must comply with the requirements specified in Table 6 to this subpart that apply to your heat exchange systems, except as specified in paragraphs (b) through...

  1. 40 CFR 63.8030 - What requirements apply to my heat exchange systems?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 13 2010-07-01 2010-07-01 false What requirements apply to my heat... apply to my heat exchange systems? (a) You must comply with the requirements specified in Table 6 to this subpart that apply to your heat exchange systems, except as specified in paragraphs (b) through...

  2. Exploratory Factor Analysis as a Construct Validation Tool: (Mis)applications in Applied Linguistics Research

    ERIC Educational Resources Information Center

    Karami, Hossein

    2015-01-01

    Factor analysis has been frequently exploited in applied research to provide evidence about the underlying factors in various measurement instruments. A close inspection of a large number of studies published in leading applied linguistic journals shows that there is a misconception among applied linguists as to the relative merits of exploratory…

  3. Applied Drama and the Higher Education Learning Spaces: A Reflective Analysis

    ERIC Educational Resources Information Center

    Moyo, Cletus

    2015-01-01

    This paper explores Applied Drama as a teaching approach in Higher Education learning spaces. The exploration takes a reflective analysis approach by first examining the impact that Applied Drama has had on my career as a Lecturer/Educator/Teacher working in Higher Education environments. My engagement with Applied Drama practice and theory is…

  4. Simulation program of nonlinearities applied to telecommunication systems

    NASA Technical Reports Server (NTRS)

    Thomas, C.

    1979-01-01

    In any satellite communication system, the problems of distorsion created by nonlinear devices or systems must be considered. The subject of this paper is the use of the Fast Fourier Transform (F.F.T.) in the prediction of the intermodulation performance of amplifiers, mixers, filters. A nonlinear memory-less model is chosen to simulate amplitude and phase nonlinearities of the device in the simulation program written in FORTRAN 4. The experimentally observed nonlinearity parameters of a low noise 3.7-4.2 GHz amplifier are related to the gain and phase coefficients of Fourier Service Series. The measured results are compared with those calculated from the simulation in the cases where the input signal is composed of two, three carriers and noise power density.

  5. Applying twisted boundary conditions for few-body nuclear systems

    NASA Astrophysics Data System (ADS)

    Körber, Christopher; Luu, Thomas

    2016-05-01

    We describe and implement twisted boundary conditions for the deuteron and triton systems within finite volumes using the nuclear lattice EFT formalism. We investigate the finite-volume dependence of these systems with different twist angles. We demonstrate how various finite-volume information can be used to improve calculations of binding energies in such a framework. Our results suggests that with appropriate twisting of boundaries, infinite-volume binding energies can be reliably extracted from calculations using modest volume sizes with cubic length L ≈8 -14 fm. Of particular importance is our derivation and numerical verification of three-body analogs of "i-periodic" twist angles that eliminate the leading-order finite-volume effects to the three-body binding energy.

  6. Applying Contamination Modelling to Spacecraft Propulsion Systems Designs and Operations

    NASA Technical Reports Server (NTRS)

    Chen, Philip T.; Thomson, Shaun; Woronowicz, Michael S.

    2000-01-01

    Molecular and particulate contaminants generated from the operations of a propulsion system may impinge on spacecraft critical surfaces. Plume depositions or clouds may hinder the spacecraft and instruments from performing normal operations. Firing thrusters will generate both molecular and particulate contaminants. How to minimize the contamination impact from the plume becomes very critical for a successful mission. The resulting effect from either molecular or particulate contamination of the thruster firing is very distinct. This paper will discuss the interconnection between the functions of spacecraft contamination modeling and propulsion system implementation. The paper will address an innovative contamination engineering approach implemented from the spacecraft concept design, manufacturing, integration and test (I&T), launch, to on- orbit operations. This paper will also summarize the implementation on several successful missions. Despite other contamination sources, only molecular contamination will be considered here.

  7. Beyond Time Out and Table Time: Today's Applied Behavior Analysis for Students with Autism

    ERIC Educational Resources Information Center

    Boutot, E. Amanda; Hume, Kara

    2010-01-01

    Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…

  8. Beyond Time out and Table Time: Today's Applied Behavior Analysis for Students with Autism

    ERIC Educational Resources Information Center

    Boutot, E. Amanda; Hume, Kara

    2012-01-01

    Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…

  9. Applying an integrated neuro-expert system model in a real-time alarm processing system

    NASA Astrophysics Data System (ADS)

    Khosla, Rajiv; Dillon, Tharam S.

    1993-03-01

    In this paper we propose an integrated model which is derived from the combination of a generic neuro-expert system model, an object model, and unix operating system process (UOSP) model. This integrated model reflects the strengths of both artificial neural nets (ANNs) and expert systems (ESs). A formalism of ES object, ANN object, UOSP object, and problem domain object is used for developing a set of generic data structures and methods. These generic data structures and methods help us to build heterogeneous ES-ANN objects with uniform communication interface. The integrated model is applied in a real-time alarm processing system for a non-trivial terminal power station. It is shown how features like hierarchical/distributed ES/ANN objects, inter process communication, and fast concurrent execution help to cope with real-time system constraints like, continuity, data variability, and fast response time.

  10. System And Method Of Applying Energetic Ions For Sterlization

    DOEpatents

    Schmidt, John A.

    2002-06-11

    A method of sterilization of a container is provided whereby a cold plasma is caused to be disposed near a surface to be sterilized, and the cold plasma is then subjected to a pulsed voltage differential for producing energized ions in the plasma. Those energized ions then operate to achieve spore destruction on the surface to be sterilized. Further, a system for sterilization of a container which includes a conductive or non-conductive container, a cold plasma in proximity to the container, and a high voltage source for delivering a pulsed voltage differential between an electrode and the container and across the cold plasma, is provided.

  11. System and method of applying energetic ions for sterilization

    DOEpatents

    Schmidt, John A.

    2003-12-23

    A method of sterilization of a container is provided whereby a cold plasma is caused to be disposed near a surface to be sterilized, and the cold plasma is then subjected to a pulsed voltage differential for producing energized ions in the plasma. Those energized ions then operate to achieve spore destruction on the surface to be sterilized. Further, a system for sterilization of a container which includes a conductive or non-conductive container, a cold plasma in proximity to the container, and a high voltage source for delivering a pulsed voltage differential between an electrode and the container and across the cold plasma, is provided.

  12. A Method to Apply Friction Modifier in Railway System

    NASA Astrophysics Data System (ADS)

    Matsumoto, Kosuke; Suda, Yoshihiro; Iwasa, Takashi; Fujii, Takeshi; Tomeoka, Masao; Tanimoto, Masuhisa; Kishimoto, Yasushi; Nakai, Takuji

    Controlling the friction between wheel and rail is direct and very effective measures to improve the curving performances of bogie trucks, because the curving performances of bogie truck depend much on friction characteristics. Authors have proposed a method, “friction control”, which utilizes friction modifier (KELTRACKTM HPF) with onboard spraying system. With the method, not only friction coefficient, but also friction characteristics are able to be controlled as expected. In this paper, results of fundamental experiments are reported which play an important role to realize the new method.

  13. Applying Real Options for Evaluating Investments in ERP Systems

    NASA Astrophysics Data System (ADS)

    Nakagane, Jun; Sekozawa, Teruji

    This paper intends to verify effectiveness of real options approach for evaluating investments in Enterprise Resource Planning systems (ERP) and proves how important it is to disclose shadow options potentially embedded in ERP investment. The net present value (NPV) method is principally adopted to evaluate the value of ERP. However, the NPV method assumes no uncertainties exist in the object. It doesn't satisfy the current business circumstances which are filled with dynamic issues. Since the 1990s the effectiveness of option pricing models for Information System (IS) investment to solve issues in the NPV method has been discussed in the IS literature. This paper presents 3 business cases to review the practical advantages of such techniques for IS investments, especially ERP investments. The first case is EDI development. We evaluate the project by a new approach with lighting one of shadow options, EDI implementation. In the second case we reveal an ERP investment has an “expanding option” in a case of eliminating redundancy. The third case describes an option to contract which is deliberately slotted in ERP development to prepare transferring a manufacturing facility.

  14. Near-infrared radiation curable multilayer coating systems and methods for applying same

    DOEpatents

    Bowman, Mark P; Verdun, Shelley D; Post, Gordon L

    2015-04-28

    Multilayer coating systems, methods of applying and related substrates are disclosed. The coating system may comprise a first coating comprising a near-IR absorber, and a second coating deposited on a least a portion of the first coating. Methods of applying a multilayer coating composition to a substrate may comprise applying a first coating comprising a near-IR absorber, applying a second coating over at least a portion of the first coating and curing the coating with near infrared radiation.

  15. An applied study using systems engineering methods to prioritize green systems options

    SciTech Connect

    Lee, Sonya M; Macdonald, John M

    2009-01-01

    For many years, there have been questions about the effectiveness of applying different green solutions. If you're building a home and wish to use green technologies, where do you start? While all technologies sound promising, which will perform the best over time? All this has to be considered within the cost and schedule of the project. The amount of information available on the topic can be overwhelming. We seek to examine if Systems Engineering methods can be used to help people choose and prioritize technologies that fit within their project and budget. Several methods are used to gain perspective into how to select the green technologies, such as the Analytic Hierarchy Process (AHP) and Kepner-Tregoe. In our study, subjects applied these methods to analyze cost, schedule, and trade-offs. Results will document whether the experimental approach is applicable to defining system priorities for green technologies.

  16. Absorption and adsorption chillers applied to air conditioning systems

    NASA Astrophysics Data System (ADS)

    Kuczyńska, Agnieszka; Szaflik, Władysław

    2010-07-01

    This work presents an application possibility of sorption refrigerators driven by low temperature fluid for air conditioning of buildings. Thermodynamic models were formulated and absorption LiBr-water chiller with 10 kW cooling power as well as adsorption chiller with silica gel bed were investigated. Both of them are using water for desorption process with temperature Tdes = 80 °C. Coefficient of performance (COP) for both cooling cycles was analyzed in the same conditions of the driving heat source, cooling water Tc = 25 °C and temperature in evaporator Tevap = 5 °C. In this study, the computer software EES was used to investigate the performance of absorption heat pump system and its behaviour in configuration with geothermal heat source.

  17. Van der Waals density functional applied to adsorption systems

    NASA Astrophysics Data System (ADS)

    Hamada, Ikutaro

    2013-03-01

    The van der Waals density functional (vdW-DF) is a promising density functional to describe the van der Waals forces within density functional theory. However, despite the recent efforts, there is still room for further improvement, especially for describing molecular adsorption on metal surfaces. I will show that by choosing appropriate exchange and nonlocal correlation functionals, it is possible to calculate geometries and electronic structures for adsorption systems accurately within the framework of vdW-DF. Applicability of the present approach will be illustrated with its applications to graphene/metal, fullerene/metal, and water/graphene interfaces. This work is partly supported by a Grant-in-Aid for Scientific Research on Innovative Area (No. 23104501). AIMR was established by the World Premier International Research Center Initiative (WPI), MEXT, Japan.

  18. Launch Vehicle Systems Analysis

    NASA Technical Reports Server (NTRS)

    Olds, John R.

    1999-01-01

    This report summaries the key accomplishments of Georgia Tech's Space Systems Design Laboratory (SSDL) under NASA Grant NAG8-1302 from NASA - Marshall Space Flight Center. The report consists of this summary white paper, copies of technical papers written under this grant, and several viewgraph-style presentations. During the course of this grant four main tasks were completed: (1)Simulated Combined-Cycle Rocket Engine Analysis Module (SCCREAM), a computer analysis tool for predicting the performance of various RBCC engine configurations; (2) Hyperion, a single stage to orbit vehicle capable of delivering 25,000 pound payloads to the International Space Station Orbit; (3) Bantam-X Support - a small payload mission; (4) International Trajectory Support for interplanetary human Mars missions.

  19. Versatile microanalytical system with porous polypropylene capillary membrane for calibration gas generation and trace gaseous pollutants sampling applied to the analysis of formaldehyde, formic acid, acetic acid and ammonia in outdoor air.

    PubMed

    Coelho, Lúcia H G; Melchert, Wanessa R; Rocha, Flavio R; Rocha, Fábio R P; Gutz, Ivano G R

    2010-11-15

    The analytical determination of atmospheric pollutants still presents challenges due to the low-level concentrations (frequently in the μg m(-3) range) and their variations with sampling site and time. In this work, a capillary membrane diffusion scrubber (CMDS) was scaled down to match with capillary electrophoresis (CE), a quick separation technique that requires nothing more than some nanoliters of sample and, when combined with capacitively coupled contactless conductometric detection (C(4)D), is particularly favorable for ionic species that do not absorb in the UV-vis region, like the target analytes formaldehyde, formic acid, acetic acid and ammonium. The CMDS was coaxially assembled inside a PTFE tube and fed with acceptor phase (deionized water for species with a high Henry's constant such as formaldehyde and carboxylic acids, or acidic solution for ammonia sampling with equilibrium displacement to the non-volatile ammonium ion) at a low flow rate (8.3 nL s(-1)), while the sample was aspirated through the annular gap of the concentric tubes at 2.5 mL s(-1). A second unit, in all similar to the CMDS, was operated as a capillary membrane diffusion emitter (CMDE), generating a gas flow with know concentrations of ammonia for the evaluation of the CMDS. The fluids of the system were driven with inexpensive aquarium air pumps, and the collected samples were stored in vials cooled by a Peltier element. Complete protocols were developed for the analysis, in air, of NH(3), CH(3)COOH, HCOOH and, with a derivatization setup, CH(2)O, by associating the CMDS collection with the determination by CE-C(4)D. The ammonia concentrations obtained by electrophoresis were checked against the reference spectrophotometric method based on Berthelot's reaction. Sensitivity enhancements of this reference method were achieved by using a modified Berthelot reaction, solenoid micro-pumps for liquid propulsion and a long optical path cell based on a liquid core waveguide (LCW). All

  20. The schism between experimental and applied behavior analysis: Is it real and who cares? 1

    PubMed Central

    Poling, Alan; Picker, Mitchell; Grossett, Deborah; Hall-Johnson, Earl; Holbrook, Maurice

    1981-01-01

    This paper addresses the relationship between the experimental analysis of behavior and applied behavior analysis. Citation data indicate that across time the Journal of the Experimental Analysis of Behavior, and other experimental sources, have been referenced increasingly infrequently in the Journal of Applied Behavior Analysis, Behavior Therapy, and Behavior Research and Therapy. Such sources are now rarely cited in these journals, and never have been regularly referenced in Behavior Modification. Although their proper interpretation is far from certain, these data partially support recent suggestions that the experimental analysis of behavior and applied behavior analysis are largely separate, insular fields. A questionnaire, mailed to the editorial staffs of the Journal of the Experimental Analysis of Behavior and the Journal of Applied Behavior Analysis, was intended to gather further information about the alleged schism between the fields. Few respondents regularly read both journals, publish in both journals, or find both journals useful in their current research efforts. The majority of editors of both journals indicated that the fields were growing apart, although there was no consensus that this is harmful for behavior analysis. Most editors of the Journal of Applied Behavior Analysis reported that research published in the Journal of the Experimental Analysis of Behavior has decreased in value to applied researchers across time; most editors of the Journal of the Experimental Analysis of Behavior indicated that research published there has not changed in applied value. Several respondents commented at length concerning the relationship of experimental and applied behavior analysis. These comments, many of which appear in the article, reveal a marked plurality of views. PMID:22478543

  1. Applying machine learning techniques to DNA sequence analysis. Progress report, February 14, 1991--February 13, 1992

    SciTech Connect

    Shavlik, J.W.

    1992-04-01

    We are developing a machine learning system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being learned. Using this information (which we call a ``domain theory``), our learning algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, the KBANN algorithm maps inference rules, such as consensus sequences, into a neural (connectionist) network. Neural network training techniques then use the training examples of refine these inference rules. We have been applying this approach to several problems in DNA sequence analysis and have also been extending the capabilities of our learning system along several dimensions.

  2. Neutron activation analysis system

    DOEpatents

    Taylor, M.C.; Rhodes, J.R.

    1973-12-25

    A neutron activation analysis system for monitoring a generally fluid media, such as slurries, solutions, and fluidized powders, including two separate conduit loops for circulating fluid samples within the range of radiation sources and detectors is described. Associated with the first loop is a neutron source that emits s high flux of slow and thermal neutrons. The second loop employs a fast neutron source, the flux from which is substantially free of thermal neutrons. Adjacent to both loops are gamma counters for spectrographic determination of the fluid constituents. Other gsmma sources and detectors are arranged across a portion of each loop for deterMining the fluid density. (Official Gazette)

  3. Beta systems error analysis

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The atmospheric backscatter coefficient, beta, measured with an airborne CO Laser Doppler Velocimeter (LDV) system operating in a continuous wave, focussed model is discussed. The Single Particle Mode (SPM) algorithm, was developed from concept through analysis of an extensive amount of data obtained with the system on board a NASA aircraft. The SPM algorithm is intended to be employed in situations where one particle at a time appears in the sensitive volume of the LDV. In addition to giving the backscatter coefficient, the SPM algorithm also produces as intermediate results the aerosol density and the aerosol backscatter cross section distribution. A second method, which measures only the atmospheric backscatter coefficient, is called the Volume Mode (VM) and was simultaneously employed. The results of these two methods differed by slightly less than an order of magnitude. The measurement uncertainties or other errors in the results of the two methods are examined.

  4. Applying axiomatic design to a medication distribution system

    NASA Astrophysics Data System (ADS)

    Raguini, Pepito B.

    As the need to minimize medication errors drives many medical facilities to come up with robust solutions to the most common error that affects patient's safety, these hospitals would be wise to put a concerted effort into finding methodologies that can facilitate an optimized medical distribution system. If the hospitals' upper management is looking for an optimization method that is an ideal fit, it is just as important that the right tool be selected for the application at hand. In the present work, we propose the application of Axiomatic Design (AD), which is a process that focuses on the generation and selection of functional requirements to meet the customer needs for product and/or process design. The appeal of the axiomatic approach is to provide both a formal design process and a set of technical coefficients for meeting the customer's needs. Thus, AD offers a strategy for the effective integration of people, design methods, design tools and design data. Therefore, we propose the AD methodology to medical applications with the main objective of allowing nurses the opportunity to provide cost effective delivery of medications to inpatients, thereby improving quality patient care. The AD methodology will be implemented through the use of focused stores, where medications can be readily stored and can be conveniently located near patients, as well as a mobile apparatus that can also store medications and is commonly used by hospitals, the medication cart. Moreover, a robust methodology called the focused store methodology will be introduced and developed for both the uncapacitated and capacitated case studies, which will set up an appropriate AD framework and design problem for a medication distribution case study.

  5. Inclusive Elementary Classroom Teacher Knowledge of and Attitudes toward Applied Behavior Analysis and Autism Spectrum Disorder and Their Use of Applied Behavior Analysis

    ERIC Educational Resources Information Center

    McCormick, Jennifer A.

    2011-01-01

    The purpose of this study was to examine inclusive elementary teacher knowledge and attitude toward Autism Spectrum Disorder (ASD) and applied behavior analysis (ABA) and their use of ABA. Furthermore, this study examined if knowledge and attitude predicted use of ABA. A survey was developed and administered through a web-based program. Of the…

  6. Systems thinking applied to safety during manual handling tasks in the transport and storage industry.

    PubMed

    Goode, Natassia; Salmon, Paul M; Lenné, Michael G; Hillard, Peter

    2014-07-01

    Injuries resulting from manual handling tasks represent an on-going problem for the transport and storage industry. This article describes an application of a systems theory-based approach, Rasmussen's (1997. Safety Science 27, 183), risk management framework, to the analysis of the factors influencing safety during manual handling activities in a freight handling organisation. Observations of manual handling activities, cognitive decision method interviews with workers (n=27) and interviews with managers (n=35) were used to gather information about three manual handling activities. Hierarchical task analysis and thematic analysis were used to identify potential risk factors and performance shaping factors across the levels of Rasmussen's framework. These different data sources were then integrated using Rasmussen's Accimap technique to provide an overall analysis of the factors influencing safety during manual handling activities in this context. The findings demonstrate how a systems theory-based approach can be applied to this domain, and suggest that policy-orientated, rather than worker-orientated, changes are required to prevent future manual handling injuries. PMID:24762830

  7. Space lab system analysis

    NASA Technical Reports Server (NTRS)

    Ingels, F. M.; Rives, T. B.

    1987-01-01

    An analytical analysis of the HOSC Generic Peripheral processing system was conducted. The results are summarized and they indicate that the maximum delay in performing screen change requests should be less than 2.5 sec., occurring for a slow VAX host to video screen I/O rate of 50 KBps. This delay is due to the average I/O rate from the video terminals to their host computer. Software structure of the main computers and the host computers will have greater impact on screen change or refresh response times. The HOSC data system model was updated by a newly coded PASCAL based simulation program which was installed on the HOSC VAX system. This model is described and documented. Suggestions are offered to fine tune the performance of the ETERNET interconnection network. Suggestions for using the Nutcracker by Excelan to trace itinerate packets which appear on the network from time to time were offered in discussions with the HOSC personnel. Several visits to the HOSC facility were to install and demonstrate the simulation model.

  8. System of systems modeling and analysis.

    SciTech Connect

    Campbell, James E.; Anderson, Dennis James; Longsine, Dennis E.; Shirah, Donald N.

    2005-01-01

    This report documents the results of an LDRD program entitled 'System of Systems Modeling and Analysis' that was conducted during FY 2003 and FY 2004. Systems that themselves consist of multiple systems (referred to here as System of Systems or SoS) introduce a level of complexity to systems performance analysis and optimization that is not readily addressable by existing capabilities. The objective of the 'System of Systems Modeling and Analysis' project was to develop an integrated modeling and simulation environment that addresses the complex SoS modeling and analysis needs. The approach to meeting this objective involved two key efforts. First, a static analysis approach, called state modeling, has been developed that is useful for analyzing the average performance of systems over defined use conditions. The state modeling capability supports analysis and optimization of multiple systems and multiple performance measures or measures of effectiveness. The second effort involves time simulation which represents every system in the simulation using an encapsulated state model (State Model Object or SMO). The time simulation can analyze any number of systems including cross-platform dependencies and a detailed treatment of the logistics required to support the systems in a defined mission.

  9. Distinguishing Pattern Formation Phenotypes: Applying Minkowski Functionals to Cell Biology Systems

    NASA Astrophysics Data System (ADS)

    Rericha, Erin; Guven, Can; Parent, Carole; Losert, Wolfgang

    2011-03-01

    Spatial Clustering of proteins within cells or cells themselves frequently occur in cell biology systems. However quantifying the underlying order and determining the regulators of these cluster patterns have proved difficult due to the inherent high noise levels in the systems. For instance the patterns formed by wild type and cyclic-AMP regulatory mutant Dictyostelium cells are visually distinctive, yet the large error bars in measurements of the fractal number, area, Euler number, eccentricity, and wavelength making it difficult to quantitatively distinguish between the patterns. We apply a spatial analysis technique based on Minkowski functionals and develop metrics which clearly separate wild type and mutant cell lines into distinct categories. Having such a metric facilitated the development of a computational model for cellular aggregation and its regulators. Supported by NIH-NGHS Nanotechnology (R01GM085574) and the Burroughs Wellcome Fund.

  10. Integrated fluorescence analysis system

    DOEpatents

    Buican, Tudor N.; Yoshida, Thomas M.

    1992-01-01

    An integrated fluorescence analysis system enables a component part of a sample to be virtually sorted within a sample volume after a spectrum of the component part has been identified from a fluorescence spectrum of the entire sample in a flow cytometer. Birefringent optics enables the entire spectrum to be resolved into a set of numbers representing the intensity of spectral components of the spectrum. One or more spectral components are selected to program a scanning laser microscope, preferably a confocal microscope, whereby the spectrum from individual pixels or voxels in the sample can be compared. Individual pixels or voxels containing the selected spectral components are identified and an image may be formed to show the morphology of the sample with respect to only those components having the selected spectral components. There is no need for any physical sorting of the sample components to obtain the morphological information.

  11. Corrosion potential analysis system

    NASA Astrophysics Data System (ADS)

    Kiefer, Karl F.

    1998-03-01

    Many cities in the northeastern U.S. transport electrical power from place to place via underground cables, which utilize voltages from 68 kv to 348 kv. These cables are placed in seamless steel pipe to protect the conductors. These buried pipe-type-cables (PTCs) are carefully designed and constantly pressurized with transformer oil to prevent any possible contamination. A protective coating placed on the outside diameter of the pipe during manufacture protects the steel pipe from the soil environment. Notwithstanding the protection mechanisms available, the pipes remain vulnerable to electrochemical corrosion processes. If undetected, corrosion can cause the pipes to leak transformer oil into the environment. These leaks can assume serious proportions due to the constant pressure on the inside of the pipe. A need exists for a detection system that can dynamically monitor the corrosive potential on the length of the pipe and dynamically adjust cathodic protection to counter local and global changes in the cathodic environment surrounding the pipes. The northeastern United States contains approximately 1000 miles of this pipe. This milage is critical to the transportation and distribution of power. So critical, that each of the pipe runs has a redundant double running parallel to it. Invocon, Inc. proposed and tested a technically unique and cost effective solution to detect critical corrosion potential and to communicate that information to a central data collection and analysis location. Invocon's solution utilizes the steel of the casing pipe as a communication medium. Each data gathering station on the pipe can act as a relay for information gathered elsewhere on the pipe. These stations must have 'smart' network configuration algorithms that constantly test various communication paths and determine the best and most power efficient route through which information should flow. Each network station also performs data acquisition and analysis tasks that ultimately

  12. Applying Transactional Analysis and Personality Assessment to Improve Patient Counseling and Communication Skills

    PubMed Central

    Lawrence, Lesa

    2007-01-01

    Objective To teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling to improve communication. Design A lecture series for a required pharmacy communications class was developed to teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling. Students were asked to apply these techniques and to report their experiences. A personality self-assessment was also conducted. Assessment After attending the lecture series, students were able to apply the techniques and demonstrated an understanding of the psychological factors that may affect patient communication, an appreciation for the diversity created by different personality types, the ability to engage patients based on adult-to-adult interaction cues, and the ability to adapt the interactive patient counseling model to different personality traits. Conclusion Students gained a greater awareness of transactional analysis and personality assessment by applying these concepts. This understanding will help students communicate more effectively with patients. PMID:17786269

  13. Computing and Systems Applied in Support of Coordinated Energy, Environmental, and Climate Planning

    EPA Science Inventory

    This talk focuses on how Dr. Loughlin is applying Computing and Systems models, tools and methods to more fully understand the linkages among energy systems, environmental quality, and climate change. Dr. Loughlin will highlight recent and ongoing research activities, including: ...

  14. Applied Time Domain Stability Margin Assessment for Nonlinear Time-Varying Systems

    NASA Technical Reports Server (NTRS)

    Kiefer, J. M.; Johnson, M. D.; Wall, J. H.; Dominguez, A.

    2016-01-01

    margins. At each time point, the system was linearized about the current operating point using Simulink's built-in solver. Each linearized system in time was evaluated for its rigid-body gain margin (high frequency gain margin), rigid-body phase margin, and aero gain margin (low frequency gain margin) for each control axis. Using the stability margins derived from the baseline linearization approach, the time domain derived stability margins were determined by executing time domain simulations in which axis-specific incremental gain and phase adjustments were made to the nominal system about the expected neutral stability point at specific flight times. The baseline stability margin time histories were used to shift the system gain to various values around the zero margin point such that a precise amount of expected gain margin was maintained throughout flight. When assessing the gain margins, the gain was applied starting at the time point under consideration, thereafter following the variation in the margin found in the linear analysis. When assessing the rigid-body phase margin, a constant time delay was applied to the system starting at the time point under consideration. If the baseline stability margins were correctly determined via the linear analysis, the time domain simulation results should contain unstable behavior at certain gain and phase values. Examples will be shown from repeated simulations with variable added gain and phase lag. Faithfulness of margins calculated from the linear analysis to the nonlinear system will be demonstrated.

  15. 40 CFR 63.1083 - Does this subpart apply to my heat exchange system?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... CATEGORIES (CONTINUED) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Applicability for Heat Exchange Systems § 63.1083 Does this subpart apply to... 40 Protection of Environment 11 2013-07-01 2013-07-01 false Does this subpart apply to my...

  16. 40 CFR 63.1083 - Does this subpart apply to my heat exchange system?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... CATEGORIES (CONTINUED) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Applicability for Heat Exchange Systems § 63.1083 Does this subpart apply to... 40 Protection of Environment 10 2010-07-01 2010-07-01 false Does this subpart apply to my...

  17. 40 CFR 63.1083 - Does this subpart apply to my heat exchange system?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... CATEGORIES (CONTINUED) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Applicability for Heat Exchange Systems § 63.1083 Does this subpart apply to... 40 Protection of Environment 11 2012-07-01 2012-07-01 false Does this subpart apply to my...

  18. 40 CFR 63.1083 - Does this subpart apply to my heat exchange system?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CATEGORIES (CONTINUED) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Applicability for Heat Exchange Systems § 63.1083 Does this subpart apply to... 40 Protection of Environment 10 2011-07-01 2011-07-01 false Does this subpart apply to my...

  19. 40 CFR 63.1083 - Does this subpart apply to my heat exchange system?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... CATEGORIES (CONTINUED) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Applicability for Heat Exchange Systems § 63.1083 Does this subpart apply to... 40 Protection of Environment 11 2014-07-01 2014-07-01 false Does this subpart apply to my...

  20. The Research Mission of Universities of Applied Sciences and the Future Configuration of Higher Education Systems in Europe

    ERIC Educational Resources Information Center

    Lepori, Benedetto; Kyvik, Svein

    2010-01-01

    This article presents a comparative analysis of the development of research in universities of applied sciences (UAS) in eight European countries and its implications for the configuration of the higher education system. The enhancement of research has mostly been seen as a case of academic drift where UAS attempt to become more similar to…

  1. Research in progress in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.

  2. Comparison of transverse dental changes induced by the palatally applied Frog appliance and buccally applied Karad's integrated distalizing system

    PubMed Central

    Kaygisiz, Emine; Unver, Fatih; Tortop, Tuba

    2016-01-01

    Objective To compare the transverse dental changes induced by the palatally applied Frog appliance and buccally applied Karad's integrated distalizing system (KIDS). Methods We evaluated the pre- and post distalization orthodontic models of 39 patients, including 19 treated using the Frog appliance, which is palatally positioned (Frog group), and 20 treated using KIDS, which is buccally positioned (KIDS group). Changes in intermolar and interpremolar distances and the amount of maxillary premolar and molar rotation were evaluated on model photocopies. Wilcoxon and Mann-Whitney U tests were used for statistical evaluations. A p-value of < 0.05 was considered statistically significant. Results Significant distopalatal rotation of premolars and distobuccal rotation of molars were observed in Frog group (p < 0.05), while significant distopalatal rotation of molars (p < 0.05), with no significant changes in premolars, was observed in KIDS group. The amount of second premolar and first molar rotation was significantly different between the two groups (p < 0.05 and p < 0.001, respectively). Furthermore, expansion in the region of the first molars and second premolars was significantly greater in KIDS group than in Frog group (p < 0.001 for both). Conclusions Our results suggest that the type and amount of first molar rotation and expansion vary with the design of the distalization appliance used. PMID:27019824

  3. Risk analysis for confined space entries: Critical analysis of four tools applied to three risk scenarios.

    PubMed

    Burlet-Vienney, Damien; Chinniah, Yuvin; Bahloul, Ali; Roberge, Brigitte

    2016-06-01

    Investigation reports of fatal confined space accidents nearly always point to a problem of identifying or underestimating risks. This paper compares 4 different risk analysis tools developed for confined spaces by applying them to 3 hazardous scenarios. The tools were namely 1. a checklist without risk estimation (Tool A), 2. a checklist with a risk scale (Tool B), 3. a risk calculation without a formal hazard identification stage (Tool C), and 4. a questionnaire followed by a risk matrix (Tool D). Each tool's structure and practical application were studied. Tools A and B gave crude results comparable to those of more analytic tools in less time. Their main limitations were lack of contextual information for the identified hazards and greater dependency on the user's expertise and ability to tackle hazards of different nature. Tools C and D utilized more systematic approaches than tools A and B by supporting risk reduction based on the description of the risk factors. Tool D is distinctive because of 1. its comprehensive structure with respect to the steps suggested in risk management, 2. its dynamic approach to hazard identification, and 3. its use of data resulting from the risk analysis. PMID:26864350

  4. Constraints to applying systems thinking concepts in health systems: A regional perspective from surveying stakeholders in Eastern Mediterranean countries

    PubMed Central

    El-Jardali, Fadi; Adam, Taghreed; Ataya, Nour; Jamal, Diana; Jaafar, Maha

    2014-01-01

    Background: Systems Thinking (ST) has recently been promoted as an important approach to health systems strengthening. However, ST is not common practice, particularly in Low- and Middle-Income Countries (LMICs). This paper seeks to explore the barriers that may hinder its application in the Eastern Mediterranean Region (EMR) and possible strategies to mitigate them. Methods: A survey consisting of open-ended questions was conducted with a purposive sample of health policy-makers such as senior officials from the Ministry of Health (MoH), researchers, and other stakeholders such as civil society groups and professional associations from ten countries in the region. A total of 62 respondents participated in the study. Thematic analysis was conducted. Results: There was strong recognition of the relevance and usefulness of ST to health systems policy-making and research, although misconceptions about what ST means were also identified. Experience with applying ST was very limited. Approaches to designing health policies in the EMR were perceived as reactive and fragmented (66%). Commonly perceived constraints to application of ST were: a perceived notion of its costliness combined with lack of the necessary funding to operationalize it (53%), competing political interests and lack of government accountability (50%), lack of awareness about relevance and value (47%), limited capacity to apply it (45%), and difficulty in coordinating and managing stakeholders (39%). Conclusion: While several strategies have been proposed to mitigate most of these constraints, they emphasized the importance of political endorsement and adoption of ST at the leadership level, together with building the necessary capacity to apply it and apply the learning in research and practice. PMID:25489598

  5. An introductory review of parallel independent component analysis (p-ICA) and a guide to applying p-ICA to genetic data and imaging phenotypes to identify disease-associated biological pathways and systems in common complex disorders.

    PubMed

    Pearlson, Godfrey D; Liu, Jingyu; Calhoun, Vince D

    2015-01-01

    Complex inherited phenotypes, including those for many common medical and psychiatric diseases, are most likely underpinned by multiple genes contributing to interlocking molecular biological processes, along with environmental factors (Owen et al., 2010). Despite this, genotyping strategies for complex, inherited, disease-related phenotypes mostly employ univariate analyses, e.g., genome wide association. Such procedures most often identify isolated risk-related SNPs or loci, not the underlying biological pathways necessary to help guide the development of novel treatment approaches. This article focuses on the multivariate analysis strategy of parallel (i.e., simultaneous combination of SNP and neuroimage information) independent component analysis (p-ICA), which typically yields large clusters of functionally related SNPs statistically correlated with phenotype components, whose overall molecular biologic relevance is inferred subsequently using annotation software suites. Because this is a novel approach, whose details are relatively new to the field we summarize its underlying principles and address conceptual questions regarding interpretation of resulting data and provide practical illustrations of the method. PMID:26442095

  6. An introductory review of parallel independent component analysis (p-ICA) and a guide to applying p-ICA to genetic data and imaging phenotypes to identify disease-associated biological pathways and systems in common complex disorders

    PubMed Central

    Pearlson, Godfrey D.; Liu, Jingyu; Calhoun, Vince D.

    2015-01-01

    Complex inherited phenotypes, including those for many common medical and psychiatric diseases, are most likely underpinned by multiple genes contributing to interlocking molecular biological processes, along with environmental factors (Owen et al., 2010). Despite this, genotyping strategies for complex, inherited, disease-related phenotypes mostly employ univariate analyses, e.g., genome wide association. Such procedures most often identify isolated risk-related SNPs or loci, not the underlying biological pathways necessary to help guide the development of novel treatment approaches. This article focuses on the multivariate analysis strategy of parallel (i.e., simultaneous combination of SNP and neuroimage information) independent component analysis (p-ICA), which typically yields large clusters of functionally related SNPs statistically correlated with phenotype components, whose overall molecular biologic relevance is inferred subsequently using annotation software suites. Because this is a novel approach, whose details are relatively new to the field we summarize its underlying principles and address conceptual questions regarding interpretation of resulting data and provide practical illustrations of the method. PMID:26442095

  7. Microfabricated Genomic Analysis System

    NASA Technical Reports Server (NTRS)

    Gonda, Steve; Elms, Rene

    2005-01-01

    Genetic sequencing and many genetic tests and assays require electrophoretic separation of DNA. In this technique, DNA fragments are separated by size as they migrate through a sieving gel under the influence of an applied electric field. In order to conduct these analyses on-orbit, it is essential to acquire the capability to efficiently perform electrophoresis in a microgravity environment. Conventional bench top electrophoresis equipment is large and cumbersome and does not lead itself to on-orbit utilization. Much of the previous research regarding on-orbit electrophoresis involved altering conventional electrophoresis equipment for bioprocessing, purification, and/or separation technology applications. A new and more efficient approach to on-orbit electrophoresis is the use of a microfabricated electrophoresis platform. These platforms are much smaller, less expensive to produce and operate, use less power, require smaller sample sizes (nanoliters), and achieve separation in a much shorter distance (a few centimeters instead of 10 s or 100 s of centimeters.) In contrast to previous applications, this platform would be utilized as an analytical tool for life science/medical research, environmental monitoring, and medical diagnoses. Identification of infectious agents as well as radiation related damage are significant to NASA s efforts to maintain, study, and monitor crew health during and in support of near-Earth and interplanetary missions. The capability to perform genetic assays on-orbit is imperative to conduct relevant and insightful biological and medical research, as well as continuing NASA s search for life elsewhere. This technology would provide an essential analytical tool for research conducted in a microgravity environment (Shuttle, ISS, long duration/interplanetary missions.) In addition, this technology could serve as a critical and invaluable component of a biosentinel system to monitor space environment genotoxic insults to include radiation.

  8. KRON's Method Applied to the Study of Electromagnetic Interference Occurring in Aerospace Systems

    NASA Astrophysics Data System (ADS)

    Leman, S.; Reineix, A.; Hoeppe, F.; Poiré, Y.; Mahoudi, M.; Démoulin, B.; Üstüner, F.; Rodriquez, V. P.

    2012-05-01

    In this paper, spacecraft and aircraft mock-ups are used to simulate the performance of KRON based tools applied to the simulation of large EMC systems. These tools aim to assist engineers in the design phase of complex systems. This is done by effectively evaluating the EM disturbances between antennas, electronic equipment, and Portable Electronic Devices found in large systems. We use a topological analysis of the system to model independent sub-volumes such as antennas, cables, equipments, PED and cavity walls. Each of these sub- volumes is modelled by an appropriate method which can be based on, for example, analytical expressions, transmission line theory or other numerical tools such as the full wave FDFD method. This representation associated with the electrical tensorial method of G.KRON leads to reasonable simulation times (typically a few minutes) and accurate results. Because equivalent sub-models are built separately, the main originality of this method is that each sub- volume can be easily replaced by another one without rebuilding the entire system. Comparisons between measurements and simulations will be also presented.

  9. Air pollution simulation and geographical information systems (GIS) applied to Athens International Airport.

    PubMed

    Theophanides, Mike; Anastassopoulou, Jane

    2009-07-01

    This study presents an improved methodology for analysing atmospheric pollution around airports using Gaussian-plume numerical simulation integrated with Geographical Information Systems (GIS). The new methodology focuses on streamlining the lengthy analysis process for Airport Environmental Impact Assessments by integrating the definition of emission sources, simulating and displaying the results in a GIS environment. One of the objectives of the research is to validate the methodology applied to the Athens International Airport, "Eleftherios Venizelos", to produce a realistic estimate of emission inventories, dispersion simulations and comparison to measured data. The methodology used a combination of the Emission Dispersion and Modelling System (EDMS) and the Atmospheric Dispersion and Modelling system (ADMS) to improve the analysis process. The second objective is to conduct numerical simulations under various adverse conditions (e.g. scenarios) and assess the dispersion in the surrounding areas. The study concludes that the use of GIS in environmental assessments provides a valuable advantage for organizing data and entering accurate geographical/topological information for the simulation engine. Emissions simulation produced estimates within 10% of published values. Dispersion simulations indicate that airport pollution will affect neighbouring cities such as Rafina and Loutsa. Presently, there are no measured controls in these areas. In some cases, airport pollution can contribute to as much as 40% of permissible EU levels in VOCs. PMID:19731833

  10. Applying Dynamical Systems Theory to Optimize Libration Point Orbit Stationkeeping Maneuvers for WIND

    NASA Technical Reports Server (NTRS)

    Brown, Jonathan M.; Petersen, Jeremy D.

    2014-01-01

    NASA's WIND mission has been operating in a large amplitude Lissajous orbit in the vicinity of the interior libration point of the Sun-Earth/Moon system since 2004. Regular stationkeeping maneuvers are required to maintain the orbit due to the instability around the collinear libration points. Historically these stationkeeping maneuvers have been performed by applying an incremental change in velocity, or (delta)v along the spacecraft-Sun vector as projected into the ecliptic plane. Previous studies have shown that the magnitude of libration point stationkeeping maneuvers can be minimized by applying the (delta)v in the direction of the local stable manifold found using dynamical systems theory. This paper presents the analysis of this new maneuver strategy which shows that the magnitude of stationkeeping maneuvers can be decreased by 5 to 25 percent, depending on the location in the orbit where the maneuver is performed. The implementation of the optimized maneuver method into operations is discussed and results are presented for the first two optimized stationkeeping maneuvers executed by WIND.

  11. Quantitative Systems Pharmacology Approaches Applied to Microphysiological Systems (MPS): Data Interpretation and Multi-MPS Integration

    PubMed Central

    Yu, J; Cilfone, NA; Large, EM; Sarkar, U; Wishnok, JS; Tannenbaum, SR; Hughes, DJ; Lauffenburger, DA; Griffith, LG; Stokes, CL; Cirit, M

    2015-01-01

    Our goal in developing Microphysiological Systems (MPS) technology is to provide an improved approach for more predictive preclinical drug discovery via a highly integrated experimental/computational paradigm. Success will require quantitative characterization of MPSs and mechanistic analysis of experimental findings sufficient to translate resulting insights from in vitro to in vivo. We describe herein a systems pharmacology approach to MPS development and utilization that incorporates more mechanistic detail than traditional pharmacokinetic/pharmacodynamic (PK/PD) models. A series of studies illustrates diverse facets of our approach. First, we demonstrate two case studies: a PK data analysis and an inflammation response––focused on a single MPS, the liver/immune MPS. Building on the single MPS modeling, a theoretical investigation of a four-MPS interactome then provides a quantitative way to consider several pharmacological concepts such as absorption, distribution, metabolism, and excretion in the design of multi-MPS interactome operation and experiments. PMID:26535159

  12. Method of error analysis for phase-measuring algorithms applied to photoelasticity.

    PubMed

    Quiroga, J A; González-Cano, A

    1998-07-10

    We present a method of error analysis that can be applied for phase-measuring algorithms applied to photoelasticity. We calculate the contributions to the measurement error of the different elements of a circular polariscope as perturbations of the Jones matrices associated with each element. The Jones matrix of the real polariscope can then be calculated as a sum of the nominal matrix and a series of contributions that depend on the errors associated with each element separately. We apply this method to the analysis of phase-measuring algorithms for the determination of isoclinics and isochromatics, including comparisons with real measurements. PMID:18285900

  13. A review of the technology and process on integrated circuits failure analysis applied in communications products

    NASA Astrophysics Data System (ADS)

    Ming, Zhimao; Ling, Xiaodong; Bai, Xiaoshu; Zong, Bo

    2016-02-01

    The failure analysis of integrated circuits plays a very important role in the improvement of the reliability in communications products. This paper intends to mainly introduce the failure analysis technology and process of integrated circuits applied in the communication products. There are many technologies for failure analysis, include optical microscopic analysis, infrared microscopic analysis, acoustic microscopy analysis, liquid crystal hot spot detection technology, optical microscopic analysis technology, micro analysis technology, electrical measurement, microprobe technology, chemical etching technology and ion etching technology. The integrated circuit failure analysis depends on the accurate confirmation and analysis of chip failure mode, the search of the root failure cause, the summary of failure mechanism and the implement of the improvement measures. Through the failure analysis, the reliability of integrated circuit and rate of good products can improve.

  14. Applying new data-entropy and data-scatter methods for optical digital signal analysis

    NASA Astrophysics Data System (ADS)

    McMillan, N. D.; Egan, J.; Denieffe, D.; Riedel, S.; Tiernan, K.; McGowan, G.; Farrell, G.

    2005-06-01

    This paper introduces for the first time a numerical example of the data-entropy 'quality-budget' method. The paper builds on an earlier theoretical investigation into the application of this information theory approach for opto-electronic system engineering. Currently the most widely used way of analysing such a system is with the power budget. This established method cannot however integrate noise of different generic types. The traditional power budget approach is not capable of allowing analysis of a system with different noise types and specifically providing a measure of signal quality. The data-entropy budget first introduced by McMillan and Reidel on the other hand is able to handle diverse forms of noise. This is achieved by applying the dimensionless 'bit measure' in a quality-budget to integrate the analysis of all types of losses. This new approach therefore facilitates the assessment of both signal quality and power issues in a unified way. The software implementation of data-entropy has been utilised for testing on a fiber optic network. The results of various new quantitative data-entropy measures on the digital system are given and their utility discussed. A new data mining technique known as data-scatter also introduced by McMillan and Reidel provides a useful visualisation of the relationships between data sets and is discussed. The paper ends by giving some perspective on future work in which the data-entropy technique, providing the objective difference measure on the signals, and data-scatter technique, providing qualitative information on the signals, are integrated together for optical communication applications.

  15. Integrated tools for control-system analysis

    NASA Technical Reports Server (NTRS)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  16. System Identification Applied to Dynamic CFD Simulation and Wind Tunnel Data

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Klein, Vladislav; Frink, Neal T.; Vicroy, Dan D.

    2011-01-01

    Demanding aerodynamic modeling requirements for military and civilian aircraft have provided impetus for researchers to improve computational and experimental techniques. Model validation is a key component for these research endeavors so this study is an initial effort to extend conventional time history comparisons by comparing model parameter estimates and their standard errors using system identification methods. An aerodynamic model of an aircraft performing one-degree-of-freedom roll oscillatory motion about its body axes is developed. The model includes linear aerodynamics and deficiency function parameters characterizing an unsteady effect. For estimation of unknown parameters two techniques, harmonic analysis and two-step linear regression, were applied to roll-oscillatory wind tunnel data and to computational fluid dynamics (CFD) simulated data. The model used for this study is a highly swept wing unmanned aerial combat vehicle. Differences in response prediction, parameters estimates, and standard errors are compared and discussed

  17. Applied Behavior Analysis: Its Impact on the Treatment of Mentally Retarded Emotionally Disturbed People.

    ERIC Educational Resources Information Center

    Matson, Johnny L.; Coe, David A.

    1992-01-01

    This article reviews applications of the applied behavior analysis ideas of B. F. Skinner and others to persons with both mental retardation and emotional disturbance. The review examines implications of behavior analysis for operant conditioning and radical behaviorism, schedules of reinforcement, and emotion and mental illness. (DB)

  18. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    ERIC Educational Resources Information Center

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  19. Causal Modeling--Path Analysis a New Trend in Research in Applied Linguistics

    ERIC Educational Resources Information Center

    Rastegar, Mina

    2006-01-01

    This article aims at discussing a new statistical trend in research in applied linguistics. This rather new statistical procedure is causal modeling--path analysis. The article demonstrates that causal modeling--path analysis is the best statistical option to use when the effects of a multitude of L2 learners' variables on language achievement are…

  20. An Objective Comparison of Applied Behavior Analysis and Organizational Behavior Management Research

    ERIC Educational Resources Information Center

    Culig, Kathryn M.; Dickinson, Alyce M.; McGee, Heather M.; Austin, John

    2005-01-01

    This paper presents an objective review, analysis, and comparison of empirical studies targeting the behavior of adults published in Journal of Applied Behavior Analysis (JABA) and Journal of Organizational Behavior Management (JOBM) between 1997 and 2001. The purpose of the comparisons was to identify similarities and differences with respect to…

  1. Sociosexuality Education for Persons with Autism Spectrum Disorders Using Principles of Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Wolfe, Pamela S.; Condo, Bethany; Hardaway, Emily

    2009-01-01

    Applied behavior analysis (ABA) has emerged as one of the most effective empirically based strategies for instructing individuals with autism spectrum disorders (ASD). Four ABA-based strategies that have been found effective are video modeling, visual strategies, social script fading, and task analysis. Individuals with ASD often struggle with…

  2. A systems analysis of longwall mining systems

    SciTech Connect

    Ramani, R.V.; Kovach, T.S.

    1983-03-01

    In many instances, the actual production from a longwall face does not equal the planned capacity. General problem areas which have been identified include face length and panel depth, face area and gate entry roof support, face conveyor and shearer wear, outby haulage system failures, number of entries in longwall gate development and longwall move time. The performance of a longwall face is totally dependent on the interaction of these and many other factors. To explore fully these interactions, a systems analysis of longwalls has been performed. In this paper, the systems approach, the factors considered in the analysis, and the results of the analysis are presented.

  3. Optical Image Analysis Applied to Pore Network Quantification of Sandstones Under Experimental CO2 Injection

    NASA Astrophysics Data System (ADS)

    Berrezueta, E.; González, L.; Ordóñez, B.; Luquot, L.; Quintana, L.; Gallastegui, G.; Martínez, R.; Olaya, P.; Breitner, D.

    2015-12-01

    This research aims to propose a protocol for pore network quantification in sandstones applying the Optical Image Analysis (OIA) procedure, which guarantees the measurement reproducibility and its reliability. Two geological formations of sandstone, located in Spain and potentially suitable for CO2 sequestration, were selected for this study: a) the Cretaceous Utrillas unit, at the base of the Cenozoic Duero Basin and b) a Triassic unit at the base of the Cenozoic Guadalquivir Basin. Sandstone samples were studied before and after the CO2 experimental injection using Optical and scanning electronic microscopy (SEM), while the quantification of petrographic changes was done with OIA. The first phase of the rersearch consisted on a detailed mineralogical and petrographic study of the sandstones (before and after CO2-injection), for which we observed thin sections. Later, the methodological and experimental processes of the investigation were focused on i) adjustment and calibration of OIA tools; ii) data acquisition protocol based on image capture with different polarization conditions (synchronized movement of polarizers), using 7 images of the same mineral scene (6 in crossed polarizer and 1 in parallel polarizer); and iii) automated identification and segmentation of pore in 2D mineral images, generating applications by executable macros. Finally, once the procedure protocols had been, the compiled data was interpreted through an automated approach and the qualitative petrography was carried out. The quantification of changes in the pore network through OIA (porosity increase ≈ 2.5%) has allowed corroborate the descriptions obtained by SEM and microscopic techniques, which consisted in an increase in the porosity when CO2 treatment occurs. Automated-image identification and quantification of minerals, pores and textures together with petrographic analysis can be applied to improve pore system characterization in sedimentary rocks. This research offers numerical

  4. A covariance analysis algorithm for interconnected systems

    NASA Technical Reports Server (NTRS)

    Cheng, Victor H. L.; Curley, Robert D.; Lin, Ching-An

    1987-01-01

    A covariance analysis algorithm for propagation of signal statistics in arbitrarily interconnected nonlinear systems is presented which is applied to six-degree-of-freedom systems. The algorithm uses statistical linearization theory to linearize the nonlinear subsystems, and the resulting linearized subsystems are considered in the original interconnection framework for propagation of the signal statistics. Some nonlinearities commonly encountered in six-degree-of-freedom space-vehicle models are referred to in order to illustrate the limitations of this method, along with problems not encountered in standard deterministic simulation analysis. Moreover, the performance of the algorithm shall be numerically exhibited by comparing results using such techniques to Monte Carlo analysis results, both applied to a simple two-dimensional space-intercept problem.

  5. Confirmation of standard error analysis techniques applied to EXAFS using simulations

    SciTech Connect

    Booth, Corwin H; Hu, Yung-Jin

    2009-12-14

    Systematic uncertainties, such as those in calculated backscattering amplitudes, crystal glitches, etc., not only limit the ultimate accuracy of the EXAFS technique, but also affect the covariance matrix representation of real parameter errors in typical fitting routines. Despite major advances in EXAFS analysis and in understanding all potential uncertainties, these methods are not routinely applied by all EXAFS users. Consequently, reported parameter errors are not reliable in many EXAFS studies in the literature. This situation has made many EXAFS practitioners leery of conventional error analysis applied to EXAFS data. However, conventional error analysis, if properly applied, can teach us more about our data, and even about the power and limitations of the EXAFS technique. Here, we describe the proper application of conventional error analysis to r-space fitting to EXAFS data. Using simulations, we demonstrate the veracity of this analysis by, for instance, showing that the number of independent dat a points from Stern's rule is balanced by the degrees of freedom obtained from a 2 statistical analysis. By applying such analysis to real data, we determine the quantitative effect of systematic errors. In short, this study is intended to remind the EXAFS community about the role of fundamental noise distributions in interpreting our final results.

  6. Analysis of large power systems

    NASA Technical Reports Server (NTRS)

    Dommel, H. W.

    1975-01-01

    Computer-oriented power systems analysis procedures in the electric utilities are surveyed. The growth of electric power systems is discussed along with the solution of sparse network equations, power flow, and stability studies.

  7. Applying Systems Engineering to Implement an Evidence-based Intervention at a Community Health Center

    PubMed Central

    Tu, Shin-Ping; Feng, Sherry; Storch, Richard; Yip, Mei-Po; Sohng, HeeYon; Fu, Mingang; Chun, Alan

    2013-01-01

    Summary Impressive results in patient care and cost reduction have increased the demand for systems-engineering methodologies in large health care systems. This Report from the Field describes the feasibility of applying systems-engineering techniques at a community health center currently lacking the dedicated expertise and resources to perform these activities. PMID:23698657

  8. Using Microcomputers To Apply Statewide Standards for Schools and School Systems: Technological Changes over Five Years.

    ERIC Educational Resources Information Center

    Wu, Yi-Cheng; Hebbler, Stephen W.

    The Evaluation and Assessment Laboratory at the University of Alabama (Tuscaloosa) has contracted with the Georgia Department of Education (GDOE) to develop a microcomputer-based data management system for use in applying evaluation standards to schools and school systems. The Comprehensive Evaluation System (CES) was implemented statewide and has…

  9. Applied Systemic Theory and Educational Psychology: Can the Twain Ever Meet?

    ERIC Educational Resources Information Center

    Pellegrini, Dario W.

    2009-01-01

    This article reflects on the potential benefits of applying systemic theory to the work of educational psychologists (EPs). It reviews developments in systemic thinking over time, and discusses the differences between more directive "first order" versus collaborative "second order" approaches. It considers systemic theories and illustrates their…

  10. Quantum analysis applied to thermo field dynamics on dissipative systems

    SciTech Connect

    Hashizume, Yoichiro; Okamura, Soichiro; Suzuki, Masuo

    2015-03-10

    Thermo field dynamics is one of formulations useful to treat statistical mechanics in the scheme of field theory. In the present study, we discuss dissipative thermo field dynamics of quantum damped harmonic oscillators. To treat the effective renormalization of quantum dissipation, we use the Suzuki-Takano approximation. Finally, we derive a dissipative von Neumann equation in the Lindbrad form. In the present treatment, we can easily obtain the initial damping shown previously by Kubo.

  11. The Integrated Space Weather Analysis System

    NASA Astrophysics Data System (ADS)

    Maddox, M. M.; Mullinix, R. E.; Jain, P.; Berrios, D.; Hesse, M.; Rastaetter, L.; MacNeice, P. J.; Kuznetsova, M. M.; Taktakishvili, A.; Garneau, J. W.; Conti-Vock, J.

    2009-12-01

    The Integrated Space Weather Analysis System is a joint development project at NASA GSFC between the Space Weather Laboratory, Community Coordinated Modeling Center, Applied Engineering & Technology Directorate, and NASA HQ Office Of Chief Engineer. The iSWA system is a turnkey, web-based dissemination system for NASA-relevant space weather information that combines forecasts based on the most advanced space weather models with concurrent space environment information. A key design driver for the iSWA system is to generate and present vast amounts of space weather resources in an intuitive, user-configurable, and adaptable format - thus enabling users to respond to current and future space weather impacts as well as enabling post-imact analysis. This presentation will highlight several technical aspects of the iSWA system implementation including data collection methods, database design, customizable user interfaces, interactive system components, and innovative displays of quantitative information.

  12. Quasi-dynamic Material Flow Analysis applied to the Austrian Phosphorus cycle

    NASA Astrophysics Data System (ADS)

    Zoboli, Ottavia; Rechberger, Helmut

    2013-04-01

    Phosphorus (P) is one of the key elements that sustain life on earth and that allow achieving the current high levels of food production worldwide. It is a non-renewable resource, without any existing substitute. Because of its current dissipative use by mankind and to its very slow geochemical cycle, this resource is rapidly depleting and it is strongly connected to the problem of ensuring food security. Moreover P is also associated to important environmental problems. Its extraction often generates hazardous wastes, while its accumulation in water bodies can lead to eutrophication, with consequent severe ecological damages. It is therefore necessary to analyze and understand in detail the system of P, in regard to its use and management, to identify the processes that should be targeted in order to reduce the overall consumption of this resource. This work aims at establishing a generic quasi-dynamic model, which describes the Austrian P-budget and which allows investigating the trends of P use in the past, but also selected future scenarios. Given the importance of P throughout the whole anthropogenic metabolism, the model is based on a comprehensive system that encompasses several economic sectors, from agriculture and animal husbandry to industry, consumption and waste and wastewater treatment. Furthermore it includes the hydrosphere, to assess the losses of P into water bodies, due to the importance of eutrophication problems. The methodology applied is Material Flow Analysis (MFA), which is a systemic approach to assess and balance the stocks and flows of a material within a system defined in space and time. Moreover the model is integrated in the software STAN, a freeware tailor-made for MFA. Particular attention is paid to the characteristics and the quality of the data, in order to include data uncertainty and error propagation in the dynamic balance.

  13. Linear digital imaging system fidelity analysis

    NASA Technical Reports Server (NTRS)

    Park, Stephen K.

    1989-01-01

    The combined effects of imaging gathering, sampling and reconstruction are analyzed in terms of image fidelity. The analysis is based upon a standard end-to-end linear system model which is sufficiently general so that the results apply to most line-scan and sensor-array imaging systems. Shift-variant sampling effects are accounted for with an expected value analysis based upon the use of a fixed deterministic input scene which is randomly shifted (mathematically) relative to the sampling grid. This random sample-scene phase approach has been used successfully by the author and associates in several previous related papers.

  14. Surface Analysis System

    NASA Technical Reports Server (NTRS)

    1991-01-01

    A variety of surface analysis techniques are used for the qualitative and quantitative evaluation material surfaces at an atomic level. This work is accomplished in a pristine ultrahigh vacuum environment to eliminate interference.

  15. Analysis of some meteorological variables time series relevant in urban environments by applying the multifractal analysis

    NASA Astrophysics Data System (ADS)

    Pavon-Dominguez, Pablo; Ariza-Villaverde, Ana B.; Jimenez-Hornero, Francisco J.; Gutierrez de Rave, Eduardo

    2010-05-01

    surface wind direction and mean temperature according to the accumulation of points in the extreme of the spectra right tails. This fact confirming that the knowledge of the relationships between the multifractal parameters helps to complete the information regarding to the influence of rare values in the time series. With respect to the relationships between the parameters of the multifractal spectra and those calculated from the descriptive statistics for the meteorological variables considered here, a strong correlation was detected between the rare high values, represented by the extreme points in the spectra left tails, and the leptokurtic shape of the frequency distributions. In addition, for the same rare high values it could be checked a significant negative correlation between them and the coefficients of variation. The spectra left tails, corresponding to high values in the time series, exhibited greater amplitudes for those variables distributions that showed higher dispersion and positive coefficients of skewness. The multifractal analysis has shown itself to be a suitable and efficient approach to characterizing the most important meteorological variables affecting cities environment providing information that can be applied to increase the knowledge on the urban climate dynamics.

  16. Applied Koopmanisma)

    NASA Astrophysics Data System (ADS)

    Budišić, Marko; Mohr, Ryan; Mezić, Igor

    2012-12-01

    A majority of methods from dynamical system analysis, especially those in applied settings, rely on Poincaré's geometric picture that focuses on "dynamics of states." While this picture has fueled our field for a century, it has shown difficulties in handling high-dimensional, ill-described, and uncertain systems, which are more and more common in engineered systems design and analysis of "big data" measurements. This overview article presents an alternative framework for dynamical systems, based on the "dynamics of observables" picture. The central object is the Koopman operator: an infinite-dimensional, linear operator that is nonetheless capable of capturing the full nonlinear dynamics. The first goal of this paper is to make it clear how methods that appeared in different papers and contexts all relate to each other through spectral properties of the Koopman operator. The second goal is to present these methods in a concise manner in an effort to make the framework accessible to researchers who would like to apply them, but also, expand and improve them. Finally, we aim to provide a road map through the literature where each of the topics was described in detail. We describe three main concepts: Koopman mode analysis, Koopman eigenquotients, and continuous indicators of ergodicity. For each concept, we provide a summary of theoretical concepts required to define and study them, numerical methods that have been developed for their analysis, and, when possible, applications that made use of them. The Koopman framework is showing potential for crossing over from academic and theoretical use to industrial practice. Therefore, the paper highlights its strengths in applied and numerical contexts. Additionally, we point out areas where an additional research push is needed before the approach is adopted as an off-the-shelf framework for analysis and design.

  17. Applied Koopmanism.

    PubMed

    Budisić, Marko; Mohr, Ryan; Mezić, Igor

    2012-12-01

    A majority of methods from dynamical system analysis, especially those in applied settings, rely on Poincaré's geometric picture that focuses on "dynamics of states." While this picture has fueled our field for a century, it has shown difficulties in handling high-dimensional, ill-described, and uncertain systems, which are more and more common in engineered systems design and analysis of "big data" measurements. This overview article presents an alternative framework for dynamical systems, based on the "dynamics of observables" picture. The central object is the Koopman operator: an infinite-dimensional, linear operator that is nonetheless capable of capturing the full nonlinear dynamics. The first goal of this paper is to make it clear how methods that appeared in different papers and contexts all relate to each other through spectral properties of the Koopman operator. The second goal is to present these methods in a concise manner in an effort to make the framework accessible to researchers who would like to apply them, but also, expand and improve them. Finally, we aim to provide a road map through the literature where each of the topics was described in detail. We describe three main concepts: Koopman mode analysis, Koopman eigenquotients, and continuous indicators of ergodicity. For each concept, we provide a summary of theoretical concepts required to define and study them, numerical methods that have been developed for their analysis, and, when possible, applications that made use of them. The Koopman framework is showing potential for crossing over from academic and theoretical use to industrial practice. Therefore, the paper highlights its strengths in applied and numerical contexts. Additionally, we point out areas where an additional research push is needed before the approach is adopted as an off-the-shelf framework for analysis and design. PMID:23278096

  18. Risk-informed criticality analysis as applied to waste packages subject to a subsurface igneous intrusion

    NASA Astrophysics Data System (ADS)

    Kimball, Darby Suzan

    Practitioners of many branches of nuclear facility safety use probabilistic risk assessment (PRA) methodology, which evaluates the reliability of a system along with the consequences of various failure states. One important exception is nuclear criticality safety, which traditionally produces binary results (critical or subcritical, based upon value of the effective multiplication factor, keff). For complex systems, criticality safety can benefit from application of the more flexible PRA techniques. A new risk-based technique in criticality safety analysis is detailed. In addition to identifying the most reactive configuration(s) and determining subcriticality, it yields more information about the relative reactivity contributions of various factors. By analyzing a more complete system, confidence that the system will remain subcritical is increased and areas where additional safety features would be most effective are indicated. The first step in the method is to create a criticality event tree (a specialized form of event tree where multiple outcomes stemming from a single event are acceptable). The tree lists events that impact reactivity by changing a system parameter. Next, the value of keff is calculated for the end states using traditional methods like the MCNP code. As calculations progress, the criticality event tree is modified; event branches demonstrated to have little effect on reactivity may be collapsed (thus reducing the total number of criticality runs), and branches may be added if more information is needed to characterize the system. When the criticality event tree is mature, critical limits are determined according to traditional validation techniques. Finally, results are evaluated. Criticality for the system is determined by comparing the value of k eff for each end state to the critical limit derived for those cases. The relative contributions of various events to criticality are identified by comparing end states resulting from different

  19. Appliance of Independent Component Analysis to System Intrusion Analysis

    NASA Astrophysics Data System (ADS)

    Ishii, Yoshikazu; Takagi, Tarou; Nakai, Kouji

    In order to analyze the output of the intrusion detection system and the firewall, we evaluated the applicability of ICA(independent component analysis). We developed a simulator for evaluation of intrusion analysis method. The simulator consists of the network model of an information system, the service model and the vulnerability model of each server, and the action model performed on client and intruder. We applied the ICA for analyzing the audit trail of simulated information system. We report the evaluation result of the ICA on intrusion analysis. In the simulated case, ICA separated two attacks correctly, and related an attack and the abnormalities of the normal application produced under the influence of the attach.

  20. Multi-Disciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song

    1997-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code developed under the leadership of NASA Lewis Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multi-disciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  1. Performance Analysis on Fault Tolerant Control System

    NASA Technical Reports Server (NTRS)

    Shin, Jong-Yeob; Belcastro, Christine

    2005-01-01

    In a fault tolerant control (FTC) system, a parameter varying FTC law is reconfigured based on fault parameters estimated by fault detection and isolation (FDI) modules. FDI modules require some time to detect fault occurrences in aero-vehicle dynamics. In this paper, an FTC analysis framework is provided to calculate the upper bound of an induced-L(sub 2) norm of an FTC system with existence of false identification and detection time delay. The upper bound is written as a function of a fault detection time and exponential decay rates and has been used to determine which FTC law produces less performance degradation (tracking error) due to false identification. The analysis framework is applied for an FTC system of a HiMAT (Highly Maneuverable Aircraft Technology) vehicle. Index Terms fault tolerant control system, linear parameter varying system, HiMAT vehicle.

  2. Laser rocket system analysis

    NASA Technical Reports Server (NTRS)

    Jones, W. S.; Forsyth, J. B.; Skratt, J. P.

    1979-01-01

    The laser rocket systems investigated in this study were for orbital transportation using space-based, ground-based and airborne laser transmitters. The propulsion unit of these systems utilizes a continuous wave (CW) laser beam focused into a thrust chamber which initiates a plasma in the hydrogen propellant, thus heating the propellant and providing thrust through a suitably designed nozzle and expansion skirt. The specific impulse is limited only by the ability to adequately cool the thruster and the amount of laser energy entering the engine. The results of the study showed that, with advanced technology, laser rocket systems with either a space- or ground-based laser transmitter could reduce the national budget allocated to space transportation by 10 to 345 billion dollars over a 10-year life cycle when compared to advanced chemical propulsion systems (LO2-LH2) of equal capability. The variation in savings depends upon the projected mission model.

  3. Applying the Earth System Grid Security System in a Heterogeneous Environment of Data Access Services

    NASA Astrophysics Data System (ADS)

    Kershaw, Philip; Lawrence, Bryan; Lowe, Dominic; Norton, Peter; Pascoe, Stephen

    2010-05-01

    CEDA (Centre for Environmental Data Archival) based at STFC Rutherford Appleton Laboratory is host to the BADC (British Atmospheric Data Centre) and NEODC (NERC Earth Observation Data Centre) with data holdings of over half a Petabyte. In the coming months this figure is set to increase by over one Petabyte through the BADC's role as one of three data centres to host the CMIP5 (Coupled Model Intercomparison Project Phase 5) core archive of climate model data. Quite apart from the problem of managing the storage of such large volumes there is the challenge of collating the data together from the modelling centres around the world and enabling access to these data for the user community. An infrastructure to support this is being developed under the US Earth System Grid (ESG) and related projects bringing together participating organisations together in a federation. The ESG architecture defines Gateways, the web interfaces that enable users to access data and data serving applications organised into Data Nodes. The BADC has been working in collaboration with US Earth System Grid team and other partners to develop a security system to restrict access to data. This provides single sign-on via both OpenID and PKI based means and uses role based authorisation facilitated by SAML and OpenID based interfaces for attribute retrieval. This presentation will provide an overview of the access control architecture and look at how this has been implemented for CEDA. CEDA has developed an expertise in data access and information services over several years through a number of projects to develop and enhance these capabilities. Participation in CMIP5 comes at a time when a number of other software development activities are coming to fruition. New services are in the process of being deployed alongside services making up the system for ESG. The security system must apply access control across this heterogeneous environment of different data services and technologies. One strand

  4. Power Plant Systems Analysis

    NASA Technical Reports Server (NTRS)

    Williams, J. R.; Yang, Y. Y.

    1973-01-01

    Three basic thermodynamic cycles of advanced nuclear MHD power plant systems are studied. The effect of reactor exit temperature and space radiator temperature on the overall thermal efficiency of a regenerative turbine compressor power plant system is shown. The effect of MHD pressure ratio on plant efficiency is also described, along with the dependence of MHD power output, compressor power requirement, turbine power output, mass flow rate of H2, and overall plant efficiency on the reactor exit temperature for a specific configuration.

  5. Thorough approach to measurement uncertainty analysis applied to immersed heat exchanger testing

    SciTech Connect

    Farrington, R B; Wells, C V

    1986-04-01

    This paper discusses the value of an uncertainty analysis, discusses how to determine measurement uncertainty, and then details the sources of error in instrument calibration, data acquisition, and data reduction for a particular experiment. Methods are discussed to determine both the systematic (or bias) error in an experiment as well as to determine the random (or precision) error in the experiment. The detailed analysis is applied to two sets of conditions in measuring the effectiveness of an immersed coil heat exchanger. It shows the value of such analysis as well as an approach to reduce overall measurement uncertainty and to improve the experiment. This paper outlines how to perform an uncertainty analysis and then provides a detailed example of how to apply the methods discussed in the paper. The authors hope this paper will encourage researchers and others to become more concerned with their measurement processes and to report measurement uncertainty with all of their test results.

  6. Land Analysis System (LAS)

    NASA Technical Reports Server (NTRS)

    Pease, P. B.

    1989-01-01

    Version 4.1 of LAS provides flexible framework for algorithm development and processing and analysis of image data. Over 500,000 lines of code enable image repair, clustering, classification, film processing, geometric registration, radiometric correction, and manipulation of image statistics.

  7. Linear covariance analysis for gimbaled pointing systems

    NASA Astrophysics Data System (ADS)

    Christensen, Randall S.

    Linear covariance analysis has been utilized in a wide variety of applications. Historically, the theory has made significant contributions to navigation system design and analysis. More recently, the theory has been extended to capture the combined effect of navigation errors and closed-loop control on the performance of the system. These advancements have made possible rapid analysis and comprehensive trade studies of complicated systems ranging from autonomous rendezvous to vehicle ascent trajectory analysis. Comprehensive trade studies are also needed in the area of gimbaled pointing systems where the information needs are different from previous applications. It is therefore the objective of this research to extend the capabilities of linear covariance theory to analyze the closed-loop navigation and control of a gimbaled pointing system. The extensions developed in this research include modifying the linear covariance equations to accommodate a wider variety of controllers. This enables the analysis of controllers common to gimbaled pointing systems, with internal states and associated dynamics as well as actuator command filtering and auxiliary controller measurements. The second extension is the extraction of power spectral density estimates from information available in linear covariance analysis. This information is especially important to gimbaled pointing systems where not just the variance but also the spectrum of the pointing error impacts the performance. The extended theory is applied to a model of a gimbaled pointing system which includes both flexible and rigid body elements as well as input disturbances, sensor errors, and actuator errors. The results of the analysis are validated by direct comparison to a Monte Carlo-based analysis approach. Once the developed linear covariance theory is validated, analysis techniques that are often prohibitory with Monte Carlo analysis are used to gain further insight into the system. These include the creation

  8. Exergy Analysis of Rocket Systems

    NASA Technical Reports Server (NTRS)

    Gilbert, Andrew; Mesmer, Bryan; Watson, Michael D.

    2015-01-01

    Exergy is defined as the useful work available from a system in a specified environment. Exergy analysis allows for comparison between different system designs, and allows for comparison of subsystem efficiencies within system designs. The proposed paper explores the relationship between the fundamental rocket equation and an exergy balance equation. A previously derived exergy equation related to rocket systems is investigated, and a higher fidelity analysis will be derived. The exergy assessments will enable informed, value-based decision making when comparing alternative rocket system designs, and will allow the most efficient configuration among candidate configurations to be determined.

  9. The Land Analysis System (LAS)

    NASA Technical Reports Server (NTRS)

    Lu, Yun-Chi; Irani, Fred M.

    1991-01-01

    The Land Analysis System (LAS) is an interactive software system, available in the public domain, for the analysis, display, and management of multispectral and other digital image data. The system was developed to support earth sciences research and development activities. LAS provides over 240 applications functions and utilities, a flexible user interface, complete on-line and hardcopy documentation, extensive image data file management, reformatting, and conversion utilities, and high level device independent access to image display hardware. The capabilities are summarized of the latest release of the system (version 5). Emphasis is given to the system portability and the isolation of hardware and software dependencies in this release.

  10. Miniaturized flow injection analysis system

    DOEpatents

    Folta, J.A.

    1997-07-01

    A chemical analysis technique known as flow injection analysis is described, wherein small quantities of chemical reagents and sample are intermixed and reacted within a capillary flow system and the reaction products are detected optically, electrochemically, or by other means. A highly miniaturized version of a flow injection analysis system has been fabricated utilizing microfabrication techniques common to the microelectronics industry. The microflow system uses flow capillaries formed by etching microchannels in a silicon or glass wafer followed by bonding to another wafer, commercially available microvalves bonded directly to the microflow channels, and an optical absorption detector cell formed near the capillary outlet, with light being both delivered and collected with fiber optics. The microflow system is designed mainly for analysis of liquids and currently measures 38{times}25{times}3 mm, but can be designed for gas analysis and be substantially smaller in construction. 9 figs.

  11. Miniaturized flow injection analysis system

    DOEpatents

    Folta, James A.

    1997-01-01

    A chemical analysis technique known as flow injection analysis, wherein small quantities of chemical reagents and sample are intermixed and reacted within a capillary flow system and the reaction products are detected optically, electrochemically, or by other means. A highly miniaturized version of a flow injection analysis system has been fabricated utilizing microfabrication techniques common to the microelectronics industry. The microflow system uses flow capillaries formed by etching microchannels in a silicon or glass wafer followed by bonding to another wafer, commercially available microvalves bonded directly to the microflow channels, and an optical absorption detector cell formed near the capillary outlet, with light being both delivered and collected with fiber optics. The microflow system is designed mainly for analysis of liquids and currently measures 38.times.25.times.3 mm, but can be designed for gas analysis and be substantially smaller in construction.

  12. 30 CFR 260.111 - What conditions apply to the bidding systems that MMS uses?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 2 2011-07-01 2011-07-01 false What conditions apply to the bidding systems that MMS uses? 260.111 Section 260.111 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, REGULATION, AND ENFORCEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE OUTER CONTINENTAL SHELF OIL AND GAS LEASING Bidding Systems General Provisions § 260.111...

  13. Design of multivariable feedback control systems via spectral assignment. [as applied to aircraft flight control

    NASA Technical Reports Server (NTRS)

    Liberty, S. R.; Mielke, R. R.; Tung, L. J.

    1981-01-01

    Applied research in the area of spectral assignment in multivariable systems is reported. A frequency domain technique for determining the set of all stabilizing controllers for a single feedback loop multivariable system is described. It is shown that decoupling and tracking are achievable using this procedure. The technique is illustrated with a simple example.

  14. Applied Behavior Analysis in Autism Spectrum Disorders: Recent Developments, Strengths, and Pitfalls

    ERIC Educational Resources Information Center

    Matson, Johnny L.; Turygin, Nicole C.; Beighley, Jennifer; Rieske, Robert; Tureck, Kimberly; Matson, Michael L.

    2012-01-01

    Autism has become one of the most heavily researched topics in the field of mental health and education. While genetics has been the most studied of all topics, applied behavior analysis (ABA) has also received a great deal of attention, and has arguably yielded the most promising results of any research area to date. The current paper provides a…

  15. Using Applied Behaviour Analysis as Standard Practice in a UK Special Needs School

    ERIC Educational Resources Information Center

    Foran, Denise; Hoerger, Marguerite; Philpott, Hannah; Jones, Elin Walker; Hughes, J. Carl; Morgan, Jonathan

    2015-01-01

    This article describes how applied behaviour analysis can be implemented effectively and affordably in a maintained special needs school in the UK. Behaviour analysts collaborate with classroom teachers to provide early intensive behaviour education for young children with autism spectrum disorders (ASD), and function based behavioural…

  16. Applied Behavior Analysis in the Treatment of Severe Psychiatric Disorders: A Bibliography.

    ERIC Educational Resources Information Center

    Scotti, Joseph R.; And Others

    Clinical research in the area of severe psychiatric disorders constituted the major focus for the discipline of applied behavior analysis during the early 1960s. Recently, however, there appears to be a notable lack of a behavioral focus within many inpatient psychiatric settings and a relative dearth of published behavioral treatment studies with…

  17. A Self-Administered Parent Training Program Based upon the Principles of Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Maguire, Heather M.

    2012-01-01

    Parents often respond to challenging behavior exhibited by their children in such a way that unintentionally strengthens it. Applied behavior analysis (ABA) is a research-based science that has been proven effective in remediating challenging behavior in children. Although many parents could benefit from using strategies from the field of ABA with…

  18. A Case Study in the Misrepresentation of Applied Behavior Analysis in Autism: The Gernsbacher Lectures

    PubMed Central

    Morris, Edward K

    2009-01-01

    I know that most men, including those at ease with problems of the greatest complexity, can seldom accept the simplest and most obvious truth if it be such as would oblige them to admit the falsity of conclusions which they have proudly taught to others, and which they have woven, thread by thread, into the fabrics of their life. (Tolstoy, 1894) This article presents a case study in the misrepresentation of applied behavior analysis for autism based on Morton Ann Gernsbacher's presentation of a lecture titled “The Science of Autism: Beyond the Myths and Misconceptions.” Her misrepresentations involve the characterization of applied behavior analysis, descriptions of practice guidelines, reviews of the treatment literature, presentations of the clinical trials research, and conclusions about those trials (e.g., children's improvements are due to development, not applied behavior analysis). The article also reviews applied behavior analysis' professional endorsements and research support, and addresses issues in professional conduct. It ends by noting the deleterious effects that misrepresenting any research on autism (e.g., biological, developmental, behavioral) have on our understanding and treating it in a transdisciplinary context. PMID:22478522

  19. Evolution of Applied Behavior Analysis in the Treatment of Individuals With Autism

    ERIC Educational Resources Information Center

    Wolery, Mark; Barton, Erin E.; Hine, Jeffrey F.

    2005-01-01

    Two issues of each volume of the Journal of Applied Behavior Analysis were reviewed to identify research reports focusing on individuals with autism. The identified articles were analyzed to describe the ages of individuals with autism, the settings in which the research occurred, the nature of the behaviors targeted for intervention, and the…

  20. Lovaas Model of Applied Behavior Analysis. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2010

    2010-01-01

    The "Lovaas Model of Applied Behavior Analysis" is a type of behavioral therapy that initially focuses on discrete trials: brief periods of one-on-one instruction, during which a teacher cues a behavior, prompts the appropriate response, and provides reinforcement to the child. Children in the program receive an average of 35 to 40 hours of…

  1. A National UK Census of Applied Behavior Analysis School Provision for Children with Autism

    ERIC Educational Resources Information Center

    Griffith, G. M.; Fletcher, R.; Hastings, R. P.

    2012-01-01

    Over more than a decade, specialist Applied Behavior Analysis (ABA) schools or classes for children with autism have developed in the UK and Ireland. However, very little is known internationally about how ABA is defined in practice in school settings, the characteristics of children supported in ABA school settings, and the staffing structures…

  2. Applied Behavior Analysis Programs for Autism: Sibling Psychosocial Adjustment during and Following Intervention Use

    ERIC Educational Resources Information Center

    Cebula, Katie R.

    2012-01-01

    Psychosocial adjustment in siblings of children with autism whose families were using a home-based, applied behavior analysis (ABA) program was compared to that of siblings in families who were not using any intensive autism intervention. Data gathered from parents, siblings and teachers indicated that siblings in ABA families experienced neither…

  3. Graphical and Numerical Descriptive Analysis: Exploratory Tools Applied to Vietnamese Data

    ERIC Educational Resources Information Center

    Haughton, Dominique; Phong, Nguyen

    2004-01-01

    This case study covers several exploratory data analysis ideas, the histogram and boxplot, kernel density estimates, the recently introduced bagplot--a two-dimensional extension of the boxplot--as well as the violin plot, which combines a boxplot with a density shape plot. We apply these ideas and demonstrate how to interpret the output from these…

  4. The pyramid system for multiscale raster analysis

    USGS Publications Warehouse

    De Cola, L.; Montagne, N.

    1993-01-01

    Geographical research requires the management and analysis of spatial data at multiple scales. As part of the U.S. Geological Survey's global change research program a software system has been developed that reads raster data (such as an image or digital elevation model) and produces a pyramid of aggregated lattices as well as various measurements of spatial complexity. For a given raster dataset the system uses the pyramid to report: (1) mean, (2) variance, (3) a spatial autocorrelation parameter based on multiscale analysis of variance, and (4) a monofractal scaling parameter based on the analysis of isoline lengths. The system is applied to 1-km digital elevation model (DEM) data for a 256-km2 region of central California, as well as to 64 partitions of the region. PYRAMID, which offers robust descriptions of data complexity, also is used to describe the behavior of topographic aspect with scale. ?? 1993.

  5. Applying Monte Carlo Simulation to Launch Vehicle Design and Requirements Analysis

    NASA Technical Reports Server (NTRS)

    Hanson, J. M.; Beard, B. B.

    2010-01-01

    This Technical Publication (TP) is meant to address a number of topics related to the application of Monte Carlo simulation to launch vehicle design and requirements analysis. Although the focus is on a launch vehicle application, the methods may be applied to other complex systems as well. The TP is organized so that all the important topics are covered in the main text, and detailed derivations are in the appendices. The TP first introduces Monte Carlo simulation and the major topics to be discussed, including discussion of the input distributions for Monte Carlo runs, testing the simulation, how many runs are necessary for verification of requirements, what to do if results are desired for events that happen only rarely, and postprocessing, including analyzing any failed runs, examples of useful output products, and statistical information for generating desired results from the output data. Topics in the appendices include some tables for requirements verification, derivation of the number of runs required and generation of output probabilistic data with consumer risk included, derivation of launch vehicle models to include possible variations of assembled vehicles, minimization of a consumable to achieve a two-dimensional statistical result, recontact probability during staging, ensuring duplicated Monte Carlo random variations, and importance sampling.

  6. Canister Model, Systems Analysis

    Energy Science and Technology Software Center (ESTSC)

    1993-09-29

    This packges provides a computer simulation of a systems model for packaging nuclear waste and spent nuclear fuel in canisters. The canister model calculates overall programmatic cost, number of canisters, and fuel and waste inventories for the Idaho Chemical Processing Plant (other initial conditions can be entered).

  7. WASTE COMBUSTION SYSTEM ANALYSIS

    EPA Science Inventory

    The report gives results of a study of biomass combustion alternatives. The objective was to evaluate the thermal performance and costs of available and developing biomass systems. The characteristics of available biomass fuels were reviewed, and the performance parameters of alt...

  8. Systems analysis-independent analysis and verification

    SciTech Connect

    Badin, J.S.; DiPietro, J.P.

    1995-09-01

    The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.

  9. Operationalizing sustainability in urban coastal systems: a system dynamics analysis.

    PubMed

    Mavrommati, Georgia; Bithas, Kostas; Panayiotidis, Panayiotis

    2013-12-15

    We propose a system dynamics approach for Ecologically Sustainable Development (ESD) in urban coastal systems. A systematic analysis based on theoretical considerations, policy analysis and experts' knowledge is followed in order to define the concept of ESD. The principles underlying ESD feed the development of a System Dynamics Model (SDM) that connects the pollutant loads produced by urban systems' socioeconomic activities with the ecological condition of the coastal ecosystem that it is delineated in operational terms through key biological elements defined by the EU Water Framework Directive. The receiving waters of the Athens Metropolitan area, which bears the elements of typical high population density Mediterranean coastal city but which currently has also new dynamics induced by the ongoing financial crisis, are used as an experimental system for testing a system dynamics approach to apply the concept of ESD. Systems' thinking is employed to represent the complex relationships among the components of the system. Interconnections and dependencies that determine the potentials for achieving ESD are revealed. The proposed system dynamics analysis can facilitate decision makers to define paths of development that comply with the principles of ESD. PMID:24200010

  10. INDEPENDENT COMPONENT ANALYSIS (ICA) APPLIED TO LONG BUNCH BEAMS IN THE LOS ALAMOS PROTON STORAGE RING

    SciTech Connect

    Kolski, Jeffrey S.; Macek, Robert J.; McCrady, Rodney C.; Pang, Xiaoying

    2012-05-14

    Independent component analysis (ICA) is a powerful blind source separation (BSS) method. Compared to the typical BSS method, principal component analysis (PCA), which is the BSS foundation of the well known model independent analysis (MIA), ICA is more robust to noise, coupling, and nonlinearity. ICA of turn-by-turn beam position data has been used to measure the transverse betatron phase and amplitude functions, dispersion function, linear coupling, sextupole strength, and nonlinear beam dynamics. We apply ICA in a new way to slices along the bunch and discuss the source signals identified as betatron motion and longitudinal beam structure.

  11. Energy-Systems Economic Analysis

    NASA Technical Reports Server (NTRS)

    Doane, J.; Slonski, M. L.; Borden, C. S.

    1982-01-01

    Energy Systems Economic Analysis (ESEA) program is flexible analytical tool for rank ordering of alternative energy systems. Basic ESEA approach derives an estimate of those costs incurred as result of purchasing, installing and operating an energy system. These costs, suitably aggregated into yearly costs over lifetime of system, are divided by expected yearly energy output to determine busbar energy costs. ESEA, developed in 1979, is written in FORTRAN IV for batch execution.

  12. Integrated systems analysis of the PIUS reactor

    SciTech Connect

    Fullwood, F.; Kroeger, P.; Higgins, J.

    1993-11-01

    Results are presented of a systems failure analysis of the PIUS plant systems that are used during normal reactor operation and postulated accidents. This study was performed to provide the NRC with an understanding of the behavior of the plant. The study applied two diverse failure identification methods, Failure Modes Effects & Criticality Analysis (FMECA) and Hazards & Operability (HAZOP) to the plant systems, supported by several deterministic analyses. Conventional PRA methods were also used along with a scheme for classifying events by initiator frequency and combinations of failures. Principal results of this study are: (a) an extensive listing of potential event sequences, grouped in categories that can be used by the NRC, (b) identification of support systems that are important to safety, and (c) identification of key operator actions.

  13. Quantitative tools for comparing animal communication systems: information theory applied to bottlenose dolphin whistle repertoires.

    PubMed

    McCOWAN; Hanser; Doyle

    1999-02-01

    Comparative analysis of nonhuman animal communication systems and their complexity, particularly in comparison to human language, has been generally hampered by both a lack of sufficiently extensive data sets and appropriate analytic tools. Information theory measures provide an important quantitative tool for examining and comparing communication systems across species. In this paper we use the original application of information theory, that of statistical examination of a communication system's structure and organization. As an example of the utility of information theory to the analysis of animal communication systems, we applied a series of information theory statistics to a statistically categorized set of bottlenose dolphin Tursiops truncatus, whistle vocalizations. First, we use the first-order entropic relation in a Zipf-type diagram (Zipf 1949 Human Behavior and the Principle of Least Effort) to illustrate the application of temporal statistics as comparative indicators of repertoire complexity, and as possible predictive indicators of acquisition/learning in animal vocal repertoires. Second, we illustrate the need for more extensive temporal data sets when examining the higher entropic orders, indicative of higher levels of internal informational structure, of such vocalizations, which could begin to allow the statistical reconstruction of repertoire organization. Third, we propose using 'communication capacity' as a measure of the degree of temporal structure and complexity of statistical correlation, represented by the values of entropic order, as an objective tool for interspecies comparison of communication complexity. In doing so, we introduce a new comparative measure, the slope of Shannon entropies, and illustrate how it potentially can be used to compare the organizational complexity of vocal repertoires across a diversity of species. Finally, we illustrate the nature and predictive application of these higher-order entropies using a preliminary

  14. Preliminary Centaur Systems Analysis

    NASA Technical Reports Server (NTRS)

    Maronde, R. G.; Holmes, J. K.; Iwasaki, R. S.

    1981-01-01

    The Centaur is stored in the Orbiter payload bay on the Centaur Integrated Support System (CISS). The CISS not only cradles the Centaur prior to deployment but also provides any signal conditioning required to make the Centaur/Orbiter hardwire interfaces compatible. In addition, the CISS provides other Centaur functions such as controlling all the avionics safety features and providing all the helium supplies for tank pressurizations. Problems associated with a Centaur design concept using a transponder and two switchable antennas are defined. Solutions to these problems are presented.

  15. EXPOSURE ANALYSIS MODELING SYSTEM (EXAMS)

    EPA Science Inventory

    The Exposure Analysis Modeling System (EXAMS), first published in 1982 (EPA-600/3-82-023), provides interactive computer software for formulating aquatic ecosystem models and rapidly evaluating the fate, transport, and exposure concentrations of synthetic organic chemicals--pesti...

  16. Controlled ecological life support system: Transportation analysis

    NASA Technical Reports Server (NTRS)

    Gustan, E.; Vinopal, T.

    1982-01-01

    This report discusses a study utilizing a systems analysis approach to determine which NASA missions would benefit from controlled ecological life support system (CELSS) technology. The study focuses on manned missions selected from NASA planning forecasts covering the next half century. Comparison of various life support scenarios for the selected missions and characteristics of projected transportation systems provided data for cost evaluations. This approach identified missions that derived benefits from a CELSS, showed the magnitude of the potential cost savings, and indicated which system or combination of systems would apply. This report outlines the analytical approach used in the evaluation, describes the missions and systems considered, and sets forth the benefits derived from CELSS when applicable.

  17. Research in progress in applied mathematics, numerical analysis, fluid mechanics, and computer science

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  18. Applying Systems Theory to Systemic Change: A Generic Model for Educational Reform.

    ERIC Educational Resources Information Center

    Hansen, Joe B.

    Although educational reformers frequently use the words "system,""systemic change," and "systemic approach," many lack a fundamental understanding of the systems concept. This paper describes the application of systems theory to the problems of educational reform and educational assessment. It introduces basic concepts and principles and describes…

  19. The Art World's Concept of Negative Space Applied to System Safety Management

    NASA Technical Reports Server (NTRS)

    Goodin, James Ronald (Ronnie)

    2005-01-01

    Tools from several different disciplines can improve system safety management. This paper relates the Art World with our system safety world, showing useful art schools of thought applied to system safety management, developing an art theory-system safety bridge. This bridge is then used to demonstrate relations with risk management, the legal system, personnel management and basic management (establishing priorities). One goal of this presentation/paper is simply to be a fun diversion from the many technical topics presented during the conference.

  20. Systems analysis for DSN microwave antenna holography

    NASA Technical Reports Server (NTRS)

    Rochblatt, D. J.

    1989-01-01

    Proposed systems for Deep Space Network (DSN) microwave antenna holography are analyzed. Microwave holography, as applied to antennas, is a technique which utilizes the Fourier Transform relation between the complex far-field radiation pattern of an antenna and the complex aperture field distribution to provide a methodology for the analysis and evaluation of antenna performance. Resulting aperture phase and amplitude distribution data are used to precisely characterize various crucial performance parameters, including panel alignment, subreflector position, antenna aperture illumination, directivity at various frequencies, and gravity deformation. Microwave holographic analysis provides diagnostic capacity as well as being a powerful tool for evaluating antenna design specifications and their corresponding theoretical models.

  1. Thermal Analysis System

    NASA Technical Reports Server (NTRS)

    DiStefano, III, Frank James (Inventor); Wobick, Craig A. (Inventor); Chapman, Kirt Auldwin (Inventor); McCloud, Peter L. (Inventor)

    2014-01-01

    A thermal fluid system modeler including a plurality of individual components. A solution vector is configured and ordered as a function of one or more inlet dependencies of the plurality of individual components. A fluid flow simulator simulates thermal energy being communicated with the flowing fluid and between first and second components of the plurality of individual components. The simulation extends from an initial time to a later time step and bounds heat transfer to be substantially between the flowing fluid, walls of tubes formed in each of the individual components of the plurality, and between adjacent tubes. Component parameters of the solution vector are updated with simulation results for each of the plurality of individual components of the simulation.

  2. Design of Astrometric Mission (JASMINE) by Applying Model Driven System Engineering

    NASA Astrophysics Data System (ADS)

    Yamada, Y.; Miyashita, H.; Nakamura, H.; Suenaga, K.; Kamiyoshi, S.; Tsuiki, A.

    2010-12-01

    We are planning space astrometric satellite mission named JASMINE. The target accuracy of parallaxes in JASMINE observation is 10 micro arc second, which corresponds to 1 nm scale on the focal plane. It is very hard to measure the 1 nm scale deformation of focal plane. Eventually, we need to add the deformation to the observation equations when estimating stellar astrometric parameters, which requires considering many factors such as instrument models and observation data analysis. In this situation, because the observation equations become more complex, we may reduce the stability of the hardware, nevertheless, we require more samplings due to the lack of rigidity of each estimation. This mission imposes a number of trades-offs in the engineering choices and then decide the optimal design from a number of candidates. In order to efficiently support such decisions, we apply Model Driven Systems Engineering (MDSE), which improves the efficiency of the engineering by revealing and formalizing requirements, specifications, and designs to find a good balance among various trade-offs.

  3. Applying cognitive load theory to the redesign of a conventional database systems course

    NASA Astrophysics Data System (ADS)

    Mason, Raina; Seton, Carolyn; Cooper, Graham

    2016-01-01

    Cognitive load theory (CLT) was used to redesign a Database Systems course for Information Technology students. The redesign was intended to address poor student performance and low satisfaction, and to provide a more relevant foundation in database design and use for subsequent studies and industry. The original course followed the conventional structure for a database course, covering database design first, then database development. Analysis showed the conventional course content was appropriate but the instructional materials used were too complex, especially for novice students. The redesign of instructional materials applied CLT to remove split attention and redundancy effects, to provide suitable worked examples and sub-goals, and included an extensive re-sequencing of content. The approach was primarily directed towards mid- to lower performing students and results showed a significant improvement for this cohort with the exam failure rate reducing by 34% after the redesign on identical final exams. Student satisfaction also increased and feedback from subsequent study was very positive. The application of CLT to the design of instructional materials is discussed for delivery of technical courses.

  4. Analysis of Combustion Systems

    NASA Technical Reports Server (NTRS)

    Bain, Daniel B.; Smith, Clifford E.; Holderman, James D. (Technical Monitor)

    2003-01-01

    As part of the NASA High-Speed Research Program, low emission combustors are being studied and demonstrated. One combustor concept that is currently being studied and evaluated is the Rich burn-Quick mix-Lean burn (RQL) combustor. The quick-mix zone of the RQL combustor is extremely important in reducing NO(x) emissions; rapid mixing of the bypass airflow with rich-born effluent is essential. The basic challenge can be described as rapid jet-in-crossflow mixing. Although jet-in-crossflow mixing is not new, this RQL application is unique in that the jet-to-mainstream mass-flow ratios are higher than studied previously (approx. 3 in RQL applications versus approx. 0.5 in dilution zone studies), plus the emphasis is on reducing NO(x) emissions (i.e. good mixing might not necessarily produce low emissions). This five-years project focused on identifying quick-mix method that would reduce NO(x) emissions in RQL combustor. The work included study of mixing concepts, and the development of design methodology. Three dimension CFD analysis was the primary tool used in assessing concept and developing design methodology for low emission. Isothermal and reacting CFD calculations were performed on cylindrical, rectangular, and annular generic geometries. Systematic parameters studies were performed to isolate key design parameters and their influence on mixing and emissions.

  5. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    SciTech Connect

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity is

  6. Applying behavior analysis to clinical problems: review and analysis of habit reversal.

    PubMed Central

    Miltenberger, R G; Fuqua, R W; Woods, D W

    1998-01-01

    This article provides a review and analysis of habit reversal, a multicomponent procedure developed by Azrin and Nunn (1973, 1974) for the treatment of nervous habits, tics, and stuttering. The article starts with a discussion of the behaviors treated with habit reversal, behavioral covariation among habits, and functional analysis and assessment of habits. Research on habit reversal and simplified versions of the procedure is then described. Next the article discusses the limitations of habit reversal and the evidence for its generality. The article concludes with an analysis of the behavioral processes involved in habit reversal and suggestions for future research. PMID:9757583

  7. 1992 NASA Life Support Systems Analysis workshop

    NASA Technical Reports Server (NTRS)

    Evanich, Peggy L.; Crabb, Thomas M.; Gartrell, Charles F.

    1992-01-01

    The 1992 Life Support Systems Analysis Workshop was sponsored by NASA's Office of Aeronautics and Space Technology (OAST) to integrate the inputs from, disseminate information to, and foster communication among NASA, industry, and academic specialists. The workshop continued discussion and definition of key issues identified in the 1991 workshop, including: (1) modeling and experimental validation; (2) definition of systems analysis evaluation criteria; (3) integration of modeling at multiple levels; and (4) assessment of process control modeling approaches. Through both the 1991 and 1992 workshops, NASA has continued to seek input from industry and university chemical process modeling and analysis experts, and to introduce and apply new systems analysis approaches to life support systems. The workshop included technical presentations, discussions, and interactive planning, with sufficient time allocated for discussion of both technology status and technology development recommendations. Key personnel currently involved with life support technology developments from NASA, industry, and academia provided input to the status and priorities of current and future systems analysis methods and requirements.

  8. Parametrized mode decomposition for bifurcation analysis applied to a thermo-acoustically oscillating flame

    NASA Astrophysics Data System (ADS)

    Sayadi, Taraneh; Schmid, Peter; Richecoeur, Franck; Durox, Daniel

    2014-11-01

    Thermo-acoustic systems belong to a class of dynamical systems that are governed by multiple parameters. Changing these parameters alters the response of the dynamical system and causes it to bifurcate. Due to their many applications and potential impact on a variety of combustion systems, there is great interest in devising control strategies to weaken or suppress thermo-acoustic instabilities. However, the system dynamics have to be available in reduced-order form to allow the design of such controllers and their operation in real-time. As the dominant modes and their respective frequencies change with varying the system parameters, the dynamical system needs to be analyzed separately for a set of fixed parameter values, before the dynamics can be linked in parameter-space. This two-step process is not only cumbersome, but also ambiguous when applied to systems operating close to a bifurcation point. Here we propose a parametrized decomposition algorithm which is capable of analyzing dynamical systems as they go through a bifurcation, extracting the dominant modes of the pre- and post-bifurcation regime. The algorithm is applied to a thermo-acoustically oscillating flame and to pressure signals from experiments. A few selected mode are capable of reproducing the dynamics.

  9. Critical Education for Systemic Change: A World-Systems Analysis Perspective

    ERIC Educational Resources Information Center

    Griffiths, Tom G.

    2015-01-01

    This paper both draws on, and seeks to apply, world-systems analysis to a broad, critical education project that builds mass schooling's potential contribution to the process of world-systemic change. In short, this is done by first setting out the world-systems analysis account of the current state, and period of transition, of the capitalist…

  10. Fundamental Study on Saving Energy for Electrified Railway System Applying High Temperature Superconductor Motor and Energy Storage System

    NASA Astrophysics Data System (ADS)

    Konishi, Takeshi; Nakamura, Taketsune; Amemiya, Naoyuki

    Induction motor instead of dc one has been applied widely for dc electric rolling stock because of the advantage of its utility and efficiency. However, further improvement of motor characteristics will be required to realize environment-friendly dc railway system in the future. It is important to study more efficient machine applying dc electric rolling stock for next generation high performance system. On the other hand, the methods to reuse regenerative energy produced by motors effectively are also important. Therefore, we carried out fundamental study on saving energy for electrified railway system. For the first step, we introduced the energy storage system applying electric double-layer capacitors (EDLC), and its control system. And then, we tried to obtain the specification of high temperature superconductor induction/synchronous motor (HTS-ISM), which performance is similar with that of the conventional induction motors. Furthermore, we tried to evaluate an electrified railway system applying energy storage system and HTS-ISM based on simulation. We succeeded in showing the effectiveness of the introductions of energy storage system and HTS-ISM in DC electrified railway system.

  11. MetaNetVar: Pipeline for applying network analysis tools for genomic variants analysis

    PubMed Central

    Moyer, Eric; Hagenauer, Megan; Lesko, Matthew; Francis, Felix; Rodriguez, Oscar; Nagarajan, Vijayaraj; Huser, Vojtech; Busby, Ben

    2016-01-01

    Network analysis can make variant analysis better. There are existing tools like HotNet2 and dmGWAS that can provide various analytical methods. We developed a prototype of a pipeline called MetaNetVar that allows execution of multiple tools. The code is published at https://github.com/NCBI-Hackathons/Network_SNPs. A working prototype is published as an Amazon Machine Image - ami-4510312f . PMID:27158457

  12. Applying Chemical Imaging Analysis to Improve Our Understanding of Cold Cloud Formation

    NASA Astrophysics Data System (ADS)

    Laskin, A.; Knopf, D. A.; Wang, B.; Alpert, P. A.; Roedel, T.; Gilles, M. K.; Moffet, R.; Tivanski, A.

    2012-12-01

    The impact that atmospheric ice nucleation has on the global radiation budget is one of the least understood problems in atmospheric sciences. This is in part due to the incomplete understanding of various ice nucleation pathways that lead to ice crystal formation from pre-existing aerosol particles. Studies investigating the ice nucleation propensity of laboratory generated particles indicate that individual particle types are highly selective in their ice nucleating efficiency. This description of heterogeneous ice nucleation would present a challenge when applying to the atmosphere which contains a complex mixture of particles. Here, we employ a combination of micro-spectroscopic and optical single particle analytical methods to relate particle physical and chemical properties with observed water uptake and ice nucleation. Field-collected particles from urban environments impacted by anthropogenic and marine emissions and aging processes are investigated. Single particle characterization is provided by computer controlled scanning electron microscopy with energy dispersive analysis of X-rays (CCSEM/EDX) and scanning transmission X-ray microscopy with near edge X-ray absorption fine structure spectroscopy (STXM/NEXAFS). A particle-on-substrate approach coupled to a vapor controlled cooling-stage and a microscope system is applied to determine the onsets of water uptake and ice nucleation including immersion freezing and deposition ice nucleation as a function of temperature (T) as low as 200 K and relative humidity (RH) up to water saturation. We observe for urban aerosol particles that for T > 230 K the oxidation level affects initial water uptake and that subsequent immersion freezing depends on particle mixing state, e.g. by the presence of insoluble particles. For T < 230 K the particles initiate deposition ice nucleation well below the homogeneous freezing limit. Particles collected throughout one day for similar meteorological conditions show very similar

  13. Analysis of imaging system performance capabilities

    NASA Astrophysics Data System (ADS)

    Haim, Harel; Marom, Emanuel

    2013-06-01

    Present performance analysis of optical imaging systems based on results obtained with classic one-dimensional (1D) resolution targets (such as the USAF resolution chart) are significantly different than those obtained with a newly proposed 2D target [1]. We hereby prove such claim and show how the novel 2D target should be used for correct characterization of optical imaging systems in terms of resolution and contrast. We apply thereafter the consequences of these observations on the optimal design of some two-dimensional barcode structures.

  14. A Study in the Founding of Applied Behavior Analysis Through Its Publications

    PubMed Central

    Morris, Edward K.; Altus, Deborah E.; Smith, Nathaniel G.

    2013-01-01

    This article reports a study of the founding of applied behavior analysis through its publications. Our methods included hand searches of sources (e.g., journals, reference lists), search terms (i.e., early, applied, behavioral, research, literature), inclusion criteria (e.g., the field's applied dimension), and (d) challenges to their face and content validity. Our results were 36 articles published between 1959 and 1967 that we organized into 4 groups: 12 in 3 programs of research and 24 others. Our discussion addresses (a) limitations in our method (e.g., the completeness of our search), (b) challenges to the validity of our methods and results (e.g., convergent validity), and (c) priority claims about the field's founding. We conclude that the claims are irresolvable because identification of the founding publications depends significantly on methods and because the field's founding was an evolutionary process. We close with suggestions for future research. PMID:25729133

  15. System safety engineering analysis handbook

    NASA Technical Reports Server (NTRS)

    Ijams, T. E.

    1972-01-01

    The basic requirements and guidelines for the preparation of System Safety Engineering Analysis are presented. The philosophy of System Safety and the various analytic methods available to the engineering profession are discussed. A text-book description of each of the methods is included.

  16. Analysis hierarchical model for discrete event systems

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  17. Summary of research in applied mathematics, numerical analysis, and computer sciences

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The major categories of current ICASE research programs addressed include: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effective numerical methods; computational problems in engineering and physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and computer systems and software, especially vector and parallel computers.

  18. Weighted correlation network analysis (WGCNA) applied to the tomato fruit metabolome

    Technology Transfer Automated Retrieval System (TEKTRAN)

    One of the challenges for systems biology approaches is that hundreds to thousands of variables are often measured for treatments with low replication, thus creating a multiple testing problem. Principal component analysis (PCA) and weighted correlation network analysis (WGCNA) are two complementary...

  19. [Clustering analysis applied to near-infrared spectroscopy analysis of Chinese traditional medicine].

    PubMed

    Liu, Mu-qing; Zhou, De-cheng; Xu, Xin-yuan; Sun, Yao-jie; Zhou, Xiao-li; Han, Lei

    2007-10-01

    The present article discusses the clustering analysis used in the near-infrared (NIR) spectroscopy analysis of Chinese traditional medicines, which provides a new method for the classification of Chinese traditional medicines. Samples selected purposely in the authors' research to measure their absorption spectra in seconds by a multi-channel NIR spectrometer developed in the authors' lab were safrole, eucalypt oil, laurel oil, turpentine, clove oil and three samples of costmary oil from different suppliers. The spectra in the range of 0.70-1.7 microm were measured with air as background and the results indicated that they are quite distinct. Qualitative mathematical model was set up and cluster analysis based on the spectra was carried out through different clustering methods for optimization, and came out the cluster correlation coefficient of 0.9742 in the authors' research. This indicated that cluster analysis of the group of samples is practicable. Also it is reasonable to get the result that the calculated classification of 8 samples was quite accorded with their characteristics, especially the three samples of costmary oil were in the closest classification of the clustering analysis. PMID:18306778

  20. The ALICE analysis train system

    NASA Astrophysics Data System (ADS)

    Zimmermann, Markus; ALICE Collaboration

    2015-05-01

    In the ALICE experiment hundreds of users are analyzing big datasets on a Grid system. High throughput and short turn-around times are achieved by a centralized system called the LEGO trains. This system combines analysis from different users in so-called analysis trains which are then executed within the same Grid jobs thereby reducing the number of times the data needs to be read from the storage systems. The centralized trains improve the performance, the usability for users and the bookkeeping in comparison to single user analysis. The train system builds upon the already existing ALICE tools, i.e. the analysis framework as well as the Grid submission and monitoring infrastructure. The entry point to the train system is a web interface which is used to configure the analysis and the desired datasets as well as to test and submit the train. Several measures have been implemented to reduce the time a train needs to finish and to increase the CPU efficiency.

  1. Intelligent System for Radiogram Analysis

    NASA Astrophysics Data System (ADS)

    Sikora, R.; Chady, T.; Baniukiewicz, P.; Łopato, P.; Napierała, L.; Pietrusewicz, T.; Psuj, G.; Piekarczyk, B.

    2011-06-01

    In this paper we present a concept for an Intelligent System for Radiogram Analysis (ISAR) for welds quality inspection. Both, hardware and software solutions have been introduced in the system. The software operates with variety of scanner standards. It contains preliminary image processing (linear and nonlinear filtering algorithms) and some specialized functions, like Sauvola's tresholding or IQI detection. The aim of the ISAR system is to support a radiologist in his work.

  2. Applying behavior analysis to school violence and discipline problems: Schoolwide positive behavior support

    PubMed Central

    Anderson, Cynthia M.; Kincaid, Donald

    2005-01-01

    School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis. PMID:22478439

  3. Applying behavior analysis to school violence and discipline problems: Schoolwide positive behavior support.

    PubMed

    Anderson, Cynthia M; Kincaid, Donald

    2005-01-01

    School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis. PMID:22478439

  4. Ongoing Analysis of Rocket Based Combined Cycle Engines by the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Ruf, Joseph; Holt, James B.; Canabal, Francisco

    1999-01-01

    This paper presents the status of analyses on three Rocket Based Combined Cycle configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes code for ejector mode fluid dynamics. The Draco engine analysis is a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.

  5. Advanced Power System Analysis Capabilities

    NASA Technical Reports Server (NTRS)

    1997-01-01

    As a continuing effort to assist in the design and characterization of space power systems, the NASA Lewis Research Center's Power and Propulsion Office developed a powerful computerized analysis tool called System Power Analysis for Capability Evaluation (SPACE). This year, SPACE was used extensively in analyzing detailed operational timelines for the International Space Station (ISS) program. SPACE was developed to analyze the performance of space-based photovoltaic power systems such as that being developed for the ISS. It is a highly integrated tool that combines numerous factors in a single analysis, providing a comprehensive assessment of the power system's capability. Factors particularly critical to the ISS include the orientation of the solar arrays toward the Sun and the shadowing of the arrays by other portions of the station.

  6. Applying the least restrictive alternative principle to treatment decisions: A legal and behavioral analysis

    PubMed Central

    Johnston, J. M.; Sherman, Robert A.

    1993-01-01

    The least restrictive alternative concept is widely used in mental health law. This paper addresses how the concept has been applied to treatment decisions. The paper offers both a legal and a behavioral analysis to some problems that have emerged in recent years concerning the selection of behavioral procedures used to change client behavior. The paper also offers ways of improving the application of the concept, which involve developing a more behaviorally functional perspective toward restrictiveness. PMID:22478138

  7. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    NASA Astrophysics Data System (ADS)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  8. Applied behavior analysis: its impact on the treatment of mentally retarded emotionally disturbed people.

    PubMed

    Matson, J L; Coe, D A

    1992-01-01

    Laboratory research on behavior analysis proved to be useful in establishing principles of learning with many relevant applications for people. Early efforts in the applied behavior analysis area proved to be particularly successful with mentally retarded persons. Self-help skills received much of the earliest attention, but another area that became quite fruitful for study was dual diagnosis--mental health problems of mentally retarded individuals. This paper reviews some of the early works of Skinner and his colleagues and the implications of this work on the rapidly developing subdiscipline of dual diagnosis. Current status and future trends are discussed. PMID:1574625

  9. Residual energy applications program systems analysis report

    SciTech Connect

    Yngve, P.W.

    1980-10-01

    Current DOE plans call for building an Energy Applied Systems Test (EAST) Facility at the Savannah River Plant in close proximity to the 140 to 150/sup 0/F waste heat from one of several operating nuclear reactors. The waste water flow from each reactor, approximately 165,000 gpm, provides a unique opportunity to test the performance and operating characteristics of large-scale waste heat power generation and heat pump system concepts. This report provides a preliminary description of the potential end-use market, parametric data on heat pump and the power generation system technology, a preliminary listing of EAST Facility requirements, and an example of an integrated industrial park utilizing the technology to maximize economic pay back. The parametric heat pump analysis concluded that dual-fluid Rankine cycle heat pumps with capacities as high as 400 x 10/sup 6/ Btu/h, can utilize large sources of low temperature residual heat to provide 300/sup 0/F saturatd steam for an industrial park. The before tax return on investment for this concept is 36.2%. The analysis also concluded that smaller modular heat pumps could fulfill the same objective while sacrificing only a moderate rate of return. The parametric power generation analysis concluded that multi-pressure Rankine cycle systems not only are superior to single pressure systems, but can also be developed for large systems (approx. = 17 MW/sub e/). This same technology is applicable to smaller systems at the sacrifice of higher investment per unit output.

  10. Sequential analysis applied to clinical trials in dentistry: a systematic review.

    PubMed

    Bogowicz, P; Flores-Mir, C; Major, P W; Heo, G

    2008-01-01

    Clinical trials employ sequential analysis for the ethical and economic benefits it brings. In dentistry, as in other fields, resources are scarce and efforts are made to ensure that patients are treated ethically. The objective of this systematic review was to characterise the use of sequential analysis for clinical trials in dentistry. We searched various databases from 1900 through to January 2008. Articles were selected for review if they were clinical trials in the field of dentistry that had applied some form of sequential analysis. Selection was carried out independently by two of the authors. We included 18 trials from various specialties, which involved many different interventions. We conclude that sequential analysis seems to be underused in this field but that there are sufficient methodological resources in place for future applications.Evidence-Based Dentistry (2008) 9, 55-62. doi:10.1038/sj.ebd.6400587. PMID:18584009

  11. Applied Nuclear Accountability Systems: A Case Study in the System Architecture and Development of NuMAC

    SciTech Connect

    Campbell, Andrea Beth

    2004-07-01

    This is a case study of the NuMAC nuclear accountability system developed at a private fuel fabrication facility. This paper investigates nuclear material accountability and safeguards by researching expert knowledge applied in the system design and development. Presented is a system developed to detect and deter the theft of weapons grade nuclear material. Examined is the system architecture that includes: issues for the design and development of the system; stakeholder issues; how the system was built and evolved; software design, database design, and development tool considerations; security and computing ethics. (author)

  12. Expert Meeting Report: Recommendations for Applying Water Heaters in Combination Space and Domestic Water Heating Systems

    SciTech Connect

    Rudd, A.; Ueno, K.; Bergey, D.; Osser, R.

    2012-07-01

    The topic of this meeting was 'Recommendations For Applying Water Heaters In Combination Space And Domestic Water Heating Systems.' Presentations and discussions centered on the design, performance, and maintenance of these combination systems, with the goal of developing foundational information toward the development of a Building America Measure Guideline on this topic. The meeting was held at the Westford Regency Hotel, in Westford, Massachusetts on 7/31/2011.

  13. Accuracy analysis of distributed simulation systems

    NASA Astrophysics Data System (ADS)

    Lin, Qi; Guo, Jing

    2010-08-01

    Existed simulation works always emphasize on procedural verification, which put too much focus on the simulation models instead of simulation itself. As a result, researches on improving simulation accuracy are always limited in individual aspects. As accuracy is the key in simulation credibility assessment and fidelity study, it is important to give an all-round discussion of the accuracy of distributed simulation systems themselves. First, the major elements of distributed simulation systems are summarized, which can be used as the specific basis of definition, classification and description of accuracy of distributed simulation systems. In Part 2, the framework of accuracy of distributed simulation systems is presented in a comprehensive way, which makes it more sensible to analyze and assess the uncertainty of distributed simulation systems. The concept of accuracy of distributed simulation systems is divided into 4 other factors and analyzed respectively further more in Part 3. In Part 4, based on the formalized description of framework of accuracy analysis in distributed simulation systems, the practical approach are put forward, which can be applied to study unexpected or inaccurate simulation results. Following this, a real distributed simulation system based on HLA is taken as an example to verify the usefulness of the approach proposed. The results show that the method works well and is applicable in accuracy analysis of distributed simulation systems.

  14. Teaching Applied Genetics and Molecular Biology to Agriculture Engineers. Application of the European Credit Transfer System

    ERIC Educational Resources Information Center

    Weiss, J.; Egea-Cortines, M.

    2008-01-01

    We have been teaching applied molecular genetics to engineers and adapted the teaching methodology to the European Credit Transfer System. We teach core principles of genetics that are universal and form the conceptual basis of most molecular technologies. The course then teaches widely used techniques and finally shows how different techniques…

  15. Reply to Comment on Shadow model for sub-barrier fusion applied to light systems' ''

    SciTech Connect

    Scalia, A. )

    1994-05-01

    This is a reply to the Comment on Shadow model for sub-barrier fusion applied to light systems.' '' We confirm the results of our paper. The claimed demonstration of the disagreement between the cross section derived from the shadow'' model and the low energy laboratory data is meaningless because it is based on a comparison which is incorrect.

  16. Embracing Connectedness and Change: A Complex Dynamic Systems Perspective for Applied Linguistic Research

    ERIC Educational Resources Information Center

    Cameron, Lynne

    2015-01-01

    Complex dynamic systems (CDS) theory offers a powerful metaphorical model of applied linguistic processes, allowing holistic descriptions of situated phenomena, and addressing the connectedness and change that often characterise issues in our field. A recent study of Kenyan conflict transformation illustrates application of a CDS perspective. Key…

  17. Toward a Blended Ontology: Applying Knowledge Systems to Compare Therapeutic and Toxicological Nanoscale Domains

    EPA Science Inventory

    Bionanomedicine and environmental research share need common terms and ontologies. This study applied knowledge systems, data mining, and bibliometrics used in nano-scale ADME research from 1991 to 2011. The prominence of nano-ADME in environmental research began to exceed the pu...

  18. 40 CFR 63.8030 - What requirements apply to my heat exchange systems?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... apply to my heat exchange systems? (a) You must comply with the requirements specified in Table 6 to... § 63.10(b)(1). (e) The reference to the periodic report required by § 63.152(c) of subpart G of...

  19. Function Analysis and Decomposistion using Function Analysis Systems Technique

    SciTech Connect

    J. R. Wixson

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Techniques (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  20. Function Analysis and Decomposistion using Function Analysis Systems Technique

    SciTech Connect

    Wixson, James Robert

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Technique (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  1. Simulation and signal analysis of Akiyama probe applied to atomic force microscope

    NASA Astrophysics Data System (ADS)

    Wang, Longlong; Lu, Mingzhen; Guo, Tong; Gao, Sitian; Zhang, Huakun

    2013-10-01

    Atomic force microscope is one of indispensable measurement tools in nano/micronano precision manufacture and critical dimension measurement. To expand its industry application, a novel head and system are newly designed combined with Nanosensors cooperation's patented probe — Akiyama probe, which is a self-sensing probe. The modal analysis and resonance frequency are obtained by finite element(FE) simulations. Using the Locked-in amplifier, the effective and available signal can be abtained. Through the experiment analysis, the retracting and extending curve reflects the tip and sample interaction. Furthermore, the measurement on the calibrated position system demonstrates that the whole system resolution can reach the nanometer scale.

  2. System based practice: a concept analysis

    PubMed Central

    YAZDANI, SHAHRAM; HOSSEINI, FAKHROLSADAT; AHMADY, SOLEIMAN

    2016-01-01

    Introduction Systems-Based Practice (SBP) is one of the six competencies introduced by the ACGME for physicians to provide high quality of care and also the most challenging of them in performance, training, and evaluation of medical students. This concept analysis clarifies the concept of SBP by identifying its components to make it possible to differentiate it from other similar concepts. For proper training of SBP and to ensure these competencies in physicians, it is necessary to have an operational definition, and SBP’s components must be precisely defined in order to provide valid and reliable assessment tools. Methods Walker & Avant’s approach to concept analysis was performed in eight stages: choosing a concept, determining the purpose of analysis, identifying all uses of the concept, defining attributes, identifying a model case, identifying borderline, related, and contrary cases, identifying antecedents and consequences, and defining empirical referents. Results Based on the analysis undertaken, the attributes of SBP includes knowledge of the system, balanced decision between patients’ need and system goals, effective role playing in interprofessional health care team, system level of health advocacy, and acting for system improvement. System thinking and a functional system are antecedents and system goals are consequences. A case model, as well as border, and contrary cases of SBP, has been introduced. Conclusion he identification of SBP attributes in this study contributes to the body of knowledge in SBP and reduces the ambiguity of this concept to make it possible for applying it in training of different medical specialties. Also, it would be possible to develop and use more precise tools to evaluate SBP competency by using empirical referents of the analysis. PMID:27104198

  3. Weld analysis and control system

    NASA Technical Reports Server (NTRS)

    Kennedy, Larry Z. (Inventor); Rodgers, Michael H. (Inventor); Powell, Bradley W. (Inventor); Burroughs, Ivan A. (Inventor); Goode, K. Wayne (Inventor)

    1994-01-01

    The invention is a Weld Analysis and Control System developed for active weld system control through real time weld data acquisition. Closed-loop control is based on analysis of weld system parameters and weld geometry. The system is adapted for use with automated welding apparatus having a weld controller which is capable of active electronic control of all aspects of a welding operation. Enhanced graphics and data displays are provided for post-weld analysis. The system provides parameter acquisition, including seam location which is acquired for active torch cross-seam positioning. Torch stand-off is also monitored for control. Weld bead and parent surface geometrical parameters are acquired as an indication of weld quality. These parameters include mismatch, peaking, undercut, underfill, crown height, weld width, puddle diameter, and other measurable information about the weld puddle regions, such as puddle symmetry, etc. These parameters provide a basis for active control as well as post-weld quality analysis and verification. Weld system parameters, such as voltage, current and wire feed rate, are also monitored and archived for correlation with quality parameters.

  4. EPALIT: A DATA MANAGEMENT SYSTEM APPLIED TO THE CONTROL AND RETRIEVAL OF TECHNICAL REPORTS

    EPA Science Inventory

    The EPALIT data management system is used by the EPA Environmental Research Laboratory, Gulf Breeze, FL, to preserve and manipulate information in text databases. EPALIT provides the logical resources for data organization, analysis, and retrieval. It is completely interactive an...

  5. Real Time Pattern Recognition And Feature Analysis From Video Signals Applied To Eye Movement And Pupillary Reflex Monitoring

    NASA Astrophysics Data System (ADS)

    Charlier, Jacques R.; Bariseau, Jean-Luc; Chuffart, Vincent; Marsy, Frangoise; Hache, Jean-Claude

    1984-06-01

    Original techniques for real time pattern recognition and feature analysis from standard video signals have been developed. These techniques have been applied to the monitoring of eye movements and pupillary size during visual field and electrophysio-logical examinations in routine ophtalmological practice. The basic features of the resulting instrument are : 1- the use of low-cost hardware, i.e. standard video equipment and LSI circuitry. 2- the measurement of eye, orientation from the position of the bright pupil relative to the corneal reflection. 3- "real time" processing and high data throughout of 50 samples per second, allowing pupillary and oculomotor reflex analysis. 4- specialized hardware and software permitting an adjustment free feature identifica-tion and analysis directly from video signals. Severe perturbations of the ocular video images can be handled by the system, including partial occlusions of the pupil with eye lids or eye lashes, fluctuations of amplitude levels and parasite light reflections.

  6. Applying FSL to the FIAC Data: Model-Based and Model-Free Analysis of Voice and Sentence Repetition Priming

    PubMed Central

    Beckmann, Christian F.; Jenkinson, Mark; Woolrich, Mark W.; Behrens, Timothy E.J.; Flitney, David E.; Devlin, Joseph T.; Smith, Stephen M.

    2009-01-01

    This article presents results obtained from applying various tools from FSL (FMRIB Software Library) to data from the repetition priming experiment used for the HBM’05 Functional Image Analysis Contest. We present analyses from the model-based General Linear Model (GLM) tool (FEAT) and from the model-free independent component analysis tool (MELODIC). We also discuss the application of tools for the correction of image distortions prior to the statistical analysis and the utility of recent advances in functional magnetic resonance imaging (FMRI) time series modeling and inference such as the use of optimal constrained HRF basis function modeling and mixture modeling inference. The combination of hemodynamic response function (HRF) and mixture modeling, in particular, revealed that both sentence content and speaker voice priming effects occurred bilaterally along the length of the superior temporal sulcus (STS). These results suggest that both are processed in a single underlying system without any significant asymmetries for content vs. voice processing. PMID:16565953

  7. Promising new baseflow separation and recession analysis methods applied to streamflow at Glendhu Catchment, New Zealand

    NASA Astrophysics Data System (ADS)

    Stewart, M. K.

    2015-06-01

    Understanding and modelling the relationship between rainfall and runoff has been a driving force in hydrology for many years. Baseflow separation and recession analysis have been two of the main tools for understanding runoff generation in catchments, but there are many different methods for each. The new baseflow separation method presented here (the bump and rise method or BRM) aims to accurately simulate the shape of tracer-determined baseflow or pre-event water. Application of the method by calibrating its parameters, using (a) tracer data or (b) an optimising method, is demonstrated for the Glendhu Catchment, New Zealand. The calibrated BRM algorithm is then applied to the Glendhu streamflow record. The new recession approach advances the thesis that recession analysis of streamflow alone gives misleading information on catchment storage reservoirs because streamflow is a varying mixture of components of very different origins and characteristics (at the simplest level, quickflow and baseflow as identified by the BRM method). Recession analyses of quickflow, baseflow and streamflow show that the steep power-law slopes often observed for streamflow at intermediate flows are artefacts due to mixing and are not representative of catchment reservoirs. Applying baseflow separation before recession analysis could therefore shed new light on water storage reservoirs in catchments and possibly resolve some current problems with recession analysis. Among other things it shows that both quickflow and baseflow reservoirs in the studied catchment have (non-linear) quadratic characteristics.

  8. A review of dendrogeomorphological research applied to flood risk analysis in Spain

    NASA Astrophysics Data System (ADS)

    Díez-Herrero, A.; Ballesteros, J. A.; Ruiz-Villanueva, V.; Bodoque, J. M.

    2013-08-01

    Over the last forty years, applying dendrogeomorphology to palaeoflood analysis has improved estimates of the frequency and magnitude of past floods worldwide. This paper reviews the main results obtained by applying dendrogeomorphology to flood research in several case studies in Central Spain. These dendrogeomorphological studies focused on the following topics: (1) anatomical analysis to understand the physiological response of trees to flood damage and improve sampling efficiency; (2) compiling robust flood chronologies in ungauged mountain streams, (3) determining flow depth and estimating flood discharge using two-dimensional hydraulic modelling, and comparing them with other palaeostage indicators; (4) calibrating hydraulic model parameters (i.e. Manning roughness); and (5) implementing stochastic-based, cost-benefit analysis to select optimal mitigation measures. The progress made in these areas is presented with suggestions for further research to improve the applicability of dendrogeochronology to palaeoflood studies. Further developments will include new methods for better identification of the causes of specific types of flood damage to trees (e.g. tilted trees) or stable isotope analysis of tree rings to identify the climatic conditions associated with periods of increasing flood magnitude or frequency.

  9. Leaching of Particulate and Dissolved Organic Carbon from Compost Applied to Bioretention Systems

    NASA Astrophysics Data System (ADS)

    Iqbal, Hamid; Flury, Markus; Mullane, Jessica; Baig, Muhammad

    2015-04-01

    Compost is used in bioretention systems to improve soil quality, to promote plant growth, and to remove metal contaminants from stormwater. However, compost itself, particularly when applied freshly, can be a source of contamination of the stormwater. To test the potential contamination caused by compost when applied to bioretention systems, we continuously leached a compost column with water under unsaturated conditions and characterized dissolved and particulate organic matter in the leachate. Freshly applied, mature compost leached up to 400 mg/L of dissolved organic carbon and 2,000 mg/L of suspended particulate organic carbon. It required a cumulative water flux of 4,000 mm until concentrations of dissolved and particulate organic carbon declined to levels typical for surface waters. Although, dissolved and particulate organic carbon are not contaminants per se, they can facilitate the movement of metals, thereby enhancing the mobility of toxic metals present in stormwater. Therefore, we recommended that compost is washed before it is applied to bioretention systems. Keywords compost; leachate; alkali extract; dissolved organic carbon; flux

  10. Wavelet-based image analysis system for soil texture analysis

    NASA Astrophysics Data System (ADS)

    Sun, Yun; Long, Zhiling; Jang, Ping-Rey; Plodinec, M. John

    2003-05-01

    Soil texture is defined as the relative proportion of clay, silt and sand found in a given soil sample. It is an important physical property of soil that affects such phenomena as plant growth and agricultural fertility. Traditional methods used to determine soil texture are either time consuming (hydrometer), or subjective and experience-demanding (field tactile evaluation). Considering that textural patterns observed at soil surfaces are uniquely associated with soil textures, we propose an innovative approach to soil texture analysis, in which wavelet frames-based features representing texture contents of soil images are extracted and categorized by applying a maximum likelihood criterion. The soil texture analysis system has been tested successfully with an accuracy of 91% in classifying soil samples into one of three general categories of soil textures. In comparison with the common methods, this wavelet-based image analysis approach is convenient, efficient, fast, and objective.

  11. Intelligent user interface for expert systems applied to power plant maintenance and troubleshooting

    SciTech Connect

    Kock, C.G.; Isle, B.A.; Butler, A.W.

    1988-03-01

    A research and development project is under way to specify, design, construct, and evaluate a user interface system to meet the unique requirements of a delivery vehicle for a knowledge-based system applied to gas turbine electronics equipment maintenance and troubleshooting. The user interface is a portable device with text display, video and overlay graphics display, voice recognition and speech production, special-function keypad, and printer. A modular software structure based on a serial communications protocol between user interface device and expert system host computer provides flexibility, expandability, and a simple, effective user interface dialogue.

  12. Quantification of applied dose in irradiated citrus fruits by DNA Comet Assay together with image analysis.

    PubMed

    Cetinkaya, Nurcan; Ercin, Demet; Özvatan, Sümer; Erel, Yakup

    2016-02-01

    The experiments were conducted for quantification of applied dose for quarantine control in irradiated citrus fruits. Citrus fruits exposed to doses of 0.1 to 1.5 kGy and analyzed by DNA Comet Assay. Observed comets were evaluated by image analysis. The tail length, tail moment and tail DNA% of comets were used for the interpretation of comets. Irradiated citrus fruits showed the separated tails from the head of the comet by increasing applied doses from 0.1 to 1.5 kGy. The mean tail length and mean tail moment% levels of irradiated citrus fruits at all doses are significantly different (p < 0.01) from control even for the lowest dose at 0.1 kGy. Thus, DNA Comet Assay may be a practical quarantine control method for irradiated citrus fruits since it has been possible to estimate the applied low doses as small as 0.1 kGy when it is combined with image analysis. PMID:26304361

  13. A Computer-Mediated Instruction System, Applied to Its Own Operating System and Peripheral Equipment.

    ERIC Educational Resources Information Center

    Winiecki, Roger D.

    Each semester students in the School of Health Sciences of Hunter College learn how to use a computer, how a computer system operates, and how peripheral equipment can be used. To overcome inadequate computer center services and equipment, programed subject matter and accompanying reference material were developed. The instructional system has a…

  14. Automated Loads Analysis System (ATLAS)

    NASA Technical Reports Server (NTRS)

    Gardner, Stephen; Frere, Scot; O’Reilly, Patrick

    2013-01-01

    ATLAS is a generalized solution that can be used for launch vehicles. ATLAS is used to produce modal transient analysis and quasi-static analysis results (i.e., accelerations, displacements, and forces) for the payload math models on a specific Shuttle Transport System (STS) flight using the shuttle math model and associated forcing functions. This innovation solves the problem of coupling of payload math models into a shuttle math model. It performs a transient loads analysis simulating liftoff, landing, and all flight events between liftoff and landing. ATLAS utilizes efficient and numerically stable algorithms available in MSC/NASTRAN.

  15. Galvanic Liquid Applied Coating System for Protection of Embedded Steel Surfaces from Corrosion

    NASA Technical Reports Server (NTRS)

    Curran, Joseph; MacDowell, Louis; Voska, N. (Technical Monitor)

    2002-01-01

    The corrosion of reinforcing steel in concrete is an insidious problem for the Kennedy Space Center, government agencies, and the general public. Existing corrosion protection systems on the market are costly, complex, and time-consuming to install, require continuous maintenance and monitoring, and require specialized skills for installation. NASA's galvanic liquid-applied coating offers companies the ability to conveniently protect embedded steel rebar surfaces from corrosion. Liquid-applied inorganic galvanic coating contains one ore more of the following metallic particles: magnesium, zinc, or indium and may contain moisture attracting compounds that facilitate the protection process. The coating is applied to the outer surface of reinforced concrete so that electrical current is established between metallic particles and surfaces of embedded steel rebar; and electric (ionic) current is responsible for providing the necessary cathodic protection for embedded rebar surfaces.

  16. Multivariat least-squares methods applied to the quantitative spectral analysis of multicomponent samples

    SciTech Connect

    Haaland, D.M.; Easterling, R.G.; Vopicka, D.A.

    1985-01-01

    In an extension of earlier work, weighted multivariate least-squares methods of quantitative FT-IR analysis have been developed. A linear least-squares approximation to nonlinearities in the Beer-Lambert law is made by allowing the reference spectra to be a set of known mixtures, The incorporation of nonzero intercepts in the relation between absorbance and concentration further improves the approximation of nonlinearities while simultaneously accounting for nonzero spectra baselines. Pathlength variations are also accommodated in the analysis, and under certain conditions, unknown sample pathlengths can be determined. All spectral data are used to improve the precision and accuracy of the estimated concentrations. During the calibration phase of the analysis, pure component spectra are estimated from the standard mixture spectra. These can be compared with the measured pure component spectra to determine which vibrations experience nonlinear behavior. In the predictive phase of the analysis, the calculated spectra are used in our previous least-squares analysis to estimate sample component concentrations. These methods were applied to the analysis of the IR spectra of binary mixtures of esters. Even with severely overlapping spectral bands and nonlinearities in the Beer-Lambert law, the average relative error in the estimated concentration was <1%.

  17. Applying latent semantic analysis to large-scale medical image databases.

    PubMed

    Stathopoulos, Spyridon; Kalamboukis, Theodore

    2015-01-01

    Latent Semantic Analysis (LSA) although has been used successfully in text retrieval when applied to CBIR induces scalability issues with large image collections. The method so far has been used with small collections due to the high cost of storage and computational time for solving the SVD problem for a large and dense feature matrix. Here we present an effective and efficient approach of applying LSA skipping the SVD solution of the feature matrix and overcoming in this way the deficiencies of the method with large scale datasets. Early and late fusion techniques are tested and their performance is calculated. The study demonstrates that early fusion of several composite descriptors with visual words increase retrieval effectiveness. It also combines well in a late fusion for mixed (textual and visual) ad hoc and modality classification. The results reported are comparable to state of the art algorithms without including additional knowledge from the medical domain. PMID:24934416

  18. Applied Behavior Analysis, Autism, and Occupational Therapy: A Search for Understanding.

    PubMed

    Welch, Christie D; Polatajko, H J

    2016-01-01

    Occupational therapists strive to be mindful, competent practitioners and continuously look for ways to improve practice. Applied behavior analysis (ABA) has strong evidence of effectiveness in helping people with autism achieve goals, yet it does not seem to be implemented in occupational therapy practice. To better understand whether ABA could be an evidence-based option to expand occupational therapy practice, the authors conducted an iterative, multiphase investigation of relevant literature. Findings suggest that occupational therapists apply developmental and sensory approaches to autism treatment. The occupational therapy literature does not reflect any use of ABA despite its strong evidence base. Occupational therapists may currently avoid using ABA principles because of a perception that ABA is not client centered. ABA principles and occupational therapy are compatible, and the two could work synergistically. PMID:27295000

  19. Generic trending and analysis system

    NASA Technical Reports Server (NTRS)

    Keehan, Lori; Reese, Jay

    1994-01-01

    The Generic Trending and Analysis System (GTAS) is a generic spacecraft performance monitoring tool developed by NASA Code 511 and Loral Aerosys. It is designed to facilitate quick anomaly resolution and trend analysis. Traditionally, the job of off-line analysis has been performed using hardware and software systems developed for real-time spacecraft contacts; then, the systems were supplemented with a collection of tools developed by Flight Operations Team (FOT) members. Since the number of upcoming missions is increasing, NASA can no longer afford to operate in this manner. GTAS improves control center productivity and effectiveness because it provides a generic solution across multiple missions. Thus, GTAS eliminates the need for each individual mission to develop duplicate capabilities. It also allows for more sophisticated tools to be developed because it draws resources from several projects. In addition, the GTAS software system incorporates commercial off-the-shelf tools software (COTS) packages and reuses components of other NASA-developed systems wherever possible. GTAS has incorporated lessons learned from previous missions by involving the users early in the development process. GTAS users took a proactive role in requirements analysis, design, development, and testing. Because of user involvement, several special tools were designed and are now being developed. GTAS users expressed considerable interest in facilitating data collection for long term trending and analysis. As a result, GTAS provides easy access to large volumes of processed telemetry data directly in the control center. The GTAS archival and retrieval capabilities are supported by the integration of optical disk technology and a COTS relational database management system.

  20. Systems analysis - independent analysis and verification

    SciTech Connect

    DiPietro, J.P.; Skolnik, E.G.; Badin, J.S.

    1996-10-01

    The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.

  1. Jacobi stability analysis of the Lorenz system

    NASA Astrophysics Data System (ADS)

    Harko, Tiberiu; Ho, Chor Yin; Leung, Chun Sing; Yip, Stan

    2015-06-01

    We perform the study of the stability of the Lorenz system by using the Jacobi stability analysis, or the Kosambi-Cartan-Chern (KCC) theory. The Lorenz model plays an important role for understanding hydrodynamic instabilities and the nature of the turbulence, also representing a nontrivial testing object for studying nonlinear effects. The KCC theory represents a powerful mathematical method for the analysis of dynamical systems. In this approach, we describe the evolution of the Lorenz system in geometric terms, by considering it as a geodesic in a Finsler space. By associating a nonlinear connection and a Berwald type connection, five geometrical invariants are obtained, with the second invariant giving the Jacobi stability of the system. The Jacobi (in)stability is a natural generalization of the (in)stability of the geodesic flow on a differentiable manifold endowed with a metric (Riemannian or Finslerian) to the non-metric setting. In order to apply the KCC theory, we reformulate the Lorenz system as a set of two second-order nonlinear differential equations. The geometric invariants associated to this system (nonlinear and Berwald connections), and the deviation curvature tensor, as well as its eigenvalues, are explicitly obtained. The Jacobi stability of the equilibrium points of the Lorenz system is studied, and the condition of the stability of the equilibrium points is obtained. Finally, we consider the time evolution of the components of the deviation vector near the equilibrium points.

  2. Scramjet nozzle design and analysis as applied to a highly integrated hypersonic research airplane

    NASA Technical Reports Server (NTRS)

    Small, W. J.; Weidner, J. P.; Johnston, P. J.

    1976-01-01

    Engine-nozzle airframe integration at hypersonic speeds was conducted by using a high-speed research aircraft concept as a focus. Recently developed techniques for analysis of scramjet-nozzle exhaust flows provide a realistic analysis of complex forces resulting from the engine-nozzle airframe coupling. By properly integrating the engine-nozzle propulsive system with the airframe, efficient, controlled and stable flight results over a wide speed range.

  3. [System approach and system analysis in dietology].

    PubMed

    Samsonov, M A

    2004-01-01

    There is analysis of using of two variants of the auto-program of dietotherapy in the article: numeric system and basis system. They belong the same kind of building type, but are different in the type of functioning principle. Numeric system is built upon nosological principle taking into consideration the clinicopathogenetic features of disease. The basic diets built upon metabolic principle that a matter is adaptation of chemical content, alimentary and food value of diet for concrete mechanism of metabolic disturbance. At the same time metabolic conveyor is considered as system organization of the separate functional systems that are in the permanent dynamic and the interacting with each other. This organization is combined on the principle of auto regulation and set in correction and recovery of disturbed homeostasis as a whole. Selection of practical using of mentioned principles of diets is a right of the specialist-dieitian. Auto-program of diet building should help him in that and simplify the organization of dietotherapy. PMID:15049149

  4. Robustness of fuzzy logic power system stabilizers applied to multimachine power system

    SciTech Connect

    Hiyama, Takashi . Dept. of Electrical Engineering and Computer Science)

    1994-09-01

    This paper investigates the robustness of fuzzy logic stabilizers using the information of speed and acceleration states of a study unit. The input signals are the real power output and/or the speed of the study unit. Non-linear simulations show the robustness of the fuzzy logic power system stabilizers. Experiments are also performed by using a micro-machine system. The results show the feasibility of proposed fuzzy logic stabilizer.

  5. X-ray microfluorescence with synchrotron radiation applied in the analysis of pigments from ancient Egypt

    NASA Astrophysics Data System (ADS)

    Calza, C.; Anjos, M. J.; Mendonça de Souza, S. M. F.; Brancaglion, A., Jr.; Lopes, R. T.

    2008-01-01

    In this work, X-ray microfluorescence with the synchrotron radiation technique was applied in the analysis of pigments found in decorative paintings in the sarcophagus of an Egyptian mummy. This female mummy, from the Roman Period, which was embalmed with the arms and legs swathed separately is considered one of the most important pieces of the Egyptian Collection from the National Museum (Rio de Janeiro, Brazil). The measurements were performed at the XRF beamline D09B of the Brazilian Synchrotron Light Laboratory (LNLS), using the white beam and a Si(Li) detector with resolution of 165 eV at 5.9 keV. The possible pigments found in the samples were: Egyptian blue, Egyptian green frit, green earth, verdigris, malachite, ochre, realgar, chalk, gypsum, bone white, ivory black and magnetite. Hierarchical cluster analysis (HCA) and principal component analysis (PCA) were applied to the results in order to verify if the samples belong to the same period of a linen wrapping fragment, whose provenance was well established.

  6. Analysis of asphalt-based roof systems using thermal analysis

    SciTech Connect

    Paroli, R.M.; Delgado, A.H.

    1996-10-01

    Asphalt is used extensively in roofing applications. Traditionally, it is used in a built-up roof system, where four or five plies are applied in conjunction with asphalt. This is labour intensive and requires good quality assurance on the roof top. Alternatively, asphalt can be used in a polymer-modified sheet where styrene-butadiene-styrene (SBS) or atactic polypropylene (APP) are added to the asphalt shipped in a roll where reinforcement (e.g., glass fibre mat) has been added. Regardless of the system used, the roof must be able to withstand the environmental loads such UV, heat, etc. Thermoanalytical techniques such as DSC, DMA, TMA and TG/DTA are ideally suited to monitor the weathering of asphalt. This paper presents data obtained using these techniques and shows how the performance of asphalt-based roof systems can be followed by thermal analysis.

  7. Probabilistic analysis of mechanical systems

    SciTech Connect

    Priddy, T.G.; Paez, T.L.; Veers, P.S.

    1993-09-01

    This paper proposes a framework for the comprehensive analysis of complex problems in probabilistic structural mechanics. Tools that can be used to accurately estimate the probabilistic behavior of mechanical systems are discussed, and some of the techniques proposed in the paper are developed and used in the solution of a problem in nonlinear structural dynamics.

  8. Analysis of hybrid solar systems

    NASA Astrophysics Data System (ADS)

    Swisher, J.

    1980-10-01

    The TRNSYS simulation program was used to evaluate the performance of active charge/passive discharge solar systems with water as the working fluid. TRNSYS simulations are used to evaluate the heating performance and cooling augmentation provided by systems in several climates. The results of the simulations are used to develop a simplified analysis tool similar to the F-chart and Phi-bar procedures used for active systems. This tool, currently in a preliminary stage, should provide the designer with quantitative performance estimates for comparison with other passive, active, and nonsolar heating and cooling designs.

  9. System Safety Common Cause Analysis

    Energy Science and Technology Software Center (ESTSC)

    1992-03-10

    The COMCAN fault tree analysis codes are designed to analyze complex systems such as nuclear plants for common causes of failure. A common cause event, or common mode failure, is a secondary cause that could contribute to the failure of more than one component and violates the assumption of independence. Analysis of such events is an integral part of system reliability and safety analysis. A significant common cause event is a secondary cause common tomore » all basic events in one or more minimal cut sets. Minimal cut sets containing events from components sharing a common location or a common link are called common cause candidates. Components share a common location if no barrier insulates any one of them from the secondary cause. A common link is a dependency among components which cannot be removed by a physical barrier (e.g.,a common energy source or common maintenance instructions).« less

  10. Time-of-arrival analysis applied to ELF/VLF wave generation experiments at HAARP

    NASA Astrophysics Data System (ADS)

    Moore, R. C.; Fujimaru, S.

    2012-12-01

    Time-of-arrival (TOA) analysis is applied to observations performed during ELF/VLF wave generation experiments at the High-frequency Active Auroral Research Program (HAARP) HF transmitter in Gakona, Alaska. In 2012, a variety of ELF/VLF wave generation techniques were employed to identify the dominant source altitude for each case. Observations were performed for beat-wave modulation, AM modulation, STF modulation, ICD modulation, and cubic frequency modulation, among others. For each of these cases, we identify the dominant ELF/VLF source altitude and compare the experimental results with theoretical HF heating predictions.

  11. UV-Visible First-Derivative Spectrophotometry Applied to an Analysis of a Vitamin Mixture

    NASA Astrophysics Data System (ADS)

    Aberásturi, F.; Jiménez, A. I.; Jiménez, F.; Arias, J. J.

    2001-06-01

    A simple new experiment that uses UV-vis spectrophotometry to introduce undergraduate chemistry students to multicomponent analysis is presented and a method for the simultaneous determination of three vitamins using derivative spectrophotometry (zero-crossing method) is described. The methodology is simple and easy to apply and allows the determination of folic acid, pyridoxine, and thiamine over the concentration ranges 1.02-14.28, 1.00-16.00, and 6.00-20.00 mg mL-1, respectively. The resulting errors were nearly always less than 5%.

  12. Analysis of Preconditioning and Relaxation Operators for the Discontinuous Galerkin Method Applied to Diffusion

    NASA Technical Reports Server (NTRS)

    Atkins, H. L.; Shu, Chi-Wang

    2001-01-01

    The explicit stability constraint of the discontinuous Galerkin method applied to the diffusion operator decreases dramatically as the order of the method is increased. Block Jacobi and block Gauss-Seidel preconditioner operators are examined for their effectiveness at accelerating convergence. A Fourier analysis for methods of order 2 through 6 reveals that both preconditioner operators bound the eigenvalues of the discrete spatial operator. Additionally, in one dimension, the eigenvalues are grouped into two or three regions that are invariant with order of the method. Local relaxation methods are constructed that rapidly damp high frequencies for arbitrarily large time step.

  13. Stakeholder analysis for industrial waste management systems.

    PubMed

    Heidrich, Oliver; Harvey, Joan; Tollin, Nicola

    2009-02-01

    Stakeholder approaches have been applied to the management of companies with a view to the improvement of all areas of performance, including economic, health and safety, waste reduction, future policies, etc. However no agreement exists regarding stakeholders, their interests and levels of importance. This paper considers stakeholder analysis with particular reference to environmental and waste management systems. It proposes a template and matrix model for identification of stakeholder roles and influences by rating the stakeholders. A case study demonstrates the use of these and their ability to be transferred to other circumstances and organizations is illustrated by using a large educational institution. PMID:18790624

  14. Applying the generic errors modeling system to obstetric hemorrhage quality improvement efforts.

    PubMed

    Bingham, Debra

    2012-01-01

    Obstetric hemorrhage is an emergency situation in which clinicians can make errors that cause women to suffer preventable maternal morbidity and mortality. Scrutinizing commonly occurring obstetric hemorrhage-related practice errors by applying the generic errors modeling system, a research-based framework, to quality improvement efforts facilitates the identification of error specific reduction strategies. The common types of errors are skill-based, rule-based, and knowledge-based active and latent errors. PMID:22548710

  15. Applying generalized stochastic Petri nets to manufacturing systems containing nonexponential transition functions

    NASA Technical Reports Server (NTRS)

    Watson, James F., III; Desrochers, Alan A.

    1991-01-01

    Generalized stochastic Petri nets (GSPNs) are applied to flexible manufacturing systems (FMSs). Throughput subnets and s-transitions are presented. Two FMS examples containing nonexponential distributions which were analyzed in previous papers by queuing theory and probability theory, respectively, are treated using GSPNs developed using throughput subnets and s-transitions. The GSPN results agree with the previous results, and developing and analyzing the GSPN models are straightforward and relatively easy compared to other methodologies.

  16. Prognostic Analysis System and Methods of Operation

    NASA Technical Reports Server (NTRS)

    MacKey, Ryan M. E. (Inventor); Sneddon, Robert (Inventor)

    2014-01-01

    A prognostic analysis system and methods of operating the system are provided. In particular, a prognostic analysis system for the analysis of physical system health applicable to mechanical, electrical, chemical and optical systems and methods of operating the system are described herein.

  17. Logic production systems: Analysis and synthesis

    SciTech Connect

    Donskoi, V.I.

    1995-03-01

    Many applied systems can be described in the following terms: given is a certain number of objects and a set of rules to construct new object from the original objects and from previously constructed objects. Mathematicians call such systems deductive, or calculi. Artificial intelligence scientists subsequently improved and elaborated the notion of production, retaining the Post operator A {yields} B as a basic element or a core. Production models are generally regarded as lacking a rigorous theory and governed by heuristics. Maslov noted: {open_quotes}We may assume that the language of calculi will become in the near future as natural and as widespread in new applications of discrete mathematics as, for instance, the language of graph theory is today.{close_quotes} Studies whose results are surveyed below were triggered by the development of applications of production systems in dual expert systems and focus around the following topics: (1) formalization of logic production systems (Pospelov has noted that results in the theory of production systems can be obtained by restricting the notion of productions and production systems); (2) analysis of completeness of logic production systems as a tool for realization of Boolean functions; (3) construction of a universal algorithmic model based on a logic production system; (4) construction of algorithms that synthesize the domain of deductive derivability of a given goal fact and analysis of algorithmic complexity of the corresponding problem. It is important to note that the results obtained so far relate to a strictly defined subclass - the subclass of logic production systems and machines. They do not pretend to cover the wider domain of applicability of the apparatus of deductive systems. Classical concepts and propositions of discrete mathematics used in this paper without further explanation are defined in existing literature.

  18. A traffic situation analysis system

    NASA Astrophysics Data System (ADS)

    Sidla, Oliver; Rosner, Marcin

    2011-01-01

    The observation and monitoring of traffic with smart visions systems for the purpose of improving traffic safety has a big potential. For example embedded vision systems built into vehicles can be used as early warning systems, or stationary camera systems can modify the switching frequency of signals at intersections. Today the automated analysis of traffic situations is still in its infancy - the patterns of vehicle motion and pedestrian flow in an urban environment are too complex to be fully understood by a vision system. We present steps towards such a traffic monitoring system which is designed to detect potentially dangerous traffic situations, especially incidents in which the interaction of pedestrians and vehicles might develop into safety critical encounters. The proposed system is field-tested at a real pedestrian crossing in the City of Vienna for the duration of one year. It consists of a cluster of 3 smart cameras, each of which is built from a very compact PC hardware system in an outdoor capable housing. Two cameras run vehicle detection software including license plate detection and recognition, one camera runs a complex pedestrian detection and tracking module based on the HOG detection principle. As a supplement, all 3 cameras use additional optical flow computation in a low-resolution video stream in order to estimate the motion path and speed of objects. This work describes the foundation for all 3 different object detection modalities (pedestrians, vehi1cles, license plates), and explains the system setup and its design.

  19. A comparative assessment of texture analysis techniques applied to bone tool use-wear

    NASA Astrophysics Data System (ADS)

    Watson, Adam S.; Gleason, Matthew A.

    2016-06-01

    The study of bone tools, a specific class of artifacts often essential to perishable craft production, provides insight into industries otherwise largely invisible archaeologically. Building on recent breakthroughs in the analysis of microwear, this research applies confocal laser scanning microscopy and texture analysis techniques drawn from the field of surface metrology to identify use-wear patterns on experimental and archaeological bone artifacts. Our approach utilizes both conventional parameters and multi-scale geometric characterizations of the areas of worn surfaces to identify statistical similarities as a function of scale. The introduction of this quantitative approach to the study of microtopography holds significant potential for advancement in use-wear studies by reducing inter-observer variability and identifying new parameters useful in the detection of differential wear-patterns.

  20. Analysis of sonic well logs applied to erosion estimates in the Bighorn Basin, Wyoming

    SciTech Connect

    Heasler, H.P.; Kharitonova, N.A.

    1996-05-01

    An improved exponential model of sonic transit time data as a function of depth takes into account the physical range of rock sonic velocities. In this way, the model is more geologically realistic for predicting compaction trends when compared to linear or simple exponential functions that fail at large depth intervals. The improved model is applied to the Bighorn basin of northwestern Wyoming for calculation of erosion amounts. This basin was chosen because of extensive geomorphic research that constrains erosion models and because of the importance of quantifying erosion amounts for basin analysis and hydrocarbon maturation prediction. Thirty-six wells were analyzed using the improved exponential model. Seven of these wells, due to limited data from the Tertiary section, were excluded from the basin erosion analysis. Erosion amounts from the remaining 29 wells ranged from 0 to 5600 ft (1700 m), with an average of 2500 ft (800 m).