Selecting the process variables for filament winding
NASA Technical Reports Server (NTRS)
Calius, E.; Springer, G. S.
1986-01-01
A model is described which can be used to determine the appropriate values of the process variables for filament winding cylinders. The process variables which can be selected by the model include the winding speed, fiber tension, initial resin degree of cure, and the temperatures applied during winding, curing, and post-curing. The effects of these process variables on the properties of the cylinder during and after manufacture are illustrated by a numerical example.
Process Variability and Capability in Candy Production and Packaging
ERIC Educational Resources Information Center
Lembke, Ronald S.
2016-01-01
In this short, in-class activity, students use fun size packages of M&Ms to study process variability, including a real-world application of C[subscript pk]. How process variability and legal requirements force the company to put "Not Labeled for Individual Retail Sale" on each fun size package is discussed, as is the economics of…
NASA Technical Reports Server (NTRS)
1980-01-01
A quality assurance program was developed which included specifications for celion/LARC-160 polyimide materials and quality control of materials and processes. The effects of monomers and/or polymer variables and prepeg variables on the processibility of celion/LARC prepeg were included. Processes for fabricating laminates, honeycomb core panels, and chopped fiber moldings were developed. Specimens and conduct tests were fabricated to qualify the processes for fabrication of demonstration components.
Mangwandi, Chirangano; Adams, Michael J; Hounslow, Michael J; Salman, Agba D
2012-05-10
Being able to predict the properties of granules from the knowledge of the process and formulation variables is what most industries are striving for. This research uses experimental design to investigate the effect of process variables and formulation variables on mechanical properties of pharmaceutical granules manufactured from a classical blend of lactose and starch using hydroxypropyl cellulose (HPC) as the binder. The process parameters investigated were granulation time and impeller speed whilst the formulation variables were starch-to-lactose ratio and HPC concentration. The granule properties investigated include granule packing coefficient and granule strength. The effect of some components of the formulation on mechanical properties would also depend on the process variables used in granulation process. This implies that by subjecting the same formulation to different process conditions results in products with different properties. Copyright © 2012 Elsevier B.V. All rights reserved.
The HPT Value Proposition in the Larger Improvement Arena.
ERIC Educational Resources Information Center
Wallace, Guy W.
2003-01-01
Discussion of human performance technology (HPT) emphasizes the key variable, which is the human variable. Highlights include the Ishikawa Diagram; human performance as one variable of process performance; collaborating with other improvement approaches; value propositions; and benefits to stakeholders, including real return on investments. (LRW)
Advanced multivariable control of a turboexpander plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Altena, D.; Howard, M.; Bullin, K.
1998-12-31
This paper describes an application of advanced multivariable control on a natural gas plant and compares its performance to the previous conventional feed-back control. This control algorithm utilizes simple models from existing plant data and/or plant tests to hold the process at the desired operating point in the presence of disturbances and changes in operating conditions. The control software is able to accomplish this due to effective handling of process variable interaction, constraint avoidance and feed-forward of measured disturbances. The economic benefit of improved control lies in operating closer to the process constraints while avoiding significant violations. The South Texasmore » facility where this controller was implemented experienced reduced variability in process conditions which increased liquids recovery because the plant was able to operate much closer to the customer specified impurity constraint. An additional benefit of this implementation of multivariable control is the ability to set performance criteria beyond simple setpoints, including process variable constraints, relative variable merit and optimizing use of manipulated variables. The paper also details the control scheme applied to the complex turboexpander process and some of the safety features included to improve reliability.« less
Code of Federal Regulations, 2010 CFR
2010-07-01
..., Demand Side Variability, and Network Variability studies, including input data, processing programs, and... should include the product or product groups carried under each listed contract; (k) Spreadsheets and...
The variable polarity plasma arc welding process: Its application to the Space Shuttle external tank
NASA Technical Reports Server (NTRS)
Nunes, A. C., Jr.; Bayless, O. E., Jr.; Jones, C. S., III; Munafo, A. P.; Wilson, W. A.
1983-01-01
The technical history of the variable polarity plasma arc (VPPA) welding process being introduced as a partial replacement for the gas shielded tungsten arc process in assembly welding of the space shuttle external tank is described. Interim results of the weld strength qualification studies, and plans for further work on the implementation of the VPPA process are included.
Study of process variables associated with manufacturing hermetically-sealed nickel-cadmium cells
NASA Technical Reports Server (NTRS)
Miller, L.
1972-01-01
The effort and results of a program to determine and study the critical process variables associated with the manufacture of aerospace, hermetically-sealed, nickel-cadmium cells are reported. During the period, the impregnation/polarization process variable study was brought to a close with the completion of a series of related experiments. The results of the experiments are summarized. During this period, a general characterization of cell separator materials was initiated. The major conclusions resulting from the characterization of materials are included.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cai, Ming; Deng, Yi
2015-02-06
El Niño-Southern Oscillation (ENSO) and Annular Modes (AMs) represent respectively the most important modes of low frequency variability in the tropical and extratropical circulations. The future projection of the ENSO and AM variability, however, remains highly uncertain with the state-of-the-art coupled general circulation models. A comprehensive understanding of the factors responsible for the inter-model discrepancies in projecting future changes in the ENSO and AM variability, in terms of multiple feedback processes involved, has yet to be achieved. The proposed research aims to identify sources of such uncertainty and establish a set of process-resolving quantitative evaluations of the existing predictions ofmore » the future ENSO and AM variability. The proposed process-resolving evaluations are based on a feedback analysis method formulated in Lu and Cai (2009), which is capable of partitioning 3D temperature anomalies/perturbations into components linked to 1) radiation-related thermodynamic processes such as cloud and water vapor feedbacks, 2) local dynamical processes including convection and turbulent/diffusive energy transfer and 3) non-local dynamical processes such as the horizontal energy transport in the oceans and atmosphere. Taking advantage of the high-resolution, multi-model ensemble products from the Coupled Model Intercomparison Project Phase 5 (CMIP5) soon to be available at the Lawrence Livermore National Lab, we will conduct a process-resolving decomposition of the global three-dimensional (3D) temperature (including SST) response to the ENSO and AM variability in the preindustrial, historical and future climate simulated by these models. Specific research tasks include 1) identifying the model-observation discrepancies in the global temperature response to ENSO and AM variability and attributing such discrepancies to specific feedback processes, 2) delineating the influence of anthropogenic radiative forcing on the key feedback processes operating on ENSO and AM variability and quantifying their relative contributions to the changes in the temperature anomalies associated with different phases of ENSO and AMs, and 3) investigating the linkages between model feedback processes that lead to inter-model differences in time-mean temperature projection and model feedback processes that cause inter-model differences in the simulated ENSO and AM temperature response. Through a thorough model-observation and inter-model comparison of the multiple energetic processes associated with ENSO and AM variability, the proposed research serves to identify key uncertainties in model representation of ENSO and AM variability, and investigate how the model uncertainty in predicting time-mean response is related to the uncertainty in predicting response of the low-frequency modes. The proposal is thus a direct response to the first topical area of the solicitation: Interaction of Climate Change and Low Frequency Modes of Natural Climate Variability. It ultimately supports the accomplishment of the BER climate science activity Long Term Measure (LTM): "Deliver improved scientific data and models about the potential response of the Earth's climate and terrestrial biosphere to increased greenhouse gas levels for policy makers to determine safe levels of greenhouse gases in the atmosphere."« less
Langevin, Christian D.; Shoemaker, W. Barclay; Guo, Weixing
2003-01-01
SEAWAT-2000 is the latest release of the SEAWAT computer program for simulation of three-dimensional, variable-density, transient ground-water flow in porous media. SEAWAT-2000 was designed by combining a modified version of MODFLOW-2000 and MT3DMS into a single computer program. The code was developed using the MODFLOW-2000 concept of a process, which is defined as ?part of the code that solves a fundamental equation by a specified numerical method.? SEAWAT-2000 contains all of the processes distributed with MODFLOW-2000 and also includes the Variable-Density Flow Process (as an alternative to the constant-density Ground-Water Flow Process) and the Integrated MT3DMS Transport Process. Processes may be active or inactive, depending on simulation objectives; however, not all processes are compatible. For example, the Sensitivity and Parameter Estimation Processes are not compatible with the Variable-Density Flow and Integrated MT3DMS Transport Processes. The SEAWAT-2000 computer code was tested with the common variable-density benchmark problems and also with problems representing evaporation from a salt lake and rotation of immiscible fluids.
Van der Fels-Klerx, Ine H J; Goossens, Louis H J; Saatkamp, Helmut W; Horst, Suzan H S
2002-02-01
This paper presents a protocol for a formal expert judgment process using a heterogeneous expert panel aimed at the quantification of continuous variables. The emphasis is on the process's requirements related to the nature of expertise within the panel, in particular the heterogeneity of both substantive and normative expertise. The process provides the opportunity for interaction among the experts so that they fully understand and agree upon the problem at hand, including qualitative aspects relevant to the variables of interest, prior to the actual quantification task. Individual experts' assessments on the variables of interest, cast in the form of subjective probability density functions, are elicited with a minimal demand for normative expertise. The individual experts' assessments are aggregated into a single probability density function per variable, thereby weighting the experts according to their expertise. Elicitation techniques proposed include the Delphi technique for the qualitative assessment task and the ELI method for the actual quantitative assessment task. Appropriately, the Classical model was used to weight the experts' assessments in order to construct a single distribution per variable. Applying this model, the experts' quality typically was based on their performance on seed variables. An application of the proposed protocol in the broad and multidisciplinary field of animal health is presented. Results of this expert judgment process showed that the proposed protocol in combination with the proposed elicitation and analysis techniques resulted in valid data on the (continuous) variables of interest. In conclusion, the proposed protocol for a formal expert judgment process aimed at the elicitation of quantitative data from a heterogeneous expert panel provided satisfactory results. Hence, this protocol might be useful for expert judgment studies in other broad and/or multidisciplinary fields of interest.
NASA Astrophysics Data System (ADS)
Verrelst, Jochem; Malenovský, Zbyněk; Van der Tol, Christiaan; Camps-Valls, Gustau; Gastellu-Etchegorry, Jean-Philippe; Lewis, Philip; North, Peter; Moreno, Jose
2018-06-01
An unprecedented spectroscopic data stream will soon become available with forthcoming Earth-observing satellite missions equipped with imaging spectroradiometers. This data stream will open up a vast array of opportunities to quantify a diversity of biochemical and structural vegetation properties. The processing requirements for such large data streams require reliable retrieval techniques enabling the spatiotemporally explicit quantification of biophysical variables. With the aim of preparing for this new era of Earth observation, this review summarizes the state-of-the-art retrieval methods that have been applied in experimental imaging spectroscopy studies inferring all kinds of vegetation biophysical variables. Identified retrieval methods are categorized into: (1) parametric regression, including vegetation indices, shape indices and spectral transformations; (2) nonparametric regression, including linear and nonlinear machine learning regression algorithms; (3) physically based, including inversion of radiative transfer models (RTMs) using numerical optimization and look-up table approaches; and (4) hybrid regression methods, which combine RTM simulations with machine learning regression methods. For each of these categories, an overview of widely applied methods with application to mapping vegetation properties is given. In view of processing imaging spectroscopy data, a critical aspect involves the challenge of dealing with spectral multicollinearity. The ability to provide robust estimates, retrieval uncertainties and acceptable retrieval processing speed are other important aspects in view of operational processing. Recommendations towards new-generation spectroscopy-based processing chains for operational production of biophysical variables are given.
NASA Technical Reports Server (NTRS)
Entekhabi, D.; Eagleson, P. S.
1989-01-01
Parameterizations are developed for the representation of subgrid hydrologic processes in atmospheric general circulation models. Reasonable a priori probability density functions of the spatial variability of soil moisture and of precipitation are introduced. These are used in conjunction with the deterministic equations describing basic soil moisture physics to derive expressions for the hydrologic processes that include subgrid scale variation in parameters. The major model sensitivities to soil type and to climatic forcing are explored.
Compensation for Lithography Induced Process Variations during Physical Design
NASA Astrophysics Data System (ADS)
Chin, Eric Yiow-Bing
This dissertation addresses the challenge of designing robust integrated circuits in the deep sub micron regime in the presence of lithography process variability. By extending and combining existing process and circuit analysis techniques, flexible software frameworks are developed to provide detailed studies of circuit performance in the presence of lithography variations such as focus and exposure. Applications of these software frameworks to select circuits demonstrate the electrical impact of these variations and provide insight into variability aware compact models that capture the process dependent circuit behavior. These variability aware timing models abstract lithography variability from the process level to the circuit level and are used to estimate path level circuit performance with high accuracy with very little overhead in runtime. The Interconnect Variability Characterization (IVC) framework maps lithography induced geometrical variations at the interconnect level to electrical delay variations. This framework is applied to one dimensional repeater circuits patterned with both 90nm single patterning and 32nm double patterning technologies, under the presence of focus, exposure, and overlay variability. Studies indicate that single and double patterning layouts generally exhibit small variations in delay (between 1--3%) due to self compensating RC effects associated with dense layouts and overlay errors for layouts without self-compensating RC effects. The delay response of each double patterned interconnect structure is fit with a second order polynomial model with focus, exposure, and misalignment parameters with 12 coefficients and residuals of less than 0.1ps. The IVC framework is also applied to a repeater circuit with cascaded interconnect structures to emulate more complex layout scenarios, and it is observed that the variations on each segment average out to reduce the overall delay variation. The Standard Cell Variability Characterization (SCVC) framework advances existing layout-level lithography aware circuit analysis by extending it to cell-level applications utilizing a physically accurate approach that integrates process simulation, compact transistor models, and circuit simulation to characterize electrical cell behavior. This framework is applied to combinational and sequential cells in the Nangate 45nm Open Cell Library, and the timing response of these cells to lithography focus and exposure variations demonstrate Bossung like behavior. This behavior permits the process parameter dependent response to be captured in a nine term variability aware compact model based on Bossung fitting equations. For a two input NAND gate, the variability aware compact model captures the simulated response to an accuracy of 0.3%. The SCVC framework is also applied to investigate advanced process effects including misalignment and layout proximity. The abstraction of process variability from the layout level to the cell level opens up an entire new realm of circuit analysis and optimization and provides a foundation for path level variability analysis without the computationally expensive costs associated with joint process and circuit simulation. The SCVC framework is used with slight modification to illustrate the speedup and accuracy tradeoffs of using compact models. With variability aware compact models, the process dependent performance of a three stage logic circuit can be estimated to an accuracy of 0.7% with a speedup of over 50,000. Path level variability analysis also provides an accurate estimate (within 1%) of ring oscillator period in well under a second. Another significant advantage of variability aware compact models is that they can be easily incorporated into existing design methodologies for design optimization. This is demonstrated by applying cell swapping on a logic circuit to reduce the overall delay variability along a circuit path. By including these variability aware compact models in cell characterization libraries, design metrics such as circuit timing, power, area, and delay variability can be quickly assessed to optimize for the correct balance of all design metrics, including delay variability. Deterministic lithography variations can be easily captured using the variability aware compact models described in this dissertation. However, another prominent source of variability is random dopant fluctuations, which affect transistor threshold voltage and in turn circuit performance. The SCVC framework is utilized to investigate the interactions between deterministic lithography variations and random dopant fluctuations. Monte Carlo studies show that the output delay distribution in the presence of random dopant fluctuations is dependent on lithography focus and exposure conditions, with a 3.6 ps change in standard deviation across the focus exposure process window. This indicates that the electrical impact of random variations is dependent on systematic lithography variations, and this dependency should be included for precise analysis.
Formulation characteristics and in vitro release testing of cyclosporine ophthalmic ointments.
Dong, Yixuan; Qu, Haiou; Pavurala, Naresh; Wang, Jiang; Sekar, Vasanthakumar; Martinez, Marilyn N; Fahmy, Raafat; Ashraf, Muhammad; Cruz, Celia N; Xu, Xiaoming
2018-06-10
The aim of the present study was to investigate the relationship between formulation/process variables versus the critical quality attributes (CQAs) of cyclosporine ophthalmic ointments and to explore the feasibility of using an in vitro approach to assess product sameness. A definitive screening design (DSD) was used to evaluate the impact of formulation and process variables. The formulation variables included drug percentage, percentage of corn oil and lanolin alcohol. The process variables studied were mixing temperature, mixing time and the method of mixing. The quality and performance attributes examined included drug assay, content uniformity, image analysis, rheology (storage modulus, shear viscosity) and in vitro drug release. Of the formulation variables evaluated, the percentage of the drug substance and the percentage of corn oil in the matrix were the most influential factors with respect to in vitro drug release. Conversely, the process parameters tested were observed to have minimal impact. An evaluation of the release mechanism of cyclosporine from the ointment revealed an interplay between formulation (e.g. physicochemical properties of the drug and ointment matrix type) and the release medium. These data provide a scientific basis to guide method development for in vitro drug release testing of ointment dosage forms. These results demonstrate that the in vitro methods used in this investigation were fit-for-purpose for detecting formulation and process changes and therefore amenable to assessment of product sameness. Published by Elsevier B.V.
UAH mathematical model of the variable polarity plasma ARC welding system calculation
NASA Technical Reports Server (NTRS)
Hung, R. J.
1994-01-01
Significant advantages of Variable Polarity Plasma Arc (VPPA) welding process include faster welding, fewer repairs, less joint preparation, reduced weldment distortion, and absence of porosity. A mathematical model is presented to analyze the VPPA welding process. Results of the mathematical model were compared with the experimental observation accomplished by the GDI team.
NASA Astrophysics Data System (ADS)
Das, Siddhartha; Siopsis, George; Weedbrook, Christian
2018-02-01
With the significant advancement in quantum computation during the past couple of decades, the exploration of machine-learning subroutines using quantum strategies has become increasingly popular. Gaussian process regression is a widely used technique in supervised classical machine learning. Here we introduce an algorithm for Gaussian process regression using continuous-variable quantum systems that can be realized with technology based on photonic quantum computers under certain assumptions regarding distribution of data and availability of efficient quantum access. Our algorithm shows that by using a continuous-variable quantum computer a dramatic speedup in computing Gaussian process regression can be achieved, i.e., the possibility of exponentially reducing the time to compute. Furthermore, our results also include a continuous-variable quantum-assisted singular value decomposition method of nonsparse low rank matrices and forms an important subroutine in our Gaussian process regression algorithm.
The underlying processes of a soil mite metacommunity on a small scale.
Dong, Chengxu; Gao, Meixiang; Guo, Chuanwei; Lin, Lin; Wu, Donghui; Zhang, Limin
2017-01-01
Metacommunity theory provides an understanding of how ecological processes regulate local community assemblies. However, few field studies have evaluated the underlying mechanisms of a metacommunity on a small scale through revealing the relative roles of spatial and environmental filtering in structuring local community composition. Based on a spatially explicit sampling design in 2012 and 2013, this study aims to evaluate the underlying processes of a soil mite metacommunity on a small spatial scale (50 m) in a temperate deciduous forest located at the Maoershan Ecosystem Research Station, Northeast China. Moran's eigenvector maps (MEMs) were used to model independent spatial variables. The relative importance of spatial (including trend variables, i.e., geographical coordinates, and broad- and fine-scale spatial variables) and environmental factors in driving the soil mite metacommunity was determined by variation partitioning. Mantel and partial Mantel tests and a redundancy analysis (RDA) were also used to identify the relative contributions of spatial and environmental variables. The results of variation partitioning suggested that the relatively large and significant variance was a result of spatial variables (including broad- and fine-scale spatial variables and trend), indicating the importance of dispersal limitation and autocorrelation processes. The significant contribution of environmental variables was detected in 2012 based on a partial Mantel test, and soil moisture and soil organic matter were especially important for the soil mite metacommunity composition in both years. The study suggested that the soil mite metacommunity was primarily regulated by dispersal limitation due to broad-scale and neutral biotic processes at a fine-scale and that environmental filtering might be of subordinate importance. In conclusion, a combination of metacommunity perspectives between neutral and species sorting theories was suggested to be important in the observed structure of the soil mite metacommunity at the studied small scale.
The underlying processes of a soil mite metacommunity on a small scale
Guo, Chuanwei; Lin, Lin; Wu, Donghui; Zhang, Limin
2017-01-01
Metacommunity theory provides an understanding of how ecological processes regulate local community assemblies. However, few field studies have evaluated the underlying mechanisms of a metacommunity on a small scale through revealing the relative roles of spatial and environmental filtering in structuring local community composition. Based on a spatially explicit sampling design in 2012 and 2013, this study aims to evaluate the underlying processes of a soil mite metacommunity on a small spatial scale (50 m) in a temperate deciduous forest located at the Maoershan Ecosystem Research Station, Northeast China. Moran’s eigenvector maps (MEMs) were used to model independent spatial variables. The relative importance of spatial (including trend variables, i.e., geographical coordinates, and broad- and fine-scale spatial variables) and environmental factors in driving the soil mite metacommunity was determined by variation partitioning. Mantel and partial Mantel tests and a redundancy analysis (RDA) were also used to identify the relative contributions of spatial and environmental variables. The results of variation partitioning suggested that the relatively large and significant variance was a result of spatial variables (including broad- and fine-scale spatial variables and trend), indicating the importance of dispersal limitation and autocorrelation processes. The significant contribution of environmental variables was detected in 2012 based on a partial Mantel test, and soil moisture and soil organic matter were especially important for the soil mite metacommunity composition in both years. The study suggested that the soil mite metacommunity was primarily regulated by dispersal limitation due to broad-scale and neutral biotic processes at a fine-scale and that environmental filtering might be of subordinate importance. In conclusion, a combination of metacommunity perspectives between neutral and species sorting theories was suggested to be important in the observed structure of the soil mite metacommunity at the studied small scale. PMID:28481906
Pacholewicz, Ewa; Swart, Arno; Wagenaar, Jaap A; Lipman, Len J A; Havelaar, Arie H
2016-12-01
This study aimed at identifying explanatory variables that were associated with Campylobacter and Escherichia coli concentrations throughout processing in two commercial broiler slaughterhouses. Quantative data on Campylobacter and E. coli along the processing line were collected. Moreover, information on batch characteristics, slaughterhouse practices, process performance, and environmental variables was collected through questionnaires, observations, and measurements, resulting in data on 19 potential explanatory variables. Analysis was conducted separately in each slaughterhouse to identify which variables were related to changes in concentrations of Campylobacter and E. coli during the processing steps: scalding, defeathering, evisceration, and chilling. Associations with explanatory variables were different in the slaughterhouses studied. In the first slaughterhouse, there was only one significant association: poorer uniformity of the weight of carcasses within a batch with less decrease in E. coli concentrations after defeathering. In the second slaughterhouse, significant statistical associations were found with variables, including age, uniformity, average weight of carcasses, Campylobacter concentrations in excreta and ceca, and E. coli concentrations in excreta. Bacterial concentrations in excreta and ceca were found to be the most prominent variables, because they were associated with concentration on carcasses at various processing points. Although the slaughterhouses produced specific products and had different batch characteristics and processing parameters, the effect of the significant variables was not always the same for each slaughterhouse. Therefore, each slaughterhouse needs to determine its particular relevant measures for hygiene control and process management. This identification could be supported by monitoring changes in bacterial concentrations during processing in individual slaughterhouses. In addition, the possibility that management and food handling practices in slaughterhouses contribute to the differences in bacterial contamination between slaughterhouses needs further investigation.
Phase 1 of the automated array assembly task of the low cost silicon solar array project
NASA Technical Reports Server (NTRS)
Pryor, R. A.; Grenon, L. A.; Coleman, M. G.
1978-01-01
The results of a study of process variables and solar cell variables are presented. Interactions between variables and their effects upon control ranges of the variables are identified. The results of a cost analysis for manufacturing solar cells are discussed. The cost analysis includes a sensitivity analysis of a number of cost factors.
A gentle introduction to quantile regression for ecologists
Cade, B.S.; Noon, B.R.
2003-01-01
Quantile regression is a way to estimate the conditional quantiles of a response variable distribution in the linear model that provides a more complete view of possible causal relationships between variables in ecological processes. Typically, all the factors that affect ecological processes are not measured and included in the statistical models used to investigate relationships between variables associated with those processes. As a consequence, there may be a weak or no predictive relationship between the mean of the response variable (y) distribution and the measured predictive factors (X). Yet there may be stronger, useful predictive relationships with other parts of the response variable distribution. This primer relates quantile regression estimates to prediction intervals in parametric error distribution regression models (eg least squares), and discusses the ordering characteristics, interval nature, sampling variation, weighting, and interpretation of the estimates for homogeneous and heterogeneous regression models.
BOREAS RSS-8 BIOME-BGC Model Simulations at Tower Flux Sites in 1994
NASA Technical Reports Server (NTRS)
Hall, Forrest G. (Editor); Nickeson, Jaime (Editor); Kimball, John
2000-01-01
BIOME-BGC is a general ecosystem process model designed to simulate biogeochemical and hydrologic processes across multiple scales (Running and Hunt, 1993). In this investigation, BIOME-BGC was used to estimate daily water and carbon budgets for the BOREAS tower flux sites for 1994. Carbon variables estimated by the model include gross primary production (i.e., net photosynthesis), maintenance and heterotrophic respiration, net primary production, and net ecosystem carbon exchange. Hydrologic variables estimated by the model include snowcover, evaporation, transpiration, evapotranspiration, soil moisture, and outflow. The information provided by the investigation includes input initialization and model output files for various sites in tabular ASCII format.
Mathematical Model Of Variable-Polarity Plasma Arc Welding
NASA Technical Reports Server (NTRS)
Hung, R. J.
1996-01-01
Mathematical model of variable-polarity plasma arc (VPPA) welding process developed for use in predicting characteristics of welds and thus serves as guide for selection of process parameters. Parameters include welding electric currents in, and durations of, straight and reverse polarities; rates of flow of plasma and shielding gases; and sizes and relative positions of welding electrode, welding orifice, and workpiece.
ERIC Educational Resources Information Center
Simon, Katherine; Barakat, Lamia P.; Patterson, Chavis A.; Dampier, Carlton
2009-01-01
Sickle cell disease (SCD) complications place patients at risk for poor psychosocial adaptation, including depression and anxiety symptoms. This study aimed to test a mediator model based on the Risk and Resistance model to explore the role of intrapersonal characteristics and stress processing variables in psychosocial functioning. Participants…
NASA Astrophysics Data System (ADS)
Moritz, R. E.
2005-12-01
The properties, distribution and temporal variation of sea-ice are reviewed for application to problems of ice-atmosphere chemical processes. Typical vertical structure of sea-ice is presented for different ice types, including young ice, first-year ice and multi-year ice, emphasizing factors relevant to surface chemistry and gas exchange. Time average annual cycles of large scale variables are presented, including ice concentration, ice extent, ice thickness and ice age. Spatial and temporal variability of these large scale quantities is considered on time scales of 1-50 years, emphasizing recent and projected changes in the Arctic pack ice. The amount and time evolution of open water and thin ice are important factors that influence ocean-ice-atmosphere chemical processes. Observations and modeling of the sea-ice thickness distribution function are presented to characterize the range of variability in open water and thin ice.
Milewski, John O.; Sklar, Edward
1998-01-01
A laser welding process including: (a) using optical ray tracing to make a model of a laser beam and the geometry of a joint to be welded; (b) adjusting variables in the model to choose variables for use in making a laser weld; and (c) laser welding the joint to be welded using the chosen variables.
Milewski, J.O.; Sklar, E.
1998-06-02
A laser welding process including: (a) using optical ray tracing to make a model of a laser beam and the geometry of a joint to be welded; (b) adjusting variables in the model to choose variables for use in making a laser weld; and (c) laser welding the joint to be welded using the chosen variables. 34 figs.
Ahlfeld, David P.; Barlow, Paul M.; Baker, Kristine M.
2011-01-01
Many groundwater-management problems are concerned with the control of one or more variables that reflect the state of a groundwater-flow system or a coupled groundwater/surface-water system. These system state variables include the distribution of heads within an aquifer, streamflow rates within a hydraulically connected stream, and flow rates into or out of aquifer storage. This report documents the new State Variables Package for the Groundwater-Management Process of MODFLOW-2005 (GWM-2005). The new package provides a means to explicitly represent heads, streamflows, and changes in aquifer storage as state variables in a GWM-2005 simulation. The availability of these state variables makes it possible to include system state in the objective function and enhances existing capabilities for constructing constraint sets for a groundwater-management formulation. The new package can be used to address groundwater-management problems such as the determination of withdrawal strategies that meet water-supply demands while simultaneously maximizing heads or streamflows, or minimizing changes in aquifer storage. Four sample problems are provided to demonstrate use of the new package for typical groundwater-management applications.
A method for developing outcome measures in the clinical laboratory.
Jones, J
1996-01-01
Measuring and reporting outcomes in health care is becoming more important for quality assessment, utilization assessment, accreditation standards, and negotiating contracts in managed care. How does one develop an outcome measure for the laboratory to assess the value of the services? A method is described which outlines seven steps in developing outcome measures for a laboratory service or process. These steps include the following: 1. Identify the process or service to be monitored for performance and outcome assessment. 2. If necessary, form an multidisciplinary team of laboratory staff, other department staff, physicians, and pathologists. 3. State the purpose of the test or service including a review of published data for the clinical pathological correlation. 4. Prepare a process cause and effect diagram including steps critical to the outcome. 5. Identify key process variables that contribute to positive or negative outcomes. 6. Identify outcome measures that are not process measures. 7. Develop an operational definition, identify data sources, and collect data. Examples, including a process cause and effect diagram, process variables, and outcome measures, are given using the Therapeutic Drug Monitoring service (TDM). A summary of conclusions and precautions for outcome measurement is then provided.
Static and Dynamic Aeroelastic Tailoring With Variable Camber Control
NASA Technical Reports Server (NTRS)
Stanford, Bret K.
2016-01-01
This paper examines the use of a Variable Camber Continuous Trailing Edge Flap (VCCTEF) system for aeroservoelastic optimization of a transport wingbox. The quasisteady and unsteady motions of the flap system are utilized as design variables, along with patch-level structural variables, towards minimizing wingbox weight via maneuver load alleviation and active flutter suppression. The resulting system is, in general, very successful at removing structural weight in a feasible manner. Limitations to this success are imposed by including load cases where the VCCTEF system is not active (open-loop) in the optimization process, and also by including actuator operating cost constraints.
Jackson, B Scott
2004-10-01
Many different types of integrate-and-fire models have been designed in order to explain how it is possible for a cortical neuron to integrate over many independent inputs while still producing highly variable spike trains. Within this context, the variability of spike trains has been almost exclusively measured using the coefficient of variation of interspike intervals. However, another important statistical property that has been found in cortical spike trains and is closely associated with their high firing variability is long-range dependence. We investigate the conditions, if any, under which such models produce output spike trains with both interspike-interval variability and long-range dependence similar to those that have previously been measured from actual cortical neurons. We first show analytically that a large class of high-variability integrate-and-fire models is incapable of producing such outputs based on the fact that their output spike trains are always mathematically equivalent to renewal processes. This class of models subsumes a majority of previously published models, including those that use excitation-inhibition balance, correlated inputs, partial reset, or nonlinear leakage to produce outputs with high variability. Next, we study integrate-and-fire models that have (nonPoissonian) renewal point process inputs instead of the Poisson point process inputs used in the preceding class of models. The confluence of our analytical and simulation results implies that the renewal-input model is capable of producing high variability and long-range dependence comparable to that seen in spike trains recorded from cortical neurons, but only if the interspike intervals of the inputs have infinite variance, a physiologically unrealistic condition. Finally, we suggest a new integrate-and-fire model that does not suffer any of the previously mentioned shortcomings. By analyzing simulation results for this model, we show that it is capable of producing output spike trains with interspike-interval variability and long-range dependence that match empirical data from cortical spike trains. This model is similar to the other models in this study, except that its inputs are fractional-gaussian-noise-driven Poisson processes rather than renewal point processes. In addition to this model's success in producing realistic output spike trains, its inputs have long-range dependence similar to that found in most subcortical neurons in sensory pathways, including the inputs to cortex. Analysis of output spike trains from simulations of this model also shows that a tight balance between the amounts of excitation and inhibition at the inputs to cortical neurons is not necessary for high interspike-interval variability at their outputs. Furthermore, in our analysis of this model, we show that the superposition of many fractional-gaussian-noise-driven Poisson processes does not approximate a Poisson process, which challenges the common assumption that the total effect of a large number of inputs on a neuron is well represented by a Poisson process.
Due-Window Assignment Scheduling with Variable Job Processing Times
Wu, Yu-Bin
2015-01-01
We consider a common due-window assignment scheduling problem jobs with variable job processing times on a single machine, where the processing time of a job is a function of its position in a sequence (i.e., learning effect) or its starting time (i.e., deteriorating effect). The problem is to determine the optimal due-windows, and the processing sequence simultaneously to minimize a cost function includes earliness, tardiness, the window location, window size, and weighted number of tardy jobs. We prove that the problem can be solved in polynomial time. PMID:25918745
NASA Astrophysics Data System (ADS)
Collins, P. C.; Koduri, S.; Dixit, V.; Fraser, H. L.
2018-03-01
The fracture toughness of a material depends upon the material's composition and microstructure, as well as other material properties operating at the continuum level. The interrelationships between these variables are complex, and thus difficult to interpret, especially in multi-component, multi-phase ductile engineering alloys such as α/β-processed Ti-6Al-4V (nominal composition, wt pct). Neural networks have been used to elucidate how variables such as composition and microstructure influence the fracture toughness directly ( i.e., via a crack initiation or propagation mechanism)—and independent of the influence of the same variables influence on the yield strength and plasticity of the material. The variables included in the models and analysis include (i) alloy composition, specifically, Al, V, O, and Fe; (ii) materials microstructure, including phase fractions and average sizes of key microstructural features; (iii) the yield strength and reduction in area obtained from uniaxial tensile tests; and (iv) an assessment of the degree to which plane strain conditions were satisfied by including a factor related to the plane strain thickness. Once trained, virtual experiments have been conducted which permit the determination of each variable's functional dependency on the resulting fracture toughness. Given that the database includes both K 1 C and K Q values, as well as the in-plane component of the stress state of the crack tip, it is possible to quantitatively assess the effect of sample thickness on K Q and the degree to which the K Q and K 1 C values may vary. These interpretations drawn by comparing multiple neural networks have a significant impact on the general understanding of how the microstructure influences the fracture toughness in ductile materials, as well as an ability to predict the fracture toughness of α/β-processed Ti-6Al-4V.
Meta-analysis of correlates of provider behavior in medical encounters.
Hall, J A; Roter, D L; Katz, N R
1988-07-01
This article summarizes the results of 41 independent studies containing correlates of objectively measured provider behaviors in medical encounters. Provider behaviors were grouped a priori into the process categories of information giving, questions, competence, partnership building, and socioemotional behavior. Total amount of communication was also included. All correlations between variables within these categories and external variables (patient outcome variables or patient and provider background variables) were extracted. The most frequently occurring outcome variables were satisfaction, recall, and compliance, and the most frequently occurring background variables were the patient's gender, age, and social class. Average correlations and combined significance levels were calculated for each combination of process category and external variable. Results showed significant relations of small to moderate average magnitude between these external variables and almost all of the provider behavior categories. A theory of provider-patient reciprocation is proposed to account for the pattern of results.
Puglia, Meghan H.; Lillard, Travis S.; Morris, James P.; Connelly, Jessica J.
2015-01-01
In humans, the neuropeptide oxytocin plays a critical role in social and emotional behavior. The actions of this molecule are dependent on a protein that acts as its receptor, which is encoded by the oxytocin receptor gene (OXTR). DNA methylation of OXTR, an epigenetic modification, directly influences gene transcription and is variable in humans. However, the impact of this variability on specific social behaviors is unknown. We hypothesized that variability in OXTR methylation impacts social perceptual processes often linked with oxytocin, such as perception of facial emotions. Using an imaging epigenetic approach, we established a relationship between OXTR methylation and neural activity in response to emotional face processing. Specifically, high levels of OXTR methylation were associated with greater amounts of activity in regions associated with face and emotion processing including amygdala, fusiform, and insula. Importantly, we found that these higher levels of OXTR methylation were also associated with decreased functional coupling of amygdala with regions involved in affect appraisal and emotion regulation. These data indicate that the human endogenous oxytocin system is involved in attenuation of the fear response, corroborating research implicating intranasal oxytocin in the same processes. Our findings highlight the importance of including epigenetic mechanisms in the description of the endogenous oxytocin system and further support a central role for oxytocin in social cognition. This approach linking epigenetic variability with neural endophenotypes may broadly explain individual differences in phenotype including susceptibility or resilience to disease. PMID:25675509
The Importance of Environment in Educational and Psychological Research.
ERIC Educational Resources Information Center
Tedesco, Lisa A.
Research treatments of home environment in studies of intellectual development and school performance are reviewed. The conceptualizations of home environment include variables specifying family status and deprivational conditions, child-rearing techniques, educational and intellectual process variables, and transactional experience. Psychological…
Processes Understanding of Decadal Climate Variability
NASA Astrophysics Data System (ADS)
Prömmel, Kerstin; Cubasch, Ulrich
2016-04-01
The realistic representation of decadal climate variability in the models is essential for the quality of decadal climate predictions. Therefore, the understanding of those processes leading to decadal climate variability needs to be improved. Several of these processes are already included in climate models but their importance has not yet completely been clarified. The simulation of other processes requires sometimes a higher resolution of the model or an extension by additional subsystems. This is addressed within one module of the German research program "MiKlip II - Decadal Climate Predictions" (http://www.fona-miklip.de/en/) with a focus on the following processes. Stratospheric processes and their impact on the troposphere are analysed regarding the climate response to aerosol perturbations caused by volcanic eruptions and the stratospheric decadal variability due to solar forcing, climate change and ozone recovery. To account for the interaction between changing ozone concentrations and climate a computationally efficient ozone chemistry module is developed and implemented in the MiKlip prediction system. The ocean variability and air-sea interaction are analysed with a special focus on the reduction of the North Atlantic cold bias. In addition, the predictability of the oceanic carbon uptake with a special emphasis on the underlying mechanism is investigated. This addresses a combination of physical, biological and chemical processes.
Heins, Marianne J; Knoop, Hans; Burk, William J; Bleijenberg, Gijs
2013-09-01
Cognitive behaviour therapy (CBT) can significantly reduce fatigue in chronic fatigue syndrome (CFS), but little is known about the process of change taking place during CBT. Based on a recent treatment model (Wiborg et al. J Psych Res 2012), we examined how (changes in) cognitions and behaviour are related to the decrease in fatigue. We included 183 patients meeting the US Centers for Disease Control criteria for CFS, aged 18 to 65 years, starting CBT. We measured fatigue and possible process variables before treatment; after 6, 12 and 18 weeks; and after treatment. Possible process variables were sense of control over fatigue, focusing on symptoms, self-reported physical functioning, perceived physical activity and objective (actigraphic) physical activity. We built multiple regression models, explaining levels of fatigue during therapy by (changes in) proposed process variables. We observed large individual variation in the patterns of change in fatigue and process variables during CBT for CFS. Increases in the sense of control over fatigue, perceived activity and self-reported physical functioning, and decreases in focusing on symptoms explained 20 to 46% of the variance in fatigue. An increase in objective activity was not a process variable. A change in cognitive factors seems to be related to the decrease in fatigue during CBT for CFS. The pattern of change varies considerably between patients, but changes in process variables and fatigue occur mostly in the same period. © 2013.
The Climate Variability & Predictability (CVP) Program at NOAA - Recent Program Advancements
NASA Astrophysics Data System (ADS)
Lucas, S. E.; Todd, J. F.
2015-12-01
The Climate Variability & Predictability (CVP) Program supports research aimed at providing process-level understanding of the climate system through observation, modeling, analysis, and field studies. This vital knowledge is needed to improve climate models and predictions so that scientists can better anticipate the impacts of future climate variability and change. To achieve its mission, the CVP Program supports research carried out at NOAA and other federal laboratories, NOAA Cooperative Institutes, and academic institutions. The Program also coordinates its sponsored projects with major national and international scientific bodies including the World Climate Research Programme (WCRP), the International and U.S. Climate Variability and Predictability (CLIVAR/US CLIVAR) Program, and the U.S. Global Change Research Program (USGCRP). The CVP program sits within NOAA's Climate Program Office (http://cpo.noaa.gov/CVP). The CVP Program currently supports multiple projects in areas that are aimed at improved representation of physical processes in global models. Some of the topics that are currently funded include: i) Improved Understanding of Intraseasonal Tropical Variability - DYNAMO field campaign and post -field projects, and the new climate model improvement teams focused on MJO processes; ii) Climate Process Teams (CPTs, co-funded with NSF) with projects focused on Cloud macrophysical parameterization and its application to aerosol indirect effects, and Internal-Wave Driven Mixing in Global Ocean Models; iii) Improved Understanding of Tropical Pacific Processes, Biases, and Climatology; iv) Understanding Arctic Sea Ice Mechanism and Predictability;v) AMOC Mechanisms and Decadal Predictability Recent results from CVP-funded projects will be summarized. Additional information can be found at http://cpo.noaa.gov/CVP.
Xie, Ping; Wu, Zi Yi; Zhao, Jiang Yan; Sang, Yan Fang; Chen, Jie
2018-04-01
A stochastic hydrological process is influenced by both stochastic and deterministic factors. A hydrological time series contains not only pure random components reflecting its inheri-tance characteristics, but also deterministic components reflecting variability characteristics, such as jump, trend, period, and stochastic dependence. As a result, the stochastic hydrological process presents complicated evolution phenomena and rules. To better understand these complicated phenomena and rules, this study described the inheritance and variability characteristics of an inconsistent hydrological series from two aspects: stochastic process simulation and time series analysis. In addition, several frequency analysis approaches for inconsistent time series were compared to reveal the main problems in inconsistency study. Then, we proposed a new concept of hydrological genes origined from biological genes to describe the inconsistent hydrolocal processes. The hydrologi-cal genes were constructed using moments methods, such as general moments, weight function moments, probability weight moments and L-moments. Meanwhile, the five components, including jump, trend, periodic, dependence and pure random components, of a stochastic hydrological process were defined as five hydrological bases. With this method, the inheritance and variability of inconsistent hydrological time series were synthetically considered and the inheritance, variability and evolution principles were fully described. Our study would contribute to reveal the inheritance, variability and evolution principles in probability distribution of hydrological elements.
Decomposing ADHD-Related Effects in Response Speed and Variability
Karalunas, Sarah L.; Huang-Pollock, Cynthia L.; Nigg, Joel T.
2012-01-01
Objective Slow and variable reaction times (RTs) on fast tasks are such a prominent feature of Attention Deficit Hyperactivity Disorder (ADHD) that any theory must account for them. However, this has proven difficult because the cognitive mechanisms responsible for this effect remain unexplained. Although speed and variability are typically correlated, it is unclear whether single or multiple mechanisms are responsible for group differences in each. RTs are a result of several semi-independent processes, including stimulus encoding, rate of information processing, speed-accuracy trade-offs, and motor response, which have not been previously well characterized. Method A diffusion model was applied to RTs from a forced-choice RT paradigm in two large, independent case-control samples (NCohort 1= 214 and N Cohort 2=172). The decomposition measured three validated parameters that account for the full RT distribution, and assessed reproducibility of ADHD effects. Results In both samples, group differences in traditional RT variables were explained by slow information processing speed, and unrelated to speed-accuracy trade-offs or non-decisional processes (e.g. encoding, motor response). Conclusions RT speed and variability in ADHD may be explained by a single information processing parameter, potentially simplifying explanations that assume different mechanisms are required to account for group differences in the mean and variability of RTs. PMID:23106115
Using weather data to improve decision-making
USDA-ARS?s Scientific Manuscript database
Weather in the western United States is relatively dry and highly variable. The consequences of this variability can be effectively dealt with through the process of adaptive management which includes contingency planning for partial restoration success or restoration failure in any given year. Pr...
Jiang, Canping; Flansburg, Lisa; Ghose, Sanchayita; Jorjorian, Paul; Shukla, Abhinav A
2010-12-15
The concept of design space has been taking root under the quality by design paradigm as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. This paper outlines the development of a design space for a hydrophobic interaction chromatography (HIC) process step. The design space included the impact of raw material lot-to-lot variability and variations in the feed stream from cell culture. A failure modes and effects analysis was employed as the basis for the process characterization exercise. During mapping of the process design space, the multi-dimensional combination of operational variables were studied to quantify the impact on process performance in terms of yield and product quality. Variability in resin hydrophobicity was found to have a significant influence on step yield and high-molecular weight aggregate clearance through the HIC step. A robust operating window was identified for this process step that enabled a higher step yield while ensuring acceptable product quality. © 2010 Wiley Periodicals, Inc.
Referential processing: reciprocity and correlates of naming and imaging.
Paivio, A; Clark, J M; Digdon, N; Bons, T
1989-03-01
To shed light on the referential processes that underlie mental translation between representations of objects and words, we studied the reciprocity and determinants of naming and imaging reaction times (RT). Ninety-six subjects pressed a key when they had covertly named 248 pictures or imaged to their names. Mean naming and imagery RTs for each item were correlated with one another, and with properties of names, images, and their interconnections suggested by prior research and dual coding theory. Imagery RTs correlated .56 (df = 246) with manual naming RTs and .58 with voicekey naming RTs from prior studies. A factor analysis of the RTs and of 31 item characteristics revealed 7 dimensions. Imagery and naming RTs loaded on a common referential factor that included variables related to both directions of processing (e.g., missing names and missing images). Naming RTs also loaded on a nonverbal-to-verbal factor that included such variables as number of different names, whereas imagery RTs loaded on a verbal-to-nonverbal factor that included such variables as rated consistency of imagery. The other factors were verbal familiarity, verbal complexity, nonverbal familiarity, and nonverbal complexity. The findings confirm the reciprocity of imaging and naming, and their relation to constructs associated with distinct phases of referential processing.
Gaia DR1 documentation Chapter 6: Variability
NASA Astrophysics Data System (ADS)
Eyer, L.; Rimoldini, L.; Guy, L.; Holl, B.; Clementini, G.; Cuypers, J.; Mowlavi, N.; Lecoeur-Taïbi, I.; De Ridder, J.; Charnas, J.; Nienartowicz, K.
2017-12-01
This chapter describes the photometric variability processing of the Gaia DR1 data. Coordination Unit 7 is responsible for the variability analysis of over a billion celestial sources. In particular the definition, design, development, validation and provision of a software package for the data processing of photometrically variable objects. Data Processing Centre Geneva (DPCG) responsibilities cover all issues related to the computational part of the CU7 analysis. These span: hardware provisioning, including selection, deployment and optimisation of suitable hardware, choosing and developing software architecture, defining data and scientific workflows as well as operational activities such as configuration management, data import, time series reconstruction, storage and processing handling, visualisation and data export. CU7/DPCG is also responsible for interaction with other DPCs and CUs, software and programming training for the CU7 members, scientific software quality control and management of software and data lifecycle. Details about the specific data treatment steps of the Gaia DR1 data products are found in Eyer et al. (2017) and are not repeated here. The variability content of the Gaia DR1 focusses on a subsample of Cepheids and RR Lyrae stars around the South ecliptic pole, showcasing the performance of the Gaia photometry with respect to variable objects.
Expert system for testing industrial processes and determining sensor status
Gross, K.C.; Singer, R.M.
1998-06-02
A method and system are disclosed for monitoring both an industrial process and a sensor. The method and system include determining a minimum number of sensor pairs needed to test the industrial process as well as the sensor for evaluating the state of operation of both. The technique further includes generating a first and second signal characteristic of an industrial process variable. After obtaining two signals associated with one physical variable, a difference function is obtained by determining the arithmetic difference between the pair of signals over time. A frequency domain transformation is made of the difference function to obtain Fourier modes describing a composite function. A residual function is obtained by subtracting the composite function from the difference function and the residual function (free of nonwhite noise) is analyzed by a statistical probability ratio test. 24 figs.
Expert system for testing industrial processes and determining sensor status
Gross, Kenneth C.; Singer, Ralph M.
1998-01-01
A method and system for monitoring both an industrial process and a sensor. The method and system include determining a minimum number of sensor pairs needed to test the industrial process as well as the sensor for evaluating the state of operation of both. The technique further includes generating a first and second signal characteristic of an industrial process variable. After obtaining two signals associated with one physical variable, a difference function is obtained by determining the arithmetic difference between the pair of signals over time. A frequency domain transformation is made of the difference function to obtain Fourier modes describing a composite function. A residual function is obtained by subtracting the composite function from the difference function and the residual function (free of nonwhite noise) is analyzed by a statistical probability ratio test.
Cider fermentation process monitoring by Vis-NIR sensor system and chemometrics.
Villar, Alberto; Vadillo, Julen; Santos, Jose I; Gorritxategi, Eneko; Mabe, Jon; Arnaiz, Aitor; Fernández, Luis A
2017-04-15
Optimization of a multivariate calibration process has been undertaken for a Visible-Near Infrared (400-1100nm) sensor system, applied in the monitoring of the fermentation process of the cider produced in the Basque Country (Spain). The main parameters that were monitored included alcoholic proof, l-lactic acid content, glucose+fructose and acetic acid content. The multivariate calibration was carried out using a combination of different variable selection techniques and the most suitable pre-processing strategies were selected based on the spectra characteristics obtained by the sensor system. The variable selection techniques studied in this work include Martens Uncertainty test, interval Partial Least Square Regression (iPLS) and Genetic Algorithm (GA). This procedure arises from the need to improve the calibration models prediction ability for cider monitoring. Copyright © 2016 Elsevier Ltd. All rights reserved.
[Effects of situational and individual variables on critical thinking expression].
Tanaka, Yuko; Kusumi, Takashi
2016-04-01
The present study examined when people decide to choose an expression that is based on critical thinking, and how situational and individual variables affect such a decision process. Given a conversation scenario including overgeneralization with two friends, participants decided whether to follow the conversation by a critical-thinking expression or not. The authors controlled purpose and topic as situational variables, and measured critical-thinking ability, critical-thinking disposition, and self-monitoring as individual variables. We conducted an experiment in which the situational variables were counterbalanced in a within-subject design with 60 university students. The results of logistic regression analysis showed differences within individuals in the decision process whether to choose a critical-thinking expression, and that some situational factors and some subscales of the individual measurements were related to the differences.
Networks for image acquisition, processing and display
NASA Technical Reports Server (NTRS)
Ahumada, Albert J., Jr.
1990-01-01
The human visual system comprises layers of networks which sample, process, and code images. Understanding these networks is a valuable means of understanding human vision and of designing autonomous vision systems based on network processing. Ames Research Center has an ongoing program to develop computational models of such networks. The models predict human performance in detection of targets and in discrimination of displayed information. In addition, the models are artificial vision systems sharing properties with biological vision that has been tuned by evolution for high performance. Properties include variable density sampling, noise immunity, multi-resolution coding, and fault-tolerance. The research stresses analysis of noise in visual networks, including sampling, photon, and processing unit noises. Specific accomplishments include: models of sampling array growth with variable density and irregularity comparable to that of the retinal cone mosaic; noise models of networks with signal-dependent and independent noise; models of network connection development for preserving spatial registration and interpolation; multi-resolution encoding models based on hexagonal arrays (HOP transform); and mathematical procedures for simplifying analysis of large networks.
Quality of narrative operative reports in pancreatic surgery
Wiebe, Meagan E.; Sandhu, Lakhbir; Takata, Julie L.; Kennedy, Erin D.; Baxter, Nancy N.; Gagliardi, Anna R.; Urbach, David R.; Wei, Alice C.
2013-01-01
Background Quality in health care can be evaluated using quality indicators (QIs). Elements contained in the surgical operative report are potential sources for QI data, but little is known about the completeness of the narrative operative report (NR). We evaluated the completeness of the NR for patients undergoing a pancreaticoduodenectomy. Methods We reviewed NRs for patients undergoing a pancreaticoduodenectomy over a 1-year period. We extracted 79 variables related to patient and narrator characteristics, process of care measures, surgical technique and oncology-related outcomes by document analysis. Data were coded and evaluated for completeness. Results We analyzed 74 NRs. The median number of variables reported was 43.5 (range 13–54). Variables related to surgical technique were most complete. Process of care and oncology-related variables were often omitted. Completeness of the NR was associated with longer operative duration. Conclusion The NRs were often incomplete and of poor quality. Important elements, including process of care and oncology-related data, were frequently missing. Thus, the NR is an inadequate data source for QI. Development and use of alternative reporting methods, including standardized synoptic operative reports, should be encouraged to improve documentation of care and serve as a measure of quality of surgical care. PMID:24067527
Quality of narrative operative reports in pancreatic surgery.
Wiebe, Meagan E; Sandhu, Lakhbir; Takata, Julie L; Kennedy, Erin D; Baxter, Nancy N; Gagliardi, Anna R; Urbach, David R; Wei, Alice C
2013-10-01
Quality in health care can be evaluated using quality indicators (QIs). Elements contained in the surgical operative report are potential sources for QI data, but little is known about the completeness of the narrative operative report (NR). We evaluated the completeness of the NR for patients undergoing a pancreaticoduodenectomy. We reviewed NRs for patients undergoing a pancreaticoduodenectomy over a 1-year period. We extracted 79 variables related to patient and narrator characteristics, process of care measures, surgical technique and oncology-related outcomes by document analysis. Data were coded and evaluated for completeness. We analyzed 74 NRs. The median number of variables reported was 43.5 (range 13-54). Variables related to surgical technique were most complete. Process of care and oncology-related variables were often omitted. Completeness of the NR was associated with longer operative duration. The NRs were often incomplete and of poor quality. Important elements, including process of care and oncology-related data, were frequently missing. Thus, the NR is an inadequate data source for QI. Development and use of alternative reporting methods, including standardized synoptic operative reports, should be encouraged to improve documentation of care and serve as a measure of quality of surgical care.
A descriptivist approach to trait conceptualization and inference.
Jonas, Katherine G; Markon, Kristian E
2016-01-01
In their recent article, How Functionalist and Process Approaches to Behavior Can Explain Trait Covariation, Wood, Gardner, and Harms (2015) underscore the need for more process-based understandings of individual differences. At the same time, the article illustrates a common error in the use and interpretation of latent variable models: namely, the misuse of models to arbitrate issues of causation and the nature of latent variables. Here, we explain how latent variables can be understood simply as parsimonious summaries of data, and how statistical inference can be based on choosing those summaries that minimize information required to represent the data using the model. Although Wood, Gardner, and Harms acknowledge this perspective, they underestimate its significance, including its importance to modeling and the conceptualization of psychological measurement. We believe this perspective has important implications for understanding individual differences in a number of domains, including current debates surrounding the role of formative versus reflective latent variables. (c) 2015 APA, all rights reserved).
Group interaction and flight crew performance
NASA Technical Reports Server (NTRS)
Foushee, H. Clayton; Helmreich, Robert L.
1988-01-01
The application of human-factors analysis to the performance of aircraft-operation tasks by the crew as a group is discussed in an introductory review and illustrated with anecdotal material. Topics addressed include the function of a group in the operational environment, the classification of group performance factors (input, process, and output parameters), input variables and the flight crew process, and the effect of process variables on performance. Consideration is given to aviation safety issues, techniques for altering group norms, ways of increasing crew effort and coordination, and the optimization of group composition.
Oh, Ching Mien; Guo, Qiyun; Wan Sia Heng, Paul; Chan, Lai Wah
2014-07-01
In any manufacturing process, the success of producing an end product with the desired properties and yield depends on a range of factors that include the equipment, process and formulation variables. It is the interest of manufacturers and researchers to understand each manufacturing process better and ascertain the effects of various manufacturing-associated factors on the properties of the end product. Unless the manufacturing process is well understood, it would be difficult to set realistic limits for the process variables and raw material specifications to ensure consistently high-quality and reproducible end products. Over the years, spray congealing has been used to produce particulates by the food and pharmaceutical industries. The latter have used this technology to develop specialized drug delivery systems. In this review, basic principles as well as advantages and disadvantages of the spray congealing process will be covered. Recent developments in spray congealing equipment, process variables and formulation variables such as the matrix material, encapsulated material and additives will also be discussed. Innovative equipment designs and formulations for spray congealing have emerged. Judicious choice of atomizers, polymers and additives is the key to achieve the desired properties of the microparticles for drug delivery.
Data Processing Aspects of MEDLARS
Austin, Charles J.
1964-01-01
The speed and volume requirements of MEDLARS necessitate the use of high-speed data processing equipment, including paper-tape typewriters, a digital computer, and a special device for producing photo-composed output. Input to the system is of three types: variable source data, including citations from the literature and search requests; changes to such master files as the medical subject headings list and the journal record file; and operating instructions such as computer programs and procedures for machine operators. MEDLARS builds two major stores of data on magnetic tape. The Processed Citation File includes bibliographic citations in expanded form for high-quality printing at periodic intervals. The Compressed Citation File is a coded, time-sequential citation store which is used for high-speed searching against demand request input. Major design considerations include converting variable-length, alphanumeric data to mechanical form quickly and accurately; serial searching by the computer within a reasonable period of time; high-speed printing that must be of graphic quality; and efficient maintenance of various complex computer files. PMID:14119287
DATA PROCESSING ASPECTS OF MEDLARS.
AUSTIN, C J
1964-01-01
The speed and volume requirements of MEDLARS necessitate the use of high-speed data processing equipment, including paper-tape typewriters, a digital computer, and a special device for producing photo-composed output. Input to the system is of three types: variable source data, including citations from the literature and search requests; changes to such master files as the medical subject headings list and the journal record file; and operating instructions such as computer programs and procedures for machine operators. MEDLARS builds two major stores of data on magnetic tape. The Processed Citation File includes bibliographic citations in expanded form for high-quality printing at periodic intervals. The Compressed Citation File is a coded, time-sequential citation store which is used for high-speed searching against demand request input. Major design considerations include converting variable-length, alphanumeric data to mechanical form quickly and accurately; serial searching by the computer within a reasonable period of time; high-speed printing that must be of graphic quality; and efficient maintenance of various complex computer files.
NASA Astrophysics Data System (ADS)
Aprilia, Ayu Rizky; Santoso, Imam; Ekasari, Dhita Murita
2017-05-01
Yogurt is a product based on milk, which has beneficial effects for health. The process for the production of yogurt is very susceptible to failure because it involves bacteria and fermentation. For an industry, the risks may cause harm and have a negative impact. In order for a product to be successful and profitable, it requires the analysis of risks that may occur during the production process. Risk analysis can identify the risks in detail and prevent as well as determine its handling, so that the risks can be minimized. Therefore, this study will analyze the risks of the production process with a case study in CV.XYZ. The method used in this research is the Fuzzy Failure Mode and Effect Analysis (fuzzy FMEA) and Fault Tree Analysis (FTA). The results showed that there are 6 risks from equipment variables, raw material variables, and process variables. Those risks include the critical risk, which is the risk of a lack of an aseptic process, more specifically if starter yogurt is damaged due to contamination by fungus or other bacteria and a lack of sanitation equipment. The results of quantitative analysis of FTA showed that the highest probability is the probability of the lack of an aseptic process, with a risk of 3.902%. The recommendations for improvement include establishing SOPs (Standard Operating Procedures), which include the process, workers, and environment, controlling the starter of yogurt and improving the production planning and sanitation equipment using hot water immersion.
BIOREMEDIATION OF PETROLEUM HYDROCARBONS: A FLEXIBLE VARIABLE SPEED TECHNOLOGY
The bioremediation of petroleum hydrocarbons has evolved into a number of different processes. These processes include in-situ aquifer bioremediation, bioventing, biosparging, passive bioremediation with oxygen release compounds, and intrinsic bioremediation. Although often viewe...
Pérez-Hoyos, S; Sáez Zafra, M; Barceló, M A; Cambra, C; Figueiras Guzmán, A; Ordóñez, J M; Guillén Grima, F; Ocaña, R; Bellido, J; Cirera Suárez, L; López, A A; Rodríguez, V; Alcalá Nalvaiz, T; Ballester Díez, F
1999-01-01
The aim of this study is to Mortality show the protocol of analysis which was set out as part of the EMECAM Project, illustrating the application thereof to the effect of pollution has on the mortality in the city of Valencia. The response variables considered will be the daily deaths rate resulting from all causes, except external ones. The explicative variables are the daily series of different pollutants (black smoke, SO2, NO2, CO, O3). As possible confusion variables, weather factors, structural factors and weekly cases of flu are taken into account. A Poisson regression model is built up for each one of the four deaths series in two stages. In the first stage, a baseline model is fitted using the possible confusion variables. In the second stage, the pollution variables or the time legs thereof are included, controlling the residual autocorrelation by including mortality time lags. The process of fitting the baseline model is as follows: 1) Include the significant sinusoidal terms up to the sixth order. 2) Include the significant temperature or temperature squared terms with the time lags thereof up to the 7th order. 3) Repeat this process with the relative humidity. 4) Add in the significant terms of calendar years, daily tendency and tendency squared. 5) The days of the week as dummy variables are always included in the model. 6) Include the holidays and the significant time lags of up to two weeks of flu. Following the reassessment of the model, each one of the pollutants and the time lags thereof up to the fifth order are proven out. The impact is analyzed by six-month periods, including interaction terms.
The added value of time-variable microgravimetry to the understanding of how volcanoes work
Carbone, Daniele; Poland, Michael; Greco, Filippo; Diament, Michel
2017-01-01
During the past few decades, time-variable volcano gravimetry has shown great potential for imaging subsurface processes at active volcanoes (including some processes that might otherwise remain “hidden”), especially when combined with other methods (e.g., ground deformation, seismicity, and gas emissions). By supplying information on changes in the distribution of bulk mass over time, gravimetry can provide information regarding processes such as magma accumulation in void space, gas segregation at shallow depths, and mechanisms driving volcanic uplift and subsidence. Despite its potential, time-variable volcano gravimetry is an underexploited method, not widely adopted by volcano researchers or observatories. The cost of instrumentation and the difficulty in using it under harsh environmental conditions is a significant impediment to the exploitation of gravimetry at many volcanoes. In addition, retrieving useful information from gravity changes in noisy volcanic environments is a major challenge. While these difficulties are not trivial, neither are they insurmountable; indeed, creative efforts in a variety of volcanic settings highlight the value of time-variable gravimetry for understanding hazards as well as revealing fundamental insights into how volcanoes work. Building on previous work, we provide a comprehensive review of time-variable volcano gravimetry, including discussions of instrumentation, modeling and analysis techniques, and case studies that emphasize what can be learned from campaign, continuous, and hybrid gravity observations. We are hopeful that this exploration of time-variable volcano gravimetry will excite more scientists about the potential of the method, spurring further application, development, and innovation.
Brennan, Gerard P; Fritz, Julie M; Houck, L T C Kevin M; Hunter, Stephen J
2015-05-01
Research examining care process variables and their relationship to clinical outcomes after total knee arthroplasty has focused primarily on inpatient variables. Care process factors related to outpatient rehabilitation have not been adequately examined. We conducted a retrospective review of 321 patients evaluating outpatient care process variables including use of continuous passive motion, home health physical therapy, number of days from inpatient discharge to beginning outpatient physical therapy, and aspects of outpatient physical therapy (number of visits, length of stay) as possible predictors of pain and disability outcomes of outpatient physical therapy. Only the number of days between inpatient discharge and outpatient physical therapy predicted better outcomes, suggesting that this may be a target for improving outcomes after total knee arthroplasty for patients discharged directly home. Copyright © 2014 Elsevier Inc. All rights reserved.
Lucarini, Valerio; Fraedrich, Klaus
2009-08-01
Starting from the classical Saltzman two-dimensional convection equations, we derive via a severe spectral truncation a minimal 10 ODE system which includes the thermal effect of viscous dissipation. Neglecting this process leads to a dynamical system which includes a decoupled generalized Lorenz system. The consideration of this process breaks an important symmetry and couples the dynamics of fast and slow variables, with the ensuing modifications to the structural properties of the attractor and of the spectral features. When the relevant nondimensional number (Eckert number Ec) is different from zero, an additional time scale of O(Ec(-1)) is introduced in the system, as shown with standard multiscale analysis and made clear by several numerical evidences. Moreover, the system is ergodic and hyperbolic, the slow variables feature long-term memory with 1/f(3/2) power spectra, and the fast variables feature amplitude modulation. Increasing the strength of the thermal-viscous feedback has a stabilizing effect, as both the metric entropy and the Kaplan-Yorke attractor dimension decrease monotonically with Ec. The analyzed system features very rich dynamics: it overcomes some of the limitations of the Lorenz system and might have prototypical value in relevant processes in complex systems dynamics, such as the interaction between slow and fast variables, the presence of long-term memory, and the associated extreme value statistics. This analysis shows how neglecting the coupling of slow and fast variables only on the basis of scale analysis can be catastrophic. In fact, this leads to spurious invariances that affect essential dynamical properties (ergodicity, hyperbolicity) and that cause the model losing ability in describing intrinsically multiscale processes.
Ahlfeld, David P.; Barlow, Paul M.; Mulligan, Anne E.
2005-01-01
GWM is a Ground?Water Management Process for the U.S. Geological Survey modular three?dimensional ground?water model, MODFLOW?2000. GWM uses a response?matrix approach to solve several types of linear, nonlinear, and mixed?binary linear ground?water management formulations. Each management formulation consists of a set of decision variables, an objective function, and a set of constraints. Three types of decision variables are supported by GWM: flow?rate decision variables, which are withdrawal or injection rates at well sites; external decision variables, which are sources or sinks of water that are external to the flow model and do not directly affect the state variables of the simulated ground?water system (heads, streamflows, and so forth); and binary variables, which have values of 0 or 1 and are used to define the status of flow?rate or external decision variables. Flow?rate decision variables can represent wells that extend over one or more model cells and be active during one or more model stress periods; external variables also can be active during one or more stress periods. A single objective function is supported by GWM, which can be specified to either minimize or maximize the weighted sum of the three types of decision variables. Four types of constraints can be specified in a GWM formulation: upper and lower bounds on the flow?rate and external decision variables; linear summations of the three types of decision variables; hydraulic?head based constraints, including drawdowns, head differences, and head gradients; and streamflow and streamflow?depletion constraints. The Response Matrix Solution (RMS) Package of GWM uses the Ground?Water Flow Process of MODFLOW to calculate the change in head at each constraint location that results from a perturbation of a flow?rate variable; these changes are used to calculate the response coefficients. For linear management formulations, the resulting matrix of response coefficients is then combined with other components of the linear management formulation to form a complete linear formulation; the formulation is then solved by use of the simplex algorithm, which is incorporated into the RMS Package. Nonlinear formulations arise for simulated conditions that include water?table (unconfined) aquifers or head?dependent boundary conditions (such as streams, drains, or evapotranspiration from the water table). Nonlinear formulations are solved by sequential linear programming; that is, repeated linearization of the nonlinear features of the management problem. In this approach, response coefficients are recalculated for each iteration of the solution process. Mixed?binary linear (or mildly nonlinear) formulations are solved by use of the branch and bound algorithm, which is also incorporated into the RMS Package. Three sample problems are provided to demonstrate the use of GWM for typical ground?water flow management problems. These sample problems provide examples of how GWM input files are constructed to specify the decision variables, objective function, constraints, and solution process for a GWM run. The GWM Process runs with the MODFLOW?2000 Global and Ground?Water Flow Processes, but in its current form GWM cannot be used with the Observation, Sensitivity, Parameter?Estimation, or Ground?Water Transport Processes. The GWM Process is written with a modular structure so that new objective functions, constraint types, and solution algorithms can be added.
Spatial pattern analysis of Cu, Zn and Ni and their interpretation in the Campania region (Italy)
NASA Astrophysics Data System (ADS)
Petrik, Attila; Albanese, Stefano; Jordan, Gyozo; Rolandi, Roberto; De Vivo, Benedetto
2017-04-01
The uniquely abundant Campanian topsoil dataset enabled us to perform a spatial pattern analysis on 3 potentially toxic elements of Cu, Zn and Ni. This study is focusing on revealing the spatial texture and distribution of these elements by spatial point pattern and image processing analysis such as lineament density and spatial variability index calculation. The application of these methods on geochemical data provides a new and efficient tool to understand the spatial variation of concentrations and their background/baseline values. The determination and quantification of spatial variability is crucial to understand how fast the change in concentration is in a certain area and what processes might govern the variation. The spatial variability index calculation and image processing analysis including lineament density enables us to delineate homogenous areas and analyse them with respect to lithology and land use. Identification of spatial outliers and their patterns were also investigated by local spatial autocorrelation and image processing analysis including the determination of local minima and maxima points and singularity index analysis. The spatial variability of Cu and Zn reveals the highest zone (Cu: 0.5 MAD, Zn: 0.8-0.9 MAD, Median Deviation Index) along the coast between Campi Flegrei and the Sorrento Peninsula with the vast majority of statistically identified outliers and high-high spatial clustered points. The background/baseline maps of Cu and Zn reveals a moderate to high variability (Cu: 0.3 MAD, Zn: 0.4-0.5 MAD) NW-SE oriented zone including disrupted patches from Bisaccia to Mignano following the alluvial plains of Appenine's rivers. This zone has high abundance of anomaly concentrations identified using singularity analysis and it also has a high density of lineaments. The spatial variability of Ni shows the highest variability zone (0.6-0.7 MAD) around Campi Flegrei where the majority of low outliers are concentrated. The variability of background/baseline map of Ni reveals a shift to the east in case of highest variability zones coinciding with limestone outcrops. The high segmented area between Mignano and Bisaccia partially follows the alluvial plains of Appenine's rivers which seem to be playing a crucial role in the distribution and redistribution pattern of Cu, Zn and Ni in Campania. The high spatial variability zones of the later elements are located in topsoils on volcanoclastic rocks and are mostly related to cultivation and urbanised areas.
Del Valle Del Valle, Gema; Carrió, Carmen; Belloch, Amparo
2017-10-09
Help-seeking for mental disorders is a complex process, which includes different temporary stages, and in which the motivational variables play an especially relevant role. However, there is a lack of instruments to evaluate in depth both the temporary and motivational variables involved in the help-seeking process. This study aims to analyse in detail these two sets of variables, using a specific instrument designed for the purpose, to gain a better understanding of the process of treatment seeking. A total of 152 patients seeking treatment in mental health outpatient clinics of the NHS were individually interviewed: 71 had Obsessive-Compulsive Disorder, 21 had Agoraphobia, 18 had Major Depressive Disorder), 20 had Anorexia Nervosa, and 22 had Cocaine Dependence. The patients completed a structured interview assessing the help-seeking process. Disorder severity and quality of life was also assessed. The patients with agoraphobia and with major depression took significantly less time in recognising their mental health symptoms. Similarly, patients with major depression were faster in seeking professional help. Motivational variables were grouped in 3 sets: motivators for seeking treatment, related to the negative impact of symptoms on mood and to loss of control over symptoms; motivators for delaying treatment, related to minimisation of the disorder; and stigma-associated variables. The results support the importance of considering the different motivational variables involved in the several stages of the help-seeking process. The interview designed to that end has shown its usefulness in this endeavour. Copyright © 2017 SEP y SEPB. Publicado por Elsevier España, S.L.U. All rights reserved.
No Small Feat! Taking Time for Change.
ERIC Educational Resources Information Center
Solomon, Pearl Gold
This book provides practical information about the complexity of school change, with an emphasis on the role of time and its impact, along with other variables, on the change process. The other interacting variables in school change include vision, history, leadership and power, the use of support and pressure, capacity building, consensual…
Approximate techniques of structural reanalysis
NASA Technical Reports Server (NTRS)
Noor, A. K.; Lowder, H. E.
1974-01-01
A study is made of two approximate techniques for structural reanalysis. These include Taylor series expansions for response variables in terms of design variables and the reduced-basis method. In addition, modifications to these techniques are proposed to overcome some of their major drawbacks. The modifications include a rational approach to the selection of the reduced-basis vectors and the use of Taylor series approximation in an iterative process. For the reduced basis a normalized set of vectors is chosen which consists of the original analyzed design and the first-order sensitivity analysis vectors. The use of the Taylor series approximation as a first (initial) estimate in an iterative process, can lead to significant improvements in accuracy, even with one iteration cycle. Therefore, the range of applicability of the reanalysis technique can be extended. Numerical examples are presented which demonstrate the gain in accuracy obtained by using the proposed modification techniques, for a wide range of variations in the design variables.
Xu, Xiaoming; Al-Ghabeish, Manar; Rahman, Ziyaur; Krishnaiah, Yellela S R; Yerlikaya, Firat; Yang, Yang; Manda, Prashanth; Hunt, Robert L; Khan, Mansoor A
2015-09-30
Owing to its unique anatomical and physiological functions, ocular surface presents special challenges for both design and performance evaluation of the ophthalmic ointment drug products formulated with a variety of bases. The current investigation was carried out to understand and identify the appropriate in vitro methods suitable for quality and performance evaluation of ophthalmic ointment, and to study the effect of formulation and process variables on its critical quality attributes (CQA). The evaluated critical formulation variables include API initial size, drug percentage, and mineral oil percentage while the critical process parameters include mixing rate, temperature, time and cooling rate. The investigated quality and performance attributes include drug assay, content uniformity, API particle size in ointment, rheological characteristics, in vitro drug release and in vitro transcorneal drug permeation. Using design of experiments (DoE) as well as a novel principle component analysis approach, five of the quality and performance attributes (API particle size, storage modulus of ointment, high shear viscosity of ointment, in vitro drug release constant and in vitro transcorneal drug permeation rate constant) were found to be highly influenced by the formulation, in particular the strength of API, and to a lesser degree by processing variables. Correlating the ocular physiology with the physicochemical characteristics of acyclovir ophthalmic ointment suggested that in vitro quality metrics could be a valuable predictor of its in vivo performance. Published by Elsevier B.V.
Variable Density Effects in Stochastic Lagrangian Models for Turbulent Combustion
2016-07-20
PDF methods in dealing with chemical reaction and convection are preserved irrespective of density variation. Since the density variation in a typical...combustion process may be as large as factor of seven, including variable- density effects in PDF methods is of significance. Conventionally, the...strategy of modelling variable density flows in PDF methods is similar to that used for second-moment closure models (SMCM): models are developed based on
Brooks, Robin; Thorpe, Richard; Wilson, John
2004-11-11
A new mathematical treatment of alarms that considers them as multi-variable interactions between process variables has provided the first-ever method to calculate values for alarm limits. This has resulted in substantial reductions in false alarms and hence in alarm annunciation rates in field trials. It has also unified alarm management, process control and product quality control into a single mathematical framework so that operations improvement and hence economic benefits are obtained at the same time as increased process safety. Additionally, an algorithm has been developed that advises what changes should be made to Manipulable process variables to clear an alarm. The multi-variable Best Operating Zone at the heart of the method is derived from existing historical data using equation-free methods. It does not require a first-principles process model or an expensive series of process identification experiments. Integral with the method is a new format Process Operator Display that uses only existing variables to fully describe the multi-variable operating space. This combination of features makes it an affordable and maintainable solution for small plants and single items of equipment as well as for the largest plants. In many cases, it also provides the justification for the investments about to be made or already made in process historian systems. Field Trials have been and are being conducted at IneosChlor and Mallinckrodt Chemicals, both in the UK, of the new geometric process control (GPC) method for improving the quality of both process operations and product by providing Process Alarms and Alerts of much high quality than ever before. The paper describes the methods used, including a simple visual method for Alarm Rationalisation that quickly delivers large sets of Consistent Alarm Limits, and the extension to full Alert Management with highlights from the Field Trials to indicate the overall effectiveness of the method in practice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deng, Yi
2014-11-24
DOE-GTRC-05596 11/24/2104 Collaborative Research: Process-Resolving Decomposition of the Global Temperature Response to Modes of Low Frequency Variability in a Changing Climate PI: Dr. Yi Deng (PI) School of Earth and Atmospheric Sciences Georgia Institute of Technology 404-385-1821, yi.deng@eas.gatech.edu El Niño-Southern Oscillation (ENSO) and Annular Modes (AMs) represent respectively the most important modes of low frequency variability in the tropical and extratropical circulations. The projection of future changes in the ENSO and AM variability, however, remains highly uncertain with the state-of-the-science climate models. This project conducted a process-resolving, quantitative evaluations of the ENSO and AM variability in the modern reanalysis observationsmore » and in climate model simulations. The goal is to identify and understand the sources of uncertainty and biases in models’ representation of ENSO and AM variability. Using a feedback analysis method originally formulated by one of the collaborative PIs, we partitioned the 3D atmospheric temperature anomalies and surface temperature anomalies associated with ENSO and AM variability into components linked to 1) radiation-related thermodynamic processes such as cloud and water vapor feedbacks, 2) local dynamical processes including convection and turbulent/diffusive energy transfer and 3) non-local dynamical processes such as the horizontal energy transport in the oceans and atmosphere. In the past 4 years, the research conducted at Georgia Tech under the support of this project has led to 15 peer-reviewed publications and 9 conference/workshop presentations. Two graduate students and one postdoctoral fellow also received research training through participating the project activities. This final technical report summarizes key scientific discoveries we made and provides also a list of all publications and conference presentations resulted from research activities at Georgia Tech. The main findings include: 1) the distinctly different roles played by atmospheric dynamical processes in establishing surface temperature response to ENSO at tropics and extratropics (i.e., atmospheric dynamics disperses energy out of tropics during ENSO warm events and modulate surface temperature at mid-, high-latitudes through controlling downward longwave radiation); 2) the representations of ENSO-related temperature response in climate models fail to converge at the process-level particularly over extratropics (i.e., models produce the right temperature responses to ENSO but with wrong reasons); 3) water vapor feedback contributes substantially to the temperature anomalies found over U.S. during different phases of the Northern Annular Mode (NAM), which adds new insight to the traditional picture that cold/warm advective processes are the main drivers of local temperature responses to the NAM; 4) the overall land surface temperature biases in the latest NCAR model (CESM1) are caused by biases in surface albedo while the surface temperature biases over ocean are related to multiple factors including biases in model albedo, cloud and oceanic dynamics, and the temperature biases over different ocean basins are also induced by different process biases. These results provide a detailed guidance for process-level model turning and improvement, and thus contribute directly to the overall goal of reducing model uncertainty in projecting future changes in the Earth’s climate system, especially in the ENSO and AM variability.« less
Method and system to estimate variables in an integrated gasification combined cycle (IGCC) plant
Kumar, Aditya; Shi, Ruijie; Dokucu, Mustafa
2013-09-17
System and method to estimate variables in an integrated gasification combined cycle (IGCC) plant are provided. The system includes a sensor suite to measure respective plant input and output variables. An extended Kalman filter (EKF) receives sensed plant input variables and includes a dynamic model to generate a plurality of plant state estimates and a covariance matrix for the state estimates. A preemptive-constraining processor is configured to preemptively constrain the state estimates and covariance matrix to be free of constraint violations. A measurement-correction processor may be configured to correct constrained state estimates and a constrained covariance matrix based on processing of sensed plant output variables. The measurement-correction processor is coupled to update the dynamic model with corrected state estimates and a corrected covariance matrix. The updated dynamic model may be configured to estimate values for at least one plant variable not originally sensed by the sensor suite.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Law, Beverly E.
Investigate the effects of disturbance and climate variables on processes controlling carbon and water processes at AmeriFlux cluster sites in semi-arid and mesic forests in Oregon. The observations were made at three existing and productive AmeriFlux research sites that represent climate and disturbance gradients as a natural experiment of the influence of climatic and hydrologic variability on carbon sequestration and resulting atmospheric CO 2 feedback that includes anomalies during the warm/ dry phase of the Pacific Decadal Oscillation.
VS2DRTI: Simulating Heat and Reactive Solute Transport in Variably Saturated Porous Media.
Healy, Richard W; Haile, Sosina S; Parkhurst, David L; Charlton, Scott R
2018-01-29
Variably saturated groundwater flow, heat transport, and solute transport are important processes in environmental phenomena, such as the natural evolution of water chemistry of aquifers and streams, the storage of radioactive waste in a geologic repository, the contamination of water resources from acid-rock drainage, and the geologic sequestration of carbon dioxide. Up to now, our ability to simulate these processes simultaneously with fully coupled reactive transport models has been limited to complex and often difficult-to-use models. To address the need for a simple and easy-to-use model, the VS2DRTI software package has been developed for simulating water flow, heat transport, and reactive solute transport through variably saturated porous media. The underlying numerical model, VS2DRT, was created by coupling the flow and transport capabilities of the VS2DT and VS2DH models with the equilibrium and kinetic reaction capabilities of PhreeqcRM. Flow capabilities include two-dimensional, constant-density, variably saturated flow; transport capabilities include both heat and multicomponent solute transport; and the reaction capabilities are a complete implementation of geochemical reactions of PHREEQC. The graphical user interface includes a preprocessor for building simulations and a postprocessor for visual display of simulation results. To demonstrate the simulation of multiple processes, the model is applied to a hypothetical example of injection of heated waste water to an aquifer with temperature-dependent cation exchange. VS2DRTI is freely available public domain software. © 2018, National Ground Water Association.
Polymer performance in cooling water: The influence of process variables
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amjad, Z.; Pugh, J.; Zibrida, J.
1997-01-01
The key to the efficacy of phosphate and phosphonates in stabilized phosphate and all-organic cooling water treatment (CWT) programs is the presence and performance of polymeric inhibitors/dispersants. The performance of polymeric additives used in CWT programs can be adversely impacted by the presence of iron, phosphonate, or cationic polymer and influenced by a variety of process variables including system pH and temperature. In this article, the performance of several polymeric additives is evaluated under a variety of stressed conditions.
Polymer performance in cooling water: The influence of process variables
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amjad, Z.; Pugh, J.; Zibrida, J.
1996-12-01
The key to the efficacy of phosphate and phosphonates in stabilized phosphate and all-organic cooling water treatment (CWT) programs is the presence and performance of polymeric inhibitors/dispersants. The performance of polymeric additives used in CWT programs can be adversely impacted by the presence of iron, phosphonate, or cationic polymer and influenced by a variety of process variables including system pH and temperature. In this paper, the performance of several polymeric additives is evaluated under a variety of stressed conditions.
Seasonally adjusted birth frequencies follow the Poisson distribution.
Barra, Mathias; Lindstrøm, Jonas C; Adams, Samantha S; Augestad, Liv A
2015-12-15
Variations in birth frequencies have an impact on activity planning in maternity wards. Previous studies of this phenomenon have commonly included elective births. A Danish study of spontaneous births found that birth frequencies were well modelled by a Poisson process. Somewhat unexpectedly, there were also weekly variations in the frequency of spontaneous births. Another study claimed that birth frequencies follow the Benford distribution. Our objective was to test these results. We analysed 50,017 spontaneous births at Akershus University Hospital in the period 1999-2014. To investigate the Poisson distribution of these births, we plotted their variance over a sliding average. We specified various Poisson regression models, with the number of births on a given day as the outcome variable. The explanatory variables included various combinations of years, months, days of the week and the digit sum of the date. The relationship between the variance and the average fits well with an underlying Poisson process. A Benford distribution was disproved by a goodness-of-fit test (p < 0.01). The fundamental model with year and month as explanatory variables is significantly improved (p < 0.001) by adding day of the week as an explanatory variable. Altogether 7.5% more children are born on Tuesdays than on Sundays. The digit sum of the date is non-significant as an explanatory variable (p = 0.23), nor does it increase the explained variance. INERPRETATION: Spontaneous births are well modelled by a time-dependent Poisson process when monthly and day-of-the-week variation is included. The frequency is highest in summer towards June and July, Friday and Tuesday stand out as particularly busy days, and the activity level is at its lowest during weekends.
Cerri, Karin H; Knapp, Martin; Fernandez, Jose-Luis
2014-09-01
The College Voor Zorgverzekeringen (CVZ) provides guidance to the Dutch healthcare system on funding and use of new pharmaceutical technologies. This study examined the impact of evidence, process and context factors on CVZ decisions in 2004-2009. A data set of CVZ decisions pertaining to pharmaceutical technologies was created, including 29 variables extracted from published information. A three-category outcome variable was used, defined as the decision to 'recommend', 'restrict' or 'not recommend' a technology. Technologies included in list 1A/1B or on the expensive drug list were considered recommended; those included in list 2 or for which patient co-payment is required were considered restricted; technologies not included on any reimbursement list were classified as 'not recommended'. Using multinomial logistic regression, the relative contribution of explanatory variables on CVZ decisions was assessed. In all, 244 technology appraisals (256 technologies) were analysed, with 51%, of technologies recommended, 33% restricted and 16% not recommended by CVZ for funding. The multinomial model showed significant associations (p ≤ 0.10) between CVZ outcome and several variables, including: (1) use of an active comparator and demonstration of statistical superiority of the primary endpoint in clinical trials, (2) pharmaceutical budget impact associated with introduction of the technology, (3) therapeutic indication and (4) prevalence of the target population. Results confirm the value of a comprehensive and multivariate approach to understanding CVZ decision-making.
Conjoint Analysis: A Study of the Effects of Using Person Variables.
ERIC Educational Resources Information Center
Fraas, John W.; Newman, Isadore
Three statistical techniques--conjoint analysis, a multiple linear regression model, and a multiple linear regression model with a surrogate person variable--were used to estimate the relative importance of five university attributes for students in the process of selecting a college. The five attributes include: availability and variety of…
Highways and Byways: The Career Paths of Senior Student Affairs Officers
ERIC Educational Resources Information Center
Tull, Ashley; Miller, Michael T.
2009-01-01
The highways and byways, or career paths, to the Senior Student Affairs Officer (SSAO) position differ based on a variety of variables. This study examined several variables including the induction or graduate preparation process, professional pathways, and professional and academic involvement of more than half of current land grant SSAOs. Data…
Inside Track to the Future: Strategies, Structures, and Leadership for Change.
ERIC Educational Resources Information Center
Alfred, Richard; Carter, Patricia
1996-01-01
Describes the importance for community colleges of looking toward the future to compete effectively. Suggests that change is a variable process that progresses slowly or quickly depending on the interaction of three variables: competitors, customers, and organizational cultures. Includes a checklist for college leaders to determine their level of…
Meteorological Contribution to Variability in Particulate Matter Concentrations
NASA Astrophysics Data System (ADS)
Woods, H. L.; Spak, S. N.; Holloway, T.
2006-12-01
Local concentrations of fine particulate matter (PM) are driven by a number of processes, including emissions of aerosols and gaseous precursors, atmospheric chemistry, and meteorology at local, regional, and global scales. We apply statistical downscaling methods, typically used for regional climate analysis, to estimate the contribution of regional scale meteorology to PM mass concentration variability at a range of sites in the Upper Midwestern U.S. Multiple years of daily PM10 and PM2.5 data, reported by the U.S. Environmental Protection Agency (EPA), are correlated with large-scale meteorology over the region from the National Centers for Environmental Prediction (NCEP) reanalysis data. We use two statistical downscaling methods (multiple linear regression, MLR, and analog) to identify which processes have the greatest impact on aerosol concentration variability. Empirical Orthogonal Functions of the NCEP meteorological data are correlated with PM timeseries at measurement sites. We examine which meteorological variables exert the greatest influence on PM variability, and which sites exhibit the greatest response to regional meteorology. To evaluate model performance, measurement data are withheld for limited periods, and compared with model results. Preliminary results suggest that regional meteorological processes account over 50% of aerosol concentration variability at study sites.
A Protective Factors Model for Alcohol Abuse and Suicide Prevention among Alaska Native Youth
Allen, James; Mohatt, Gerald V.; Fok, Carlotta Ching Ting; Henry, David; Burkett, Rebekah
2014-01-01
This study provides an empirical test of a culturally grounded theoretical model for prevention of alcohol abuse and suicide risk with Alaska Native youth, using a promising set of culturally appropriate measures for the study of the process of change and outcome. This model is derived from qualitative work that generated an heuristic model of protective factors from alcohol (Allen at al., 2006; Mohatt, Hazel et al., 2004; Mohatt, Rasmus et al., 2004). Participants included 413 rural Alaska Native youth ages 12-18 who assisted in testing a predictive model of Reasons for Life and Reflective Processes about alcohol abuse consequences as co-occurring outcomes. Specific individual, family, peer, and community level protective factor variables predicted these outcomes. Results suggest prominent roles for these predictor variables as intermediate prevention strategy target variables in a theoretical model for a multilevel intervention. The model guides understanding of underlying change processes in an intervention to increase the ultimate outcome variables of Reasons for Life and Reflective Processes regarding the consequences of alcohol abuse. PMID:24952249
Virtue, Shannon Myers; Manne, Sharon; Mee, Laura; Bartell, Abraham; Sands, Stephen; Ohman-Strickland, Pamela; Gajda, Tina Marie
2014-09-01
The current study examined whether cognitive and social processing variables mediated the relationship between fear network and depression among parents of children undergoing hematopoietic stem cell transplant (HSCT). Parents whose children were initiating HSCT (N = 179) completed survey measures including fear network, Beck Depression Inventory, cognitive processing variables (positive reappraisal and self-blame) and social processing variables (emotional support and holding back from sharing concerns). Fear network was positively correlated with depression (p < .001). Self-blame and holding back emerged as individual partial mediators in the relationship between fear network and depression. Together they accounted for 34.3% of the variance in the relationship between fear network and depression. Positive reappraisal and emotional support did not have significant mediating effects. Social and cognitive processes, specifically self-blame and holding back from sharing concerns, play a negative role in parents' psychological adaptation to fears surrounding a child's HSCT.
Virtue, Shannon Myers; Manne, Sharon; Mee, Laura; Bartell, Abraham; Sands, Stephen; Ohman-Strickland, Pamela; Gajda, Tina Marie
2014-01-01
The current study examined whether cognitive and social processing variables mediated the relationship between fear network and depression among parents of children undergoing hematopoietic stem cell transplant (HSCT). Parents whose children were initiating HSCT (N = 179) completed survey measures including fear network, Beck Depression Inventory (BDI), cognitive processing variables (positive reappraisal and self-blame) and social processing variables (emotional support and holding back from sharing concerns). Fear network was positively correlated with depression (p < .001). Self-blame and holding back emerged as individual partial mediators in the relationship between fear network and depression. Together they accounted for 34.3% of the variance in the relationship between fear network and depression. Positive reappraisal and emotional support did not have significant mediating effects. Social and cognitive processes, specifically self-blame and holding back from sharing concerns, play a negative role in parents’ psychological adaptation to fears surrounding a child’s HSCT. PMID:25081956
Stegemöller, Elizabeth L; Wilson, Jonathan P; Hazamy, Audrey; Shelley, Mack C; Okun, Michael S; Altmann, Lori J P; Hass, Chris J
2014-06-01
Cognitive impairments in Parkinson disease (PD) manifest as deficits in speed of processing, working memory, and executive function and attention abilities. The gait impairment in PD is well documented to include reduced speed, shortened step lengths, and increased step-to-step variability. However, there is a paucity of research examining the relationship between overground walking and cognitive performance in people with PD. This study sought to examine the relationship between both the mean and variability of gait spatiotemporal parameters and cognitive performance across a broad range of cognitive domains. A cross-sectional design was used. Thirty-five participants with no dementia and diagnosed with idiopathic PD completed a battery of 12 cognitive tests that yielded 3 orthogonal factors: processing speed, working memory, and executive function and attention. Participants completed 10 trials of overground walking (single-task walking) and 5 trials of overground walking while counting backward by 3's (dual-task walking). All gait measures were impaired by the dual task. Cognitive processing speed correlated with stride length and walking speed. Executive function correlated with step width variability. There were no significant associations with working memory. Regression models relating speed of processing to gait spatiotemporal variables revealed that including dual-task costs in the model significantly improved the fit of the model. Participants with PD were tested only in the on-medication state. Different characteristics of gait are related to distinct types of cognitive processing, which may be differentially affected by dual-task walking due to the pathology of PD. © 2014 American Physical Therapy Association.
Exposure and response prevention process predicts treatment outcome in youth with OCD.
Kircanski, Katharina; Peris, Tara S
2015-04-01
Recent research on the treatment of adults with anxiety disorders suggests that aspects of the in-session exposure therapy process are relevant to clinical outcomes. However, few comprehensive studies have been conducted with children and adolescents. In the present study, 35 youth diagnosed with primary obsessive-compulsive disorder (OCD; M age = 12.9 years, 49% male, 63% Caucasian) completed 12 sessions of exposure and response prevention (ERP) in one of two treatment conditions as part of a pilot randomized controlled testing of a family focused intervention for OCD. Key exposure process variables, including youth self-reported distress during ERP and the quantity and quality of ERP completed, were computed. These variables were examined as predictors of treatment outcomes assessed at mid-treatment, post-treatment, and three-month follow-up, partialing treatment condition. In general, greater variability of distress during ERP and completing a greater proportion of combined exposures (i.e., exposures targeting more than one OC symptom at once) were predictive of better outcomes. Conversely, greater distress at the end of treatment was generally predictive of poorer outcomes. Finally, several variables, including within- and between-session decreases in distress during ERP, were not consistently predictive of outcomes. Findings signal potentially important facets of exposure for youth with OCD and have implications for treatment. A number of results also parallel recent findings in the adult literature, suggesting that there may be some continuity in exposure processes from child to adult development. Future work should examine additional measures of exposure process, such as psychophysiological arousal during exposure, in youth.
Saraf-Sinik, Inbar; Assa, Eldad; Ahissar, Ehud
2015-06-10
Tactile perception is obtained by coordinated motor-sensory processes. We studied the processes underlying the perception of object location in freely moving rats. We trained rats to identify the relative location of two vertical poles placed in front of them and measured at high resolution the motor and sensory variables (19 and 2 variables, respectively) associated with this whiskers-based perceptual process. We found that the rats developed stereotypic head and whisker movements to solve this task, in a manner that can be described by several distinct behavioral phases. During two of these phases, the rats' whiskers coded object position by first temporal and then angular coding schemes. We then introduced wind (in two opposite directions) and remeasured their perceptual performance and motor-sensory variables. Our rats continued to perceive object location in a consistent manner under wind perturbations while maintaining all behavioral phases and relatively constant sensory coding. Constant sensory coding was achieved by keeping one group of motor variables (the "controlled variables") constant, despite the perturbing wind, at the cost of strongly modulating another group of motor variables (the "modulated variables"). The controlled variables included coding-relevant variables, such as head azimuth and whisker velocity. These results indicate that consistent perception of location in the rat is obtained actively, via a selective control of perception-relevant motor variables. Copyright © 2015 the authors 0270-6474/15/358777-13$15.00/0.
Khan, Jenna; Lieberman, Joshua A; Lockwood, Christina M
2017-05-01
microRNAs (miRNAs) hold promise as biomarkers for a variety of disease processes and for determining cell differentiation. These short RNA species are robust, survive harsh treatment and storage conditions and may be extracted from blood and tissue. Pre-analytical variables are critical confounders in the analysis of miRNAs: we elucidate these and identify best practices for minimizing sample variation in blood and tissue specimens. Pre-analytical variables addressed include patient-intrinsic variation, time and temperature from sample collection to storage or processing, processing methods, contamination by cells and blood components, RNA extraction method, normalization, and storage time/conditions. For circulating miRNAs, hemolysis and blood cell contamination significantly affect profiles; samples should be processed within 2 h of collection; ethylene diamine tetraacetic acid (EDTA) is preferred while heparin should be avoided; samples should be "double spun" or filtered; room temperature or 4 °C storage for up to 24 h is preferred; miRNAs are stable for at least 1 year at -20 °C or -80 °C. For tissue-based analysis, warm ischemic time should be <1 h; cold ischemic time (4 °C) <24 h; common fixative used for all specimens; formalin fix up to 72 h prior to processing; enrich for cells of interest; validate candidate biomarkers with in situ visualization. Most importantly, all specimen types should have standard and common workflows with careful documentation of relevant pre-analytical variables.
Social Workers' Orientation toward the Evidence-Based Practice Process: A Dutch Survey
ERIC Educational Resources Information Center
van der Zwet, Renske J. M.; Kolmer, Deirdre M. Beneken genaamd; Schalk, René
2016-01-01
Objectives: This study assesses social workers' orientation toward the evidence-based practice (EBP) process and explores which specific variables (e.g. age) are associated. Methods: Data were collected from 341 Dutch social workers through an online survey which included a Dutch translation of the EBP Process Assessment Scale (EBPPAS), along with…
NASA Astrophysics Data System (ADS)
Petropavlovskikh, I. V.; Manney, G. L.; Hoor, P. M.; Bourassa, A. E.; Braathen, G.; Chang, K. L.; Hegglin, M. I.; Kramarova, N. A.; Kunkel, D.; Lawrence, Z. D.; Leblanc, T.; Livesey, N. J.; Millan Valle, L. F.; Stiller, G. P.; Tegtmeier, S.; Thouret, V.; Voigt, C.; Walker, K. A.
2017-12-01
The distribution of tracers in the upper troposphere and lower stratosphere (UTLS) shows large spatial and temporal variability because of interactions of transport, chemical, and mixing processes near the tropopause, as well as variations in the location of the tropopause itself. This strongly affects quantitative estimates of the impact of radiatively active substances, including ozone and water vapour, on surface temperatures, and complicates diagnosis of dynamical processes such as stratosphere troposphere exchange (STE). The Stratosphere-troposphere Processes And their Role in Climate (SPARC) emerging activity OCTAV-UTLS (Observed Composition Trends and Variability in the UTLS) aims to reduce the uncertainties in trend estimates by accounting for these dynamically induced sources of variability. Achieving these goals by using existing UTLS trace gas observations from aircraft, ground-based, balloon and satellite platforms requires a consistent analysis of these different data with respect to the tropopause or the jets. As a central task for OCTAV-UTLS, we are developing and applying common metrics, calculated using the same reanalysis datasets, to compare UTLS data using geophysically-based coordinate systems including tropopause and upper tropospheric jet relative coordinates. In addition to assessing present day measurement capabilities, OCTAV-UTLS will assess gaps in current geographical / temporal sampling of the UTLS that limit our ability to determine atmospheric composition variability and trends. This talk will provide an overview of the OCTAV-UTLS activity and some examples of initial calculations of geophysically-based coordinates and comparisons of remapped data.
Phillips, K A; Morrison, K R; Andersen, R; Aday, L A
1998-01-01
OBJECTIVE: The behavioral model of utilization, developed by Andersen, Aday, and others, is one of the most frequently used frameworks for analyzing the factors that are associated with patient utilization of healthcare services. However, the use of the model for examining the context within which utilization occurs-the role of the environment and provider-related factors-has been largely neglected. OBJECTIVE: To conduct a systematic review and analysis to determine if studies of medical care utilization that have used the behavioral model during the last 20 years have included environmental and provider-related variables and the methods used to analyze these variables. We discuss barriers to the use of these contextual variables and potential solutions. DATA SOURCES: The Social Science Citation Index and Science Citation Index. We included all articles from 1975-1995 that cited any of three key articles on the behavioral model, that included all articles that were empirical analyses and studies of formal medical care utilization, and articles that specifically stated their use of the behavioral model (n = 139). STUDY DESIGN: Design was a systematic literature review. DATA ANALYSIS: We used a structured review process to code articles on whether they included contextual variables: (1) environmental variables (characteristics of the healthcare delivery system, external environment, and community-level enabling factors); and (2) provider-related variables (patient factors that may be influenced by providers and provider characteristics that interact with patient characteristics to influence utilization). We also examined the methods used in studies that included contextual variables. PRINCIPAL FINDINGS: Forty-five percent of the studies included environmental variables and 51 percent included provider-related variables. Few studies examined specific measures of the healthcare system or provider characteristics or used methods other than simple regression analysis with hierarchical entry of variables. Only 14 percent of studies analyzed the context of healthcare by including both environmental and provider-related variables as well as using relevant methods. CONCLUSIONS: By assessing whether and how contextual variables are used, we are able to highlight the contributions made by studies using these approaches, to identify variables and methods that have been relatively underused, and to suggest solutions to barriers in using contextual variables. PMID:9685123
Acuña, Gonzalo; Ramirez, Cristian; Curilem, Millaray
2014-01-01
The lack of sensors for some relevant state variables in fermentation processes can be coped by developing appropriate software sensors. In this work, NARX-ANN, NARMAX-ANN, NARX-SVM and NARMAX-SVM models are compared when acting as software sensors of biomass concentration for a solid substrate cultivation (SSC) process. Results show that NARMAX-SVM outperforms the other models with an SMAPE index under 9 for a 20 % amplitude noise. In addition, NARMAX models perform better than NARX models under the same noise conditions because of their better predictive capabilities as they include prediction errors as inputs. In the case of perturbation of initial conditions of the autoregressive variable, NARX models exhibited better convergence capabilities. This work also confirms that a difficult to measure variable, like biomass concentration, can be estimated on-line from easy to measure variables like CO₂ and O₂ using an adequate software sensor based on computational intelligence techniques.
Rathore, Anurag S; Kumar Singh, Sumit; Pathak, Mili; Read, Erik K; Brorson, Kurt A; Agarabi, Cyrus D; Khan, Mansoor
2015-01-01
Fermentanomics is an emerging field of research and involves understanding the underlying controlled process variables and their effect on process yield and product quality. Although major advancements have occurred in process analytics over the past two decades, accurate real-time measurement of significant quality attributes for a biotech product during production culture is still not feasible. Researchers have used an amalgam of process models and analytical measurements for monitoring and process control during production. This article focuses on using multivariate data analysis as a tool for monitoring the internal bioreactor dynamics, the metabolic state of the cell, and interactions among them during culture. Quality attributes of the monoclonal antibody product that were monitored include glycosylation profile of the final product along with process attributes, such as viable cell density and level of antibody expression. These were related to process variables, raw materials components of the chemically defined hybridoma media, concentration of metabolites formed during the course of the culture, aeration-related parameters, and supplemented raw materials such as glucose, methionine, threonine, tryptophan, and tyrosine. This article demonstrates the utility of multivariate data analysis for correlating the product quality attributes (especially glycosylation) to process variables and raw materials (especially amino acid supplements in cell culture media). The proposed approach can be applied for process optimization to increase product expression, improve consistency of product quality, and target the desired quality attribute profile. © 2015 American Institute of Chemical Engineers.
NASA Technical Reports Server (NTRS)
Boyce, Lola; Bast, Callie C.
1992-01-01
The research included ongoing development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subjected to a number of effects or primative variables. These primative variable may include high temperature, fatigue or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation has been randomized and is included in the computer program, PROMISS. Also included in the research is the development of methodology to calibrate the above described constitutive equation using actual experimental materials data together with linear regression of that data, thereby predicting values for the empirical material constraints for each effect or primative variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from the open literature for materials typically of interest to those studying aerospace propulsion system components. Material data for Inconel 718 was analyzed using the developed methodology.
An overview of AmeriFlux data products and methods for data acquisition, processing, and publication
NASA Astrophysics Data System (ADS)
Pastorello, G.; Poindexter, C.; Agarwal, D.; Papale, D.; van Ingen, C.; Torn, M. S.
2014-12-01
The AmeriFlux network encompasses independently managed field sites measuring ecosystem carbon, water, and energy fluxes across the Americas. In close coordination with ICOS in Europe, a new set of fluxes data and metadata products is being produced and released at the FLUXNET level, including all AmeriFlux sites. This will enable continued releases of global standardized set of flux data products. In this release, new formats, structures, and ancillary information are being proposed and adopted. This presentation discusses these aspects, detailing current and future solutions. One of the major revisions was to the BADM (Biological, Ancillary, and Disturbance Metadata) protocols. The updates include structure and variable changes to address new developments in data collection related to flux towers and facilitate two-way data sharing. In particular, a new organization of templates is now in place, including changes in templates for biomass, disturbances, instrumentation, soils, and others. New variables and an extensive addition to the vocabularies used to describe BADM templates allow for a more flexible and comprehensible coverage of field sites and the data collection methods and results. Another extensive revision is in the data formats, levels, and versions for fluxes and micrometeorological data. A new selection and revision of data variables and an integrated new definition for data processing levels allow for a more intuitive and flexible notation for the variety of data products. For instance, all variables now include positional information that is tied to BADM instrumentation descriptions. This allows for a better characterization of spatial representativeness of data points, e.g., individual sensors or the tower footprint. Additionally, a new definition for data levels better characterizes the types of processing and transformations applied to the data across different dimensions (e.g., spatial representativeness of a data point, data quality checks applied, and differentiation between measured data and data from models that use process knowledge). We also present an expanded approach to versions of data and data processing software, with stable and immutable data releases, but also pre-release versions to allow evaluation and feedback prior to a stable release.
Ormes, James D; Zhang, Dan; Chen, Alex M; Hou, Shirley; Krueger, Davida; Nelson, Todd; Templeton, Allen
2013-02-01
There has been a growing interest in amorphous solid dispersions for bioavailability enhancement in drug discovery. Spray drying, as shown in this study, is well suited to produce prototype amorphous dispersions in the Candidate Selection stage where drug supply is limited. This investigation mapped the processing window of a micro-spray dryer to achieve desired particle characteristics and optimize throughput/yield. Effects of processing variables on the properties of hypromellose acetate succinate were evaluated by a fractional factorial design of experiments. Parameters studied include solid loading, atomization, nozzle size, and spray rate. Response variables include particle size, morphology and yield. Unlike most other commercial small-scale spray dryers, the ProCepT was capable of producing particles with a relatively wide mean particle size, ca. 2-35 µm, allowing material properties to be tailored to support various applications. In addition, an optimized throughput of 35 g/hour with a yield of 75-95% was achieved, which affords to support studies from Lead-identification/Lead-optimization to early safety studies. A regression model was constructed to quantify the relationship between processing parameters and the response variables. The response surface curves provide a useful tool to design processing conditions, leading to a reduction in development time and drug usage to support drug discovery.
Deciphering Sources of Variability in Clinical Pathology.
Tripathi, Niraj K; Everds, Nancy E; Schultze, A Eric; Irizarry, Armando R; Hall, Robert L; Provencher, Anne; Aulbach, Adam
2017-01-01
The objectives of this session were to explore causes of variability in clinical pathology data due to preanalytical and analytical variables as well as study design and other procedures that occur in toxicity testing studies. The presenters highlighted challenges associated with such variability in differentiating test article-related effects from the effects of experimental procedures and its impact on overall data interpretation. These presentations focused on preanalytical and analytical variables and study design-related factors and their influence on clinical pathology data, and the importance of various factors that influence data interpretation including statistical analysis and reference intervals. Overall, these presentations touched upon potential effect of many variables on clinical pathology parameters, including animal physiology, sample collection process, specimen handling and analysis, study design, and some discussion points on how to manage those variables to ensure accurate interpretation of clinical pathology data in toxicity studies. This article is a brief synopsis of presentations given in a session entitled "Deciphering Sources of Variability in Clinical Pathology-It's Not Just about the Numbers" that occurred at the 35th Annual Symposium of the Society of Toxicologic Pathology in San Diego, California.
Van Schuerbeek, Peter; Baeken, Chris; De Mey, Johan
2016-01-01
Concerns are raising about the large variability in reported correlations between gray matter morphology and affective personality traits as ‘Harm Avoidance’ (HA). A recent review study (Mincic 2015) stipulated that this variability could come from methodological differences between studies. In order to achieve more robust results by standardizing the data processing procedure, as a first step, we repeatedly analyzed data from healthy females while changing the processing settings (voxel-based morphology (VBM) or region-of-interest (ROI) labeling, smoothing filter width, nuisance parameters included in the regression model, brain atlas and multiple comparisons correction method). The heterogeneity in the obtained results clearly illustrate the dependency of the study outcome to the opted analysis settings. Based on our results and the existing literature, we recommended the use of VBM over ROI labeling for whole brain analyses with a small or intermediate smoothing filter (5-8mm) and a model variable selection step included in the processing procedure. Additionally, it is recommended that ROI labeling should only be used in combination with a clear hypothesis and that authors are encouraged to report their results uncorrected for multiple comparisons as supplementary material to aid review studies. PMID:27096608
All-Sky Census of Variable Stars from the ATLAS Survey
NASA Astrophysics Data System (ADS)
Heinze, Aren Nathaniel; Tonry, John; Denneau, Larry; Stalder, Brian
2018-01-01
The Asteroid Terrestrial-Impact Last Alert Survey uses two custom-built 0.5 meter telescopes to scan the whole accessible sky down to magnitude 19.5 every two nights, with a cadence optimized to detect small asteroids on their 'final plunge' toward impact with Earth. This cadence is also well suited to the detection of variable stars with a huge range of periods and properties, while ATLAS' use of two filters provides additional scientific depth. From the first two years of ATLAS data we have constructed a catalog of several hundred thousand variable objects with periods from one hour to hundreds of days. These include RR Lyrae stars, Cepheids, eclipsing binaries, spotted stars, ellipsoidal variables, Miras; and other objects both regular and irregular. We describe the construction of this catalog, including our multi-step confirmation process for genuine variables; some big-picture scientific conclusions; and prospects for more detailed results.
Juhasz, Barbara J
2016-11-14
Recording eye movements provides information on the time-course of word recognition during reading. Juhasz and Rayner [Juhasz, B. J., & Rayner, K. (2003). Investigating the effects of a set of intercorrelated variables on eye fixation durations in reading. Journal of Experimental Psychology: Learning, Memory and Cognition, 29, 1312-1318] examined the impact of five word recognition variables, including familiarity and age-of-acquisition (AoA), on fixation durations. All variables impacted fixation durations, but the time-course differed. However, the study focused on relatively short, morphologically simple words. Eye movements are also informative for examining the processing of morphologically complex words such as compound words. The present study further examined the time-course of lexical and semantic variables during morphological processing. A total of 120 English compound words that varied in familiarity, AoA, semantic transparency, lexeme meaning dominance, sensory experience rating (SER), and imageability were selected. The impact of these variables on fixation durations was examined when length, word frequency, and lexeme frequencies were controlled in a regression model. The most robust effects were found for familiarity and AoA, indicating that a reader's experience with compound words significantly impacts compound recognition. These results provide insight into semantic processing of morphologically complex words during reading.
Intradaily variability of water quality in a shallow tidal lagoon: Mechanisms and implications
Lucas, L.V.; Sereno, D.M.; Burau, J.R.; Schraga, T.S.; Lopez, C.B.; Stacey, M.T.; Parchevsky, K.V.; Parchevsky, V.P.
2006-01-01
Although surface water quality and its underlying processes vary over time scales ranging from seconds to decades, they have historically been studied at the lower (weekly to interannual) frequencies. The aim of this study was to investigate intradaily variability of three water quality parameters in a small freshwater tidal lagoon (Mildred Island, California). High frequency time series of specific conductivity, water temperature, and chlorophyll a at two locations within the habitat were analyzed in conjunction with supporting hydrodynamic, meteorological, biological, and spatial mapping data. All three constituents exhibited large amplitude intradaily (e.g., semidiurnal tidal and diurnal) oscillations, and periodicity varied across constituents, space, and time. Like other tidal embayments, this habitat is influenced by several processes with distinct periodicities including physical controls, such as tides, solar radiation, and wind, and biological controls, such as photosynthesis, growth, and grazing. A scaling approach was developed to estimate individual process contributions to the observed variability. Scaling results were generally consistent with observations and together with detailed examination of time series and time derivatives, revealed specific mechanisms underlying the observed periodicities, including interactions between the tidal variability, heating, wind, and biology. The implications for monitoring were illustrated through subsampling of the data set. This exercise demonstrated how quantities needed by scientists and managers (e.g., mean or extreme concentrations) may be misrepresented by low frequency data and how short-duration high frequency measurements can aid in the design and interpretation of temporally coarser sampling programs. The dispersive export of chlorophyll a from the habitat exhibited a fortnightly variability corresponding to the modulation of semidiurnal tidal currents with the diurnal cycle of phytoplankton variability, demonstrating how high frequency interactions can govern long-term trends. Process identification, as through the scaling analysis here, can help us anticipate changes in system behavior and adapt our own interactions with the system. ?? 2006 Estuarine Research Federation.
Testing Components of a Self-Management Theory in Adolescents With Type 1 Diabetes Mellitus.
Verchota, Gwen; Sawin, Kathleen J
The role of self-management in adolescents with type 1 diabetes mellitus is not well understood. The purpose of the research was to examine the relationship of key individual and family self-management theory, context, and process variables on proximal (self-management behaviors) and distal (hemoglobin A1c and diabetes-specific health-related quality of life) outcomes in adolescents with type 1 diabetes. A correlational, cross-sectional study was conducted to identify factors contributing to outcomes in adolescents with Type 1 diabetes and examine potential relationships between context, process, and outcome variables delineated in individual and family self-management theory. Participants were 103 adolescent-parent dyads (adolescents ages 12-17) with Type 1 diabetes from a Midwest, outpatient, diabetes clinic. The dyads completed a self-report survey including instruments intended to measure context, process, and outcome variables from individual and family self-management theory. Using hierarchical multiple regression, context (depressive symptoms) and process (communication) variables explained 37% of the variance in self-management behaviors. Regimen complexity-the only significant predictor-explained 11% of the variance in hemoglobin A1c. Neither process variables nor self-management behaviors were significant. For the diabetes-specific health-related quality of life outcome, context (regimen complexity and depressive symptoms) explained 26% of the variance at step 1; an additional 9% of the variance was explained when process (self-efficacy and communication) variables were added at step 2; and 52% of the variance was explained when self-management behaviors were added at Step 3. In the final model, three variables were significant predictors: depressive symptoms, self-efficacy, and self-management behaviors. The individual and family self-management theory can serve as a cogent theory for understanding key concepts, processes, and outcomes essential to self-management in adolescents and families dealing with Type 1 diabetes mellitus.
Buys, Gerhard M; du Plessis, Lissinda H; Marais, Andries F; Kotze, Awie F; Hamman, Josias H
2013-06-01
Chitosan is a polymer derived from chitin that is widely available at relatively low cost, but due to compression challenges it has limited application for the production of direct compression tablets. The aim of this study was to use certain process and formulation variables to improve manufacturing of tablets containing chitosan as bulking agent. Chitosan particle size and flow properties were determined, which included bulk density, tapped density, compressibility and moisture uptake. The effect of process variables (i.e. compression force, punch depth, percentage compaction in a novel double fill compression process) and formulation variables (i.e. type of glidant, citric acid, pectin, coating with Eudragit S®) on chitosan tablet performance (i.e. mass variation, tensile strength, dissolution) was investigated. Moisture content of the chitosan powder, particle size and the inclusion of glidants had a pronounced effect on its flow ability. Varying the percentage compaction during the first cycle of a double fill compression process produced chitosan tablets with more acceptable tensile strength and dissolution rate properties. The inclusion of citric acid and pectin into the formulation significantly decreased the dissolution rate of isoniazid from the tablets due to gel formation. Direct compression of chitosan powder into tablets can be significantly improved by the investigated process and formulation variables as well as applying a double fill compression process.
Practical small-scale explosive seam welding
NASA Technical Reports Server (NTRS)
Bement, L. J.
1983-01-01
Joining principles and variables, types of joints, capabilities, and current and potential applications are described for an explosive seam welding process developed at NASA Langley Research Center. Variable small quantities of RDX explosive in a ribbon configuration are used to create narrow (less than 0.5 inch), long length, uniform, hermetrically sealed joints that exhibit parent metal properties in a wide variety of metals, alloys, and combinations. The first major all application of the process is the repair of four nuclear reactors in Canada. Potential applications include pipelines, sealing of vessels, and assembly of large space structures.
A Random Variable Transformation Process.
ERIC Educational Resources Information Center
Scheuermann, Larry
1989-01-01
Provides a short BASIC program, RANVAR, which generates random variates for various theoretical probability distributions. The seven variates include: uniform, exponential, normal, binomial, Poisson, Pascal, and triangular. (MVL)
Systematic review of the neural basis of social cognition in patients with mood disorders.
Cusi, Andrée M; Nazarov, Anthony; Holshausen, Katherine; Macqueen, Glenda M; McKinnon, Margaret C
2012-05-01
This review integrates neuroimaging studies of 2 domains of social cognition--emotion comprehension and theory of mind (ToM)--in patients with major depressive disorder and bipolar disorder. The influence of key clinical and method variables on patterns of neural activation during social cognitive processing is also examined. Studies were identified using PsycINFO and PubMed (January 1967 to May 2011). The search terms were "fMRI," "emotion comprehension," "emotion perception," "affect comprehension," "affect perception," "facial expression," "prosody," "theory of mind," "mentalizing" and "empathy" in combination with "major depressive disorder," "bipolar disorder," "major depression," "unipolar depression," "clinical depression" and "mania." Taken together, neuroimaging studies of social cognition in patients with mood disorders reveal enhanced activation in limbic and emotion-related structures and attenuated activity within frontal regions associated with emotion regulation and higher cognitive functions. These results reveal an overall lack of inhibition by higher-order cognitive structures on limbic and emotion-related structures during social cognitive processing in patients with mood disorders. Critically, key variables, including illness burden, symptom severity, comorbidity, medication status and cognitive load may moderate this pattern of neural activation. Studies that did not include control tasks or a comparator group were included in this review. Further work is needed to examine the contribution of key moderator variables and to further elucidate the neural networks underlying altered social cognition in patients with mood disorders. The neural networks under lying higher-order social cognitive processes, including empathy, remain unexplored in patients with mood disorders.
Nishiyama, Chika; Brown, Siobhan P; May, Susanne J; Iwami, Taku; Koster, Rudolph W.; Beesems, Stefanie G.; Kuisma, Markku; Salo, Ari; Jacobs, Ian; Finn, Judith; Sterz, Fritz; Nürnberger, Alexander; Smith, Karen; Morrison, Laurie; Olasveengen, Theresa M.; Callaway, Clifton W.; Shin, Sang Do; Gräsner, Jan-Thorsten; Daya, Mohamud; Ma, Matthew Huei-Ming; Herlitz, Johan; Strömsöe, Anneli; Aufderheide, Tom P.; Masterson, Siobhán; Wang, Henry; Christenson, Jim; Stiell, Ian; Davis, Dan; Huszti, Ella; Nichol, Graham
2014-01-01
Objectives Survival after out-of-hospital cardiac arrest (OHCA) varies between communities, due in part to variation in the methods of measurement. The Utstein template was disseminated to standardize comparisons of risk factors, quality of care and outcomes in patients with OHCA. We sought to assess whether OHCA registries are able to collate common data using the Utstein template. A subsequent study will assess whether the Utstein factors explain differences in survival between emergency medical services (EMS) systems. Study design Retrospective study. Setting This retrospective analysis of prospective cohorts included adults treated for OHCA, regardless of the etiology of arrest. Data describing the baseline characteristics of patients, and the process and outcome of their care were grouped by EMS system, de-identified then collated. Included were core Utstein variables and timed event data from each participating registry. This study was classified as exempt from human subjects’ research by a research ethics committee. Measurements and Main Results Twelve registries with 265 first-responding EMS agencies in 14 countries contributed data describing 125,840 cases of OHCA. Variation in inclusion criteria, definition, coding, and process of care variables were observed. Contributing registries collected 61.9% of recommended core variables and 42.9% of timed event variables. Among core variables, the proportion of missingness was mean 1.9 ± 2.2%. The proportion of unknown was mean 4.8 ± 6.4%. Among time variables, missingness was mean 9.0 ± 6.3%. Conclusions International differences in measurement of care after OHCA persist. Greater consistency would facilitate improved resuscitation care and comparison within and between communities. PMID:25010784
Fleury, Marie-Josée; Grenier, Guy; Bamvita, Jean-Marie
2017-01-01
This study identified multiple socio-professional and team effectiveness variables, based on the Input-Mediator-Output-Input (IMOI) model, and tested their associations with job satisfaction for three categories of mental health professionals (nurses, psychologists/psychotherapists, and social workers). Job satisfaction was assessed with the Job Satisfaction Survey. Independent variables were classified into four categories: 1) Socio-professional Characteristics; 2) Team Attributes; 3) Team Processes; and 4) Team Emergent States. Variables were entered successively, by category, into a hierarchical regression model. Team Processes contributed the greatest number of variables to job satisfaction among all professional groups, including team support which was the only significant variable common to all three types of professionals. Greater involvement in the decision-making process, and lower levels of team conflict (Team Processes) were associated with job satisfaction among nurses and social workers. Lower seniority on team (Socio-professional Characteristics), and team collaboration (Team Processes) were associated with job satisfaction among nurses, as was belief in the advantages of interdisciplinary collaboration (Team Emergent States) among psychologists. Knowledge sharing (Team Processes) and affective commitment to the team (Team Emergent States) were associated with job satisfaction among social workers. Results suggest the need for mental health decision-makers and team managers to offer adequate support to mental health professionals, to involve nurses and social workers in the decision-making process, and implement procedures and mechanisms favourable to the prevention or resolution of team conflict with a view toward increasing job satisfaction among mental health professionals.
A Longitudinal Study of Consumer Socialization.
ERIC Educational Resources Information Center
Moschis, George P.; Moore, Roy L.
A study examined the effects of factors (including television, family, peers, age, and socioeconomic status) on consumer socialization, the process by which individuals develop consumption-related cognitions and behaviors. The specific criterion variables studied included consumer affairs knowledge, puffery filtering, consumer finance management,…
Coastal vulnerability assessment with the use of environmental and socio-economic indicators
NASA Astrophysics Data System (ADS)
Alexandrakis, George; Petrakis, Stelios; Vousdoukas, Mixalis; Ghionis, George; Hatziyanni, Eleni; Kampanis, Nikolaos
2014-05-01
Climate change has significant repercussions on the natural environment, triggering obvious changes in the natural processes that have a severe socio-economic impact on the coastal zone; where a great number of human activities are concentrated. So far, the estimation of coastal vulnerability was based primarily on the natural processes and less on socio-economic variables, which would assist in the identification of vulnerable areas. The present investigation proposes a methodology to examine the vulnerability of a highly touristic area in the Island of Crete to an expected sea level rise of up to ~40 cm by the year 2100, according to the A1B scenario of IPCC 2007. The methodology includes the combination of socio-economic indicators into a GIS-based coastal vulnerability index for wave-induced erosion. This approach includes three sub-indices that contribute equally to the overall index. The sub-indices refer to coastal forcing, socio-economic and coastal characteristics. All variables are ranked on a 1-5 scale with 5 indicating higher vulnerability. The socio-economic sub-index includes, as indicators, the population of the study area, cultural heritage sites, transport networks, land use and protection measures. The coastal forcing sub-index includes the frequency of extreme events, while the Coastal Vulnerability Index includes the geological variables (coastal geomorphology, historical coastline changes, and regional coastal slope) and the variables representing the marine processes (relative sea level rise, mean significant wave height, and tidal range). The main difficulty for the estimation of the index lies in assessing and ranking the socio-economic indicators. The whole approach was tested and validated through field and desktop studies, using as a case study the Elouda bay, Crete Isl., an area of high cultural and economic value, which combines monuments from ancient and medieval times, with a very high touristic development since the 1970s.
Nonlinear dynamics of global atmospheric and Earth system processes
NASA Technical Reports Server (NTRS)
Saltzman, Barry
1993-01-01
During the past eight years, we have been engaged in a NASA-supported program of research aimed at establishing the connection between satellite signatures of the earth's environmental state and the nonlinear dynamics of the global weather and climate system. Thirty-five publications and four theses have resulted from this work, which included contributions in five main areas of study: (1) cloud and latent heat processes in finite-amplitude baroclinic waves; (2) application of satellite radiation data in global weather analysis; (3) studies of planetary waves and low-frequency weather variability; (4) GCM studies of the atmospheric response to variable boundary conditions measurable from satellites; and (5) dynamics of long-term earth system changes. Significant accomplishments from the three main lines of investigation pursued during the past year are presented and include the following: (1) planetary atmospheric waves and low frequency variability; (2) GCM studies of the atmospheric response to changed boundary conditions; and (3) dynamics of long-term changes in the global earth system.
Apparatus and method for microwave processing of materials
Johnson, A.C.; Lauf, R.J.; Bible, D.W.; Markunas, R.J.
1996-05-28
Disclosed is a variable frequency microwave heating apparatus designed to allow modulation of the frequency of the microwaves introduced into a furnace cavity for testing or other selected applications. The variable frequency heating apparatus is used in the method of the present invention to monitor the resonant processing frequency within the furnace cavity depending upon the material, including the state thereof, from which the workpiece is fabricated. The variable frequency microwave heating apparatus includes a microwave signal generator and a high-power microwave amplifier or a microwave voltage-controlled oscillator. A power supply is provided for operation of the high-power microwave oscillator or microwave amplifier. A directional coupler is provided for detecting the direction and amplitude of signals incident upon and reflected from the microwave cavity. A first power meter is provided for measuring the power delivered to the microwave furnace. A second power meter detects the magnitude of reflected power. Reflected power is dissipated in the reflected power load. 10 figs.
Landau, Sabine; Emsley, Richard; Dunn, Graham
2018-06-01
Random allocation avoids confounding bias when estimating the average treatment effect. For continuous outcomes measured at post-treatment as well as prior to randomisation (baseline), analyses based on (A) post-treatment outcome alone, (B) change scores over the treatment phase or (C) conditioning on baseline values (analysis of covariance) provide unbiased estimators of the average treatment effect. The decision to include baseline values of the clinical outcome in the analysis is based on precision arguments, with analysis of covariance known to be most precise. Investigators increasingly carry out explanatory analyses to decompose total treatment effects into components that are mediated by an intermediate continuous outcome and a non-mediated part. Traditional mediation analysis might be performed based on (A) post-treatment values of the intermediate and clinical outcomes alone, (B) respective change scores or (C) conditioning on baseline measures of both intermediate and clinical outcomes. Using causal diagrams and Monte Carlo simulation, we investigated the performance of the three competing mediation approaches. We considered a data generating model that included three possible confounding processes involving baseline variables: The first two processes modelled baseline measures of the clinical variable or the intermediate variable as common causes of post-treatment measures of these two variables. The third process allowed the two baseline variables themselves to be correlated due to past common causes. We compared the analysis models implied by the competing mediation approaches with this data generating model to hypothesise likely biases in estimators, and tested these in a simulation study. We applied the methods to a randomised trial of pragmatic rehabilitation in patients with chronic fatigue syndrome, which examined the role of limiting activities as a mediator. Estimates of causal mediation effects derived by approach (A) will be biased if one of the three processes involving baseline measures of intermediate or clinical outcomes is operating. Necessary assumptions for the change score approach (B) to provide unbiased estimates under either process include the independence of baseline measures and change scores of the intermediate variable. Finally, estimates provided by the analysis of covariance approach (C) were found to be unbiased under all the three processes considered here. When applied to the example, there was evidence of mediation under all methods but the estimate of the indirect effect depended on the approach used with the proportion mediated varying from 57% to 86%. Trialists planning mediation analyses should measure baseline values of putative mediators as well as of continuous clinical outcomes. An analysis of covariance approach is recommended to avoid potential biases due to confounding processes involving baseline measures of intermediate or clinical outcomes, and not simply for increased precision.
Comparative Effects of Antihistamines on Aircrew Mission Effectiveness under Sustained Operations
1992-06-01
measures consist mainly of process measures. Process measures are measures of activities used to accomplish the mission and produce the final results...They include task completion times and response variability, and information processing rates as they relate to unique task assignment. Performance...contains process measures that assess the Individual contributions of hardware/software and human components to overall system performance. Measures
NASA Technical Reports Server (NTRS)
Taminger, Karen M.; Hafley, Robert A.; Domack, Marcia S.
2006-01-01
The layer-additive nature of the electron beam freeform fabrication (EBF3) process results in a tortuous thermal path producing complex microstructures including: small homogeneous equiaxed grains; dendritic growth contained within larger grains; and/or pervasive dendritic formation in the interpass regions of the deposits. Several process control variables contribute to the formation of these different microstructures, including translation speed, wire feed rate, beam current and accelerating voltage. In electron beam processing, higher accelerating voltages embed the energy deeper below the surface of the substrate. Two EBF3 systems have been established at NASA Langley, one with a low-voltage (10-30kV) and the other a high-voltage (30-60 kV) electron beam gun. Aluminum alloy 2219 was processed over a range of different variables to explore the design space and correlate the resultant microstructures with the processing parameters. This report is specifically exploring the impact of accelerating voltage. Of particular interest is correlating energy to the resultant material characteristics to determine the potential of achieving microstructural control through precise management of the heat flux and cooling rates during deposition.
Coupling of snow and permafrost processes using the Basic Modeling Interface (BMI)
NASA Astrophysics Data System (ADS)
Wang, K.; Overeem, I.; Jafarov, E. E.; Piper, M.; Stewart, S.; Clow, G. D.; Schaefer, K. M.
2017-12-01
We developed a permafrost modeling tool based by implementing the Kudryavtsev empirical permafrost active layer depth model (the so-called "Ku" component). The model is specifically set up to have a basic model interface (BMI), which enhances the potential coupling to other earth surface processes model components. This model is accessible through the Web Modeling Tool in Community Surface Dynamics Modeling System (CSDMS). The Kudryavtsev model has been applied for entire Alaska to model permafrost distribution at high spatial resolution and model predictions have been verified by Circumpolar Active Layer Monitoring (CALM) in-situ observations. The Ku component uses monthly meteorological forcing, including air temperature, snow depth, and snow density, and predicts active layer thickness (ALT) and temperature on the top of permafrost (TTOP), which are important factors in snow-hydrological processes. BMI provides an easy approach to couple the models with each other. Here, we provide a case of coupling the Ku component to snow process components, including the Snow-Degree-Day (SDD) method and Snow-Energy-Balance (SEB) method, which are existing components in the hydrological model TOPOFLOW. The work flow is (1) get variables from meteorology component, set the values to snow process component, and advance the snow process component, (2) get variables from meteorology and snow component, provide these to the Ku component and advance, (3) get variables from snow process component, set the values to meteorology component, and advance the meteorology component. The next phase is to couple the permafrost component with fully BMI-compliant TOPOFLOW hydrological model, which could provide a useful tool to investigate the permafrost hydrological effect.
Effects of in-sewer processes: a stochastic model approach.
Vollertsen, J; Nielsen, A H; Yang, W; Hvitved-Jacobsen, T
2005-01-01
Transformations of organic matter, nitrogen and sulfur in sewers can be simulated taking into account the relevant transformation and transport processes. One objective of such simulation is the assessment and management of hydrogen sulfide formation and corrosion. Sulfide is formed in the biofilms and sediments of the water phase, but corrosion occurs on the moist surfaces of the sewer gas phase. Consequently, both phases and the transport of volatile substances between these phases must be included. Furthermore, wastewater composition and transformations in sewers are complex and subject to high, natural variability. This paper presents the latest developments of the WATS model concept, allowing integrated aerobic, anoxic and anaerobic simulation of the water phase and of gas phase processes. The resulting model is complex and with high parameter variability. An example applying stochastic modeling shows how this complexity and variability can be taken into account.
Variability of the institutional review board process within a national research network.
Khan, Muhammad A; Barratt, Michelle S; Krugman, Scott D; Serwint, Janet R; Dumont-Driscoll, Marilyn
2014-06-01
To determine the variability of the institutional review board (IRB) process for a minimal risk multicenter study. Participants included 24 Continuity Research Network (CORNET) sites of the Academic Pediatric Association that participated in a cross-sectional study. Each site obtained individual institutional IRB approval. An anonymous questionnaire went to site investigators about the IRB process at their institution. Twenty-two of 24 sites (92%) responded. Preparation time ranged from 1 to 20 hours, mean of 7.1 hours. Individuals submitting ≤3 IRB applications/year required more time for completion than those submitting >3/year (P < .05). Thirteen of 22 (59%) study sites received approval with "exempt" status, and 6 (27%) approved as "expedited" studies. IRB experiences were highly variable across study sites. These findings indicate that multicenter research projects should anticipate barriers to timely study implementation. Improved IRB standardization or centralization for multicenter clinical studies would facilitate this type of practice-based clinical research.
Assessment of reservoir system variable forecasts
NASA Astrophysics Data System (ADS)
Kistenmacher, Martin; Georgakakos, Aris P.
2015-05-01
Forecast ensembles are a convenient means to model water resources uncertainties and to inform planning and management processes. For multipurpose reservoir systems, forecast types include (i) forecasts of upcoming inflows and (ii) forecasts of system variables and outputs such as reservoir levels, releases, flood damage risks, hydropower production, water supply withdrawals, water quality conditions, navigation opportunities, and environmental flows, among others. Forecasts of system variables and outputs are conditional on forecasted inflows as well as on specific management policies and can provide useful information for decision-making processes. Unlike inflow forecasts (in ensemble or other forms), which have been the subject of many previous studies, reservoir system variable and output forecasts are not formally assessed in water resources management theory or practice. This article addresses this gap and develops methods to rectify potential reservoir system forecast inconsistencies and improve the quality of management-relevant information provided to stakeholders and managers. The overarching conclusion is that system variable and output forecast consistency is critical for robust reservoir management and needs to be routinely assessed for any management model used to inform planning and management processes. The above are demonstrated through an application from the Sacramento-American-San Joaquin reservoir system in northern California.
A protective factors model for alcohol abuse and suicide prevention among Alaska Native youth.
Allen, James; Mohatt, Gerald V; Fok, Carlotta Ching Ting; Henry, David; Burkett, Rebekah
2014-09-01
This study provides an empirical test of a culturally grounded theoretical model for prevention of alcohol abuse and suicide risk with Alaska Native youth, using a promising set of culturally appropriate measures for the study of the process of change and outcome. This model is derived from qualitative work that generated an heuristic model of protective factors from alcohol (Allen et al. in J Prev Interv Commun 32:41-59, 2006; Mohatt et al. in Am J Commun Psychol 33:263-273, 2004a; Harm Reduct 1, 2004b). Participants included 413 rural Alaska Native youth ages 12-18 who assisted in testing a predictive model of Reasons for Life and Reflective Processes about alcohol abuse consequences as co-occurring outcomes. Specific individual, family, peer, and community level protective factor variables predicted these outcomes. Results suggest prominent roles for these predictor variables as intermediate prevention strategy target variables in a theoretical model for a multilevel intervention. The model guides understanding of underlying change processes in an intervention to increase the ultimate outcome variables of Reasons for Life and Reflective Processes regarding the consequences of alcohol abuse.
Transport induced by mean-eddy interaction: II. Analysis of transport processes
NASA Astrophysics Data System (ADS)
Ide, Kayo; Wiggins, Stephen
2015-03-01
We present a framework for the analysis of transport processes resulting from the mean-eddy interaction in a flow. The framework is based on the Transport Induced by the Mean-Eddy Interaction (TIME) method presented in a companion paper (Ide and Wiggins, 2014) [1]. The TIME method estimates the (Lagrangian) transport across stationary (Eulerian) boundaries defined by chosen streamlines of the mean flow. Our framework proceeds after first carrying out a sequence of preparatory steps that link the flow dynamics to the transport processes. This includes the construction of the so-called "instantaneous flux" as the Hovmöller diagram. Transport processes are studied by linking the signals of the instantaneous flux field to the dynamical variability of the flow. This linkage also reveals how the variability of the flow contributes to the transport. The spatio-temporal analysis of the flux diagram can be used to assess the efficiency of the variability in transport processes. We apply the method to the double-gyre ocean circulation model in the situation where the Rossby-wave mode dominates the dynamic variability. The spatio-temporal analysis shows that the inter-gyre transport is controlled by the circulating eddy vortices in the fast eastward jet region, whereas the basin-scale Rossby waves have very little impact.
NASA Astrophysics Data System (ADS)
Sosik, Heidi M.; Green, Rebecca E.; Pegau, W. Scott; Roesler, Collin S.
2001-05-01
Relationships between optical and physical properties were examined on the basis of intensive sampling at a site on the New England continental shelf during late summer 1996 and spring 1997. During both seasons, particles were found to be the primary source of temporal and vertical variability in optical properties since light absorption by dissolved material, though significant in magnitude, was relatively constant. Within the particle pool, changes in phytoplankton were responsible for much of the observed optical variability. Physical processes associated with characteristic seasonal patterns in stratification and mixing contributed to optical variability mostly through effects on phytoplankton. An exception to this generalization occurred during summer as the passage of a hurricane led to a breakdown in stratification and substantial resuspension of nonphytoplankton particulate material. Prior to the hurricane, conditions in summer were highly stratified with subsurface maxima in absorption and scattering coefficients. In spring, stratification was much weaker but increased over the sampling period, and a modest phytoplankton bloom caused surface layer maxima in absorption and scattering coefficients. These seasonal differences in the vertical distribution of inherent optical properties were evident in surface reflectance spectra, which were elevated and shifted toward blue wavelengths in the summer. Some seasonal differences in optical properties, including reflectance spectra, suggest that a significant shift toward a smaller particle size distribution occurred in summer. Shorter timescale optical variability was consistent with a variety of influences including episodic events such as the hurricane, physical processes associated with shelfbreak frontal dynamics, biological processes such as phytoplankton growth, and horizontal patchiness combined with water mass advection.
Use and perception of the environment: cultural and developmental processes
Martin M. Chemers; Irwin Altman
1977-01-01
This paper presents a "social systems" orientation for integrating the diverse aspects of environment, culture, and individual behavior. It suggests that a wide range of variables, including the physical environment, cultural and social processes, environmental perceptions and cognitions, behavior, and products of behavior, are connected in a complex,...
NASA Astrophysics Data System (ADS)
Mackey, Audrey Leroy
The impact of demographic, cognitive, and non-cognitive variables on academic success among community college science students was studied. Demographic variables included gender, employment status, and ethnicity. Cognitive variables included college grade point average, assessment status, course prerequisites, college course success ratios, final course grade, withdrawal patterns, and curriculum format. Non-cognitive variables included enrollment status, educational objectives, academic expectations, and career goals. The sample population included students enrolled in human anatomy courses (N = 191) at a large metropolitan community college located in central Texas. Variables that potentially influence attrition and achievement in college level science courses were examined. Final course grade and withdrawal phenomena were treated as dependent variables, while all other variables were treated as independent variables. No significant differences were found to exist between any of the demographic variables studied and the numbers of students who withdrew passing or failing. A difference was shown to be associated with the ethnicity variable and achievement levels. Educational objectives and career goals were shown to have an impact on the number of students who withdrew failing. The career goals variable and the academic expectations variable were shown to have an impact on achievement among daytime and evening students. College grade point average and course success ratios were shown to make a difference among students who withdrew passing. None of the other cognitive variables studied were shown to influence the numbers of students who withdrew passing or failing. College grade point average and course prerequisites, however, were shown to make a difference in achievement. The collaborative learning instructional format was found to have no impact on attrition or achievement, however, mean scores earned by students experiencing the collaborative learning format were higher than mean scores among other students. These results are extremely valuable when engaging in the process of developing advising strategies and instructional methodologies for community college science students.
Commercially sterilized mussel meats (Mytilus chilensis): a study on process yield.
Almonacid, S; Bustamante, J; Simpson, R; Urtubia, A; Pinto, M; Teixeira, A
2012-06-01
The processing steps most responsible for yield loss in the manufacture of canned mussel meats are the thermal treatments of precooking to remove meats from shells, and thermal processing (retorting) to render the final canned product commercially sterile for long-term shelf stability. The objective of this study was to investigate and evaluate the impact of different combinations of process variables on the ultimate drained weight in the final mussel product (Mytilu chilensis), while verifying that any differences found were statistically and economically significant. The process variables selected for this study were precooking time, brine salt concentration, and retort temperature. Results indicated 2 combinations of process variables producing the widest difference in final drained weight, designated best combination and worst combination with 35% and 29% yield, respectively. Significance of this difference was determined by employing a Bootstrap methodology, which assumes an empirical distribution of statistical error. A difference of nearly 6 percentage points in total yield was found. This represents a 20% increase in annual sales from the same quantity of raw material, in addition to increase in yield, the conditions for the best process included a retort process time 65% shorter than that for the worst process, this difference in yield could have significant economic impact, important to the mussel canning industry. © 2012 Institute of Food Technologists®
Yang, Cheng-Huei; Luo, Ching-Hsing; Yang, Cheng-Hong; Chuang, Li-Yeh
2004-01-01
Morse code is now being harnessed for use in rehabilitation applications of augmentative-alternative communication and assistive technology, including mobility, environmental control and adapted worksite access. In this paper, Morse code is selected as a communication adaptive device for disabled persons who suffer from muscle atrophy, cerebral palsy or other severe handicaps. A stable typing rate is strictly required for Morse code to be effective as a communication tool. This restriction is a major hindrance. Therefore, a switch adaptive automatic recognition method with a high recognition rate is needed. The proposed system combines counter-propagation networks with a variable degree variable step size LMS algorithm. It is divided into five stages: space recognition, tone recognition, learning process, adaptive processing, and character recognition. Statistical analyses demonstrated that the proposed method elicited a better recognition rate in comparison to alternative methods in the literature.
Salari-Moghaddam, Asma; Milajerdi, Alireza; Larijani, Bagher; Esmaillzadeh, Ahmad
2018-06-01
No earlier study has summarized findings from previous publications on processed red meat intake and risk of Chronic Obstructive Pulmonary Disease (COPD). This systematic review and meta-analysis was conducted to examine the association between processed red meat intake and COPD risk. We searched in PubMed/Medline, ISI Web of Knowledge, Scopus, EMBASE and Google Scholar up to April 2018 to identify relevant studies. Prospective cohort studies that considered processed red meat as the exposure variable and COPD as the main outcome variable or as one of the outcomes were included in the systematic review. Publications in which hazard ratios (HRs) were reported as effect size were included in the meta-analysis. Finally, five cohort studies were considered in this systematic review and meta-analysis. In total, 289,952 participants, including 8338 subjects with COPD, aged ≥27 years were included in the meta-analysis. These studies were from Sweden and the US. Linear dose response meta-analysis revealed that each 50 gr/week increase in processed red meat intake was associated with 8% higher risk of COPD (HR: 1.08; 95% CI: 1.03, 1.13). There was an evidence of non-linear association between processed red meat intake and risk of COPD (P < 0.001). In this systematic review and meta-analysis, we found a significant positive association between processed red meat intake and risk of COPD. CRD42017077971. Copyright © 2018 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.
Measuring the wetting angle and perimeter of single wood pulp fibers : a modified method
John H. Klungness
1981-01-01
In pulp processing development it is often necessary to measure the effect of a process variable on individual pulp fiber wettability. Such processes would include drying of market pulps, recycling of secondary fibers, and surface modification of fibers as in sizing. However, if wettability is measured on a fiber sheet surface, the results are confounded by...
ERIC Educational Resources Information Center
Mitrani, Victoria B.; Lewis, John E.; Feaster, Daniel J.; Czaja, Sara J.; Eisdorfer, Carl; Schulz, Richard; Szapocznik, Jose
2006-01-01
Purpose: The purpose of the study was to evaluate the role of family functioning in the stress process in a sample of caregivers of dementia patients by using a structural family framework. The stress-process model of caregiver distress included family functioning as an intervening variable in the relationship between objective burden and…
J. G. Isebrands; G. E. Host; K. Lenz; G. Wu; H. W. Stech
2000-01-01
Process models are powerful research tools for assessing the effects of multiple environmental stresses on forest plantations. These models are driven by interacting environmental variables and often include genetic factors necessary for assessing forest plantation growth over a range of different site, climate, and silvicultural conditions. However, process models are...
ERIC Educational Resources Information Center
Tike Bafra, Leyla; Kargin, Tevhide
2009-01-01
This study aims to analyze the attitudes of elementary school teachers, school psychologists and guidance research center personnel regarding developing an individualized educational program (IEP) process as well as challenges faced during the related process, according to several variables. The study included 201 participants who were working in…
The need for spatially explicit quantification of benefits in invasive-species management.
Januchowski-Hartley, Stephanie R; Adams, Vanessa M; Hermoso, Virgilio
2018-04-01
Worldwide, invasive species are a leading driver of environmental change across terrestrial, marine, and freshwater environments and cost billions of dollars annually in ecological damages and economic losses. Resources limit invasive-species control, and planning processes are needed to identify cost-effective solutions. Thus, studies are increasingly considering spatially variable natural and socioeconomic assets (e.g., species persistence, recreational fishing) when planning the allocation of actions for invasive-species management. There is a need to improve understanding of how such assets are considered in invasive-species management. We reviewed over 1600 studies focused on management of invasive species, including flora and fauna. Eighty-four of these studies were included in our final analysis because they focused on the prioritization of actions for invasive species management. Forty-five percent (n = 38) of these studies were based on spatial optimization methods, and 35% (n = 13) accounted for spatially variable assets. Across all 84 optimization studies considered, 27% (n = 23) explicitly accounted for spatially variable assets. Based on our findings, we further explored the potential costs and benefits to invasive species management when spatially variable assets are explicitly considered or not. To include spatially variable assets in decision-making processes that guide invasive-species management there is a need to quantify environmental responses to invasive species and to enhance understanding of potential impacts of invasive species on different natural or socioeconomic assets. We suggest these gaps could be filled by systematic reviews, quantifying invasive species impacts on native species at different periods, and broadening sources and enhancing sharing of knowledge. © 2017 Society for Conservation Biology.
Integrated controls design optimization
Lou, Xinsheng; Neuschaefer, Carl H.
2015-09-01
A control system (207) for optimizing a chemical looping process of a power plant includes an optimizer (420), an income algorithm (230) and a cost algorithm (225) and a chemical looping process models. The process models are used to predict the process outputs from process input variables. Some of the process in puts and output variables are related to the income of the plant; and some others are related to the cost of the plant operations. The income algorithm (230) provides an income input to the optimizer (420) based on a plurality of input parameters (215) of the power plant. The cost algorithm (225) provides a cost input to the optimizer (420) based on a plurality of output parameters (220) of the power plant. The optimizer (420) determines an optimized operating parameter solution based on at least one of the income input and the cost input, and supplies the optimized operating parameter solution to the power plant.
NASA Astrophysics Data System (ADS)
Gentry, Jeffery D.
2000-05-01
A relational database is a powerful tool for collecting and analyzing the vast amounts of inner-related data associated with the manufacture of composite materials. A relational database contains many individual database tables that store data that are related in some fashion. Manufacturing process variables as well as quality assurance measurements can be collected and stored in database tables indexed according to lot numbers, part type or individual serial numbers. Relationships between manufacturing process and product quality can then be correlated over a wide range of product types and process variations. This paper presents details on how relational databases are used to collect, store, and analyze process variables and quality assurance data associated with the manufacture of advanced composite materials. Important considerations are covered including how the various types of data are organized and how relationships between the data are defined. Employing relational database techniques to establish correlative relationships between process variables and quality assurance measurements is then explored. Finally, the benefits of database techniques such as data warehousing, data mining and web based client/server architectures are discussed in the context of composite material manufacturing.
Uncertainty modelling of real-time observation of a moving object: photogrammetric measurements
NASA Astrophysics Data System (ADS)
Ulrich, Thomas
2015-04-01
Photogrametric systems are widely used in the field of industrial metrology to measure kinematic tasks such as tracking robot movements. In order to assess spatiotemporal deviations of a kinematic movement, it is crucial to have a reliable uncertainty of the kinematic measurements. Common methods to evaluate the uncertainty in kinematic measurements include approximations specified by the manufactures, various analytical adjustment methods and Kalman filters. Here a hybrid system estimator in conjunction with a kinematic measurement model is applied. This method can be applied to processes which include various types of kinematic behaviour, constant velocity, variable acceleration or variable turn rates. Additionally, it has been shown that the approach is in accordance with GUM (Guide to the Expression of Uncertainty in Measurement). The approach is compared to the Kalman filter using simulated data to achieve an overall error calculation. Furthermore, the new approach is used for the analysis of a rotating system as this system has both a constant and a variable turn rate. As the new approach reduces overshoots it is more appropriate for analysing kinematic processes than the Kalman filter. In comparison with the manufacturer’s approximations, the new approach takes account of kinematic behaviour, with an improved description of the real measurement process. Therefore, this approach is well-suited to the analysis of kinematic processes with unknown changes in kinematic behaviour.
Harwell, Mark A.; Gentile, John H.; Cummins, Kenneth W.; Highsmith, Raymond C.; Hilborn, Ray; McRoy, C. Peter; Parrish, Julia; Weingartner, Thomas
2010-01-01
Prince William Sound (PWS) is a semi-enclosed fjord estuary on the coast of Alaska adjoining the northern Gulf of Alaska (GOA). PWS is highly productive and diverse, with primary productivity strongly coupled to nutrient dynamics driven by variability in the climate and oceanography of the GOA and North Pacific Ocean. The pelagic and nearshore primary productivity supports a complex and diverse trophic structure, including large populations of forage and large fish that support many species of marine birds and mammals. High intra-annual, inter-annual, and interdecadal variability in climatic and oceanographic processes as drives high variability in the biological populations. A risk-based conceptual ecosystem model (CEM) is presented describing the natural processes, anthropogenic drivers, and resultant stressors that affect PWS, including stressors caused by the Great Alaska Earthquake of 1964 and the Exxon Valdez oil spill of 1989. A trophodynamic model incorporating PWS valued ecosystem components is integrated into the CEM. By representing the relative strengths of driver/stressors/effects, the CEM graphically demonstrates the fundamental dynamics of the PWS ecosystem, the natural forces that control the ecological condition of the Sound, and the relative contribution of natural processes and human activities to the health of the ecosystem. The CEM illustrates the dominance of natural processes in shaping the structure and functioning of the GOA and PWS ecosystems. PMID:20862192
Harwell, Mark A; Gentile, John H; Cummins, Kenneth W; Highsmith, Raymond C; Hilborn, Ray; McRoy, C Peter; Parrish, Julia; Weingartner, Thomas
2010-07-01
Prince William Sound (PWS) is a semi-enclosed fjord estuary on the coast of Alaska adjoining the northern Gulf of Alaska (GOA). PWS is highly productive and diverse, with primary productivity strongly coupled to nutrient dynamics driven by variability in the climate and oceanography of the GOA and North Pacific Ocean. The pelagic and nearshore primary productivity supports a complex and diverse trophic structure, including large populations of forage and large fish that support many species of marine birds and mammals. High intra-annual, inter-annual, and interdecadal variability in climatic and oceanographic processes as drives high variability in the biological populations. A risk-based conceptual ecosystem model (CEM) is presented describing the natural processes, anthropogenic drivers, and resultant stressors that affect PWS, including stressors caused by the Great Alaska Earthquake of 1964 and the Exxon Valdez oil spill of 1989. A trophodynamic model incorporating PWS valued ecosystem components is integrated into the CEM. By representing the relative strengths of driver/stressors/effects, the CEM graphically demonstrates the fundamental dynamics of the PWS ecosystem, the natural forces that control the ecological condition of the Sound, and the relative contribution of natural processes and human activities to the health of the ecosystem. The CEM illustrates the dominance of natural processes in shaping the structure and functioning of the GOA and PWS ecosystems.
Preparation of Effective Operating Manuals to Support Waste Management Plant Operator Training
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, S. R.
2003-02-25
Effective plant operating manuals used in a formal training program can make the difference between a successful operation and a failure. Once the plant process design and control strategies have been fixed, equipment has been ordered, and the plant is constructed, the only major variable affecting success is the capability of plant operating personnel. It is essential that the myriad details concerning plant operation are documented in comprehensive operating manuals suitable for training the non-technical personnel that will operate the plant. These manuals must cover the fundamental principles of each unit operation including how each operates, what process variables aremore » important, and the impact of each variable on the overall process. In addition, operators must know the process control strategies, process interlocks, how to respond to alarms, each of the detailed procedures required to start up and optimize the plant, and every control loop-including when it is appropriate to take manual control. More than anything else, operating mistakes during the start-up phase can lead to substantial delays in achieving design processing rates as well as to problems with government authorities if environmental permit limits are exceeded. The only way to assure return on plant investment is to ensure plant operators have the knowledge to properly run the plant from the outset. A comprehensive set of operating manuals specifically targeted toward plant operators and supervisors written by experienced operating personnel is the only effective way to provide the necessary information for formal start-up training.« less
Study of oxygen gas production phenomenon during stand and discharge in silver-zinc batteries
NASA Technical Reports Server (NTRS)
1974-01-01
Standard production procedures for manufacturing silver zinc batteries are evaluated and modified to reduce oxygen generation during open circuit stand and discharge. Production predictions of several variable combinations using analysis models are listed for minimum gassing, with emphasis on the concentration of potassium hydroxide in plate formation. A recommendation for work optimizing the variables involved in plate processing is included.
ERIC Educational Resources Information Center
FATTU, N.A.
THE REPORT, THE SECOND OF A SERIES (ACCESSION NUMBERS ED 003 239 THROUGH ED 003 241), INCLUDES SEVEN APPENDIXES TO AN EARLIER STUDY. THE APPENDIXES ARE TITLED, (1) PROBLEMS OF MEANING AND REFERENCE IN BLOOM'S TAXONOMY, (2) SCALE FOR APPRAISING A MODEL, (3) EXPLORATION OF INTERACTIONS AMONG INSTRUCTIONAL CONTENT AND APTITUDE VARIABLES, (4) AN…
The Impact of ARM on Climate Modeling. Chapter 26
NASA Technical Reports Server (NTRS)
Randall, David A.; Del Genio, Anthony D.; Donner, Leo J.; Collins, William D.; Klein, Stephen A.
2016-01-01
Climate models are among humanity's most ambitious and elaborate creations. They are designed to simulate the interactions of the atmosphere, ocean, land surface, and cryosphere on time scales far beyond the limits of deterministic predictability, and including the effects of time-dependent external forcings. The processes involved include radiative transfer, fluid dynamics, microphysics, and some aspects of geochemistry, biology, and ecology. The models explicitly simulate processes on spatial scales ranging from the circumference of the Earth down to one hundred kilometers or smaller, and implicitly include the effects of processes on even smaller scales down to a micron or so. The atmospheric component of a climate model can be called an atmospheric global circulation model (AGCM). In an AGCM, calculations are done on a three-dimensional grid, which in some of today's climate models consists of several million grid cells. For each grid cell, about a dozen variables are time-stepped as the model integrates forward from its initial conditions. These so-called prognostic variables have special importance because they are the only things that a model remembers from one time step to the next; everything else is recreated on each time step by starting from the prognostic variables and the boundary conditions. The prognostic variables typically include information about the mass of dry air, the temperature, the wind components, water vapor, various condensed-water species, and at least a few chemical species such as ozone. A good way to understand how climate models work is to consider the lengthy and complex process used to develop one. Lets imagine that a new AGCM is to be created, starting from a blank piece of paper. The model may be intended for a particular class of applications, e.g., high-resolution simulations on time scales of a few decades. Before a single line of code is written, the conceptual foundation of the model must be designed through a creative envisioning that starts from the intended application and is based on current understanding of how the atmosphere works and the inventory of mathematical methods available.
Quantitative approaches in climate change ecology
Brown, Christopher J; Schoeman, David S; Sydeman, William J; Brander, Keith; Buckley, Lauren B; Burrows, Michael; Duarte, Carlos M; Moore, Pippa J; Pandolfi, John M; Poloczanska, Elvira; Venables, William; Richardson, Anthony J
2011-01-01
Contemporary impacts of anthropogenic climate change on ecosystems are increasingly being recognized. Documenting the extent of these impacts requires quantitative tools for analyses of ecological observations to distinguish climate impacts in noisy data and to understand interactions between climate variability and other drivers of change. To assist the development of reliable statistical approaches, we review the marine climate change literature and provide suggestions for quantitative approaches in climate change ecology. We compiled 267 peer-reviewed articles that examined relationships between climate change and marine ecological variables. Of the articles with time series data (n = 186), 75% used statistics to test for a dependency of ecological variables on climate variables. We identified several common weaknesses in statistical approaches, including marginalizing other important non-climate drivers of change, ignoring temporal and spatial autocorrelation, averaging across spatial patterns and not reporting key metrics. We provide a list of issues that need to be addressed to make inferences more defensible, including the consideration of (i) data limitations and the comparability of data sets; (ii) alternative mechanisms for change; (iii) appropriate response variables; (iv) a suitable model for the process under study; (v) temporal autocorrelation; (vi) spatial autocorrelation and patterns; and (vii) the reporting of rates of change. While the focus of our review was marine studies, these suggestions are equally applicable to terrestrial studies. Consideration of these suggestions will help advance global knowledge of climate impacts and understanding of the processes driving ecological change.
Design and Development of a Real-Time Model Attitude Measurement System for Hypersonic Facilities
NASA Technical Reports Server (NTRS)
Jones, Thomas W.; Lunsford, Charles B.
2005-01-01
A series of wind tunnel tests have been conducted to evaluate a multi-camera videogrammetric system designed to measure model attitude in hypersonic facilities. The technique utilizes processed video data and applies photogrammetric principles for point tracking to compute model position including pitch, roll and yaw variables. A discussion of the constraints encountered during the design, development, and testing process, including lighting, vibration, operational range and optical access is included. Initial measurement results from the NASA Langley Research Center (LaRC) 31-Inch Mach 10 tunnel are presented.
Design and Development of a Real-Time Model Attitude Measurement System for Hypersonic Facilities
NASA Technical Reports Server (NTRS)
Jones, Thomas W.; Lunsford, Charles B.
2004-01-01
A series of wind tunnel tests have been conducted to evaluate a multi-camera videogrammetric system designed to measure model attitude in hypersonic facilities. The technique utilizes processed video data and applies photogrammetric principles for point tracking to compute model position including pitch, roll and yaw variables. A discussion of the constraints encountered during the design, development, and testing process, including lighting, vibration, operational range and optical access is included. Initial measurement results from the NASA Langley Research Center (LaRC) 31-Inch Mach 10 tunnel are presented.
Racial and Cultural Factors and Learning Transfer
ERIC Educational Resources Information Center
Closson, Rosemary
2013-01-01
Baldwin and Ford (1988) specifically include learner characteristics as one of three key inputs into the learning transfer process but infrequently (actually almost never) has race, ethnicity, or culture been included as a variable when describing trainee characteristics. For the most part one is left to speculate as to the potential influence…
Friedel, Michael J.
2011-01-01
Few studies attempt to model the range of possible post-fire hydrologic and geomorphic hazards because of the sparseness of data and the coupled, nonlinear, spatial, and temporal relationships among landscape variables. In this study, a type of unsupervised artificial neural network, called a self-organized map (SOM), is trained using data from 540 burned basins in the western United States. The sparsely populated data set includes variables from independent numerical landscape categories (climate, land surface form, geologic texture, and post-fire condition), independent landscape classes (bedrock geology and state), and dependent initiation processes (runoff, landslide, and runoff and landslide combination) and responses (debris flows, floods, and no events). Pattern analysis of the SOM-based component planes is used to identify and interpret relations among the variables. Application of the Davies-Bouldin criteria following k-means clustering of the SOM neurons identified eight conceptual regional models for focusing future research and empirical model development. A split-sample validation on 60 independent basins (not included in the training) indicates that simultaneous predictions of initiation process and response types are at least 78% accurate. As climate shifts from wet to dry conditions, forecasts across the burned landscape reveal a decreasing trend in the total number of debris flow, flood, and runoff events with considerable variability among individual basins. These findings suggest the SOM may be useful in forecasting real-time post-fire hazards, and long-term post-recovery processes and effects of climate change scenarios.
A Process Dynamics and Control Experiment for the Undergraduate Laboratory
ERIC Educational Resources Information Center
Spencer, Jordan L.
2009-01-01
This paper describes a process control experiment. The apparatus includes a three-vessel glass flow system with a variable flow configuration, means for feeding dye solution controlled by a stepper-motor driven valve, and a flow spectrophotometer. Students use impulse response data and nonlinear regression to estimate three parameters of a model…
ERIC Educational Resources Information Center
Jackson, J. Kasi; Latimer, Melissa; Stoiko, Rachel
2017-01-01
This study sought to understand predictors of faculty satisfaction with promotion and tenure processes and reasonableness of expectations in the context of a striving institution. The factors we investigated included discipline (high-consensus [science and math] vs. low-consensus [humanities and social sciences]); demographic variables; and…
Relapse Model among Iranian Drug Users: A Qualitative Study.
Jalali, Amir; Seyedfatemi, Naiemeh; Peyrovi, Hamid
2015-01-01
Relapse is a common problem in drug user's rehabilitation program and reported in all over the country. An in-depth study on patients' experiences can be used for exploring the relapse process among drug users. Therefore, this study suggests a model for relapse process among Iranian drug users. In this qualitative study with grounded theory approach, 22 participants with rich information about the phenomenon under the study were selected using purposive, snowball and theoretical sampling methods. After obtaining the informed consent, data were collected based on face-to-face, in-depth, semi-structured interviews. All interviews were analyzed in three stages of axial, selective and open coding methods. Nine main categories emerged, including avoiding of drugs, concerns about being accepted, family atmosphere, social conditions, mental challenge, self-management, self-deception, use and remorse and a main category, feeling of loss as the core variable. Mental challenge has two subcategories, evoking pleasure and craving. Relapse model is a dynamic and systematic process including from cycles of drug avoidance to remorse with a core variable as feeling of loss. Relapse process is a dynamic and systematic process that needs an effective control. Determining a relapse model as a clear process could be helpful in clinical sessions. RESULTS of this research have depicted relapse process among Iranian drugs user by conceptual model.
Viscoelasticity, postseismic slip, fault interactions, and the recurrence of large earthquakes
Michael, A.J.
2005-01-01
The Brownian Passage Time (BPT) model for earthquake recurrence is modified to include transient deformation due to either viscoelasticity or deep post seismic slip. Both of these processes act to increase the rate of loading on the seismogenic fault for some time after a large event. To approximate these effects, a decaying exponential term is added to the BPT model's uniform loading term. The resulting interevent time distributions remain approximately lognormal, but the balance between the level of noise (e.g., unknown fault interactions) and the coefficient of variability of the interevent time distribution changes depending on the shape of the loading function. For a given level of noise in the loading process, transient deformation has the effect of increasing the coefficient of variability of earthquake interevent times. Conversely, the level of noise needed to achieve a given level of variability is reduced when transient deformation is included. Using less noise would then increase the effect of known fault interactions modeled as stress or strain steps because they would be larger with respect to the noise. If we only seek to estimate the shape of the interevent time distribution from observed earthquake occurrences, then the use of a transient deformation model will not dramatically change the results of a probability study because a similar shaped distribution can be achieved with either uniform or transient loading functions. However, if the goal is to estimate earthquake probabilities based on our increasing understanding of the seismogenic process, including earthquake interactions, then including transient deformation is important to obtain accurate results. For example, a loading curve based on the 1906 earthquake, paleoseismic observations of prior events, and observations of recent deformation in the San Francisco Bay region produces a 40% greater variability in earthquake recurrence than a uniform loading model with the same noise level.
NASA Astrophysics Data System (ADS)
Toscano, Joseph Christopher
Several fundamental questions about speech perception concern how listeners understand spoken language despite considerable variability in speech sounds across different contexts (the problem of lack of invariance in speech). This contextual variability is caused by several factors, including differences between individual talkers' voices, variation in speaking rate, and effects of coarticulatory context. A number of models have been proposed to describe how the speech system handles differences across contexts. Critically, these models make different predictions about (1) whether contextual variability is handled at the level of acoustic cue encoding or categorization, (2) whether it is driven by feedback from category-level processes or interactions between cues, and (3) whether listeners discard fine-grained acoustic information to compensate for contextual variability. Separating the effects of cue- and category-level processing has been difficult because behavioral measures tap processes that occur well after initial cue encoding and are influenced by task demands and linguistic information. Recently, we have used the event-related brain potential (ERP) technique to examine cue encoding and online categorization. Specifically, we have looked at differences in the auditory N1 as a measure of acoustic cue encoding and the P3 as a measure of categorization. This allows us to examine multiple levels of processing during speech perception and can provide a useful tool for studying effects of contextual variability. Here, I apply this approach to determine the point in processing at which context has an effect on speech perception and to examine whether acoustic cues are encoded continuously. Several types of contextual variability (talker gender, speaking rate, and coarticulation), as well as several acoustic cues (voice onset time, formant frequencies, and bandwidths), are examined in a series of experiments. The results suggest that (1) at early stages of speech processing, listeners encode continuous differences in acoustic cues, independent of phonological categories; (2) at post-perceptual stages, fine-grained acoustic information is preserved; and (3) there is preliminary evidence that listeners encode cues relative to context via feedback from categories. These results are discussed in relation to proposed models of speech perception and sources of contextual variability.
A social-cognitive framework of multidisciplinary team innovation.
Paletz, Susannah B F; Schunn, Christian D
2010-01-01
The psychology of science typically lacks integration between cognitive and social variables. We present a new framework of team innovation in multidisciplinary science and engineering groups that ties factors from both literatures together. We focus on the effects of a particularly challenging social factor, knowledge diversity, which has a history of mixed effects on creativity, most likely because those effects are mediated and moderated by cognitive and additional social variables. In addition, we highlight the distinction between team innovative processes that are primarily divergent versus convergent; we propose that the social and cognitive implications are different for each, providing a possible explanation for knowledge diversity's mixed results on team outcomes. Social variables mapped out include formal roles, communication norms, sufficient participation and information sharing, and task conflict; cognitive variables include analogy, information search, and evaluation. This framework provides a roadmap for research that aims to harness the power of multidisciplinary teams. Copyright © 2009 Cognitive Science Society, Inc.
Mediators of weight loss in a family-based intervention presented over the internet.
White, Marney A; Martin, Pamela D; Newton, Robert L; Walden, Heather M; York-Crowe, Emily E; Gordon, Stewart T; Ryan, Donna H; Williamson, Donald A
2004-07-01
To assess the process variables involved in a weight loss program for African-American adolescent girls. Several process variables have been identified as affecting success in in vivo weight loss programs for adults and children, including program adherence, self-efficacy, and social support. The current study sought to broaden the understanding of these process variables as they pertain to an intervention program that is presented using the Internet. It was hypothesized that variables such as program adherence, dietary self-efficacy, psychological factors, and family environment factors would mediate the effect of the experimental condition on weight loss. Participants were 57 adolescent African-American girls who joined the program with one obese parent; family pairs were randomized to either a behavioral or control condition in an Internet-based weight loss program. Outcome data (weight loss) are reported for the first 6 months of the intervention. Results partially supported the hypotheses. For weight loss among adolescents, parent variables pertaining to life and family satisfaction were the strongest mediating variables. For parental weight loss, changes in dietary practices over the course of 6 months were the strongest mediators. The identification of factors that enhance or impede weight loss for adolescents is an important step in improving weight loss programs for this group. The current findings suggest that family/parental variables exert a strong influence on weight loss efforts for adolescents and should be considered in developing future programs. Copyright 2004 NAASO
Sources of biomass feedstock variability and the potential impact on biofuels production
Williams, C. Luke; Westover, Tyler L.; Emerson, Rachel M.; ...
2015-11-23
In this study, terrestrial lignocellulosic biomass has the potential to be a carbon neutral and domestic source of fuels and chemicals. However, the innate variability of biomass resources, such as herbaceous and woody materials, and the inconsistency within a single resource due to disparate growth and harvesting conditions, presents challenges for downstream processes which often require materials that are physically and chemically consistent. Intrinsic biomass characteristics, including moisture content, carbohydrate and ash compositions, bulk density, and particle size/shape distributions are highly variable and can impact the economics of transforming biomass into value-added products. For instance, ash content increases by anmore » order of magnitude between woody and herbaceous feedstocks (from ~0.5 to 5 %, respectively) while lignin content drops by a factor of two (from ~30 to 15 %, respectively). This increase in ash and reduction in lignin leads to biofuel conversion consequences, such as reduced pyrolysis oil yields for herbaceous products as compared to woody material. In this review, the sources of variability for key biomass characteristics are presented for multiple types of biomass. Additionally, this review investigates the major impacts of the variability in biomass composition on four conversion processes: fermentation, hydrothermal liquefaction, pyrolysis, and direct combustion. Finally, future research processes aimed at reducing the detrimental impacts of biomass variability on conversion to fuels and chemicals are proposed.« less
Cronin, Thomas M.
2016-01-01
Climate change (including climate variability) refers to regional or global changes in mean climate state or in patterns of climate variability over decades to millions of years often identified using statistical methods and sometimes referred to as changes in long-term weather conditions (IPCC, 2012). Climate is influenced by changes in continent-ocean configurations due to plate tectonic processes, variations in Earth’s orbit, axial tilt and precession, atmospheric greenhouse gas (GHG) concentrations, solar variability, volcanism, internal variability resulting from interactions between the atmosphere, oceans and ice (glaciers, small ice caps, ice sheets, and sea ice), and anthropogenic activities such as greenhouse gas emissions and land use and their effects on carbon cycling.
Qiao, Yuanhua; West, Harry H; Mannan, M Sam; Johnson, David W; Cornwell, John B
2006-03-17
Liquefied natural gas (LNG) release, spread, evaporation, and dispersion processes are illustrated using the Federal Energy Regulatory Commission models in this paper. The spillage consequences are dependent upon the tank conditions, release scenarios, and the environmental conditions. The effects of the contributing variables, including the tank configuration, breach hole size, ullage pressure, wind speed and stability class, and surface roughness, on the consequence of LNG spillage onto water are evaluated using the models. The sensitivities of the consequences to those variables are discussed.
Digital Archiving: Where the Past Lives Again
NASA Astrophysics Data System (ADS)
Paxson, K. B.
2012-06-01
The process of digital archiving for variable star data by manual entry with an Excel spreadsheet is described. Excel-based tools including a Step Magnitude Calculator and a Julian Date Calculator for variable star observations where magnitudes and Julian dates have not been reduced are presented. Variable star data in the literature and the AAVSO International Database prior to 1911 are presented and reviewed, with recent archiving work being highlighted. Digitization using optical character recognition software conversion is also demonstrated, with editing and formatting suggestions for the OCR-converted text.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Gordon M.; Robertson, Amy; Jonkman, Jason
A database of meteorological and ocean conditions is presented for use in offshore wind energy research and design. The original data are from 23 ocean sites around the USA and were obtained from the National Data Buoy Center run by the National Oceanic and Atmospheric Administration. The data are presented in a processed form that includes the variables of interest for offshore wind energy design: wind speed, significant wave height, wave peak-spectral period, wind direction and wave direction. For each site, a binning process is conducted to create conditional probability functions for each of these variables. The sites are thenmore » grouped according to geographic location and combined to create three representative sites, including a West Coast site, an East Coast site and a Gulf of Mexico site. Both the processed data and the probability distribution parameters for the individual and representative sites are being hosted on a publicly available domain by the National Renewable Energy Laboratory, with the intent of providing a standard basis of comparison for meteorological and ocean conditions for offshore wind energy research worldwide.« less
NASA Astrophysics Data System (ADS)
Batzias, Dimitris F.
2009-08-01
This work deals with a methodological framework under the form of a simple/short algorithmic procedure (including 11 activity steps and 3 decision nodes) designed/developed for the determination of optimal subsidy for materials saving investment through recycle/recovery (RR) at industrial level. Two case examples are presented, covering both aspects, without and with recycling. The expected Relative Cost Decrease (RCD) because of recycling, which forms a critical index for decision making on subsidizing, is estimated. The developed procedure can be extended outside the industrial unit to include collection/transportation/processing of recyclable wasted products. Since, in such a case, transportation cost and processing cost are conflict depended variables (when the quantity collected/processed Q is the independent/explanatory variable), the determination of Qopt is examined under energy crises conditions, when corresponding subsidies might be granted to re-set the original equilibrium and avoid putting the recycling enterprise in jeopardize due to dangerous lowering of the first break-even point.
Zhang, Xingyu; Kim, Joyce; Patzer, Rachel E; Pitts, Stephen R; Patzer, Aaron; Schrager, Justin D
2017-10-26
To describe and compare logistic regression and neural network modeling strategies to predict hospital admission or transfer following initial presentation to Emergency Department (ED) triage with and without the addition of natural language processing elements. Using data from the National Hospital Ambulatory Medical Care Survey (NHAMCS), a cross-sectional probability sample of United States EDs from 2012 and 2013 survey years, we developed several predictive models with the outcome being admission to the hospital or transfer vs. discharge home. We included patient characteristics immediately available after the patient has presented to the ED and undergone a triage process. We used this information to construct logistic regression (LR) and multilayer neural network models (MLNN) which included natural language processing (NLP) and principal component analysis from the patient's reason for visit. Ten-fold cross validation was used to test the predictive capacity of each model and receiver operating curves (AUC) were then calculated for each model. Of the 47,200 ED visits from 642 hospitals, 6,335 (13.42%) resulted in hospital admission (or transfer). A total of 48 principal components were extracted by NLP from the reason for visit fields, which explained 75% of the overall variance for hospitalization. In the model including only structured variables, the AUC was 0.824 (95% CI 0.818-0.830) for logistic regression and 0.823 (95% CI 0.817-0.829) for MLNN. Models including only free-text information generated AUC of 0.742 (95% CI 0.731- 0.753) for logistic regression and 0.753 (95% CI 0.742-0.764) for MLNN. When both structured variables and free text variables were included, the AUC reached 0.846 (95% CI 0.839-0.853) for logistic regression and 0.844 (95% CI 0.836-0.852) for MLNN. The predictive accuracy of hospital admission or transfer for patients who presented to ED triage overall was good, and was improved with the inclusion of free text data from a patient's reason for visit regardless of modeling approach. Natural language processing and neural networks that incorporate patient-reported outcome free text may increase predictive accuracy for hospital admission.
The relationship between Urbanisation and changes in flood regimes: the British case.
NASA Astrophysics Data System (ADS)
Prosdocimi, Ilaria; Miller, James; Kjeldsen, Thomas
2013-04-01
This pilot study investigates if long-term changes in observed series of extreme flood events can be attributed to changes in climate and land-use drivers. We investigate, in particular, changes of winter and summer peaks extracted from gauged instantaneous flows records in selected British catchments. Using a Poisson processes framework, the frequency and magnitude of extreme events above a threshold can be modelled simultaneously under the standard stationarity assumptions of constant location and scale. In the case of a non-stationary process, the framework was extended to include covariates to account for changes in the process parameters. By including covariates related to the physical process, such as increased urbanization or North Atlantic Oscillation (NAO) Index levels, rather than just time, an enhanced understanding of the changes in high flows is obtainable. Indeed some variability is expected in any natural process and can be partially explained by large scale measures like NAO Index. The focus of this study is to understand, once natural variability is taken into account, how much of the remaining variability can be explained by increased urbanization levels. For this study, catchments are selected that have experienced significant growth in urbanisation in the past decades, typically 1960s to present, and for which concurrent good quality high flow data are available. Temporal change in the urban extent within catchments is obtained using novel processing of historical mapping sources, whereby the urban, suburban and rural fractions are obtained for decadal periods. Suitable flow data from localised rural catchments are also included as control cases to compare observed changes in the flood regime of urbanised catchments against, and to provide evidence of changes in regional climate. Initial results suggest that the effect of urbanisation can be detected in the rate of occurrence of flood events, especially in summer, whereas the impact on flood magnitude is less pronounced. Further tests across a greater number of catchments are necessary to validate these results.
Model-free adaptive control of supercritical circulating fluidized-bed boilers
Cheng, George Shu-Xing; Mulkey, Steven L
2014-12-16
A novel 3-Input-3-Output (3.times.3) Fuel-Air Ratio Model-Free Adaptive (MFA) controller is introduced, which can effectively control key process variables including Bed Temperature, Excess O2, and Furnace Negative Pressure of combustion processes of advanced boilers. A novel 7-input-7-output (7.times.7) MFA control system is also described for controlling a combined 3-Input-3-Output (3.times.3) process of Boiler-Turbine-Generator (BTG) units and a 5.times.5 CFB combustion process of advanced boilers. Those boilers include Circulating Fluidized-Bed (CFB) Boilers and Once-Through Supercritical Circulating Fluidized-Bed (OTSC CFB) Boilers.
NASA Astrophysics Data System (ADS)
Dafflon, B.; Barrash, W.; Cardiff, M.; Johnson, T. C.
2011-12-01
Reliable predictions of groundwater flow and solute transport require an estimation of the detailed distribution of the parameters (e.g., hydraulic conductivity, effective porosity) controlling these processes. However, such parameters are difficult to estimate because of the inaccessibility and complexity of the subsurface. In this regard, developments in parameter estimation techniques and investigations of field experiments are still challenging and necessary to improve our understanding and the prediction of hydrological processes. Here we analyze a conservative tracer test conducted at the Boise Hydrogeophysical Research Site in 2001 in a heterogeneous unconfined fluvial aquifer. Some relevant characteristics of this test include: variable-density (sinking) effects because of the injection concentration of the bromide tracer, the relatively small size of the experiment, and the availability of various sources of geophysical and hydrological information. The information contained in this experiment is evaluated through several parameter estimation approaches, including a grid-search-based strategy, stochastic simulation of hydrological property distributions, and deterministic inversion using regularization and pilot-point techniques. Doing this allows us to investigate hydraulic conductivity and effective porosity distributions and to compare the effects of assumptions from several methods and parameterizations. Our results provide new insights into the understanding of variable-density transport processes and the hydrological relevance of incorporating various sources of information in parameter estimation approaches. Among others, the variable-density effect and the effective porosity distribution, as well as their coupling with the hydraulic conductivity structure, are seen to be significant in the transport process. The results also show that assumed prior information can strongly influence the estimated distributions of hydrological properties.
The Hubble Catalog of Variables
NASA Astrophysics Data System (ADS)
Gavras, P.; Bonanos, A. Z.; Bellas-Velidis, I.; Charmandaris, V.; Georgantopoulos, I.; Hatzidimitriou, D.; Kakaletris, G.; Karampelas, A.; Laskaris, N.; Lennon, D. J.; Moretti, M. I.; Pouliasis, E.; Sokolovsky, K.; Spetsieri, Z. T.; Tsinganos, K.; Whitmore, B. C.; Yang, M.
2017-06-01
The Hubble Catalog of Variables (HCV) is a 3 year ESA funded project that aims to develop a set of algorithms to identify variables among the sources included in the Hubble Source Catalog (HSC) and produce the HCV. We will process all HSC sources with more than a predefined number of measurements in a single filter/instrument combination and compute a range of lightcurve features to determine the variability status of each source. At the end of the project, the first release of the Hubble Catalog of Variables will be made available at the Mikulski Archive for Space Telescopes (MAST) and the ESA Science Archives. The variability detection pipeline will be implemented at the Space Telescope Science Institute (STScI) so that updated versions of the HCV may be created following the future releases of the HSC.
Goode, C; LeRoy, J; Allen, D G
2007-01-01
This study reports on a multivariate analysis of the moving bed biofilm reactor (MBBR) wastewater treatment system at a Canadian pulp mill. The modelling approach involved a data overview by principal component analysis (PCA) followed by partial least squares (PLS) modelling with the objective of explaining and predicting changes in the BOD output of the reactor. Over two years of data with 87 process measurements were used to build the models. Variables were collected from the MBBR control scheme as well as upstream in the bleach plant and in digestion. To account for process dynamics, a variable lagging approach was used for variables with significant temporal correlations. It was found that wood type pulped at the mill was a significant variable governing reactor performance. Other important variables included flow parameters, faults in the temperature or pH control of the reactor, and some potential indirect indicators of biomass activity (residual nitrogen and pH out). The most predictive model was found to have an RMSEP value of 606 kgBOD/d, representing a 14.5% average error. This was a good fit, given the measurement error of the BOD test. Overall, the statistical approach was effective in describing and predicting MBBR treatment performance.
System for monitoring an industrial process and determining sensor status
Gross, K.C.; Hoyer, K.K.; Humenik, K.E.
1995-10-17
A method and system for monitoring an industrial process and a sensor are disclosed. The method and system include generating a first and second signal characteristic of an industrial process variable. One of the signals can be an artificial signal generated by an auto regressive moving average technique. After obtaining two signals associated with one physical variable, a difference function is obtained by determining the arithmetic difference between the two pairs of signals over time. A frequency domain transformation is made of the difference function to obtain Fourier modes describing a composite function. A residual function is obtained by subtracting the composite function from the difference function and the residual function (free of nonwhite noise) is analyzed by a statistical probability ratio test. 17 figs.
System for monitoring an industrial process and determining sensor status
Gross, K.C.; Hoyer, K.K.; Humenik, K.E.
1997-05-13
A method and system are disclosed for monitoring an industrial process and a sensor. The method and system include generating a first and second signal characteristic of an industrial process variable. One of the signals can be an artificial signal generated by an auto regressive moving average technique. After obtaining two signals associated with one physical variable, a difference function is obtained by determining the arithmetic difference between the two pairs of signals over time. A frequency domain transformation is made of the difference function to obtain Fourier modes describing a composite function. A residual function is obtained by subtracting the composite function from the difference function and the residual function (free of nonwhite noise) is analyzed by a statistical probability ratio test. 17 figs.
System for monitoring an industrial process and determining sensor status
Gross, Kenneth C.; Hoyer, Kristin K.; Humenik, Keith E.
1995-01-01
A method and system for monitoring an industrial process and a sensor. The method and system include generating a first and second signal characteristic of an industrial process variable. One of the signals can be an artificial signal generated by an auto regressive moving average technique. After obtaining two signals associated with one physical variable, a difference function is obtained by determining the arithmetic difference between the two pairs of signals over time. A frequency domain transformation is made of the difference function to obtain Fourier modes describing a composite function. A residual function is obtained by subtracting the composite function from the difference function and the residual function (free of nonwhite noise) is analyzed by a statistical probability ratio test.
System for monitoring an industrial process and determining sensor status
Gross, Kenneth C.; Hoyer, Kristin K.; Humenik, Keith E.
1997-01-01
A method and system for monitoring an industrial process and a sensor. The method and system include generating a first and second signal characteristic of an industrial process variable. One of the signals can be an artificial signal generated by an auto regressive moving average technique. After obtaining two signals associated with one physical variable, a difference function is obtained by determining the arithmetic difference between the two pairs of signals over time. A frequency domain transformation is made of the difference function to obtain Fourier modes describing a composite function. A residual function is obtained by subtracting the composite function from the difference function and the residual function (free of nonwhite noise) is analyzed by a statistical probability ratio test.
Apparatus and method for microwave processing of materials
Johnson, Arvid C.; Lauf, Robert J.; Bible, Don W.; Markunas, Robert J.
1996-01-01
A variable frequency microwave heating apparatus (10) designed to allow modulation of the frequency of the microwaves introduced into a furnace cavity (34) for testing or other selected applications. The variable frequency heating apparatus (10) is used in the method of the present invention to monitor the resonant processing frequency within the furnace cavity (34) depending upon the material, including the state thereof, from which the workpiece (36) is fabricated. The variable frequency microwave heating apparatus (10) includes a microwave signal generator (12) and a high-power microwave amplifier (20) or a microwave voltage-controlled oscillator (14). A power supply (22) is provided for operation of the high-power microwave oscillator (14) or microwave amplifier (20). A directional coupler (24) is provided for detecting the direction and amplitude of signals incident upon and reflected from the microwave cavity (34). A first power meter (30) is provided for measuring the power delivered to the microwave furnace (32). A second power meter (26) detects the magnitude of reflected power. Reflected power is dissipated in the reflected power load (28).
Processing data base information having nonwhite noise
Gross, Kenneth C.; Morreale, Patricia
1995-01-01
A method and system for processing a set of data from an industrial process and/or a sensor. The method and system can include processing data from either real or calculated data related to an industrial process variable. One of the data sets can be an artificial signal data set generated by an autoregressive moving average technique. After obtaining two data sets associated with one physical variable, a difference function data set is obtained by determining the arithmetic difference between the two pairs of data sets over time. A frequency domain transformation is made of the difference function data set to obtain Fourier modes describing a composite function data set. A residual function data set is obtained by subtracting the composite function data set from the difference function data set and the residual function data set (free of nonwhite noise) is analyzed by a statistical probability ratio test to provide a validated data base.
Efficient Robust Optimization of Metal Forming Processes using a Sequential Metamodel Based Strategy
NASA Astrophysics Data System (ADS)
Wiebenga, J. H.; Klaseboer, G.; van den Boogaard, A. H.
2011-08-01
The coupling of Finite Element (FE) simulations to mathematical optimization techniques has contributed significantly to product improvements and cost reductions in the metal forming industries. The next challenge is to bridge the gap between deterministic optimization techniques and the industrial need for robustness. This paper introduces a new and generally applicable structured methodology for modeling and solving robust optimization problems. Stochastic design variables or noise variables are taken into account explicitly in the optimization procedure. The metamodel-based strategy is combined with a sequential improvement algorithm to efficiently increase the accuracy of the objective function prediction. This is only done at regions of interest containing the optimal robust design. Application of the methodology to an industrial V-bending process resulted in valuable process insights and an improved robust process design. Moreover, a significant improvement of the robustness (>2σ) was obtained by minimizing the deteriorating effects of several noise variables. The robust optimization results demonstrate the general applicability of the robust optimization strategy and underline the importance of including uncertainty and robustness explicitly in the numerical optimization procedure.
Huff, Mark J.; Bodner, Glen E.
2014-01-01
Whether encoding variability facilitates memory is shown to depend on whether item-specific and relational processing are both performed across study blocks, and whether study items are weakly versus strongly related. Variable-processing groups studied a word list once using an item-specific task and once using a relational task. Variable-task groups’ two different study tasks recruited the same type of processing each block. Repeated-task groups performed the same study task each block. Recall and recognition were greatest in the variable-processing group, but only with weakly related lists. A variable-processing benefit was also found when task-based processing and list-type processing were complementary (e.g., item-specific processing of a related list) rather than redundant (e.g., relational processing of a related list). That performing both item-specific and relational processing across trials, or within a trial, yields encoding-variability benefits may help reconcile decades of contradictory findings in this area. PMID:25018583
The role of primary auditory and visual cortices in temporal processing: A tDCS approach.
Mioni, G; Grondin, S; Forgione, M; Fracasso, V; Mapelli, D; Stablum, F
2016-10-15
Many studies showed that visual stimuli are frequently experienced as shorter than equivalent auditory stimuli. These findings suggest that timing is distributed across many brain areas and that "different clocks" might be involved in temporal processing. The aim of this study is to investigate, with the application of tDCS over V1 and A1, the specific role of primary sensory cortices (either visual or auditory) in temporal processing. Forty-eight University students were included in the study. Twenty-four participants were stimulated over A1 and 24 participants were stimulated over V1. Participants performed time bisection tasks, in the visual and the auditory modalities, involving standard durations lasting 300ms (short) and 900ms (long). When tDCS was delivered over A1, no effect of stimulation was observed on perceived duration but we observed higher temporal variability under anodic stimulation compared to sham and higher variability in the visual compared to the auditory modality. When tDCS was delivered over V1, an under-estimation of perceived duration and higher variability was observed in the visual compared to the auditory modality. Our results showed more variability of visual temporal processing under tDCS stimulation. These results suggest a modality independent role of A1 in temporal processing and a modality specific role of V1 in the processing of temporal intervals in the visual modality. Copyright © 2016 Elsevier B.V. All rights reserved.
FLUXNET2015 Dataset: Batteries included
NASA Astrophysics Data System (ADS)
Pastorello, G.; Papale, D.; Agarwal, D.; Trotta, C.; Chu, H.; Canfora, E.; Torn, M. S.; Baldocchi, D. D.
2016-12-01
The synthesis datasets have become one of the signature products of the FLUXNET global network. They are composed from contributions of individual site teams to regional networks, being then compiled into uniform data products - now used in a wide variety of research efforts: from plant-scale microbiology to global-scale climate change. The FLUXNET Marconi Dataset in 2000 was the first in the series, followed by the FLUXNET LaThuile Dataset in 2007, with significant additions of data products and coverage, solidifying the adoption of the datasets as a research tool. The FLUXNET2015 Dataset counts with another round of substantial improvements, including extended quality control processes and checks, use of downscaled reanalysis data for filling long gaps in micrometeorological variables, multiple methods for USTAR threshold estimation and flux partitioning, and uncertainty estimates - all of which accompanied by auxiliary flags. This "batteries included" approach provides a lot of information for someone who wants to explore the data (and the processing methods) in detail. This inevitably leads to a large number of data variables. Although dealing with all these variables might seem overwhelming at first, especially to someone looking at eddy covariance data for the first time, there is method to our madness. In this work we describe the data products and variables that are part of the FLUXNET2015 Dataset, and the rationale behind the organization of the dataset, covering the simplified version (labeled SUBSET), the complete version (labeled FULLSET), and the auxiliary products in the dataset.
On the Past, Present, and Future of Eastern Boundary Upwelling Systems
NASA Astrophysics Data System (ADS)
Bograd, S. J.; Black, B.; Garcia-Reyes, M.; Rykaczewski, R. R.; Thompson, S. A.; Turley, B. D.; van der Sleen, P.; Sydeman, W. J.
2016-12-01
Coastal upwelling in Eastern Boundary Upwelling Systems (EBUS) drives high productivity and marine biodiversity and supports lucrative commercial fishing operations. Thus there is significant interest in understanding the mechanisms underlying variations in the upwelling process, its drivers, and potential changes relative to global warming. Here we review recent results from a combination of regional and global observations, reanalysis products, and climate model projections that describe variability in coastal upwelling in EBUS. Key findings include: (1) interannual variability in California Current upwelling occurs in two orthogonal seasonal modes: a winter/early spring mode dominated by interannual variability and a summer mode dominated by long-term increasing trend; (2) there is substantial coherence in year-to-year variability between this winter/spring upwelling mode and upper trophic level demographic processes, including fish growth rates (rockfish and salmon) and seabird phenology, breeding success and survival; (3) a meta-analysis of existing literature suggests consistency with the Bakun (1990) hypothesis that rising global greenhouse-gas concentrations would result in upwelling-favorable wind intensification; however, (4) an ensemble of coupled, global ocean-atmosphere models finds limited evidence for intensification of upwelling-favorable winds over the 21st century, although summertime winds near the poleward boundaries of climatalogical upwelling zones are projected to intensify. We will also review a new comparative research program between the California and Benguela Upwelling Systems, including efforts to understand patterns of change and variation between climate, upwelling, fish, and seabirds.
Salgado, Diana; Torres, J Antonio; Welti-Chanes, Jorge; Velazquez, Gonzalo
2011-08-01
Consumer demand for food safety and quality improvements, combined with new regulations, requires determining the processor's confidence level that processes lowering safety risks while retaining quality will meet consumer expectations and regulatory requirements. Monte Carlo calculation procedures incorporate input data variability to obtain the statistical distribution of the output of prediction models. This advantage was used to analyze the survival risk of Mycobacterium avium subspecies paratuberculosis (M. paratuberculosis) and Clostridium botulinum spores in high-temperature short-time (HTST) milk and canned mushrooms, respectively. The results showed an estimated 68.4% probability that the 15 sec HTST process would not achieve at least 5 decimal reductions in M. paratuberculosis counts. Although estimates of the raw milk load of this pathogen are not available to estimate the probability of finding it in pasteurized milk, the wide range of the estimated decimal reductions, reflecting the variability of the experimental data available, should be a concern to dairy processors. Knowledge of the C. botulinum initial load and decimal thermal time variability was used to estimate an 8.5 min thermal process time at 110 °C for canned mushrooms reducing the risk to 10⁻⁹ spores/container with a 95% confidence. This value was substantially higher than the one estimated using average values (6.0 min) with an unacceptable 68.6% probability of missing the desired processing objective. Finally, the benefit of reducing the variability in initial load and decimal thermal time was confirmed, achieving a 26.3% reduction in processing time when standard deviation values were lowered by 90%. In spite of novel technologies, commercialized or under development, thermal processing continues to be the most reliable and cost-effective alternative to deliver safe foods. However, the severity of the process should be assessed to avoid under- and over-processing and determine opportunities for improvement. This should include a systematic approach to consider variability in the parameters for the models used by food process engineers when designing a thermal process. The Monte Carlo procedure here presented is a tool to facilitate this task for the determination of process time at a constant lethal temperature. © 2011 Institute of Food Technologists®
Study of process variables associated with manufacturing hermetically-sealed nickel-cadmium cells
NASA Technical Reports Server (NTRS)
Miller, L.
1974-01-01
A two year study of the major process variables associated with the manufacturing process for sealed, nickel-cadmium, areospace cells is summarized. Effort was directed toward identifying the major process variables associated with a manufacturing process, experimentally assessing each variable's effect, and imposing the necessary changes (optimization) and controls for the critical process variables to improve results and uniformity. A critical process variable associated with the sintered nickel plaque manufacturing process was identified as the manual forming operation. Critical process variables identified with the positive electrode impregnation/polarization process were impregnation solution temperature, free acid content, vacuum impregnation, and sintered plaque strength. Positive and negative electrodes were identified as a major source of carbonate contamination in sealed cells.
The market for airline aircraft: A study of process and performance
NASA Technical Reports Server (NTRS)
1976-01-01
The key variables accounting for the nature, timing and magnitude of the equipment and re-equipment cycle are identified and discussed. Forecasts of aircraft purchases by U.S. trunk airlines over the next 10 years are included to examine the anatomy of equipment forecasts in a way that serves to illustrate how certain of these variables or determinants of aircraft demand can be considered in specific terms.
NASA Astrophysics Data System (ADS)
McGuire, K. J.; Bailey, S. W.; Ross, D. S.
2017-12-01
Heterogeneity in biophysical properties within catchments challenges how we quantify and characterize biogeochemical processes and interpret catchment outputs. Interactions between the spatiotemporal variability of hydrological states and fluxes and soil development can spatially structure catchments, leading to a framework for understanding patterns in biogeochemical processes. In an upland, glaciated landscape at the Hubbard Brook Experimental Forest (HBEF) in New Hampshire, USA, we are embracing the structure and organization of soils to understand the spatial relations between runoff production zones, distinct soil-biogeochemical environments, and solute retention and release. This presentation will use observations from the HBEF to demonstrate that a soil-landscape framework is essential in understanding the spatial and temporal variability of biogeochemical processes in this catchment. Specific examples will include how laterally developed soils reveal the location of active runoff production zones and lead to gradients in primary mineral dissolution and the distribution of weathering products along hillslopes. Soil development patterns also highlight potential carbon and nitrogen cycling hotspots, differentiate acidic conditions, and affect the regulation of surface water quality. Overall, this work demonstrates the importance of understanding the landscape-level structural organization of soils in characterizing the variation and extent of biogeochemical processes that occur in catchments.
Saha, Felix J; Brüning, Alexander; Barcelona, Cyrus; Büssing, Arndt; Langhorst, Jost; Dobos, Gustav; Lauche, Romy; Cramer, Holger
2016-07-01
Integrative medicine inpatient treatment has been shown to improve physical and mental health in patients with internal medicine conditions. The aim of this study was to investigate the effectiveness of a 2-week integrative medicine inpatient treatment in patients with chronic pain syndromes and the association of treatment success with patient-related process variables. Inpatients with chronic pain syndromes participating in a 2-week integrative medicine inpatient program were included. Patients' pain intensity, pain disability, pain perception, quality of life, depression, and perceived stress were measured on admission, discharge, and 6 months after discharge. Likewise process variables including ability and will to change, emotional/rational disease acceptance, mindfulness, life and health satisfaction, and easiness of life were assessed. A total of 310 inpatients (91% female, mean age 50.7 ± 12.4 year, 26.5% low back pain, and 22.9% fibromyalgia) were included. Using mixed linear models, significant improvements in pain intensity, pain disability, pain perception, quality of life, depression, and perceived stress were found (all P < 0.05). Ability to change and implementation, disease acceptance, mindfulness, life and health satisfaction, and light heartedness/easiness likewise improved (all P < 0.05). Improved outcomes were associated with increases in process variables, mainly ability to change and implementation, disease acceptance, life and health satisfaction, and light heartedness/easiness (R = 0.03-0.40). Results of this study suggest that a 2-week integrative medicine inpatient treatment can benefit patients with chronic pain conditions. Functional improvements are associated with improved ability to change and implementation, disease acceptance, and satisfaction.
NASA Astrophysics Data System (ADS)
Brown, S. M.; Behn, M. D.; Grove, T. L.
2017-12-01
We present results of a combined petrologic - geochemical (major and trace element) - geodynamical forward model for mantle melting and subsequent melt modification. The model advances Behn & Grove (2015), and is calibrated using experimental petrology. Our model allows for melting in the plagioclase, spinel, and garnet fields with a flexible retained melt fraction (from pure batch to pure fractional), tracks residual mantle composition, and includes melting with water, variable melt productivity, and mantle mode calculations. This approach is valuable for understanding oceanic crustal accretion, which involves mantle melting and melt modification by migration and aggregation. These igneous processes result in mid-ocean ridge basalts that vary in composition at the local (segment) and global scale. The important variables are geophysical and geochemical and include mantle composition, potential temperature, mantle flow, and spreading rate. Accordingly, our model allows us to systematically quantify the importance of each of these external variables. In addition to discriminating melt generation effects, we are able to discriminate the effects of different melt modification processes (inefficient pooling, melt-rock reaction, and fractional crystallization) in generating both local, segment-scale and global-scale compositional variability. We quantify the influence of a specific igneous process on the generation of oceanic crust as a function of variations in the external variables. We also find that it is unlikely that garnet lherzolite melting produces a signature in either major or trace element compositions formed from aggregated melts, because when melting does occur in the garnet field at high mantle temperature, it contributes a relatively small, uniform fraction (< 10%) of the pooled melt compositions at all spreading rates. Additionally, while increasing water content and/or temperature promote garnet melting, they also increase melt extent, pushing the pooled composition to lower Sm/Yb and higher Lu/Hf.
Rao, Neena K.; Motes, Michael A.; Rypma, Bart
2014-01-01
Several fMRI studies have examined brain regions mediating inter-subject variability in cognitive efficiency, but none have examined regions mediating intra-subject variability in efficiency. Thus, the present study was designed to identify brain regions involved in intra-subject variability in cognitive efficiency via participant-level correlations between trial-level reaction time (RT) and trial-level fMRI BOLD percent signal change on a processing speed task. On each trial, participants indicated whether a digit-symbol probe-pair was present or absent in an array of nine digit-symbol probe-pairs while fMRI data were collected. Deconvolution analyses, using RT time-series models (derived from the proportional scaling of an event-related hemodynamic response function model by trial-level RT), were used to evaluate relationships between trial-level RTs and BOLD percent signal change. Although task-related patterns of activation and deactivation were observed in regions including bilateral occipital, bilateral parietal, portions of the medial wall such as the precuneus, default mode network regions including anterior cingulate, posterior cingulate, bilateral temporal, right cerebellum, and right cuneus, RT-BOLD correlations were observed in a more circumscribed set of regions. Positive RT-BOLD correlations, where fast RTs were associated with lower BOLD percent signal change, were observed in regions including bilateral occipital, bilateral parietal, and the precuneus. RT-BOLD correlations were not observed in the default mode network indicating a smaller set of regions associated with intra-subject variability in cognitive efficiency. The results are discussed in terms of a distributed area of regions that mediate variability in the cognitive efficiency that might underlie processing speed differences between individuals. PMID:25374527
Rao, Neena K; Motes, Michael A; Rypma, Bart
2014-01-01
Several fMRI studies have examined brain regions mediating inter-subject variability in cognitive efficiency, but none have examined regions mediating intra-subject variability in efficiency. Thus, the present study was designed to identify brain regions involved in intra-subject variability in cognitive efficiency via participant-level correlations between trial-level reaction time (RT) and trial-level fMRI BOLD percent signal change on a processing speed task. On each trial, participants indicated whether a digit-symbol probe-pair was present or absent in an array of nine digit-symbol probe-pairs while fMRI data were collected. Deconvolution analyses, using RT time-series models (derived from the proportional scaling of an event-related hemodynamic response function model by trial-level RT), were used to evaluate relationships between trial-level RTs and BOLD percent signal change. Although task-related patterns of activation and deactivation were observed in regions including bilateral occipital, bilateral parietal, portions of the medial wall such as the precuneus, default mode network regions including anterior cingulate, posterior cingulate, bilateral temporal, right cerebellum, and right cuneus, RT-BOLD correlations were observed in a more circumscribed set of regions. Positive RT-BOLD correlations, where fast RTs were associated with lower BOLD percent signal change, were observed in regions including bilateral occipital, bilateral parietal, and the precuneus. RT-BOLD correlations were not observed in the default mode network indicating a smaller set of regions associated with intra-subject variability in cognitive efficiency. The results are discussed in terms of a distributed area of regions that mediate variability in the cognitive efficiency that might underlie processing speed differences between individuals.
1992-09-01
abilities is fit along with the autoregressive process. Initially, the influences on search performance of within-group age and sex were included as control...Results: PerformanceLAbility Structure Measurement Model: Ability Structure The correlations between all the ability measures, age, and sex are...subsequent analyses for young adults. Age and sex were included as control variables. There was an age range of 15 years; this range is sufficiently large that
[Development and validation of quality standards for colonoscopy].
Sánchez Del Río, Antonio; Baudet, Juan Salvador; Naranjo Rodríguez, Antonio; Campo Fernández de Los Ríos, Rafael; Salces Franco, Inmaculada; Aparicio Tormo, Jose Ramón; Sánchez Muñoz, Diego; Llach, Joseph; Hervás Molina, Antonio; Parra-Blanco, Adolfo; Díaz Acosta, Juan Antonio
2010-01-30
Before starting programs for colorectal cancer screening it is necessary to evaluate the quality of colonoscopy. Our objectives were to develop a group of quality indicators of colonoscopy easily applicable and to determine the variability of their achievement. After reviewing the bibliography we prepared 21 potential indicators of quality that were submitted to a process of selection in which we measured their facial validity, content validity, reliability and viability of their measurement. We estimated the variability of their achievement by means of the coefficient of variability (CV) and the variability of the achievement of the standards by means of chi(2). Six indicators overcome the selection process: informed consent, medication administered, completed colonoscopy, complications, every polyp removed and recovered, and adenoma detection rate in patients older than 50 years. 1928 colonoscopies were included from eight endoscopy units. Every unit included the same number of colonoscopies selected by means of simple random sampling with substitution. There was an important variability in the achievement of some indicators and standards: medication administered (CV 43%, p<0.01), complications registered (CV 37%, p<0.01), every polyp removed and recovered (CV 12%, p<0.01) and adenoma detection rate in older than fifty years (CV 2%, p<0.01). We have validated six quality indicators for colonoscopy which are easily measurable. An important variability exists in the achievement of some indicators and standards. Our data highlight the importance of the development of continuous quality improvement programmes for colonoscopy before starting colorectal cancer screening. Copyright (c) 2009 Elsevier España, S.L. All rights reserved.
Doing It Right: 366 answers to computing questions you didn't know you had
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herring, Stuart Davis
Slides include information on history: version control, version control: branches, version control: Git, releases, requirements, readability, readability control flow, global variables, architecture, architecture redundancy, processes, input/output, unix, etcetera.
A Production Function Approach to Regional Environmental-Economic Assessments
Numerous difficulties await those creating regional-scale environmental assessments, from data having inconsistent spatial or temporal scales to poorly understood environmental processes and indicators. Including socioeconomic variables further complicates the situation. In place...
ERIC Educational Resources Information Center
Malgady, Robert G.; Zayas, Luis H.
2001-01-01
Nearly half the population who seeks mental health services in the century ahead will be ethnic minorities. Evidence suggests that Hispanics, who are the fastest growing minority group, present higher levels of psychiatric symptomatology and prevalence rates of disorders. A discussion is included on process variables that can inform social…
ERIC Educational Resources Information Center
Bavin, Edith L.; Prendergast, Luke A.; Kidd, Evan; Baker, Emma; Dissanayake, Cheryl
2016-01-01
Background: There is variability in the language of children with autism, even those who are high functioning. However, little is known about how they process language structures in real time, including how they handle potential ambiguity, and whether they follow referential constraints. Previous research with older autism spectrum disorder (ASD)…
ERIC Educational Resources Information Center
Bean, John P.
A theoretical model of turnover in work organizations was applied to the college student dropout process at a major midwestern land grant university. The 854 freshmen women subjects completed a questionnaire that included measures for 14 independent variables: grades, practical value, development, routinization, instrumental communication,…
Post-processing method for wind speed ensemble forecast using wind speed and direction
NASA Astrophysics Data System (ADS)
Sofie Eide, Siri; Bjørnar Bremnes, John; Steinsland, Ingelin
2017-04-01
Statistical methods are widely applied to enhance the quality of both deterministic and ensemble NWP forecasts. In many situations, like wind speed forecasting, most of the predictive information is contained in one variable in the NWP models. However, in statistical calibration of deterministic forecasts it is often seen that including more variables can further improve forecast skill. For ensembles this is rarely taken advantage of, mainly due to that it is generally not straightforward how to include multiple variables. In this study, it is demonstrated how multiple variables can be included in Bayesian model averaging (BMA) by using a flexible regression method for estimating the conditional means. The method is applied to wind speed forecasting at 204 Norwegian stations based on wind speed and direction forecasts from the ECMWF ensemble system. At about 85 % of the sites the ensemble forecasts were improved in terms of CRPS by adding wind direction as predictor compared to only using wind speed. On average the improvements were about 5 %, but mainly for moderate to strong wind situations. For weak wind speeds adding wind direction had more or less neutral impact.
Obradović, Jelena
2012-05-01
The focus of this article is to present current progress in understanding the interplay among adversity, physiological sensitivity to context, and adaptive functioning, with an emphasis on implications and future directions for resilience researchers. It includes a review of current literature that demonstrates (a) links between various levels of adversity exposure and variability in physiological reactivity, (b) how the interplay between children's physiological reactivity and different sources of risk and adversity relates to variability in adaptive functioning, and (c) various approaches for capturing a more dynamic nature of physiological reactivity and related processes. Throughout, important conceptual and empirical issues are highlighted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nicklaus, Dennis J.
2013-10-13
We have developed an Erlang language implementation of the Channel Access protocol. Included are low-level functions for encoding and decoding Channel Access protocol network packets as well as higher level functions for monitoring or setting EPICS process variables. This provides access to EPICS process variables for the Fermilab Acnet control system via our Erlang-based front-end architecture without having to interface to C/C++ programs and libraries. Erlang is a functional programming language originally developed for real-time telecommunications applications. Its network programming features and list management functions make it particularly well-suited for the task of managing multiple Channel Access circuits and PVmore » monitors.« less
Zhao, Yu Xi; Xie, Ping; Sang, Yan Fang; Wu, Zi Yi
2018-04-01
Hydrological process evaluation is temporal dependent. Hydrological time series including dependence components do not meet the data consistency assumption for hydrological computation. Both of those factors cause great difficulty for water researches. Given the existence of hydrological dependence variability, we proposed a correlationcoefficient-based method for significance evaluation of hydrological dependence based on auto-regression model. By calculating the correlation coefficient between the original series and its dependence component and selecting reasonable thresholds of correlation coefficient, this method divided significance degree of dependence into no variability, weak variability, mid variability, strong variability, and drastic variability. By deducing the relationship between correlation coefficient and auto-correlation coefficient in each order of series, we found that the correlation coefficient was mainly determined by the magnitude of auto-correlation coefficient from the 1 order to p order, which clarified the theoretical basis of this method. With the first-order and second-order auto-regression models as examples, the reasonability of the deduced formula was verified through Monte-Carlo experiments to classify the relationship between correlation coefficient and auto-correlation coefficient. This method was used to analyze three observed hydrological time series. The results indicated the coexistence of stochastic and dependence characteristics in hydrological process.
Lin, Yen Ting; Chylek, Lily A; Lemons, Nathan W; Hlavacek, William S
2018-06-21
The chemical kinetics of many complex systems can be concisely represented by reaction rules, which can be used to generate reaction events via a kinetic Monte Carlo method that has been termed network-free simulation. Here, we demonstrate accelerated network-free simulation through a novel approach to equation-free computation. In this process, variables are introduced that approximately capture system state. Derivatives of these variables are estimated using short bursts of exact stochastic simulation and finite differencing. The variables are then projected forward in time via a numerical integration scheme, after which a new exact stochastic simulation is initialized and the whole process repeats. The projection step increases efficiency by bypassing the firing of numerous individual reaction events. As we show, the projected variables may be defined as populations of building blocks of chemical species. The maximal number of connected molecules included in these building blocks determines the degree of approximation. Equation-free acceleration of network-free simulation is found to be both accurate and efficient.
Systematic review of the neural basis of social cognition in patients with mood disorders
Cusi, Andrée M.; Nazarov, Anthony; Holshausen, Katherine; MacQueen, Glenda M.; McKinnon, Margaret C.
2012-01-01
Background This review integrates neuroimaging studies of 2 domains of social cognition — emotion comprehension and theory of mind (ToM) — in patients with major depressive disorder and bipolar disorder. The influence of key clinical and method variables on patterns of neural activation during social cognitive processing is also examined. Methods Studies were identified using PsycINFO and PubMed (January 1967 to May 2011). The search terms were “fMRI,” “emotion comprehension,” “emotion perception,” “affect comprehension,” “affect perception,” “facial expression,” “prosody,” “theory of mind,” “mentalizing” and “empathy” in combination with “major depressive disorder,” “bipolar disorder,” “major depression,” “unipolar depression,” “clinical depression” and “mania.” Results Taken together, neuroimaging studies of social cognition in patients with mood disorders reveal enhanced activation in limbic and emotion-related structures and attenuated activity within frontal regions associated with emotion regulation and higher cognitive functions. These results reveal an overall lack of inhibition by higher-order cognitive structures on limbic and emotion-related structures during social cognitive processing in patients with mood disorders. Critically, key variables, including illness burden, symptom severity, comorbidity, medication status and cognitive load may moderate this pattern of neural activation. Limitations Studies that did not include control tasks or a comparator group were included in this review. Conclusion Further work is needed to examine the contribution of key moderator variables and to further elucidate the neural networks underlying altered social cognition in patients with mood disorders. The neural networks underlying higher-order social cognitive processes, including empathy, remain unexplored in patients with mood disorders. PMID:22297065
NASA Astrophysics Data System (ADS)
Goodman, J. W.
This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.
Cho, Jinmyoung; Martin, Peter; Poon, Leonard W; MacDonald, M; Jazwinski, S M; Green, R C; Gearing, M; Johnson, M A; Markesbery, W R; Woodard, J L; Tenover, J S; Siegler, L C; Rott, C; Rodgers, W L; Hausman, D; Arnold, J; Davey, A
2013-01-01
The developmental adaptation model (Martin & Martin, 2002) provides insights into how current experiences and resources (proximal variables) and past experiences (distal variables) are correlated with outcomes (e.g., well-being) in later life. Applying this model, the current study examined proximal and distal variables associated with positive and negative affect in oldest-old adults, investigating age differences. Data from 306 octogenarians and centenarians who participated in Phase III of the Georgia Centenarian Study were used. Proximal variables included physical functioning, cognitive functioning, self-rated health, number of chronic conditions, social resources, and perceived economic status; distal variables included education, social productive activities, management of personal assets, and other learning experiences. Analysis of variance and block-wise regression analyses were conducted. Octogenarians showed significantly higher levels of positive emotion than centenarians. Cognitive functioning was significantly associated with positive affect, and number of health problems was significantly associated with negative affect after controlling for gender, ethnicity, residence, and marital status. Furthermore, four significant interaction effects suggested that positive affect significantly depended on the levels of cognitive and physical functioning among centenarians, whereas positive affect was dependent on the levels of physical health problems and learning experiences among octogenarians. Findings of this study addressed the importance of current and past experiences and resources in subjective well-being among oldest-old adults as a life-long process. Mechanisms connecting aging processes at the end of a long life to subjective well-being should be explored in future studies.
Artificial Intelligence Tools for Scaling Up of High Shear Wet Granulation Process.
Landin, Mariana
2017-01-01
The results presented in this article demonstrate the potential of artificial intelligence tools for predicting the endpoint of the granulation process in high-speed mixer granulators of different scales from 25L to 600L. The combination of neurofuzzy logic and gene expression programing technologies allowed the modeling of the impeller power as a function of operation conditions and wet granule properties, establishing the critical variables that affect the response and obtaining a unique experimental polynomial equation (transparent model) of high predictability (R 2 > 86.78%) for all size equipment. Gene expression programing allowed the modeling of the granulation process for granulators of similar and dissimilar geometries and can be improved by implementing additional characteristics of the process, as composition variables or operation parameters (e.g., batch size, chopper speed). The principles and the methodology proposed here can be applied to understand and control manufacturing process, using any other granulation equipment, including continuous granulation processes. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Martin, J.; Runge, M.C.; Nichols, J.D.; Lubow, B.C.; Kendall, W.L.
2009-01-01
Thresholds and their relevance to conservation have become a major topic of discussion in the ecological literature. Unfortunately, in many cases the lack of a clear conceptual framework for thinking about thresholds may have led to confusion in attempts to apply the concept of thresholds to conservation decisions. Here, we advocate a framework for thinking about thresholds in terms of a structured decision making process. The purpose of this framework is to promote a logical and transparent process for making informed decisions for conservation. Specification of such a framework leads naturally to consideration of definitions and roles of different kinds of thresholds in the process. We distinguish among three categories of thresholds. Ecological thresholds are values of system state variables at which small changes bring about substantial changes in system dynamics. Utility thresholds are components of management objectives (determined by human values) and are values of state or performance variables at which small changes yield substantial changes in the value of the management outcome. Decision thresholds are values of system state variables at which small changes prompt changes in management actions in order to reach specified management objectives. The approach that we present focuses directly on the objectives of management, with an aim to providing decisions that are optimal with respect to those objectives. This approach clearly distinguishes the components of the decision process that are inherently subjective (management objectives, potential management actions) from those that are more objective (system models, estimates of system state). Optimization based on these components then leads to decision matrices specifying optimal actions to be taken at various values of system state variables. Values of state variables separating different actions in such matrices are viewed as decision thresholds. Utility thresholds are included in the objectives component, and ecological thresholds may be embedded in models projecting consequences of management actions. Decision thresholds are determined by the above-listed components of a structured decision process. These components may themselves vary over time, inducing variation in the decision thresholds inherited from them. These dynamic decision thresholds can then be determined using adaptive management. We provide numerical examples (that are based on patch occupancy models) of structured decision processes that include all three kinds of thresholds. ?? 2009 by the Ecological Society of America.
Functional-anatomic correlates of individual differences in memory.
Kirchhoff, Brenda A; Buckner, Randy L
2006-07-20
Memory abilities differ greatly across individuals. To explore a source of these differences, we characterized the varied strategies people adopt during unconstrained encoding. Participants intentionally encoded object pairs during functional MRI. Principal components analysis applied to a strategy questionnaire revealed that participants variably used four main strategies to aid learning. Individuals' use of verbal elaboration and visual inspection strategies independently correlated with their memory performance. Verbal elaboration correlated with activity in a network of regions that included prefrontal regions associated with controlled verbal processing, while visual inspection correlated with activity in a network of regions that included an extrastriate region associated with object processing. Activity in regions associated with use of these strategies was also correlated with memory performance. This study reveals functional-anatomic correlates of verbal and perceptual strategies that are variably used by individuals during encoding. These strategies engage distinct brain regions and may separately influence memory performance.
A Motivational Physical Activity Intervention for Improving Mobility in Older Korean Americans.
Yeom, Hye-A; Fleury, Julie
2014-07-01
There has been limited empirical support for interventions designed to promote physical activity targeting mobility in racially diverse older adults. This study aims to examine the effects of a Motivational Physical Activity Intervention (MPAI) on social resource, behavioral change process, physical activity, and mobility variables in sedentary older Korean Americans. A quasi-experimental, repeated-measure, pre- and post-tests design was used. Sixty-four community-dwelling, sedentary older Korean Americans (n = 33 for MPAI group, n = 31 for Attention Control group) participated in the study. There were significant improvements in social resources, including social support from family and friends; behavioral change process variables, including self-efficacy; motivational appraisal; and self-regulation for physical activity. There were significant intervention effects on physical activity, walking endurance, and flexibility. The MPAI is supported as improving mobility and physical activity, as well as increasing motivation for physical activity in older Korean Americans. © The Author(s) 2013.
Hierarchical Synthesis of Coastal Ecosystem Health Indicators at Karimunjawa National Marine Park
NASA Astrophysics Data System (ADS)
Danu Prasetya, Johan; Ambariyanto; Supriharyono; Purwanti, Frida
2018-02-01
The coastal ecosystem of Karimunjawa National Marine Park (KNMP) is facing various pressures, including from human activity. Monitoring the health condition of coastal ecosystems periodically is needed as an evaluation of the ecosystem condition. Systematic and consistent indicators are needed in monitoring of coastal ecosystem health. This paper presents hierarchical synthesis of coastal ecosystem health indicators using Analytic Hierarchy Process (AHP) method. Hierarchical synthesis is obtained from process of weighting by paired comparison based on expert judgments. The variables of coastal ecosystem health indicators in this synthesis consist of 3 level of variable, i.e. main variable, sub-variable and operational variable. As a result of assessment, coastal ecosystem health indicators consist of 3 main variables, i.e. State of Ecosystem, Pressure and Management. Main variables State of Ecosystem and Management obtain the same value i.e. 0.400, while Pressure value was 0.200. Each main variable consist of several sub-variable, i.e. coral reef, reef fish, mangrove and seagrass for State of Ecosystem; fisheries and marine tourism activity for Pressure; planning and regulation, institutional and also infrastructure and financing for Management. The highest value of sub-variable of main variable State of Ecosystem, Pressure and Management were coral reef (0.186); marine tourism pressure (0.133) and institutional (0.171), respectively. The highest value of operational variable of main variable State of Ecosystem, Pressure and Management were percent of coral cover (0.058), marine tourism pressure (0.133) and presence of zonation plan, regulation also socialization of monitoring program (0.53), respectively. Potential pressure from marine tourism activity is the variable that most affect the health of the ecosystem. The results of this research suggest that there is a need to develop stronger conservation strategies to facing with pressures from marine tourism activities.
Integrating Decision Making and Mental Health Interventions Research: Research Directions
Wills, Celia E.; Holmes-Rovner, Margaret
2006-01-01
The importance of incorporating patient and provider decision-making processes is in the forefront of the National Institute of Mental Health (NIMH) agenda for improving mental health interventions and services. Key concepts in patient decision making are highlighted within a simplified model of patient decision making that links patient-level/“micro” variables to services-level/“macro” variables via the decision-making process that is a target for interventions. The prospective agenda for incorporating decision-making concepts in mental health research includes (a) improved measures for characterizing decision-making processes that are matched to study populations, complexity, and types of decision making; (b) testing decision aids in effectiveness research for diverse populations and clinical settings; and (c) improving the understanding and incorporation of preference concepts in enhanced intervention designs. PMID:16724158
Method of and apparatus for thermomagnetically processing a workpiece
Kisner, Roger A.; Rios, Orlando; Wilgen, John B.; Ludtka, Gerard M.; Ludtka, Gail M.
2014-08-05
A method of thermomagnetically processing a material includes disposing a workpiece within a bore of a magnet; exposing the workpiece to a magnetic field of at least about 1 Tesla generated by the magnet; and, while exposing the workpiece to the magnetic field, applying heat energy to the workpiece at a plurality of frequencies to achieve spatially-controlled heating of the workpiece. An apparatus for thermomagnetically processing a material comprises: a high field strength magnet having a bore extending therethrough for insertion of a workpiece therein; and an energy source disposed adjacent to an entrance to the bore. The energy source is an emitter of variable frequency heat energy, and the bore comprises a waveguide for propagation of the variable frequency heat energy from the energy source to the workpiece.
Conservation and Variability of Meiosis Across the Eukaryotes.
Loidl, Josef
2016-11-23
Comparisons among a variety of eukaryotes have revealed considerable variability in the structures and processes involved in their meiosis. Nevertheless, conventional forms of meiosis occur in all major groups of eukaryotes, including early-branching protists. This finding confirms that meiosis originated in the common ancestor of all eukaryotes and suggests that primordial meiosis may have had many characteristics in common with conventional extant meiosis. However, it is possible that the synaptonemal complex and the delicate crossover control related to its presence were later acquisitions. Later still, modifications to meiotic processes occurred within different groups of eukaryotes. Better knowledge on the spectrum of derived and uncommon forms of meiosis will improve our understanding of many still mysterious aspects of the meiotic process and help to explain the evolutionary basis of functional adaptations to the meiotic program.
NASA Astrophysics Data System (ADS)
James, M. R.; Robson, S.; d'Oleire-Oltmanns, S.; Niethammer, U.
2017-03-01
Structure-from-motion (SfM) algorithms greatly facilitate the production of detailed topographic models from photographs collected using unmanned aerial vehicles (UAVs). However, the survey quality achieved in published geomorphological studies is highly variable, and sufficient processing details are never provided to understand fully the causes of variability. To address this, we show how survey quality and consistency can be improved through a deeper consideration of the underlying photogrammetric methods. We demonstrate the sensitivity of digital elevation models (DEMs) to processing settings that have not been discussed in the geomorphological literature, yet are a critical part of survey georeferencing, and are responsible for balancing the contributions of tie and control points. We provide a Monte Carlo approach to enable geomorphologists to (1) carefully consider sources of survey error and hence increase the accuracy of SfM-based DEMs and (2) minimise the associated field effort by robust determination of suitable lower-density deployments of ground control. By identifying appropriate processing settings and highlighting photogrammetric issues such as over-parameterisation during camera self-calibration, processing artefacts are reduced and the spatial variability of error minimised. We demonstrate such DEM improvements with a commonly-used SfM-based software (PhotoScan), which we augment with semi-automated and automated identification of ground control points (GCPs) in images, and apply to two contrasting case studies - an erosion gully survey (Taroudant, Morocco) and an active landslide survey (Super-Sauze, France). In the gully survey, refined processing settings eliminated step-like artefacts of up to 50 mm in amplitude, and overall DEM variability with GCP selection improved from 37 to 16 mm. In the much more challenging landslide case study, our processing halved planimetric error to 0.1 m, effectively doubling the frequency at which changes in landslide velocity could be detected. In both case studies, the Monte Carlo approach provided a robust demonstration that field effort could by substantially reduced by only deploying approximately half the number of GCPs, with minimal effect on the survey quality. To reduce processing artefacts and promote confidence in SfM-based geomorphological surveys, published results should include processing details which include the image residuals for both tie points and GCPs, and ensure that these are considered appropriately within the workflow.
Human Language Technology: Opportunities and Challenges
2005-01-01
because of the connections to and reliance on signal processing. Audio diarization critically includes indexing of speakers [12], since speaker ...to reduce inter- speaker variability in training. Standard techniques include vocal-tract length normalization, adaptation of acoustic models using...maximum likelihood linear regression (MLLR), and speaker -adaptive training based on MLLR. The acoustic models are mixtures of Gaussians, typically with
Ramsdal, Gro; Bergvik, Svein; Wynn, Rolf
2015-01-01
Poor academic performance is a strong predictor of school dropout. Researchers have tried to disentangle variables influencing academic performance. However, studies on preschool and early care variables are seldom examined when explaining the school dropout process. We reviewed the literature on the relationship between caregiver-child attachment and academic performance, including attachment studies from preschool years, seeking out potential contributions to academic performance and the dropout process. The review was organized according to a model of four main mediating hypotheses: the attachment-teaching hypothesis, the social network hypothesis, the attachment-cooperation hypothesis, and the attachment self-regulation hypothesis. The results of the review are summed up in a model. There is some support for all four hypotheses. The review indicates that attachment and early care contribute substantially to dropout and graduation processes. Mediation effects should be given far more attention in future research.
The Effects of Age, Years of Experience, and Type of Experience in the Teacher Selection Process
ERIC Educational Resources Information Center
Vail, David Scott
2010-01-01
Paper screening in the pre-selection process of hiring teachers has been the focus in an ongoing series of similar studies starting with Allison in 1981. There have been many independent variables, including, but not limited to, age, gender, ethnic background, years of experience, type of experience, and grade point average, introduced into the…
Results from the VALUE perfect predictor experiment: process-based evaluation
NASA Astrophysics Data System (ADS)
Maraun, Douglas; Soares, Pedro; Hertig, Elke; Brands, Swen; Huth, Radan; Cardoso, Rita; Kotlarski, Sven; Casado, Maria; Pongracz, Rita; Bartholy, Judit
2016-04-01
Until recently, the evaluation of downscaled climate model simulations has typically been limited to surface climatologies, including long term means, spatial variability and extremes. But these aspects are often, at least partly, tuned in regional climate models to match observed climate. The tuning issue is of course particularly relevant for bias corrected regional climate models. In general, a good performance of a model for these aspects in present climate does therefore not imply a good performance in simulating climate change. It is now widely accepted that, to increase our condidence in climate change simulations, it is necessary to evaluate how climate models simulate relevant underlying processes. In other words, it is important to assess whether downscaling does the right for the right reason. Therefore, VALUE has carried out a broad process-based evaluation study based on its perfect predictor experiment simulations: the downscaling methods are driven by ERA-Interim data over the period 1979-2008, reference observations are given by a network of 85 meteorological stations covering all European climates. More than 30 methods participated in the evaluation. In order to compare statistical and dynamical methods, only variables provided by both types of approaches could be considered. This limited the analysis to conditioning local surface variables on variables from driving processes that are simulated by ERA-Interim. We considered the following types of processes: at the continental scale, we evaluated the performance of downscaling methods for positive and negative North Atlantic Oscillation, Atlantic ridge and blocking situations. At synoptic scales, we considered Lamb weather types for selected European regions such as Scandinavia, the United Kingdom, the Iberian Pensinsula or the Alps. At regional scales we considered phenomena such as the Mistral, the Bora or the Iberian coastal jet. Such process-based evaluation helps to attribute biases in surface variables to underlying processes and ultimately to improve climate models.
Integrative medicine for chronic pain
Saha, Felix J.; Brüning, Alexander; Barcelona, Cyrus; Büssing, Arndt; Langhorst, Jost; Dobos, Gustav; Lauche, Romy; Cramer, Holger
2016-01-01
Abstract Introduction: Integrative medicine inpatient treatment has been shown to improve physical and mental health in patients with internal medicine conditions. The aim of this study was to investigate the effectiveness of a 2-week integrative medicine inpatient treatment in patients with chronic pain syndromes and the association of treatment success with patient-related process variables. Methods: Inpatients with chronic pain syndromes participating in a 2-week integrative medicine inpatient program were included. Patients’ pain intensity, pain disability, pain perception, quality of life, depression, and perceived stress were measured on admission, discharge, and 6 months after discharge. Likewise process variables including ability and will to change, emotional/rational disease acceptance, mindfulness, life and health satisfaction, and easiness of life were assessed. Results: A total of 310 inpatients (91% female, mean age 50.7 ± 12.4 year, 26.5% low back pain, and 22.9% fibromyalgia) were included. Using mixed linear models, significant improvements in pain intensity, pain disability, pain perception, quality of life, depression, and perceived stress were found (all P < 0.05). Ability to change and implementation, disease acceptance, mindfulness, life and health satisfaction, and light heartedness/easiness likewise improved (all P < 0.05). Improved outcomes were associated with increases in process variables, mainly ability to change and implementation, disease acceptance, life and health satisfaction, and light heartedness/easiness (R2 = 0.03–0.40). Conclusions: Results of this study suggest that a 2-week integrative medicine inpatient treatment can benefit patients with chronic pain conditions. Functional improvements are associated with improved ability to change and implementation, disease acceptance, and satisfaction. PMID:27399133
Functional variability of habitats within the Sacramento-San Joaquin Delta: Restoration implications
Lucas, L.V.; Cloern, J.E.; Thompson, J.K.; Monsen, N.E.
2002-01-01
We have now entered an era of large-scale attempts to restore ecological functions and biological communities in impaired ecosystems. Our knowledge base of complex ecosystems and interrelated functions is limited, so the outcomes of specific restoration actions are highly uncertain. One approach for exploring that uncertainty and anticipating the range of possible restoration outcomes is comparative study of existing habitats similar to future habitats slated for construction. Here we compare two examples of one habitat type targeted for restoration in the Sacramento-San Joaquin River Delta. We compare one critical ecological function provided by these shallow tidal habitats - production and distribution of phytoplankton biomass as the food supply to pelagic consumers. We measured spatial and short-term temporal variability of phytoplankton biomass and growth rate and quantified the hydrodynamic and biological processes governing that variability. Results show that the production and distribution of phytoplankton biomass can be highly variable within and between nearby habitats of the same type, due to variations in phytoplankton sources, sinks, and transport. Therefore, superficially similar, geographically proximate habitats can function very differently, and that functional variability introduces large uncertainties into the restoration process. Comparative study of existing habitats is one way ecosystem science can elucidate and potentially minimize restoration uncertainties, by identifying processes shaping habitat functionality, including those that can be controlled in the restoration design.
Nursing Homes Appeals of Deficiency Citations: The Informal Dispute Resolution Process
Mukamel, Dana B.; Weimer, David L.; Li, Yue; Bailey, Lauren; Spector, William D.; Harrington, Charlene
2012-01-01
Objective Nursing homes found to be not meeting quality standards are cited for deficiencies. Before 1995, their only recourse was a formal appeal process, which is lengthy and costly. In 1995, the Centers for Medicare & Medicaid Services (CMS) instituted the Informal Dispute Resolution (IDR) process. This study presents for the first time national statistics about the IDR process and an analysis of the factors that influence nursing homes’ decisions to request an IDR. Design Retrospective study including descriptive statistics and multivariate logistic hierarchical models. Setting U.S. nursing homes in 2005 to 2008. Participant 15,916 Medicaid and Medicare certified nursing homes nationally, with 94,188 surveys and 9,388 IDRs. Measures The unit of observation was an annual survey or a complaint survey that generated at least one deficiency. The dependent variable was dichotomous and indicated whether the annual or a complaint survey triggered an IDR request. Independent variables included characteristics of the nursing home, the deficiency, the market, and the state regulatory environment. Results Ten percent of all annual surveys and complaint surveys resulted in IDRs. There was substantial variation across states, which persisted over time. Multivariate results suggest that nursing homes’ decisions to request an IDR depend on their assessment of the probability of success and assessment of the benefits of the submission. Conclusions Nursing homes avail themselves of the IDR process. Their propensity to do so depends on a number of factors, including the state regulatory system and the market environment in which they operate. PMID:22402171
McGrath, Lauren M; Pennington, Bruce F; Shanahan, Michelle A; Santerre-Lemmon, Laura E; Barnard, Holly D; Willcutt, Erik G; Defries, John C; Olson, Richard K
2011-05-01
This study tests a multiple cognitive deficit model of reading disability (RD), attention-deficit/hyperactivity disorder (ADHD), and their comorbidity. A structural equation model (SEM) of multiple cognitive risk factors and symptom outcome variables was constructed. The model included phonological awareness as a unique predictor of RD and response inhibition as a unique predictor of ADHD. Processing speed, naming speed, and verbal working memory were modeled as potential shared cognitive deficits. Model fit indices from the SEM indicated satisfactory fit. Closer inspection of the path weights revealed that processing speed was the only cognitive variable with significant unique relationships to RD and ADHD dimensions, particularly inattention. Moreover, the significant correlation between reading and inattention was reduced to non-significance when processing speed was included in the model, suggesting that processing speed primarily accounted for the phenotypic correlation (or comorbidity) between reading and inattention. This study illustrates the power of a multiple deficit approach to complex developmental disorders and psychopathologies, particularly for exploring comorbidities. The theoretical role of processing speed in the developmental pathways of RD and ADHD and directions for future research are discussed. © 2010 The Authors. Journal of Child Psychology and Psychiatry © 2010 Association for Child and Adolescent Mental Health.
Code of Federal Regulations, 2012 CFR
2012-04-01
... contracts, including, but not limited to, premium rate structure and premium processing, insurance... discrete cash values that may vary in amount in accordance with the investment experience of the separate...
Code of Federal Regulations, 2010 CFR
2010-04-01
... contracts, including, but not limited to, premium rate structure and premium processing, insurance... discrete cash values that may vary in amount in accordance with the investment experience of the separate...
Code of Federal Regulations, 2014 CFR
2014-04-01
... contracts, including, but not limited to, premium rate structure and premium processing, insurance... discrete cash values that may vary in amount in accordance with the investment experience of the separate...
Code of Federal Regulations, 2013 CFR
2013-04-01
... contracts, including, but not limited to, premium rate structure and premium processing, insurance... discrete cash values that may vary in amount in accordance with the investment experience of the separate...
Code of Federal Regulations, 2011 CFR
2011-04-01
... contracts, including, but not limited to, premium rate structure and premium processing, insurance... discrete cash values that may vary in amount in accordance with the investment experience of the separate...
ORES - Objective Referenced Evaluation in Science.
ERIC Educational Resources Information Center
Shaw, Terry
Science process skills considered important in making decisions and solving problems include: observing, classifying, measuring, using numbers, using space/time relationships, communicating, predicting, inferring, manipulating variables, making operational definitions, forming hypotheses, interpreting data, and experimenting. This 60-item test,…
Impact of climate variability on tropospheric ozone.
Grewe, Volker
2007-03-01
A simulation with the climate-chemistry model (CCM) E39/C is presented, which covers both the troposphere and stratosphere dynamics and chemistry during the period 1960 to 1999. Although the CCM, by its nature, is not exactly representing observed day-by-day meteorology, there is an overall model's tendency to correctly reproduce the variability pattern due to an inclusion of realistic external forcings, like observed sea surface temperatures (e.g. El Niño), major volcanic eruption, solar cycle, concentrations of greenhouse gases, and Quasi-Biennial Oscillation. Additionally, climate-chemistry interactions are included, like the impact of ozone, methane, and other species on radiation and dynamics, and the impact of dynamics on emissions (lightning). However, a number of important feedbacks are not yet included (e.g. feedbacks related to biogenic emissions and emissions due to biomass burning). The results show a good representation of the evolution of the stratospheric ozone layer, including the ozone hole, which plays an important role for the simulation of natural variability of tropospheric ozone. Anthropogenic NO(x) emissions are included with a step-wise linear trend for each sector, but no interannual variability is included. The application of a number of diagnostics (e.g. marked ozone tracers) allows the separation of the impact of various processes/emissions on tropospheric ozone and shows that the simulated Northern Hemisphere tropospheric ozone budget is not only dominated by nitrogen oxide emissions and other ozone pre-cursors, but also by changes of the stratospheric ozone budget and its flux into the troposphere, which tends to reduce the simulated positive trend in tropospheric ozone due to emissions from industry and traffic during the late 80s and early 90s. For tropical regions the variability in ozone is dominated by variability in lightning (related to ENSO) and stratosphere-troposphere exchange (related to Northern Hemisphere Stratospheric dynamics and solar activity). Since tropospheric background chemistry is regarded only, the results are quantitatively limited with respect to derived trends. However, the main results are regarded to be robust. Although the horizontal resolution is rather coarse in comparison to regional models, such kind of simulations provide useful and necessary information on the impact of large-scale processes and inter-annual/decadal variations on regional air quality.
NASA Technical Reports Server (NTRS)
Alkire, K.
1984-01-01
A nonlinear analysis which is necessary to adequately model elastic helicopter rotor blades experiencing moderately large deformations was examined. The analysis must be based on an appropriate description of the blade's deformation geometry including elastic bending and twist. Built-in pretwist angles complicate the deformation process ant its definition. Relationships between the twist variables associated with different rotation sequences and corresponding forms of the transformation matrix are lasted. Relationships between the twist variables associated with first, the pretwist combined with the deformation twist are included. Many of the corresponding forms of the transformation matrix for the two cases are listed. It is shown that twist variables connected with the combined twist treatment are related to those where the pretwist is applied initially. A method to determine the relationships and some results are outlined. A procedure to evaluate the transformation matrix that eliminates the Eulerlike sequence altogether is demonstrated. The resulting form of the transformation matrix is unaffected by rotation sequence or pretwist treatment.
Development of a working Hovercraft model
NASA Astrophysics Data System (ADS)
Noor, S. H. Mohamed; Syam, K.; Jaafar, A. A.; Mohamad Sharif, M. F.; Ghazali, M. R.; Ibrahim, W. I.; Atan, M. F.
2016-02-01
This paper presents the development process to fabricate a working hovercraft model. The purpose of this study is to design and investigate of a fully functional hovercraft, based on the studies that had been done. The different designs of hovercraft model had been made and tested but only one of the models is presented in this paper. In this thesis, the weight, the thrust, the lift and the drag force of the model had been measured and the electrical and mechanical parts are also presented. The processing unit of this model is Arduino Uno by using the PSP2 (Playstation 2) as the controller. Since our prototype should be functioning on all kind of earth surface, our model also had been tested in different floor condition. They include water, grass, cement and tile. The Speed of the model is measured in every case as the respond variable, Current (I) as the manipulated variable and Voltage (V) as the constant variable.
Inter-individual cognitive variability in children with Asperger's syndrome
Gonzalez-Gadea, Maria Luz; Tripicchio, Paula; Rattazzi, Alexia; Baez, Sandra; Marino, Julian; Roca, Maria; Manes, Facundo; Ibanez, Agustin
2014-01-01
Multiple studies have tried to establish the distinctive profile of individuals with Asperger's syndrome (AS). However, recent reports suggest that adults with AS feature heterogeneous cognitive profiles. The present study explores inter-individual variability in children with AS through group comparison and multiple case series analysis. All participants completed an extended battery including measures of fluid and crystallized intelligence, executive functions, theory of mind, and classical neuropsychological tests. Significant group differences were found in theory of mind and other domains related to global information processing. However, the AS group showed high inter-individual variability (both sub- and supra-normal performance) on most cognitive tasks. Furthermore, high fluid intelligence correlated with less general cognitive impairment, high cognitive flexibility, and speed of motor processing. In light of these findings, we propose that children with AS are characterized by a distinct, uneven pattern of cognitive strengths and weaknesses. PMID:25132817
Examining the sources of variability in cell culture media used for biopharmaceutical production.
McGillicuddy, Nicola; Floris, Patrick; Albrecht, Simone; Bones, Jonathan
2018-01-01
Raw materials, in particular cell culture media, represent a significant source of variability to biopharmaceutical manufacturing processes that can detrimentally affect cellular growth, viability and specific productivity or alter the quality profile of the expressed therapeutic protein. The continual expansion of the biopharmaceutical industry is creating an increasing demand on the production and supply chain consistency for cell culture media, especially as companies embrace intensive continuous processing. Here, we provide a historical perspective regarding the transition from serum containing to serum-free media, the development of chemically-defined cell culture media for biopharmaceutical production using industrial scale bioprocesses and review production mechanisms for liquid and powder culture media. An overview and critique of analytical approaches used for the characterisation of cell culture media and the identification of root causes of variability are also provided, including in-depth liquid phase separations, mass spectrometry and spectroscopic methods.
Snedden, Gregg
2014-01-01
Understanding how circulation and mixing processes in coastal navigation canals influence the exchange of salt between marshes and coastal ocean, and how those processes are modulated by external physical processes, is critical to anticipating effects of future actions and circumstance. Examples of such circumstances include deepening the channel, placement of locks in the channel, changes in freshwater discharge down the channel, changes in outer continental shelf (OCS) vessel traffic volume, and sea level rise. The study builds on previous BOEM-funded studies by investigating salt flux variability through the Houma Navigation Canal (HNC). It examines how external physical factors, such as buoyancy forcing and mixing from tidal stirring and OCS vessel wakes, influence dispersive and advective fluxes through the HNC and the impact of this salt flux on salinity in nearby marshes. This study quantifies salt transport processes and salinity variability in the HNC and surrounding Terrebonne marshes. Data collected for this study include time-series data of salinity and velocity in the HNC, monthly salinity-depth profiles along the length of the channel, hourly vertical profiles of velocity and salinity over multiple tidal cycles, and salinity time series data at three locations in the surrounding marshes along a transect of increasing distance from the HNC. Two modes of vertical current structure were identified. The first mode, making up 90% of the total flow field variability, strongly resembled a barotropic current structure and was coherent with alongshelf wind stress over the coastal Gulf of Mexico. The second mode was indicative of gravitational circulation and was linked to variability in tidal stirring and the longitudinal salinity gradients along the channel’s length. Diffusive process were dominant drivers of upestuary salt transport, except during periods of minimal tidal stirring when gravitational circulation became more important. Salinity in the surrounding marshes was much more responsive to salinity variations in the HNC than it was to variations in the lower Terrebonne marshes, suggesting that the HNC is the primary conduit for saltwater intrusion to the middle Terrebonne marshes. Finally, salt transport to the middle Terrebonne marshes directly associated with vessel wakes was negligible.
Rule, Michael E.; Vargas-Irwin, Carlos; Donoghue, John P.; Truccolo, Wilson
2015-01-01
Understanding the sources of variability in single-neuron spiking responses is an important open problem for the theory of neural coding. This variability is thought to result primarily from spontaneous collective dynamics in neuronal networks. Here, we investigate how well collective dynamics reflected in motor cortex local field potentials (LFPs) can account for spiking variability during motor behavior. Neural activity was recorded via microelectrode arrays implanted in ventral and dorsal premotor and primary motor cortices of non-human primates performing naturalistic 3-D reaching and grasping actions. Point process models were used to quantify how well LFP features accounted for spiking variability not explained by the measured 3-D reach and grasp kinematics. LFP features included the instantaneous magnitude, phase and analytic-signal components of narrow band-pass filtered (δ,θ,α,β) LFPs, and analytic signal and amplitude envelope features in higher-frequency bands. Multiband LFP features predicted single-neuron spiking (1ms resolution) with substantial accuracy as assessed via ROC analysis. Notably, however, models including both LFP and kinematics features displayed marginal improvement over kinematics-only models. Furthermore, the small predictive information added by LFP features to kinematic models was redundant to information available in fast-timescale (<100 ms) spiking history. Overall, information in multiband LFP features, although predictive of single-neuron spiking during movement execution, was redundant to information available in movement parameters and spiking history. Our findings suggest that, during movement execution, collective dynamics reflected in motor cortex LFPs primarily relate to sensorimotor processes directly controlling movement output, adding little explanatory power to variability not accounted by movement parameters. PMID:26157365
NASA Astrophysics Data System (ADS)
Davini, Paolo; von Hardenberg, Jost; Corti, Susanna; Subramanian, Aneesh; Weisheimer, Antje; Christensen, Hannah; Juricke, Stephan; Palmer, Tim
2016-04-01
The PRACE Climate SPHINX project investigates the sensitivity of climate simulations to model resolution and stochastic parameterization. The EC-Earth Earth-System Model is used to explore the impact of stochastic physics in 30-years climate integrations as a function of model resolution (from 80km up to 16km for the atmosphere). The experiments include more than 70 simulations in both a historical scenario (1979-2008) and a climate change projection (2039-2068), using RCP8.5 CMIP5 forcing. A total amount of 20 million core hours will be used at end of the project (March 2016) and about 150 TBytes of post-processed data will be available to the climate community. Preliminary results show a clear improvement in the representation of climate variability over the Euro-Atlantic following resolution increase. More specifically, the well-known atmospheric blocking negative bias over Europe is definitely resolved. High resolution runs also show improved fidelity in representation of tropical variability - such as the MJO and its propagation - over the low resolution simulations. It is shown that including stochastic parameterization in the low resolution runs help to improve some of the aspects of the MJO propagation further. These findings show the importance of representing the impact of small scale processes on the large scale climate variability either explicitly (with high resolution simulations) or stochastically (in low resolution simulations).
Assessing groundwater vulnerability to agrichemical contamination in the Midwest US
Burkart, M.R.; Kolpin, D.W.; James, D.E.
1999-01-01
Agrichemicals (herbicides and nitrate) are significant sources of diffuse pollution to groundwater. Indirect methods are needed to assess the potential for groundwater contamination by diffuse sources because groundwater monitoring is too costly to adequately define the geographic extent of contamination at a regional or national scale. This paper presents examples of the application of statistical, overlay and index, and process-based modeling methods for groundwater vulnerability assessments to a variety of data from the Midwest U.S. The principles for vulnerability assessment include both intrinsic (pedologic, climatologic, and hydrogeologic factors) and specific (contaminant and other anthropogenic factors) vulnerability of a location. Statistical methods use the frequency of contaminant occurrence, contaminant concentration, or contamination probability as a response variable. Statistical assessments are useful for defining the relations among explanatory and response variables whether they define intrinsic or specific vulnerability. Multivariate statistical analyses are useful for ranking variables critical to estimating water quality responses of interest. Overlay and index methods involve intersecting maps of intrinsic and specific vulnerability properties and indexing the variables by applying appropriate weights. Deterministic models use process-based equations to simulate contaminant transport and are distinguished from the other methods in their potential to predict contaminant transport in both space and time. An example of a one-dimensional leaching model linked to a geographic information system (GIS) to define a regional metamodel for contamination in the Midwest is included.
Carbothermic Synthesis of ~820- m UN Kernels. Investigation of Process Variables
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lindemer, Terrence; Silva, Chinthaka M; Henry, Jr, John James
2015-06-01
This report details the continued investigation of process variables involved in converting sol-gel-derived, urainia-carbon microspheres to ~820-μm-dia. UN fuel kernels in flow-through, vertical refractory-metal crucibles at temperatures up to 2123 K. Experiments included calcining of air-dried UO 3-H 2O-C microspheres in Ar and H 2-containing gases, conversion of the resulting UO 2-C kernels to dense UO 2:2UC in the same gases and vacuum, and its conversion in N 2 to in UC 1-xN x. The thermodynamics of the relevant reactions were applied extensively to interpret and control the process variables. Producing the precursor UO 2:2UC kernel of ~96% theoretical densitymore » was required, but its subsequent conversion to UC 1-xN x at 2123 K was not accompanied by sintering and resulted in ~83-86% of theoretical density. Decreasing the UC 1-xN x kernel carbide component via HCN evolution was shown to be quantitatively consistent with present and past experiments and the only useful application of H2 in the entire process.« less
Orlandini, S; Pasquini, B; Caprini, C; Del Bubba, M; Squarcialupi, L; Colotta, V; Furlanetto, S
2016-09-30
A comprehensive strategy involving the use of mixture-process variable (MPV) approach and Quality by Design principles has been applied in the development of a capillary electrophoresis method for the simultaneous determination of the anti-inflammatory drug diclofenac and its five related substances. The selected operative mode consisted in microemulsion electrokinetic chromatography with the addition of methyl-β-cyclodextrin. The critical process parameters included both the mixture components (MCs) of the microemulsion and the process variables (PVs). The MPV approach allowed the simultaneous investigation of the effects of MCs and PVs on the critical resolution between diclofenac and its 2-deschloro-2-bromo analogue and on analysis time. MPV experiments were used both in the screening phase and in the Response Surface Methodology, making it possible to draw MCs and PVs contour plots and to find important interactions between MCs and PVs. Robustness testing was carried out by MPV experiments and validation was performed following International Conference on Harmonisation guidelines. The method was applied to a real sample of diclofenac gastro-resistant tablets. Copyright © 2016 Elsevier B.V. All rights reserved.
Hinojosa, José A.; Rincón-Pérez, Irene; Romero-Ferreiro, Mª Verónica; Martínez-García, Natalia; Villalba-García, Cristina; Montoro, Pedro R.; Pozo, Miguel A.
2016-01-01
The current study presents ratings by 540 Spanish native speakers for dominance, familiarity, subjective age of acquisition (AoA), and sensory experience (SER) for the 875 Spanish words included in the Madrid Affective Database for Spanish (MADS). The norms can be downloaded as supplementary materials for this manuscript from https://figshare.com/s/8e7b445b729527262c88 These ratings may be of potential relevance to researches who are interested in characterizing the interplay between language and emotion. Additionally, with the aim of investigating how the affective features interact with the lexicosemantic properties of words, we performed correlational analyses between norms for familiarity, subjective AoA and SER, and scores for those affective variables which are currently included in the MADs. A distinct pattern of significant correlations with affective features was found for different lexicosemantic variables. These results show that familiarity, subjective AoA and SERs may have independent effects on the processing of emotional words. They also suggest that these psycholinguistic variables should be fully considered when formulating theoretical approaches to the processing of affective language. PMID:27227521
Singh, Sushil K; Muthukumarappan, Kasiviswanathan
2016-04-01
Soy white flakes (SWF) is an intermediate product during soy bean processing. It is an untoasted inexpensive product and contains around 51% of crude protein. It can be a potential source of protein to replace fish meal for developing aquafeed. The extrusion process is versatile and is used for the development of aquafeed. Our objective was to study the effects of inclusion of SWF (up to 50%) and other extrusion processing parameters such as barrel temperature and screw speed on the properties of aquafeed extrudates using a single-screw extruder. Extrudate properties, including pellet durability index, bulk density, water absorption and solubility indices and mass flow rate, were significantly (P < 0.05) affected by the process variables. SWF was the most significant variable with quadratic effects on most of the properties. Increasing temperature and screw speed resulted in increase in durability and mass flow rate of extrudates. Response surface regression models were established to correlate the properties of extrudates to the process variables. SWF was used as an alternative protein source of fish meal. Our study shows that aquafeed with high durability, lower bulk density and lower water absorption and higher solubility indices can be obtained by adding SWF up to 40%. © 2015 Society of Chemical Industry.
Sacha, Gregory A; Schmitt, William J; Nail, Steven L
2006-01-01
The critical processing parameters affecting average particle size, particle size distribution, yield, and level of residual carrier solvent using the supercritical anti-solvent method (SAS) were identified. Carbon dioxide was used as the supercritical fluid. Methylprednisolone acetate was used as the model solute in tetrahydrofuran. Parameters examined included pressure of the supercritical fluid, agitation rate, feed solution flow rate, impeller diameter, and nozzle design. Pressure was identified as the most important process parameter affecting average particle size, either through the effect of pressure on dispersion of the feed solution into the precipitation vessel or through the effect of pressure on solubility of drug in the CO2/organic solvent mixture. Agitation rate, impeller diameter, feed solution flow rate, and nozzle design had significant effects on particle size, which suggests that dispersion of the feed solution is important. Crimped HPLC tubing was the most effective method of introducing feed solution into the precipitation vessel, largely because it resulted in the least amount of clogging during the precipitation. Yields of 82% or greater were consistently produced and were not affected by the processing variables. Similarly, the level of residual solvent was independent of the processing variables and was present at 0.0002% wt/wt THF or less.
Brooks, Kriston P; Holladay, Jamelyn D; Simmons, Kevin L; Herling, Darrell R
2014-11-18
An on-board hydride storage system and process are described. The system includes a slurry storage system that includes a slurry reactor and a variable concentration slurry. In one preferred configuration, the storage system stores a slurry containing a hydride storage material in a carrier fluid at a first concentration of hydride solids. The slurry reactor receives the slurry containing a second concentration of the hydride storage material and releases hydrogen as a fuel to hydrogen-power devices and vehicles.
Maulidiani; Rudiyanto; Abas, Faridah; Ismail, Intan Safinar; Lajis, Nordin H
2018-06-01
Optimization process is an important aspect in the natural product extractions. Herein, an alternative approach is proposed for the optimization in extraction, namely, the Generalized Likelihood Uncertainty Estimation (GLUE). The approach combines the Latin hypercube sampling, the feasible range of independent variables, the Monte Carlo simulation, and the threshold criteria of response variables. The GLUE method is tested in three different techniques including the ultrasound, the microwave, and the supercritical CO 2 assisted extractions utilizing the data from previously published reports. The study found that this method can: provide more information on the combined effects of the independent variables on the response variables in the dotty plots; deal with unlimited number of independent and response variables; consider combined multiple threshold criteria, which is subjective depending on the target of the investigation for response variables; and provide a range of values with their distribution for the optimization. Copyright © 2018 Elsevier Ltd. All rights reserved.
Foroughi, Nasim; Smith, Richard; Vanwanseele, Benedicte
2009-10-01
Osteoarthritis (OA) is a musculoskeletal disorder primarily affecting the older population and resulting in chronic pain and disability. Biomechanical variables, associated with OA severity such as external knee adduction moment (KAM) and joint malalignment, may affect the disease process by altering the bone-on-bone forces during gait. To investigate the association between biomechanical variables and KAM in knee OA. A systematic search for published studies' titles and abstracts was performed on Ovid Medline, Cumulative index to Nursing and Allied Health, PREMEDLINE, EBM reviews and SPORTDiscus. Fourteen studies met the inclusion criteria and were considered for the review. The magnitude and time course of KAM during gait appeared to be consistent across laboratories and computational methods. Only two of the included studies that compared patients with OA to a control group reported a higher peak KAM for the OA group. Knee adduction moment increased with OA severity and was directly proportional to varus malalignment. Classifying the patients on the basis of disease severity decreased the group variability, permitting the differences to be more detectable. Biomechanical variables such as varus malalignment are associated with KAM and therefore may affect the disease process. These variables should be taken into considerations when developing therapeutic interventions for individuals suffering from knee OA.
Modeling heart rate variability including the effect of sleep stages
NASA Astrophysics Data System (ADS)
Soliński, Mateusz; Gierałtowski, Jan; Żebrowski, Jan
2016-02-01
We propose a model for heart rate variability (HRV) of a healthy individual during sleep with the assumption that the heart rate variability is predominantly a random process. Autonomic nervous system activity has different properties during different sleep stages, and this affects many physiological systems including the cardiovascular system. Different properties of HRV can be observed during each particular sleep stage. We believe that taking into account the sleep architecture is crucial for modeling the human nighttime HRV. The stochastic model of HRV introduced by Kantelhardt et al. was used as the initial starting point. We studied the statistical properties of sleep in healthy adults, analyzing 30 polysomnographic recordings, which provided realistic information about sleep architecture. Next, we generated synthetic hypnograms and included them in the modeling of nighttime RR interval series. The results of standard HRV linear analysis and of nonlinear analysis (Shannon entropy, Poincaré plots, and multiscale multifractal analysis) show that—in comparison with real data—the HRV signals obtained from our model have very similar properties, in particular including the multifractal characteristics at different time scales. The model described in this paper is discussed in the context of normal sleep. However, its construction is such that it should allow to model heart rate variability in sleep disorders. This possibility is briefly discussed.
Ahun, Marilyn N; Côté, Sylvana M
2018-06-06
Despite the abundance of research investigating the associations between maternal depressive symptoms (MDS) and children's cognitive development, little is known about the putative mechanisms through which depressive symptoms are associated with children's cognitive development. The aim of this review was to summarize the literature on family mediators (i.e., maternal parenting behaviors, mother-child interactions, and family stress) involved in this association in early childhood. The review includes seven studies, five longitudinal and two cross-sectional, which tested putative mediators of the association between MDS and children's cognitive development. Studies were selected from online databases (PubMed, PsycNet) and manual searches. Only studies which quantitatively assessed associations between MDS in the postnatal period and child cognitive development in early childhood (i.e., 0-5 years) and included mediator variables were included in the review. Six out of seven studies identified mediating variables. The mediators included maternal responsiveness, parenting style, family dysfunction, the quality of the home environment, and maternal caregiving practices. Different mediators were identified across the reviewed studies. Maternal depressive symptoms are partly associated with child cognitive development via family processes and parenting practices. Various mediating processes are at play. Further research is needed on the role of maternal and paternal mental health and gene-environment correlations in this association. A better understanding of the mediating pathways is needed for the design of preventative intervention targeting specific family processes.
NASA Technical Reports Server (NTRS)
Branscome, Lee E.; Bleck, Rainer; Obrien, Enda
1990-01-01
The project objectives are to develop process models to investigate the interaction of planetary and synoptic-scale waves including the effects of latent heat release (precipitation), nonlinear dynamics, physical and boundary-layer processes, and large-scale topography; to determine the importance of latent heat release for temporal variability and time-mean behavior of planetary and synoptic-scale waves; to compare the model results with available observations of planetary and synoptic wave variability; and to assess the implications of the results for monitoring precipitation in oceanic-storm tracks by satellite observing systems. Researchers have utilized two different models for this project: a two-level quasi-geostrophic model to study intraseasonal variability, anomalous circulations and the seasonal cycle, and a 10-level, multi-wave primitive equation model to validate the two-level Q-G model and examine effects of convection, surface processes, and spherical geometry. It explicitly resolves several planetary and synoptic waves and includes specific humidity (as a predicted variable), moist convection, and large-scale precipitation. In the past year researchers have concentrated on experiments with the multi-level primitive equation model. The dynamical part of that model is similar to the spectral model used by the National Meteorological Center for medium-range forecasts. The model includes parameterizations of large-scale condensation and moist convection. To test the validity of results regarding the influence of convective precipitation, researchers can use either one of two different convective schemes in the model, a Kuo convective scheme or a modified Arakawa-Schubert scheme which includes downdrafts. By choosing one or the other scheme, they can evaluate the impact of the convective parameterization on the circulation. In the past year researchers performed a variety of initial-value experiments with the primitive-equation model. Using initial conditions typical of climatological winter conditions, they examined the behavior of synoptic and planetary waves growing in moist and dry environments. Surface conditions were representative of a zonally averaged ocean. They found that moist convection associated with baroclinic wave development was confined to the subtropics.
NASA Astrophysics Data System (ADS)
Stone, H. B.; Banas, N. S.; Hickey, B. M.; MacCready, P.
2016-02-01
The Pacific Northwest coast is an unusually productive area with a strong river influence and highly variable upwelling-favorable and downwelling-favorable winds, but recent trends in hypoxia and ocean acidification in this region are troubling to both scientists and the general public. A new ROMS hindcast model of this region makes possible a study of interannual variability. This study of the interannual temperature and salinity variability on the Pacific Northwest coast is conducted using a coastal hindcast model (43°N - 50°N) spanning 2002-2009 from the University of Washington Coastal Modeling Group, with a resolution of 1.5 km over the shelf and slope. Analysis of hindcast model results was used to assess the relative importance of source water variability, including the poleward California Undercurrent, local and remote wind forcing, winter wind-driven mixing, and river influence in explaining the interannual variations in the shelf bottom layer (40 - 80 m depth, 10 m thick) and over the slope (150 - 250 m depth, <100 km from shelf break) at each latitude within the model domain. Characterized through tracking of the fraction of Pacific Equatorial Water (PEW) relative to Pacific Subarctic Upper Water (PSUW) present on the slope, slope water properties at all latitudes varied little throughout the time series, with the largest variability due to patterns of large north-south advection of water masses over the slope. Over the time series, the standard deviation of slope temperature was 0.09 ˚C, while slope salinity standard deviation was 0.02 psu. Results suggest that shelf bottom water interannual variability is not driven primarily by interannual variability in slope water as shelf bottom water temperature and salinity vary nearly 10 times more than those over the slope. Instead, interannual variability in shelf bottom water properties is likely driven by other processes, such as local and remote wind forcing, and winter wind-driven mixing. The relative contributions of these processes to interannual variability in shelf bottom water properties will be addressed. Overall, these results highlight the importance of shelf processes relative to large-scale influences on the interannual timescale in particular. Implications for variability in hypoxia and ocean acidification impacts will be discussed.
Modified Exponential Weighted Moving Average (EWMA) Control Chart on Autocorrelation Data
NASA Astrophysics Data System (ADS)
Herdiani, Erna Tri; Fandrilla, Geysa; Sunusi, Nurtiti
2018-03-01
In general, observations of the statistical process control are assumed to be mutually independence. However, this assumption is often violated in practice. Consequently, statistical process controls were developed for interrelated processes, including Shewhart, Cumulative Sum (CUSUM), and exponentially weighted moving average (EWMA) control charts in the data that were autocorrelation. One researcher stated that this chart is not suitable if the same control limits are used in the case of independent variables. For this reason, it is necessary to apply the time series model in building the control chart. A classical control chart for independent variables is usually applied to residual processes. This procedure is permitted provided that residuals are independent. In 1978, Shewhart modification for the autoregressive process was introduced by using the distance between the sample mean and the target value compared to the standard deviation of the autocorrelation process. In this paper we will examine the mean of EWMA for autocorrelation process derived from Montgomery and Patel. Performance to be investigated was investigated by examining Average Run Length (ARL) based on the Markov Chain Method.
Optimization of a GO2/GH2 Swirl Coaxial Injector Element
NASA Technical Reports Server (NTRS)
Tucker, P. Kevin; Shyy, Wei; Vaidyanathan, Rajkumar
1999-01-01
An injector optimization methodology, method i, is used to investigate optimal design points for a gaseous oxygen/gaseous hydrogen (GO2/GH2) swirl coaxial injector element. The element is optimized in terms of design variables such as fuel pressure drop, DELTA P(sub f), oxidizer pressure drop, DELTA P(sub 0) combustor length, L(sub comb), and full cone swirl angle, theta, for a given mixture ratio and chamber pressure. Dependent variables such as energy release efficiency, ERE, wall heat flux, Q(sub w) injector heat flux, Q(sub inj), relative combustor weight, W(sub rel), and relative injector cost, C(sub rel), are calculated and then correlated with the design variables. An empirical design methodology is used to generate these responses for 180 combinations of input variables. Method i is then used to generate response surfaces for each dependent variable. Desirability functions based on dependent variable constraints are created and used to facilitate development of composite response surfaces representing some, or all, of the five dependent variables in terms of the input variables. Two examples illustrating the utility and flexibility of method i are discussed in detail. First, joint response surfaces are constructed by sequentially adding dependent variables. Optimum designs are identified after addition of each variable and the effect each variable has on the design is shown. This stepwise demonstration also highlights the importance of including variables such as weight and cost early in the design process. Secondly, using the composite response surface that includes all five dependent variables, unequal weights are assigned to emphasize certain variables relative to others. Here, method i is used to enable objective trade studies on design issues such as component life and thrust to weight ratio.
Controlling for confounding variables in MS-omics protocol: why modularity matters.
Smith, Rob; Ventura, Dan; Prince, John T
2014-09-01
As the field of bioinformatics research continues to grow, more and more novel techniques are proposed to meet new challenges and improvements upon solutions to long-standing problems. These include data processing techniques and wet lab protocol techniques. Although the literature is consistently thorough in experimental detail and variable-controlling rigor for wet lab protocol techniques, bioinformatics techniques tend to be less described and less controlled. As the validation or rejection of hypotheses rests on the experiment's ability to isolate and measure a variable of interest, we urge the importance of reducing confounding variables in bioinformatics techniques during mass spectrometry experimentation. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Assessment of mid-latitude atmospheric variability in CMIP5 models using a process oriented-metric
NASA Astrophysics Data System (ADS)
Di Biagio, Valeria; Calmanti, Sandro; Dell'Aquila, Alessandro; Ruti, Paolo
2013-04-01
We compare, for the period 1962-2000, an estimate of the northern hemisphere mid-latitude winter atmospheric variability according several global climate models included in the fifth phase of the Climate Model Intercomparison Project (CMIP5) with the results of the models belonging to the previous CMIP3 and with the NCEP-NCAR reanalysis. We use the space-time Hayashi spectra of the 500hPa geopotential height fields to characterize the variability of atmospheric circulation regimes and we introduce an ad hoc integral measure of the variability observed in the Northern Hemisphere on different spectral sub-domains. The overall performance of each model is evaluated by considering the total wave variability as a global scalar measure of the statistical properties of different types of atmospheric disturbances. The variability associated to eastward propagating baroclinic waves and to planetary waves is instead used to describe the performance of each model in terms of specific physical processes. We find that the two model ensembles (CMIP3 and CMIP5) do not show substantial differences in the description of northern hemisphere winter mid-latitude atmospheric variability, although some CMIP5 models display performances superior to their previous versions implemented in CMIP3. Preliminary results for the 21th century RCP 4.5 scenario will be also discussed for the CMIP5 models.
CUHK Papers in Linguistics, Number 4.
ERIC Educational Resources Information Center
Tang, Gladys, Ed.
1993-01-01
Papers in this issue include the following: "Code-Mixing in Hongkong Cantonese-English Bilinguals: Constraints and Processes" (Brian Chan Hok-shing); "Information on Quantifiers and Argument Structure in English Learner's Dictionaries" (Thomas Hun-tak Lee); "Systematic Variability: In Search of a Linguistic…
ERIC Educational Resources Information Center
McGrath, Joseph E.
1978-01-01
Summarizes research on small group processes by giving a comprehensive account of the types of variables primarily studied in the laboratory. These include group structure, group composition, group size, and group relations. Considers effects of power, leadership, conformity to social norms, and role relationships. (Author/AV)
Fermentation: From Sensory Experience to Conceptual Understanding
ERIC Educational Resources Information Center
Moore, Eugene B.
1977-01-01
Presented is a laboratory exercise that utilizes the natural yeast carbonation method of making homemade root beer to study fermentation and the effect of variables upon the fermentation process. There are photographs, a sample data sheet, and procedural hints included. (Author/MA)
Roush, W B; Boykin, D; Branton, S L
2004-08-01
A mixture experiment, a variant of response surface methodology, was designed to determine the proportion of time to feed broiler starter (23% protein), grower (20% protein), and finisher (18% protein) diets to optimize production and processing variables based on a total production time of 48 d. Mixture designs are useful for proportion problems where the components of the experiment (i.e., length of time the diets were fed) add up to a unity (48 d). The experiment was conducted with day-old male Ross x Ross broiler chicks. The birds were placed 50 birds per pen in each of 60 pens. The experimental design was a 10-point augmented simplex-centroid (ASC) design with 6 replicates of each point. Each design point represented the portion(s) of the 48 d that each of the diets was fed. Formulation of the diets was based on NRC standards. At 49 d, each pen of birds was evaluated for production data including BW, feed conversion, and cost of feed consumed. Then, 6 birds were randomly selected from each pen for processing data. Processing variables included live weight, hot carcass weight, dressing percentage, fat pad percentage, and breast yield (pectoralis major and pectoralis minor weights). Production and processing data were fit to simplex regression models. Model terms determined not to be significant (P > 0.05) were removed. The models were found to be statistically adequate for analysis of the response surfaces. A compromise solution was calculated based on optimal constraints designated for the production and processing data. The results indicated that broilers fed a starter and finisher diet for 30 and 18 d, respectively, would meet the production and processing constraints. Trace plots showed that the production and processing variables were not very sensitive to the grower diet.
Variability in Rheumatology day care hospitals in Spain: VALORA study.
Hernández Miguel, María Victoria; Martín Martínez, María Auxiliadora; Corominas, Héctor; Sanchez-Piedra, Carlos; Sanmartí, Raimon; Fernandez Martinez, Carmen; García-Vicuña, Rosario
To describe the variability of the day care hospital units (DCHUs) of Rheumatology in Spain, in terms of structural resources and operating processes. Multicenter descriptive study with data from a self-completed questionnaire of DCHUs self-assessment based on DCHUs quality standards of the Spanish Society of Rheumatology. Structural resources and operating processes were analyzed and stratified by hospital complexity (regional, general, major and complex). Variability was determined using the coefficient of variation (CV) of the variable with clinical relevance that presented statistically significant differences when was compared by centers. A total of 89 hospitals (16 autonomous regions and Melilla) were included in the analysis. 11.2% of hospitals are regional, 22,5% general, 27%, major and 39,3% complex. A total of 92% of DCHUs were polyvalent. The number of treatments applied, the coordination between DCHUs and hospital pharmacy and the post graduate training process were the variables that showed statistically significant differences depending on the complexity of hospital. The highest rate of rheumatologic treatments was found in complex hospitals (2.97 per 1,000 population), and the lowest in general hospitals (2.01 per 1,000 population). The CV was 0.88 in major hospitals; 0.86 in regional; 0.76 in general, and 0.72 in the complex. there was variability in the number of treatments delivered in DCHUs, being greater in major hospitals and then in regional centers. Nonetheless, the variability in terms of structure and function does not seem due to differences in center complexity. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Reumatología y Colegio Mexicano de Reumatología. All rights reserved.
Remote Sensing and Problems of the Hydrosphere
NASA Technical Reports Server (NTRS)
Goldberg, E. D. (Editor)
1979-01-01
A discussion of freshwater and marine systems is presented including areas of the classification of lakes, identification and quantification of major functional groups of phytoplankton, sources and sinks of biochemical factors, and temporal and regional variability of surface features. Atmospheric processes linked to hydrospheric process through the transfer of matter via aerosols and gases are discussed. Particle fluxes to the aquatic environment and global geochemical problems are examined.
Software sensors for bioprocesses.
Bogaerts, Ph; Vande Wouwer, A
2003-10-01
State estimation is a significant problem in biotechnological processes, due to the general lack of hardware sensor measurements of the variables describing the process dynamics. The objective of this paper is to review a number of software sensor design methods, including extended Kalman filters, receding-horizon observers, asymptotic observers, and hybrid observers, which can be efficiently applied to bioprocesses. These several methods are illustrated with simulation and real-life case studies.
We present development of a process to perform greyscale photolithography on a 2.55-m thick photoresist in order to transfer tiered and sloped...platinum or iridium oxide (IrO2) electrodes above and below each layer. Process variables including resist rehydration , focus of the exposure, and UV cure...bake temperature were optimized to produce the best greyscale profile through the thickness of the resist.
Mo Zhou; Joseph Buongiorno
2011-01-01
Most economic studies of forest decision making under risk assume a fixed interest rate. This paper investigated some implications of this stochastic nature of interest rates. Markov decision process (MDP) models, used previously to integrate stochastic stand growth and prices, can be extended to include variable interest rates as well. This method was applied to...
Geometrical accuracy improvement in flexible roll forming lines
NASA Astrophysics Data System (ADS)
Larrañaga, J.; Berner, S.; Galdos, L.; Groche, P.
2011-01-01
The general interest to produce profiles with variable cross section in a cost-effective way has increased in the last few years. The flexible roll forming process allows producing profiles with variable cross section lengthwise in a continuous way. Until now, only a few flexible roll forming lines were developed and built up. Apart from the flange wrinkling along the transition zone of u-profiles with variable cross section, the process limits have not been investigated and solutions for shape deviations are unknown. During the PROFOM project a flexible roll forming machine has been developed with the objective of producing high technological components for automotive body structures. In order to investigate the limits of the process, different profile geometries and steel grades including high strength steels have been applied. During the first experimental tests, several errors have been identified, as a result of the complex stress states generated during the forming process. In order to improve the accuracy of the target profiles and to meet the tolerance demands of the automotive industry, a thermo-mechanical solution has been proposed. Additional mechanical devices, supporting flexible the roll forming process, have been implemented in the roll forming line together with local heating techniques. The combination of both methods shows a significant increase of the accuracy. In the present investigation, the experimental results of the validation process are presented.
Which Measures of Online Control Are Least Sensitive to Offline Processes?
de Grosbois, John; Tremblay, Luc
2018-02-28
A major challenge to the measurement of online control is the contamination by offline, planning-based processes. The current study examined the sensitivity of four measures of online control to offline changes in reaching performance induced by prism adaptation and terminal feedback. These measures included the squared Z scores (Z 2 ) of correlations of limb position at 75% movement time versus movement end, variable error, time after peak velocity, and a frequency-domain analysis (pPower). The results indicated that variable error and time after peak velocity were sensitive to the prism adaptation. Furthermore, only the Z 2 values were biased by the terminal feedback. Ultimately, the current study has demonstrated the sensitivity of limb kinematic measures to offline control processes and that pPower analyses may yield the most suitable measure of online control.
The variable polarity plasma arc welding process: Characteristics and performance
NASA Technical Reports Server (NTRS)
Hung, R. J.; Zhu, G. J.
1991-01-01
Significant advantages of the Variable Polarity Plasma Arc (VPPA) Welding Process include faster welding, fewer repairs, less joint preparation, reduced weldment distortion, and absence of porosity. The power distribution was analyzed for an argon plasma gas flow constituting the fluid in the VPPA Welding Process. The major heat loss at the torch nozzle is convective heat transfer; in the space between the outlet of the nozzle and the workpiece; radiative heat transfer; and in the keyhole in the workpiece, convective heat transfer. The power absorbed at the workpiece produces the molten puddle that solidifies into the weld bead. Crown and root widths, and crown and root heights of the weld bead are predicted. The basis is provided for an algorithm for automatic control of VPPA welding machine parameters to obtain desired weld bead dimensions.
Hurricane Ike: Observations and Analysis of Coastal Change
Doran, Kara S.; Plant, Nathaniel G.; Stockdon, Hilary F.; Sallenger, Asbury H.; Serafin, Katherine A.
2009-01-01
Understanding storm-induced coastal change and forecasting these changes require knowledge of the physical processes associated with the storm and the geomorphology of the impacted coastline. The primary physical processes of interest are the wind field, storm surge, and wave climate. Not only does wind cause direct damage to structures along the coast, but it is ultimately responsible for much of the energy that is transferred to the ocean and expressed as storm surge, mean currents, and large waves. Waves and currents are the processes most responsible for moving sediments in the coastal zone during extreme storm events. Storm surge, the rise in water level due to the wind, barometric pressure, and other factors, allows both waves and currents to attack parts of the coast not normally exposed to those processes. Coastal geomorphology, including shapes of the shoreline, beaches, and dunes, is equally important to the coastal change observed during extreme storm events. Relevant geomorphic variables include sand dune elevation, beach width, shoreline position, sediment grain size, and foreshore beach slope. These variables, in addition to hydrodynamic processes, can be used to predict coastal vulnerability to storms The U.S. Geological Survey's (USGS) National Assessment of Coastal Change Hazards Project (http://coastal.er.usgs.gov/hurricanes), strives to provide hazard information to those interested in the Nation's coastlines, including residents of coastal areas, government agencies responsible for coastal management, and coastal researchers. As part of the National Assessment, observations were collected to measure coastal changes associated with Hurricane Ike, which made landfall near Galveston, Texas, on September 13, 2008. Methods of observation included aerial photography and airborne topographic surveys. This report documents these data-collection efforts and presents qualitative and quantitative descriptions of hurricane-induced changes to the shoreline, beaches, dunes, and infrastructure in the region that was heavily impacted by Hurricane Ike.
Scruggs, Stacie; Mama, Scherezade K; Carmack, Cindy L; Douglas, Tommy; Diamond, Pamela; Basen-Engquist, Karen
2018-01-01
This study examined whether a physical activity intervention affects transtheoretical model (TTM) variables that facilitate exercise adoption in breast cancer survivors. Sixty sedentary breast cancer survivors were randomized to a 6-month lifestyle physical activity intervention or standard care. TTM variables that have been shown to facilitate exercise adoption and progress through the stages of change, including self-efficacy, decisional balance, and processes of change, were measured at baseline, 3 months, and 6 months. Differences in TTM variables between groups were tested using repeated measures analysis of variance. The intervention group had significantly higher self-efficacy ( F = 9.55, p = .003) and perceived significantly fewer cons of exercise ( F = 5.416, p = .025) at 3 and 6 months compared with the standard care group. Self-liberation, counterconditioning, and reinforcement management processes of change increased significantly from baseline to 6 months in the intervention group, and self-efficacy and reinforcement management were significantly associated with improvement in stage of change. The stage-based physical activity intervention increased use of select processes of change, improved self-efficacy, decreased perceptions of the cons of exercise, and helped participants advance in stage of change. These results point to the importance of using a theory-based approach in interventions to increase physical activity in cancer survivors.
Patterned wafer geometry grouping for improved overlay control
NASA Astrophysics Data System (ADS)
Lee, Honggoo; Han, Sangjun; Woo, Jaeson; Park, Junbeom; Song, Changrock; Anis, Fatima; Vukkadala, Pradeep; Jeon, Sanghuck; Choi, DongSub; Huang, Kevin; Heo, Hoyoung; Smith, Mark D.; Robinson, John C.
2017-03-01
Process-induced overlay errors from outside the litho cell have become a significant contributor to the overlay error budget including non-uniform wafer stress. Previous studies have shown the correlation between process-induced stress and overlay and the opportunity for improvement in process control, including the use of patterned wafer geometry (PWG) metrology to reduce stress-induced overlay signatures. Key challenges of volume semiconductor manufacturing are how to improve not only the magnitude of these signatures, but also the wafer to wafer variability. This work involves a novel technique of using PWG metrology to provide improved litho-control by wafer-level grouping based on incoming process induced overlay, relevant for both 3D NAND and DRAM. Examples shown in this study are from 19 nm DRAM manufacturing.
Abdollahi, Yadollah; Sairi, Nor Asrina; Said, Suhana Binti Mohd; Abouzari-lotf, Ebrahim; Zakaria, Azmi; Sabri, Mohd Faizul Bin Mohd; Islam, Aminul; Alias, Yatimah
2015-11-05
It is believe that 80% industrial of carbon dioxide can be controlled by separation and storage technologies which use the blended ionic liquids absorber. Among the blended absorbers, the mixture of water, N-methyldiethanolamine (MDEA) and guanidinium trifluoromethane sulfonate (gua) has presented the superior stripping qualities. However, the blended solution has illustrated high viscosity that affects the cost of separation process. In this work, the blended fabrication was scheduled with is the process arranging, controlling and optimizing. Therefore, the blend's components and operating temperature were modeled and optimized as input effective variables to minimize its viscosity as the final output by using back-propagation artificial neural network (ANN). The modeling was carried out by four mathematical algorithms with individual experimental design to obtain the optimum topology using root mean squared error (RMSE), R-squared (R(2)) and absolute average deviation (AAD). As a result, the final model (QP-4-8-1) with minimum RMSE and AAD as well as the highest R(2) was selected to navigate the fabrication of the blended solution. Therefore, the model was applied to obtain the optimum initial level of the input variables which were included temperature 303-323 K, x[gua], 0-0.033, x[MDAE], 0.3-0.4, and x[H2O], 0.7-1.0. Moreover, the model has obtained the relative importance ordered of the variables which included x[gua]>temperature>x[MDEA]>x[H2O]. Therefore, none of the variables was negligible in the fabrication. Furthermore, the model predicted the optimum points of the variables to minimize the viscosity which was validated by further experiments. The validated results confirmed the model schedulability. Accordingly, ANN succeeds to model the initial components of the blended solutions as absorber of CO2 capture in separation technologies that is able to industries scale up. Copyright © 2015 Elsevier B.V. All rights reserved.
Development of uniform and predictable battery materials for nickel-cadmium aerospace cells
NASA Technical Reports Server (NTRS)
1971-01-01
Battery materials and manufacturing methods were analyzed with the aim of developing uniform and predictable battery plates for nickel cadmium aerospace cells. A study is presented for the high temperature electrochemical impregnation process for the preparation of nickel cadmium battery plates. This comparative study is set up as a factorially designed experiment to examine both manufacturing and operational variables and any interaction that might exist between them. The manufacturing variables in the factorial design include plaque preparative method, plaque porosity and thickness, impregnation method, and loading, The operational variables are type of duty cycle, charge and discharge rate, extent of overcharge, and depth of discharge.
Investigation of JP-8 Autoignition Under Vitiated Combustion Conditions
2011-01-01
no less than 1.5 times the dew point temperature of the mixture for all test cases that involved H2O. The flow path and apparatus for the steam...Variable m Interaction Effect of Design Variables m and n R Universal Gas Constant [cal/mol-K] E Activation Energy of Ignition Process [cal/mol] T...combustion including CO2, CO, H2O, and NOX. Vitiated conditions are often the result of flue or exhaust gas recirculation (EGR) into a fresh air stream
Real-time parameter optimization based on neural network for smart injection molding
NASA Astrophysics Data System (ADS)
Lee, H.; Liau, Y.; Ryu, K.
2018-03-01
The manufacturing industry has been facing several challenges, including sustainability, performance and quality of production. Manufacturers attempt to enhance the competitiveness of companies by implementing CPS (Cyber-Physical Systems) through the convergence of IoT(Internet of Things) and ICT(Information & Communication Technology) in the manufacturing process level. Injection molding process has a short cycle time and high productivity. This features have been making it suitable for mass production. In addition, this process is used to produce precise parts in various industry fields such as automobiles, optics and medical devices. Injection molding process has a mixture of discrete and continuous variables. In order to optimized the quality, variables that is generated in the injection molding process must be considered. Furthermore, Optimal parameter setting is time-consuming work to predict the optimum quality of the product. Since the process parameter cannot be easily corrected during the process execution. In this research, we propose a neural network based real-time process parameter optimization methodology that sets optimal process parameters by using mold data, molding machine data, and response data. This paper is expected to have academic contribution as a novel study of parameter optimization during production compare with pre - production parameter optimization in typical studies.
Process for estimating likelihood and confidence in post detonation nuclear forensics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darby, John L.; Craft, Charles M.
2014-07-01
Technical nuclear forensics (TNF) must provide answers to questions of concern to the broader community, including an estimate of uncertainty. There is significant uncertainty associated with post-detonation TNF. The uncertainty consists of a great deal of epistemic (state of knowledge) as well as aleatory (random) uncertainty, and many of the variables of interest are linguistic (words) and not numeric. We provide a process by which TNF experts can structure their process for answering questions and provide an estimate of uncertainty. The process uses belief and plausibility, fuzzy sets, and approximate reasoning.
NASA Astrophysics Data System (ADS)
Hwang, Jai-Chan; Noh, Hyerim
2005-03-01
We present cosmological perturbation theory based on generalized gravity theories including string theory correction terms and a tachyonic complication. The classical evolution as well as the quantum generation processes in these varieties of gravity theories are presented in unified forms. These apply both to the scalar- and tensor-type perturbations. Analyses are made based on the curvature variable in two different gauge conditions often used in the literature in Einstein’s gravity; these are the curvature variables in the comoving (or uniform-field) gauge and the zero-shear gauge. Applications to generalized slow-roll inflation and its consequent power spectra are derived in unified forms which include a wide range of inflationary scenarios based on Einstein’s gravity and others.
DOT National Transportation Integrated Search
2013-02-15
The technical tasks in this study included activities to characterize the impact of selected : metallurgical processing and fabrication variables on ethanol stress corrosion cracking (ethanol : SCC) of new pipeline steels, develop a better understand...
HYDROLOGIC EVALUATION OF LANDFILL PERFORMANCE (HELP) MODEL - USER'S GUIDE FOR VERSION 3
This report documents the solution methods and process descriptions used in the Version 3 of the HELP model. Program documentation including program options, system and operating requirements, file structures, program structure and variable descriptions are provided in a separat...
Methods and Techniques of Revenue Forecasting.
ERIC Educational Resources Information Center
Caruthers, J. Kent; Wentworth, Cathi L.
1997-01-01
Revenue forecasting is the critical first step in most college and university budget-planning processes. While it seems a straightforward exercise, effective forecasting requires consideration of a number of interacting internal and external variables, including demographic trends, economic conditions, and broad social priorities. The challenge…
Effects of Embedded Processing Tasks on Learning Outcomes.
ERIC Educational Resources Information Center
Hobbs, D. J.
1987-01-01
Describes a British study with undergraduate accountancy students which compared the quantitative and qualitative effects of three types of embedded tasks or questions--relational-semantic, transpose-semantic, and non-semantic--on learning outcomes. Variables investigated included mathematical background, recall, and comprehension. Relevance of…
Computing and Office Automation: Changing Variables.
ERIC Educational Resources Information Center
Staman, E. Michael
1981-01-01
Trends in computing and office automation and their applications, including planning, institutional research, and general administrative support in higher education, are discussed. Changing aspects of information processing and an increasingly larger user community are considered. The computing literacy cycle may involve programming, analysis, use…
Reward Processing, Neuroeconomics, and Psychopathology.
Zald, David H; Treadway, Michael T
2017-05-08
Abnormal reward processing is a prominent transdiagnostic feature of psychopathology. The present review provides a framework for considering the different aspects of reward processing and their assessment, and highlights recent insights from the field of neuroeconomics that may aid in understanding these processes. Although altered reward processing in psychopathology has often been treated as a general hypo- or hyperresponsivity to reward, increasing data indicate that a comprehensive understanding of reward dysfunction requires characterization within more specific reward-processing domains, including subjective valuation, discounting, hedonics, reward anticipation and facilitation, and reinforcement learning. As such, more nuanced models of the nature of these abnormalities are needed. We describe several processing abnormalities capable of producing the types of selective alterations in reward-related behavior observed in different forms of psychopathology, including (mal)adaptive scaling and anchoring, dysfunctional weighting of reward and cost variables, competition between valuation systems, and reward prediction error signaling.
Reward Processing, Neuroeconomics, and Psychopathology
Zald, David H.; Treadway, Michael
2018-01-01
Abnormal reward processing is a prominent transdiagnostic feature of psychopathology. The present review provides a framework for considering the different aspects of reward processing and their assessment and highlight recent insights from the field of neuroeconomics that may aid in understanding these processes. Although altered reward processing in psychopathology has often been treated as a general hypo- or hyper-responsivity to reward, increasing data indicate that a comprehensive understanding of reward dysfunction requires characterization within more specific reward processing domains, including subjective valuation, discounting, hedonics, reward anticipation and facilitation, and reinforcement learning. As such, more nuanced models of the nature of these abnormalities are needed. We describe several processing abnormalities capable of producing the types of selective alterations in reward related behavior observed in different forms of psychopathology, including (mal)adaptive scaling and anchoring, dysfunctional weighting of reward and cost variables, completion between valuation systems, and positive prediction error signaling. PMID:28301764
NASA Astrophysics Data System (ADS)
Siedlecki, Samantha A.; Pilcher, Darren J.; Hermann, Albert J.; Coyle, Ken; Mathis, Jeremy
2017-11-01
High-latitude and subpolar regions like the Gulf of Alaska (GOA) are more vulnerable than equatorial regions to rising carbon dioxide (CO2) levels, in part due to local processes that amplify the global signal. Recent field observations have shown that the shelf of the GOA is currently experiencing seasonal corrosive events (carbonate mineral saturation states Ω, Ω < 1), including suppressed Ω in response to ocean acidification as well as local processes like increased low-alkalinity glacial meltwater discharge. While the glacial discharge mainly influences the inner shelf, on the outer shelf, upwelling brings corrosive waters from the deep GOA. In this work, we develop a high-resolution model for carbon dynamics in the GOA, identify regions of high variability of Ω, and test the sensitivity of those regions to changes in the chemistry of glacial meltwater discharge. Results indicate the importance of this climatically sensitive and relatively unconstrained regional freshwater forcing for Ω variability in the nearshore. The increase was nearly linear at 0.002 Ω per 100 µmol/kg increase in alkalinity in the freshwater runoff. We find that the local winds, biological processes, and freshwater forcing all contribute to the spatial distribution of Ω and identify which of these three is highly correlated to the variability in Ω. Given that the timing and magnitude of these processes will likely change during the next few decades, it is critical to elucidate the effect of local processes on the background ocean acidification signal using robust models, such as the one described here.
Hydrologic Remote Sensing and Land Surface Data Assimilation.
Moradkhani, Hamid
2008-05-06
Accurate, reliable and skillful forecasting of key environmental variables such as soil moisture and snow are of paramount importance due to their strong influence on many water resources applications including flood control, agricultural production and effective water resources management which collectively control the behavior of the climate system. Soil moisture is a key state variable in land surface-atmosphere interactions affecting surface energy fluxes, runoff and the radiation balance. Snow processes also have a large influence on land-atmosphere energy exchanges due to snow high albedo, low thermal conductivity and considerable spatial and temporal variability resulting in the dramatic change on surface and ground temperature. Measurement of these two variables is possible through variety of methods using ground-based and remote sensing procedures. Remote sensing, however, holds great promise for soil moisture and snow measurements which have considerable spatial and temporal variability. Merging these measurements with hydrologic model outputs in a systematic and effective way results in an improvement of land surface model prediction. Data Assimilation provides a mechanism to combine these two sources of estimation. Much success has been attained in recent years in using data from passive microwave sensors and assimilating them into the models. This paper provides an overview of the remote sensing measurement techniques for soil moisture and snow data and describes the advances in data assimilation techniques through the ensemble filtering, mainly Ensemble Kalman filter (EnKF) and Particle filter (PF), for improving the model prediction and reducing the uncertainties involved in prediction process. It is believed that PF provides a complete representation of the probability distribution of state variables of interests (according to sequential Bayes law) and could be a strong alternative to EnKF which is subject to some limitations including the linear updating rule and assumption of jointly normal distribution of errors in state variables and observation.
Process for applying control variables having fractal structures
Bullock, IV, Jonathan S.; Lawson, Roger L.
1996-01-01
A process and apparatus for the application of a control variable having a fractal structure to a body or process. The process of the present invention comprises the steps of generating a control variable having a fractal structure and applying the control variable to a body or process reacting in accordance with the control variable. The process is applicable to electroforming where first, second and successive pulsed-currents are applied to cause the deposition of material onto a substrate, such that the first pulsed-current, the second pulsed-current, and successive pulsed currents form a fractal pulsed-current waveform.
Process for applying control variables having fractal structures
Bullock, J.S. IV; Lawson, R.L.
1996-01-23
A process and apparatus are disclosed for the application of a control variable having a fractal structure to a body or process. The process of the present invention comprises the steps of generating a control variable having a fractal structure and applying the control variable to a body or process reacting in accordance with the control variable. The process is applicable to electroforming where first, second and successive pulsed-currents are applied to cause the deposition of material onto a substrate, such that the first pulsed-current, the second pulsed-current, and successive pulsed currents form a fractal pulsed-current waveform. 3 figs.
Pharmaceutical quality by design: product and process development, understanding, and control.
Yu, Lawrence X
2008-04-01
The purpose of this paper is to discuss the pharmaceutical Quality by Design (QbD) and describe how it can be used to ensure pharmaceutical quality. The QbD was described and some of its elements identified. Process parameters and quality attributes were identified for each unit operation during manufacture of solid oral dosage forms. The use of QbD was contrasted with the evaluation of product quality by testing alone. The QbD is a systemic approach to pharmaceutical development. It means designing and developing formulations and manufacturing processes to ensure predefined product quality. Some of the QbD elements include: Defining target product quality profile; Designing product and manufacturing processes; Identifying critical quality attributes, process parameters, and sources of variability; Controlling manufacturing processes to produce consistent quality over time. Using QbD, pharmaceutical quality is assured by understanding and controlling formulation and manufacturing variables. Product testing confirms the product quality. Implementation of QbD will enable transformation of the chemistry, manufacturing, and controls (CMC) review of abbreviated new drug applications (ANDAs) into a science-based pharmaceutical quality assessment.
Multiplicative processes in visual cognition
NASA Astrophysics Data System (ADS)
Credidio, H. F.; Teixeira, E. N.; Reis, S. D. S.; Moreira, A. A.; Andrade, J. S.
2014-03-01
The Central Limit Theorem (CLT) is certainly one of the most important results in the field of statistics. The simple fact that the addition of many random variables can generate the same probability curve, elucidated the underlying process for a broad spectrum of natural systems, ranging from the statistical distribution of human heights to the distribution of measurement errors, to mention a few. An extension of the CLT can be applied to multiplicative processes, where a given measure is the result of the product of many random variables. The statistical signature of these processes is rather ubiquitous, appearing in a diverse range of natural phenomena, including the distributions of incomes, body weights, rainfall, and fragment sizes in a rock crushing process. Here we corroborate results from previous studies which indicate the presence of multiplicative processes in a particular type of visual cognition task, namely, the visual search for hidden objects. Precisely, our results from eye-tracking experiments show that the distribution of fixation times during visual search obeys a log-normal pattern, while the fixational radii of gyration follow a power-law behavior.
Estimation of Particulate Mass and Manganese Exposure Levels among Welders
Hobson, Angela; Seixas, Noah; Sterling, David; Racette, Brad A.
2011-01-01
Background: Welders are frequently exposed to Manganese (Mn), which may increase the risk of neurological impairment. Historical exposure estimates for welding-exposed workers are needed for epidemiological studies evaluating the relationship between welding and neurological or other health outcomes. The objective of this study was to develop and validate a multivariate model to estimate quantitative levels of welding fume exposures based on welding particulate mass and Mn concentrations reported in the published literature. Methods: Articles that described welding particulate and Mn exposures during field welding activities were identified through a comprehensive literature search. Summary measures of exposure and related determinants such as year of sampling, welding process performed, type of ventilation used, degree of enclosure, base metal, and location of sampling filter were extracted from each article. The natural log of the reported arithmetic mean exposure level was used as the dependent variable in model building, while the independent variables included the exposure determinants. Cross-validation was performed to aid in model selection and to evaluate the generalizability of the models. Results: A total of 33 particulate and 27 Mn means were included in the regression analysis. The final model explained 76% of the variability in the mean exposures and included welding process and degree of enclosure as predictors. There was very little change in the explained variability and root mean squared error between the final model and its cross-validation model indicating the final model is robust given the available data. Conclusions: This model may be improved with more detailed exposure determinants; however, the relatively large amount of variance explained by the final model along with the positive generalizability results of the cross-validation increases the confidence that the estimates derived from this model can be used for estimating welder exposures in absence of individual measurement data. PMID:20870928
ERIC Educational Resources Information Center
Bermani, Michelle Ines
2017-01-01
In this quantitative and qualitative mixed study, the researcher focused on a range of factors that drive principals' decision making and examined the variables that affect principals' decision-making. The study assessed the extent to which principals' leadership and decision-making processes exert influence on the operations of inclusion…
Analysis And Control System For Automated Welding
NASA Technical Reports Server (NTRS)
Powell, Bradley W.; Burroughs, Ivan A.; Kennedy, Larry Z.; Rodgers, Michael H.; Goode, K. Wayne
1994-01-01
Automated variable-polarity plasma arc (VPPA) welding apparatus operates under electronic supervision by welding analysis and control system. System performs all major monitoring and controlling functions. It acquires, analyzes, and displays weld-quality data in real time and adjusts process parameters accordingly. Also records pertinent data for use in post-weld analysis and documentation of quality. System includes optoelectronic sensors and data processors that provide feedback control of welding process.
Dynamic control of remelting processes
Bertram, Lee A.; Williamson, Rodney L.; Melgaard, David K.; Beaman, Joseph J.; Evans, David G.
2000-01-01
An apparatus and method of controlling a remelting process by providing measured process variable values to a process controller; estimating process variable values using a process model of a remelting process; and outputting estimated process variable values from the process controller. Feedback and feedforward control devices receive the estimated process variable values and adjust inputs to the remelting process. Electrode weight, electrode mass, electrode gap, process current, process voltage, electrode position, electrode temperature, electrode thermal boundary layer thickness, electrode velocity, electrode acceleration, slag temperature, melting efficiency, cooling water temperature, cooling water flow rate, crucible temperature profile, slag skin temperature, and/or drip short events are employed, as are parameters representing physical constraints of electroslag remelting or vacuum arc remelting, as applicable.
NASA Technical Reports Server (NTRS)
Spiering, Bruce; Underwood, Lauren; Ellis, Chris; Lehrter, John; Hagy, Jim; Schaeffer, Blake
2010-01-01
The goals of the project are to provide information from satellite remote sensing to support numeric nutrient criteria development and to determine data processing methods and data quality requirements to support nutrient criteria development and implementation. The approach is to identify water quality indicators that are used by decision makers to assess water quality and that are related to optical properties of the water; to develop remotely sensed data products based on algorithms relating remote sensing imagery to field-based observations of indicator values; to develop methods to assess estuarine water quality, including trends, spatial and temporal variability, and seasonality; and to develop tools to assist in the development and implementation of estuarine and coastal nutrient criteria. Additional slides present process, criteria development, typical data sources and analyses for criteria process, the power of remote sensing data for the process, examples from Pensacola Bay, spatial and temporal variability, pixel matchups, remote sensing validation, remote sensing in coastal waters, requirements for remotely sensed data products, and needs assessment. An additional presentation examines group engagement and information collection. Topics include needs assessment purpose and objectives, understanding water quality decision making, determining information requirements, and next steps.
Use of multi-node wells in the Groundwater-Management Process of MODFLOW-2005 (GWM-2005)
Ahlfeld, David P.; Barlow, Paul M.
2013-01-01
Many groundwater wells are open to multiple aquifers or to multiple intervals within a single aquifer. These types of wells can be represented in numerical simulations of groundwater flow by use of the Multi-Node Well (MNW) Packages developed for the U.S. Geological Survey’s MODFLOW model. However, previous versions of the Groundwater-Management (GWM) Process for MODFLOW did not allow the use of multi-node wells in groundwater-management formulations. This report describes modifications to the MODFLOW–2005 version of the GWM Process (GWM–2005) to provide for such use with the MNW2 Package. Multi-node wells can be incorporated into a management formulation as flow-rate decision variables for which optimal withdrawal or injection rates will be determined as part of the GWM–2005 solution process. In addition, the heads within multi-node wells can be used as head-type state variables, and, in that capacity, be included in the objective function or constraint set of a management formulation. Simple head bounds also can be defined to constrain water levels at multi-node wells. The report provides instructions for including multi-node wells in the GWM–2005 data-input files and a sample problem that demonstrates use of multi-node wells in a typical groundwater-management problem.
Quantum information processing with a travelling wave of light
NASA Astrophysics Data System (ADS)
Serikawa, Takahiro; Shiozawa, Yu; Ogawa, Hisashi; Takanashi, Naoto; Takeda, Shuntaro; Yoshikawa, Jun-ichi; Furusawa, Akira
2018-02-01
We exploit quantum information processing on a traveling wave of light, expecting emancipation from thermal noise, easy coupling to fiber communication, and potentially high operation speed. Although optical memories are technically challenging, we have an alternative approach to apply multi-step operations on traveling light, that is, continuous-variable one-way computation. So far our achievement includes generation of a one-million-mode entangled chain in time-domain, mode engineering of nonlinear resource states, and real-time nonlinear feedforward. Although they are implemented with free space optics, we are also investigating photonic integration and performed quantum teleportation with a passive liner waveguide chip as a demonstration of entangling, measurement, and feedforward. We also suggest a loop-based architecture as another model of continuous-variable computing.
Riddell, Michaela A; Edwards, Nancy; Thompson, Simon R; Bernabe-Ortiz, Antonio; Praveen, Devarsetty; Johnson, Claire; Kengne, Andre P; Liu, Peter; McCready, Tara; Ng, Eleanor; Nieuwlaat, Robby; Ovbiagele, Bruce; Owolabi, Mayowa; Peiris, David; Thrift, Amanda G; Tobe, Sheldon; Yusoff, Khalid
2017-03-15
The imperative to improve global health has prompted transnational research partnerships to investigate common health issues on a larger scale. The Global Alliance for Chronic Diseases (GACD) is an alliance of national research funding agencies. To enhance research funded by GACD members, this study aimed to standardise data collection methods across the 15 GACD hypertension research teams and evaluate the uptake of these standardised measurements. Furthermore we describe concerns and difficulties associated with the data harmonisation process highlighted and debated during annual meetings of the GACD funded investigators. With these concerns and issues in mind, a working group comprising representatives from the 15 studies iteratively identified and proposed a set of common measures for inclusion in each of the teams' data collection plans. One year later all teams were asked which consensus measures had been implemented. Important issues were identified during the data harmonisation process relating to data ownership, sharing methodologies and ethical concerns. Measures were assessed across eight domains; demographic; dietary; clinical and anthropometric; medical history; hypertension knowledge; physical activity; behavioural (smoking and alcohol); and biochemical domains. Identifying validated measures relevant across a variety of settings presented some difficulties. The resulting GACD hypertension data dictionary comprises 67 consensus measures. Of the 14 responding teams, only two teams were including more than 50 consensus variables, five teams were including between 25 and 50 consensus variables and four teams were including between 6 and 24 consensus variables, one team did not provide details of the variables collected and two teams did not include any of the consensus variables as the project had already commenced or the measures were not relevant to their study. Deriving consensus measures across diverse research projects and contexts was challenging. The major barrier to their implementation was related to the time taken to develop and present these measures. Inclusion of consensus measures into future funding announcements would facilitate researchers integrating these measures within application protocols. We suggest that adoption of consensus measures developed here, across the field of hypertension, would help advance the science in this area, allowing for more comparable data sets and generalizable inferences.
Water Rockets and Indirect Measurement.
ERIC Educational Resources Information Center
Inman, Duane
1997-01-01
Describes an activity that teaches a number of scientific concepts including indirect measurement, Newton's third law of motion, manipulating and controlling variables, and the scientific method of inquiry. Uses process skills such as observation, inference, prediction, mensuration, and communication as well as problem solving and higher-order…
Gilerson, Alexander; Carrizo, Carlos; Foster, Robert; Harmel, Tristan
2018-04-16
The value and spectral dependence of the reflectance coefficient (ρ) of skylight from wind-roughened ocean surfaces is critical for determining accurate water leaving radiance and remote sensing reflectances from shipborne, AERONET-Ocean Color and satellite observations. Using a vector radiative transfer code, spectra of the reflectance coefficient and corresponding radiances near the ocean surface and at the top of the atmosphere (TOA) are simulated for a broad range of parameters including flat and windy ocean surfaces with wind speeds up to 15 m/s, aerosol optical thicknesses of 0-1 at 440nm, wavelengths of 400-900 nm, and variable Sun and viewing zenith angles. Results revealed a profound impact of the aerosol load and type on the spectral values of ρ. Such impacts, not included yet in standard processing, may produce significant inaccuracies in the reflectance spectra retrieved from above-water radiometry and satellite observations. Implications for satellite cal/val activities as well as potential changes in measurement and data processing schemes are discussed.
Stability of mycotoxins during food processing.
Bullerman, Lloyd B; Bianchini, Andreia
2007-10-20
The mycotoxins that commonly occur in cereal grains and other products are not completely destroyed during food processing operations and can contaminate finished processed foods. The mycotoxins most commonly associated with cereal grains are aflatoxins, ochratoxin A, fumonisins, deoxynivalenol and zearalenone. The various food processes that may have effects on mycotoxins include sorting, trimming, cleaning, milling, brewing, cooking, baking, frying, roasting, canning, flaking, alkaline cooking, nixtamalization, and extrusion. Most of the food processes have variable effects on mycotoxins, with those that utilize the highest temperatures having greatest effects. In general the processes reduce mycotoxin concentrations significantly, but do not eliminate them completely. However, roasting and extrusion processing show promise for lowering mycotoxin concentrations, though very high temperatures are needed to bring about much of a reduction in mycotoxin concentrations. Extrusion processing at temperatures greater than 150 degrees C are needed to give good reduction of zearalenone, moderate reduction of alfatoxins, variable to low reduction of deoxynivalenol and good reduction of fumonisins. The greatest reductions of fumonisins occur at extrusion temperatures of 160 degrees C or higher and in the presence of glucose. Extrusion of fumonisin contaminated corn grits with 10% added glucose resulted in 75-85% reduction in Fumonisin B(1) levels. Some fumonisin degredation products are formed during extrusion, including small amounts of hydrolyzed Fumonisin B(1) and N-(Carboxymethyl) - Fumonisin B(1) and somewhat higher amounts of N-(1-deoxy-d-fructos-1-yl) Fumonisin B(1) in extruded grits containing added glucose. Feeding trial toxicity tests in rats with extruded fumonisin contaminated corn grits show some reduction in toxicity of grits extruded with glucose.
Cahyadi, Christine; Heng, Paul Wan Sia; Chan, Lai Wah
2011-03-01
The aim of this study was to identify and optimize the critical process parameters of the newly developed Supercell quasi-continuous coater for optimal tablet coat quality. Design of experiments, aided by multivariate analysis techniques, was used to quantify the effects of various coating process conditions and their interactions on the quality of film-coated tablets. The process parameters varied included batch size, inlet temperature, atomizing pressure, plenum pressure, spray rate and coating level. An initial screening stage was carried out using a 2(6-1(IV)) fractional factorial design. Following these preliminary experiments, optimization study was carried out using the Box-Behnken design. Main response variables measured included drug-loading efficiency, coat thickness variation, and the extent of tablet damage. Apparent optimum conditions were determined by using response surface plots. The process parameters exerted various effects on the different response variables. Hence, trade-offs between individual optima were necessary to obtain the best compromised set of conditions. The adequacy of the optimized process conditions in meeting the combined goals for all responses was indicated by the composite desirability value. By using response surface methodology and optimization, coating conditions which produced coated tablets of high drug-loading efficiency, low incidences of tablet damage and low coat thickness variation were defined. Optimal conditions were found to vary over a large spectrum when different responses were considered. Changes in processing parameters across the design space did not result in drastic changes to coat quality, thereby demonstrating robustness in the Supercell coating process. © 2010 American Association of Pharmaceutical Scientists
Accelerated design of bioconversion processes using automated microscale processing techniques.
Lye, Gary J; Ayazi-Shamlou, Parviz; Baganz, Frank; Dalby, Paul A; Woodley, John M
2003-01-01
Microscale processing techniques are rapidly emerging as a means to increase the speed of bioprocess design and reduce material requirements. Automation of these techniques can reduce labour intensity and enable a wider range of process variables to be examined. This article examines recent research on various individual microscale unit operations including microbial fermentation, bioconversion and product recovery techniques. It also explores the potential of automated whole process sequences operated in microwell formats. The power of the whole process approach is illustrated by reference to a particular bioconversion, namely the Baeyer-Villiger oxidation of bicyclo[3.2.0]hept-2-en-6-one for the production of optically pure lactones.
Unsteady Aerodynamic Testing Using the Dynamic Plunge Pitch and Roll Model Mount
NASA Technical Reports Server (NTRS)
Lutze, Frederick H.; Fan, Yigang
1999-01-01
A final report on the DyPPiR tests that were run are presented. Essentially it consists of two parts, a description of the data reduction techniques and the results. The data reduction techniques include three methods that were considered: 1) signal processing of wind on - wind off data; 2) using wind on data in conjunction with accelerometer measurements; and 3) using a dynamic model of the sting to predict the sting oscillations and determining the aerodynamic inputs using an optimization process. After trying all three, we ended up using method 1, mainly because of its simplicity and our confidence in its accuracy. The results section consists of time history plots of the input variables (angle of attack, roll angle, and/or plunge position) and the corresponding time histories of the output variables, C(sub L), C(sub D), C(sub m), C(sub l), C(sub m), C(sub n). Also included are some phase plots of one or more of the output variable vs. an input variable. Typically of interest are pitch moment coefficient vs. angle of attack for an oscillatory motion where the hysteresis loops can be observed. These plots are useful to determine the "more interesting" cases. Samples of the data as it appears on the disk are presented at the end of the report. The last maneuver, a rolling pull up, is indicative of the unique capabilities of the DyPPiR, allowing combinations of motions to be exercised at the same time.
Mechanisms driving variability in the ocean forcing of Pine Island Glacier
Webber, Benjamin G. M.; Heywood, Karen J.; Stevens, David P.; Dutrieux, Pierre; Abrahamsen, E. Povl; Jenkins, Adrian; Jacobs, Stanley S.; Ha, Ho Kyung; Lee, Sang Hoon; Kim, Tae Wan
2017-01-01
Pine Island Glacier (PIG) terminates in a rapidly melting ice shelf, and ocean circulation and temperature are implicated in the retreat and growing contribution to sea level rise of PIG and nearby glaciers. However, the variability of the ocean forcing of PIG has been poorly constrained due to a lack of multi-year observations. Here we show, using a unique record close to the Pine Island Ice Shelf (PIIS), that there is considerable oceanic variability at seasonal and interannual timescales, including a pronounced cold period from October 2011 to May 2013. This variability can be largely explained by two processes: cumulative ocean surface heat fluxes and sea ice formation close to PIIS; and interannual reversals in ocean currents and associated heat transport within Pine Island Bay, driven by a combination of local and remote forcing. Local atmospheric forcing therefore plays an important role in driving oceanic variability close to PIIS. PMID:28211473
Okruszek, Łukasz; Dolan, Kirsty; Lawrence, Megan; Cella, Matteo
2017-10-01
There is a long-standing debate on the influence of physiological signals on social behavior. Recent studies suggested that heart rate variability (HRV) may be a marker of social cognitive processes. However, this evidence is preliminary and limited to laboratory studies. In this study, 25 participants were assessed with a social cognition battery and asked to wear a wearable device measuring HRV for 6 consecutive days. The results showed that reduced HRV correlated with higher hostility attribution bias. However, no relationship was found between HRV and other social cognitive measures including facial emotion recognition, theory of mind or emotional intelligence. These results suggest that HRV may be linked to specific social cognitive processes requiring online emotional processing, in particular those related to social threat. These findings are discussed in the context of the neurovisceral integration model.
Rodriguez, Hayley; Kissell, Kellie; Lucas, Lloyd; Fisak, Brian
2017-11-01
Although negative beliefs have been found to be associated with worry symptoms and depressive rumination, negative beliefs have yet to be examined in relation to post-event processing and social anxiety symptoms. The purpose of the current study was to examine the psychometric properties of the Negative Beliefs about Post-Event Processing Questionnaire (NB-PEPQ). A large, non-referred undergraduate sample completed the NB-PEPQ along with validation measures, including a measure of post-event processing and social anxiety symptoms. Based on factor analysis, a single-factor model was obtained, and the NB-PEPQ was found to exhibit good validity, including positive associations with measures of post-event processing and social anxiety symptoms. These findings add to the literature on the metacognitive variables that may lead to the development and maintenance of post-event processing and social anxiety symptoms, and have relevant clinical applications.
Space Shuttle ET Friction Stir Weld Machines
NASA Technical Reports Server (NTRS)
Thompson, Jack M.
2003-01-01
NASA and Lockheed-Martin approached the FSW machine vendor community with a specification for longitudinal barrel production FSW weld machines and a shorter travel process development machine in June of 2000. This specification was based on three years of FSW process development on the Space Shuttle External Tank alloys, AL2 195-T8M4 and AL22 19-T87. The primary motivations for changing the ET longitudinal welds from the existing variable polarity Plasma Arc plasma weld process included: (1) Significantly reduced weld defect rates and related reduction in cycle time and uncertainty; (2) Many fewer process variables to control (5 vs. 17); (3) Fewer manufacturing steps; (4) Lower residual stresses and distortion; (5) Improved weld strengths, particularly at cryogenic temperatures; (6) Fewer hazards to production personnel. General Tool was the successful bidder. The equipment is at this writing installed and welding flight hardware. This paper is a means of sharing with the rest of the FSW community the unique features developed to assure NASA/L-M of successful production welds.
Review and classification of variability analysis techniques with clinical applications.
Bravi, Andrea; Longtin, André; Seely, Andrew J E
2011-10-10
Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.
Review and classification of variability analysis techniques with clinical applications
2011-01-01
Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357
Goodman, Geoff; Chung, Hyewon; Fischel, Leah; Athey-Lloyd, Laura
2017-07-01
This study examined the sequential relations among three pertinent variables in child psychotherapy: therapeutic alliance (TA) (including ruptures and repairs), autism symptoms, and adherence to child-centered play therapy (CCPT) process. A 2-year CCPT of a 6-year-old Caucasian boy diagnosed with autism spectrum disorder was conducted weekly with two doctoral-student therapists, working consecutively for 1 year each, in a university-based community mental-health clinic. Sessions were video-recorded and coded using the Child Psychotherapy Process Q-Set (CPQ), a measure of the TA, and an autism symptom measure. Sequential relations among these variables were examined using simulation modeling analysis (SMA). In Therapist 1's treatment, unexpectedly, autism symptoms decreased three sessions after a rupture occurred in the therapeutic dyad. In Therapist 2's treatment, adherence to CCPT process increased 2 weeks after a repair occurred in the therapeutic dyad. The TA decreased 1 week after autism symptoms increased. Finally, adherence to CCPT process decreased 1 week after autism symptoms increased. The authors concluded that (1) sequential relations differ by therapist even though the child remains constant, (2) therapeutic ruptures can have an unexpected effect on autism symptoms, and (3) changes in autism symptoms can precede as well as follow changes in process variables.
Process and representation in graphical displays
NASA Technical Reports Server (NTRS)
Gillan, Douglas J.; Lewis, Robert; Rudisill, Marianne
1990-01-01
How people comprehend graphics is examined. Graphical comprehension involves the cognitive representation of information from a graphic display and the processing strategies that people apply to answer questions about graphics. Research on representation has examined both the features present in a graphic display and the cognitive representation of the graphic. The key features include the physical components of a graph, the relation between the figure and its axes, and the information in the graph. Tests of people's memory for graphs indicate that both the physical and informational aspect of a graph are important in the cognitive representation of a graph. However, the physical (or perceptual) features overshadow the information to a large degree. Processing strategies also involve a perception-information distinction. In order to answer simple questions (e.g., determining the value of a variable, comparing several variables, and determining the mean of a set of variables), people switch between two information processing strategies: (1) an arithmetic, look-up strategy in which they use a graph much like a table, looking up values and performing arithmetic calculations; and (2) a perceptual strategy in which they use the spatial characteristics of the graph to make comparisons and estimations. The user's choice of strategies depends on the task and the characteristics of the graph. A theory of graphic comprehension is presented.
NASA Astrophysics Data System (ADS)
Nugraha, M. G.; Utari, S.; Saepuzaman, D.; Nugraha, F.
2018-05-01
Scientific process skills (SPS) are an intellectual skill to build knowledge, solve problems scientifically, train thinking skills as well as a very important part of the inquiry process and contribute to scientific literacy. Therefore, SPS is very important to be developed. This study aims to develop Student Worksheets (SW) that can trace SPS through basic physics experiments (BPE) on Melde’s law. This research uses R&D method involving 18 physics education department students who take the BPE course as a sample. The research instrument uses an SW designed with a SPS approach that have been reviewed and judged by expert, which includes observing, communicating, classifying, measuring, inferring, predicting, identifying variable, constructing hypothesis, defining variable operationally, designing experiment, acquiring and processing data to conclusions. The result of the research shows that the student’s SPS has not been trained optimally, the students’ answers are not derived from the observations and experiments conducted but derived from the initial knowledge of the students, as well as in the determination of experimental variables, inferring and hypothesis. This result is also supported by a low increase of conceptual content on Melde’s law with n-gain of 0.40. The research findings are used as the basis for the redesign of SW.
Quantification of process variables for carbothermic synthesis of UC 1-xN x fuel microspheres
Lindemer, Terrance B.; Silva, Chinthaka M.; Henry, Jr, John James; ...
2016-11-05
This report details the continued investigation of process variables involved in converting sol-gel-derived, urania-carbon microspheres to ~820-μm-dia. UC 1-xN x fuel kernels in flow-through, vertical Mo and W crucibles at temperatures up to 2123 K. Experiments included calcining of air-dried UO 3-H 2O-C microspheres in Ar and H 2-containing gases, conversion of the resulting UO 2-C kernels to dense UO2:2UC in the same gases and vacuum, and its conversion in N 2 to UC 1-xN x (x = ~0.85). The thermodynamics of the relevant reactions were applied extensively to interpret and control the process variables. Producing the precursor UO 2:2UCmore » kernel of ~96% theoretical density was required, but its subsequent conversion to UC 1-xN x at 2123 K was not accompanied by sintering and resulted in ~83-86% of theoretical density. Increasing the UC 1-xN x kernel nitride component to ~0.98 in flowing N 2-H 2 mixtures to evolve HCN was shown to be quantitatively consistent with present and past experiments and the only useful application of H 2 in the entire process.« less
Quantification of process variables for carbothermic synthesis of UC1-xNx fuel microspheres
NASA Astrophysics Data System (ADS)
Lindemer, T. B.; Silva, C. M.; Henry, J. J.; McMurray, J. W.; Voit, S. L.; Collins, J. L.; Hunt, R. D.
2017-01-01
This report details the continued investigation of process variables involved in converting sol-gel-derived, urania-carbon microspheres to ∼820-μm-dia. UC1-xNx fuel kernels in flow-through, vertical Mo and W crucibles at temperatures up to 2123 K. Experiments included calcining of air-dried UO3-H2O-C microspheres in Ar and H2-containing gases, conversion of the resulting UO2-C kernels to dense UO2:2UC in the same gases and vacuum, and its conversion in N2 to UC1-xNx (x = ∼0.85). The thermodynamics of the relevant reactions were applied extensively to interpret and control the process variables. Producing the precursor UO2:2UC kernel of ∼96% theoretical density was required, but its subsequent conversion to UC1-xNx at 2123 K was not accompanied by sintering and resulted in ∼83-86% of theoretical density. Increasing the UC1-xNx kernel nitride component to ∼0.98 in flowing N2-H2 mixtures to evolve HCN was shown to be quantitatively consistent with present and past experiments and the only useful application of H2 in the entire process.
A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori
2005-07-01
The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less
A Framework to Design and Optimize Chemical Flooding Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori
2006-08-31
The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less
A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori
2004-11-01
The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less
Individual Differences in Pain: Understanding the Mosaic that Makes Pain Personal
Fillingim, Roger B.
2016-01-01
The experience of pain is characterized by tremendous inter-individual variability. Multiple biological and psychosocial variables contribute to these individual differences in pain, including demographic variables, genetic factors, and psychosocial processes. For example, sex, age and ethnic group differences in the prevalence of chronic pain conditions have been widely reported. Moreover, these demographic factors have been associated with responses to experimentally-induced pain. Similarly, both genetic and psychosocial factors contribute to clinical and experimental pain responses. Importantly, these different biopsychosocial influences interact with each other in complex ways to sculpt the experience of pain. Some genetic associations with pain have been found to vary across sex and ethnic group. Moreover, genetic factors also interact with psychosocial factors, including stress and pain catastrophizing, to influence pain. The individual and combined influences of these biological and psychosocial variables results in a unique mosaic of factors that contributes pain in each individual. Understanding these mosaics is critically important in order to provide optimal pain treatment, and future research to further elucidate the nature of these biopsychosocial interactions is needed in order to provide more informed and personalized pain care. PMID:27902569
From the Last Interglacial to the Anthropocene: Modelling a Complete Glacial Cycle (PalMod)
NASA Astrophysics Data System (ADS)
Brücher, Tim; Latif, Mojib
2017-04-01
We will give a short overview and update on the current status of the national climate modelling initiative PalMod (Paleo Modelling, www.palmod.de). PalMod focuses on the understanding of the climate system dynamics and its variability during the last glacial cycle. The initiative is funded by the German Federal Ministry of Education and Research (BMBF) and its specific topics are: (i) to identify and quantify the relative contributions of the fundamental processes which determined the Earth's climate trajectory and variability during the last glacial cycle, (ii) to simulate with comprehensive Earth System Models (ESMs) the climate from the peak of the last interglacial - the Eemian warm period - up to the present, including the changes in the spectrum of variability, and (iii) to assess possible future climate trajectories beyond this century during the next millennia with sophisticated ESMs tested in such a way. The research is intended to be conducted over a period of 10 years, but with shorter funding cycles. PalMod kicked off in February 2016. The first phase focuses on the last deglaciation (app. the last 23.000 years). From the ESM perspective PalMod pushes forward model development by coupling ESM with dynamical ice sheet models. Computer scientists work on speeding up climate models using different concepts (like parallelisation in time) and one working group is dedicated to perform a comprehensive data synthesis to validate model performance. The envisioned approach is innovative in three respects. First, the consortium aims at simulating a full glacial cycle in transient mode and with comprehensive ESMs which allow full interactions between the physical and biogeochemical components of the Earth system, including ice sheets. Second, we shall address climate variability during the last glacial cycle on a large range of time scales, from interannual to multi-millennial, and attempt to quantify the relative contributions of external forcing and processes internal to the Earth system to climate variability at different time scales. Third, in order to achieve a higher level of understanding of natural climate variability at time scales of millennia, its governing processes and implications for the future climate, we bring together three different research communities: the Earth system modeling community, the proxy data community and the computational science community. The consortium consists of 18 partners including all major modelling centers within Germany. The funding comprises approximately 65 PostDoc positions and more than 120 scientists are involved. PalMod is coordinated at the Helmholtz Centre for Ocean Research Kiel (GEOMAR).
Liang, Shih-Hsiung; Walther, Bruno Andreas; Shieh, Bao-Sen
2017-01-01
Biological invasions have become a major threat to biodiversity, and identifying determinants underlying success at different stages of the invasion process is essential for both prevention management and testing ecological theories. To investigate variables associated with different stages of the invasion process in a local region such as Taiwan, potential problems using traditional parametric analyses include too many variables of different data types (nominal, ordinal, and interval) and a relatively small data set with too many missing values. We therefore used five decision tree models instead and compared their performance. Our dataset contains 283 exotic bird species which were transported to Taiwan; of these 283 species, 95 species escaped to the field successfully (introduction success); of these 95 introduced species, 36 species reproduced in the field of Taiwan successfully (establishment success). For each species, we collected 22 variables associated with human selectivity and species traits which may determine success during the introduction stage and establishment stage. For each decision tree model, we performed three variable treatments: (I) including all 22 variables, (II) excluding nominal variables, and (III) excluding nominal variables and replacing ordinal values with binary ones. Five performance measures were used to compare models, namely, area under the receiver operating characteristic curve (AUROC), specificity, precision, recall, and accuracy. The gradient boosting models performed best overall among the five decision tree models for both introduction and establishment success and across variable treatments. The most important variables for predicting introduction success were the bird family, the number of invaded countries, and variables associated with environmental adaptation, whereas the most important variables for predicting establishment success were the number of invaded countries and variables associated with reproduction. Our final optimal models achieved relatively high performance values, and we discuss differences in performance with regard to sample size and variable treatments. Our results showed that, for both the establishment model and introduction model, the number of invaded countries was the most important or second most important determinant, respectively. Therefore, we suggest that future success for introduction and establishment of exotic birds may be gauged by simply looking at previous success in invading other countries. Finally, we found that species traits related to reproduction were more important in establishment models than in introduction models; importantly, these determinants were not averaged but either minimum or maximum values of species traits. Therefore, we suggest that in addition to averaged values, reproductive potential represented by minimum and maximum values of species traits should be considered in invasion studies.
Liang, Shih-Hsiung; Walther, Bruno Andreas
2017-01-01
Background Biological invasions have become a major threat to biodiversity, and identifying determinants underlying success at different stages of the invasion process is essential for both prevention management and testing ecological theories. To investigate variables associated with different stages of the invasion process in a local region such as Taiwan, potential problems using traditional parametric analyses include too many variables of different data types (nominal, ordinal, and interval) and a relatively small data set with too many missing values. Methods We therefore used five decision tree models instead and compared their performance. Our dataset contains 283 exotic bird species which were transported to Taiwan; of these 283 species, 95 species escaped to the field successfully (introduction success); of these 95 introduced species, 36 species reproduced in the field of Taiwan successfully (establishment success). For each species, we collected 22 variables associated with human selectivity and species traits which may determine success during the introduction stage and establishment stage. For each decision tree model, we performed three variable treatments: (I) including all 22 variables, (II) excluding nominal variables, and (III) excluding nominal variables and replacing ordinal values with binary ones. Five performance measures were used to compare models, namely, area under the receiver operating characteristic curve (AUROC), specificity, precision, recall, and accuracy. Results The gradient boosting models performed best overall among the five decision tree models for both introduction and establishment success and across variable treatments. The most important variables for predicting introduction success were the bird family, the number of invaded countries, and variables associated with environmental adaptation, whereas the most important variables for predicting establishment success were the number of invaded countries and variables associated with reproduction. Discussion Our final optimal models achieved relatively high performance values, and we discuss differences in performance with regard to sample size and variable treatments. Our results showed that, for both the establishment model and introduction model, the number of invaded countries was the most important or second most important determinant, respectively. Therefore, we suggest that future success for introduction and establishment of exotic birds may be gauged by simply looking at previous success in invading other countries. Finally, we found that species traits related to reproduction were more important in establishment models than in introduction models; importantly, these determinants were not averaged but either minimum or maximum values of species traits. Therefore, we suggest that in addition to averaged values, reproductive potential represented by minimum and maximum values of species traits should be considered in invasion studies. PMID:28316893
den Besten, Heidy M W; Wells-Bennik, Marjon H J; Zwietering, Marcel H
2018-03-25
Heat treatments are widely used in food processing often with the aim of reducing or eliminating spoilage microorganisms and pathogens in food products. The efficacy of applying heat to control microorganisms is challenged by the natural diversity of microorganisms with respect to their heat robustness. This review gives an overview of the variations in heat resistances of various species and strains, describes modeling approaches to quantify heat robustness, and addresses the relevance and impact of the natural diversity of microorganisms when assessing heat inactivation. This comparison of heat resistances of microorganisms facilitates the evaluation of which (groups of) organisms might be troublesome in a production process in which heat treatment is critical to reducing the microbial contaminants, and also allows fine-tuning of the process parameters. Various sources of microbiological variability are discussed and compared for a range of species, including spore-forming and non-spore-forming pathogens and spoilage organisms. This benchmarking of variability factors gives crucial information about the most important factors that should be included in risk assessments to realistically predict heat inactivation of bacteria and spores as part of the measures for controlling shelf life and safety of food products.
NASA Astrophysics Data System (ADS)
Urban, F. E.; Clow, G. D.; Meares, D. C.
2004-12-01
Observations of long-term climate and surficial geological processes are sparse in most of the Arctic, despite the fact that this region is highly sensitive to climate change. Instrumental networks that monitor the interplay of climatic variability and geological/cryospheric processes are a necessity for documenting and understanding climate change. Improvements to the spatial coverage and temporal scale of Arctic climate data are in progress. The USGS, in collaboration with The Bureau of Land Management (BLM) and The Fish and Wildlife Service (FWS) currently maintains two types of monitoring networks in northern Alaska: (1) A 15 site network of continuously operating active-layer and climate monitoring stations, and (2) a 21 element array of deep bore-holes in which the thermal state of deep permafrost is monitored. Here, we focus on the USGS Alaska Active Layer and Climate Monitoring Network (AK-CLIM). These 15 stations are deployed in longitudinal transects that span Alaska north of the Brooks Range, (11 in The National Petroleum Reserve Alaska, (NPRA), and 4 in The Arctic National Wildlife Refuge (ANWR)). An informative overview and update of the USGS AK-CLIM network is presented, including insight to current data, processing and analysis software, and plans for data telemetry. Data collection began in 1998 and parameters currently measured include air temperature, soil temperatures (5-120 cm), snow depth, incoming and reflected short-wave radiation, soil moisture (15 cm), wind speed and direction. Custom processing and analysis software has been written that calculates additional parameters such as active layer thaw depth, thawing-degree-days, albedo, cloudiness, and duration of seasonal snow cover. Data from selected AK-CLIM stations are now temporally sufficient to begin identifying trends, anomalies, and inter-annual variability in the climate of northern Alaska.
Effect of electron-beam deposition process variables on the film characteristics of the CrOx films
NASA Astrophysics Data System (ADS)
Chiu, Po-kai; Liao, Yi-Ting; Tsai, Hung-Yin; Chiang, Donyau
2018-02-01
The film characteristics and optical properties of the chromium oxide films on the glass substrates prepared by electron-beam deposition with different process variables were investigated. The process variables included are the various oxygen flow rates, the different applied substrate temperatures, and the preparation process in Ar or O2 surrounding environment with and without ion-assisted deposition. The optical constants of the deposited films are determined from the reflectance and transmittance measurements obtained using a spectrophotometer with wavelengths ranging from 350 nm to 2000 nm. The microstructures of the films were examined by the XRD, SEM, and XPS. The electrical conductivity was measured by a four-point probe instrument. The resulting microstructures of all the prepared films are amorphous and the features of the films are dense, uniform and no pillar structure is observed. The refractive index of deposited films decrease with oxygen flow rate increase within studied wavelengths and the extinction coefficients have the same trend in wavelengths of UV/Vis ranges. Increasing substrate temperature to 200 oC results in increase of both refractive index and extinction coefficient, but substrate temperatures below 150 oC show negligible effect on optical constants. The optical and electrical properties in the prepared CrOx films are illustrated by the analyzed XPS results, which decompose the enveloped curve of chromium electron energy status into the constituents of metal Cr, oxides CrO2 and Cr2O3. The relative occupied area contributed from metal Cr and area contributed from the other oxides can express the concentration ratio of free electron to covalent bonds in deposited films and the ratio is applied to explain the film characteristics, including the optical constants and sheet resistance.
Policy Making for American Education.
ERIC Educational Resources Information Center
Campbell, Roald F.; Layton, Donald H.
This monograph studies the policy-making process of public schools at the elementary and secondary levels. Variables include legislative bodies and courts at the local, State, and national levels and special interest groups (e.g., labor unions and religious bodies, professional educators, and qualified voters). Public expectations have contributed…
Processes Affecting the Annual Surface Energy Budget at High-Latitude Terrestrial Sites
NASA Astrophysics Data System (ADS)
Persson, P. O. G.; Stone, R. S.; Grachev, A.; Matrosova, L.
2012-04-01
Instrumentation at four Study of Environmental Arctic Change (SEARCH) sites (Barrow, Eureka, Alert, and Tiksi) have been enhanced in the past 6 years, including during the 2007-2008 IPY. Data from these sites are used to investigate the annual cycle of the surface energy budget (SEB), its coupling to atmospheric processes, and for Alert, its interannual variability. The comprehensive data sets are useful for showing interactions between the atmosphere, surface, and soil at high temporal resolution throughout the annual cycle. Processes that govern the SEB variability at each site are identified, and their impacts on the SEB are quantified. For example, mesoscale modulation of the SEB caused by forcing from the local terrain (downslope wind events) and coastlines (sea and land breezes) are significant at Alert and Eureka, with these processes affecting both radiative, turbulent, and ground heat flux terms in the SEB. Sub-seasonal and interannual variations in atmospheric processes and SEB impact soil thermal structures, such as the depth and timing of the summer active layer. These analyses provide an improved understanding of the processes producing changes in surface and soil temperature, linking them through the SEB as affected by atmospheric processes.
Origins of extrinsic variability in eukaryotic gene expression
NASA Astrophysics Data System (ADS)
Volfson, Dmitri; Marciniak, Jennifer; Blake, William J.; Ostroff, Natalie; Tsimring, Lev S.; Hasty, Jeff
2006-02-01
Variable gene expression within a clonal population of cells has been implicated in a number of important processes including mutation and evolution, determination of cell fates and the development of genetic disease. Recent studies have demonstrated that a significant component of expression variability arises from extrinsic factors thought to influence multiple genes simultaneously, yet the biological origins of this extrinsic variability have received little attention. Here we combine computational modelling with fluorescence data generated from multiple promoter-gene inserts in Saccharomyces cerevisiae to identify two major sources of extrinsic variability. One unavoidable source arising from the coupling of gene expression with population dynamics leads to a ubiquitous lower limit for expression variability. A second source, which is modelled as originating from a common upstream transcription factor, exemplifies how regulatory networks can convert noise in upstream regulator expression into extrinsic noise at the output of a target gene. Our results highlight the importance of the interplay of gene regulatory networks with population heterogeneity for understanding the origins of cellular diversity.
Origins of extrinsic variability in eukaryotic gene expression
NASA Astrophysics Data System (ADS)
Volfson, Dmitri; Marciniak, Jennifer; Blake, William J.; Ostroff, Natalie; Tsimring, Lev S.; Hasty, Jeff
2006-03-01
Variable gene expression within a clonal population of cells has been implicated in a number of important processes including mutation and evolution, determination of cell fates and the development of genetic disease. Recent studies have demonstrated that a significant component of expression variability arises from extrinsic factors thought to influence multiple genes in concert, yet the biological origins of this extrinsic variability have received little attention. Here we combine computational modeling with fluorescence data generated from multiple promoter-gene inserts in Saccharomyces cerevisiae to identify two major sources of extrinsic variability. One unavoidable source arising from the coupling of gene expression with population dynamics leads to a ubiquitous noise floor in expression variability. A second source which is modeled as originating from a common upstream transcription factor exemplifies how regulatory networks can convert noise in upstream regulator expression into extrinsic noise at the output of a target gene. Our results highlight the importance of the interplay of gene regulatory networks with population heterogeneity for understanding the origins of cellular diversity.
Optimization of a GO2/GH2 Impinging Injector Element
NASA Technical Reports Server (NTRS)
Tucker, P. Kevin; Shyy, Wei; Vaidyanathan, Rajkumar
2001-01-01
An injector optimization methodology, method i, is used to investigate optimal design points for a gaseous oxygen/gaseous hydrogen (GO2/GH2) impinging injector element. The unlike impinging element, a fuel-oxidizer- fuel (F-O-F) triplet, is optimized in terms of design variables such as fuel pressure drop, (Delta)P(sub f), oxidizer pressure drop, (Delta)P(sub o), combustor length, L(sub comb), and impingement half-angle, alpha, for a given mixture ratio and chamber pressure. Dependent variables such as energy release efficiency, ERE, wall heat flux, Q(sub w), injector heat flux, Q(sub inj), relative combustor weight, W(sub rel), and relative injector cost, C(sub rel), are calculated and then correlated with the design variables. An empirical design methodology is used to generate these responses for 163 combinations of input variables. Method i is then used to generate response surfaces for each dependent variable. Desirability functions based on dependent variable constraints are created and used to facilitate development of composite response surfaces representing some, or all, of the five dependent variables in terms of the input variables. Three examples illustrating the utility and flexibility of method i are discussed in detail. First, joint response surfaces are constructed by sequentially adding dependent variables. Optimum designs are identified after addition of each variable and the effect each variable has on the design is shown. This stepwise demonstration also highlights the importance of including variables such as weight and cost early in the design process. Secondly, using the composite response surface which includes all five dependent variables, unequal weights are assigned to emphasize certain variables relative to others. Here, method i is used to enable objective trade studies on design issues such as component life and thrust to weight ratio. Finally, specific variable weights are further increased to illustrate the high marginal cost of realizing the last increment of injector performance and thruster weight.
NASA Astrophysics Data System (ADS)
Forkel, Matthias; Dorigo, Wouter; Lasslop, Gitta; Teubner, Irene; Chuvieco, Emilio; Thonicke, Kirsten
2017-12-01
Vegetation fires affect human infrastructures, ecosystems, global vegetation distribution, and atmospheric composition. However, the climatic, environmental, and socioeconomic factors that control global fire activity in vegetation are only poorly understood, and in various complexities and formulations are represented in global process-oriented vegetation-fire models. Data-driven model approaches such as machine learning algorithms have successfully been used to identify and better understand controlling factors for fire activity. However, such machine learning models cannot be easily adapted or even implemented within process-oriented global vegetation-fire models. To overcome this gap between machine learning-based approaches and process-oriented global fire models, we introduce a new flexible data-driven fire modelling approach here (Satellite Observations to predict FIre Activity, SOFIA approach version 1). SOFIA models can use several predictor variables and functional relationships to estimate burned area that can be easily adapted with more complex process-oriented vegetation-fire models. We created an ensemble of SOFIA models to test the importance of several predictor variables. SOFIA models result in the highest performance in predicting burned area if they account for a direct restriction of fire activity under wet conditions and if they include a land cover-dependent restriction or allowance of fire activity by vegetation density and biomass. The use of vegetation optical depth data from microwave satellite observations, a proxy for vegetation biomass and water content, reaches higher model performance than commonly used vegetation variables from optical sensors. We further analyse spatial patterns of the sensitivity between anthropogenic, climate, and vegetation predictor variables and burned area. We finally discuss how multiple observational datasets on climate, hydrological, vegetation, and socioeconomic variables together with data-driven modelling and model-data integration approaches can guide the future development of global process-oriented vegetation-fire models.
NASA Astrophysics Data System (ADS)
Eveleth, R.; Cassar, N.; Doney, S. C.; Munro, D. R.; Sweeney, C.
2017-05-01
Using simultaneous sub-kilometer resolution underway measurements of surface O2/Ar, total O2 and pCO2 from annual austral summer surveys in 2012, 2013 and 2014, we explore the impacts of biological and physical processes on the O2 and pCO2 system spatial and interannual variability at the Western Antarctic Peninsula (WAP). In the WAP, mean O2/Ar supersaturation was (7.6±9.1)% and mean pCO2 supersaturation was (-28±22)%. We see substantial spatial variability in O2 and pCO2 including sub-mesoscale/mesoscale variability with decorrelation length scales of 4.5 km, consistent with the regional Rossby radius. This variability is embedded within onshore-offshore gradients. O2 in the LTER grid region is driven primarily by biological processes as seen by the median ratio of the magnitude of biological oxygen (O2/Ar) to physical oxygen (Ar) supersaturation anomalies (%) relative to atmospheric equilibrium (2.6), however physical processes have a more pronounced influence in the southern onshore region of the grid where we see active sea-ice melting. Total O2 measurements should be interpreted with caution in regions of significant sea-ice formation and melt and glacial meltwater input. pCO2 undersaturation predominantly reflects biological processes in the LTER grid. In contrast we compare these results to the Drake Passage where gas supersaturations vary by smaller magnitudes and decorrelate at length scales of 12 km, in line with latitudinal changes in the regional Rossby radius. Here biological processes induce smaller O2/Ar supersaturations (mean (0.14±1.3)%) and pCO2 undersaturations (mean (-2.8±3.9)%) than in the WAP, and pressure changes, bubble and gas exchange fluxes drive stable Ar supersaturations.
Synchronous parallel system for emulation and discrete event simulation
NASA Technical Reports Server (NTRS)
Steinman, Jeffrey S. (Inventor)
1992-01-01
A synchronous parallel system for emulation and discrete event simulation having parallel nodes responds to received messages at each node by generating event objects having individual time stamps, stores only the changes to state variables of the simulation object attributable to the event object, and produces corresponding messages. The system refrains from transmitting the messages and changing the state variables while it determines whether the changes are superseded, and then stores the unchanged state variables in the event object for later restoral to the simulation object if called for. This determination preferably includes sensing the time stamp of each new event object and determining which new event object has the earliest time stamp as the local event horizon, determining the earliest local event horizon of the nodes as the global event horizon, and ignoring the events whose time stamps are less than the global event horizon. Host processing between the system and external terminals enables such a terminal to query, monitor, command or participate with a simulation object during the simulation process.
Barnett, Tony; Fournié, Guillaume; Gupta, Sunetra; Seeley, Janet
2015-01-01
Incorporation of 'social' variables into epidemiological models remains a challenge. Too much detail and models cease to be useful; too little and the very notion of infection - a highly social process in human populations - may be considered with little reference to the social. The French sociologist Émile Durkheim proposed that the scientific study of society required identification and study of 'social currents'. Such 'currents' are what we might today describe as 'emergent properties', specifiable variables appertaining to individuals and groups, which represent the perspectives of social actors as they experience the environment in which they live their lives. Here we review the ways in which one particular emergent property, hope, relevant to a range of epidemiological situations, might be used in epidemiological modelling of infectious diseases in human populations. We also indicate how such an approach might be extended to include a range of other potential emergent properties to represent complex social and economic processes bearing on infectious disease transmission.
Synchronous Parallel System for Emulation and Discrete Event Simulation
NASA Technical Reports Server (NTRS)
Steinman, Jeffrey S. (Inventor)
2001-01-01
A synchronous parallel system for emulation and discrete event simulation having parallel nodes responds to received messages at each node by generating event objects having individual time stamps, stores only the changes to the state variables of the simulation object attributable to the event object and produces corresponding messages. The system refrains from transmitting the messages and changing the state variables while it determines whether the changes are superseded, and then stores the unchanged state variables in the event object for later restoral to the simulation object if called for. This determination preferably includes sensing the time stamp of each new event object and determining which new event object has the earliest time stamp as the local event horizon, determining the earliest local event horizon of the nodes as the global event horizon, and ignoring events whose time stamps are less than the global event horizon. Host processing between the system and external terminals enables such a terminal to query, monitor, command or participate with a simulation object during the simulation process.
Technical variables in high-throughput miRNA expression profiling: much work remains to be done.
Nelson, Peter T; Wang, Wang-Xia; Wilfred, Bernard R; Tang, Guiliang
2008-11-01
MicroRNA (miRNA) gene expression profiling has provided important insights into plant and animal biology. However, there has not been ample published work about pitfalls associated with technical parameters in miRNA gene expression profiling. One source of pertinent information about technical variables in gene expression profiling is the separate and more well-established literature regarding mRNA expression profiling. However, many aspects of miRNA biochemistry are unique. For example, the cellular processing and compartmentation of miRNAs, the differential stability of specific miRNAs, and aspects of global miRNA expression regulation require specific consideration. Additional possible sources of systematic bias in miRNA expression studies include the differential impact of pre-analytical variables, substrate specificity of nucleic acid processing enzymes used in labeling and amplification, and issues regarding new miRNA discovery and annotation. We conclude that greater focus on technical parameters is required to bolster the validity, reliability, and cultural credibility of miRNA gene expression profiling studies.
Al-Omar, Sally; Le Rolle, Virginie; Beuchée, Alain; Samson, Nathalie; Praud, Jean-Paul; Carrault, Guy
2018-05-10
A semi-automated processing approach was developed to assess the effects of early postnatal environmental tobacco smoke (ETS) on the cardiorespiratory control of newborn lambs. The system consists of several steps beginning with artifact rejection, followed by the selection of stationary segments, and ending with feature extraction. This approach was used in six lambs exposed to 20 cigarettes/day for the first 15 days of life, while another six control lambs were exposed to room air. On postnatal day 16, electrocardiograph and respiratory signals were obtained from a 6-h polysomnographic recording. The effects of postnatal ETS exposure on heart rate variability, respiratory rate variability, and cardiorespiratory interrelations were explored. The unique results suggest that early postnatal ETS exposure increases respiratory rate variability and decreases the coupling between cardiac and respiratory systems. Potentially harmful consequences in early life include unstable breathing and decreased adaptability of cardiorespiratory function, particularly during early life challenges, such as prematurity or viral infection. Graphical abstract ᅟ.
NASA Astrophysics Data System (ADS)
Krtičková, I.; Krtička, J.
2018-06-01
Stars that exhibit a B[e] phenomenon comprise a very diverse group of objects in a different evolutionary status. These objects show common spectral characteristics, including the presence of Balmer lines in emission, forbidden lines and strong infrared excess due to dust. Observations of emission lines indicate illumination by an ultraviolet ionizing source, which is key to understanding the elusive nature of these objects. We study the ultraviolet variability of many B[e] stars to specify the geometry of the circumstellar environment and its variability. We analyse massive hot B[e] stars from our Galaxy and from the Magellanic Clouds. We study the ultraviolet broad-band variability derived from the flux-calibrated data. We determine variations of individual lines and the correlation with the total flux variability. We detected variability of the spectral energy distribution and of the line profiles. The variability has several sources of origin, including light absorption by the disc, pulsations, luminous blue variable type variations, and eclipses in the case of binaries. The stellar radiation of most of B[e] stars is heavily obscured by circumstellar material. This suggests that the circumstellar material is present not only in the disc but also above its plane. The flux and line variability is consistent with a two-component model of a circumstellar environment composed of a dense disc and an ionized envelope. Observations of B[e] supergiants show that many of these stars have nearly the same luminosity, about 1.9 × 105 L⊙, and similar effective temperatures.
Aspects of porosity prediction using multivariate linear regression
DOE Office of Scientific and Technical Information (OSTI.GOV)
Byrnes, A.P.; Wilson, M.D.
1991-03-01
Highly accurate multiple linear regression models have been developed for sandstones of diverse compositions. Porosity reduction or enhancement processes are controlled by the fundamental variables, Pressure (P), Temperature (T), Time (t), and Composition (X), where composition includes mineralogy, size, sorting, fluid composition, etc. The multiple linear regression equation, of which all linear porosity prediction models are subsets, takes the generalized form: Porosity = C{sub 0} + C{sub 1}(P) + C{sub 2}(T) + C{sub 3}(X) + C{sub 4}(t) + C{sub 5}(PT) + C{sub 6}(PX) + C{sub 7}(Pt) + C{sub 8}(TX) + C{sub 9}(Tt) + C{sub 10}(Xt) + C{sub 11}(PTX) + C{submore » 12}(PXt) + C{sub 13}(PTt) + C{sub 14}(TXt) + C{sub 15}(PTXt). The first four primary variables are often interactive, thus requiring terms involving two or more primary variables (the form shown implies interaction and not necessarily multiplication). The final terms used may also involve simple mathematic transforms such as log X, e{sup T}, X{sup 2}, or more complex transformations such as the Time-Temperature Index (TTI). The X term in the equation above represents a suite of compositional variable and, therefore, a fully expanded equation may include a series of terms incorporating these variables. Numerous published bivariate porosity prediction models involving P (or depth) or Tt (TTI) are effective to a degree, largely because of the high degree of colinearity between p and TTI. However, all such bivariate models ignore the unique contributions of P and Tt, as well as various X terms. These simpler models become poor predictors in regions where colinear relations change, were important variables have been ignored, or where the database does not include a sufficient range or weight distribution for the critical variables.« less
NASA Astrophysics Data System (ADS)
Lucas, S. E.
2017-12-01
The Climate Variability & Predictability (CVP) Program supports research aimed at providing process-level understanding of the climate system through observation, modeling, analysis, and field studies. This vital knowledge is needed to improve climate models and predictions so that scientists can better anticipate the impacts of future climate variability and change. To achieve its mission, the CVP Program supports research carried out at NOAA and other federal laboratories, NOAA Cooperative Institutes, and academic institutions. The Program also coordinates its sponsored projects with major national and international scientific bodies including the World Climate Research Programme (WCRP), the International and U.S. Climate Variability and Predictability (CLIVAR/US CLIVAR) Program, and the U.S. Global Change Research Program (USGCRP). The CVP program sits within NOAA's Climate Program Office (http://cpo.noaa.gov/CVP). In 2017, the CVP Program had a call for proposals focused on observing and understanding processes affecting the propagation of intraseasonal oscillations in the Maritime Continent region. This poster will present the recently funded CVP projects, the expected scientific outcomes, the geographic areas of their work in the Maritime Continent region, and the collaborations with the Office of Naval Research, Indonesian Agency for Meteorology, Climatology and Geophysics (BMKG), Japan Agency for Marine-Earth Science and Technology (JAMSTEC) and other partners.
Complexity in relational processing predicts changes in functional brain network dynamics.
Cocchi, Luca; Halford, Graeme S; Zalesky, Andrew; Harding, Ian H; Ramm, Brentyn J; Cutmore, Tim; Shum, David H K; Mattingley, Jason B
2014-09-01
The ability to link variables is critical to many high-order cognitive functions, including reasoning. It has been proposed that limits in relating variables depend critically on relational complexity, defined formally as the number of variables to be related in solving a problem. In humans, the prefrontal cortex is known to be important for reasoning, but recent studies have suggested that such processes are likely to involve widespread functional brain networks. To test this hypothesis, we used functional magnetic resonance imaging and a classic measure of deductive reasoning to examine changes in brain networks as a function of relational complexity. As expected, behavioral performance declined as the number of variables to be related increased. Likewise, increments in relational complexity were associated with proportional enhancements in brain activity and task-based connectivity within and between 2 cognitive control networks: A cingulo-opercular network for maintaining task set, and a fronto-parietal network for implementing trial-by-trial control. Changes in effective connectivity as a function of increased relational complexity suggested a key role for the left dorsolateral prefrontal cortex in integrating and implementing task set in a trial-by-trial manner. Our findings show that limits in relational processing are manifested in the brain as complexity-dependent modulations of large-scale networks. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Development of a novel wet oxidation process for hazardous and mixed wastes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dhooge, P.M.
1994-11-01
This article describes and evaluates the DETOX{sup sm} process for processing of mixed wastes. Many DOE waste streams and remediates contain complex and variable mixtures of organic compounds, toxic metals, and radionuclides, often dispersed in organic or inorganic matrices, such as personal protective equipment, various sludges, soils, and water. The DETOX{sup sm} process, patented by Delphi Research, uses a unique combination of metal catalysts to increase the rate of oxidation of organic materials. Included are the following subject areas: project description (phases I-IV); results of all phases; and future work. 5 figs., 1 tab.
Variational Data Assimilation for the Global Ocean
2013-01-01
ocean includes the Geoid (a fixed gravity equipotential surface ) as well as the MDT, which is not known accurately enough relative to the centimeter...scales, including processes that control the surface mixed layer, the formation of ocean eddies, meandering ocean J.A. Cummings (E3) nography Division...variables. Examples of this in the ocean are integral quantities, such as acous^B travel time and altimeter measures of sea surface height, and direct
Ecological Drivers of Biogeographic Patterns of Soil Archaeal Community
Zheng, Yuan-Ming; Cao, Peng; Fu, Bojie; Hughes, Jane M.; He, Ji-Zheng
2013-01-01
Knowledge about the biogeography of organisms has long been a focus in ecological research, including the mechanisms that generate and maintain diversity. In this study, we targeted a microbial group relatively underrepresented in the microbial biogeographic literature, the soil Archaea. We surveyed the archaeal abundance and community composition using real-time quantitative PCR and T-RFLP approaches for 105 soil samples from 2 habitat types to identify the archaeal distribution patterns and factors driving these patterns. Results showed that the soil archaeal community was affected by spatial and environmental variables, and 79% and 51% of the community variation was explained in the non-flooded soil (NS) and flooded soil (FS) habitat, respectively, showing its possible biogeographic distribution. The diversity patterns of soil Archaea across the landscape were influenced by a combination of stochastic and deterministic processes. The contribution from neutral processes was higher than that from deterministic processes associated with environmental variables. The variables pH, sample depth and longitude played key roles in determining the archaeal distribution in the NS habitat, while sampling depth, longitude and NH4 +-N were most important in the FS habitat. Overall, there might be similar ecological drivers in the soil archaeal community as in macroorganism communities. PMID:23717418
Revealing structure within the coronae of Seyfert galaxies
NASA Astrophysics Data System (ADS)
Wilkins, D.
2017-10-01
Detailed analysis of the reflection and reverberation of X-rays from the innermost regions of AGN accretion discs reveals the structure and processes that produce the intense continuum emission and the extreme variability we see, right down to the innermost stable orbit and event horizon of the black hole. Observations of Seyfert galaxies spanning more than a decade have enabled measurement of the geometry of the corona and how it evolves, leading to orders of magnitude of variability. They reveal processes the corona undergoes during transient events, notably the collimation and ejection of the corona during X-ray flares, reminiscent of the aborted launching of a jet. Recent reverberation studies, including those of the Seyfert galaxy I Zwicky 1 with XMM-Newton, are revealing structures within the corona for the first time. A persistent collimated core is found, akin to the base of a jet embedded in the innermost regions. The evolution of both the collimated and extended portions point to the mechanisms powering the X-ray emission and variability. This gives us important constraints on the processes by which energy is liberated from black hole accretion flows and by which jets are launched, allowing us to understand how these extreme objects are powered.
Plasma Arc Welding: How it Works
NASA Technical Reports Server (NTRS)
Nunes, Arthur
2004-01-01
The physical principles of PAW from basic arcs to keyholing to variable polarity are outlined. A very brief account of the physics of PAW with an eye to the needs of a welder is presented. Understanding is usually (but not always) superior to handbooks and is required (unless dumb luck intervenes) for innovation. And, in any case, all welders by nature desire to know. A bit of history of the rise and fall of the Variable Polarity (VP) PA process in fabrication of the Space Shuttle External Tank is included.
ERIC Educational Resources Information Center
Malmberg, Lars-Erik; Lim, Wee H. T.; Tolvanen, Asko; Nurmi, Jari-Erik
2016-01-01
In order to advance our understanding of educational processes, we present a tutorial of intraindividual variability. An adaptive educational process is characterised by stable (less variability), and a maladaptive process is characterised by instable (more variability) learning experiences from one learning situation to the next. We outline step…
Huebner, Angela J; Howell, Laurie W
2003-08-01
To examine the relationship between adolescent sexual risk-taking and perception of parental monitoring, frequency of parent-adolescent communication, and parenting style. The influences of gender, age, and ethnicity are also of interest. Data were collected from 7th-12th grade students in six rural, ethnically diverse school located in adjacent counties in a Southeastern state. A 174-item instrument assessed adolescent perceptions, behaviors and attitudes. Youth who had engaged in sexual intercourse (n = 1160) were included in the analyses. Logistic regression analyses were conducted to identify parenting practices that predicted high versus low-risk sex (defined by number of partners and use of condoms). Variables included parental monitoring, parent-adolescent communication, parenting style, parenting process interaction effects and interaction effects among these three parenting processes and gender, age and ethnicity. Analyses included frequencies, cross-tabulations and logistic regression. Parental monitoring, parental monitoring by parent-adolescent communication and parenting style by ethnicity were significant predictors of sexual risk-taking. No gender or age interactions were noted. Parental monitoring, parent-adolescent communication and parenting style are all important variables to consider when examining sexual risk-taking among adolescents.
Jiang, Hui; Zhang, Hang; Chen, Quansheng; Mei, Congli; Liu, Guohai
2015-01-01
The use of wavelength variable selection before partial least squares discriminant analysis (PLS-DA) for qualitative identification of solid state fermentation degree by FT-NIR spectroscopy technique was investigated in this study. Two wavelength variable selection methods including competitive adaptive reweighted sampling (CARS) and stability competitive adaptive reweighted sampling (SCARS) were employed to select the important wavelengths. PLS-DA was applied to calibrate identified model using selected wavelength variables by CARS and SCARS for identification of solid state fermentation degree. Experimental results showed that the number of selected wavelength variables by CARS and SCARS were 58 and 47, respectively, from the 1557 original wavelength variables. Compared with the results of full-spectrum PLS-DA, the two wavelength variable selection methods both could enhance the performance of identified models. Meanwhile, compared with CARS-PLS-DA model, the SCARS-PLS-DA model achieved better results with the identification rate of 91.43% in the validation process. The overall results sufficiently demonstrate the PLS-DA model constructed using selected wavelength variables by a proper wavelength variable method can be more accurate identification of solid state fermentation degree. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Jiang, Hui; Zhang, Hang; Chen, Quansheng; Mei, Congli; Liu, Guohai
2015-10-01
The use of wavelength variable selection before partial least squares discriminant analysis (PLS-DA) for qualitative identification of solid state fermentation degree by FT-NIR spectroscopy technique was investigated in this study. Two wavelength variable selection methods including competitive adaptive reweighted sampling (CARS) and stability competitive adaptive reweighted sampling (SCARS) were employed to select the important wavelengths. PLS-DA was applied to calibrate identified model using selected wavelength variables by CARS and SCARS for identification of solid state fermentation degree. Experimental results showed that the number of selected wavelength variables by CARS and SCARS were 58 and 47, respectively, from the 1557 original wavelength variables. Compared with the results of full-spectrum PLS-DA, the two wavelength variable selection methods both could enhance the performance of identified models. Meanwhile, compared with CARS-PLS-DA model, the SCARS-PLS-DA model achieved better results with the identification rate of 91.43% in the validation process. The overall results sufficiently demonstrate the PLS-DA model constructed using selected wavelength variables by a proper wavelength variable method can be more accurate identification of solid state fermentation degree.
RCT of a Psychological Intervention for Patients With Cancer: I. Mechanisms of Change
Andersen, Barbara L.; Shelby, Rebecca A.; Golden-Kreutz, Deanna M.
2008-01-01
Little is known about the therapeutic processes contributing to efficacy of psychological interventions for patients with cancer. Data from a randomized clinical trial yielding robust biobehavioral and health effects (B. L. Andersen et al., 2004, 2007) were used to examine associations between process variables, treatment utilization, and outcomes. Novel findings emerged. Patients were highly satisfied with the treatment, but their higher levels of felt support (group cohesion) covaried with lower distress and fewer symptoms. Also, specific. treatment strategies were associated with specific outcomes, including lower distress, improved dietary habits, reduced symptomatology, and higher chemotherapy dose intensity. These data provide a comprehensive test of multiple therapeutic processes and mechanisms for biobehavioral change with an intervention including both intensive and maintenance phases. PMID:18085909
Yan, Zhengbing; Kuang, Te-Hui; Yao, Yuan
2017-09-01
In recent years, multivariate statistical monitoring of batch processes has become a popular research topic, wherein multivariate fault isolation is an important step aiming at the identification of the faulty variables contributing most to the detected process abnormality. Although contribution plots have been commonly used in statistical fault isolation, such methods suffer from the smearing effect between correlated variables. In particular, in batch process monitoring, the high autocorrelations and cross-correlations that exist in variable trajectories make the smearing effect unavoidable. To address such a problem, a variable selection-based fault isolation method is proposed in this research, which transforms the fault isolation problem into a variable selection problem in partial least squares discriminant analysis and solves it by calculating a sparse partial least squares model. As different from the traditional methods, the proposed method emphasizes the relative importance of each process variable. Such information may help process engineers in conducting root-cause diagnosis. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Non-Traditional Predictors of Academic Success for Special Action Admissions.
ERIC Educational Resources Information Center
Tom, Alice K.
The use of nontraditional college admission variables in the prediction of academic success was assessed with 444 freshmen entering the University of California, Davis, under the Special Action process (wavering of admission requirements). For fall 1978, 1979, 1980 special entrants, attention was directed to college applications, including high…
Criterion 1: Conservation of biological diversity
Stephen R. Shifley; Francisco X. Aguilar; Nianfu Song; Susan I. Stewart; David J. Nowak; Dale D. Gormanson; W. Keith Moser; Sherri Wormstead; Eric J. Greenfield
2012-01-01
Biological diversity, or biodiversity, is the variety of life. It encompasses the variability among living organisms and includes diversity within species, among species, and among ecosystems. High biodiversity enables a forest ecosystem to respond to external influences, absorb and recover from disturbances, and still maintain essential ecosystem processes such as...
Parental Depressive Symptoms and Children's Sleep: The Role of Family Conflict
ERIC Educational Resources Information Center
El-Sheikh, Mona; Kelly, Ryan J.; Bagley, Erika J.; Wetter, Emily K.
2012-01-01
Background: We used a multi-method and multi-informant design to identify developmental pathways through which parental depressive symptoms contribute to children's sleep problems. Environmental factors including adult inter-partner conflict and parent-child conflict were considered as process variables of this relation. Methods: An ethnically and…
Examining, Documenting, and Modeling the Problem Space of a Variable Domain
2002-06-14
Feature-Oriented Domain Analysis ( FODA ) .............................................................................................. 9...development of this proposed process include: Feature-Oriented Domain Analysis ( FODA ) [3,4], Organization Domain Modeling (ODM) [2,5,6], Family-Oriented...configuration knowledge using generators [2]. 8 Existing Methods of Domain Engineering Feature-Oriented Domain Analysis ( FODA ) FODA is a domain
Medical Decision Making: A Selective Review for Child Psychiatrists and Psychologists
ERIC Educational Resources Information Center
Galanter, Cathryn A.; Patel, Vimla L.
2005-01-01
Physicians, including child and adolescent psychiatrists, show variability and inaccuracies in diagnosis and treatment of their patients and do not routinely implement evidenced-based medical and psychiatric treatments in the community. We believe that it is necessary to characterize the decision-making processes of child and adolescent…
Content Analysis Schedule for Bilingual Education Programs: Proyecto PAL.
ERIC Educational Resources Information Center
Gonzalez, Castor
This content analysis schedule for "Proyecto PAL" in San Jose, California, presents information on the history, funding, and scope of the project. Included are sociolinguistic process variables such as the native and dominant languages of students and their interaction. Information is provided on staff selection and the linguistic…
16 CFR 1107.21 - Periodic testing.
Code of Federal Regulations, 2012 CFR
2012-01-01
... samples selected for testing pass the test, there is a high degree of assurance that the other untested... determining the testing interval include, but are not limited to, the following: (i) High variability in test... process management techniques and tests provide a high degree of assurance of compliance if they are not...
16 CFR § 1107.21 - Periodic testing.
Code of Federal Regulations, 2013 CFR
2013-01-01
... samples selected for testing pass the test, there is a high degree of assurance that the other untested... determining the testing interval include, but are not limited to, the following: (i) High variability in test... process management techniques and tests provide a high degree of assurance of compliance if they are not...
16 CFR 1107.21 - Periodic testing.
Code of Federal Regulations, 2014 CFR
2014-01-01
... samples selected for testing pass the test, there is a high degree of assurance that the other untested... determining the testing interval include, but are not limited to, the following: (i) High variability in test... process management techniques and tests provide a high degree of assurance of compliance if they are not...
Examining Self Regulated Learning in Relation to Certain Selected Variables
ERIC Educational Resources Information Center
Johnson, N.
2012-01-01
Self-regulation is the controlling of a process or activity by the students who are involved in Problem solving in Physics rather than by an external agency (Johnson, 2011). Selfregulated learning consists of three main components: cognition, metacognition, and motivation. Cognition includes skills necessary to encode, memorise, and recall…
The input variables for a numerical model of reactive solute transport in groundwater include both transport parameters, such as hydraulic conductivity and infiltration, and reaction parameters that describe the important chemical and biological processes in the system. These pa...
NASA Astrophysics Data System (ADS)
Christensen, H. M.; Berner, J.; Sardeshmukh, P. D.
2017-12-01
Stochastic parameterizations have been used for more than a decade in atmospheric models. They provide a way to represent model uncertainty through representing the variability of unresolved sub-grid processes, and have been shown to have a beneficial effect on the spread and mean state for medium- and extended-range forecasts. There is increasing evidence that stochastic parameterization of unresolved processes can improve the bias in mean and variability, e.g. by introducing a noise-induced drift (nonlinear rectification), and by changing the residence time and structure of flow regimes. We present results showing the impact of including the Stochastically Perturbed Parameterization Tendencies scheme (SPPT) in coupled runs of the National Center for Atmospheric Research (NCAR) Community Atmosphere Model, version 4 (CAM4) with historical forcing. SPPT results in a significant improvement in the representation of the El Nino-Southern Oscillation in CAM4, improving the power spectrum, as well as both the inter- and intra-annual variability of tropical pacific sea surface temperatures. We use a Linear Inverse Modelling framework to gain insight into the mechanisms by which SPPT has improved ENSO-variability.
NASA Astrophysics Data System (ADS)
Juszczyk, Michał; Leśniak, Agnieszka; Zima, Krzysztof
2013-06-01
Conceptual cost estimation is important for construction projects. Either underestimation or overestimation of building raising cost may lead to failure of a project. In the paper authors present application of a multicriteria comparative analysis (MCA) in order to select factors influencing residential building raising cost. The aim of the analysis is to indicate key factors useful in conceptual cost estimation in the early design stage. Key factors are being investigated on basis of the elementary information about the function, form and structure of the building, and primary assumptions of technological and organizational solutions applied in construction process. The mentioned factors are considered as variables of the model which aim is to make possible conceptual cost estimation fast and with satisfying accuracy. The whole analysis included three steps: preliminary research, choice of a set of potential variables and reduction of this set to select the final set of variables. Multicriteria comparative analysis is applied in problem solution. Performed analysis allowed to select group of factors, defined well enough at the conceptual stage of the design process, to be used as a describing variables of the model.
Erdeniz, Burak; Rohe, Tim; Done, John; Seidler, Rachael D
2013-01-01
Conventional neuroimaging techniques provide information about condition-related changes of the BOLD (blood-oxygen-level dependent) signal, indicating only where and when the underlying cognitive processes occur. Recently, with the help of a new approach called "model-based" functional neuroimaging (fMRI), researchers are able to visualize changes in the internal variables of a time varying learning process, such as the reward prediction error or the predicted reward value of a conditional stimulus. However, despite being extremely beneficial to the imaging community in understanding the neural correlates of decision variables, a model-based approach to brain imaging data is also methodologically challenging due to the multicollinearity problem in statistical analysis. There are multiple sources of multicollinearity in functional neuroimaging including investigations of closely related variables and/or experimental designs that do not account for this. The source of multicollinearity discussed in this paper occurs due to correlation between different subjective variables that are calculated very close in time. Here, we review methodological approaches to analyzing such data by discussing the special case of separating the reward prediction error signal from reward outcomes.
Computational study of peptide permeation through membrane: searching for hidden slow variables
NASA Astrophysics Data System (ADS)
Cardenas, Alfredo E.; Elber, Ron
2013-12-01
Atomically detailed molecular dynamics trajectories in conjunction with Milestoning are used to analyse the different contributions of coarse variables to the permeation process of a small peptide (N-acetyl-l-tryptophanamide, NATA) through a 1,2-dioleoyl-sn-glycero-3-phosphocholine membrane. The peptide reverses its overall orientation as it permeates through the biological bilayer. The large change in orientation is investigated explicitly but is shown to impact the free energy landscape and permeation time only moderately. Nevertheless, a significant difference in permeation properties of the two halves of the membrane suggests the presence of other hidden slow variables. We speculate, based on calculation of the potential of mean force, that a conformational transition of NATA makes significant contribution to these differences. Other candidates for hidden slow variables may include water permeation and collective motions of phospholipids.
A Framework for Orbital Performance Evaluation in Distributed Space Missions for Earth Observation
NASA Technical Reports Server (NTRS)
Nag, Sreeja; LeMoigne-Stewart, Jacqueline; Miller, David W.; de Weck, Olivier
2015-01-01
Distributed Space Missions (DSMs) are gaining momentum in their application to earth science missions owing to their unique ability to increase observation sampling in spatial, spectral and temporal dimensions simultaneously. DSM architectures have a large number of design variables and since they are expected to increase mission flexibility, scalability, evolvability and robustness, their design is a complex problem with many variables and objectives affecting performance. There are very few open-access tools available to explore the tradespace of variables which allow performance assessment and are easy to plug into science goals, and therefore select the most optimal design. This paper presents a software tool developed on the MATLAB engine interfacing with STK, for DSM orbit design and selection. It is capable of generating thousands of homogeneous constellation or formation flight architectures based on pre-defined design variable ranges and sizing those architectures in terms of predefined performance metrics. The metrics can be input into observing system simulation experiments, as available from the science teams, allowing dynamic coupling of science and engineering designs. Design variables include but are not restricted to constellation type, formation flight type, FOV of instrument, altitude and inclination of chief orbits, differential orbital elements, leader satellites, latitudes or regions of interest, planes and satellite numbers. Intermediate performance metrics include angular coverage, number of accesses, revisit coverage, access deterioration over time at every point of the Earth's grid. The orbit design process can be streamlined and variables more bounded along the way, owing to the availability of low fidelity and low complexity models such as corrected HCW equations up to high precision STK models with J2 and drag. The tool can thus help any scientist or program manager select pre-Phase A, Pareto optimal DSM designs for a variety of science goals without having to delve into the details of the engineering design process.
NASA Astrophysics Data System (ADS)
Polo, María José; Egüen, Marta; Andreu, Ana; Carpintero, Elisabet; Gómez-Giráldez, Pedro; Patrocinio González-Dugo, María
2017-04-01
Water vapour fluxes between the soil surface and the atmosphere constitute one of the most important components of the water cycle in the continental areas. Their regime directly affect the availability of water to plants, water storage in surface bodies, air humidity in the boundary layer, snow persistence… among others, and the list of indirectly affected processes comprises a large number of components. Water potential or wetness gradients are some of the main drivers of water vapour fluxes to the atmosphere; soil humidity is usually monitored as key variable in many hydrological and environmental studies, and its estimated series are used to calibrate and validate the modelling of certain hydrological processes. However, such results may differ when water fluxes are used instead of water state variables, such as humidity. This work shows the analysis of high resolution water vapour fluxes series from a dehesa area in South Spain where a complete energy and water fluxes/variables monitoring site has been operating for the last four years. The results include pasture and tree vegetated control points. The daily water budget calculation on both types of sites has been performed from weather and energy fluxes measurements, and soil moisture measurements, and the results have been aggregated on a weekly, monthly and seasonal basis. Comparison between observed trends of soil moisture and calculated trends of water vapour fluxes is included to show the differences arising in terms of the regime of the dominant weather variables in this type of ecosystems. The results identify significant thresholds for each weather variable driver and highlight the importance of the wind regime, which is the somehow forgotten variable in future climate impacts on hydrology. Further work is being carried out to assess water cycle potential trends under future climate conditions and their impacts on the vegetation in dehesa ecosystems.
Hertzog, Christopher; Dixon, Roger A; Hultsch, David F; MacDonald, Stuart W S
2003-12-01
The authors used 6-year longitudinal data from the Victoria Longitudinal Study (VLS) to investigate individual differences in amount of episodic memory change. Latent change models revealed reliable individual differences in cognitive change. Changes in episodic memory were significantly correlated with changes in other cognitive variables, including speed and working memory. A structural equation model for the latent change scores showed that changes in speed and working memory predicted changes in episodic memory, as expected by processing resource theory. However, these effects were best modeled as being mediated by changes in induction and fact retrieval. Dissociations were detected between cross-sectional ability correlations and longitudinal changes. Shuffling the tasks used to define the Working Memory latent variable altered patterns of change correlations.
ARTiiFACT: a tool for heart rate artifact processing and heart rate variability analysis.
Kaufmann, Tobias; Sütterlin, Stefan; Schulz, Stefan M; Vögele, Claus
2011-12-01
The importance of appropriate handling of artifacts in interbeat interval (IBI) data must not be underestimated. Even a single artifact may cause unreliable heart rate variability (HRV) results. Thus, a robust artifact detection algorithm and the option for manual intervention by the researcher form key components for confident HRV analysis. Here, we present ARTiiFACT, a software tool for processing electrocardiogram and IBI data. Both automated and manual artifact detection and correction are available in a graphical user interface. In addition, ARTiiFACT includes time- and frequency-based HRV analyses and descriptive statistics, thus offering the basic tools for HRV analysis. Notably, all program steps can be executed separately and allow for data export, thus offering high flexibility and interoperability with a whole range of applications.
Zhang, Xia; Hu, Changqin
2017-09-08
Penicillins are typical of complex ionic samples which likely contain large number of degradation-related impurities (DRIs) with different polarities and charge properties. It is often a challenge to develop selective and robust high performance liquid chromatography (HPLC) methods for the efficient separation of all DRIs. In this study, an analytical quality by design (AQbD) approach was proposed for stability-indicating method development of cloxacillin. The structures, retention and UV characteristics rules of penicillins and their impurities were summarized and served as useful prior knowledge. Through quality risk assessment and screen design, 3 critical process parameters (CPPs) were defined, including 2 mixture variables (MVs) and 1 process variable (PV). A combined mixture-process variable (MPV) design was conducted to evaluate the 3 CPPs simultaneously and a response surface methodology (RSM) was used to achieve the optimal experiment parameters. A dual gradient elution was performed to change buffer pH, mobile-phase type and strength simultaneously. The design spaces (DSs) was evaluated using Monte Carlo simulation to give their possibility of meeting the specifications of CQAs. A Plackett-Burman design was performed to test the robustness around the working points and to decide the normal operating ranges (NORs). Finally, validation was performed following International Conference on Harmonisation (ICH) guidelines. To our knowledge, this is the first study of using MPV design and dual gradient elution to develop HPLC methods and improve separations for complex ionic samples. Copyright © 2017 Elsevier B.V. All rights reserved.
Wu, Huiquan; White, Maury; Khan, Mansoor A
2011-02-28
The aim of this work was to develop an integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and design space development. A dynamic co-precipitation process by gradually introducing water to the ternary system of naproxen-Eudragit L100-alcohol was monitored at real-time in situ via Lasentec FBRM and PVM. 3D map of count-time-chord length revealed three distinguishable process stages: incubation, transition, and steady-state. The effects of high risk process variables (slurry temperature, stirring rate, and water addition rate) on both derived co-precipitation process rates and final chord-length-distribution were evaluated systematically using a 3(3) full factorial design. Critical process variables were identified via ANOVA for both transition and steady state. General linear models (GLM) were then used for parameter estimation for each critical variable. Clear trends about effects of each critical variable during transition and steady state were found by GLM and were interpreted using fundamental process principles and Nyvlt's transfer model. Neural network models were able to link process variables with response variables at transition and steady state with R(2) of 0.88-0.98. PVM images evidenced nucleation and crystal growth. Contour plots illustrated design space via critical process variables' ranges. It demonstrated the utility of integrated PAT approach for QbD development. Published by Elsevier B.V.
Using a Bayesian network to predict barrier island geomorphologic characteristics
Gutierrez, Ben; Plant, Nathaniel G.; Thieler, E. Robert; Turecek, Aaron
2015-01-01
Quantifying geomorphic variability of coastal environments is important for understanding and describing the vulnerability of coastal topography, infrastructure, and ecosystems to future storms and sea level rise. Here we use a Bayesian network (BN) to test the importance of multiple interactions between barrier island geomorphic variables. This approach models complex interactions and handles uncertainty, which is intrinsic to future sea level rise, storminess, or anthropogenic processes (e.g., beach nourishment and other forms of coastal management). The BN was developed and tested at Assateague Island, Maryland/Virginia, USA, a barrier island with sufficient geomorphic and temporal variability to evaluate our approach. We tested the ability to predict dune height, beach width, and beach height variables using inputs that included longer-term, larger-scale, or external variables (historical shoreline change rates, distances to inlets, barrier width, mean barrier elevation, and anthropogenic modification). Data sets from three different years spanning nearly a decade sampled substantial temporal variability and serve as a proxy for analysis of future conditions. We show that distinct geomorphic conditions are associated with different long-term shoreline change rates and that the most skillful predictions of dune height, beach width, and beach height depend on including multiple input variables simultaneously. The predictive relationships are robust to variations in the amount of input data and to variations in model complexity. The resulting model can be used to evaluate scenarios related to coastal management plans and/or future scenarios where shoreline change rates may differ from those observed historically.
A site specific model and analysis of the neutral somatic mutation rate in whole-genome cancer data.
Bertl, Johanna; Guo, Qianyun; Juul, Malene; Besenbacher, Søren; Nielsen, Morten Muhlig; Hornshøj, Henrik; Pedersen, Jakob Skou; Hobolth, Asger
2018-04-19
Detailed modelling of the neutral mutational process in cancer cells is crucial for identifying driver mutations and understanding the mutational mechanisms that act during cancer development. The neutral mutational process is very complex: whole-genome analyses have revealed that the mutation rate differs between cancer types, between patients and along the genome depending on the genetic and epigenetic context. Therefore, methods that predict the number of different types of mutations in regions or specific genomic elements must consider local genomic explanatory variables. A major drawback of most methods is the need to average the explanatory variables across the entire region or genomic element. This procedure is particularly problematic if the explanatory variable varies dramatically in the element under consideration. To take into account the fine scale of the explanatory variables, we model the probabilities of different types of mutations for each position in the genome by multinomial logistic regression. We analyse 505 cancer genomes from 14 different cancer types and compare the performance in predicting mutation rate for both regional based models and site-specific models. We show that for 1000 randomly selected genomic positions, the site-specific model predicts the mutation rate much better than regional based models. We use a forward selection procedure to identify the most important explanatory variables. The procedure identifies site-specific conservation (phyloP), replication timing, and expression level as the best predictors for the mutation rate. Finally, our model confirms and quantifies certain well-known mutational signatures. We find that our site-specific multinomial regression model outperforms the regional based models. The possibility of including genomic variables on different scales and patient specific variables makes it a versatile framework for studying different mutational mechanisms. Our model can serve as the neutral null model for the mutational process; regions that deviate from the null model are candidates for elements that drive cancer development.
NASA Astrophysics Data System (ADS)
Pradeep, Krishna; Poiroux, Thierry; Scheer, Patrick; Juge, André; Gouget, Gilles; Ghibaudo, Gérard
2018-07-01
This work details the analysis of wafer level global process variability in 28 nm FD-SOI using split C-V measurements. The proposed approach initially evaluates the native on wafer process variability using efficient extraction methods on split C-V measurements. The on-wafer threshold voltage (VT) variability is first studied and modeled using a simple analytical model. Then, a statistical model based on the Leti-UTSOI compact model is proposed to describe the total C-V variability in different bias conditions. This statistical model is finally used to study the contribution of each process parameter to the total C-V variability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chojnowski, S. Drew; Holtzman, Jon A.; Wisniewski, John P.
2017-04-01
We report on the H -band spectral variability of classical Be stars observed over the course of the Apache Point Galactic Evolution Experiment (APOGEE), one of four subsurveys comprising SDSS-III. As described in the first paper of this series, the APOGEE B-type emission-line (ABE) star sample was culled from the large number of blue stars observed as telluric standards during APOGEE observations. In this paper, we explore the multi-epoch ABE sample, consisting of 1100 spectra for 213 stars. These “snapshots” of the circumstellar disk activity have revealed a wealth of temporal variability including, but not limited to, gradual disappearance ofmore » the line emission and vice versa over both short and long timescales. Other forms of variability include variation in emission strength, emission peak intensity ratios, and emission peak separations. We also analyze radial velocities (RVs) of the emission lines for a subsample of 162 stars with sufficiently strong features, and we discuss on a case-by-case basis whether the RV variability exhibited by some stars is caused by binary motion versus dynamical processes in the circumstellar disks. Ten systems are identified as convincing candidates for binary Be stars with as of yet undetected companions.« less
Feigin, Rena; Sapir, Yaffa
2005-03-01
The present study deals with personal and psychological characteristics of addicts coping with abstinence from drugs in various stages of recovery. The study focuses primarily on two personal variables: attribution of responsibility for the problem and its solution, and the sense of coherence. Additional factors that were examined in the study are demographic variables, which include those related to drug addiction. The sample included 128 short-term abstinent patients in the early stages of recovery after detoxification, and 40 long-term abstinent former addicts, who have abstained from the use of drugs for two to eight years. The results indicate a higher level of sense of coherence among the long-term abstinent subjects relating to their inner resources. On the other hand, much similarity was found between the groups in relation to the attribution of responsibility variable. In both groups, the majority reports that they attribute responsibility for the solution of the problem to themselves. The findings underscored the significant link between personality variables and coping with the processes of recovery, while an analysis of demographic and addiction variables did not show a significant distinction between the group of long-term abstinent subjects and the short-term abstinent subjects.
NASA Technical Reports Server (NTRS)
Taminger, Karen M.; Hafley, Robert A.; Domack, Marcia S.
2006-01-01
Electron beam freeform fabrication (EBF3) is a new layer-additive process that has been developed for near-net shape fabrication of complex structures. EBF3 uses an electron beam to create a molten pool on the surface of a substrate. Wire is fed into the molten pool and the part translated with respect to the beam to build up a 3-dimensional structure one layer at a time. Unlike many other freeform fabrication processes, the energy coupling of the electron beam is extremely well suited to processing of aluminum alloys. The layer-additive nature of the EBF3 process results in a tortuous thermal path producing complex microstructures including: small homogeneous equiaxed grains; dendritic growth contained within larger grains; and/or pervasive dendritic formation in the interpass regions of the deposits. Several process control variables contribute to the formation of these different microstructures, including translation speed, wire feed rate, beam current and accelerating voltage. In electron beam processing, higher accelerating voltages embed the energy deeper below the surface of the substrate. Two EBF3 systems have been established at NASA Langley, one with a low-voltage (10-30kV) and the other a high-voltage (30-60 kV) electron beam gun. Aluminum alloy 2219 was processed over a range of different variables to explore the design space and correlate the resultant microstructures with the processing parameters. This report is specifically exploring the impact of accelerating voltage. Of particular interest is correlating energy to the resultant material characteristics to determine the potential of achieving microstructural control through precise management of the heat flux and cooling rates during deposition.
Study of process variables associated with manufacturing hermetically-sealed nickel-cadmium cells
NASA Technical Reports Server (NTRS)
Miller, L.; Doan, D. J.; Carr, E. S.
1971-01-01
A program to determine and study the critical process variables associated with the manufacture of aerospace, hermetically-sealed, nickel-cadmium cells is described. The determination and study of the process variables associated with the positive and negative plaque impregnation/polarization process are emphasized. The experimental data resulting from the implementation of fractional factorial design experiments are analyzed by means of a linear multiple regression analysis technique. This analysis permits the selection of preferred levels for certain process variables to achieve desirable impregnated plaque characteristics.
Nguyen, Dinh Duc; Yoon, Yong Soo; Bui, Xuan Thanh; Kim, Sung Su; Chang, Soon Woong; Guo, Wenshan; Ngo, Huu Hao
2017-11-01
Performance of an electrocoagulation (EC) process in batch and continuous operating modes was thoroughly investigated and evaluated for enhancing wastewater phosphorus removal under various operating conditions, individually or combined with initial phosphorus concentration, wastewater conductivity, current density, and electrolysis times. The results revealed excellent phosphorus removal (72.7-100%) for both processes within 3-6 min of electrolysis, with relatively low energy requirements, i.e., less than 0.5 kWh/m 3 for treated wastewater. However, the removal efficiency of phosphorus in the continuous EC operation mode was better than that in batch mode within the scope of the study. Additionally, the rate and efficiency of phosphorus removal strongly depended on operational parameters, including wastewater conductivity, initial phosphorus concentration, current density, and electrolysis time. Based on experimental data, statistical model verification of the response surface methodology (RSM) (multiple factor optimization) was also established to provide further insights and accurately describe the interactive relationship between the process variables, thus optimizing the EC process performance. The EC process using iron electrodes is promising for improving wastewater phosphorus removal efficiency, and RSM can be a sustainable tool for predicting the performance of the EC process and explaining the influence of the process variables.
NASA Technical Reports Server (NTRS)
Cecil, R. W.; White, R. A.; Szczur, M. R.
1972-01-01
The IDAMS Processor is a package of task routines and support software that performs convolution filtering, image expansion, fast Fourier transformation, and other operations on a digital image tape. A unique task control card for that program, together with any necessary parameter cards, selects each processing technique to be applied to the input image. A variable number of tasks can be selected for execution by including the proper task and parameter cards in the input deck. An executive maintains control of the run; it initiates execution of each task in turn and handles any necessary error processing.
Children's adjustment to their divorced parents' new relationships.
Isaacs, Ar
2002-08-01
With new relationships common after divorce, researchers have tried to determine the factors that predict how well children adjust to their stepfamily. The many potential factors are often grouped into the categories of family process, individual risk and vulnerability, and ecological variables. Family process is concentrated on the impact of disrupted family relationships; positive outcomes are associated with low conflict and authoritative parenting. Individual risk and vulnerability includes attributes of the child and the adults; positive outcomes are associated with children who have an easy temperament. Adolescents and girls may have particular difficulty adjusting. Ecological perspectives include the larger social environment such as peers and school.
Tannase production by Paecilomyces variotii.
Battestin, Vania; Macedo, Gabriela Alves
2007-07-01
Surface response methodology was applied to the optimization of the laboratory scale production of tannase using a lineage of Paecilomyces variotii. A preliminary study was conducted to evaluate the effects of variables, including temperature ( degrees C), residue (%) (coffee husk:wheat bran), tannic acid (%) and salt solutions (%) on the production of tannase during 3, 5 and 7 days of fermentation. Among these variables, temperature, residues and tannic acid had significant effects on tannase production. The variables were optimized using surface response methodology. The best conditions for tannase production were: temperature (29-34 degrees C); tannic acid (8.5-14%); % residue (coffee husk:wheat bran 50:50) and incubation time of 5 days. The supplementation of external nitrogen and carbon sources at 0.4%, 0.8% and 1.2% concentration on tannase production were studied in the optimized medium. Three different nitrogen sources included yeast extract, ammonia nitrate and sodium nitrate along with carbon source (starch) were studied. Only ammonia nitrate showed a significant effect on tannase production. After the optimization process, the tannase activity increased 8.6-fold.
A Content Analysis of Television Ads: Does Current Practice Maximize Cognitive Processing?
2008-12-11
ads with arousing content such as sexual imagery and fatty/sweet food imagery have the potential to stress the cognitive processing system. When the...to examine differences in content arousal , this study included variables shown to elicit arousal —loved brands, sexual images, and fatty/sweet food...loved brands as well as ads with sexual and fatty/food images are not all the same—they are not likely to be equally arousing . Initially, brands were
NASA Technical Reports Server (NTRS)
Roberts, J. Brent; Clayson, Carol A.
2012-01-01
The Eastern tropical ocean basins are regions of significant atmosphere-ocean interaction and are important to variability across subseasonal to decadal time scales. The numerous physical processes at play in these areas strain the abilities of coupled general circulation models to accurately reproduce observed upper ocean variability. Furthermore, limitations in the observing system of important terms in the surface temperature balance (e.g., turbulent and radiative heat fluxes, advection) introduce uncertainty into the analyses of processes controlling sea surface temperature variability. This study presents recent efforts to close the surface temperature balance through estimation of the terms in the mixed layer temperature budget using state-of-the-art remotely sensed and model-reanalysis derived products. A set of twelve net heat flux estimates constructed using combinations of radiative and turbulent heat flux products - including GEWEX-SRB, ISCCP-SRF, OAFlux, SeaFlux, among several others - are used with estimates of oceanic advection, entrainment, and mixed layer depth variability to investigate the seasonal variability of ocean surface temperatures. Particular emphasis is placed on how well the upper ocean temperature balance is, or is not, closed on these scales using the current generation of observational and model reanalysis products. That is, the magnitudes and spatial variability of residual imbalances are addressed. These residuals are placed into context within the current uncertainties of the surface net heat fluxes and the role of the mixed layer depth variability in scaling the impact of those uncertainties, particularly in the shallow mixed layers of the Eastern tropical ocean basins.
Uncertainty estimation and multi sensor fusion for kinematic laser tracker measurements
NASA Astrophysics Data System (ADS)
Ulrich, Thomas
2013-08-01
Laser trackers are widely used to measure kinematic tasks such as tracking robot movements. Common methods to evaluate the uncertainty in the kinematic measurement include approximations specified by the manufacturers, various analytical adjustment methods and the Kalman filter. In this paper a new, real-time technique is proposed, which estimates the 4D-path (3D-position + time) uncertainty of an arbitrary path in space. Here a hybrid system estimator is applied in conjunction with the kinematic measurement model. This method can be applied to processes, which include various types of kinematic behaviour, constant velocity, variable acceleration or variable turn rates. The new approach is compared with the Kalman filter and a manufacturer's approximations. The comparison was made using data obtained by tracking an industrial robot's tool centre point with a Leica laser tracker AT901 and a Leica laser tracker LTD500. It shows that the new approach is more appropriate to analysing kinematic processes than the Kalman filter, as it reduces overshoots and decreases the estimated variance. In comparison with the manufacturer's approximations, the new approach takes account of kinematic behaviour with an improved description of the real measurement process and a reduction in estimated variance. This approach is therefore well suited to the analysis of kinematic processes with unknown changes in kinematic behaviour as well as the fusion among laser trackers.
De la Fuente, Jesus; Zapata, Lucía; Martínez-Vicente, Jose M.; Sander, Paul; Cardelle-Elawar, María
2014-01-01
The present investigation examines how personal self-regulation (presage variable) and regulatory teaching (process variable of teaching) relate to learning approaches, strategies for coping with stress, and self-regulated learning (process variables of learning) and, finally, how they relate to performance and satisfaction with the learning process (product variables). The objective was to clarify the associative and predictive relations between these variables, as contextualized in two different models that use the presage-process-product paradigm (the Biggs and DEDEPRO models). A total of 1101 university students participated in the study. The design was cross-sectional and retrospective with attributional (or selection) variables, using correlations and structural analysis. The results provide consistent and significant empirical evidence for the relationships hypothesized, incorporating variables that are part of and influence the teaching–learning process in Higher Education. Findings confirm the importance of interactive relationships within the teaching–learning process, where personal self-regulation is assumed to take place in connection with regulatory teaching. Variables that are involved in the relationships validated here reinforce the idea that both personal factors and teaching and learning factors should be taken into consideration when dealing with a formal teaching–learning context at university. PMID:25964764
NASA Technical Reports Server (NTRS)
Collatz, G. James; Kawa, R.
2007-01-01
Progress in better determining CO2 sources and sinks will almost certainly rely on utilization of more extensive and intensive CO2 and related observations including those from satellite remote sensing. Use of advanced data requires improved modeling and analysis capability. Under NASA Carbon Cycle Science support we seek to develop and integrate improved formulations for 1) atmospheric transport, 2) terrestrial uptake and release, 3) biomass and 4) fossil fuel burning, and 5) observational data analysis including inverse calculations. The transport modeling is based on meteorological data assimilation analysis from the Goddard Modeling and Assimilation Office. Use of assimilated met data enables model comparison to CO2 and other observations across a wide range of scales of variability. In this presentation we focus on the short end of the temporal variability spectrum: hourly to synoptic to seasonal. Using CO2 fluxes at varying temporal resolution from the SIB 2 and CASA biosphere models, we examine the model's ability to simulate CO2 variability in comparison to observations at different times, locations, and altitudes. We find that the model can resolve much of the variability in the observations, although there are limits imposed by vertical resolution of boundary layer processes. The influence of key process representations is inferred. The high degree of fidelity in these simulations leads us to anticipate incorporation of realtime, highly resolved observations into a multiscale carbon cycle analysis system that will begin to bridge the gap between top-down and bottom-up flux estimation, which is a primary focus of NACP.
Solar Ion Processing of Itokawa Grains: Reconciling Model Predictions with Sample Observations
NASA Technical Reports Server (NTRS)
Christoffersen, Roy; Keller, L. P.
2014-01-01
Analytical TEM observations of Itokawa grains reported to date show complex solar wind ion processing effects in the outer 30-100 nm of pyroxene and olivine grains. The effects include loss of long-range structural order, formation of isolated interval cavities or "bubbles", and other nanoscale compositional/microstructural variations. None of the effects so far described have, however, included complete ion-induced amorphization. To link the array of observed relationships to grain surface exposure times, we have adapted our previous numerical model for progressive solar ion processing effects in lunar regolith grains to the Itokawa samples. The model uses SRIM ion collision damage and implantation calculations within a framework of a constant-deposited-energy model for amorphization. Inputs include experimentally-measured amorphization fluences, a Pi steradian variable ion incidence geometry required for a rotating asteroid, and a numerical flux-versus-velocity solar wind spectrum.
Curtis, Gary P.; Kohler, Matthias; Kannappan, Ramakrishnan; Briggs, Martin A.; Day-Lewis, Frederick D.
2015-01-01
Scientifically defensible predictions of field scale U(VI) transport in groundwater requires an understanding of key processes at multiple scales. These scales range from smaller than the sediment grain scale (less than 10 μm) to as large as the field scale which can extend over several kilometers. The key processes that need to be considered include both geochemical reactions in solution and at sediment surfaces as well as physical transport processes including advection, dispersion, and pore-scale diffusion. The research summarized in this report includes both experimental and modeling results in batch, column and tracer tests. The objectives of this research were to: (1) quantify the rates of U(VI) desorption from sediments acquired from a uranium contaminated aquifer in batch experiments;(2) quantify rates of U(VI) desorption in column experiments with variable chemical conditions, and(3) quantify nonreactive tracer and U(VI) transport in field tests.
Culture as a variable in neuroscience and clinical neuropsychology: A comprehensive review
Wajman, José Roberto; Bertolucci, Paulo Henrique Ferreira; Mansur, Letícia Lessa; Gauthier, Serge
2015-01-01
Culture is a dynamic system of bidirectional influences among individuals and their environment, including psychological and biological processes, which facilitate adaptation and social interaction. One of the main challenges in clinical neuropsychology involves cognitive, behavioral and functional assessment of people with different sociocultural backgrounds. In this review essay, examining culture from a historical perspective to ethical issues in cross-cultural research, including the latest significant and publications, the authors sought to explore the main features related to cultural variables in neuropsychological practice and to debate the challenges found regarding the operational methods currently in use. Literature findings suggest a more comprehensive approach in cognitive and behavioral neuroscience, including an interface between elementary disciplines and applied neuropsychology. Thus, as a basis for discussion on this issue, the authors analyzed key-topics related to the study of new trends in sociocultural neuroscience and the application of their concepts from a clinical perspective. PMID:29213964
A robust variable sampling time BLDC motor control design based upon μ-synthesis.
Hung, Chung-Wen; Yen, Jia-Yush
2013-01-01
The variable sampling rate system is encountered in many applications. When the speed information is derived from the position marks along the trajectory, one would have a speed dependent sampling rate system. The conventional fixed or multisampling rate system theory may not work in these cases because the system dynamics include the uncertainties which resulted from the variable sampling rate. This paper derived a convenient expression for the speed dependent sampling rate system. The varying sampling rate effect is then translated into multiplicative uncertainties to the system. The design then uses the popular μ-synthesis process to achieve a robust performance controller design. The implementation on a BLDC motor demonstrates the effectiveness of the design approach.
A Robust Variable Sampling Time BLDC Motor Control Design Based upon μ-Synthesis
Yen, Jia-Yush
2013-01-01
The variable sampling rate system is encountered in many applications. When the speed information is derived from the position marks along the trajectory, one would have a speed dependent sampling rate system. The conventional fixed or multisampling rate system theory may not work in these cases because the system dynamics include the uncertainties which resulted from the variable sampling rate. This paper derived a convenient expression for the speed dependent sampling rate system. The varying sampling rate effect is then translated into multiplicative uncertainties to the system. The design then uses the popular μ-synthesis process to achieve a robust performance controller design. The implementation on a BLDC motor demonstrates the effectiveness of the design approach. PMID:24327804
How to decompose arbitrary continuous-variable quantum operations.
Sefi, Seckin; van Loock, Peter
2011-10-21
We present a general, systematic, and efficient method for decomposing any given exponential operator of bosonic mode operators, describing an arbitrary multimode Hamiltonian evolution, into a set of universal unitary gates. Although our approach is mainly oriented towards continuous-variable quantum computation, it may be used more generally whenever quantum states are to be transformed deterministically, e.g., in quantum control, discrete-variable quantum computation, or Hamiltonian simulation. We illustrate our scheme by presenting decompositions for various nonlinear Hamiltonians including quartic Kerr interactions. Finally, we conclude with two potential experiments utilizing offline-prepared optical cubic states and homodyne detections, in which quantum information is processed optically or in an atomic memory using quadratic light-atom interactions. © 2011 American Physical Society
Implementation of in-line infrared monitor in full-scale anaerobic digestion process.
Spanjers, H; Bouvier, J C; Steenweg, P; Bisschops, I; van Gils, W; Versprille, B
2006-01-01
During start up but also during normal operation, anaerobic reactor systems should be run and monitored carefully to secure trouble-free operation, because the process is vulnerable to disturbances such as temporary overloading, biomass wash out and influent toxicity. The present method of monitoring is usually by manual sampling and subsequent laboratory analysis. Data collection, processing and feedback to system operation is manual and ad hoc, and involves high-level operator skills and attention. As a result, systems tend to be designed at relatively conservative design loading rates resulting in significant over-sizing of reactors and thus increased systems cost. It is therefore desirable to have on-line and continuous access to performance data on influent and effluent quality. Relevant variables to indicate process performance include VFA, COD, alkalinity, sulphate, and, if aerobic post-treatment is considered, total nitrogen, ammonia and nitrate. Recently, mid-IR spectrometry was demonstrated on a pilot scale to be suitable for in-line simultaneous measurement of these variables. This paper describes a full-scale application of the technique to test its ability to monitor continuously and without human intervention the above variables simultaneously in two process streams. For VFA, COD, sulphate, ammonium and TKN good agreement was obtained between in-line and manual measurements. During a period of six months the in-line measurements had to be interrupted several times because of clogging. It appeared that the sample pre-treatment unit was not able to cope with high solids concentrations all the time.
NASA Astrophysics Data System (ADS)
JW, Schramm; Jin, H.; Keeling, EG; Johnson, M.; Shin, HJ
2017-05-01
This paper reports on our use of a fine-grained learning progression to assess secondary students' reasoning through carbon-transforming processes (photosynthesis, respiration, biosynthesis). Based on previous studies, we developed a learning progression with four progress variables: explaining mass changes, explaining energy transformations, explaining subsystems, and explaining large-scale systems. For this study, we developed a 2-week teaching module integrating these progress variables. Students were assessed before and after instruction, with the learning progression framework driving data analysis. Our work revealed significant overall learning gains for all students, with the mean post-test person proficiency estimates higher by 0.6 logits than the pre-test proficiency estimates. Further, instructional effects were statistically similar across all grades included in the study (7th-12th) with students in the lowest third of initial proficiency evidencing the largest learning gains. Students showed significant gains in explaining the processes of photosynthesis and respiration and in explaining transformations of mass and energy, areas where prior research has shown that student misconceptions are prevalent. Student gains on items about large-scale systems were higher than with other variables (although absolute proficiency was still lower). Gains across each of the biological processes tested were similar, despite the different levels of emphasis each had in the teaching unit. Together, these results indicate that students can benefit from instruction addressing these processes more explicitly. This requires pedagogical design quite different from that usually practiced with students at this level.
NASA Astrophysics Data System (ADS)
Buscheck, T.; Glascoe, L.; Sun, Y.; Gansemer, J.; Lee, K.
2003-12-01
For the proposed Yucca Mountain geologic repository for high-level nuclear waste, the planned method of disposal involves the emplacement of cylindrical packages containing the waste inside horizontal tunnels, called emplacement drifts, bored several hundred meters below the ground surface. The emplacement drifts reside in highly fractured, partially saturated volcanic tuff. An important phenomenological consideration for the licensing of the proposed repository at Yucca Mountain is the generation of decay heat by the emplaced waste and the consequences of this decay heat. Changes in temperature will affect the hydrologic and chemical environment at Yucca Mountain. A thermohydrologic-modeling tool is necessary to support the performance assessment of the Engineered Barrier System (EBS) of the proposed repository. This modeling tool must simultaneously account for processes occurring at a scale of a few tens of centimeters around individual waste packages, for processes occurring around the emplacement drifts themselves, and for processes occurring at the multi-kilometer scale of the mountain. Additionally, many other features must be considered including non-isothermal, multiphase-flow in fractured porous rock of variable liquid-phase saturation and thermal radiation and convection in open cavities. The Multiscale Thermohydrologic Model (MSTHM) calculates the following thermohydrologic (TH) variables: temperature, relative humidity, liquid-phase saturation, evaporation rate, air-mass fraction, gas-phase pressure, capillary pressure, and liquid- and gas-phase fluxes. The TH variables are determined as a function of position along each of the emplacement drifts in the repository and as a function of waste-package (WP) type. These variables are determined at various generic locations within the emplacement drifts, including the waste package and drip-shield surfaces and in the invert; they are also determined at various generic locations in the adjoining host rock; these variables are determined every 20 m for each emplacement drift in the repository. The MSTHM accounts for 3-D drift-scale and mountain-scale heat flow and captures the influence of the key engineering-design variables and natural-system factors affecting TH conditions in the emplacement drifts and adjoining host rock. Presented is a synopsis of recent MSTHM calculations conducted to support the Total System Performance Assessment for the License Application (TSPA-LA). This work was performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48.
Hurricane Gustav: Observations and Analysis of Coastal Change
Doran, Kara S.; Stockdon, Hilary F.; Plant, Nathaniel G.; Sallenger, Asbury H.; Guy, Kristy K.; Serafin, Katherine A.
2009-01-01
Understanding storm-induced coastal change and forecasting these changes require knowledge of the physical processes associated with a storm and the geomorphology of the impacted coastline. The primary physical processes of interest are the wind field, storm surge, currents, and wave field. Not only does wind cause direct damage to structures along the coast, but it is ultimately responsible for much of the energy that is transferred to the ocean and expressed as storm surge, mean currents, and surface waves. Waves and currents are the processes most responsible for moving sediments in the coastal zone during extreme storm events. Storm surge, which is the rise in water level due to the wind, barometric pressure, and other factors, allows both waves and currents to attack parts of the coast not normally exposed to these processes. Coastal geomorphology, including shapes of the shoreline, beaches, and dunes, is also a significant aspect of the coastal change observed during extreme storms. Relevant geomorphic variables include sand dune elevation, beach width, shoreline position, sediment grain size, and foreshore beach slope. These variables, in addition to hydrodynamic processes, can be used to predict coastal vulnerability to storms. The U.S. Geological Survey (USGS) National Assessment of Coastal Change Hazards project (http://coastal.er.usgs.gov/hurricanes) strives to provide hazard information to those concerned about the Nation's coastlines, including residents of coastal areas, government agencies responsible for coastal management, and coastal researchers. As part of the National Assessment, observations were collected to measure morphological changes associated with Hurricane Gustav, which made landfall near Cocodrie, Louisiana, on September 1, 2008. Methods of observation included oblique aerial photography, airborne topographic surveys, and ground-based topographic surveys. This report documents these data-collection efforts and presents qualitative and quantitative descriptions of hurricane-induced changes to the shoreline, beaches, dunes, and infrastructure in the region that was heavily impacted by Hurricane Gustav.
Bio-inspired online variable recruitment control of fluidic artificial muscles
NASA Astrophysics Data System (ADS)
Jenkins, Tyler E.; Chapman, Edward M.; Bryant, Matthew
2016-12-01
This paper details the creation of a hybrid variable recruitment control scheme for fluidic artificial muscle (FAM) actuators with an emphasis on maximizing system efficiency and switching control performance. Variable recruitment is the process of altering a system’s active number of actuators, allowing operation in distinct force regimes. Previously, FAM variable recruitment was only quantified with offline, manual valve switching; this study addresses the creation and characterization of novel, on-line FAM switching control algorithms. The bio-inspired algorithms are implemented in conjunction with a PID and model-based controller, and applied to a simulated plant model. Variable recruitment transition effects and chatter rejection are explored via a sensitivity analysis, allowing a system designer to weigh tradeoffs in actuator modeling, algorithm choice, and necessary hardware. Variable recruitment is further developed through simulation of a robotic arm tracking a variety of spline position inputs, requiring several levels of actuator recruitment. Switching controller performance is quantified and compared with baseline systems lacking variable recruitment. The work extends current variable recruitment knowledge by creating novel online variable recruitment control schemes, and exploring how online actuator recruitment affects system efficiency and control performance. Key topics associated with implementing a variable recruitment scheme, including the effects of modeling inaccuracies, hardware considerations, and switching transition concerns are also addressed.
PHT3D-UZF: A reactive transport model for variably-saturated porous media
Wu, Ming Zhi; Post, Vincent E. A.; Salmon, S. Ursula; Morway, Eric D.; Prommer, H.
2016-01-01
A modified version of the MODFLOW/MT3DMS-based reactive transport model PHT3D was developed to extend current reactive transport capabilities to the variably-saturated component of the subsurface system and incorporate diffusive reactive transport of gaseous species. Referred to as PHT3D-UZF, this code incorporates flux terms calculated by MODFLOW's unsaturated-zone flow (UZF1) package. A volume-averaged approach similar to the method used in UZF-MT3DMS was adopted. The PHREEQC-based computation of chemical processes within PHT3D-UZF in combination with the analytical solution method of UZF1 allows for comprehensive reactive transport investigations (i.e., biogeochemical transformations) that jointly involve saturated and unsaturated zone processes. Intended for regional-scale applications, UZF1 simulates downward-only flux within the unsaturated zone. The model was tested by comparing simulation results with those of existing numerical models. The comparison was performed for several benchmark problems that cover a range of important hydrological and reactive transport processes. A 2D simulation scenario was defined to illustrate the geochemical evolution following dewatering in a sandy acid sulfate soil environment. Other potential applications include the simulation of biogeochemical processes in variably-saturated systems that track the transport and fate of agricultural pollutants, nutrients, natural and xenobiotic organic compounds and micropollutants such as pharmaceuticals, as well as the evolution of isotope patterns.
Akseli, Ilgaz; Xie, Jingjin; Schultz, Leon; Ladyzhynsky, Nadia; Bramante, Tommasina; He, Xiaorong; Deanne, Rich; Horspool, Keith R; Schwabe, Robert
2017-01-01
Enabling the paradigm of quality by design requires the ability to quantitatively correlate material properties and process variables to measureable product performance attributes. Conventional, quality-by-test methods for determining tablet breaking force and disintegration time usually involve destructive tests, which consume significant amount of time and labor and provide limited information. Recent advances in material characterization, statistical analysis, and machine learning have provided multiple tools that have the potential to develop nondestructive, fast, and accurate approaches in drug product development. In this work, a methodology to predict the breaking force and disintegration time of tablet formulations using nondestructive ultrasonics and machine learning tools was developed. The input variables to the model include intrinsic properties of formulation and extrinsic process variables influencing the tablet during manufacturing. The model has been applied to predict breaking force and disintegration time using small quantities of active pharmaceutical ingredient and prototype formulation designs. The novel approach presented is a step forward toward rational design of a robust drug product based on insight into the performance of common materials during formulation and process development. It may also help expedite drug product development timeline and reduce active pharmaceutical ingredient usage while improving efficiency of the overall process. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Bond, B. J.; Peterson, K.; McKane, R.; Lajtha, K.; Quandt, D. J.; Allen, S. T.; Sell, S.; Daly, C.; Harmon, M. E.; Johnson, S. L.; Spies, T.; Sollins, P.; Abdelnour, A. G.; Stieglitz, M.
2010-12-01
We are pursuing the ambitious goal of understanding how complex terrain influences the responses of carbon and water cycle processes to climate variability and climate change. Our studies take place in H.J. Andrews Experimental Forest, an LTER (Long Term Ecological Research) site situated in Oregon’s central-western Cascade Range. Decades of long-term measurements and intensive research have revealed influences of topography on vegetation patterns, disturbance history, and hydrology. More recent research has shown surprising interactions between microclimates and synoptic weather patterns due to cold air drainage and pooling in mountain valleys. Using these data and insights, in addition to a recent LiDAR (Light Detection and Ranging) reconnaissance and a small sensor network, we are employing process-based models, including “SPA” (Soil-Plant-Atmosphere, developed by Mathew Williams of the University of Edinburgh), and “VELMA” (Visualizing Ecosystems for Land Management Alternatives, developed by Marc Stieglitz and colleagues of the Georgia Institute of Technology) to focus on two important features of mountainous landscapes: heterogeneity (both spatial and temporal) and connectivity (atmosphere-canopy-hillslope-stream). Our research questions include: 1) Do fine-scale spatial and temporal heterogeneity result in emergent properties at the basin scale, and if so, what are they? 2) How does connectivity across ecosystem components affect system responses to climate variability and change? Initial results show that for environmental drivers that elicit non-linear ecosystem responses on the plot scale, such as solar radiation, soil depth and soil water content, fine-scale spatial heterogeneity may produce unexpected emergent properties at larger scales. The results from such modeling experiments are necessarily a function of the supporting algorithms. However, comparisons based on models such as SPA and VELMA that operate at much different spatial scales (plots vs. hillslopes) and levels of biophysical organization (individual plants vs. aggregate plant biomass) can help us to understand how and why mountainous ecosystems may have distinctive responses to climate variability and climate change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jaya Shankar Tumuluru
2011-08-01
Effect of process variables on the quality attributes of briquettes from wheat, oat, canola and barley straw Jaya Shankar Tumuluru*, L. G. Tabil, Y. Song, K. L. Iroba and V. Meda Biomass is a renewable energy source and environmentally friendly substitute for fossil fuels such as coal and petroleum products. Major limitation of biomass for successful energy application is its low bulk density, which makes it very difficult and costly to transport and handle. To overcome this limitation, biomass has to be densified. The commonly used technologies for densification of biomass are pelletization and briquetting. Briquetting offers many advantages atmore » it can densify larger particles sizes of biomass at higher moisture contents. Briquetting is influenced by a number of feedstock and process variables such as moisture content, particle size distribution, and some operating variables such as temperature and densification pressure. In the present study, experiments were designed and conducted based on Box-Behnken design to produce briquettes using barley, wheat, canola and barley straws. A laboratory scale hydraulic briquette press was used for the present study. The experimental process variables and their levels used in the present study were pressure levels (7.5, 10, 12.5 MPa), three levels of temperature (90, 110, 130 C), at three moisture content levels (9, 12, 15% w.b.), and three levels of particle size (19.1, 25.04, 31.75 mm). The quality variables studied includes moisture content, initial density and final briquette density after two weeks of storage, size distribution index and durability. The raw biomass was initially chopped and size reduced using a hammer mill. The ground biomass was conditioned at different moisture contents and was further densified using laboratory hydraulic press. For each treatment combination, ten briquettes were manufactured at a residence time of about 30 s after compression pressure setpoint was achieved. After compression, the initial dimensions and the final dimensions after 2 weeks of storage in controlled environment of all the samples were measured. Durability, dimensional stability, and moisture content tests were conducted after two weeks of storage of the briquettes produced. Initial results indicated that moisture content played a significant role on briquettes durability, stability, and density. Low moisture content of the straws (7-12%) gave more durable briquettes. Briquette density increased with increasing pressure depending on the moisture content value. The axial expansion was more significant than the lateral expansion, which in some cases tended to be nil depending on the material and operating variables. Further data analysis is in progress in order to understand the significance of the process variables based on ANOVA. Regression models were developed to predict the changes in quality of briquettes with respect of the process variables under study. Keywords: Herbaceous biomass, densification, briquettes, density, durability, dimensional stability, ANOVA and regression equations« less
Correlates of compliance with national comprehensive smoke-free laws.
Peruga, Armando; Hayes, Luminita S; Aguilera, Ximena; Prasad, Vinayak; Bettcher, Douglas W
2017-12-05
To explore correlates of high compliance with smoking bans in a cross-sectional data set from the 41 countries with national comprehensive smoke-free laws in 2014 and complete data on compliance and enforcement. Outcome variable: compliance with a national comprehensive smoke-free law in each country was obtained for 2014 from the WHO global report on the global tobacco epidemic. Explanatory variables: legal enforcement requirements, penalties, infrastructure and strategy were obtained through a separate survey of governments. Also, country socioeconomic and demographic characteristics including the level of corruption control were included. an initial bivariate analysis determined the significance of each potentially relevant explanatory variable of high compliance. Differences in compliance were tested using the exact logistic regression. High compliance with the national comprehensive smoke-free law was associated with the involvement of the local jurisdictions in providing training and/or guidance for inspections (OR=10.3, 95% CI 1.7 to 117.7) and a perception of high corruption control efforts in the country (OR=7.2, 95% CI 1.1 to 85.8). The results show the importance of the depth of the enforcement infrastructure and effort represented by the degree to which the local government is involved in enforcement. They also show the significance of fighting corruption in the enforcement process, including the attempts of the tobacco industry to undermine the process, to achieve high levels of compliance with the law. The results point out to the need to invest minimal but essential enforcement resources given that national comprehensive smoke-free laws are self-enforcing in many but not all countries and sectors.
Complementary Roles for Amygdala and Periaqueductal Gray in Temporal-Difference Fear Learning
ERIC Educational Resources Information Center
Cole, Sindy; McNally, Gavan P.
2009-01-01
Pavlovian fear conditioning is not a unitary process. At the neurobiological level multiple brain regions and neurotransmitters contribute to fear learning. At the behavioral level many variables contribute to fear learning including the physical salience of the events being learned about, the direction and magnitude of predictive error, and the…
Content Analysis Schedule for Bilingual Education Programs: BICEP Intercambio de la Cultura.
ERIC Educational Resources Information Center
Shore, Marietta Saravia; Nafus, Charles
This content analysis schedule for BICEP Intercambio de la Cultura (San Bernardino, California), presents information on the history, funding, and scope of the project. Included are sociolinguistic process variables such as the native and dominant languages of students and their interaction. Information is provided on staff selection and the…
The Measurement and Cost of Removing Unexplained Gender Differences in Faculty Salaries.
ERIC Educational Resources Information Center
Becker, William E.; Toutkoushian, Robert K.
1995-01-01
In assessing sex-discrimination suit damages, debate rages over the type and number of variables included in a single-equation model of the salary-determination process. This article considers single- and multiple-equation models, providing 36 different damage calculations. For University of Minnesota data, equalization cost hinges on the…
Research in Reading in English as a Second Language.
ERIC Educational Resources Information Center
Devine, Joanne, Ed.; And Others
This collection of essays, most followed by comments, reflect some aspect of the general theme: reading is a multifacted, complex, interactive process that involves many subskills and many types of reader, as well as text, variables. Papers include: "The Eclectic Synergy of Methods of Reading Research" (Ulla Connor); "A View of…
Learning and Optimization of Cognitive Capabilities. Final Project Report.
ERIC Educational Resources Information Center
Lumsdaine, A.A.; And Others
The work of a three-year series of experimental studies of human cognition is summarized in this report. Proglem solving and learning in man-machine interaction was investigated, as well as relevant variables and processes. The work included four separate projects: (1) computer-aided problem solving, (2) computer-aided instruction techniques, (3)…
Content Analysis Schedule for Bilingual Education Programs: Bilingual Project Forward-Adelante.
ERIC Educational Resources Information Center
Figueroa, Ramon
This content analysis schedule for the Bilingual Project of Rochester, New York presents information on the history, funding, and scope of the project. Included are sociolinguistic process variables such as the native and dominant languages of students and their interaction. Information is provided on staff selection and the linguistic background…
Teaching and Learning in Higher Education.
ERIC Educational Resources Information Center
Dart, Barry; Boulton-Lewis, Gillian
The 11 chapters in this book, each contributed by a different author, are organized around the "3P model" of learning at the college level developed by John Biggs, which allows teachers to monitor and modify their teaching in light of students' learning. The 3P model includes presage (student and situational variables), process (how…
ERIC Educational Resources Information Center
Hess, Richard T.; And Others
This content analysis schedule for the Albuquerque (New Mexico) Public School Bicultural-Bilingual Program presents information on the history, funding, and scope of the project. Included are sociolinguistic process variables such as the native and dominant languages of students and their interaction. Information is provided on staff selection and…
A Pilot-Scale Heat Recovery System for Computer Process Control Teaching and Research.
ERIC Educational Resources Information Center
Callaghan, P. J.; And Others
1988-01-01
Describes the experimental system and equipment including an interface box for displaying variables. Discusses features which make the circuit suitable for teaching and research in computing. Feedforward, decoupling, and adaptive control, examination of digital filtering, and a cascade loop are teaching experiments utilizing this rig. Diagrams and…
Using the Terms "Hypothesis" and "Variable" for Qualitative Work: A Critical Reflection
ERIC Educational Resources Information Center
Lareau, Annette
2012-01-01
Ralph LaRossa's (2012) thoughtful piece suggested that qualitative researchers' self-awareness (and clear articulation) of their conceptual and empirical goals can help their manuscripts in many ways, including during the review process. If authors self-consciously embrace particular orientations, then it will be easier for reviewers to evaluate…
ERIC Educational Resources Information Center
Chao, Ruth; Kanatsu, Akira
2008-01-01
This study examined both socioeconomic and cultural factors in explaining ethnic differences in monitoring, behavioral control, and warmth--part of a series of coordinated studies presented in this special issue. Socioeconomic variables included mother's and father's educational levels, employment status, home ownership, number of siblings in the…
Interplay Between Reading Tasks, Reader Variables, and Unknown Word Processing.
ERIC Educational Resources Information Center
Levine, Adina; Reves, Thea
1998-01-01
Examined to what extent readers' word-treatment strategies are task dependent, and to what extent word-treatment strategies are dependent on the reader's reading profile. Subjects were 42 students of English for academic purposes advanced reading-comprehension course. Instruments used in the study included a word-treatment experiment, an open…
Gestural communication in young gorillas (Gorilla gorilla): gestural repertoire, learning, and use.
Pika, Simone; Liebal, Katja; Tomasello, Michael
2003-07-01
In the present study we investigated the gestural communication of gorillas (Gorilla gorilla). The subjects were 13 gorillas (1-6 years old) living in two different groups in captivity. Our goal was to compile the gestural repertoire of subadult gorillas, with a special focus on processes of social cognition, including attention to individual and developmental variability, group variability, and flexibility of use. Thirty-three different gestures (six auditory, 11 tactile, and 16 visual gestures) were recorded. We found idiosyncratic gestures, individual differences, and similar degrees of concordance between and within groups, as well as some group-specific gestures. These results provide evidence that ontogenetic ritualization is the main learning process involved, but some form of social learning may also be responsible for the acquisition of special gestures. The present study establishes that gorillas have a multifaceted gestural repertoire, characterized by a great deal of flexibility with accommodations to various communicative circumstances, including the attentional state of the recipient. The possibility of assigning Seyfarth and Cheney's [1997] model for nonhuman primate vocal development to the development of nonhuman primate gestural communication is discussed. Copyright 2003 Wiley-Liss, Inc.
Analysis of AVHRR, CZCS and historical in situ data off the Oregon Coast
NASA Technical Reports Server (NTRS)
Strub, P. Ted; Chelton, Dudley B.
1990-01-01
The original scientific objectives of this grant were to: (1) characterize the seasonal cycles and interannual variability for phytoplankton concentrations and sea surface temperature (SST) in the California Current using satellite data; and (2) to explore the spatial and temporal relationship between these variables and surface wind forcing. An additional methodological objective was to develop statistical methods for forming mean fields, which minimize the effects of random data gaps and errors in the irregularly sampled CZCS (Coastal Zone Color Scanner) and AVHRR (Advanced Very High Resolution Radiometer) satellite data. A final task was to evaluate the level of uncertainty in the wind fields used for the statistical analysis. Funding in the first year included part of the cost of an image processing system to enable this and other projects to process and analyze satellite data. This report consists of summaries of the major projects carried out with all or partial support from this grant. The appendices include a list of papers and professional presentations supported by the grant, as well as reprints of the major papers and reports.
Sung, Yao-Ting; Yang, Je-Ming; Lee, Han-Yueh
2017-08-01
One of the trends in collaborative learning is using mobile devices for supporting the process and products of collaboration, which has been forming the field of mobile-computer-supported collaborative learning (mCSCL). Although mobile devices have become valuable collaborative learning tools, evaluative evidence for their substantial contributions to collaborative learning is still scarce. The present meta-analysis, which included 48 peer-reviewed journal articles and doctoral dissertations written over a 16-year period (2000-2015) involving 5,294 participants, revealed that mCSCL has produced meaningful improvements for collaborative learning, with an overall mean effect size of 0.516. Moderator variables, such as domain subject, group size, teaching method, intervention duration, and reward method were related to different effect sizes. The results provided implications for future research and practice, such as suggestions on how to appropriately use the functionalities of mobile devices, how to best leverage mCSCL through effective group learning mechanisms, and what outcome variables should be included in future studies to fully elucidate the process and products of mCSCL.
Thresholds for conservation and management: structured decision making as a conceptual framework
Nichols, James D.; Eaton, Mitchell J.; Martin, Julien; Edited by Guntenspergen, Glenn R.
2014-01-01
changes in system dynamics. They are frequently incorporated into ecological models used to project system responses to management actions. Utility thresholds are components of management objectives and are values of state or performance variables at which small changes yield substantial changes in the value of the management outcome. Decision thresholds are values of system state variables at which small changes prompt changes in management actions in order to reach specified management objectives. Decision thresholds are derived from the other components of the decision process.We advocate a structured decision making (SDM) approach within which the following components are identified: objectives (possibly including utility thresholds), potential actions, models (possibly including ecological thresholds), monitoring program, and a solution algorithm (which produces decision thresholds). Adaptive resource management (ARM) is described as a special case of SDM developed for recurrent decision problems that are characterized by uncertainty. We believe that SDM, in general, and ARM, in particular, provide good approaches to conservation and management. Use of SDM and ARM also clarifies the distinct roles of ecological thresholds, utility thresholds, and decision thresholds in informed decision processes.
Sung, Yao-Ting; Yang, Je-Ming; Lee, Han-Yueh
2017-01-01
One of the trends in collaborative learning is using mobile devices for supporting the process and products of collaboration, which has been forming the field of mobile-computer-supported collaborative learning (mCSCL). Although mobile devices have become valuable collaborative learning tools, evaluative evidence for their substantial contributions to collaborative learning is still scarce. The present meta-analysis, which included 48 peer-reviewed journal articles and doctoral dissertations written over a 16-year period (2000–2015) involving 5,294 participants, revealed that mCSCL has produced meaningful improvements for collaborative learning, with an overall mean effect size of 0.516. Moderator variables, such as domain subject, group size, teaching method, intervention duration, and reward method were related to different effect sizes. The results provided implications for future research and practice, such as suggestions on how to appropriately use the functionalities of mobile devices, how to best leverage mCSCL through effective group learning mechanisms, and what outcome variables should be included in future studies to fully elucidate the process and products of mCSCL. PMID:28989193
Heralded processes on continuous-variable spaces as quantum maps
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferreyrol, Franck; Spagnolo, Nicolò; Blandino, Rémi
2014-12-04
Heralding processes, which only work when a measurement on a part of the system give the good result, are particularly interesting for continuous-variables. They permit non-Gaussian transformations that are necessary for several continuous-variable quantum information tasks. However if maps and quantum process tomography are commonly used to describe quantum transformations in discrete-variable space, they are much rarer in the continuous-variable domain. Also, no convenient tool for representing maps in a way more adapted to the particularities of continuous variables have yet been explored. In this paper we try to fill this gap by presenting such a tool.
Vuolo, Janet; Goffman, Lisa
2017-01-01
This exploratory treatment study used phonetic transcription and speech kinematics to examine changes in segmental and articulatory variability. Nine children, ages 4- to 8-years-old, served as participants, including two with childhood apraxia of speech (CAS), five with speech sound disorder (SSD), and two who were typically developing (TD). Children practised producing agent + action phrases in an imitation task (low linguistic load) and a retrieval task (high linguistic load) over five sessions. In the imitation task in session one, both participants with CAS showed high degrees of segmental and articulatory variability. After five sessions, imitation practice resulted in increased articulatory variability for five participants. Retrieval practice resulted in decreased articulatory variability in three participants with SSD. These results suggest that short-term speech production practice in rote imitation disrupts articulatory control in children with and without CAS. In contrast, tasks that require linguistic processing may scaffold learning for children with SSD but not CAS. PMID:27960554
Jha, Pankaj; Das, Arup Jyoti; Deka, Sankar Chandra
2017-11-01
Phenolic compounds were extracted from the husk of milled black rice (cv. Poireton) by using a combination of ultrasound assisted extraction and microwave assisted extraction. Extraction parameters were optimized by response surface methodology according to a three levels, five variables Box-Behnken design. The appropriate process variables (extraction temperature and extraction time) to maximize the ethanolic extraction of total phenolic compounds, flavonoids, anthocyanins and antioxidant activity of the extracts were obtained. Extraction of functional components with varying ethanol concentration and microwave time were significantly affected by the process variables. The best possible conditions obtained by RSM for all the factors included 10.02 min sonication time, 49.46 °C sonication temperature, 1:40.79 (w/v) solute solvent ratio, 67.34% ethanol concentration, and 31.11 s microwave time. Under the given solutions, the maximum extraction of phenolics (1.65 mg/g GAE), flavonoids (3.04 mg/100 g), anthocyanins (3.39 mg/100 g) and antioxidants (100%) were predicted, while the experimental values included 1.72 mg/g GAE of total phenolics, 3.01 mg/100 g of flavonoids, 3.36 mg/100 g of anthocyanins and 100% antioxidant activity. The overall results indicated positive impact of co-application of microwave and ultrasound assisted extractions of phenolic compounds from black rice husk.
NASA Astrophysics Data System (ADS)
Zhang, Zhongyang; Nian, Qiong; Doumanidis, Charalabos C.; Liao, Yiliang
2018-02-01
Nanosecond pulsed laser shock processing (LSP) techniques, including laser shock peening, laser peen forming, and laser shock imprinting, have been employed for widespread industrial applications. In these processes, the main beneficial characteristic is the laser-induced shockwave with a high pressure (in the order of GPa), which leads to the plastic deformation with an ultrahigh strain rate (105-106/s) on the surface of target materials. Although LSP processes have been extensively studied by experiments, few efforts have been put on elucidating underlying process mechanisms through developing a physics-based process model. In particular, development of a first-principles model is critical for process optimization and novel process design. This work aims at introducing such a theoretical model for a fundamental understanding of process mechanisms in LSP. Emphasis is placed on the laser-matter interaction and plasma dynamics. This model is found to offer capabilities in predicting key parameters including electron and ion temperatures, plasma state variables (temperature, density, and pressure), and the propagation of the laser shockwave. The modeling results were validated by experimental data.
PVD thermal barrier coating applications and process development for aircraft engines
NASA Astrophysics Data System (ADS)
Rigney, D. V.; Viguie, R.; Wortman, D. J.; Skelly, D. W.
1997-06-01
Thermal barrier coatings (TBCs) have been developed for application to aircraft engine components to improve service life in an increasingly hostile thermal environment. The choice of TBC type is related to the component, intended use, and economics. Selection of electron beam physical vapor deposition proc-essing for turbine blade is due in part to part size, surface finish requirements, thickness control needs, and hole closure issues. Process development of PVD TBCs has been carried out at several different sites, including GE Aircraft Engines (GEAE). The influence of processing variables on microstructure is dis-cussed, along with the GEAE development coater and initial experiences of pilot line operation.
The emerging conceptualization of groups as information processors.
Hinsz, V B; Tindale, R S; Vollrath, D A
1997-01-01
A selective review of research highlights the emerging view of groups as information processors. In this review, the authors include research on processing objectives, attention, encoding, storage, retrieval, processing, response, feedback, and learning in small interacting task groups. The groups as information processors perspective underscores several characteristic dimensions of variability in group performance of cognitive tasks, namely, commonality-uniqueness of information, convergence-diversity of ideas, accentuation-attenuation of cognitive processes, and belongingness-distinctiveness of members. A combination of contributions framework provides an additional conceptualization of information processing in groups. The authors also address implications, caveats, and questions for future research and theory regarding groups as information processors.
Wang, Pei; Zhang, Hui; Yang, Hailong; Nie, Lei; Zang, Hengchang
2015-02-25
Near-infrared (NIR) spectroscopy has been developed into an indispensable tool for both academic research and industrial quality control in a wide field of applications. The feasibility of NIR spectroscopy to monitor the concentration of puerarin, daidzin, daidzein and total isoflavonoid (TIF) during the extraction process of kudzu (Pueraria lobata) was verified in this work. NIR spectra were collected in transmission mode and pretreated with smoothing and derivative. Partial least square regression (PLSR) was used to establish calibration models. Three different variable selection methods, including correlation coefficient method, interval partial least squares (iPLS), and successive projections algorithm (SPA) were performed and compared with models based on all of the variables. The results showed that the approach was very efficient and environmentally friendly for rapid determination of the four quality indices (QIs) in the kudzu extraction process. This method established may have the potential to be used as a process analytical technological (PAT) tool in the future. Copyright © 2014 Elsevier B.V. All rights reserved.
Residual stress evaluation of components produced via direct metal laser sintering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kemerling, Brandon; Lippold, John C.; Fancher, Christopher M.
Direct metal laser sintering is an additive manufacturing process which is capable of fabricating three-dimensional components using a laser energy source and metal powder particles. Despite the numerous benefits offered by this technology, the process maturity is low with respect to traditional subtractive manufacturing methods. Relationships between key processing parameters and final part properties are generally lacking and require further development. In this study, residual stresses were evaluated as a function of key process variables. The variables evaluated included laser scan strategy and build plate preheat temperature. Residual stresses were measured experimentally via neutron diffraction and computationally via finite elementmore » analysis. Good agreement was shown between the experimental and computational results. Results showed variations in the residual stress profile as a function of laser scan strategy. Compressive stresses were dominant along the build height (z) direction, and tensile stresses were dominant in the x and y directions. Build plate preheating was shown to be an effective method for alleviating residual stress due to the reduction in thermal gradient.« less
Residual stress evaluation of components produced via direct metal laser sintering
Kemerling, Brandon; Lippold, John C.; Fancher, Christopher M.; ...
2018-03-22
Direct metal laser sintering is an additive manufacturing process which is capable of fabricating three-dimensional components using a laser energy source and metal powder particles. Despite the numerous benefits offered by this technology, the process maturity is low with respect to traditional subtractive manufacturing methods. Relationships between key processing parameters and final part properties are generally lacking and require further development. In this study, residual stresses were evaluated as a function of key process variables. The variables evaluated included laser scan strategy and build plate preheat temperature. Residual stresses were measured experimentally via neutron diffraction and computationally via finite elementmore » analysis. Good agreement was shown between the experimental and computational results. Results showed variations in the residual stress profile as a function of laser scan strategy. Compressive stresses were dominant along the build height (z) direction, and tensile stresses were dominant in the x and y directions. Build plate preheating was shown to be an effective method for alleviating residual stress due to the reduction in thermal gradient.« less
An Agenda for Research on the Sustainability of Public Health Programs
Dearing, James W.
2011-01-01
Funders of programs in public health and community health are increasingly concerned about the sustainability of changes they initiate. Despite a recent increase in sustainability research and evaluation, this literature has not developed a widely used paradigm for conducting research that can accumulate into generalizable findings. We provide guidance for research and evaluation of health program sustainability, including definitions and types of sustainability, specifications and measurements of dependent variables, definitions of independent variables or factors that influence sustainability, and suggestions for designs for research and data collection. We suggest viewing sustainability research as a further stage in the translation or dissemination of research-based interventions into practice. This perspective emphasizes ongoing relationships with earlier stages of a broader diffusion framework, including adoption and implementation processes. PMID:21940916
Cheng, Sen; Sabes, Philip N
2007-04-01
The sensorimotor calibration of visually guided reaching changes on a trial-to-trial basis in response to random shifts in the visual feedback of the hand. We show that a simple linear dynamical system is sufficient to model the dynamics of this adaptive process. In this model, an internal variable represents the current state of sensorimotor calibration. Changes in this state are driven by error feedback signals, which consist of the visually perceived reach error, the artificial shift in visual feedback, or both. Subjects correct for > or =20% of the error observed on each movement, despite being unaware of the visual shift. The state of adaptation is also driven by internal dynamics, consisting of a decay back to a baseline state and a "state noise" process. State noise includes any source of variability that directly affects the state of adaptation, such as variability in sensory feedback processing, the computations that drive learning, or the maintenance of the state. This noise is accumulated in the state across trials, creating temporal correlations in the sequence of reach errors. These correlations allow us to distinguish state noise from sensorimotor performance noise, which arises independently on each trial from random fluctuations in the sensorimotor pathway. We show that these two noise sources contribute comparably to the overall magnitude of movement variability. Finally, the dynamics of adaptation measured with random feedback shifts generalizes to the case of constant feedback shifts, allowing for a direct comparison of our results with more traditional blocked-exposure experiments.
A Database Approach for Predicting and Monitoring Baked Anode Properties
NASA Astrophysics Data System (ADS)
Lauzon-Gauthier, Julien; Duchesne, Carl; Tessier, Jayson
2012-11-01
The baked anode quality control strategy currently used by most carbon plants based on testing anode core samples in the laboratory is inadequate for facing increased raw material variability. The low core sampling rate limited by lab capacity and the common practice of reporting averaged properties based on some anode population mask a significant amount of individual anode variability. In addition, lab results are typically available a few weeks after production and the anodes are often already set in the reduction cells preventing early remedial actions when necessary. A database approach is proposed in this work to develop a soft-sensor for predicting individual baked anode properties at the end of baking cycle. A large historical database including raw material properties, process operating parameters and anode core data was collected from a modern Alcoa plant. A multivariate latent variable PLS regression method was used for analyzing the large database and building the soft-sensor model. It is shown that the general low frequency trends in most anode physical and mechanical properties driven by raw material changes are very well captured by the model. Improvements in the data infrastructure (instrumentation, sampling frequency and location) will be necessary for predicting higher frequency variations in individual baked anode properties. This paper also demonstrates how multivariate latent variable models can be interpreted against process knowledge and used for real-time process monitoring of carbon plants, and detection of faults and abnormal operation.
van Strien, Maarten J; Slager, Cornelis T J; de Vries, Bauke; Grêt-Regamey, Adrienne
2016-06-01
Many studies have assessed the effect of landscape patterns on spatial ecological processes by simulating these processes in computer-generated landscapes with varying composition and configuration. To generate such landscapes, various neutral landscape models have been developed. However, the limited set of landscape-level pattern variables included in these models is often inadequate to generate landscapes that reflect real landscapes. In order to achieve more flexibility and variability in the generated landscapes patterns, a more complete set of class- and patch-level pattern variables should be implemented in these models. These enhancements have been implemented in Landscape Generator (LG), which is a software that uses optimization algorithms to generate landscapes that match user-defined target values. Developed for participatory spatial planning at small scale, we enhanced the usability of LG and demonstrated how it can be used for larger scale ecological studies. First, we used LG to recreate landscape patterns from a real landscape (i.e., a mountainous region in Switzerland). Second, we generated landscape series with incrementally changing pattern variables, which could be used in ecological simulation studies. We found that LG was able to recreate landscape patterns that approximate those of real landscapes. Furthermore, we successfully generated landscape series that would not have been possible with traditional neutral landscape models. LG is a promising novel approach for generating neutral landscapes and enables testing of new hypotheses regarding the influence of landscape patterns on ecological processes. LG is freely available online.
NASA Astrophysics Data System (ADS)
Hockaday, W. C.; Kane, E. S.; Ohlson, M.; Huang, R.; Von Bargen, J.; Davis, R.
2014-12-01
Efforts have been made by various scientific disciplines to study hyporheic zones and characterize their associated processes. One way to approach the study of the hyporheic zone is to define facies, which are elements of a (hydrobio) geologic classification scheme that groups components of a complex system with high variability into a manageable set of discrete classes. In this study, we try to classify the hyporheic zone based on the geology, geochemistry, microbiology, and understand their interactive influences on the integrated biogeochemical distributions and processes. A number of measurements have been taken for 21 freeze core samples along the Columbia River bank in the Hanford 300 Area, and unique datasets have been obtained on biomass, pH, number of microbial taxa, percentage of N/C/H/S, microbial activity parameters, as well as microbial community attributes/modules. In order to gain a complete understanding of the geological control on these variables and processes, the explanatory variables are set to include quantitative gravel/sand/mud/silt/clay percentages, statistical moments of grain size distributions, as well as geological (e.g., Folk-Wentworth) and statistical (e.g., hierarchical) clusters. The dominant factors for major microbial and geochemical variables are identified and summarized using exploratory data analysis approaches (e.g., principal component analysis, hierarchical clustering, factor analysis, multivariate analysis of variance). The feasibility of extending the facies definition and its control of microbial and geochemical properties to larger scales is discussed.
Microbial facies distribution and its geological and geochemical controls at the Hanford 300 area
NASA Astrophysics Data System (ADS)
Hou, Z.; Nelson, W.; Stegen, J.; Murray, C. J.; Arntzen, E.
2015-12-01
Efforts have been made by various scientific disciplines to study hyporheic zones and characterize their associated processes. One way to approach the study of the hyporheic zone is to define facies, which are elements of a (hydrobio) geologic classification scheme that groups components of a complex system with high variability into a manageable set of discrete classes. In this study, we try to classify the hyporheic zone based on the geology, geochemistry, microbiology, and understand their interactive influences on the integrated biogeochemical distributions and processes. A number of measurements have been taken for 21 freeze core samples along the Columbia River bank in the Hanford 300 Area, and unique datasets have been obtained on biomass, pH, number of microbial taxa, percentage of N/C/H/S, microbial activity parameters, as well as microbial community attributes/modules. In order to gain a complete understanding of the geological control on these variables and processes, the explanatory variables are set to include quantitative gravel/sand/mud/silt/clay percentages, statistical moments of grain size distributions, as well as geological (e.g., Folk-Wentworth) and statistical (e.g., hierarchical) clusters. The dominant factors for major microbial and geochemical variables are identified and summarized using exploratory data analysis approaches (e.g., principal component analysis, hierarchical clustering, factor analysis, multivariate analysis of variance). The feasibility of extending the facies definition and its control of microbial and geochemical properties to larger scales is discussed.
Giles, Madeline; Morley, Nicholas; Baggs, Elizabeth M.; Daniell, Tim J.
2012-01-01
The microbial processes of denitrification and dissimilatory nitrate reduction to ammonium (DNRA) are two important nitrate reducing mechanisms in soil, which are responsible for the loss of nitrate (NO3−) and production of the potent greenhouse gas, nitrous oxide (N2O). A number of factors are known to control these processes, including O2 concentrations and moisture content, N, C, pH, and the size and community structure of nitrate reducing organisms responsible for the processes. There is an increasing understanding associated with many of these controls on flux through the nitrogen cycle in soil systems. However, there remains uncertainty about how the nitrate reducing communities are linked to environmental variables and the flux of products from these processes. The high spatial variability of environmental controls and microbial communities across small sub centimeter areas of soil may prove to be critical in determining why an understanding of the links between biotic and abiotic controls has proved elusive. This spatial effect is often overlooked as a driver of nitrate reducing processes. An increased knowledge of the effects of spatial heterogeneity in soil on nitrate reduction processes will be fundamental in understanding the drivers, location, and potential for N2O production from soils. PMID:23264770
Giles, Madeline; Morley, Nicholas; Baggs, Elizabeth M; Daniell, Tim J
2012-01-01
The microbial processes of denitrification and dissimilatory nitrate reduction to ammonium (DNRA) are two important nitrate reducing mechanisms in soil, which are responsible for the loss of nitrate ([Formula: see text]) and production of the potent greenhouse gas, nitrous oxide (N(2)O). A number of factors are known to control these processes, including O(2) concentrations and moisture content, N, C, pH, and the size and community structure of nitrate reducing organisms responsible for the processes. There is an increasing understanding associated with many of these controls on flux through the nitrogen cycle in soil systems. However, there remains uncertainty about how the nitrate reducing communities are linked to environmental variables and the flux of products from these processes. The high spatial variability of environmental controls and microbial communities across small sub centimeter areas of soil may prove to be critical in determining why an understanding of the links between biotic and abiotic controls has proved elusive. This spatial effect is often overlooked as a driver of nitrate reducing processes. An increased knowledge of the effects of spatial heterogeneity in soil on nitrate reduction processes will be fundamental in understanding the drivers, location, and potential for N(2)O production from soils.
Data driven model generation based on computational intelligence
NASA Astrophysics Data System (ADS)
Gemmar, Peter; Gronz, Oliver; Faust, Christophe; Casper, Markus
2010-05-01
The simulation of discharges at a local gauge or the modeling of large scale river catchments are effectively involved in estimation and decision tasks of hydrological research and practical applications like flood prediction or water resource management. However, modeling such processes using analytical or conceptual approaches is made difficult by both complexity of process relations and heterogeneity of processes. It was shown manifold that unknown or assumed process relations can principally be described by computational methods, and that system models can automatically be derived from observed behavior or measured process data. This study describes the development of hydrological process models using computational methods including Fuzzy logic and artificial neural networks (ANN) in a comprehensive and automated manner. Methods We consider a closed concept for data driven development of hydrological models based on measured (experimental) data. The concept is centered on a Fuzzy system using rules of Takagi-Sugeno-Kang type which formulate the input-output relation in a generic structure like Ri : IFq(t) = lowAND...THENq(t+Δt) = ai0 +ai1q(t)+ai2p(t-Δti1)+ai3p(t+Δti2)+.... The rule's premise part (IF) describes process states involving available process information, e.g. actual outlet q(t) is low where low is one of several Fuzzy sets defined over variable q(t). The rule's conclusion (THEN) estimates expected outlet q(t + Δt) by a linear function over selected system variables, e.g. actual outlet q(t), previous and/or forecasted precipitation p(t ?Δtik). In case of river catchment modeling we use head gauges, tributary and upriver gauges in the conclusion part as well. In addition, we consider temperature and temporal (season) information in the premise part. By creating a set of rules R = {Ri|(i = 1,...,N)} the space of process states can be covered as concise as necessary. Model adaptation is achieved by finding on optimal set A = (aij) of conclusion parameters with respect to a defined rating function and experimental data. To find A, we use for example a linear equation solver and RMSE-function. In practical process models, the number of Fuzzy sets and the according number of rules is fairly low. Nevertheless, creating the optimal model requires some experience. Therefore, we improved this development step by methods for automatic generation of Fuzzy sets, rules, and conclusions. Basically, the model achievement depends to a great extend on the selection of the conclusion variables. It is the aim that variables having most influence on the system reaction being considered and superfluous ones being neglected. At first, we use Kohonen maps, a specialized ANN, to identify relevant input variables from the large set of available system variables. A greedy algorithm selects a comprehensive set of dominant and uncorrelated variables. Next, the premise variables are analyzed with clustering methods (e.g. Fuzzy-C-means) and Fuzzy sets are then derived from cluster centers and outlines. The rule base is automatically constructed by permutation of the Fuzzy sets of the premise variables. Finally, the conclusion parameters are calculated and the total coverage of the input space is iteratively tested with experimental data, rarely firing rules are combined and coarse coverage of sensitive process states results in refined Fuzzy sets and rules. Results The described methods were implemented and integrated in a development system for process models. A series of models has already been built e.g. for rainfall-runoff modeling or for flood prediction (up to 72 hours) in river catchments. The models required significantly less development effort and showed advanced simulation results compared to conventional models. The models can be used operationally and simulation takes only some minutes on a standard PC e.g. for a gauge forecast (up to 72 hours) for the whole Mosel (Germany) river catchment.
Laso, Jara; Margallo, María; Fullana, Pére; Bala, Alba; Gazulla, Cristina; Irabien, Ángel; Aldaco, Rubén
2017-01-01
To be able to fulfil high market expectations for a number of practical applications, Environmental Product Declarations (EPDs) have to meet and comply with specific and strict methodological prerequisites. These expectations include the possibility to add up Life Cycle Assessment (LCA)-based information in the supply chain and to compare different EPDs. To achieve this goal, common and harmonized calculation rules have to be established, the so-called Product Category Rules (PCRs), which set the overall LCA calculation rules to create EPDs. This document provides PCRs for the assessment of the environmental performance of canned anchovies in Cantabria Region based on an Environmental Sustainability Assessment (ESA) method. This method uses two main variables: the natural resources sustainability (NRS) and the environmental burdens sustainability (EBS). To reduce the complexity of ESA and facilitate the decision-making process, all variables are normalized and weighted to obtain two global dimensionless indexes: resource consumption (X 1 ) and environmental burdens (X 2 ). •This paper sets the PCRs adapted to the Cantabrian canned anchovies.•ESA method facilitates the product comparison and the decision-making process.•This paper stablishes all the steps that an EPD should include within the PCRs of Cantabrian canned anchovies.
Spahr, N.E.; Boulger, R.W.
1997-01-01
Quality-control samples provide part of the information needed to estimate the bias and variability that result from sample collection, processing, and analysis. Quality-control samples of surface water collected for the Upper Colorado River National Water-Quality Assessment study unit for water years 1995?96 are presented and analyzed in this report. The types of quality-control samples collected include pre-processing split replicates, concurrent replicates, sequential replicates, post-processing split replicates, and field blanks. Analysis of the pre-processing split replicates, concurrent replicates, sequential replicates, and post-processing split replicates is based on differences between analytical results of the environmental samples and analytical results of the quality-control samples. Results of these comparisons indicate that variability introduced by sample collection, processing, and handling is low and will not affect interpretation of the environmental data. The differences for most water-quality constituents is on the order of plus or minus 1 or 2 lowest rounding units. A lowest rounding unit is equivalent to the magnitude of the least significant figure reported for analytical results. The use of lowest rounding units avoids some of the difficulty in comparing differences between pairs of samples when concentrations span orders of magnitude and provides a measure of the practical significance of the effect of variability. Analysis of field-blank quality-control samples indicates that with the exception of chloride and silica, no systematic contamination of samples is apparent. Chloride contamination probably was the result of incomplete rinsing of the dilute cleaning solution from the outlet ports of the decaport sample splitter. Silica contamination seems to have been introduced by the blank water. Sampling and processing procedures for water year 1997 have been modified as a result of these analyses.
NASA Astrophysics Data System (ADS)
Hashemi Sanatgar, Razieh; Campagne, Christine; Nierstrasz, Vincent
2017-05-01
In this paper, 3D printing as a novel printing process was considered for deposition of polymers on synthetic fabrics to introduce more flexible, resource-efficient and cost effective textile functionalization processes than conventional printing process like screen and inkjet printing. The aim is to develop an integrated or tailored production process for smart and functional textiles which avoid unnecessary use of water, energy, chemicals and minimize the waste to improve ecological footprint and productivity. Adhesion of polymer and nanocomposite layers which were 3D printed directly onto the textile fabrics using fused deposition modeling (FDM) technique was investigated. Different variables which may affect the adhesion properties including 3D printing process parameters, fabric type and filler type incorporated in polymer were considered. A rectangular shape according to the peeling standard was designed as 3D computer-aided design (CAD) to find out the effect of the different variables. The polymers were printed in different series of experimental design: nylon on polyamide 66 (PA66) fabrics, polylactic acid (PLA) on PA66 fabric, PLA on PLA fabric, and finally nanosize carbon black/PLA (CB/PLA) and multi-wall carbon nanotubes/PLA (CNT/PLA) nanocomposites on PLA fabrics. The adhesion forces were quantified using the innovative sample preparing method combining with the peeling standard method. Results showed that different variables of 3D printing process like extruder temperature, platform temperature and printing speed can have significant effect on adhesion force of polymers to fabrics while direct 3D printing. A model was proposed specifically for deposition of a commercial 3D printer Nylon filament on PA66 fabrics. In the following, among the printed polymers, PLA and its composites had high adhesion force to PLA fabrics.
Spindler, A
2014-06-15
Although data reconciliation is intensely applied in process engineering, almost none of its powerful methods are employed for validation of operational data from wastewater treatment plants. This is partly due to some prerequisites that are difficult to meet including steady state, known variances of process variables and absence of gross errors. However, an algorithm can be derived from the classical approaches to data reconciliation that allows to find a comprehensive set of equations describing redundancy in the data when measured and unmeasured variables (flows and concentrations) are defined. This is a precondition for methods of data validation based on individual mass balances such as CUSUM charts. The procedure can also be applied to verify the necessity of existing or additional measurements with respect to the improvement of the data's redundancy. Results are given for a large wastewater treatment plant. The introduction aims at establishing a link between methods known from data reconciliation in process engineering and their application in wastewater treatment. Copyright © 2014 Elsevier Ltd. All rights reserved.
Continuity-based model interfacing for plant-wide simulation: a general approach.
Volcke, Eveline I P; van Loosdrecht, Mark C M; Vanrolleghem, Peter A
2006-08-01
In plant-wide simulation studies of wastewater treatment facilities, often existing models from different origin need to be coupled. However, as these submodels are likely to contain different state variables, their coupling is not straightforward. The continuity-based interfacing method (CBIM) provides a general framework to construct model interfaces for models of wastewater systems, taking into account conservation principles. In this contribution, the CBIM approach is applied to study the effect of sludge digestion reject water treatment with a SHARON-Anammox process on a plant-wide scale. Separate models were available for the SHARON process and for the Anammox process. The Benchmark simulation model no. 2 (BSM2) is used to simulate the behaviour of the complete WWTP including sludge digestion. The CBIM approach is followed to develop three different model interfaces. At the same time, the generally applicable CBIM approach was further refined and particular issues when coupling models in which pH is considered as a state variable, are pointed out.
Postnatal brain development: Structural imaging of dynamic neurodevelopmental processes
Jernigan, Terry L.; Baaré, William F. C.; Stiles, Joan; Madsen, Kathrine Skak
2013-01-01
After birth, there is striking biological and functional development of the brain’s fiber tracts as well as remodeling of cortical and subcortical structures. Behavioral development in children involves a complex and dynamic set of genetically guided processes by which neural structures interact constantly with the environment. This is a protracted process, beginning in the third week of gestation and continuing into early adulthood. Reviewed here are studies using structural imaging techniques, with a special focus on diffusion weighted imaging, describing age-related brain maturational changes in children and adolescents, as well as studies that link these changes to behavioral differences. Finally, we discuss evidence for effects on the brain of several factors that may play a role in mediating these brain–behavior associations in children, including genetic variation, behavioral interventions, and hormonal variation associated with puberty. At present longitudinal studies are few, and we do not yet know how variability in individual trajectories of biological development in specific neural systems map onto similar variability in behavioral trajectories. PMID:21489384
Co-optimization of CO 2 -EOR and Storage Processes under Geological Uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ampomah, William; Balch, Robert; Will, Robert
This paper presents an integrated numerical framework to co-optimize EOR and CO 2 storage performance in the Farnsworth field unit (FWU), Ochiltree County, Texas. The framework includes a field-scale compositional reservoir flow model, an uncertainty quantification model and a neural network optimization process. The reservoir flow model has been constructed based on the field geophysical, geological, and engineering data. A laboratory fluid analysis was tuned to an equation of state and subsequently used to predict the thermodynamic minimum miscible pressure (MMP). A history match of primary and secondary recovery processes was conducted to estimate the reservoir and multiphase flow parametersmore » as the baseline case for analyzing the effect of recycling produced gas, infill drilling and water alternating gas (WAG) cycles on oil recovery and CO 2 storage. A multi-objective optimization model was defined for maximizing both oil recovery and CO 2 storage. The uncertainty quantification model comprising the Latin Hypercube sampling, Monte Carlo simulation, and sensitivity analysis, was used to study the effects of uncertain variables on the defined objective functions. Uncertain variables such as bottom hole injection pressure, WAG cycle, injection and production group rates, and gas-oil ratio among others were selected. The most significant variables were selected as control variables to be used for the optimization process. A neural network optimization algorithm was utilized to optimize the objective function both with and without geological uncertainty. The vertical permeability anisotropy (Kv/Kh) was selected as one of the uncertain parameters in the optimization process. The simulation results were compared to a scenario baseline case that predicted CO 2 storage of 74%. The results showed an improved approach for optimizing oil recovery and CO 2 storage in the FWU. The optimization process predicted more than 94% of CO 2 storage and most importantly about 28% of incremental oil recovery. The sensitivity analysis reduced the number of control variables to decrease computational time. A risk aversion factor was used to represent results at various confidence levels to assist management in the decision-making process. The defined objective functions were proved to be a robust approach to co-optimize oil recovery and CO 2 storage. The Farnsworth CO 2 project will serve as a benchmark for future CO 2–EOR or CCUS projects in the Anadarko basin or geologically similar basins throughout the world.« less
Co-optimization of CO 2 -EOR and Storage Processes under Geological Uncertainty
Ampomah, William; Balch, Robert; Will, Robert; ...
2017-07-01
This paper presents an integrated numerical framework to co-optimize EOR and CO 2 storage performance in the Farnsworth field unit (FWU), Ochiltree County, Texas. The framework includes a field-scale compositional reservoir flow model, an uncertainty quantification model and a neural network optimization process. The reservoir flow model has been constructed based on the field geophysical, geological, and engineering data. A laboratory fluid analysis was tuned to an equation of state and subsequently used to predict the thermodynamic minimum miscible pressure (MMP). A history match of primary and secondary recovery processes was conducted to estimate the reservoir and multiphase flow parametersmore » as the baseline case for analyzing the effect of recycling produced gas, infill drilling and water alternating gas (WAG) cycles on oil recovery and CO 2 storage. A multi-objective optimization model was defined for maximizing both oil recovery and CO 2 storage. The uncertainty quantification model comprising the Latin Hypercube sampling, Monte Carlo simulation, and sensitivity analysis, was used to study the effects of uncertain variables on the defined objective functions. Uncertain variables such as bottom hole injection pressure, WAG cycle, injection and production group rates, and gas-oil ratio among others were selected. The most significant variables were selected as control variables to be used for the optimization process. A neural network optimization algorithm was utilized to optimize the objective function both with and without geological uncertainty. The vertical permeability anisotropy (Kv/Kh) was selected as one of the uncertain parameters in the optimization process. The simulation results were compared to a scenario baseline case that predicted CO 2 storage of 74%. The results showed an improved approach for optimizing oil recovery and CO 2 storage in the FWU. The optimization process predicted more than 94% of CO 2 storage and most importantly about 28% of incremental oil recovery. The sensitivity analysis reduced the number of control variables to decrease computational time. A risk aversion factor was used to represent results at various confidence levels to assist management in the decision-making process. The defined objective functions were proved to be a robust approach to co-optimize oil recovery and CO 2 storage. The Farnsworth CO 2 project will serve as a benchmark for future CO 2–EOR or CCUS projects in the Anadarko basin or geologically similar basins throughout the world.« less
Method for using polarization gating to measure a scattering sample
Baba, Justin S.
2015-08-04
Described herein are systems, devices, and methods facilitating optical characterization of scattering samples. A polarized optical beam can be directed to pass through a sample to be tested. The optical beam exiting the sample can then be analyzed to determine its degree of polarization, from which other properties of the sample can be determined. In some cases, an apparatus can include a source of an optical beam, an input polarizer, a sample, an output polarizer, and a photodetector. In some cases, a signal from a photodetector can be processed through attenuation, variable offset, and variable gain.
Gabard-Durnam, Laurel J; Mendez Leal, Adriana S; Wilkinson, Carol L; Levin, April R
2018-01-01
Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and recording lengths. HAPPE software is freely available under the terms of the GNU General Public License at https://github.com/lcnhappe/happe.
Gabard-Durnam, Laurel J.; Mendez Leal, Adriana S.; Wilkinson, Carol L.; Levin, April R.
2018-01-01
Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and recording lengths. HAPPE software is freely available under the terms of the GNU General Public License at https://github.com/lcnhappe/happe. PMID:29535597
NASA Technical Reports Server (NTRS)
Niiler, Pearn P.
2004-01-01
The scientific objective of this research program were to utilize drifter and satellite sea level data for the determination of time mean and time variable surface currents of the global ocean. To accomplish these tasks has required the processing of drifter data to include a wide variety of different configurations of drifters into a uniform format and to process the along track satellite altimeter data for computing the geostrophic current components normal to the track. These tasks were accomplished, which resulted in an increase of drifter data by about 40% and the development of new algorithms for obtaining satellite derived geostrophic velocity data that was consistent with the drifter observations of geostrophic time-variable currents. The methodologies and the research results using these methodologies were reported in the publications listed in this paper.
Continuous-variable quantum key distribution in uniform fast-fading channels
NASA Astrophysics Data System (ADS)
Papanastasiou, Panagiotis; Weedbrook, Christian; Pirandola, Stefano
2018-03-01
We investigate the performance of several continuous-variable quantum key distribution protocols in the presence of uniform fading channels. These are lossy channels whose transmissivity changes according to a uniform probability distribution. We assume the worst-case scenario where an eavesdropper induces a fast-fading process, where she chooses the instantaneous transmissivity while the remote parties may only detect the mean statistical effect. We analyze coherent-state protocols in various configurations, including the one-way switching protocol in reverse reconciliation, the measurement-device-independent protocol in the symmetric configuration, and its extension to a three-party network. We show that, regardless of the advantage given to the eavesdropper (control of the fading), these protocols can still achieve high rates under realistic attacks, within reasonable values for the variance of the probability distribution associated with the fading process.
Applications of the generalized information processing system (GIPSY)
Moody, D.W.; Kays, Olaf
1972-01-01
The Generalized Information Processing System (GIPSY) stores and retrieves variable-field, variable-length records consisting of numeric data, textual data, or codes. A particularly noteworthy feature of GIPSY is its ability to search records for words, word stems, prefixes, and suffixes as well as for numeric values. Moreover, retrieved records may be printed on pre-defined formats or formatted as fixed-field, fixed-length records for direct input to other-programs, which facilitates the exchange of data with other systems. At present there are some 22 applications of GIPSY falling in the general areas of bibliography, natural resources information, and management science, This report presents a description of each application including a sample input form, dictionary, and a typical formatted record. It is hoped that these examples will stimulate others to experiment with innovative uses of computer technology.
Early markers of adult obesity: a review
Brisbois, T D; Farmer, A P; McCargar, L J
2012-01-01
Summary The purpose of this review was to evaluate factors in early childhood (≤5 years of age) that are the most significant predictors of the development of obesity in adulthood. Factors of interest included exposures/insults in the prenatal period, infancy and early childhood, as well as other socio-demographic variables such as socioeconomic status (SES) or birth place that could impact all three time periods. An extensive electronic and systematic search initially resulted in 8,880 citations, after duplicates were removed. Specific inclusion and exclusion criteria were set, and following two screening processes, 135 studies were retained for detailed abstraction and analysis. A total of 42 variables were associated with obesity in adulthood; however, of these, only seven variables may be considered as potential early markers of obesity based on the reported associations. Possible early markers of obesity included maternal smoking and maternal weight gain during pregnancy. Probable early markers of obesity included maternal body mass index, childhood growth patterns (early rapid growth and early adiposity rebound), childhood obesity and father's employment (a proxy measure for SES in many studies). Health promotion programmes/agencies should consider these factors as reasonable targets to reduce the risk of adult obesity. PMID:22171945
The IRMIS object model and services API.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saunders, C.; Dohan, D. A.; Arnold, N. D.
2005-01-01
The relational model developed for the Integrated Relational Model of Installed Systems (IRMIS) toolkit has been successfully used to capture the Advanced Photon Source (APS) control system software (EPICS process variables and their definitions). The relational tables are populated by a crawler script that parses each Input/Output Controller (IOC) start-up file when an IOC reboot is detected. User interaction is provided by a Java Swing application that acts as a desktop for viewing the process variable information. Mapping between the display objects and the relational tables was carried out with the Hibernate Object Relational Modeling (ORM) framework. Work is wellmore » underway at the APS to extend the relational modeling to include control system hardware. For this work, due in part to the complex user interaction required, the primary application development environment has shifted from the relational database view to the object oriented (Java) perspective. With this approach, the business logic is executed in Java rather than in SQL stored procedures. This paper describes the object model used to represent control system software, hardware, and interconnects in IRMIS. We also describe the services API used to encapsulate the required behaviors for creating and maintaining the complex data. In addition to the core schema and object model, many important concepts in IRMIS are captured by the services API. IRMIS is an ambitious collaborative effort for defining and developing a relational database and associated applications to comprehensively document the large and complex EPICS-based control systems of today's accelerators. The documentation effort includes process variables, control system hardware, and interconnections. The approach could also be used to document all components of the accelerator, including mechanical, vacuum, power supplies, etc. One key aspect of IRMIS is that it is a documentation framework, not a design and development tool. We do not generate EPICS control system configurations from IRMIS, and hence do not impose any additional requirements on EPICS developers.« less
Spatial heterogeneities and variability of karst hydro-system : insights from geophysics
NASA Astrophysics Data System (ADS)
Champollion, C.; Fores, B.; Lesparre, N.; Frederic, N.
2017-12-01
Heterogeneous systems such as karsts or fractured hydro-systems are challenging for both scientist and groundwater resources management. Karsts heterogeneities prevent the comparison and moreover the combination of data representative of different scales: borehole water level can generally not be used directly to interpret spring flow dynamic for example. The spatial heterogeneity has also an impact on the temporal variability of groundwater transfer and storage. Karst hydro-systems have characteristic non linear relation between precipitation amount and discharge at the outlets with threshold effects and a large variability of groundwater transit times In the presentation, geophysical field experiments conducted in karst hydro-system in the south of France are used to investigate groundwater transfer and storage variability at a scale of a few hundred meters. We focus on the added value of both geophysical time-lapse gravity experiments and 2D ERT imaging of the subsurface heterogeneities. Both gravity and ERT results can only be interpreted with large ambiguity or some strong a priori: the relation between resistivity and water content is not unique; almost no information about the processes can be inferred from the groundwater stock variations. The present study demonstrate how the ERT and gravity field experiments can be interpreted together in a coherent scheme with less ambiguity. First the geological and hydro-meteorological context is presented. Then the ERT field experiment including the processing and the results are detailed in the section about geophysical imaging of the heterogeneities. The gravity double difference (S2D) time-lapse experiment is described in the section about geophysical monitoring of the temporal variability. The following discussion demonstrate the impact of both experiments on the interpretation in terms of processes and heterogeneities.
NASA Astrophysics Data System (ADS)
Blums, Angela
The present study examines instructional approaches and cognitive factors involved in elementary school children's thinking and learning the Control of Variables Strategy (CVS), a critical aspect of scientific reasoning. Previous research has identified several features related to effective instruction of CVS, including using a guided learning approach, the use of self-reflective questions, and learning in individual and group contexts. The current study examined the roles of procedural and conceptual instruction in learning CVS and investigated the role of executive function in the learning process. Additionally, this study examined how learning to identify variables is a part of the CVS process. In two studies (individual and classroom experiments), 139 third, fourth, and fifth grade students participated in hands-on and paper and pencil CVS learning activities and, in each study, were assigned to either a procedural instruction, conceptual instruction, or control (no instruction) group. Participants also completed a series of executive function tasks. The study was carried out with two parts--Study 1 used an individual context and Study 2 was carried out in a group setting. Results indicated that procedural and conceptual instruction were more effective than no instruction, and the ability to identify variables was identified as a key piece to the CVS process. Executive function predicted ability to identify variables and predicted success on CVS tasks. Developmental differences were present, in that older children outperformed younger children on CVS tasks, and that conceptual instruction was slightly more effective for older children. Some differences between individual and group instruction were found, with those in the individual context showing some advantage over the those in the group setting in learning CVS concepts. Conceptual implications about scientific thinking and practical implications in science education are discussed.
Chemochromic detector for sensing gas leakage and process for producing the same
NASA Technical Reports Server (NTRS)
Roberson, Luke B. (Inventor); Williams, Martha K. (Inventor); Captain, Janine E. (Inventor); Smith, Trent M. (Inventor); Tate, LaNetra Clayton (Inventor)
2012-01-01
A chemochromic sensor for detecting a combustible gas, such as hydrogen, includes a chemochromic pigment mechanically mixed with a polymer and formed into a rigid or pliable material. In a preferred embodiment, the chemochromic detector includes aerogel material. The detector is robust and easily modifiable for a variety of applications and environmental conditions, such as atmospheres of inert gas, hydrogen gas, or mixtures of gases, or in environments that have variable temperature, including high temperatures such as above 100.degree. C. and low temperatures such as below -196.degree. C.
Fatichi, Simone; Vivoni, Enrique R.; Odgen, Fred L; Ivanov, Valeriy Y; Mirus, Benjamin B.; Gochis, David; Downer, Charles W; Camporese, Matteo; Davison, Jason H; Ebel, Brian A.; Jones, Norm; Kim, Jongho; Mascaro, Giuseppe; Niswonger, Richard G.; Restrepo, Pedro; Rigon, Riccardo; Shen, Chaopeng; Sulis, Mauro; Tarboton, David
2016-01-01
Process-based hydrological models have a long history dating back to the 1960s. Criticized by some as over-parameterized, overly complex, and difficult to use, a more nuanced view is that these tools are necessary in many situations and, in a certain class of problems, they are the most appropriate type of hydrological model. This is especially the case in situations where knowledge of flow paths or distributed state variables and/or preservation of physical constraints is important. Examples of this include: spatiotemporal variability of soil moisture, groundwater flow and runoff generation, sediment and contaminant transport, or when feedbacks among various Earth’s system processes or understanding the impacts of climate non-stationarity are of primary concern. These are situations where process-based models excel and other models are unverifiable. This article presents this pragmatic view in the context of existing literature to justify the approach where applicable and necessary. We review how improvements in data availability, computational resources and algorithms have made detailed hydrological simulations a reality. Avenues for the future of process-based hydrological models are presented suggesting their use as virtual laboratories, for design purposes, and with a powerful treatment of uncertainty.
High taxonomic variability despite stable functional structure across microbial communities.
Louca, Stilianos; Jacques, Saulo M S; Pires, Aliny P F; Leal, Juliana S; Srivastava, Diane S; Parfrey, Laura Wegener; Farjalla, Vinicius F; Doebeli, Michael
2016-12-05
Understanding the processes that are driving variation of natural microbial communities across space or time is a major challenge for ecologists. Environmental conditions strongly shape the metabolic function of microbial communities; however, other processes such as biotic interactions, random demographic drift or dispersal limitation may also influence community dynamics. The relative importance of these processes and their effects on community function remain largely unknown. To address this uncertainty, here we examined bacterial and archaeal communities in replicate 'miniature' aquatic ecosystems contained within the foliage of wild bromeliads. We used marker gene sequencing to infer the taxonomic composition within nine metabolic functional groups, and shotgun environmental DNA sequencing to estimate the relative abundances of these groups. We found that all of the bromeliads exhibited remarkably similar functional community structures, but that the taxonomic composition within individual functional groups was highly variable. Furthermore, using statistical analyses, we found that non-neutral processes, including environmental filtering and potentially biotic interactions, at least partly shaped the composition within functional groups and were more important than spatial dispersal limitation and demographic drift. Hence both the functional structure and taxonomic composition within functional groups of natural microbial communities may be shaped by non-neutral and roughly separate processes.
Alonso-Torres, Beatriz; Hernández-Pérez, José Alfredo; Sierra-Espinoza, Fernando; Schenker, Stefan; Yeretzian, Chahan
2013-01-01
Heat and mass transfer in individual coffee beans during roasting were simulated using computational fluid dynamics (CFD). Numerical equations for heat and mass transfer inside the coffee bean were solved using the finite volume technique in the commercial CFD code Fluent; the software was complemented with specific user-defined functions (UDFs). To experimentally validate the numerical model, a single coffee bean was placed in a cylindrical glass tube and roasted by a hot air flow, using the identical geometrical 3D configuration and hot air flow conditions as the ones used for numerical simulations. Temperature and humidity calculations obtained with the model were compared with experimental data. The model predicts the actual process quite accurately and represents a useful approach to monitor the coffee roasting process in real time. It provides valuable information on time-resolved process variables that are otherwise difficult to obtain experimentally, but critical to a better understanding of the coffee roasting process at the individual bean level. This includes variables such as time-resolved 3D profiles of bean temperature and moisture content, and temperature profiles of the roasting air in the vicinity of the coffee bean.
Allnutt, Thomas F.; McClanahan, Timothy R.; Andréfouët, Serge; Baker, Merrill; Lagabrielle, Erwann; McClennen, Caleb; Rakotomanjaka, Andry J. M.; Tianarisoa, Tantely F.; Watson, Reg; Kremen, Claire
2012-01-01
The Government of Madagascar plans to increase marine protected area coverage by over one million hectares. To assist this process, we compare four methods for marine spatial planning of Madagascar's west coast. Input data for each method was drawn from the same variables: fishing pressure, exposure to climate change, and biodiversity (habitats, species distributions, biological richness, and biodiversity value). The first method compares visual color classifications of primary variables, the second uses binary combinations of these variables to produce a categorical classification of management actions, the third is a target-based optimization using Marxan, and the fourth is conservation ranking with Zonation. We present results from each method, and compare the latter three approaches for spatial coverage, biodiversity representation, fishing cost and persistence probability. All results included large areas in the north, central, and southern parts of western Madagascar. Achieving 30% representation targets with Marxan required twice the fish catch loss than the categorical method. The categorical classification and Zonation do not consider targets for conservation features. However, when we reduced Marxan targets to 16.3%, matching the representation level of the “strict protection” class of the categorical result, the methods show similar catch losses. The management category portfolio has complete coverage, and presents several management recommendations including strict protection. Zonation produces rapid conservation rankings across large, diverse datasets. Marxan is useful for identifying strict protected areas that meet representation targets, and minimize exposure probabilities for conservation features at low economic cost. We show that methods based on Zonation and a simple combination of variables can produce results comparable to Marxan for species representation and catch losses, demonstrating the value of comparing alternative approaches during initial stages of the planning process. Choosing an appropriate approach ultimately depends on scientific and political factors including representation targets, likelihood of adoption, and persistence goals. PMID:22359534
Yeager, Lauren A; Marchand, Philippe; Gill, David A; Baum, Julia K; McPherson, Jana M
2017-07-01
Biophysical conditions, including climate, environmental stress, and habitat availability, are key drivers of many ecological processes (e.g., community assembly and productivity) and associated ecosystem services (e.g., carbon sequestration and fishery production). Furthermore, anthropogenic impacts such as coastal development and fishing can have drastic effects on the structure and function of marine ecosystems. Scientists need to account for environmental variation and human impacts to accurately model, manage, and conserve marine ecosystems. Although there are many types of environmental data available from global remote sensing and open-source data products, some are inaccessible to potential end-users because they exist as global layers in high temporal and spatial resolutions which require considerable computational power to process. Additionally, coastal locations often suffer from missing data or data quality issues which limit the utility of some global marine products for coastal sites. Herein we present the Marine Socio-Environmental Covariates dataset for the global oceans, which consists of environmental and anthropogenic variables summarized in ecologically relevant ways. The dataset includes four sets of environmental variables related to biophysical conditions (net primary productivity models corrected for shallow-water reflectance, wave energy including sheltered-coastline corrections) and landscape context (coral reef and land cover within varying radii). We also present two sets of anthropogenic variables, human population density (within varying radii) and distance to large population center, which can serve as indicators of local human impacts. We have paired global, summarized layers available for download with an online data querying platform that allows users to extract data for specific point locations with finer control of summary statistics. In creating these global layers and online platform, we hope to make the data accessible to a wide array of end-users with the goal of advancing marine ecosystem studies. © 2017 by the Ecological Society of America.
Murphy, David J; Rubinson, Lewis; Blum, James; Isakov, Alexander; Bhagwanjee, Statish; Cairns, Charles B; Cobb, J Perren; Sevransky, Jonathan E
2015-11-01
In developed countries, public health systems have become adept at rapidly identifying the etiology and impact of public health emergencies. However, within the time course of clinical responses, shortfalls in readily analyzable patient-level data limit capabilities to understand clinical course, predict outcomes, ensure resource availability, and evaluate the effectiveness of diagnostic and therapeutic strategies for seriously ill and injured patients. To be useful in the timeline of a public health emergency, multi-institutional clinical investigation systems must be in place to rapidly collect, analyze, and disseminate detailed clinical information regarding patients across prehospital, emergency department, and acute care hospital settings, including ICUs. As an initial step to near real-time clinical learning during public health emergencies, we sought to develop an "all-hazards" core dataset to characterize serious illness and injuries and the resource requirements for acute medical response across the care continuum. A multidisciplinary panel of clinicians, public health professionals, and researchers with expertise in public health emergencies. Group consensus process. The consensus process included regularly scheduled conference calls, electronic communications, and an in-person meeting to generate candidate variables. Candidate variables were then reviewed by the group to meet the competing criteria of utility and feasibility resulting in the core dataset. The 40-member panel generated 215 candidate variables for potential dataset inclusion. The final dataset includes 140 patient-level variables in the domains of demographics and anthropometrics (7), prehospital (11), emergency department (13), diagnosis (8), severity of illness (54), medications and interventions (38), and outcomes (9). The resulting all-hazard core dataset for seriously ill and injured persons provides a foundation to facilitate rapid collection, analyses, and dissemination of information necessary for clinicians, public health officials, and policymakers to optimize public health emergency response. Further work is needed to validate the effectiveness of the dataset in a variety of emergency settings.
Allnutt, Thomas F; McClanahan, Timothy R; Andréfouët, Serge; Baker, Merrill; Lagabrielle, Erwann; McClennen, Caleb; Rakotomanjaka, Andry J M; Tianarisoa, Tantely F; Watson, Reg; Kremen, Claire
2012-01-01
The Government of Madagascar plans to increase marine protected area coverage by over one million hectares. To assist this process, we compare four methods for marine spatial planning of Madagascar's west coast. Input data for each method was drawn from the same variables: fishing pressure, exposure to climate change, and biodiversity (habitats, species distributions, biological richness, and biodiversity value). The first method compares visual color classifications of primary variables, the second uses binary combinations of these variables to produce a categorical classification of management actions, the third is a target-based optimization using Marxan, and the fourth is conservation ranking with Zonation. We present results from each method, and compare the latter three approaches for spatial coverage, biodiversity representation, fishing cost and persistence probability. All results included large areas in the north, central, and southern parts of western Madagascar. Achieving 30% representation targets with Marxan required twice the fish catch loss than the categorical method. The categorical classification and Zonation do not consider targets for conservation features. However, when we reduced Marxan targets to 16.3%, matching the representation level of the "strict protection" class of the categorical result, the methods show similar catch losses. The management category portfolio has complete coverage, and presents several management recommendations including strict protection. Zonation produces rapid conservation rankings across large, diverse datasets. Marxan is useful for identifying strict protected areas that meet representation targets, and minimize exposure probabilities for conservation features at low economic cost. We show that methods based on Zonation and a simple combination of variables can produce results comparable to Marxan for species representation and catch losses, demonstrating the value of comparing alternative approaches during initial stages of the planning process. Choosing an appropriate approach ultimately depends on scientific and political factors including representation targets, likelihood of adoption, and persistence goals.
NASA Astrophysics Data System (ADS)
Fernández-González, Daniel; Martín-Duarte, Ramón; Ruiz-Bustinza, Íñigo; Mochón, Javier; González-Gasca, Carmen; Verdeja, Luis Felipe
2016-08-01
Blast furnace operators expect to get sinter with homogenous and regular properties (chemical and mechanical), necessary to ensure regular blast furnace operation. Blends for sintering also include several iron by-products and other wastes that are obtained in different processes inside the steelworks. Due to their source, the availability of such materials is not always consistent, but their total production should be consumed in the sintering process, to both save money and recycle wastes. The main scope of this paper is to obtain the least expensive iron ore blend for the sintering process, which will provide suitable chemical and mechanical features for the homogeneous and regular operation of the blast furnace. The systematic use of statistical tools was employed to analyze historical data, including linear and partial correlations applied to the data and fuzzy clustering based on the Sugeno Fuzzy Inference System to establish relationships among the available variables.
The growth receptors and their role in wound healing.
Rolfe, Kerstin J; Grobbelaar, Adriaan O
2010-11-01
Abnormal wound healing is a major problem in healthcare today, with both scarring and chronic wounds affecting large numbers of individuals worldwide. Wound healing is a complex process involving several variables, including growth factors and their receptors. Chronic wounds fail to complete the wound healing process, while scarring is considered to be an overzealous wound healing process. Growth factor receptors and their ligands are being investigated to assess their potential in the development of therapeutic strategies to improve wound healing. This review discusses potential therapeutics for manipulating growth factors and their corresponding receptors for the treatment of abnormal wound healing.
Non-Gaussian Multi-resolution Modeling of Magnetosphere-Ionosphere Coupling Processes
NASA Astrophysics Data System (ADS)
Fan, M.; Paul, D.; Lee, T. C. M.; Matsuo, T.
2016-12-01
The most dynamic coupling between the magnetosphere and ionosphere occurs in the Earth's polar atmosphere. Our objective is to model scale-dependent stochastic characteristics of high-latitude ionospheric electric fields that originate from solar wind magnetosphere-ionosphere interactions. The Earth's high-latitude ionospheric electric field exhibits considerable variability, with increasing non-Gaussian characteristics at decreasing spatio-temporal scales. Accurately representing the underlying stochastic physical process through random field modeling is crucial not only for scientific understanding of the energy, momentum and mass exchanges between the Earth's magnetosphere and ionosphere, but also for modern technological systems including telecommunication, navigation, positioning and satellite tracking. While a lot of efforts have been made to characterize the large-scale variability of the electric field in the context of Gaussian processes, no attempt has been made so far to model the small-scale non-Gaussian stochastic process observed in the high-latitude ionosphere. We construct a novel random field model using spherical needlets as building blocks. The double localization of spherical needlets in both spatial and frequency domains enables the model to capture the non-Gaussian and multi-resolutional characteristics of the small-scale variability. The estimation procedure is computationally feasible due to the utilization of an adaptive Gibbs sampler. We apply the proposed methodology to the computational simulation output from the Lyon-Fedder-Mobarry (LFM) global magnetohydrodynamics (MHD) magnetosphere model. Our non-Gaussian multi-resolution model results in characterizing significantly more energy associated with the small-scale ionospheric electric field variability in comparison to Gaussian models. By accurately representing unaccounted-for additional energy and momentum sources to the Earth's upper atmosphere, our novel random field modeling approach will provide a viable remedy to the current numerical models' systematic biases resulting from the underestimation of high-latitude energy and momentum sources.
NASA Astrophysics Data System (ADS)
Peckham, S. D.
2013-12-01
Model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that allow heterogeneous sets of process models to be assembled in a plug-and-play manner to create composite "system models". These mechanisms facilitate code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers, e.g. by requiring them to provide their output in specific forms that meet the input requirements of other models. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can compare the answers to these queries with similar answers from other process models in a collection and then automatically call framework service components as necessary to mediate the differences between the coupled models. This talk will first review two key products of the CSDMS project, namely a standardized model interface called the Basic Model Interface (BMI) and the CSDMS Standard Names. The standard names are used in conjunction with BMI to provide a semantic matching mechanism that allows output variables from one process model to be reliably used as input variables to other process models in a collection. They include not just a standardized naming scheme for model variables, but also a standardized set of terms for describing the attributes and assumptions of a given model. To illustrate the power of standardized model interfaces and metadata, a smart, light-weight modeling framework written in Python will be introduced that can automatically (without user intervention) couple a set of BMI-enabled hydrologic process components together to create a spatial hydrologic model. The same mechanisms could also be used to provide seamless integration (import/export) of data and models.
Early prediction of extreme stratospheric polar vortex states based on causal precursors
NASA Astrophysics Data System (ADS)
Kretschmer, Marlene; Runge, Jakob; Coumou, Dim
2017-08-01
Variability in the stratospheric polar vortex (SPV) can influence the tropospheric circulation and thereby winter weather. Early predictions of extreme SPV states are thus important to improve forecasts of winter weather including cold spells. However, dynamical models are usually restricted in lead time because they poorly capture low-frequency processes. Empirical models often suffer from overfitting problems as the relevant physical processes and time lags are often not well understood. Here we introduce a novel empirical prediction method by uniting a response-guided community detection scheme with a causal discovery algorithm. This way, we objectively identify causal precursors of the SPV at subseasonal lead times and find them to be in good agreement with known physical drivers. A linear regression prediction model based on the causal precursors can explain most SPV variability (r2 = 0.58), and our scheme correctly predicts 58% (46%) of extremely weak SPV states for lead times of 1-15 (16-30) days with false-alarm rates of only approximately 5%. Our method can be applied to any variable relevant for (sub)seasonal weather forecasts and could thus help improving long-lead predictions.
Huang, Xiaobi; Elliott, Michael R.; Harlow, Siobán D.
2013-01-01
SUMMARY As women approach menopause, the patterns of their menstrual cycle lengths change. To study these changes, we need to jointly model both the mean and variability of cycle length. Our proposed model incorporates separate mean and variance change points for each woman and a hierarchical model to link them together, along with regression components to include predictors of menopausal onset such as age at menarche and parity. Additional complexity arises from the fact that the calendar data have substantial missingness due to hormone use, surgery, and failure to report. We integrate multiple imputation and time-to event modeling in a Bayesian estimation framework to deal with different forms of the missingness. Posterior predictive model checks are applied to evaluate the model fit. Our method successfully models patterns of women’s menstrual cycle trajectories throughout their late reproductive life and identifies change points for mean and variability of segment length, providing insight into the menopausal process. More generally, our model points the way toward increasing use of joint mean-variance models to predict health outcomes and better understand disease processes. PMID:24729638
NASA Astrophysics Data System (ADS)
Abdussalam, Auwal; Monaghan, Andrew; Dukic, Vanja; Hayden, Mary; Hopson, Thomas; Leckebusch, Gregor
2013-04-01
Northwest Nigeria is a region with high risk of bacterial meningitis. Since the first documented epidemic of meningitis in Nigeria in 1905, the disease has been endemic in the northern part of the country, with epidemics occurring regularly. In this study we examine the influence of climate on the interannual variability of meningitis incidence and epidemics. Monthly aggregate counts of clinically confirmed hospital-reported cases of meningitis were collected in northwest Nigeria for the 22-year period spanning 1990-2011. Several generalized linear statistical models were fit to the monthly meningitis counts, including generalized additive models. Explanatory variables included monthly records of temperatures, humidity, rainfall, wind speed, sunshine and dustiness from weather stations nearest to the hospitals, and a time series of polysaccharide vaccination efficacy. The effects of other confounding factors -- i.e., mainly non-climatic factors for which records were not available -- were estimated as a smooth, monthly-varying function of time in the generalized additive models. Results reveal that the most important explanatory climatic variables are mean maximum monthly temperature, relative humidity and dustiness. Accounting for confounding factors (e.g., social processes) in the generalized additive models explains more of the year-to-year variation of meningococcal disease compared to those generalized linear models that do not account for such factors. Promising results from several models that included only explanatory variables that preceded the meningitis case data by 1-month suggest there may be potential for prediction of meningitis in northwest Nigeria to aid decision makers on this time scale.
NASA Astrophysics Data System (ADS)
Zäh, Ralf-Kilian; Mosbach, Benedikt; Hollwich, Jan; Faupel, Benedikt
2017-02-01
To ensure the competitiveness of manufacturing companies it is indispensable to optimize their manufacturing processes. Slight variations of process parameters and machine settings have only marginally effects on the product quality. Therefore, the largest possible editing window is required. Such parameters are, for example, the movement of the laser beam across the component for the laser keyhole welding. That`s why it is necessary to keep the formation of welding seams within specified limits. Therefore, the quality of laser welding processes is ensured, by using post-process methods, like ultrasonic inspection, or special in-process methods. These in-process systems only achieve a simple evaluation which shows whether the weld seam is acceptable or not. Furthermore, in-process systems use no feedback for changing the control variables such as speed of the laser or adjustment of laser power. In this paper the research group presents current results of the research field of Online Monitoring, Online Controlling and Model predictive controlling in laser welding processes to increase the product quality. To record the characteristics of the welding process, tested online methods are used during the process. Based on the measurement data, a state space model is ascertained, which includes all the control variables of the system. Depending on simulation tools the model predictive controller (MPC) is designed for the model and integrated into an NI-Real-Time-System.
Kusurkar, R A; Ten Cate, Th J; van Asperen, M; Croiset, G
2011-01-01
Motivation in learning behaviour and education is well-researched in general education, but less in medical education. To answer two research questions, 'How has the literature studied motivation as either an independent or dependent variable? How is motivation useful in predicting and understanding processes and outcomes in medical education?' in the light of the Self-determination Theory (SDT) of motivation. A literature search performed using the PubMed, PsycINFO and ERIC databases resulted in 460 articles. The inclusion criteria were empirical research, specific measurement of motivation and qualitative research studies which had well-designed methodology. Only studies related to medical students/school were included. Findings of 56 articles were included in the review. Motivation as an independent variable appears to affect learning and study behaviour, academic performance, choice of medicine and specialty within medicine and intention to continue medical study. Motivation as a dependent variable appears to be affected by age, gender, ethnicity, socioeconomic status, personality, year of medical curriculum and teacher and peer support, all of which cannot be manipulated by medical educators. Motivation is also affected by factors that can be influenced, among which are, autonomy, competence and relatedness, which have been described as the basic psychological needs important for intrinsic motivation according to SDT. Motivation is an independent variable in medical education influencing important outcomes and is also a dependent variable influenced by autonomy, competence and relatedness. This review finds some evidence in support of the validity of SDT in medical education.
A workflow for the 3D visualization of meteorological data
NASA Astrophysics Data System (ADS)
Helbig, Carolin; Rink, Karsten
2014-05-01
In the future, climate change will strongly influence our environment and living conditions. To predict possible changes, climate models that include basic and process conditions have been developed and big data sets are produced as a result of simulations. The combination of various variables of climate models with spatial data from different sources helps to identify correlations and to study key processes. For our case study we use results of the weather research and forecasting (WRF) model of two regions at different scales that include various landscapes in Northern Central Europe and Baden-Württemberg. We visualize these simulation results in combination with observation data and geographic data, such as river networks, to evaluate processes and analyze if the model represents the atmospheric system sufficiently. For this purpose, a continuous workflow that leads from the integration of heterogeneous raw data to visualization using open source software (e.g. OpenGeoSys Data Explorer, ParaView) is developed. These visualizations can be displayed on a desktop computer or in an interactive virtual reality environment. We established a concept that includes recommended 3D representations and a color scheme for the variables of the data based on existing guidelines and established traditions in the specific domain. To examine changes over time in observation and simulation data, we added the temporal dimension to the visualization. In a first step of the analysis, the visualizations are used to get an overview of the data and detect areas of interest such as regions of convection or wind turbulences. Then, subsets of data sets are extracted and the included variables can be examined in detail. An evaluation by experts from the domains of visualization and atmospheric sciences establish if they are self-explanatory and clearly arranged. These easy-to-understand visualizations of complex data sets are the basis for scientific communication. In addition, they have become an essential medium for the evaluation and verification of models. Particularly in interdisciplinary research projects, they support the scientists in discussions and help to set a general level of knowledge.
Brambila-Tapia, Aniel Jessica Leticia; Poot-Hernández, Augusto Cesar; Garcia-Guevara, Jose Fernando; Rodríguez-Vázquez, Katya
2016-06-01
To date, a few works have performed a correlation of metabolic variables in bacteria; however specific correlations with these variables have not been reported. In this work, we included 36 human pathogenic bacteria and 18 non- or less-pathogenic-related bacteria and obtained all metabolic variables, including enzymes, metabolic pathways, enzymatic steps and specific metabolic pathways, and enzymatic steps of particular metabolic processes, from a reliable metabolic database (KEGG). Then, we correlated the number of the open reading frames (ORF) with these variables and with the proportions of these variables, and we observed a negative correlation with the proportion of enzymes (r = -0.506, p < 0.0001), metabolic pathways (r = -0.871, p < 00.0001), enzymatic reactions (r = -0.749, p < 00.0001), and with the proportions of central metabolism variables as well as a positive correlation with the proportions of multistep reactions (r = 0.650, p < 00.0001) and secondary metabolism variables. The proportion of multifunctional reactions (r: -0.114, p = 0.41) and the proportion of enzymatic steps (r: -0.205, p = 0.14) did not present a significant correlation. These correlations indicate that as the size of a genome (measured in the number of ORFs) increases, the proportion of genes that encode enzymes significantly diminishes (especially those related to central metabolism), suggesting that when essential metabolic pathways are complete, an increase in the number of ORFs does not require a similar increase in the metabolic pathways and enzymes, but only a slight increase is sufficient to cope with a large genome.
Are your covariates under control? How normalization can re-introduce covariate effects.
Pain, Oliver; Dudbridge, Frank; Ronald, Angelica
2018-04-30
Many statistical tests rely on the assumption that the residuals of a model are normally distributed. Rank-based inverse normal transformation (INT) of the dependent variable is one of the most popular approaches to satisfy the normality assumption. When covariates are included in the analysis, a common approach is to first adjust for the covariates and then normalize the residuals. This study investigated the effect of regressing covariates against the dependent variable and then applying rank-based INT to the residuals. The correlation between the dependent variable and covariates at each stage of processing was assessed. An alternative approach was tested in which rank-based INT was applied to the dependent variable before regressing covariates. Analyses based on both simulated and real data examples demonstrated that applying rank-based INT to the dependent variable residuals after regressing out covariates re-introduces a linear correlation between the dependent variable and covariates, increasing type-I errors and reducing power. On the other hand, when rank-based INT was applied prior to controlling for covariate effects, residuals were normally distributed and linearly uncorrelated with covariates. This latter approach is therefore recommended in situations were normality of the dependent variable is required.
Global Variability and Changes in Ocean Total Alkalinity from Aquarius Satellite
NASA Astrophysics Data System (ADS)
Fine, R. A.; Willey, D. A.; Millero, F. J., Jr.
2016-02-01
To document effects of ocean acidification it is important to have an understanding of the processes and parameters that influence alkalinity. Alkalinity is a gauge on the ability of seawater to neutralize acids. We use Aquarius satellite data, which allow unprecedented global mapping of surface total alkalinity as it correlates strongly with salinity and to a lesser extent with temperature. Spatial variability in total alkalinity and salinity exceed temporal variability, the latter includes seasonal and differences compared to climatological data. The northern hemisphere has more spatial and monthly variability in total alkalinity and salinity, while less variability in Southern Ocean alkalinity is due to less salinity variability and upwelling of waters enriched in alkalinity. Satellite alkalinity data are providing a global baseline that can be used for comparing with future carbon data, and for evaluating spatial and temporal variability and past trends. For the first time it is shown that recent satellite derived total alkalinity in the subtropics have increased as compared with climatological data; this is reflective of large scale changes in the global water cycle. Total alkalinity increases imply increased dissolution of calcareous minerals and difficulty for calcifying organisms to make their shells.
Evaluation of a 12-km Satellite-Era Reanalysis of Surface Mass Balance for the Greenland Ice Sheet
NASA Astrophysics Data System (ADS)
Cullather, R. I.; Nowicki, S.; Zhao, B.; Max, S.
2016-12-01
The recent contribution to sea level change from the Greenland Ice Sheet is thought to be strongly driven by surface processes including melt and runoff. Global reanalyses are potential means of reconstructing the historical time series of ice sheet surface mass balance (SMB), but lack spatial resolution needed to resolve ablation areas along the periphery of the ice sheet. In this work, the Modern-Era Retrospective analysis for Research and Applications, version 2 (MERRA-2) is used to examine the spatial and temporal variability of surface melt over the Greenland Ice Sheet. MERRA-2 is produced for the period 1980 to the present at a grid spacing of ½° latitude by ⅝° longitude, and includes snow hydrology processes including compaction, meltwater percolation and refreezing, runoff, and a prognostic surface albedo. The configuration of the MERRA-2 system allows for the background model - the Goddard Earth Observing System model, version 5 (GEOS-5) - to be carried in phase space through analyzed states via the computation of analysis increments, a capability referred to as "replay". Here, a MERRA-2 replay integration is conducted in which atmospheric forcing fields are interpolated and adjusted to sub- atmospheric grid-scale resolution. These adjustments include lapse-rate effects on temperature, humidity, precipitation, and other atmospheric variables that are known to have a strong elevation dependency over ice sheets. The surface coupling is performed such that mass and energy are conserved. The atmospheric forcing influences the surface representation, which operates on land surface tiles with an approximate 12-km spacing. This produces a high-resolution, downscaled SMB which is interactively coupled to the reanalysis model. We compare the downscaled SMB product with other reanalyses, regional climate model values, and a second MERRA-2 replay in which the background model has been replaced with a 12-km, non-hydrostatic version of GEOS-5. The assessment focuses on regional changes in SMB and SMB components, the identification of changes and temporal variability in the SMB equilibrium line, and the relation between SMB and other climate variables related to general circulation.
State and Trends of the Global Carbon Budget
NASA Astrophysics Data System (ADS)
Canadell, J.
2017-12-01
Long-term redistribution of carbon among fossil fuel reserves, the atmosphere, oceans and land largely determines the degree of the human perturbation of the atmosphere and the climate system. Here I'll show a number of diagnostics to characterize changes in the global carbon cycle, including: 1) the continued growth in atmospheric CO2 despite an apparent stabilization in the growth of fossil fuel emissions and the likely emissions decline from land use change; 2) the growth in the land and ocean sinks in response to the rise in excess atmospheric CO2 with large annual and decadal variability; and 3) key drivers of these trends including the global greening, spatial distribution of carbons sinks, and responses to inter-annual variability. Efforts to attribute driving processes to the growing sinks require a strong CO2 fertilization effect on vegetation growth and emerging trends show an under realized role of semiarid regions in contributing to the mean, trend and variability of the global land sink. Climate variability, including ENSO and the 2000's slowdown in terrestrial global warming, has produced opportunities to explore the drivers of global carbon fluxes as they take large departures from mean states (e.g., high rates of atmospheric CO2 accumulation along with no growth in fossil fuel emissions and strong land greening trends in recent years). Process attribution shows the strong interplay between gross primary productivity and heterotrophic respiration in response to warming, and the role of tropical and sub-tropical systems to the overall sink. New advances in observations and data handling are critical in reducing uncertainties including 1) Bayesian fusion approaches to optimally combine multiple data streams of ocean and land uptake, and fossil fuel and land use change emissions; 2) continuous landscape carbon density measurements and column CO2 from remotely sensed platforms; and 3) improved ocean circulation and CO2 uptake at the decadal scales; among others. This presentation builds upon the work done by a team of international scientists under the umbrella of the Global Carbon Project.
Parrish, Anne-Maree; Yeatman, Heather; Iverson, Don; Russell, Ken
2012-04-01
School break times provide a daily opportunity for children to be active; however, research indicates this time is underutilized. Reasons for low children's playground activity levels have primarily focused on physical barriers. This research aimed to contribute to physical environmental findings affecting children's playground physical activity levels by identifying additional variables through the interview process. Thirteen public schools were included in the sample (total 2946 children). Physical activity and environmental data were collected over 3 days. Environmental variables were manually assessed at each of the 13 schools. Observational data were used to determine which three schools were the most and least active. The principal, three teachers and 20 students in Grades 4-6 from these six schools (four lower and two average socioeconomic status) were invited to participate in the interview process. Student interviews involved the paired interview technique. The main themes generated from the school interviews included the effect of non-fixed equipment (including balls), playground markings, playground aesthetics, activity preference, clothing, the amount of break time available for play, teacher playground involvement, gender, bullying, school policies, student confidence in break-time activity and fundamental movement skills. The effect of bullying on playground physical activity levels was concerning.
Delsignore, Aba
2008-08-01
To examine whether and how different patterns of psychotherapy history (no prior therapy, successful therapy experience, and unsuccessful therapy experience) affect the outcome of future treatment among patients undergoing cognitive-behavioural group therapy for social anxiety disorder. Fifty-seven patients with varying histories of psychotherapy participating in cognitive-behavioural group treatment for social anxiety disorder were included in the study. Symptom severity (including anxiety, depression, self-efficacy, and global symptom severity) was assessed at pre- and posttreatment. A therapist-rated measure of patient therapy engagement was included as a process variable. First-time therapy patients showed more favourable pretreatment variables and achieved greater benefit from group therapy. Among patients with unsuccessful therapy experience, substantial gains were attained by those who were able to actively engage in the therapy process. Patients rating previous therapies as successful could benefit the least and tended to stagnate. Possible explanations for group differences and clinical implications are discussed. Prior psychotherapy experience affects the course of cognitive-behavioural group therapy in patients with social phobias. While patients with negative therapy experience may need extensive support in being and remaining actively engaged, those rating previous therapies as successful should be assessed very carefully and may benefit from a major focus on relational aspects.
Najafpoor, Ali Asghar; Jonidi Jafari, Ahmad; Hosseinzadeh, Ahmad; Khani Jazani, Reza; Bargozin, Hasan
2018-01-01
Treatment with a non-thermal plasma (NTP) is a new and effective technology applied recently for conversion of gases for air pollution control. This research was initiated to optimize the efficient application of the NTP process in benzene, toluene, ethyl-benzene, and xylene (BTEX) removal. The effects of four variables including temperature, initial BTEX concentration, voltage, and flow rate on the BTEX elimination efficiency were investigated using response surface methodology (RSM). The constructed model was evaluated by analysis of variance (ANOVA). The model goodness-of-fit and statistical significance was assessed using determination coefficients (R 2 and R 2 adj ) and the F-test. The results revealed that the R 2 proportion was greater than 0.96 for BTEX removal efficiency. The statistical analysis demonstrated that the BTEX removal efficiency was significantly correlated with the temperature, BTEX concentration, voltage, and flow rate. Voltage was the most influential variable affecting the dependent variable as it exerted a significant effect (p < 0.0001) on the response variable. According to the achieved results, NTP can be applied as a progressive, cost-effective, and practical process for treatment of airstreams polluted with BTEX in conditions of low residence time and high concentrations of pollutants.
Near-infrared Variability in the Orion Nebula Cluster
NASA Astrophysics Data System (ADS)
Rice, Thomas S.; Reipurth, Bo; Wolk, Scott J.; Vaz, Luiz Paulo; Cross, N. J. G.
2015-10-01
Using UKIRT on Mauna Kea, we have carried out a new near-infrared J, H, K monitoring survey of almost a square degree of the star-forming Orion Nebula Cluster with observations on 120 nights over three observing seasons, spanning a total of 894 days. We monitored ˜15,000 stars down to J≈ 20 using the WFCAM instrument, and have extracted 1203 significantly variable stars from our data. By studying variability in young stellar objects (YSOs) in the H - K, K color-magnitude diagram, we are able to distinguish between physical mechanisms of variability. Many variables show color behavior indicating either dust-extinction or disk/accretion activity, but we find that when monitored for longer periods of time, a number of stars shift between these two variability mechanisms. Further, we show that the intrinsic timescale of disk/accretion variability in young stars is longer than that of dust-extinction variability. We confirm that variability amplitude is statistically correlated with evolutionary class in all bands and colors. Our investigations of these 1203 variables have revealed 73 periodic AA Tau type variables, many large-amplitude and long-period (P\\gt 15 days) YSOs, including three stars showing widely spaced periodic brightening events consistent with circumbinary disk activity, and four new eclipsing binaries. These phenomena and others indicate the activity of long-term disk/accretion variability processes taking place in young stars. We have made the light curves and associated data for these 1203 variables available online.
Clinical process in an integrative psychotherapy for self-wounds.
Wolfe, Barry E
2013-09-01
In this article, I will briefly describe the clinical process of an integrative psychotherapy for the healing of self-wounds, including its intended interventions and the variability of their application and outcome. Four specific strategies will be considered, including (a) the role of empathy throughout the course of therapy; (b) exposure therapy as a paradigmatic treatment for the treatment of feared thoughts, behavior, and emotions; (c) focusing and other experiential interventions for eliciting self-wounds; and (d) modification and healing of self-wounds with an individualized array of psychodynamic, experiential, and cognitive-behavioral strategies. In addition, we will briefly consider the impact of transference and countertransference on the trajectory of therapy. 2013 APA, all rights reserved
NASA Astrophysics Data System (ADS)
Karenyi, Natasha; Sink, Kerry; Nel, Ronel
2016-02-01
Marine unconsolidated sediment habitats, the largest benthic ecosystem, are considered physically controlled ecosystems driven by a number of local physical processes. Depth and sediment type are recognised key drivers of these ecosystems. Seascape (i.e., marine landscape) habitat classifications are based solely on consistent geophysical features and provide an opportunity to define unconsolidated sediment habitats based on processes which may vary in distribution through space and time. This paper aimed to classify unconsolidated sediment seascapes and explore their diversity in an eastern boundary upwelling region at the macro-scale, using the South African west coast as a case study. Physical variables such as sediment grain size, depth and upwelling-related variables (i.e., maximum chlorophyll concentration, austral summer bottom oxygen concentration and sediment organic carbon content) were included in the analyses. These variables were directly measured through sampling, or collated from existing databases and the literature. These data were analysed using multivariate Cluster, Principal Components Ordination and SIMPER analyses (in PRIMER 6 + with PERMANOVA add-in package). There were four main findings; (i) eight seascapes were identified for the South African west coast based on depth, slope, sediment grain size and upwelling-related variables, (ii) three depth zones were distinguished (inner, middle and outer shelf), (iii) seascape diversity in the inner and middle shelves was greater than the outer shelf, and (iv) upwelling-related variables were responsible for the habitat diversity in both inner and middle shelves. This research demonstrates that the inclusion of productivity and its related variables, such as hypoxia and sedimentary organic carbon, in seascape classifications will enhance the ability to distinguish seascapes on continental shelves, where productivity is most variable.
NASA Astrophysics Data System (ADS)
Chen, M.; Keenan, T. F.; Hufkens, K.; Munger, J. W.; Bohrer, G.; Brzostek, E. R.; Richardson, A. D.
2014-12-01
Carbon dynamics in terrestrial ecosystems are influenced by both abiotic and biotic factors. Abiotic factors, such as variation in meteorological conditions, directly drive biophysical and biogeochemical processes; biotic factors, referring to the inherent properties of the ecosystem components, reflect the internal regulating effects including temporal dynamics and memory. The magnitude of the effect of abiotic and biotic factors on forest ecosystem carbon exchange has been suggested to vary at different time scales. In this study, we design and conduct a model-data fusion experiment to investigate the role and relative importance of the biotic and abiotic factors for inter-annual variability of the net ecosystem CO2 exchange (NEE) of temperate deciduous forest ecosystems in the Northeastern US. A process-based model (FöBAAR) is parameterized at four eddy-covariance sites using all available flux and biometric measurements. We conducted a "transplant" modeling experiment, that is, cross- site and parameter simulations with different combinations of site meteorology and parameters. Using wavelet analysis and variance partitioning techniques, analysis of model predictions identifies both spatial variant and spatially invariant parameters. Variability of NEE was primarily modulated by gross primary productivity (GPP), with relative contributions varying from hourly to yearly time scales. The inter-annual variability of GPP and NEE is more regulated by meteorological forcing, but spatial variability in certain model parameters (biotic response) has more substantial effects on the inter-annual variability of ecosystem respiration (Reco) through the effects on carbon pools. Both the biotic and abiotic factors play significant roles in modulating the spatial and temporal variability in terrestrial carbon cycling in the region. Together, our study quantifies the relative importance of both, and calls for better understanding of them to better predict regional CO2 exchanges.
The East Asian Jet Stream and Asian-Pacific Climate
NASA Technical Reports Server (NTRS)
Yang, Song; Lau, K.-M.; Kim, K.-M.
1999-01-01
In this study, the NASA GEOS and NCEP/NCAR reanalyses and GPCP rainfall data have been used to study the variability of the East Asian westerly jet stream and its impact on the Asian-Pacific climate, with a focus on interannual time scales. Results indicate that external forcings such as sea surface temperature (SST) and land surface processes also play an important role in the variability of the jet although this variability is strongly governed by internal dynamics. There is a close link between the jet and Asian-Pacific climate including the Asian winter monsoon and tropical convection. The atmospheric teleconnection pattern associated with the jet is different from the ENSO-related pattern. The influence of the jet on eastern Pacific and North American climate is also discussed.
Austin, Bradley J; Hardgrave, Natalia; Inlander, Ethan; Gallipeau, Cory; Entrekin, Sally; Evans-White, Michelle A
2015-10-01
Construction of unconventional natural gas (UNG) infrastructure (e.g., well pads, pipelines) is an increasingly common anthropogenic stressor that increases potential sediment erosion. Increased sediment inputs into nearby streams may decrease autotrophic processes through burial and scour, or sediment bound nutrients could have a positive effect through alleviating potential nutrient limitations. Ten streams with varying catchment UNG well densities (0-3.6 wells/km(2)) were sampled during winter and spring of 2010 and 2011 to examine relationships between landscape scale disturbances associated with UNG activity and stream periphyton [chlorophyll a (Chl a)] and gross primary production (GPP). Local scale variables including light availability and water column physicochemical variables were measured for each study site. Correlation analyses examined the relationships of autotrophic processes and local scale variables with the landscape scale variables percent pasture land use and UNG metrics (well density and well pad inverse flow path length). Both GPP and Chl a were primarily positively associated with the UNG activity metrics during most sample periods; however, neither landscape variables nor response variables correlated well with local scale factors. These positive correlations do not confirm causation, but they do suggest that it is possible that UNG development can alleviate one or more limiting factors on autotrophic production within these streams. A secondary manipulative study was used to examine the link between nutrient limitation and algal growth across a gradient of streams impacted by natural gas activity. Nitrogen limitation was common among minimally impacted stream reaches and was alleviated in streams with high UNG activity. These data provide evidence that UNG may stimulate the primary production of Fayetteville shale streams via alleviation of N-limitation. Restricting UNG activities from the riparian zone along with better enforcement of best management practices should help reduce these possible impacts of UNG activities on stream autotrophic processes. Copyright © 2015 Elsevier B.V. All rights reserved.
Corsi, Steven R.; Borchardt, M. A.; Spencer, S. K.; Hughes, Peter E.; Baldwin, Austin K.
2014-01-01
To examine the occurrence, hydrologic variability, and seasonal variability of human and bovine viruses in surface water, three stream locations were monitored in the Milwaukee River watershed in Wisconsin, USA, from February 2007 through June 2008. Monitoring sites included an urban subwatershed, a rural subwatershed, and the Milwaukee River at the mouth. To collect samples that characterize variability throughout changing hydrologic periods, a process control system was developed for unattended, large-volume (56–2800 L) filtration over extended durations. This system provided flow-weighted mean concentrations during runoff and extended (24-h) low-flow periods. Human viruses and bovine viruses were detected by real-time qPCR in 49% and 41% of samples (n = 63), respectively. All human viruses analyzed were detected at least once including adenovirus (40% of samples), GI norovirus (10%), enterovirus (8%), rotavirus (6%), GII norovirus (1.6%) and hepatitis A virus (1.6%). Three of seven bovine viruses analyzed were detected including bovine polyomavirus (32%), bovine rotavirus (19%), and bovine viral diarrhea virus type 1 (5%). Human viruses were present in 63% of runoff samples resulting from precipitation and snowmelt, and 20% of low-flow samples. Maximum human virus concentrations exceeded 300 genomic copies/L. Bovine viruses were present in 46% of runoff samples resulting from precipitation and snowmelt and 14% of low-flow samples. The maximum bovine virus concentration was 11 genomic copies/L. Statistical modeling indicated that stream flow, precipitation, and season explained the variability of human viruses in the watershed, and hydrologic condition (runoff event or low-flow) and season explained the variability of the sum of human and bovine viruses; however, no model was identified that could explain the variability of bovine viruses alone. Understanding the factors that affect virus fate and transport in rivers will aid watershed management for minimizing human exposure and disease transmission.
NASA Astrophysics Data System (ADS)
Jacques, Diederik; Gérard, Fréderic; Mayer, Uli; Simunek, Jirka; Leterme, Bertrand
2016-04-01
A large number of organic matter degradation, CO2 transport and dissolved organic matter models have been developed during the last decades. However, organic matter degradation models are in many cases strictly hard-coded in terms of organic pools, degradation kinetics and dependency on environmental variables. The scientific input of the model user is typically limited to the adjustment of input parameters. In addition, the coupling with geochemical soil processes including aqueous speciation, pH-dependent sorption and colloid-facilitated transport are not incorporated in many of these models, strongly limiting the scope of their application. Furthermore, the most comprehensive organic matter degradation models are combined with simplified representations of flow and transport processes in the soil system. We illustrate the capability of generic reactive transport codes to overcome these shortcomings. The formulations of reactive transport codes include a physics-based continuum representation of flow and transport processes, while biogeochemical reactions can be described as equilibrium processes constrained by thermodynamic principles and/or kinetic reaction networks. The flexibility of these type of codes allows for straight-forward extension of reaction networks, permits the inclusion of new model components (e.g.: organic matter pools, rate equations, parameter dependency on environmental conditions) and in such a way facilitates an application-tailored implementation of organic matter degradation models and related processes. A numerical benchmark involving two reactive transport codes (HPx and MIN3P) demonstrates how the process-based simulation of transient variably saturated water flow (Richards equation), solute transport (advection-dispersion equation), heat transfer and diffusion in the gas phase can be combined with a flexible implementation of a soil organic matter degradation model. The benchmark includes the production of leachable organic matter and inorganic carbon in the aqueous and gaseous phases, as well as different decomposition functions with first-order, linear dependence or nonlinear dependence on a biomass pool. In addition, we show how processes such as local bioturbation (bio-diffusion) can be included implicitly through a Fickian formulation of transport of soil organic matter. Coupling soil organic matter models with generic and flexible reactive transport codes offers a valuable tool to enhance insights into coupled physico-chemical processes at different scales within the scope of C-biogeochemical cycles, possibly linked with other chemical elements such as plant nutrients and pollutants.
Lapping, Karin; Frongillo, Edward A; Nguyen, Phuong H; Coates, Jennifer; Webb, Patrick; Menon, Purnima
2014-09-01
Translating national policies and guidelines into effective action at the subnational level (e.g., province or region) is a prerequisite for ensuring an impact on nutrition. In several countries, including Vietnam, the focus of this paper, this process is affected by the quality of the decentralized process of planning and action. This study examined how provincial planning processes for nutrition occurred in Vietnam during 2009 and 2010. Key goals were to understand variability in processes across provinces, identify factors that influenced the process, and assess the usefulness of the process for individuals involved in planning and action. A qualitative case-study methodology was used. Data were drawn from interviews with 51 government officials in eight provinces. The study found little variability in the planning process among these eight provinces, probably due to a planning process that was predominantly a fiscal exercise within the confines of a largely centralized structure. Respondents were almost unanimous about the main barriers: a top-down approach to planning, limited human capacity for effective planning at subnational levels, and difficulty in integrating actions from multiple sectors. Provincial-level actors were deeply dissatisfied with the nature of their role in the process. Despite the rhetoric to the contrary, too much power is probably still retained at the central level. A strategic multiyear approach is needed to strengthen the provincial planning process and address many of the key barriers identified in this study.
ERIC Educational Resources Information Center
Dornan, Tim; Muijtjens, Arno; Graham, Jennifer; Scherpbier, Albert; Boshuizen, Henny
2012-01-01
The drive to quality-manage medical education has created a need for valid measurement instruments. Validity evidence includes the theoretical and contextual origin of items, choice of response processes, internal structure, and interrelationship of a measure's variables. This research set out to explore the validity and potential utility of an…
The Biasing Effects of Unmodeled ARMA Time Series Processes on Latent Growth Curve Model Estimates
ERIC Educational Resources Information Center
Sivo, Stephen; Fan, Xitao; Witta, Lea
2005-01-01
The purpose of this study was to evaluate the robustness of estimated growth curve models when there is stationary autocorrelation among manifest variable errors. The results suggest that when, in practice, growth curve models are fitted to longitudinal data, alternative rival hypotheses to consider would include growth models that also specify…
A Ballistic Model of Choice Response Time
ERIC Educational Resources Information Center
Brown, Scott; Heathcote, Andrew
2005-01-01
Almost all models of response time (RT) use a stochastic accumulation process. To account for the benchmark RT phenomena, researchers have found it necessary to include between-trial variability in the starting point and/or the rate of accumulation, both in linear (R. Ratcliff & J. N. Rouder, 1998) and nonlinear (M. Usher & J. L. McClelland, 2001)…
Reservoirs are a globally important source of methane (CH4) to the atmosphere, but measuring CH4 emission rates from reservoirs is difficult due to the spatial and temporal variability of the various emission pathways, including ebullition and diffusion. We used the eddy covarian...
Colour in Learning: Its Effect on the Retention Rate of Graduate Students
ERIC Educational Resources Information Center
Olurinola, Oluwakemi; Tayo, Omoniyi
2015-01-01
Cognitive psychologists have discovered different design principles to enhance memory performance. It has been said that retrieving process depends on many variables and one of them is colour. This paper provides an overview of research on colour and learning. It includes the effect of colour on attention, retention and memory performance, and…
USDA-ARS?s Scientific Manuscript database
For decades, the importance of evapotranspiration processes has been recognized in many disciplines, including hydrologic and drainage studies, irrigation systems design and management. A wide number of equations have been proposed to estimate crop reference evapotranspiration, ET0, based on the var...
Danny L. Fry; Scott L. Stephens; Brandon M. Collins; Malcolm North; Ernesto Franco-Vizcaino; Samantha J. Gill
2014-01-01
In Mediterranean environments in western North America, historic fire regimes in frequent-fire conifer forests are highly variable both temporally and spatially. This complexity influenced forest structure and spatial patterns, but some of this diversity has been lost due to anthropogenic disruption of ecosystem processes, including fire. Information from reference...
ERIC Educational Resources Information Center
Dore, Melissa L.
2017-01-01
This applied dissertation was conducted to provide the graduate program in marine sciences a valid predictor for success in the admissions scoring systems that include the general Graduate Record Exam. The dependent variable was persistence: successfully graduating from the marine sciences master's programs. This dissertation evaluated other…
Spatiotemporal Variability of Hillslope Soil Moisture Across Steep, Highly Dissected Topography
NASA Astrophysics Data System (ADS)
Jarecke, K. M.; Wondzell, S. M.; Bladon, K. D.
2016-12-01
Hillslope ecohydrological processes, including subsurface water flow and plant water uptake, are strongly influenced by soil moisture. However, the factors controlling spatial and temporal variability of soil moisture in steep, mountainous terrain are poorly understood. We asked: How do topography and soils interact to control the spatial and temporal variability of soil moisture in steep, Douglas-fir dominated hillslopes in the western Cascades? We will present a preliminary analysis of bimonthly soil moisture variability from July-November 2016 at 0-30 and 0-60 cm depth across spatially extensive convergent and divergent topographic positions in Watershed 1 of the H.J. Andrews Experimental Forest in central Oregon. Soil moisture monitoring locations were selected following a 5 m LIDAR analysis of topographic position, aspect, and slope. Topographic position index (TPI) was calculated as the difference in elevation to the mean elevation within a 30 m radius. Convergent (negative TPI values) and divergent (positive TPI values) monitoring locations were established along northwest to northeast-facing aspects and within 25-55 degree slopes. We hypothesized that topographic position (convergent vs. divergent), as well as soil physical properties (e.g., texture, bulk density), control variation in hillslope soil moisture at the sub-watershed scale. In addition, we expected the relative importance of hillslope topography to the spatial variability in soil moisture to differ seasonally. By comparing the spatiotemporal variability of hillslope soil moisture across topographic positions, our research provides a foundation for additional understanding of subsurface flow processes and plant-available soil-water in forests with steep, highly dissected terrain.
Fontana, Silvia Alicia; Raimondi, Waldina; Rizzo, María Laura
2014-09-05
Sleep quality not only refers to sleeping well at night, but also includes appropriate daytime functioning. Poor quality of sleep can affect a variety of attention processes. The aim of this investigation was to evaluate the relationship between the perceived quality of sleep and selective focus in a group of college students. A descriptive cross-sectional study was carried out in a group of 52 Argentinian college students of the Universidad Adventista del Plata. The Pittsburgh Sleep Quality Index, the Continuous Performance Test and the Trail Making Test were applied. The main results indicate that students sleep an average of 6.48 hours. Generally half of the population tested had a good quality of sleep. However, the dispersion seen in some components demonstrates the heterogeneity of the sample in these variables. It was observed that the evaluated attention processes yielded different levels of alteration in the total sample: major variability in the process of process and in the divided-attention processes were detected. A lower percentage of alteration was observed in the process of attention support. Poor quality of sleep has more impact in the sub processes with greater participation of corticocortical circuits (selective and divided attention) and greater involvement of the prefrontal cortex. Fewer difficulties were found in the attention-support processes that rely on subcortical regions and have less frontal involvement.
Cognitive components of a mathematical processing network in 9-year-old children.
Szűcs, Dénes; Devine, Amy; Soltesz, Fruzsina; Nobes, Alison; Gabriel, Florence
2014-07-01
We determined how various cognitive abilities, including several measures of a proposed domain-specific number sense, relate to mathematical competence in nearly 100 9-year-old children with normal reading skill. Results are consistent with an extended number processing network and suggest that important processing nodes of this network are phonological processing, verbal knowledge, visuo-spatial short-term and working memory, spatial ability and general executive functioning. The model was highly specific to predicting arithmetic performance. There were no strong relations between mathematical achievement and verbal short-term and working memory, sustained attention, response inhibition, finger knowledge and symbolic number comparison performance. Non-verbal intelligence measures were also non-significant predictors when added to our model. Number sense variables were non-significant predictors in the model and they were also non-significant predictors when entered into regression analysis with only a single visuo-spatial WM measure. Number sense variables were predicted by sustained attention. Results support a network theory of mathematical competence in primary school children and falsify the importance of a proposed modular 'number sense'. We suggest an 'executive memory function centric' model of mathematical processing. Mapping a complex processing network requires that studies consider the complex predictor space of mathematics rather than just focusing on a single or a few explanatory factors.
Cognitive components of a mathematical processing network in 9-year-old children
Szűcs, Dénes; Devine, Amy; Soltesz, Fruzsina; Nobes, Alison; Gabriel, Florence
2014-01-01
We determined how various cognitive abilities, including several measures of a proposed domain-specific number sense, relate to mathematical competence in nearly 100 9-year-old children with normal reading skill. Results are consistent with an extended number processing network and suggest that important processing nodes of this network are phonological processing, verbal knowledge, visuo-spatial short-term and working memory, spatial ability and general executive functioning. The model was highly specific to predicting arithmetic performance. There were no strong relations between mathematical achievement and verbal short-term and working memory, sustained attention, response inhibition, finger knowledge and symbolic number comparison performance. Non-verbal intelligence measures were also non-significant predictors when added to our model. Number sense variables were non-significant predictors in the model and they were also non-significant predictors when entered into regression analysis with only a single visuo-spatial WM measure. Number sense variables were predicted by sustained attention. Results support a network theory of mathematical competence in primary school children and falsify the importance of a proposed modular ‘number sense’. We suggest an ‘executive memory function centric’ model of mathematical processing. Mapping a complex processing network requires that studies consider the complex predictor space of mathematics rather than just focusing on a single or a few explanatory factors. PMID:25089322