To twist, roll, stroke or poke? A study of input devices for menu navigation in the cockpit.
Stanton, Neville A; Harvey, Catherine; Plant, Katherine L; Bolton, Luke
2013-01-01
Modern interfaces within the aircraft cockpit integrate many flight management system (FMS) functions into a single system. The success of a user's interaction with an interface depends upon the optimisation between the input device, tasks and environment within which the system is used. In this study, four input devices were evaluated using a range of Human Factors methods, in order to assess aspects of usability including task interaction times, error rates, workload, subjective usability and physical discomfort. The performance of the four input devices was compared using a holistic approach and the findings showed that no single input device produced consistently high performance scores across all of the variables evaluated. The touch screen produced the highest number of 'best' scores; however, discomfort ratings for this device were high, suggesting that it is not an ideal solution as both physical and cognitive aspects of performance must be accounted for in design. This study evaluated four input devices for control of a screen-based flight management system. A holistic approach was used to evaluate both cognitive and physical performance. Performance varied across the dependent variables and between the devices; however, the touch screen produced the largest number of 'best' scores.
Sensitivity analysis and nonlinearity assessment of steam cracking furnace process
NASA Astrophysics Data System (ADS)
Rosli, M. N.; Sudibyo, Aziz, N.
2017-11-01
In this paper, sensitivity analysis and nonlinearity assessment of cracking furnace process are presented. For the sensitivity analysis, the fractional factorial design method is employed as a method to analyze the effect of input parameters, which consist of four manipulated variables and two disturbance variables, to the output variables and to identify the interaction between each parameter. The result of the factorial design method is used as a screening method to reduce the number of parameters, and subsequently, reducing the complexity of the model. It shows that out of six input parameters, four parameters are significant. After the screening is completed, step test is performed on the significant input parameters to assess the degree of nonlinearity of the system. The result shows that the system is highly nonlinear with respect to changes in an air-to-fuel ratio (AFR) and feed composition.
Variance-based interaction index measuring heteroscedasticity
NASA Astrophysics Data System (ADS)
Ito, Keiichi; Couckuyt, Ivo; Poles, Silvia; Dhaene, Tom
2016-06-01
This work is motivated by the need to deal with models with high-dimensional input spaces of real variables. One way to tackle high-dimensional problems is to identify interaction or non-interaction among input parameters. We propose a new variance-based sensitivity interaction index that can detect and quantify interactions among the input variables of mathematical functions and computer simulations. The computation is very similar to first-order sensitivity indices by Sobol'. The proposed interaction index can quantify the relative importance of input variables in interaction. Furthermore, detection of non-interaction for screening can be done with as low as 4 n + 2 function evaluations, where n is the number of input variables. Using the interaction indices based on heteroscedasticity, the original function may be decomposed into a set of lower dimensional functions which may then be analyzed separately.
Data-driven process decomposition and robust online distributed modelling for large-scale processes
NASA Astrophysics Data System (ADS)
Shu, Zhang; Lijuan, Li; Lijuan, Yao; Shipin, Yang; Tao, Zou
2018-02-01
With the increasing attention of networked control, system decomposition and distributed models show significant importance in the implementation of model-based control strategy. In this paper, a data-driven system decomposition and online distributed subsystem modelling algorithm was proposed for large-scale chemical processes. The key controlled variables are first partitioned by affinity propagation clustering algorithm into several clusters. Each cluster can be regarded as a subsystem. Then the inputs of each subsystem are selected by offline canonical correlation analysis between all process variables and its controlled variables. Process decomposition is then realised after the screening of input and output variables. When the system decomposition is finished, the online subsystem modelling can be carried out by recursively block-wise renewing the samples. The proposed algorithm was applied in the Tennessee Eastman process and the validity was verified.
NASA Astrophysics Data System (ADS)
Périllat, Raphaël; Girard, Sylvain; Korsakissok, Irène; Mallet, Vinien
2015-04-01
In a previous study, the sensitivity of a long distance model was analyzed on the Fukushima Daiichi disaster case with the Morris screening method. It showed that a few variables, such as horizontal diffusion coefficient or clouds thickness, have a weak influence on most of the chosen outputs. The purpose of the present study is to apply a similar methodology on the IRSN's operational short distance atmospheric dispersion model, called pX. Atmospheric dispersion models are very useful in case of accidental releases of pollutant to minimize the population exposure during the accident and to obtain an accurate assessment of short and long term environmental and sanitary impact. Long range models are mostly used for consequences assessment while short range models are more adapted to the early phases of the crisis and are used to make prognosis. The Morris screening method was used to estimate the sensitivity of a set of outputs and to rank the inputs by their influences. The input ranking is highly dependent on the considered output, but a few variables seem to have a weak influence on most of them. This first step revealed that interactions and non-linearity are much more pronounced with the short range model than with the long range one. Afterward, the Sobol screening method was used to obtain more quantitative results on the same set of outputs. Using this method was possible for the short range model because it is far less computationally demanding than the long range model. The study also confronts two parameterizations, Doury's and Pasquill's models, to contrast their behavior. The Doury's model seems to excessively inflate the influence of some inputs compared to the Pasquill's model, such as the altitude of emission and the air stability which do not have the same role in the two models. The outputs of the long range model were dominated by only a few inputs. On the contrary, in this study the influence is shared more evenly between the inputs.
Development and weighting of a life cycle assessment screening model
NASA Astrophysics Data System (ADS)
Bates, Wayne E.; O'Shaughnessy, James; Johnson, Sharon A.; Sisson, Richard
2004-02-01
Nearly all life cycle assessment tools available today are high priced, comprehensive and quantitative models requiring a significant amount of data collection and data input. In addition, most of the available software packages require a great deal of training time to learn how to operate the model software. Even after this time investment, results are not guaranteed because of the number of estimations and assumptions often necessary to run the model. As a result, product development, design teams and environmental specialists need a simplified tool that will allow for the qualitative evaluation and "screening" of various design options. This paper presents the development and design of a generic, qualitative life cycle screening model and demonstrates its applicability and ease of use. The model uses qualitative environmental, health and safety factors, based on site or product-specific issues, to sensitize the overall results for a given set of conditions. The paper also evaluates the impact of different population input ranking values on model output. The final analysis is based on site or product-specific variables. The user can then evaluate various design changes and the apparent impact or improvement on the environment, health and safety, compliance cost and overall corporate liability. Major input parameters can be varied, and factors such as materials use, pollution prevention, waste minimization, worker safety, product life, environmental impacts, return of investment, and recycle are evaluated. The flexibility of the model format will be discussed in order to demonstrate the applicability and usefulness within nearly any industry sector. Finally, an example using audience input value scores will be compared to other population input results.
NASA Technical Reports Server (NTRS)
Lucas, S. H.; Davis, R. C.
1992-01-01
A user's manual is presented for MacPASCO, which is an interactive, graphic, preprocessor for panel design. MacPASCO creates input for PASCO, an existing computer code for structural analysis and sizing of longitudinally stiffened composite panels. MacPASCO provides a graphical user interface which simplifies the specification of panel geometry and reduces user input errors. The user draws the initial structural geometry and reduces user input errors. The user draws the initial structural geometry on the computer screen, then uses a combination of graphic and text inputs to: refine the structural geometry; specify information required for analysis such as panel load and boundary conditions; and define design variables and constraints for minimum mass optimization. Only the use of MacPASCO is described, since the use of PASCO has been documented elsewhere.
1997-04-01
DATA COLLABORATORS 0001N B NQ 8380 NUMBER OF DATA RECEIVERS 0001N B NQ 2533 AUTHORIZED ITEM IDENTIFICATION DATA COLLABORATOR CODE 0002 ,X B 03 18 TD...01 NC 8268 DATA ELEMENT TERMINATOR CODE 000iX VT 9505 TYPE OF SCREENING CODE 0001A 01 NC 8268 DATA ELEMENT TERMINATOR CODE 000iX VT 4690 OUTPUT DATA... 9505 TYPE OF SCREENING CODE 0001A 2 89 2910 REFERENCE NUMBER CATEGORY CODE (RNCC) 0001X 2 89 4780 REFERENCE NUMBER VARIATION CODE (RNVC) 0001 N 2 89
Duarte, José M; Barbier, Içvara; Schaerli, Yolanda
2017-11-17
Synthetic biologists increasingly rely on directed evolution to optimize engineered biological systems. Applying an appropriate screening or selection method for identifying the potentially rare library members with the desired properties is a crucial step for success in these experiments. Special challenges include substantial cell-to-cell variability and the requirement to check multiple states (e.g., being ON or OFF depending on the input). Here, we present a high-throughput screening method that addresses these challenges. First, we encapsulate single bacteria into microfluidic agarose gel beads. After incubation, they harbor monoclonal bacterial microcolonies (e.g., expressing a synthetic construct) and can be sorted according their fluorescence by fluorescence activated cell sorting (FACS). We determine enrichment rates and demonstrate that we can measure the average fluorescent signals of microcolonies containing phenotypically heterogeneous cells, obviating the problem of cell-to-cell variability. Finally, we apply this method to sort a pBAD promoter library at ON and OFF states.
NASA Technical Reports Server (NTRS)
Jones, Denise R.
1990-01-01
A piloted simulation study was conducted comparing three different input methods for interfacing to a large-screen, multiwindow, whole-flight-deck display for management of transport aircraft systems. The thumball concept utilized a miniature trackball embedded in a conventional side-arm controller. The touch screen concept provided data entry through a capacitive touch screen. The voice concept utilized a speech recognition system with input through a head-worn microphone. No single input concept emerged as the most desirable method of interacting with the display. Subjective results, however, indicate that the voice concept was the most preferred method of data entry and had the most potential for future applications. The objective results indicate that, overall, the touch screen concept was the most effective input method. There was also significant differences between the time required to perform specific tasks and the input concept employed, with each concept providing better performance relative to a specific task. These results suggest that a system combining all three input concepts might provide the most effective method of interaction.
Micro Computer Feedback Report for the Strategic Leader Development Inventory
1993-05-01
POS or NEG variables CALL CREATE MEM DIR ;make a memory directory JC SELS ;exat I error CALL SELECT-SCREEN ;dlsplay select screen JC SEL4 ;no flles in...get keyboaI Input CMP AL,1Bh3 ;ls I an Esc key ? JNZ SEL2 ;X not goto nrod test G-95 JMP SEL4 ;Exit SEL2: CMP AL,OOh Iskapick? JZ SEL ;I YES exit loop...position CALL READ DATE ;gat DOS daoe od 4e CALL F4ND -ERO ;kxlae OW In data ue JC SEL.5 SEL4 : CALL RELEASE MEM DIR ;release meu block CLC ;cler carry fag
Ding, Yao; Thompson, John D; Kobrynski, Lisa; Ojodu, Jelili; Zarbalian, Guisou; Grosse, Scott D
2016-05-01
To evaluate the expected cost-effectiveness and net benefit of the recent implementation of newborn screening (NBS) for severe combined immunodeficiency (SCID) in Washington State. We constructed a decision analysis model to estimate the costs and benefits of NBS in an annual birth cohort of 86 600 infants based on projections of avoided infant deaths. Point estimates and ranges for input variables, including the birth prevalence of SCID, proportion detected asymptomatically without screening through family history, screening test characteristics, survival rates, and costs of screening, diagnosis, and treatment were derived from published estimates, expert opinion, and the Washington NBS program. We estimated treatment costs stratified by age of identification and SCID type (with or without adenosine deaminase deficiency). Economic benefit was estimated using values of $4.2 and $9.0 million per death averted. We performed sensitivity analyses to evaluate the influence of key variables on the incremental cost-effectiveness ratio (ICER) of net direct cost per life-year saved. Our model predicts an additional 1.19 newborn infants with SCID detected preclinically through screening, in addition to those who would have been detected early through family history, and 0.40 deaths averted annually. Our base-case model suggests an ICER of $35 311 per life-year saved, and a benefit-cost ratio of either 5.31 or 2.71. Sensitivity analyses found ICER values <$100 000 and positive net benefit for plausible assumptions on all variables. Our model suggests that NBS for SCID in Washington is likely to be cost-effective and to show positive net economic benefit. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Shin, K. H.; Kim, K. H.; Ki, S. J.; Lee, H. G.
2017-12-01
The vulnerability assessment tool at a Tier 1 level, although not often used for regulatory purposes, helps establish pollution prevention and management strategies in the areas of potential environmental concern such as soil and ground water. In this study, the Neural Network Pattern Recognition Tool embedded in MATLAB was used to allow the initial screening of soil and groundwater pollution based on data compiled across about 1000 previously contaminated sites in Korea. The input variables included a series of parameters which were tightly related to downward movement of water and contaminants through soil and ground water, whereas multiple classes were assigned to the sum of concentrations of major pollutants detected. Results showed that in accordance with diverse pollution indices for soil and ground water, pollution levels in both media were strongly modulated by site-specific characteristics such as intrinsic soil and other geologic properties, in addition to pollution sources and rainfall. However, classification accuracy was very sensitive to the number of classes defined as well as the types of the variables incorporated, requiring careful selection of input variables and output categories. Therefore, we believe that the proposed methodology is used not only to modify existing pollution indices so that they are more suitable for addressing local vulnerability, but also to develop a unique assessment tool to support decision making based on locally or nationally available data. This study was funded by a grant from the GAIA project(2016000560002), Korea Environmental Industry & Technology Institute, Republic of Korea.
Back propagation artificial neural network for community Alzheimer's disease screening in China.
Tang, Jun; Wu, Lei; Huang, Helang; Feng, Jiang; Yuan, Yefeng; Zhou, Yueping; Huang, Peng; Xu, Yan; Yu, Chao
2013-01-25
Alzheimer's disease patients diagnosed with the Chinese Classification of Mental Disorders diagnostic criteria were selected from the community through on-site sampling. Levels of macro and trace elements were measured in blood samples using an atomic absorption method, and neurotransmitters were measured using a radioimmunoassay method. SPSS 13.0 was used to establish a database, and a back propagation artificial neural network for Alzheimer's disease prediction was simulated using Clementine 12.0 software. With scores of activities of daily living, creatinine, 5-hydroxytryptamine, age, dopamine and aluminum as input variables, the results revealed that the area under the curve in our back propagation artificial neural network was 0.929 (95% confidence interval: 0.868-0.968), sensitivity was 90.00%, specificity was 95.00%, and accuracy was 92.50%. The findings indicated that the results of back propagation artificial neural network established based on the above six variables were satisfactory for screening and diagnosis of Alzheimer's disease in patients selected from the community.
Back propagation artificial neural network for community Alzheimer's disease screening in China★
Tang, Jun; Wu, Lei; Huang, Helang; Feng, Jiang; Yuan, Yefeng; Zhou, Yueping; Huang, Peng; Xu, Yan; Yu, Chao
2013-01-01
Alzheimer's disease patients diagnosed with the Chinese Classification of Mental Disorders diagnostic criteria were selected from the community through on-site sampling. Levels of macro and trace elements were measured in blood samples using an atomic absorption method, and neurotransmitters were measured using a radioimmunoassay method. SPSS 13.0 was used to establish a database, and a back propagation artificial neural network for Alzheimer's disease prediction was simulated using Clementine 12.0 software. With scores of activities of daily living, creatinine, 5-hydroxytryptamine, age, dopamine and aluminum as input variables, the results revealed that the area under the curve in our back propagation artificial neural network was 0.929 (95% confidence interval: 0.868–0.968), sensitivity was 90.00%, specificity was 95.00%, and accuracy was 92.50%. The findings indicated that the results of back propagation artificial neural network established based on the above six variables were satisfactory for screening and diagnosis of Alzheimer's disease in patients selected from the community. PMID:25206598
Mastoid vibration affects dynamic postural control during gait in healthy older adults
NASA Astrophysics Data System (ADS)
Chien, Jung Hung; Mukherjee, Mukul; Kent, Jenny; Stergiou, Nicholas
2017-01-01
Vestibular disorders are difficult to diagnose early due to the lack of a systematic assessment. Our previous work has developed a reliable experimental design and the result shows promising results that vestibular sensory input while walking could be affected through mastoid vibration (MV) and changes are in the direction of motion. In the present paper, we wanted to extend this work to older adults and investigate how manipulating sensory input through mastoid vibration (MV) could affect dynamic postural control during walking. Three levels of MV (none, unilateral, and bilateral) applied via vibrating elements placed on the mastoid processes were combined with the Locomotor Sensory Organization Test (LSOT) paradigm to challenge the visual and somatosensory systems. We hypothesized that the MV would affect sway variability during walking in older adults. Our results revealed that MV significantly not only increased the amount of sway variability but also decreased the temporal structure of sway variability only in anterior-posterior direction. Importantly, the bilateral MV stimulation generally produced larger effects than the unilateral. This is an important finding that confirmed our experimental design and the results produced could guide a more reliable screening of vestibular system deterioration.
Noise screen for attitude control system
NASA Technical Reports Server (NTRS)
Rodden, John J. (Inventor); Stevens, Homer D. (Inventor); Hong, David P. (Inventor); Hirschberg, Philip C. (Inventor)
2002-01-01
An attitude control system comprising a controller and a noise screen device coupled to the controller. The controller is adapted to control an attitude of a vehicle carrying an actuator system that is adapted to pulse in metered bursts in order to generate a control torque to control the attitude of the vehicle in response to a control pulse. The noise screen device is adapted to generate a noise screen signal in response to the control pulse that is generated when an input attitude error signal exceeds a predetermined deadband attitude level. The noise screen signal comprises a decaying offset signal that when combined with the attitude error input signal results in a net attitude error input signal away from the predetermined deadband level to reduce further control pulse generation.
Sugarman, R.M.
1960-08-30
An oscilloscope is designed for displaying transient signal waveforms having random time and amplitude distributions. The oscilloscopc is a sampling device that selects for display a portion of only those waveforms having a particular range of amplitudes. For this purpose a pulse-height analyzer is provided to screen the pulses. A variable voltage-level shifter and a time-scale rampvoltage generator take the pulse height relative to the start of the waveform. The variable voltage shifter produces a voltage level raised one step for each sequential signal waveform to be sampled and this results in an unsmeared record of input signal waveforms. Appropriate delay devices permit each sample waveform to pass its peak amplitude before the circuit selects it for display.
A method for screening climate change-sensitive infectious diseases.
Wang, Yunjing; Rao, Yuhan; Wu, Xiaoxu; Zhao, Hainan; Chen, Jin
2015-01-14
Climate change is a significant and emerging threat to human health, especially where infectious diseases are involved. Because of the complex interactions between climate variables and infectious disease components (i.e., pathogen, host and transmission environment), systematically and quantitatively screening for infectious diseases that are sensitive to climate change is still a challenge. To address this challenge, we propose a new statistical indicator, Relative Sensitivity, to identify the difference between the sensitivity of the infectious disease to climate variables for two different climate statuses (i.e., historical climate and present climate) in non-exposure and exposure groups. The case study in Anhui Province, China has demonstrated the effectiveness of this Relative Sensitivity indicator. The application results indicate significant sensitivity of many epidemic infectious diseases to climate change in the form of changing climatic variables, such as temperature, precipitation and absolute humidity. As novel evidence, this research shows that absolute humidity has a critical influence on many observed infectious diseases in Anhui Province, including dysentery, hand, foot and mouth disease, hepatitis A, hemorrhagic fever, typhoid fever, malaria, meningitis, influenza and schistosomiasis. Moreover, some infectious diseases are more sensitive to climate change in rural areas than in urban areas. This insight provides guidance for future health inputs that consider spatial variability in response to climate change.
A Method for Screening Climate Change-Sensitive Infectious Diseases
Wang, Yunjing; Rao, Yuhan; Wu, Xiaoxu; Zhao, Hainan; Chen, Jin
2015-01-01
Climate change is a significant and emerging threat to human health, especially where infectious diseases are involved. Because of the complex interactions between climate variables and infectious disease components (i.e., pathogen, host and transmission environment), systematically and quantitatively screening for infectious diseases that are sensitive to climate change is still a challenge. To address this challenge, we propose a new statistical indicator, Relative Sensitivity, to identify the difference between the sensitivity of the infectious disease to climate variables for two different climate statuses (i.e., historical climate and present climate) in non-exposure and exposure groups. The case study in Anhui Province, China has demonstrated the effectiveness of this Relative Sensitivity indicator. The application results indicate significant sensitivity of many epidemic infectious diseases to climate change in the form of changing climatic variables, such as temperature, precipitation and absolute humidity. As novel evidence, this research shows that absolute humidity has a critical influence on many observed infectious diseases in Anhui Province, including dysentery, hand, foot and mouth disease, hepatitis A, hemorrhagic fever, typhoid fever, malaria, meningitis, influenza and schistosomiasis. Moreover, some infectious diseases are more sensitive to climate change in rural areas than in urban areas. This insight provides guidance for future health inputs that consider spatial variability in response to climate change. PMID:25594780
Beddows, Andrew V; Kitwiroon, Nutthida; Williams, Martin L; Beevers, Sean D
2017-06-06
Gaussian process emulation techniques have been used with the Community Multiscale Air Quality model, simulating the effects of input uncertainties on ozone and NO 2 output, to allow robust global sensitivity analysis (SA). A screening process ranked the effect of perturbations in 223 inputs, isolating the 30 most influential from emissions, boundary conditions (BCs), and reaction rates. Community Multiscale Air Quality (CMAQ) simulations of a July 2006 ozone pollution episode in the UK were made with input values for these variables plus ozone dry deposition velocity chosen according to a 576 point Latin hypercube design. Emulators trained on the output of these runs were used in variance-based SA of the model output to input uncertainties. Performing these analyses for every hour of a 21 day period spanning the episode and several days on either side allowed the results to be presented as a time series of sensitivity coefficients, showing how the influence of different input uncertainties changed during the episode. This is one of the most complex models to which these methods have been applied, and here, they reveal detailed spatiotemporal patterns of model sensitivities, with NO and isoprene emissions, NO 2 photolysis, ozone BCs, and deposition velocity being among the most influential input uncertainties.
Veligdan, James T.
1997-01-01
An optical display includes a plurality of stacked optical waveguides having first and second opposite ends collectively defining an image input face and an image screen, respectively, with the screen being oblique to the input face. Each of the waveguides includes a transparent core bound by a cladding layer having a lower index of refraction for effecting internal reflection of image light transmitted into the input face to project an image on the screen, with each of the cladding layers including a cladding cap integrally joined thereto at the waveguide second ends. Each of the cores is beveled at the waveguide second end so that the cladding cap is viewable through the transparent core. Each of the cladding caps is black for absorbing external ambient light incident upon the screen for improving contrast of the image projected internally on the screen.
Effects of input device and motion type on a cursor-positioning task.
Yau, Yi-Jan; Hwang, Sheue-Ling; Chao, Chin-Jung
2008-02-01
Many studies have investigated the performance of using nonkey-board input devices under static situations, but few have considered the effects of motion type on manipulating these input devices. In this study comparison of 12 mens' performance using four input devices (three trackballs: currently used, trackman wheel, and erectly held trackballs, as well as a touch screen) under five motion types of static, heave, roll, pitch, and random movements was conducted. The input device and motion type significantly affected movement speed and accuracy, and their interaction significantly affected the movement speed. The touch screen was the fastest but the least accurate input device. The erectly held trackball was the slowest, whereas the error rate of the currently used trackball was the lowest. Impairments of the random motion on movement time and error rate were larger than those of other motion types. Considering objective and subjective evaluations, the trackman wheel and currently used trackball were more efficient in operation than the erectly held trackball and touch screen under the motion environments.
Ransom, Katherine M.; Nolan, Bernard T.; Traum, Jonathan A.; Faunt, Claudia; Bell, Andrew M.; Gronberg, Jo Ann M.; Wheeler, David C.; Zamora, Celia; Jurgens, Bryant; Schwarz, Gregory E.; Belitz, Kenneth; Eberts, Sandra; Kourakos, George; Harter, Thomas
2017-01-01
Intense demand for water in the Central Valley of California and related increases in groundwater nitrate concentration threaten the sustainability of the groundwater resource. To assess contamination risk in the region, we developed a hybrid, non-linear, machine learning model within a statistical learning framework to predict nitrate contamination of groundwater to depths of approximately 500 m below ground surface. A database of 145 predictor variables representing well characteristics, historical and current field and landscape-scale nitrogen mass balances, historical and current land use, oxidation/reduction conditions, groundwater flow, climate, soil characteristics, depth to groundwater, and groundwater age were assigned to over 6000 private supply and public supply wells measured previously for nitrate and located throughout the study area. The boosted regression tree (BRT) method was used to screen and rank variables to predict nitrate concentration at the depths of domestic and public well supplies. The novel approach included as predictor variables outputs from existing physically based models of the Central Valley. The top five most important predictor variables included two oxidation/reduction variables (probability of manganese concentration to exceed 50 ppb and probability of dissolved oxygen concentration to be below 0.5 ppm), field-scale adjusted unsaturated zone nitrogen input for the 1975 time period, average difference between precipitation and evapotranspiration during the years 1971–2000, and 1992 total landscape nitrogen input. Twenty-five variables were selected for the final model for log-transformed nitrate. In general, increasing probability of anoxic conditions and increasing precipitation relative to potential evapotranspiration had a corresponding decrease in nitrate concentration predictions. Conversely, increasing 1975 unsaturated zone nitrogen leaching flux and 1992 total landscape nitrogen input had an increasing relative impact on nitrate predictions. Three-dimensional visualization indicates that nitrate predictions depend on the probability of anoxic conditions and other factors, and that nitrate predictions generally decreased with increasing groundwater age.
Transported Geothermal Energy Technoeconomic Screening Tool - Calculation Engine
Liu, Xiaobing
2016-09-21
This calculation engine estimates technoeconomic feasibility for transported geothermal energy projects. The TGE screening tool (geotool.exe) takes input from input file (input.txt), and list results into output file (output.txt). Both the input and ouput files are in the same folder as the geotool.exe. To use the tool, the input file containing adequate information of the case should be prepared in the format explained below, and the input file should be put into the same folder as geotool.exe. Then the geotool.exe can be executed, which will generate a output.txt file in the same folder containing all key calculation results. The format and content of the output file is explained below as well.
Sensitivity analysis of radionuclides atmospheric dispersion following the Fukushima accident
NASA Astrophysics Data System (ADS)
Girard, Sylvain; Korsakissok, Irène; Mallet, Vivien
2014-05-01
Atmospheric dispersion models are used in response to accidental releases with two purposes: - minimising the population exposure during the accident; - complementing field measurements for the assessment of short and long term environmental and sanitary impacts. The predictions of these models are subject to considerable uncertainties of various origins. Notably, input data, such as meteorological fields or estimations of emitted quantities as function of time, are highly uncertain. The case studied here is the atmospheric release of radionuclides following the Fukushima Daiichi disaster. The model used in this study is Polyphemus/Polair3D, from which derives IRSN's operational long distance atmospheric dispersion model ldX. A sensitivity analysis was conducted in order to estimate the relative importance of a set of identified uncertainty sources. The complexity of this task was increased by four characteristics shared by most environmental models: - high dimensional inputs; - correlated inputs or inputs with complex structures; - high dimensional output; - multiplicity of purposes that require sophisticated and non-systematic post-processing of the output. The sensitivities of a set of outputs were estimated with the Morris screening method. The input ranking was highly dependent on the considered output. Yet, a few variables, such as horizontal diffusion coefficient or clouds thickness, were found to have a weak influence on most of them and could be discarded from further studies. The sensitivity analysis procedure was also applied to indicators of the model performance computed on a set of gamma dose rates observations. This original approach is of particular interest since observations could be used later to calibrate the input variables probability distributions. Indeed, only the variables that are influential on performance scores are likely to allow for calibration. An indicator based on emission peaks time matching was elaborated in order to complement classical statistical scores which were dominated by deposit dose rates and almost insensitive to lower atmosphere dose rates. The substantial sensitivity of these performance indicators is auspicious for future calibration attempts and indicates that the simple perturbations used here may be sufficient to represent an essential part of the overall uncertainty.
NASA Astrophysics Data System (ADS)
Wang, Lijuan; Yan, Yong; Wang, Xue; Wang, Tao
2017-03-01
Input variable selection is an essential step in the development of data-driven models for environmental, biological and industrial applications. Through input variable selection to eliminate the irrelevant or redundant variables, a suitable subset of variables is identified as the input of a model. Meanwhile, through input variable selection the complexity of the model structure is simplified and the computational efficiency is improved. This paper describes the procedures of the input variable selection for the data-driven models for the measurement of liquid mass flowrate and gas volume fraction under two-phase flow conditions using Coriolis flowmeters. Three advanced input variable selection methods, including partial mutual information (PMI), genetic algorithm-artificial neural network (GA-ANN) and tree-based iterative input selection (IIS) are applied in this study. Typical data-driven models incorporating support vector machine (SVM) are established individually based on the input candidates resulting from the selection methods. The validity of the selection outcomes is assessed through an output performance comparison of the SVM based data-driven models and sensitivity analysis. The validation and analysis results suggest that the input variables selected from the PMI algorithm provide more effective information for the models to measure liquid mass flowrate while the IIS algorithm provides a fewer but more effective variables for the models to predict gas volume fraction.
Evaluation of globally available precipitation data products as input for water balance models
NASA Astrophysics Data System (ADS)
Lebrenz, H.; Bárdossy, A.
2009-04-01
Subject of this study is the evaluation of globally available precipitation data products, which are intended to be used as input variables for water balance models in ungauged basins. The selected data sources are a) the Global Precipitation Climatology Centre (GPCC), b) the Global Precipitation Climatology Project (GPCP) and c) the Climate Research Unit (CRU), resulting into twelve globally available data products. The data products imply different data bases, different derivation routines and varying resolutions in time and space. For validation purposes, the ground data from South Africa were screened on homogeneity and consistency by various tests and an outlier detection using multi-linear regression was performed. External Drift Kriging was subsequently applied on the ground data and the resulting precipitation arrays were compared to the different products with respect to quantity and variance.
2014-09-01
Redesign .................................122 d. Screen 10/Final Review Redesign ........................................123 F. TEST SET- UP INITIAL TEST...user with a chance to review his or her inputs and send the request by his or her preferred method (digital or voice). The screen breaks down the line...user with a chance to review his or her inputs and send the request by his or her preferred method (digital or voice). The screen breaks down the
CARE3MENU- A CARE III USER FRIENDLY INTERFACE
NASA Technical Reports Server (NTRS)
Pierce, J. L.
1994-01-01
CARE3MENU generates an input file for the CARE III program. CARE III is used for reliability prediction of complex, redundant, fault-tolerant systems including digital computers, aircraft, nuclear and chemical control systems. The CARE III input file often becomes complicated and is not easily formatted with a text editor. CARE3MENU provides an easy, interactive method of creating an input file by automatically formatting a set of user-supplied inputs for the CARE III system. CARE3MENU provides detailed on-line help for most of its screen formats. The reliability model input process is divided into sections using menu-driven screen displays. Each stage, or set of identical modules comprising the model, must be identified and described in terms of number of modules, minimum number of modules for stage operation, and critical fault threshold. The fault handling and fault occurence models are detailed in several screens by parameters such as transition rates, propagation and detection densities, Weibull or exponential characteristics, and model accuracy. The system fault tree and critical pairs fault tree screens are used to define the governing logic and to identify modules affected by component failures. Additional CARE3MENU screens prompt the user for output options and run time control values such as mission time and truncation values. There are fourteen major screens, many with default values and HELP options. The documentation includes: 1) a users guide with several examples of CARE III models, the dialog required to input them to CARE3MENU, and the output files created; and 2) a maintenance manual for assistance in changing the HELP files and modifying any of the menu formats or contents. CARE3MENU is written in FORTRAN 77 for interactive execution and has been implemented on a DEC VAX series computer operating under VMS. This program was developed in 1985.
When Can Information from Ordinal Scale Variables Be Integrated?
ERIC Educational Resources Information Center
Kemp, Simon; Grace, Randolph C.
2010-01-01
Many theoretical constructs of interest to psychologists are multidimensional and derive from the integration of several input variables. We show that input variables that are measured on ordinal scales cannot be combined to produce a stable weakly ordered output variable that allows trading off the input variables. Instead a partial order is…
Cardiac risk stratification in renal transplantation using a form of artificial intelligence.
Heston, T F; Norman, D J; Barry, J M; Bennett, W M; Wilson, R A
1997-02-15
The purpose of this study was to determine if an expert network, a form of artificial intelligence, could effectively stratify cardiac risk in candidates for renal transplant. Input into the expert network consisted of clinical risk factors and thallium-201 stress test data. Clinical risk factor screening alone identified 95 of 189 patients as high risk. These 95 patients underwent thallium-201 stress testing, and 53 had either reversible or fixed defects. The other 42 patients were classified as low risk. This algorithm made up the "expert system," and during the 4-year follow-up period had a sensitivity of 82%, specificity of 77%, and accuracy of 78%. An artificial neural network was added to the expert system, creating an expert network. Input into the neural network consisted of both clinical variables and thallium-201 stress test data. There were 5 hidden nodes and the output (end point) was cardiac death. The expert network increased the specificity of the expert system alone from 77% to 90% (p < 0.001), the accuracy from 78% to 89% (p < 0.005), and maintained the overall sensitivity at 88%. An expert network based on clinical risk factor screening and thallium-201 stress testing had an accuracy of 89% in predicting the 4-year cardiac mortality among 189 renal transplant candidates.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naimi, Ladan J.; Collard, Flavien; Bi, Xiaotao
Size reduction is an unavoidable operation for preparing biomass for biofuels and bioproduct conversion. Yet, there is considerable uncertainty in power input requirement and the uniformity of ground biomass. Considerable gains are possible if the required power input for a size reduction ratio is estimated accurately. In this research three well-known mechanistic equations attributed to Rittinger, Kick, and Bond available for predicting energy input for grinding pine wood chips were tested against experimental grinding data. Prior to testing, samples of pine wood chips were conditioned to 11.7% wb, moisture content. The wood chips were successively ground in a hammer millmore » using screen sizes of 25.4 mm, 10 mm, 6.4 mm, and 3.2 mm. The input power and the flow of material into the grinder were recorded continuously. The recorded power input vs. mean particle size showed that the Rittinger equation had the best fit to the experimental data. The ground particle sizes were 4 to 7 times smaller than the size of installed screen. Geometric mean size of particles were calculated using two methods (1) Tyler sieves and using particle size analysis and (2) Sauter mean diameter calculated from the ratio of volume to surface that were estimated from measured length and width. The two mean diameters agreed well, pointing to the fact that either mechanical sieving or particle imaging can be used to characterize particle size. In conclusion, specific energy input to the hammer mill increased from 1.4 kWh t –1 (5.2 J g –1) for large 25.1-mm screen to 25 kWh t –1 (90.4 J g –1) for small 3.2-mm screen.« less
Naimi, Ladan J.; Collard, Flavien; Bi, Xiaotao; ...
2016-01-05
Size reduction is an unavoidable operation for preparing biomass for biofuels and bioproduct conversion. Yet, there is considerable uncertainty in power input requirement and the uniformity of ground biomass. Considerable gains are possible if the required power input for a size reduction ratio is estimated accurately. In this research three well-known mechanistic equations attributed to Rittinger, Kick, and Bond available for predicting energy input for grinding pine wood chips were tested against experimental grinding data. Prior to testing, samples of pine wood chips were conditioned to 11.7% wb, moisture content. The wood chips were successively ground in a hammer millmore » using screen sizes of 25.4 mm, 10 mm, 6.4 mm, and 3.2 mm. The input power and the flow of material into the grinder were recorded continuously. The recorded power input vs. mean particle size showed that the Rittinger equation had the best fit to the experimental data. The ground particle sizes were 4 to 7 times smaller than the size of installed screen. Geometric mean size of particles were calculated using two methods (1) Tyler sieves and using particle size analysis and (2) Sauter mean diameter calculated from the ratio of volume to surface that were estimated from measured length and width. The two mean diameters agreed well, pointing to the fact that either mechanical sieving or particle imaging can be used to characterize particle size. In conclusion, specific energy input to the hammer mill increased from 1.4 kWh t –1 (5.2 J g –1) for large 25.1-mm screen to 25 kWh t –1 (90.4 J g –1) for small 3.2-mm screen.« less
The art of spacecraft design: A multidisciplinary challenge
NASA Technical Reports Server (NTRS)
Abdi, F.; Ide, H.; Levine, M.; Austel, L.
1989-01-01
Actual design turn-around time has become shorter due to the use of optimization techniques which have been introduced into the design process. It seems that what, how and when to use these optimization techniques may be the key factor for future aircraft engineering operations. Another important aspect of this technique is that complex physical phenomena can be modeled by a simple mathematical equation. The new powerful multilevel methodology reduces time-consuming analysis significantly while maintaining the coupling effects. This simultaneous analysis method stems from the implicit function theorem and system sensitivity derivatives of input variables. Use of the Taylor's series expansion and finite differencing technique for sensitivity derivatives in each discipline makes this approach unique for screening dominant variables from nondominant variables. In this study, the current Computational Fluid Dynamics (CFD) aerodynamic and sensitivity derivative/optimization techniques are applied for a simple cone-type forebody of a high-speed vehicle configuration to understand basic aerodynamic/structure interaction in a hypersonic flight condition.
NASA Astrophysics Data System (ADS)
Shoaib, Syed Abu; Marshall, Lucy; Sharma, Ashish
2018-06-01
Every model to characterise a real world process is affected by uncertainty. Selecting a suitable model is a vital aspect of engineering planning and design. Observation or input errors make the prediction of modelled responses more uncertain. By way of a recently developed attribution metric, this study is aimed at developing a method for analysing variability in model inputs together with model structure variability to quantify their relative contributions in typical hydrological modelling applications. The Quantile Flow Deviation (QFD) metric is used to assess these alternate sources of uncertainty. The Australian Water Availability Project (AWAP) precipitation data for four different Australian catchments is used to analyse the impact of spatial rainfall variability on simulated streamflow variability via the QFD. The QFD metric attributes the variability in flow ensembles to uncertainty associated with the selection of a model structure and input time series. For the case study catchments, the relative contribution of input uncertainty due to rainfall is higher than that due to potential evapotranspiration, and overall input uncertainty is significant compared to model structure and parameter uncertainty. Overall, this study investigates the propagation of input uncertainty in a daily streamflow modelling scenario and demonstrates how input errors manifest across different streamflow magnitudes.
Granato, Gregory E.
2006-01-01
The Kendall-Theil Robust Line software (KTRLine-version 1.0) is a Visual Basic program that may be used with the Microsoft Windows operating system to calculate parameters for robust, nonparametric estimates of linear-regression coefficients between two continuous variables. The KTRLine software was developed by the U.S. Geological Survey, in cooperation with the Federal Highway Administration, for use in stochastic data modeling with local, regional, and national hydrologic data sets to develop planning-level estimates of potential effects of highway runoff on the quality of receiving waters. The Kendall-Theil robust line was selected because this robust nonparametric method is resistant to the effects of outliers and nonnormality in residuals that commonly characterize hydrologic data sets. The slope of the line is calculated as the median of all possible pairwise slopes between points. The intercept is calculated so that the line will run through the median of input data. A single-line model or a multisegment model may be specified. The program was developed to provide regression equations with an error component for stochastic data generation because nonparametric multisegment regression tools are not available with the software that is commonly used to develop regression models. The Kendall-Theil robust line is a median line and, therefore, may underestimate total mass, volume, or loads unless the error component or a bias correction factor is incorporated into the estimate. Regression statistics such as the median error, the median absolute deviation, the prediction error sum of squares, the root mean square error, the confidence interval for the slope, and the bias correction factor for median estimates are calculated by use of nonparametric methods. These statistics, however, may be used to formulate estimates of mass, volume, or total loads. The program is used to read a two- or three-column tab-delimited input file with variable names in the first row and data in subsequent rows. The user may choose the columns that contain the independent (X) and dependent (Y) variable. A third column, if present, may contain metadata such as the sample-collection location and date. The program screens the input files and plots the data. The KTRLine software is a graphical tool that facilitates development of regression models by use of graphs of the regression line with data, the regression residuals (with X or Y), and percentile plots of the cumulative frequency of the X variable, Y variable, and the regression residuals. The user may individually transform the independent and dependent variables to reduce heteroscedasticity and to linearize data. The program plots the data and the regression line. The program also prints model specifications and regression statistics to the screen. The user may save and print the regression results. The program can accept data sets that contain up to about 15,000 XY data points, but because the program must sort the array of all pairwise slopes, the program may be perceptibly slow with data sets that contain more than about 1,000 points.
NASA Technical Reports Server (NTRS)
Jones, Denise R.; Parrish, Russell V.
1990-01-01
A piloted simulation study was conducted comparing three different input methods for interfacing to a large screen, multiwindow, whole flight deck display for management of transport aircraft systems. The thumball concept utilized a miniature trackball embedded in a conventional side arm controller. The multifunction control throttle and stick (MCTAS) concept employed a thumb switch located in the throttle handle. The touch screen concept provided data entry through a capacitive touch screen installed on the display surface. The objective and subjective results obtained indicate that, with present implementations, the thumball concept was the most appropriate for interfacing with aircraft systems/subsystems presented on a large screen display. Not unexpectedly, the completion time differences between the three concepts varied with the task being performed, although the thumball implementation consistently outperformed the other two concepts. However, pilot suggestions for improved implementations of the MCTAS and touch screen concepts could reduce some of these differences.
Voice and gesture-based 3D multimedia presentation tool
NASA Astrophysics Data System (ADS)
Fukutake, Hiromichi; Akazawa, Yoshiaki; Okada, Yoshihiro
2007-09-01
This paper proposes a 3D multimedia presentation tool that allows the user to manipulate intuitively only through the voice input and the gesture input without using a standard keyboard or a mouse device. The authors developed this system as a presentation tool to be used in a presentation room equipped a large screen like an exhibition room in a museum because, in such a presentation environment, it is better to use voice commands and the gesture pointing input rather than using a keyboard or a mouse device. This system was developed using IntelligentBox, which is a component-based 3D graphics software development system. IntelligentBox has already provided various types of 3D visible, reactive functional components called boxes, e.g., a voice input component and various multimedia handling components. IntelligentBox also provides a dynamic data linkage mechanism called slot-connection that allows the user to develop 3D graphics applications by combining already existing boxes through direct manipulations on a computer screen. Using IntelligentBox, the 3D multimedia presentation tool proposed in this paper was also developed as combined components only through direct manipulations on a computer screen. The authors have already proposed a 3D multimedia presentation tool using a stage metaphor and its voice input interface. This time, we extended the system to make it accept the user gesture input besides voice commands. This paper explains details of the proposed 3D multimedia presentation tool and especially describes its component-based voice and gesture input interfaces.
High-throughput cultivation and screening platform for unicellular phototrophs.
Tillich, Ulrich M; Wolter, Nick; Schulze, Katja; Kramer, Dan; Brödel, Oliver; Frohme, Marcus
2014-09-16
High-throughput cultivation and screening methods allow a parallel, miniaturized and cost efficient processing of many samples. These methods however, have not been generally established for phototrophic organisms such as microalgae or cyanobacteria. In this work we describe and test high-throughput methods with the model organism Synechocystis sp. PCC6803. The required technical automation for these processes was achieved with a Tecan Freedom Evo 200 pipetting robot. The cultivation was performed in 2.2 ml deepwell microtiter plates within a cultivation chamber outfitted with programmable shaking conditions, variable illumination, variable temperature, and an adjustable CO2 atmosphere. Each microtiter-well within the chamber functions as a separate cultivation vessel with reproducible conditions. The automated measurement of various parameters such as growth, full absorption spectrum, chlorophyll concentration, MALDI-TOF-MS, as well as a novel vitality measurement protocol, have already been established and can be monitored during cultivation. Measurement of growth parameters can be used as inputs for the system to allow for periodic automatic dilutions and therefore a semi-continuous cultivation of hundreds of cultures in parallel. The system also allows the automatic generation of mid and long term backups of cultures to repeat experiments or to retrieve strains of interest. The presented platform allows for high-throughput cultivation and screening of Synechocystis sp. PCC6803. The platform should be usable for many phototrophic microorganisms as is, and be adaptable for even more. A variety of analyses are already established and the platform is easily expandable both in quality, i.e. with further parameters to screen for additional targets and in quantity, i.e. size or number of processed samples.
Klann, Jeffrey G; Anand, Vibha; Downs, Stephen M
2013-12-01
Over 8 years, we have developed an innovative computer decision support system that improves appropriate delivery of pediatric screening and care. This system employs a guidelines evaluation engine using data from the electronic health record (EHR) and input from patients and caregivers. Because guideline recommendations typically exceed the scope of one visit, the engine uses a static prioritization scheme to select recommendations. Here we extend an earlier idea to create patient-tailored prioritization. We used Bayesian structure learning to build networks of association among previously collected data from our decision support system. Using area under the receiver-operating characteristic curve (AUC) as a measure of discriminability (a sine qua non for expected value calculations needed for prioritization), we performed a structural analysis of variables with high AUC on a test set. Our source data included 177 variables for 29 402 patients. The method produced a network model containing 78 screening questions and anticipatory guidance (107 variables total). Average AUC was 0.65, which is sufficient for prioritization depending on factors such as population prevalence. Structure analysis of seven highly predictive variables reveals both face-validity (related nodes are connected) and non-intuitive relationships. We demonstrate the ability of a Bayesian structure learning method to 'phenotype the population' seen in our primary care pediatric clinics. The resulting network can be used to produce patient-tailored posterior probabilities that can be used to prioritize content based on the patient's current circumstances. This study demonstrates the feasibility of EHR-driven population phenotyping for patient-tailored prioritization of pediatric preventive care services.
A neural circuit mechanism for regulating vocal variability during song learning in zebra finches.
Garst-Orozco, Jonathan; Babadi, Baktash; Ölveczky, Bence P
2014-12-15
Motor skill learning is characterized by improved performance and reduced motor variability. The neural mechanisms that couple skill level and variability, however, are not known. The zebra finch, a songbird, presents a unique opportunity to address this question because production of learned song and induction of vocal variability are instantiated in distinct circuits that converge on a motor cortex analogue controlling vocal output. To probe the interplay between learning and variability, we made intracellular recordings from neurons in this area, characterizing how their inputs from the functionally distinct pathways change throughout song development. We found that inputs that drive stereotyped song-patterns are strengthened and pruned, while inputs that induce variability remain unchanged. A simple network model showed that strengthening and pruning of action-specific connections reduces the sensitivity of motor control circuits to variable input and neural 'noise'. This identifies a simple and general mechanism for learning-related regulation of motor variability.
An inexpensive frequency-modulated (FM) audio monitor of time-dependent analog parameters.
Langdon, R B; Jacobs, R S
1980-02-01
The standard method for quantification and presentation of an experimental variable in real time is the use of visual display on the ordinate of an oscilloscope screen or chart recorder. This paper describes a relatively simple electronic circuit, using commercially available and inexpensive integrated circuits (IC), which generates an audible tone, the pitch of which varies in proportion to a running variable of interest. This device, which we call an "Audioscope," can accept as input the monitor output from any instrument that expresses an experimental parameter as a dc voltage. The Audioscope is particularly useful in implanting microelectrodes intracellularly. It may also function to mediate the first step in data recording on magnetic tape, and/or data analysis and reduction by electronic circuitary. We estimate that this device can be built, with two-channel capability, for less than $50, and in less than 10 hr by an experienced electronics technician.
Case studies in Bayesian microbial risk assessments.
Kennedy, Marc C; Clough, Helen E; Turner, Joanne
2009-12-21
The quantification of uncertainty and variability is a key component of quantitative risk analysis. Recent advances in Bayesian statistics make it ideal for integrating multiple sources of information, of different types and quality, and providing a realistic estimate of the combined uncertainty in the final risk estimates. We present two case studies related to foodborne microbial risks. In the first, we combine models to describe the sequence of events resulting in illness from consumption of milk contaminated with VTEC O157. We used Monte Carlo simulation to propagate uncertainty in some of the inputs to computer models describing the farm and pasteurisation process. Resulting simulated contamination levels were then assigned to consumption events from a dietary survey. Finally we accounted for uncertainty in the dose-response relationship and uncertainty due to limited incidence data to derive uncertainty about yearly incidences of illness in young children. Options for altering the risk were considered by running the model with different hypothetical policy-driven exposure scenarios. In the second case study we illustrate an efficient Bayesian sensitivity analysis for identifying the most important parameters of a complex computer code that simulated VTEC O157 prevalence within a managed dairy herd. This was carried out in 2 stages, first to screen out the unimportant inputs, then to perform a more detailed analysis on the remaining inputs. The method works by building a Bayesian statistical approximation to the computer code using a number of known code input/output pairs (training runs). We estimated that the expected total number of children aged 1.5-4.5 who become ill due to VTEC O157 in milk is 8.6 per year, with 95% uncertainty interval (0,11.5). The most extreme policy we considered was banning on-farm pasteurisation of milk, which reduced the estimate to 6.4 with 95% interval (0,11). In the second case study the effective number of inputs was reduced from 30 to 7 in the screening stage, and just 2 inputs were found to explain 82.8% of the output variance. A combined total of 500 runs of the computer code were used. These case studies illustrate the use of Bayesian statistics to perform detailed uncertainty and sensitivity analyses, integrating multiple information sources in a way that is both rigorous and efficient.
NASA Astrophysics Data System (ADS)
Prasad, Ramendra; Deo, Ravinesh C.; Li, Yan; Maraseni, Tek
2017-11-01
Forecasting streamflow is vital for strategically planning, utilizing and redistributing water resources. In this paper, a wavelet-hybrid artificial neural network (ANN) model integrated with iterative input selection (IIS) algorithm (IIS-W-ANN) is evaluated for its statistical preciseness in forecasting monthly streamflow, and it is then benchmarked against M5 Tree model. To develop hybrid IIS-W-ANN model, a global predictor matrix is constructed for three local hydrological sites (Richmond, Gwydir, and Darling River) in Australia's agricultural (Murray-Darling) Basin. Model inputs comprised of statistically significant lagged combination of streamflow water level, are supplemented by meteorological data (i.e., precipitation, maximum and minimum temperature, mean solar radiation, vapor pressure and evaporation) as the potential model inputs. To establish robust forecasting models, iterative input selection (IIS) algorithm is applied to screen the best data from the predictor matrix and is integrated with the non-decimated maximum overlap discrete wavelet transform (MODWT) applied on the IIS-selected variables. This resolved the frequencies contained in predictor data while constructing a wavelet-hybrid (i.e., IIS-W-ANN and IIS-W-M5 Tree) model. Forecasting ability of IIS-W-ANN is evaluated via correlation coefficient (r), Willmott's Index (WI), Nash-Sutcliffe Efficiency (ENS), root-mean-square-error (RMSE), and mean absolute error (MAE), including the percentage RMSE and MAE. While ANN models are seen to outperform M5 Tree executed for all hydrological sites, the IIS variable selector was efficient in determining the appropriate predictors, as stipulated by the better performance of the IIS coupled (ANN and M5 Tree) models relative to the models without IIS. When IIS-coupled models are integrated with MODWT, the wavelet-hybrid IIS-W-ANN and IIS-W-M5 Tree are seen to attain significantly accurate performance relative to their standalone counterparts. Importantly, IIS-W-ANN model accuracy outweighs IIS-ANN, as evidenced by a larger r and WI (by 7.5% and 3.8%, respectively) and a lower RMSE (by 21.3%). In comparison to the IIS-W-M5 Tree model, IIS-W-ANN model yielded larger values of WI = 0.936-0.979 and ENS = 0.770-0.920. Correspondingly, the errors (RMSE and MAE) ranged from 0.162-0.487 m and 0.139-0.390 m, respectively, with relative errors, RRMSE = (15.65-21.00) % and MAPE = (14.79-20.78) %. Distinct geographic signature is evident where the most and least accurately forecasted streamflow data is attained for the Gwydir and Darling River, respectively. Conclusively, this study advocates the efficacy of iterative input selection, allowing the proper screening of model predictors, and subsequently, its integration with MODWT resulting in enhanced performance of the models applied in streamflow forecasting.
Bottom-up and Top-down Input Augment the Variability of Cortical Neurons
Nassi, Jonathan J.; Kreiman, Gabriel; Born, Richard T.
2016-01-01
SUMMARY Neurons in the cerebral cortex respond inconsistently to a repeated sensory stimulus, yet they underlie our stable sensory experiences. Although the nature of this variability is unknown, its ubiquity has encouraged the general view that each cell produces random spike patterns that noisily represent its response rate. In contrast, here we show that reversibly inactivating distant sources of either bottom-up or top-down input to cortical visual areas in the alert primate reduces both the spike train irregularity and the trial-to-trial variability of single neurons. A simple model in which a fraction of the pre-synaptic input is silenced can reproduce this reduction in variability, provided that there exist temporal correlations primarily within, but not between, excitatory and inhibitory input pools. A large component of the variability of cortical neurons may therefore arise from synchronous input produced by signals arriving from multiple sources. PMID:27427459
Troutman, Brent M.
1982-01-01
Errors in runoff prediction caused by input data errors are analyzed by treating precipitation-runoff models as regression (conditional expectation) models. Independent variables of the regression consist of precipitation and other input measurements; the dependent variable is runoff. In models using erroneous input data, prediction errors are inflated and estimates of expected storm runoff for given observed input variables are biased. This bias in expected runoff estimation results in biased parameter estimates if these parameter estimates are obtained by a least squares fit of predicted to observed runoff values. The problems of error inflation and bias are examined in detail for a simple linear regression of runoff on rainfall and for a nonlinear U.S. Geological Survey precipitation-runoff model. Some implications for flood frequency analysis are considered. A case study using a set of data from Turtle Creek near Dallas, Texas illustrates the problems of model input errors.
Control Board Digital Interface Input Devices – Touchscreen, Trackpad, or Mouse?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas A. Ulrich; Ronald L. Boring; Roger Lew
The authors collaborated with a power utility to evaluate input devices for use in the human system interface (HSI) for a new digital Turbine Control System (TCS) at a nuclear power plant (NPP) undergoing a TCS upgrade. A standalone dynamic software simulation of the new digital TCS and a mobile kiosk were developed to conduct an input device study to evaluate operator preference and input device effectiveness. The TCS software presented the anticipated HSI for the TCS and mimicked (i.e., simulated) the turbine systems’ responses to operator commands. Twenty-four licensed operators from the two nuclear power units participated in themore » study. Three input devices were tested: a trackpad, mouse, and touchscreen. The subjective feedback from the survey indicates the operators preferred the touchscreen interface. The operators subjectively rated the touchscreen as the fastest and most comfortable input device given the range of tasks they performed during the study, but also noted a lack of accuracy for selecting small targets. The empirical data suggest the mouse input device provides the most consistent performance for screen navigation and manipulating on screen controls. The trackpad input device was both empirically and subjectively found to be the least effective and least desired input device.« less
Simulation models in population breast cancer screening: A systematic review.
Koleva-Kolarova, Rositsa G; Zhan, Zhuozhao; Greuter, Marcel J W; Feenstra, Talitha L; De Bock, Geertruida H
2015-08-01
The aim of this review was to critically evaluate published simulation models for breast cancer screening of the general population and provide a direction for future modeling. A systematic literature search was performed to identify simulation models with more than one application. A framework for qualitative assessment which incorporated model type; input parameters; modeling approach, transparency of input data sources/assumptions, sensitivity analyses and risk of bias; validation, and outcomes was developed. Predicted mortality reduction (MR) and cost-effectiveness (CE) were compared to estimates from meta-analyses of randomized control trials (RCTs) and acceptability thresholds. Seven original simulation models were distinguished, all sharing common input parameters. The modeling approach was based on tumor progression (except one model) with internal and cross validation of the resulting models, but without any external validation. Differences in lead times for invasive or non-invasive tumors, and the option for cancers not to progress were not explicitly modeled. The models tended to overestimate the MR (11-24%) due to screening as compared to optimal RCTs 10% (95% CI - 2-21%) MR. Only recently, potential harms due to regular breast cancer screening were reported. Most scenarios resulted in acceptable cost-effectiveness estimates given current thresholds. The selected models have been repeatedly applied in various settings to inform decision making and the critical analysis revealed high risk of bias in their outcomes. Given the importance of the models, there is a need for externally validated models which use systematical evidence for input data to allow for more critical evaluation of breast cancer screening. Copyright © 2015 Elsevier Ltd. All rights reserved.
Particle parameter analyzing system. [x-y plotter circuits and display
NASA Technical Reports Server (NTRS)
Hansen, D. O.; Roy, N. L. (Inventor)
1969-01-01
An X-Y plotter circuit apparatus is described which displays an input pulse representing particle parameter information, that would ordinarily appear on the screen of an oscilloscope as a rectangular pulse, as a single dot positioned on the screen where the upper right hand corner of the input pulse would have appeared. If another event occurs, and it is desired to display this event, the apparatus is provided to replace the dot with a short horizontal line.
Diet shift of lentic dragonfly larvae in response to reduced terrestrial prey subsidies
Kraus, Johanna M.
2010-01-01
Inputs of terrestrial plant detritus and nutrients play an important role in aquatic food webs, but the importance of terrestrial prey inputs in determining aquatic predator distribution and abundance has been appreciated only recently. I examined the numerical, biomass, and diet responses of a common predator, dragonfly larvae, to experimental reduction of terrestrial arthropod input into ponds. I distributed paired enclosures (n = 7), one with a screen between the land and water (reduced subsidy) and one without a screen (ambient subsidy), near the shoreline of 2 small fishless ponds and sampled each month during the growing season in the southern Appalachian Mountains, Virginia (USA). Screens between water and land reduced the number of terrestrial arthropods that fell into screened enclosures relative to the number that fell into unscreened enclosures and open reference plots by 36%. The δ13C isotopic signatures of dragonfly larvae shifted towards those of aquatic prey in reduced-subsidy enclosures, a result suggesting that dragonflies consumed fewer terrestrial prey when fewer were available (ambient subsidy: 30%, reduced subsidy: 19% of diet). Overall abundance and biomass of dragonfly larvae did not change in response to reduced terrestrial arthropod inputs, despite the fact that enclosures permitted immigration/emigration. These results suggest that terrestrial arthropods can provide resources to aquatic predators in lentic systems, but that their effects on abundance and distribution might be subtle and confounded by in situ factors.
Variable Screening for Cluster Analysis.
ERIC Educational Resources Information Center
Donoghue, John R.
Inclusion of irrelevant variables in a cluster analysis adversely affects subgroup recovery. This paper examines using moment-based statistics to screen variables; only variables that pass the screening are then used in clustering. Normal mixtures are analytically shown often to possess negative kurtosis. Two related measures, "m" and…
2014-04-01
surrogate model generation is difficult for high -dimensional problems, due to the curse of dimensionality. Variable screening methods have been...a variable screening model was developed for the quasi-molecular treatment of ion-atom collision [16]. In engineering, a confidence interval of...for high -level radioactive waste [18]. Moreover, the design sensitivity method can be extended to the variable screening method because vital
Input-variable sensitivity assessment for sediment transport relations
NASA Astrophysics Data System (ADS)
Fernández, Roberto; Garcia, Marcelo H.
2017-09-01
A methodology to assess input-variable sensitivity for sediment transport relations is presented. The Mean Value First Order Second Moment Method (MVFOSM) is applied to two bed load transport equations showing that it may be used to rank all input variables in terms of how their specific variance affects the overall variance of the sediment transport estimation. In sites where data are scarce or nonexistent, the results obtained may be used to (i) determine what variables would have the largest impact when estimating sediment loads in the absence of field observations and (ii) design field campaigns to specifically measure those variables for which a given transport equation is most sensitive; in sites where data are readily available, the results would allow quantifying the effect that the variance associated with each input variable has on the variance of the sediment transport estimates. An application of the method to two transport relations using data from a tropical mountain river in Costa Rica is implemented to exemplify the potential of the method in places where input data are limited. Results are compared against Monte Carlo simulations to assess the reliability of the method and validate its results. For both of the sediment transport relations used in the sensitivity analysis, accurate knowledge of sediment size was found to have more impact on sediment transport predictions than precise knowledge of other input variables such as channel slope and flow discharge.
A Multifactor Approach to Research in Instructional Technology.
ERIC Educational Resources Information Center
Ragan, Tillman J.
In a field such as instructional design, explanations of educational outcomes must necessarily consider multiple input variables. To adequately understand the contribution made by the independent variables, it is helpful to have a visual conception of how the input variables interrelate. Two variable models are adequately represented by a two…
Predicting high-risk preterm birth using artificial neural networks.
Catley, Christina; Frize, Monique; Walker, C Robin; Petriu, Dorina C
2006-07-01
A reengineered approach to the early prediction of preterm birth is presented as a complimentary technique to the current procedure of using costly and invasive clinical testing on high-risk maternal populations. Artificial neural networks (ANNs) are employed as a screening tool for preterm birth on a heterogeneous maternal population; risk estimations use obstetrical variables available to physicians before 23 weeks gestation. The objective was to assess if ANNs have a potential use in obstetrical outcome estimations in low-risk maternal populations. The back-propagation feedforward ANN was trained and tested on cases with eight input variables describing the patient's obstetrical history; the output variables were: 1) preterm birth; 2) high-risk preterm birth; and 3) a refined high-risk preterm birth outcome excluding all cases where resuscitation was delivered in the form of free flow oxygen. Artificial training sets were created to increase the distribution of the underrepresented class to 20%. Training on the refined high-risk preterm birth model increased the network's sensitivity to 54.8%, compared to just over 20% for the nonartificially distributed preterm birth model.
78 FR 31967 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-28
... Historic Preservation Environmental Screening Form. DATES: Comments must be submitted on or before July 29... authorities through one consolidated process. With input from grantees, the EHP Screening Form was revised for clarity and ease of use. The 2013 Screening Form does not require any new information, and includes an...
The Role and Design of Screen Images in Software Documentation.
ERIC Educational Resources Information Center
van der Meij, Hans
2000-01-01
Discussion of learning a new computer software program focuses on how to support the joint handling of a manual, input devices, and screen display. Describes a study that examined three design styles for manuals that included screen images to reduce split-attention problems and discusses theory versus practice and cognitive load theory.…
Recknagel, Friedrich; Orr, Philip T; Cao, Hongqing
2014-01-01
Seven-day-ahead forecasting models of Cylindrospermopsis raciborskii in three warm-monomictic and mesotrophic reservoirs in south-east Queensland have been developed by means of water quality data from 1999 to 2010 and the hybrid evolutionary algorithm HEA. Resulting models using all measured variables as inputs as well as models using electronically measurable variables only as inputs forecasted accurately timing of overgrowth of C. raciborskii and matched well high and low magnitudes of observed bloom events with 0.45≤r 2 >0.61 and 0.4≤r 2 >0.57, respectively. The models also revealed relationships and thresholds triggering bloom events that provide valuable information on synergism between water quality conditions and population dynamics of C. raciborskii. Best performing models based on using all measured variables as inputs indicated electrical conductivity (EC) within the range of 206-280mSm -1 as threshold above which fast growth and high abundances of C. raciborskii have been observed for the three lakes. Best models based on electronically measurable variables for the Lakes Wivenhoe and Somerset indicated a water temperature (WT) range of 25.5-32.7°C within which fast growth and high abundances of C. raciborskii can be expected. By contrast the model for Lake Samsonvale highlighted a turbidity (TURB) level of 4.8 NTU as indicator for mass developments of C. raciborskii. Experiments with online measured water quality data of the Lake Wivenhoe from 2007 to 2010 resulted in predictive models with 0.61≤r 2 >0.65 whereby again similar levels of EC and WT have been discovered as thresholds for outgrowth of C. raciborskii. The highest validity of r 2 =0.75 for an in situ data-based model has been achieved after considering time lags for EC by 7 days and dissolved oxygen by 1 day. These time lags have been discovered by a systematic screening of all possible combinations of time lags between 0 and 10 days for all electronically measurable variables. The so-developed model performs seven-day-ahead forecasts and is currently implemented and tested for early warning of C. raciborskii blooms in the Wivenhoe reservoir. Copyright © 2013 Elsevier B.V. All rights reserved.
A designed screening study with prespecified combinations of factor settings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson-cook, Christine M; Robinson, Timothy J
2009-01-01
In many applications, the experimenter has limited options about what factor combinations can be chosen for a designed study. Consider a screening study for a production process involving five input factors whose levels have been previously established. The goal of the study is to understand the effect of each factor on the response, a variable that is expensive to measure and results in destruction of the part. From an inventory of available parts with known factor values, we wish to identify a best collection of factor combinations with which to estimate the factor effects. Though the observational nature of themore » study cannot establish a causal relationship involving the response and the factors, the study can increase understanding of the underlying process. The study can also help determine where investment should be made to control input factors during production that will maximally influence the response. Because the factor combinations are observational, the chosen model matrix will be nonorthogonal and will not allow independent estimation of factor effects. In this manuscript we borrow principles from design of experiments to suggest an 'optimal' selection of factor combinations. Specifically, we consider precision of model parameter estimates, the issue of replication, and abilities to detect lack of fit and to estimate two-factor interactions. Through an example, we present strategies for selecting a subset of factor combinations that simultaneously balance multiple objectives, conduct a limited sensitivity analysis, and provide practical guidance for implementing our techniques across a variety of quality engineering disciplines.« less
Flight dynamics analysis and simulation of heavy lift airships, volume 4. User's guide: Appendices
NASA Technical Reports Server (NTRS)
Emmen, R. D.; Tischler, M. B.
1982-01-01
This table contains all of the input variables to the three programs. The variables are arranged according to the name list groups in which they appear in the data files. The program name, subroutine name, definition and, where appropriate, a default input value and any restrictions are listed with each variable. The default input values are user supplied, not generated by the computer. These values remove a specific effect from the calculations, as explained in the table. The phrase "not used' indicates that a variable is not used in the calculations and are for identification purposes only. The engineering symbol, where it exists, is listed to assist the user in correlating these inputs with the discussion in the Technical Manual.
Production Function Geometry with "Knightian" Total Product
ERIC Educational Resources Information Center
Truett, Dale B.; Truett, Lila J.
2007-01-01
Authors of principles and price theory textbooks generally illustrate short-run production using a total product curve that displays first increasing and then diminishing marginal returns to employment of the variable input(s). Although it seems reasonable that a temporary range of increasing returns to variable inputs will likely occur as…
Trogdon, Justin G.; Subramanian, Sujha; Crouse, Wesley
2018-01-01
This study investigates the existence of economies of scale in the provision of breast and cervical cancer screening and diagnostic services by state National Breast and Cervical Cancer Early Detection Program (NBCCEDP) grantees. A translog cost function is estimated as a system with input factor share equations. The estimated cost function is then used to determine output levels for which average costs are decreasing (i.e., economies of scale exist). Data were collected from all state NBCCEDP programs and District of Columbia for program years 2006–2007, 2008–2009 and 2009–2010 (N =147). Costs included all programmatic and in-kind contributions from federal and non-federal sources, allocated to breast and cervical cancer screening activities. Output was measured by women served, women screened and cancers detected, separately by breast and cervical services for each measure. Inputs included labor, rent and utilities, clinical services, and quasi-fixed factors (e.g., percent of women eligible for screening by the NBCCEDP). 144 out of 147 program-years demonstrated significant economies of scale for women served and women screened; 136 out of 145 program-years displayed significant economies of scale for cancers detected. The cost data were self-reported by the NBCCEDP State programs. Quasi-fixed inputs were allowed to affect costs but not economies of scale or the share equations. The main analysis accounted for clustering of observations within State programs, but it did not make full use of the panel data. The average cost of providing breast and cervical cancer screening services decreases as the number of women screened and served increases. PMID:24326873
Applications of information theory, genetic algorithms, and neural models to predict oil flow
NASA Astrophysics Data System (ADS)
Ludwig, Oswaldo; Nunes, Urbano; Araújo, Rui; Schnitman, Leizer; Lepikson, Herman Augusto
2009-07-01
This work introduces a new information-theoretic methodology for choosing variables and their time lags in a prediction setting, particularly when neural networks are used in non-linear modeling. The first contribution of this work is the Cross Entropy Function (XEF) proposed to select input variables and their lags in order to compose the input vector of black-box prediction models. The proposed XEF method is more appropriate than the usually applied Cross Correlation Function (XCF) when the relationship among the input and output signals comes from a non-linear dynamic system. The second contribution is a method that minimizes the Joint Conditional Entropy (JCE) between the input and output variables by means of a Genetic Algorithm (GA). The aim is to take into account the dependence among the input variables when selecting the most appropriate set of inputs for a prediction problem. In short, theses methods can be used to assist the selection of input training data that have the necessary information to predict the target data. The proposed methods are applied to a petroleum engineering problem; predicting oil production. Experimental results obtained with a real-world dataset are presented demonstrating the feasibility and effectiveness of the method.
Harvey, Catherine; Stanton, Neville A; Pickering, Carl A; McDonald, Mike; Zheng, Pengjun
2011-07-01
In-vehicle information systems (IVIS) can be controlled by the user via direct or indirect input devices. In order to develop the next generation of usable IVIS, designers need to be able to evaluate and understand the usability issues associated with these two input types. The aim of this study was to investigate the effectiveness of a set of empirical usability evaluation methods for identifying important usability issues and distinguishing between the IVIS input devices. A number of usability issues were identified and their causal factors have been explored. These were related to the input type, the structure of the menu/tasks and hardware issues. In particular, the translation between inputs and on-screen actions and a lack of visual feedback for menu navigation resulted in lower levels of usability for the indirect device. This information will be useful in informing the design of new IVIS, with improved usability. STATEMENT OF RELEVANCE: This paper examines the use of empirical methods for distinguishing between direct and indirect IVIS input devices and identifying usability issues. Results have shown that the characteristics of indirect input devices produce more serious usability issues, compared with direct devices and can have a negative effect on the driver-vehicle interaction.
Nonequilibrium air radiation (Nequair) program: User's manual
NASA Technical Reports Server (NTRS)
Park, C.
1985-01-01
A supplement to the data relating to the calculation of nonequilibrium radiation in flight regimes of aeroassisted orbital transfer vehicles contains the listings of the computer code NEQAIR (Nonequilibrium Air Radiation), its primary input data, and explanation of the user-supplied input variables. The user-supplied input variables are the thermodynamic variables of air at a given point, i.e., number densities of various chemical species, translational temperatures of heavy particles and electrons, and vibrational temperature. These thermodynamic variables do not necessarily have to be in thermodynamic equilibrium. The code calculates emission and absorption characteristics of air under these given conditions.
Analytic uncertainty and sensitivity analysis of models with input correlations
NASA Astrophysics Data System (ADS)
Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu
2018-03-01
Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.
Harmonize input selection for sediment transport prediction
NASA Astrophysics Data System (ADS)
Afan, Haitham Abdulmohsin; Keshtegar, Behrooz; Mohtar, Wan Hanna Melini Wan; El-Shafie, Ahmed
2017-09-01
In this paper, three modeling approaches using a Neural Network (NN), Response Surface Method (RSM) and response surface method basis Global Harmony Search (GHS) are applied to predict the daily time series suspended sediment load. Generally, the input variables for forecasting the suspended sediment load are manually selected based on the maximum correlations of input variables in the modeling approaches based on NN and RSM. The RSM is improved to select the input variables by using the errors terms of training data based on the GHS, namely as response surface method and global harmony search (RSM-GHS) modeling method. The second-order polynomial function with cross terms is applied to calibrate the time series suspended sediment load with three, four and five input variables in the proposed RSM-GHS. The linear, square and cross corrections of twenty input variables of antecedent values of suspended sediment load and water discharge are investigated to achieve the best predictions of the RSM based on the GHS method. The performances of the NN, RSM and proposed RSM-GHS including both accuracy and simplicity are compared through several comparative predicted and error statistics. The results illustrated that the proposed RSM-GHS is as uncomplicated as the RSM but performed better, where fewer errors and better correlation was observed (R = 0.95, MAE = 18.09 (ton/day), RMSE = 25.16 (ton/day)) compared to the ANN (R = 0.91, MAE = 20.17 (ton/day), RMSE = 33.09 (ton/day)) and RSM (R = 0.91, MAE = 20.06 (ton/day), RMSE = 31.92 (ton/day)) for all types of input variables.
Akimoto, Yuki; Yugi, Katsuyuki; Uda, Shinsuke; Kudo, Takamasa; Komori, Yasunori; Kubota, Hiroyuki; Kuroda, Shinya
2013-01-01
Cells use common signaling molecules for the selective control of downstream gene expression and cell-fate decisions. The relationship between signaling molecules and downstream gene expression and cellular phenotypes is a multiple-input and multiple-output (MIMO) system and is difficult to understand due to its complexity. For example, it has been reported that, in PC12 cells, different types of growth factors activate MAP kinases (MAPKs) including ERK, JNK, and p38, and CREB, for selective protein expression of immediate early genes (IEGs) such as c-FOS, c-JUN, EGR1, JUNB, and FOSB, leading to cell differentiation, proliferation and cell death; however, how multiple-inputs such as MAPKs and CREB regulate multiple-outputs such as expression of the IEGs and cellular phenotypes remains unclear. To address this issue, we employed a statistical method called partial least squares (PLS) regression, which involves a reduction of the dimensionality of the inputs and outputs into latent variables and a linear regression between these latent variables. We measured 1,200 data points for MAPKs and CREB as the inputs and 1,900 data points for IEGs and cellular phenotypes as the outputs, and we constructed the PLS model from these data. The PLS model highlighted the complexity of the MIMO system and growth factor-specific input-output relationships of cell-fate decisions in PC12 cells. Furthermore, to reduce the complexity, we applied a backward elimination method to the PLS regression, in which 60 input variables were reduced to 5 variables, including the phosphorylation of ERK at 10 min, CREB at 5 min and 60 min, AKT at 5 min and JNK at 30 min. The simple PLS model with only 5 input variables demonstrated a predictive ability comparable to that of the full PLS model. The 5 input variables effectively extracted the growth factor-specific simple relationships within the MIMO system in cell-fate decisions in PC12 cells.
Makoul, Gregory; Cameron, Kenzie A; Baker, David W; Francis, Lee; Scholtens, Denise; Wolf, Michael S
2009-08-01
To test a multimedia patient education program on colorectal cancer (CRC) screening that was designed specifically for the Hispanic/Latino community, and developed with input from community members. A total of 270 Hispanic/Latino adults, age 50-80 years, participated in Spanish for all phases of this pretest-posttest design. Patients were randomly assigned to a version of the multimedia program that opened with either a positive or negative introductory appeal. Structured interviews assessed screening relevant knowledge (anatomy and key terms, screening options, and risk information), past screening behavior, willingness to consider screening options, intention to discuss CRC screening with the doctor, and reactions to the multimedia patient education program. The multimedia program significantly increased knowledge of anatomy and key terms (e.g., polyp), primary screening options (FOBT, flexible sigmoidoscopy, colonoscopy), and risk information as well as willingness to consider screening (p<.001 for all). No significant differences emerged between positive and negative introductory appeals on these measures, intention to discuss CRC screening with their doctor, or rating the multimedia program. Multimedia tools developed with community input that are designed to present important health messages using graphics and audio can reach Hispanic/Latino adults across literacy levels and ethnic backgrounds. Additional research is needed to determine effects on actual screening behavior. Despite promising results for engaging a difficult-to-reach audience, the multimedia program should not be considered a stand-alone intervention or a substitute for communication with physicians. Rather, it is a priming mechanism intended to prepare patients for productive discussions of CRC screening.
Jackson, B Scott
2004-10-01
Many different types of integrate-and-fire models have been designed in order to explain how it is possible for a cortical neuron to integrate over many independent inputs while still producing highly variable spike trains. Within this context, the variability of spike trains has been almost exclusively measured using the coefficient of variation of interspike intervals. However, another important statistical property that has been found in cortical spike trains and is closely associated with their high firing variability is long-range dependence. We investigate the conditions, if any, under which such models produce output spike trains with both interspike-interval variability and long-range dependence similar to those that have previously been measured from actual cortical neurons. We first show analytically that a large class of high-variability integrate-and-fire models is incapable of producing such outputs based on the fact that their output spike trains are always mathematically equivalent to renewal processes. This class of models subsumes a majority of previously published models, including those that use excitation-inhibition balance, correlated inputs, partial reset, or nonlinear leakage to produce outputs with high variability. Next, we study integrate-and-fire models that have (nonPoissonian) renewal point process inputs instead of the Poisson point process inputs used in the preceding class of models. The confluence of our analytical and simulation results implies that the renewal-input model is capable of producing high variability and long-range dependence comparable to that seen in spike trains recorded from cortical neurons, but only if the interspike intervals of the inputs have infinite variance, a physiologically unrealistic condition. Finally, we suggest a new integrate-and-fire model that does not suffer any of the previously mentioned shortcomings. By analyzing simulation results for this model, we show that it is capable of producing output spike trains with interspike-interval variability and long-range dependence that match empirical data from cortical spike trains. This model is similar to the other models in this study, except that its inputs are fractional-gaussian-noise-driven Poisson processes rather than renewal point processes. In addition to this model's success in producing realistic output spike trains, its inputs have long-range dependence similar to that found in most subcortical neurons in sensory pathways, including the inputs to cortex. Analysis of output spike trains from simulations of this model also shows that a tight balance between the amounts of excitation and inhibition at the inputs to cortical neurons is not necessary for high interspike-interval variability at their outputs. Furthermore, in our analysis of this model, we show that the superposition of many fractional-gaussian-noise-driven Poisson processes does not approximate a Poisson process, which challenges the common assumption that the total effect of a large number of inputs on a neuron is well represented by a Poisson process.
Covey, Curt; Lucas, Donald D.; Tannahill, John; ...
2013-07-01
Modern climate models contain numerous input parameters, each with a range of possible values. Since the volume of parameter space increases exponentially with the number of parameters N, it is generally impossible to directly evaluate a model throughout this space even if just 2-3 values are chosen for each parameter. Sensitivity screening algorithms, however, can identify input parameters having relatively little effect on a variety of output fields, either individually or in nonlinear combination.This can aid both model development and the uncertainty quantification (UQ) process. Here we report results from a parameter sensitivity screening algorithm hitherto untested in climate modeling,more » the Morris one-at-a-time (MOAT) method. This algorithm drastically reduces the computational cost of estimating sensitivities in a high dimensional parameter space because the sample size grows linearly rather than exponentially with N. It nevertheless samples over much of the N-dimensional volume and allows assessment of parameter interactions, unlike traditional elementary one-at-a-time (EOAT) parameter variation. We applied both EOAT and MOAT to the Community Atmosphere Model (CAM), assessing CAM’s behavior as a function of 27 uncertain input parameters related to the boundary layer, clouds, and other subgrid scale processes. For radiation balance at the top of the atmosphere, EOAT and MOAT rank most input parameters similarly, but MOAT identifies a sensitivity that EOAT underplays for two convection parameters that operate nonlinearly in the model. MOAT’s ranking of input parameters is robust to modest algorithmic variations, and it is qualitatively consistent with model development experience. Supporting information is also provided at the end of the full text of the article.« less
Poleshuck, Ellen; Wittink, Marsha; Crean, Hugh; Gellasch, Tara; Sandler, Mardy; Bell, Elaine; Juskiewicz, Iwona; Cerulli, Catherine
2015-07-01
Significant health disparities exist among socioeconomically disadvantaged women, who experience elevated rates of depression and increased risk for poor depression treatment engagement and outcomes. We aimed to use stakeholder input to develop innovative methods for a comparative effectiveness trial to address the needs of socioeconomically disadvantaged women with depression in women's health practices. Using a community advisory board, focus groups, and individual patient input, we determined the feasibility and acceptability of an electronic psychosocial screening and referral tool; developed and finalized a prioritization tool for women with depression; and piloted the prioritization tool. Two intervention approaches, enhanced screening and referral using an electronic psychosocial screening, and mentoring using the prioritization tool, were developed as intervention options for socioeconomically disadvantaged women attending women's health practices. We describe the developmental steps and the final design for the comparative effectiveness trial evaluating both intervention approaches. Stakeholder input allowed us to develop an acceptable clinical trial of two patient-centered interventions with patient-driven outcomes. Copyright © 2015 Elsevier Inc. All rights reserved.
Influential input classification in probabilistic multimedia models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maddalena, Randy L.; McKone, Thomas E.; Hsieh, Dennis P.H.
1999-05-01
Monte Carlo analysis is a statistical simulation method that is often used to assess and quantify the outcome variance in complex environmental fate and effects models. Total outcome variance of these models is a function of (1) the uncertainty and/or variability associated with each model input and (2) the sensitivity of the model outcome to changes in the inputs. To propagate variance through a model using Monte Carlo techniques, each variable must be assigned a probability distribution. The validity of these distributions directly influences the accuracy and reliability of the model outcome. To efficiently allocate resources for constructing distributions onemore » should first identify the most influential set of variables in the model. Although existing sensitivity and uncertainty analysis methods can provide a relative ranking of the importance of model inputs, they fail to identify the minimum set of stochastic inputs necessary to sufficiently characterize the outcome variance. In this paper, we describe and demonstrate a novel sensitivity/uncertainty analysis method for assessing the importance of each variable in a multimedia environmental fate model. Our analyses show that for a given scenario, a relatively small number of input variables influence the central tendency of the model and an even smaller set determines the shape of the outcome distribution. For each input, the level of influence depends on the scenario under consideration. This information is useful for developing site specific models and improving our understanding of the processes that have the greatest influence on the variance in outcomes from multimedia models.« less
Kietrys, David M; Gerg, Michael J; Dropkin, Jonathan; Gold, Judith E
2015-09-01
This study aimed to determine the effects of input device type, texting style, and screen size on upper extremity and trapezius muscle activity and cervical posture during a short texting task in college students. Users of a physical keypad produced greater thumb, finger flexor, and wrist extensor muscle activity than when texting with a touch screen device of similar dimensions. Texting on either device produced greater wrist extensor muscle activity when texting with 1 hand/thumb compared with both hands/thumbs. As touch screen size increased, more participants held the device on their lap, and chose to use both thumbs less. There was also a trend for greater finger flexor, wrist extensor, and trapezius muscle activity as touch screen size increased, and for greater cervical flexion, although mean differences for cervical flexion were small. Future research can help inform whether the ergonomic stressors observed during texting are associated with musculoskeletal disorder risk. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Nguyen, Thi-Phuong-Lan; Wright, E. Pamela; Nguyen, Thanh-Trung; Schuiling-Veninga, C. C. M.; Bijlsma, M. J.; Nguyen, Thi-Bach-Yen; Postma, M. J.
2016-01-01
Objective To inform development of guidelines for hypertension management in Vietnam, we evaluated the cost-effectiveness of different strategies on screening for hypertension in preventing cardiovascular disease (CVD). Methods A decision tree was combined with a Markov model to measure incremental cost-effectiveness of different approaches to hypertension screening. Values used as input parameters for the model were taken from different sources. Various screening intervals (one-off, annually, biannually) and starting ages to screen (35, 45 or 55 years) and coverage of treatment were analysed. We ran both a ten-year and a lifetime horizon. Input parameters for the models were extracted from local and regional data. Probabilistic sensitivity analysis was used to evaluate parameter uncertainty. A threshold of three times GDP per capita was applied. Results Cost per quality adjusted life year (QALY) gained varied in different screening scenarios. In a ten-year horizon, the cost-effectiveness of screening for hypertension ranged from cost saving to Int$ 758,695 per QALY gained. For screening of men starting at 55 years, all screening scenarios gave a high probability of being cost-effective. For screening of females starting at 55 years, the probability of favourable cost-effectiveness was 90% with one-off screening. In a lifetime horizon, cost per QALY gained was lower than the threshold of Int$ 15,883 in all screening scenarios among males. Similar results were found in females when starting screening at 55 years. Starting screening in females at 45 years had a high probability of being cost-effective if screening biannually was combined with increasing coverage of treatment by 20% or even if sole biannual screening was considered. Conclusion From a health economic perspective, integrating screening for hypertension into routine medical examination and related coverage by health insurance could be recommended. Screening for hypertension has a high probability of being cost-effective in preventing CVD. An adequate screening strategy can best be selected based on age, sex and screening interval. PMID:27192051
The role of acculturation and collectivism in cancer screening for Vietnamese American women.
Nguyen, Anh B; Clark, Trenette T
2014-01-01
The aim of this study was to examine the influence of demographic variables and the interplay between collectivism and acculturation on breast and cervical cancer screening outcomes among Vietnamese American women. Convenience sampling was used to recruit 111 Vietnamese women from the Richmond, VA, metropolitan area, who participated in a larger cancer screening intervention. All participants completed measures on demographic variables, collectivism, acculturation, and cancer-screening-related variables (i.e., attitudes, self-efficacy, and screening behavior). Findings indicated that collectivism predicted both positive attitudes and higher levels of self-efficacy with regard to breast and cervical cancer screening. Collectivism also moderated the relationship between acculturation and attitudes toward breast cancer screening such that for women with low levels of collectivistic orientation, increasing acculturation predicted less positive attitudes towards breast cancer screening. This relationship was not found for women with high levels of collectivistic orientation. The current findings highlight the important roles that sociodemographic and cultural variables play in affecting health attitudes, self-efficacy, and behavior among Vietnamese women. The findings potentially inform screening programs that rely on culturally relevant values in helping increase Vietnamese women's motivation to screen.
The Role of Acculturation and Collectivism in Cancer Screening for Vietnamese American Women
NGUYEN, ANH B.; CLARK, TRENETTE T.
2017-01-01
The aim of this study was to examine the influence of demographic variables and the interplay between collectivism and acculturation on breast and cervical cancer screening outcomes among Vietnamese American women. Convenience sampling was used to recruit 111 Vietnamese women from the Richmond, VA, metropolitan area, who participated in a larger cancer screening intervention. All participants completed measures on demographic variables, collectivism, acculturation, and cancer-screening-related variables (i.e., attitudes, self-efficacy, and screening behavior). Findings indicated that collectivism predicted both positive attitudes and higher levels of self-efficacy with regard to breast and cervical cancer screening. Collectivism also moderated the relationship between acculturation and attitudes toward breast cancer screening such that for women with low levels of collectivistic orientation, increasing acculturation predicted less positive attitudes towards breast cancer screening. This relationship was not found for women with high levels of collectivistic orientation. The current findings highlight the important roles that sociodemographic and cultural variables play in affecting health attitudes, self-efficacy, and behavior among Vietnamese women. The findings potentially inform screening programs that rely on culturally relevant values in helping increase Vietnamese women’s motivation to screen. PMID:24313445
Peer Educators and Close Friends as Predictors of Male College Students' Willingness to Prevent Rape
ERIC Educational Resources Information Center
Stein, Jerrold L.
2007-01-01
Astin's (1977, 1991, 1993) input-environment-outcome (I-E-O) model provided a conceptual framework for this study which measured 156 male college students' willingness to prevent rape (outcome variable). Predictor variables included personal attitudes (input variable), perceptions of close friends' attitudes toward rape and rape prevention…
The Effects of a Change in the Variability of Irrigation Water
NASA Astrophysics Data System (ADS)
Lyon, Kenneth S.
1983-10-01
This paper examines the short-run effects upon several variables of an increase in the variability of an input. The measure of an increase in the variability is the "mean preserving spread" suggested by Rothschild and Stiglitz (1970). The variables examined are real income (utility), expected profits, expected output, the quantity used of the controllable input, and the shadow price of the stochastic input. Four striking features of the results follow: (1) The concepts that have been useful in summarizing deterministic comparative static results are nearly absent when an input is stochastic. (2) Most of the signs of the partial derivatives depend upon more than concavity of the utility and production functions. (3) If the utility function is not "too" risk averse, then the risk-neutral results hold for the risk-aversion case. (4) If the production function is Cobb-Douglas, then definite results are achieved if the utility function is linear or if the "degree of risk-aversion" is "small."
Stuckey, Marla H.
2008-01-01
The Water Resources Planning Act, Act 220 of 2002, requires the Pennsylvania Department of Environmental Protection (PaDEP) to update the State Water Plan by 2008. As part of this update, a water-analysis screening tool (WAST) was developed by the U.S. Geological Survey, in cooperation with the PaDEP, to provide assistance to the state in the identification of critical water-planning areas. The WAST has two primary inputs: net withdrawals and the initial screening criteria. A comprehensive water-use database that includes data from registration, estimation, discharge monitoring reports, mining data, and other sources was developed as input into the WAST. Water use in the following categories was estimated using water-use factors: residential, industrial, commercial, agriculture, and golf courses. A percentage of the 7-day, 10-year low flow is used for the initial screenings using the WAST to identify potential critical water-planning areas. This quantity, or initial screening criteria, is 50 percent of the 7-day, 10-year low flow for most streams. Using a basic water-balance equation, a screening indicator is calculated that indicates the potential influences of net withdrawals on aquatic-resource uses for watersheds generally larger than 15 square miles. Points representing outlets of these watersheds are colored-coded within the WAST to show the screening criteria for each watershed.
Optimal allocation of testing resources for statistical simulations
NASA Astrophysics Data System (ADS)
Quintana, Carolina; Millwater, Harry R.; Singh, Gulshan; Golden, Patrick
2015-07-01
Statistical estimates from simulation involve uncertainty caused by the variability in the input random variables due to limited data. Allocating resources to obtain more experimental data of the input variables to better characterize their probability distributions can reduce the variance of statistical estimates. The methodology proposed determines the optimal number of additional experiments required to minimize the variance of the output moments given single or multiple constraints. The method uses multivariate t-distribution and Wishart distribution to generate realizations of the population mean and covariance of the input variables, respectively, given an amount of available data. This method handles independent and correlated random variables. A particle swarm method is used for the optimization. The optimal number of additional experiments per variable depends on the number and variance of the initial data, the influence of the variable in the output function and the cost of each additional experiment. The methodology is demonstrated using a fretting fatigue example.
Input variable selection and calibration data selection for storm water quality regression models.
Sun, Siao; Bertrand-Krajewski, Jean-Luc
2013-01-01
Storm water quality models are useful tools in storm water management. Interest has been growing in analyzing existing data for developing models for urban storm water quality evaluations. It is important to select appropriate model inputs when many candidate explanatory variables are available. Model calibration and verification are essential steps in any storm water quality modeling. This study investigates input variable selection and calibration data selection in storm water quality regression models. The two selection problems are mutually interacted. A procedure is developed in order to fulfil the two selection tasks in order. The procedure firstly selects model input variables using a cross validation method. An appropriate number of variables are identified as model inputs to ensure that a model is neither overfitted nor underfitted. Based on the model input selection results, calibration data selection is studied. Uncertainty of model performances due to calibration data selection is investigated with a random selection method. An approach using the cluster method is applied in order to enhance model calibration practice based on the principle of selecting representative data for calibration. The comparison between results from the cluster selection method and random selection shows that the former can significantly improve performances of calibrated models. It is found that the information content in calibration data is important in addition to the size of calibration data.
Effect of screen-based media on energy expenditure and heart rate in 9- to 12-year-old children.
Straker, Leon; Abbott, Rebecca
2007-11-01
This study compared the cardiovascular responses and energy costs of new and traditional screen based entertainments, as played by twenty 9- to 12-year-old children. Playing traditional electronic games resulted in little change to heart rate or energy expenditure compared with watching a DVD. In contrast, playing an active-input game resulted in a 59% increase in heart rate (p < .001) and a 224% increase in energy expenditure (p < .001) for boys and girls. The average heart rate of 130 bpm and energy expenditure of 0.13 kcal . min-1 . kg-1 achieved during active-input game use equates with moderate intensity activities such as basketball and jogging. Active-input electronic games might provide children with opportunities to engage with technology and be physically active at the same time.
Oxlade, Olivia; Pinto, Marcia; Trajman, Anete; Menzies, Dick
2013-01-01
Introduction Cost effectiveness analyses (CEA) can provide useful information on how to invest limited funds, however they are less useful if different analysis of the same intervention provide unclear or contradictory results. The objective of our study was to conduct a systematic review of methodologic aspects of CEA that evaluate Interferon Gamma Release Assays (IGRA) for the detection of Latent Tuberculosis Infection (LTBI), in order to understand how differences affect study results. Methods A systematic review of studies was conducted with particular focus on study quality and the variability in inputs used in models used to assess cost-effectiveness. A common decision analysis model of the IGRA versus Tuberculin Skin Test (TST) screening strategy was developed and used to quantify the impact on predicted results of observed differences of model inputs taken from the studies identified. Results Thirteen studies were ultimately included in the review. Several specific methodologic issues were identified across studies, including how study inputs were selected, inconsistencies in the costing approach, the utility of the QALY (Quality Adjusted Life Year) as the effectiveness outcome, and how authors choose to present and interpret study results. When the IGRA versus TST test strategies were compared using our common decision analysis model predicted effectiveness largely overlapped. Implications Many methodologic issues that contribute to inconsistent results and reduced study quality were identified in studies that assessed the cost-effectiveness of the IGRA test. More specific and relevant guidelines are needed in order to help authors standardize modelling approaches, inputs, assumptions and how results are presented and interpreted. PMID:23505412
NASA Technical Reports Server (NTRS)
Holzhausen, K. P.; Gaertner, K. P.
1985-01-01
A significant problem concerning the integration of display and switching functions is related to the fact that numerous informative data which have to be processed by man must be read from only a few display devices. A satisfactory ergonomic design of integrated display devices and keyboards is in many cases difficult, because not all functions which can be displayed and selected are simultaneously available. A technical solution which provides an integration of display and functional elements on the basis of the highest flexibility is obtained by using a cathode ray tube with a touch-sensitive screen. The employment of an integrated data input/output system is demonstrated for the cases of onboard and ground-based flight control. Ergonomic studies conducted to investigate the suitability of an employment of touch-sensitive screens are also discussed.
Evolution of separate screening soliton pairs in a biased series photorefractive crystal circuit.
Liu, Jinsong; Hao, Zhonghua
2002-06-01
This paper presents calculations for an idea in photorefractive spatial soliton, namely, screening solitons form in a biased series photorefractive crystal circuit consisting of two photorefractive crystals connected electronically by electrode leads in a chain with a voltage source. A system of two coupled equations is derived under appropriate conditions for two-beam propagation in the crystal circuit. The possibility of obtaining steady-state bright and dark screening soliton solutions is investigated in one dimension and, the existence of dark-dark, bright-dark, and bright-bright separate screening soliton pairs in such a circuit is proved. The numerical results show that the two solitons in a soliton pair can affect each other by the light-induced current and their coupling can affect their spatial profiles, dynamical evolutions, stabilities, and self-deflection. Under the limit in which the optical wave has a spatial extent much less than the width of the crystal, only the dark soliton can affect the other soliton by the light-induced current, but the bright soliton cannot. For a bright-dark or dark-dark soliton pair, the dark soliton in a weak input intensity can be obtained for a larger nonlinearity than for a stronger input intensity. For a bright-dark soliton pair, increasing the input intensity of the dark soliton can increase the bending angle of the bright soliton. Some potential applications are discussed.
Artificial neural network model for ozone concentration estimation and Monte Carlo analysis
NASA Astrophysics Data System (ADS)
Gao, Meng; Yin, Liting; Ning, Jicai
2018-07-01
Air pollution in urban atmosphere directly affects public-health; therefore, it is very essential to predict air pollutant concentrations. Air quality is a complex function of emissions, meteorology and topography, and artificial neural networks (ANNs) provide a sound framework for relating these variables. In this study, we investigated the feasibility of using ANN model with meteorological parameters as input variables to predict ozone concentration in the urban area of Jinan, a metropolis in Northern China. We firstly found that the architecture of network of neurons had little effect on the predicting capability of ANN model. A parsimonious ANN model with 6 routinely monitored meteorological parameters and one temporal covariate (the category of day, i.e. working day, legal holiday and regular weekend) as input variables was identified, where the 7 input variables were selected following the forward selection procedure. Compared with the benchmarking ANN model with 9 meteorological and photochemical parameters as input variables, the predicting capability of the parsimonious ANN model was acceptable. Its predicting capability was also verified in term of warming success ratio during the pollution episodes. Finally, uncertainty and sensitivity analysis were also performed based on Monte Carlo simulations (MCS). It was concluded that the ANN could properly predict the ambient ozone level. Maximum temperature, atmospheric pressure, sunshine duration and maximum wind speed were identified as the predominate input variables significantly influencing the prediction of ambient ozone concentrations.
NASA Technical Reports Server (NTRS)
Stalnaker, Dale K.
1993-01-01
ACARA (Availability, Cost, and Resource Allocation) is a computer program which analyzes system availability, lifecycle cost (LCC), and resupply scheduling using Monte Carlo analysis to simulate component failure and replacement. This manual was written to: (1) explain how to prepare and enter input data for use in ACARA; (2) explain the user interface, menus, input screens, and input tables; (3) explain the algorithms used in the program; and (4) explain each table and chart in the output.
Correction of I/Q channel errors without calibration
Doerry, Armin W.; Tise, Bertice L.
2002-01-01
A method of providing a balanced demodular output for a signal such as a Doppler radar having an analog pulsed input; includes adding a variable phase shift as a function of time to the input signal, applying the phase shifted input signal to a demodulator; and generating a baseband signal from the input signal. The baseband signal is low-pass filtered and converted to a digital output signal. By removing the variable phase shift from the digital output signal, a complex data output is formed that is representative of the output of a balanced demodulator.
Delpierre, Nicolas; Berveiller, Daniel; Granda, Elena; Dufrêne, Eric
2016-04-01
Although the analysis of flux data has increased our understanding of the interannual variability of carbon inputs into forest ecosystems, we still know little about the determinants of wood growth. Here, we aimed to identify which drivers control the interannual variability of wood growth in a mesic temperate deciduous forest. We analysed a 9-yr time series of carbon fluxes and aboveground wood growth (AWG), reconstructed at a weekly time-scale through the combination of dendrometer and wood density data. Carbon inputs and AWG anomalies appeared to be uncorrelated from the seasonal to interannual scales. More than 90% of the interannual variability of AWG was explained by a combination of the growth intensity during a first 'critical period' of the wood growing season, occurring close to the seasonal maximum, and the timing of the first summer growth halt. Both atmospheric and soil water stress exerted a strong control on the interannual variability of AWG at the study site, despite its mesic conditions, whilst not affecting carbon inputs. Carbon sink activity, not carbon inputs, determined the interannual variations in wood growth at the study site. Our results provide a functional understanding of the dependence of radial growth on precipitation observed in dendrological studies. © 2015 The Authors. New Phytologist © 2015 New Phytologist Trust.
Input Variability Facilitates Unguided Subcategory Learning in Adults
Eidsvåg, Sunniva Sørhus; Austad, Margit; Asbjørnsen, Arve E.
2015-01-01
Purpose This experiment investigated whether input variability would affect initial learning of noun gender subcategories in an unfamiliar, natural language (Russian), as it is known to assist learning of other grammatical forms. Method Forty adults (20 men, 20 women) were familiarized with examples of masculine and feminine Russian words. Half of the participants were familiarized with 32 different root words in a high-variability condition. The other half were familiarized with 16 different root words, each repeated twice for a total of 32 presentations in a high-repetition condition. Participants were tested on untrained members of the category to assess generalization. Familiarization and testing was completed 2 additional times. Results Only participants in the high-variability group showed evidence of learning after an initial period of familiarization. Participants in the high-repetition group were able to learn after additional input. Both groups benefited when words included 2 cues to gender compared to a single cue. Conclusions The results demonstrate that the degree of input variability can influence learners' ability to generalize a grammatical subcategory (noun gender) from a natural language. In addition, the presence of multiple cues to linguistic subcategory facilitated learning independent of variability condition. PMID:25680081
Input Variability Facilitates Unguided Subcategory Learning in Adults.
Eidsvåg, Sunniva Sørhus; Austad, Margit; Plante, Elena; Asbjørnsen, Arve E
2015-06-01
This experiment investigated whether input variability would affect initial learning of noun gender subcategories in an unfamiliar, natural language (Russian), as it is known to assist learning of other grammatical forms. Forty adults (20 men, 20 women) were familiarized with examples of masculine and feminine Russian words. Half of the participants were familiarized with 32 different root words in a high-variability condition. The other half were familiarized with 16 different root words, each repeated twice for a total of 32 presentations in a high-repetition condition. Participants were tested on untrained members of the category to assess generalization. Familiarization and testing was completed 2 additional times. Only participants in the high-variability group showed evidence of learning after an initial period of familiarization. Participants in the high-repetition group were able to learn after additional input. Both groups benefited when words included 2 cues to gender compared to a single cue. The results demonstrate that the degree of input variability can influence learners' ability to generalize a grammatical subcategory (noun gender) from a natural language. In addition, the presence of multiple cues to linguistic subcategory facilitated learning independent of variability condition.
Latin Hypercube Sampling (LHS) UNIX Library/Standalone
DOE Office of Scientific and Technical Information (OSTI.GOV)
2004-05-13
The LHS UNIX Library/Standalone software provides the capability to draw random samples from over 30 distribution types. It performs the sampling by a stratified sampling method called Latin Hypercube Sampling (LHS). Multiple distributions can be sampled simultaneously, with user-specified correlations amongst the input distributions, LHS UNIX Library/ Standalone provides a way to generate multi-variate samples. The LHS samples can be generated either as a callable library (e.g., from within the DAKOTA software framework) or as a standalone capability. LHS UNIX Library/Standalone uses the Latin Hypercube Sampling method (LHS) to generate samples. LHS is a constrained Monte Carlo sampling scheme. Inmore » LHS, the range of each variable is divided into non-overlapping intervals on the basis of equal probability. A sample is selected at random with respect to the probability density in each interval, If multiple variables are sampled simultaneously, then values obtained for each are paired in a random manner with the n values of the other variables. In some cases, the pairing is restricted to obtain specified correlations amongst the input variables. Many simulation codes have input parameters that are uncertain and can be specified by a distribution, To perform uncertainty analysis and sensitivity analysis, random values are drawn from the input parameter distributions, and the simulation is run with these values to obtain output values. If this is done repeatedly, with many input samples drawn, one can build up a distribution of the output as well as examine correlations between input and output variables.« less
75 FR 21645 - Secretary's Advisory Committee on Heritable Disorders in Newborns and Children
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-26
... with pre- and post-natal care about newborn screening and the potential use of residual dried blood... further access after newborn screening tests are completed. Multidisciplinary input, including from... quality check) or process improvement (e.g., non-commercial, internal program new test development or...
Variable Shadow Screens for Imaging Optical Devices
NASA Technical Reports Server (NTRS)
Lu, Ed; Chretien, Jean L.
2004-01-01
Variable shadow screens have been proposed for reducing the apparent brightnesses of very bright light sources relative to other sources within the fields of view of diverse imaging optical devices, including video and film cameras and optical devices for imaging directly into the human eye. In other words, variable shadow screens would increase the effective dynamic ranges of such devices. Traditionally, imaging sensors are protected against excessive brightness by use of dark filters and/or reduction of iris diameters. These traditional means do not increase dynamic range; they reduce the ability to view or image dimmer features of an image because they reduce the brightness of all parts of an image by the same factor. On the other hand, a variable shadow screen would darken only the excessively bright parts of an image. For example, dim objects in a field of view that included the setting Sun or bright headlights could be seen more readily in a picture taken through a variable shadow screen than in a picture of the same scene taken through a dark filter or a narrowed iris. The figure depicts one of many potential variations of the basic concept of the variable shadow screen. The shadow screen would be a normally transparent liquid-crystal matrix placed in front of a focal-plane array of photodetectors in a charge-coupled-device video camera. The shadow screen would be placed far enough from the focal plane so as not to disrupt the focal-plane image to an unacceptable degree, yet close enough so that the out-of-focus shadows cast by the screen would still be effective in darkening the brightest parts of the image. The image detected by the photodetector array itself would be used as feedback to drive the variable shadow screen: The video output of the camera would be processed by suitable analog and/or digital electronic circuitry to generate a negative partial version of the image to be impressed on the shadow screen. The parts of the shadow screen in front of those parts of the image with brightness below a specified threshold would be left transparent; the parts of the shadow screen in front of those parts of the image where the brightness exceeded the threshold would be darkened by an amount that would increase with the excess above the threshold.
Qualitative and semiquantitative Fourier transformation using a noncoherent system.
Rogers, G L
1979-09-15
A number of authors have pointed out that a system of zone plates combined with a diffuse source, transparent input, lens, and focusing screen will display on the output screen the Fourier transform of the input. Strictly speaking, the transform normally displayed is the cosine transform, and the bipolar output is superimposed on a dc gray level to give a positive-only intensity variation. By phase-shifting one zone plate the sine transform is obtained. Temporal modulation is possible. It is also possible to redesign the system to accept a diffusely reflecting input at the cost of introducing a phase gradient in the output. Results are given of the sine and cosine transforms of a small circular aperture. As expected, the sine transform is a uniform gray. Both transforms show unwanted artifacts beyond 0.1 rad off-axis. An analysis shows this is due to unwanted circularly symmetrical moire patterns between the zone plates.
Variable screening via quantile partial correlation
Ma, Shujie; Tsai, Chih-Ling
2016-01-01
In quantile linear regression with ultra-high dimensional data, we propose an algorithm for screening all candidate variables and subsequently selecting relevant predictors. Specifically, we first employ quantile partial correlation for screening, and then we apply the extended Bayesian information criterion (EBIC) for best subset selection. Our proposed method can successfully select predictors when the variables are highly correlated, and it can also identify variables that make a contribution to the conditional quantiles but are marginally uncorrelated or weakly correlated with the response. Theoretical results show that the proposed algorithm can yield the sure screening set. By controlling the false selection rate, model selection consistency can be achieved theoretically. In practice, we proposed using EBIC for best subset selection so that the resulting model is screening consistent. Simulation studies demonstrate that the proposed algorithm performs well, and an empirical example is presented. PMID:28943683
Speed control system for an access gate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bzorgi, Fariborz M
2012-03-20
An access control apparatus for an access gate. The access gate typically has a rotator that is configured to rotate around a rotator axis at a first variable speed in a forward direction. The access control apparatus may include a transmission that typically has an input element that is operatively connected to the rotator. The input element is generally configured to rotate at an input speed that is proportional to the first variable speed. The transmission typically also has an output element that has an output speed that is higher than the input speed. The input element and the outputmore » element may rotate around a common transmission axis. A retardation mechanism may be employed. The retardation mechanism is typically configured to rotate around a retardation mechanism axis. Generally the retardation mechanism is operatively connected to the output element of the transmission and is configured to retard motion of the access gate in the forward direction when the first variable speed is above a control-limit speed. In many embodiments the transmission axis and the retardation mechanism axis are substantially co-axial. Some embodiments include a freewheel/catch mechanism that has an input connection that is operatively connected to the rotator. The input connection may be configured to engage an output connection when the rotator is rotated at the first variable speed in a forward direction and configured for substantially unrestricted rotation when the rotator is rotated in a reverse direction opposite the forward direction. The input element of the transmission is typically operatively connected to the output connection of the freewheel/catch mechanism.« less
ERIC Educational Resources Information Center
Blandford, A. E.; Smith, P. R.
1986-01-01
Describes the style of design of computer simulations developed by Computer Assisted Teaching Unit at Queen Mary College with reference to user interface, input and initialization, input data vetting, effective display screen use, graphical results presentation, and need for hard copy. Procedures and problems relating to academic involvement are…
Gender Roles and Acculturation: Relationships With Cancer Screening Among Vietnamese American Women
Nguyen, Anh B.; Clark, Trenette T.; Belgrave, Faye Z.
2017-01-01
The aim of this study was to examine the influence of demographic variables and the interplay between gender roles and acculturation on breast and cervical cancer screening outcomes among Vietnamese American women. Convenience sampling was used to recruit 100 Vietnamese women from the Richmond, VA, metropolitan area. Women were recruited to participate in a larger cancer screening intervention. All participants completed measures on demographic variables, gender roles, acculturation, and cancer screening variables. Findings indicated that traditional masculine gender roles were associated with increased self-efficacy for breast and cervical cancer screening. Higher levels of acculturation were associated with higher probability of having had a Papanicolaou test. In addition, acculturation moderated the relationship between traditional female gender roles and cancer screening variables. For highly acculturated women, higher levels of feminine gender roles predicted higher probability of having had a previous clinical breast exam and higher levels of self-efficacy for cervical cancer screening, while the opposite was true for lower acculturated women. The findings of this study indicate the important roles that sociodemographic variables, gender roles, and acculturation play in affecting health attitudes and behaviors among Vietnamese women. These findings also help to identify a potentially high-risk subgroup and existing gaps that need to be targeted by preventive interventions. PMID:24491129
Gender roles and acculturation: relationships with cancer screening among Vietnamese American women.
Nguyen, Anh B; Clark, Trenette T; Belgrave, Faye Z
2014-01-01
The aim of this study was to examine the influence of demographic variables and the interplay between gender roles and acculturation on breast and cervical cancer screening outcomes among Vietnamese American women. Convenience sampling was used to recruit 100 Vietnamese women from the Richmond, VA, metropolitan area. Women were recruited to participate in a larger cancer screening intervention. All participants completed measures on demographic variables, gender roles, acculturation, and cancer screening variables. Findings indicated that traditional masculine gender roles were associated with increased self-efficacy for breast and cervical cancer screening. Higher levels of acculturation were associated with higher probability of having had a Papanicolaou test. In addition, acculturation moderated the relationship between traditional female gender roles and cancer screening variables. For highly acculturated women, higher levels of feminine gender roles predicted higher probability of having had a previous clinical breast exam and higher levels of self-efficacy for cervical cancer screening, while the opposite was true for lower acculturated women. The findings of this study indicate the important roles that sociodemographic variables, gender roles, and acculturation play in affecting health attitudes and behaviors among Vietnamese women. These findings also help to identify a potentially high-risk subgroup and existing gaps that need to be targeted by preventive interventions.
Gupta, Himanshu; Schiros, Chun G; Sharifov, Oleg F; Jain, Apurva; Denney, Thomas S
2016-08-31
Recently released American College of Cardiology/American Heart Association (ACC/AHA) guideline recommends the Pooled Cohort equations for evaluating atherosclerotic cardiovascular risk of individuals. The impact of the clinical input variable uncertainties on the estimates of ten-year cardiovascular risk based on ACC/AHA guidelines is not known. Using a publicly available the National Health and Nutrition Examination Survey dataset (2005-2010), we computed maximum and minimum ten-year cardiovascular risks by assuming clinically relevant variations/uncertainties in input of age (0-1 year) and ±10 % variation in total-cholesterol, high density lipoprotein- cholesterol, and systolic blood pressure and by assuming uniform distribution of the variance of each variable. We analyzed the changes in risk category compared to the actual inputs at 5 % and 7.5 % risk limits as these limits define the thresholds for consideration of drug therapy in the new guidelines. The new-pooled cohort equations for risk estimation were implemented in a custom software package. Based on our input variances, changes in risk category were possible in up to 24 % of the population cohort at both 5 % and 7.5 % risk boundary limits. This trend was consistently noted across all subgroups except in African American males where most of the cohort had ≥7.5 % baseline risk regardless of the variation in the variables. The uncertainties in the input variables can alter the risk categorization. The impact of these variances on the ten-year risk needs to be incorporated into the patient/clinician discussion and clinical decision making. Incorporating good clinical practices for the measurement of critical clinical variables and robust standardization of laboratory parameters to more stringent reference standards is extremely important for successful implementation of the new guidelines. Furthermore, ability to customize the risk calculator inputs to better represent unique clinical circumstances specific to individual needs would be highly desirable in the future versions of the risk calculator.
Aitkenhead, Matt J; Black, Helaina I J
2018-02-01
Using the International Centre for Research in Agroforestry-International Soil Reference and Information Centre (ICRAF-ISRIC) global soil spectroscopy database, models were developed to estimate a number of soil variables using different input data types. These input types included: (1) site data only; (2) visible-near-infrared (Vis-NIR) diffuse reflectance spectroscopy only; (3) combined site and Vis-NIR data; (4) red-green-blue (RGB) color data only; and (5) combined site and RGB color data. The models produced variable estimation accuracy, with RGB only being generally worst and spectroscopy plus site being best. However, we showed that for certain variables, estimation accuracy levels achieved with the "site plus RGB input data" were sufficiently good to provide useful estimates (r 2 > 0.7). These included major elements (Ca, Si, Al, Fe), organic carbon, and cation exchange capacity. Estimates for bulk density, contrast-to-noise (C/N), and P were moderately good, but K was not well estimated using this model type. For the "spectra plus site" model, many more variables were well estimated, including many that are important indicators for agricultural productivity and soil health. Sum of cation, electrical conductivity, Si, Ca, and Al oxides, and C/N ratio were estimated using this approach with r 2 values > 0.9. This work provides a mechanism for identifying the cost-effectiveness of using different model input data, with associated costs, for estimating soil variables to required levels of accuracy.
Extraordinary optical transmission inside a waveguide: spatial mode dependence.
Reichel, Kimberly S; Lu, Peter Y; Backus, Sterling; Mendis, Rajind; Mittleman, Daniel M
2016-12-12
We study the influence of the input spatial mode on the extraordinary optical transmission (EOT) effect. By placing a metal screen with a 1D array of subwavelength holes inside a terahertz (THz) parallel-plate waveguide (PPWG), we can directly compare the transmission spectra with different input waveguide modes. We observe that the transmitted spectrum depends strongly on the input mode. A conventional description of EOT based on the excitation of surface plasmons is not predictive in all cases. Instead, we utilize a formalism based on impedance matching, which accurately predicts the spectral resonances for both TEM and non-TEM input modes.
NLEdit: A generic graphical user interface for Fortran programs
NASA Technical Reports Server (NTRS)
Curlett, Brian P.
1994-01-01
NLEdit is a generic graphical user interface for the preprocessing of Fortran namelist input files. The interface consists of a menu system, a message window, a help system, and data entry forms. A form is generated for each namelist. The form has an input field for each namelist variable along with a one-line description of that variable. Detailed help information, default values, and minimum and maximum allowable values can all be displayed via menu picks. Inputs are processed through a scientific calculator program that allows complex equations to be used instead of simple numeric inputs. A custom user interface is generated simply by entering information about the namelist input variables into an ASCII file. There is no need to learn a new graphics system or programming language. NLEdit can be used as a stand-alone program or as part of a larger graphical user interface. Although NLEdit is intended for files using namelist format, it can be easily modified to handle other file formats.
Computing Shapes Of Cascade Diffuser Blades
NASA Technical Reports Server (NTRS)
Tran, Ken; Prueger, George H.
1993-01-01
Computer program generates sizes and shapes of cascade-type blades for use in axial or radial turbomachine diffusers. Generates shapes of blades rapidly, incorporating extensive cascade data to determine optimum incidence and deviation angle for blade design based on 65-series data base of National Advisory Commission for Aeronautics and Astronautics (NACA). Allows great variability in blade profile through input variables. Also provides for design of three-dimensional blades by allowing variable blade stacking. Enables designer to obtain computed blade-geometry data in various forms: as input for blade-loading analysis; as input for quasi-three-dimensional analysis of flow; or as points for transfer to computer-aided design.
A stacking ensemble learning framework for annual river ice breakup dates
NASA Astrophysics Data System (ADS)
Sun, Wei; Trevor, Bernard
2018-06-01
River ice breakup dates (BDs) are not merely a proxy indicator of climate variability and change, but a direct concern in the management of local ice-caused flooding. A framework of stacking ensemble learning for annual river ice BDs was developed, which included two-level components: member and combining models. The member models described the relations between BD and their affecting indicators; the combining models linked the predicted BD by each member models with the observed BD. Especially, Bayesian regularization back-propagation artificial neural network (BRANN), and adaptive neuro fuzzy inference systems (ANFIS) were employed as both member and combining models. The candidate combining models also included the simple average methods (SAM). The input variables for member models were selected by a hybrid filter and wrapper method. The performances of these models were examined using the leave-one-out cross validation. As the largest unregulated river in Alberta, Canada with ice jams frequently occurring in the vicinity of Fort McMurray, the Athabasca River at Fort McMurray was selected as the study area. The breakup dates and candidate affecting indicators in 1980-2015 were collected. The results showed that, the BRANN member models generally outperformed the ANFIS member models in terms of better performances and simpler structures. The difference between the R and MI rankings of inputs in the optimal member models may imply that the linear correlation based filter method would be feasible to generate a range of candidate inputs for further screening through other wrapper or embedded IVS methods. The SAM and BRANN combining models generally outperformed all member models. The optimal SAM combining model combined two BRANN member models and improved upon them in terms of average squared errors by 14.6% and 18.1% respectively. In this study, for the first time, the stacking ensemble learning was applied to forecasting of river ice breakup dates, which appeared promising for other river ice forecasting problems.
Automated Inference of Chemical Discriminants of Biological Activity.
Raschka, Sebastian; Scott, Anne M; Huertas, Mar; Li, Weiming; Kuhn, Leslie A
2018-01-01
Ligand-based virtual screening has become a standard technique for the efficient discovery of bioactive small molecules. Following assays to determine the activity of compounds selected by virtual screening, or other approaches in which dozens to thousands of molecules have been tested, machine learning techniques make it straightforward to discover the patterns of chemical groups that correlate with the desired biological activity. Defining the chemical features that generate activity can be used to guide the selection of molecules for subsequent rounds of screening and assaying, as well as help design new, more active molecules for organic synthesis.The quantitative structure-activity relationship machine learning protocols we describe here, using decision trees, random forests, and sequential feature selection, take as input the chemical structure of a single, known active small molecule (e.g., an inhibitor, agonist, or substrate) for comparison with the structure of each tested molecule. Knowledge of the atomic structure of the protein target and its interactions with the active compound are not required. These protocols can be modified and applied to any data set that consists of a series of measured structural, chemical, or other features for each tested molecule, along with the experimentally measured value of the response variable you would like to predict or optimize for your project, for instance, inhibitory activity in a biological assay or ΔG binding . To illustrate the use of different machine learning algorithms, we step through the analysis of a dataset of inhibitor candidates from virtual screening that were tested recently for their ability to inhibit GPCR-mediated signaling in a vertebrate.
Port-O-Sim Object Simulation Application
NASA Technical Reports Server (NTRS)
Lanzi, Raymond J.
2009-01-01
Port-O-Sim is a software application that supports engineering modeling and simulation of launch-range systems and subsystems, as well as the vehicles that operate on them. It is flexible, distributed, object-oriented, and realtime. A scripting language is used to configure an array of simulation objects and link them together. The script is contained in a text file, but executed and controlled using a graphical user interface. A set of modules is defined, each with input variables, output variables, and settings. These engineering models can be either linked to each other or run as standalone. The settings can be modified during execution. Since 2001, this application has been used for pre-mission failure mode training for many Range Safety Scenarios. It contains range asset link analysis, develops look-angle data, supports sky-screen site selection, drives GPS (Global Positioning System) and IMU (Inertial Measurement Unit) simulators, and can support conceptual design efforts for multiple flight programs with its capacity for rapid six-degrees-of-freedom model development. Due to the assembly of various object types into one application, the application is applicable across a wide variety of launch range problem domains.
Neural Network Machine Learning and Dimension Reduction for Data Visualization
NASA Technical Reports Server (NTRS)
Liles, Charles A.
2014-01-01
Neural network machine learning in computer science is a continuously developing field of study. Although neural network models have been developed which can accurately predict a numeric value or nominal classification, a general purpose method for constructing neural network architecture has yet to be developed. Computer scientists are often forced to rely on a trial-and-error process of developing and improving accurate neural network models. In many cases, models are constructed from a large number of input parameters. Understanding which input parameters have the greatest impact on the prediction of the model is often difficult to surmise, especially when the number of input variables is very high. This challenge is often labeled the "curse of dimensionality" in scientific fields. However, techniques exist for reducing the dimensionality of problems to just two dimensions. Once a problem's dimensions have been mapped to two dimensions, it can be easily plotted and understood by humans. The ability to visualize a multi-dimensional dataset can provide a means of identifying which input variables have the highest effect on determining a nominal or numeric output. Identifying these variables can provide a better means of training neural network models; models can be more easily and quickly trained using only input variables which appear to affect the outcome variable. The purpose of this project is to explore varying means of training neural networks and to utilize dimensional reduction for visualizing and understanding complex datasets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
LaFarge, R.A.
1990-05-01
MCPRAM (Monte Carlo PReprocessor for AMEER), a computer program that uses Monte Carlo techniques to create an input file for the AMEER trajectory code, has been developed for the Sandia National Laboratories VAX and Cray computers. Users can select the number of trajectories to compute, which AMEER variables to investigate, and the type of probability distribution for each variable. Any legal AMEER input variable can be investigated anywhere in the input run stream with either a normal, uniform, or Rayleigh distribution. Users also have the option to use covariance matrices for the investigation of certain correlated variables such as boostermore » pre-reentry errors and wind, axial force, and atmospheric models. In conjunction with MCPRAM, AMEER was modified to include the variables introduced by the covariance matrices and to include provisions for six types of fuze models. The new fuze models and the new AMEER variables are described in this report.« less
Input Variability Facilitates Unguided Subcategory Learning in Adults
ERIC Educational Resources Information Center
Eidsvåg, Sunniva Sørhus; Austad, Margit; Plante, Elena; Asbjørnsen, Arve E.
2015-01-01
Purpose: This experiment investigated whether input variability would affect initial learning of noun gender subcategories in an unfamiliar, natural language (Russian), as it is known to assist learning of other grammatical forms. Method: Forty adults (20 men, 20 women) were familiarized with examples of masculine and feminine Russian words. Half…
Lee, Eleanor S.; Geisler-Moroder, David; Ward, Gregory
2017-12-23
Simulation tools that enable annual energy performance analysis of optically-complex fenestration systems have been widely adopted by the building industry for use in building design, code development, and the development of rating and certification programs for commercially-available shading and daylighting products. The tools rely on a three-phase matrix operation to compute solar heat gains, using as input low-resolution bidirectional scattering distribution function (BSDF) data (10–15° angular resolution; BSDF data define the angle-dependent behavior of light-scattering materials and systems). Measurement standards and product libraries for BSDF data are undergoing development to support solar heat gain calculations. Simulation of other metrics suchmore » as discomfort glare, annual solar exposure, and potentially thermal discomfort, however, require algorithms and BSDF input data that more accurately model the spatial distribution of transmitted and reflected irradiance or illuminance from the sun (0.5° resolution). This study describes such algorithms and input data, then validates the tools (i.e., an interpolation tool for measured BSDF data and the five-phase method) through comparisons with ray-tracing simulations and field monitored data from a full-scale testbed. Simulations of daylight-redirecting films, a micro-louvered screen, and venetian blinds using variable resolution, tensor tree BSDF input data derived from interpolated scanning goniophotometer measurements were shown to agree with field monitored data to within 20% for greater than 75% of the measurement period for illuminance-based performance parameters. The three-phase method delivered significantly less accurate results. We discuss the ramifications of these findings on industry and provide recommendations to increase end user awareness of the current limitations of existing software tools and BSDF product libraries.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Eleanor S.; Geisler-Moroder, David; Ward, Gregory
Simulation tools that enable annual energy performance analysis of optically-complex fenestration systems have been widely adopted by the building industry for use in building design, code development, and the development of rating and certification programs for commercially-available shading and daylighting products. The tools rely on a three-phase matrix operation to compute solar heat gains, using as input low-resolution bidirectional scattering distribution function (BSDF) data (10–15° angular resolution; BSDF data define the angle-dependent behavior of light-scattering materials and systems). Measurement standards and product libraries for BSDF data are undergoing development to support solar heat gain calculations. Simulation of other metrics suchmore » as discomfort glare, annual solar exposure, and potentially thermal discomfort, however, require algorithms and BSDF input data that more accurately model the spatial distribution of transmitted and reflected irradiance or illuminance from the sun (0.5° resolution). This study describes such algorithms and input data, then validates the tools (i.e., an interpolation tool for measured BSDF data and the five-phase method) through comparisons with ray-tracing simulations and field monitored data from a full-scale testbed. Simulations of daylight-redirecting films, a micro-louvered screen, and venetian blinds using variable resolution, tensor tree BSDF input data derived from interpolated scanning goniophotometer measurements were shown to agree with field monitored data to within 20% for greater than 75% of the measurement period for illuminance-based performance parameters. The three-phase method delivered significantly less accurate results. We discuss the ramifications of these findings on industry and provide recommendations to increase end user awareness of the current limitations of existing software tools and BSDF product libraries.« less
Effects of input uncertainty on cross-scale crop modeling
NASA Astrophysics Data System (ADS)
Waha, Katharina; Huth, Neil; Carberry, Peter
2014-05-01
The quality of data on climate, soils and agricultural management in the tropics is in general low or data is scarce leading to uncertainty in process-based modeling of cropping systems. Process-based crop models are common tools for simulating crop yields and crop production in climate change impact studies, studies on mitigation and adaptation options or food security studies. Crop modelers are concerned about input data accuracy as this, together with an adequate representation of plant physiology processes and choice of model parameters, are the key factors for a reliable simulation. For example, assuming an error in measurements of air temperature, radiation and precipitation of ± 0.2°C, ± 2 % and ± 3 % respectively, Fodor & Kovacs (2005) estimate that this translates into an uncertainty of 5-7 % in yield and biomass simulations. In our study we seek to answer the following questions: (1) are there important uncertainties in the spatial variability of simulated crop yields on the grid-cell level displayed on maps, (2) are there important uncertainties in the temporal variability of simulated crop yields on the aggregated, national level displayed in time-series, and (3) how does the accuracy of different soil, climate and management information influence the simulated crop yields in two crop models designed for use at different spatial scales? The study will help to determine whether more detailed information improves the simulations and to advise model users on the uncertainty related to input data. We analyse the performance of the point-scale crop model APSIM (Keating et al., 2003) and the global scale crop model LPJmL (Bondeau et al., 2007) with different climate information (monthly and daily) and soil conditions (global soil map and African soil map) under different agricultural management (uniform and variable sowing dates) for the low-input maize-growing areas in Burkina Faso/West Africa. We test the models' response to different levels of input data from very little to very detailed information, and compare the models' abilities to represent the spatial variability and temporal variability in crop yields. We display the uncertainty in crop yield simulations from different input data and crop models in Taylor diagrams which are a graphical summary of the similarity between simulations and observations (Taylor, 2001). The observed spatial variability can be represented well from both models (R=0.6-0.8) but APSIM predicts higher spatial variability than LPJmL due to its sensitivity to soil parameters. Simulations with the same crop model, climate and sowing dates have similar statistics and therefore similar skill to reproduce the observed spatial variability. Soil data is less important for the skill of a crop model to reproduce the observed spatial variability. However, the uncertainty in simulated spatial variability from the two crop models is larger than from input data settings and APSIM is more sensitive to input data then LPJmL. Even with a detailed, point-scale crop model and detailed input data it is difficult to capture the complexity and diversity in maize cropping systems.
Region-to-area screening methodology for the Crystalline Repository Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1985-04-01
The purpose of this document is to describe the Crystalline Repository Project's (CRP) process for region-to-area screening of exposed and near-surface crystalline rock bodies in the three regions of the conterminous United States where crystalline rock is being evaluated as a potential host for the second nuclear waste repository (i.e., in the North Central, Northeastern, and Southeastern Regions). This document indicates how the US Department of Energy's (DOE) General Guidelines for the Recommendation of Sites for Nuclear Waste Repositories (10 CFR 960) were used to select and apply factors and variables for the region-to-area screening, explains how these factors andmore » variable are to be applied in the region-to-area screening, and indicates how this methodology relates to the decision process leading to the selection of candidate areas. A brief general discussion of the screening process from the national survey through area screening and site recommendation is presented. This discussion sets the scene for detailed discussions which follow concerning the region-to-area screening process, the guidance provided by the DOE Siting Guidelines for establishing disqualifying factors and variables for screening, and application of the disqualifying factors and variables in the screening process. This document is complementary to the regional geologic and environmental characterization reports to be issued in the summer of 1985 as final documents. These reports will contain the geologic and environmental data base that will be used in conjunction with the methodology to conduct region-to-area screening.« less
Statistics of optimal information flow in ensembles of regulatory motifs
NASA Astrophysics Data System (ADS)
Crisanti, Andrea; De Martino, Andrea; Fiorentino, Jonathan
2018-02-01
Genetic regulatory circuits universally cope with different sources of noise that limit their ability to coordinate input and output signals. In many cases, optimal regulatory performance can be thought to correspond to configurations of variables and parameters that maximize the mutual information between inputs and outputs. Since the mid-2000s, such optima have been well characterized in several biologically relevant cases. Here we use methods of statistical field theory to calculate the statistics of the maximal mutual information (the "capacity") achievable by tuning the input variable only in an ensemble of regulatory motifs, such that a single controller regulates N targets. Assuming (i) sufficiently large N , (ii) quenched random kinetic parameters, and (iii) small noise affecting the input-output channels, we can accurately reproduce numerical simulations both for the mean capacity and for the whole distribution. Our results provide insight into the inherent variability in effectiveness occurring in regulatory systems with heterogeneous kinetic parameters.
Missing pulse detector for a variable frequency source
Ingram, Charles B.; Lawhorn, John H.
1979-01-01
A missing pulse detector is provided which has the capability of monitoring a varying frequency pulse source to detect the loss of a single pulse or total loss of signal from the source. A frequency-to-current converter is used to program the output pulse width of a variable period retriggerable one-shot to maintain a pulse width slightly longer than one-half the present monitored pulse period. The retriggerable one-shot is triggered at twice the input pulse rate by employing a frequency doubler circuit connected between the one-shot input and the variable frequency source being monitored. The one-shot remains in the triggered or unstable state under normal conditions even though the source period is varying. A loss of an input pulse or single period of a fluctuating signal input will cause the one-shot to revert to its stable state, changing the output signal level to indicate a missing pulse or signal.
Input and language development in bilingually developing children.
Hoff, Erika; Core, Cynthia
2013-11-01
Language skills in young bilingual children are highly varied as a result of the variability in their language experiences, making it difficult for speech-language pathologists to differentiate language disorder from language difference in bilingual children. Understanding the sources of variability in bilingual contexts and the resulting variability in children's skills will help improve language assessment practices by speech-language pathologists. In this article, we review literature on bilingual first language development for children under 5 years of age. We describe the rate of development in single and total language growth, we describe effects of quantity of input and quality of input on growth, and we describe effects of family composition on language input and language growth in bilingual children. We provide recommendations for language assessment of young bilingual children and consider implications for optimizing children's dual language development. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
NASA Astrophysics Data System (ADS)
Kondapalli, S. P.
2017-12-01
In the present work, pulsed current microplasma arc welding is carried out on AISI 321 austenitic stainless steel of 0.3 mm thickness. Peak current, Base current, Pulse rate and Pulse width are chosen as the input variables, whereas grain size and hardness are considered as output responses. Response surface method is adopted by using Box-Behnken Design, and in total 27 experiments are performed. Empirical relation between input and output response is developed using statistical software and analysis of variance (ANOVA) at 95% confidence level to check the adequacy. The main effect and interaction effect of input variables on output response are also studied.
More comprehensive discussion of CRC screening associated with higher screening.
Mosen, David M; Feldstein, Adrianne C; Perrin, Nancy A; Rosales, A Gabriella; Smith, David H; Liles, Elizabeth G; Schneider, Jennifer L; Meyers, Ronald E; Elston-Lafata, Jennifer
2013-04-01
Examine association of comprehensiveness of colorectal cancer (CRC) screening discussion by primary care physicians (PCPs) with completion of CRC screening. Observational study in Kaiser Permanente Northwest, a group-model health maintenance organization. A total of 883 participants overdue for CRC screening received an automated telephone call (ATC) between April and June 2009 encouraging CRC screening. Between January and March 2010, participants completed a survey on PCPs' discussion of CRC screening and patient beliefs regarding screening. receipt of CRC screening (assessed by electronic medical record [EMR], 9 months after ATC). Primary independent variable: comprehensiveness of CRC screening discussion by PCPs (7-item scale). Secondary independent variables: perceived benefits of screening (4-item scale assessing respondents' agreement with benefits of timely screening) and primary care utilization (EMR; 9 months after ATC). The independent association of variables with CRC screening was assessed with logistic regression. Average scores for comprehensiveness of CRC discussion and perceived benefits were 0.4 (range 0-1) and 4.0 (range 1-5), respectively. A total of 28.2% (n = 249) completed screening, 84% of whom had survey assessments after their screening date. Of screeners, 95.2% completed the fecal immunochemical test. More comprehensive discussion of CRC screening was associated with increased screening (odds ratio [OR] = 1.51, 95% confidence interval [CI] = 1.03-2.21). Higher perceived benefits (OR = 1.46, 95% CI = 1.13-1.90) and 1 or more PCP visits (OR = 5.82, 95% CI = 3.87-8.74) were also associated with increased screening. More comprehensive discussion of CRC screening was independently associated with increased CRC screening. Primary care utilization was even more strongly associated with CRC screening, irrespective of discussion of CRC screening.
Silva, Graciela E.; Vana, Kimberly D.; Goodwin, James L.; Sherrill, Duane L.; Quan, Stuart F.
2011-01-01
Study Objective: The Epworth Sleepiness Scale (ESS) has been used to detect patients with potential sleep disordered breathing (SDB). Recently, a 4-Variable screening tool was proposed to identify patients with SDB, in addition to the STOP and STOP-Bang questionnaires. This study evaluated the abilities of the 4-Variable screening tool, STOP, STOP-Bang, and ESS questionnaires in identifying subjects at risk for SDB. Methods: A total of 4,770 participants who completed polysomnograms in the baseline evaluation of the Sleep Heart Health Study (SHHS) were included. Subjects with RDIs ≥ 15 and ≥ 30 were considered to have moderate-to-severe or severe SDB, respectively. Variables were constructed to approximate those in the questionnaires. The risk of SDB was calculated by the 4-Variable screening tool according to Takegami et al. The STOP and STOP-Bang questionnaires were evaluated including variables for snoring, tiredness/sleepiness, observed apnea, blood pressure, body mass index, age, neck circumference, and gender. Sleepiness was evaluated using the ESS questionnaire and scores were dichotomized into < 11 and ≥ 11. Results: The STOP-Bang questionnaire had higher sensitivity to predict moderate-to-severe (87.0%) and severe (70.4%) SDB, while the 4-Variable screening tool had higher specificity to predict moderate-to-severe and severe SDB (93.2% for both). Conclusions: In community populations such as the SHHS, high specificities may be more useful in excluding low-risk patients, while avoiding false positives. However, sleep clinicians may prefer to use screening tools with high sensitivities, like the STOP-Bang, in order to avoid missing cases that may lead to adverse health consequences and increased healthcare costs. Citation: Silva GE; Vana KD; Goodwin JL; Sherrill DL; Quan SF. Identification of patients with sleep disordered breathing: comparing the Four-Variable screening tool, STOP, STOP-Bang, and Epworth Sleepiness Scales. J Clin Sleep Med 2011;7(5):467-472. PMID:22003341
Kroshus, Emily
2016-05-01
Universal screening for mental health concerns, as part of the preparticipation examination in collegiate sports medicine settings, can be an important and feasible strategy for facilitating early detection of mental health disorders. To assess whether sports medicine departments at National Collegiate Athletic Association (NCAA) member colleges have policies related to identifying student-athlete mental health problems, the nature of preparticipation examination screening related to mental health, and whether other departmental or institutional screening initiatives are in place. I also aimed to characterize the variability in screening by institutional characteristics. Cross-sectional study. College sports medicine departments. Team physicians and head athletic trainers at NCAA member colleges (n = 365, 30.3% response rate). Electronic survey of departmental mental health screening activities. A total of 39% of respondents indicated that their institution had a written plan related to identifying student-athletes with mental health concerns. Fewer than half reported that their sports medicine department administers a written or verbal screening instrument for symptoms of disordered eating (44.5%), depression (32.3%), or anxiety (30.7%). The strongest predictors of mental health screening were the presence of a written plan related to identifying student-athlete mental health concerns and the employment of a clinical psychologist. Additionally, Division I institutions and institutions with a greater ratio of athletic trainers to student-athletes tended to engage in more screening. The substantial among-institutions variability in mental health screening suggests that opportunities exist to make these practices more widespread. To address this variability, recent NCAA mental health best-practice guidelines suggested that institutions should screen for a range of mental health disorders and risk behaviors. However, at some institutions, staffing deficits may need to be addressed to allow for implementation of screening-related activities.
Ventricular repolarization variability for hypoglycemia detection.
Ling, Steve; Nguyen, H T
2011-01-01
Hypoglycemia is the most acute and common complication of Type 1 diabetes and is a limiting factor in a glycemic management of diabetes. In this paper, two main contributions are presented; firstly, ventricular repolarization variabilities are introduced for hypoglycemia detection, and secondly, a swarm-based support vector machine (SVM) algorithm with the inputs of the repolarization variabilities is developed to detect hypoglycemia. By using the algorithm and including several repolarization variabilities as inputs, the best hypoglycemia detection performance is found with sensitivity and specificity of 82.14% and 60.19%, respectively.
Gu, Jianwei; Pitz, Mike; Breitner, Susanne; Birmili, Wolfram; von Klot, Stephanie; Schneider, Alexandra; Soentgen, Jens; Reller, Armin; Peters, Annette; Cyrys, Josef
2012-10-01
The success of epidemiological studies depends on the use of appropriate exposure variables. The purpose of this study is to extract a relatively small selection of variables characterizing ambient particulate matter from a large measurement data set. The original data set comprised a total of 96 particulate matter variables that have been continuously measured since 2004 at an urban background aerosol monitoring site in the city of Augsburg, Germany. Many of the original variables were derived from measured particle size distribution (PSD) across the particle diameter range 3 nm to 10 μm, including size-segregated particle number concentration, particle length concentration, particle surface concentration and particle mass concentration. The data set was complemented by integral aerosol variables. These variables were measured by independent instruments, including black carbon, sulfate, particle active surface concentration and particle length concentration. It is obvious that such a large number of measured variables cannot be used in health effect analyses simultaneously. The aim of this study is a pre-screening and a selection of the key variables that will be used as input in forthcoming epidemiological studies. In this study, we present two methods of parameter selection and apply them to data from a two-year period from 2007 to 2008. We used the agglomerative hierarchical cluster method to find groups of similar variables. In total, we selected 15 key variables from 9 clusters which are recommended for epidemiological analyses. We also applied a two-dimensional visualization technique called "heatmap" analysis to the Spearman correlation matrix. 12 key variables were selected using this method. Moreover, the positive matrix factorization (PMF) method was applied to the PSD data to characterize the possible particle sources. Correlations between the variables and PMF factors were used to interpret the meaning of the cluster and the heatmap analyses. Copyright © 2012 Elsevier B.V. All rights reserved.
The Role of Learner and Input Variables in Learning Inflectional Morphology
ERIC Educational Resources Information Center
Brooks, Patricia J.; Kempe, Vera; Sionov, Ariel
2006-01-01
To examine effects of input and learner characteristics on morphology acquisition, 60 adult English speakers learned to inflect masculine and feminine Russian nouns in nominative, dative, and genitive cases. By varying training vocabulary size (i.e., type variability), holding constant the number of learning trials, we tested whether learners…
Wideband low-noise variable-gain BiCMOS transimpedance amplifier
NASA Astrophysics Data System (ADS)
Meyer, Robert G.; Mack, William D.
1994-06-01
A new monolithic variable gain transimpedance amplifier is described. The circuit is realized in BiCMOS technology and has measured gain of 98 kilo ohms, bandwidth of 128 MHz, input noise current spectral density of 1.17 pA/square root of Hz and input signal-current handling capability of 3 mA.
Integrated controls design optimization
Lou, Xinsheng; Neuschaefer, Carl H.
2015-09-01
A control system (207) for optimizing a chemical looping process of a power plant includes an optimizer (420), an income algorithm (230) and a cost algorithm (225) and a chemical looping process models. The process models are used to predict the process outputs from process input variables. Some of the process in puts and output variables are related to the income of the plant; and some others are related to the cost of the plant operations. The income algorithm (230) provides an income input to the optimizer (420) based on a plurality of input parameters (215) of the power plant. The cost algorithm (225) provides a cost input to the optimizer (420) based on a plurality of output parameters (220) of the power plant. The optimizer (420) determines an optimized operating parameter solution based on at least one of the income input and the cost input, and supplies the optimized operating parameter solution to the power plant.
Model-Free Conditional Independence Feature Screening For Ultrahigh Dimensional Data.
Wang, Luheng; Liu, Jingyuan; Li, Yong; Li, Runze
2017-03-01
Feature screening plays an important role in ultrahigh dimensional data analysis. This paper is concerned with conditional feature screening when one is interested in detecting the association between the response and ultrahigh dimensional predictors (e.g., genetic makers) given a low-dimensional exposure variable (such as clinical variables or environmental variables). To this end, we first propose a new index to measure conditional independence, and further develop a conditional screening procedure based on the newly proposed index. We systematically study the theoretical property of the proposed procedure and establish the sure screening and ranking consistency properties under some very mild conditions. The newly proposed screening procedure enjoys some appealing properties. (a) It is model-free in that its implementation does not require a specification on the model structure; (b) it is robust to heavy-tailed distributions or outliers in both directions of response and predictors; and (c) it can deal with both feature screening and the conditional screening in a unified way. We study the finite sample performance of the proposed procedure by Monte Carlo simulations and further illustrate the proposed method through two real data examples.
NASA Astrophysics Data System (ADS)
Fang, Wei; Huang, Shengzhi; Huang, Qiang; Huang, Guohe; Meng, Erhao; Luan, Jinkai
2018-06-01
In this study, reference evapotranspiration (ET0) forecasting models are developed for the least economically developed regions subject to meteorological data scarcity. Firstly, the partial mutual information (PMI) capable of capturing the linear and nonlinear dependence is investigated regarding its utility to identify relevant predictors and exclude those that are redundant through the comparison with partial linear correlation. An efficient input selection technique is crucial for decreasing model data requirements. Then, the interconnection between global climate indices and regional ET0 is identified. Relevant climatic indices are introduced as additional predictors to comprise information regarding ET0, which ought to be provided by meteorological data unavailable. The case study in the Jing River and Beiluo River basins, China, reveals that PMI outperforms the partial linear correlation in excluding the redundant information, favouring the yield of smaller predictor sets. The teleconnection analysis identifies the correlation between Nino 1 + 2 and regional ET0, indicating influences of ENSO events on the evapotranspiration process in the study area. Furthermore, introducing Nino 1 + 2 as predictors helps to yield more accurate ET0 forecasts. A model performance comparison also shows that non-linear stochastic models (SVR or RF with input selection through PMI) do not always outperform linear models (MLR with inputs screen by linear correlation). However, the former can offer quite comparable performance depending on smaller predictor sets. Therefore, efforts such as screening model inputs through PMI and incorporating global climatic indices interconnected with ET0 can benefit the development of ET0 forecasting models suitable for data-scarce regions.
Modification of infant hypothyroidism and phenylketonuria screening program using electronic tools.
Taheri, Behjat; Haddadpoor, Asefeh; Mirkhalafzadeh, Mahmood; Mazroei, Fariba; Aghdak, Pezhman; Nasri, Mehran; Bahrami, Gholamreza
2017-01-01
Congenital hypothyroidism and phenylketonuria (PKU) are the most common cause for preventable mental retardation in infants worldwide. Timely diagnosis and treatment of these disorders can have lasting effects on the mental development of newborns. However, there are several problems at different stages of screening programs that along with imposing heavy costs can reduce the precision of the screening, increasing the chance of undiagnosed cases which in turn can have damaging consequences for the society. Therefore, given these problems and the importance of information systems in facilitating the management and improving the quality of health care the aim of this study was to improve the screening process of hypothyroidism and PKU in infants with the help of electronic resources. The current study is a qualitative, action research designed to improve the quality of screening, services, performance, implementation effectiveness, and management of hypothyroidism and PKU screening program in Isfahan province. To this end, web-based software was designed. Programming was carried out using Delphi.net software and used SQL Server 2008 for database management. Given the weaknesses, problems, and limitations of hypothyroidism and PKU screening program, and the importance of these diseases in a national scale, this study resulted in design of hypothyroidism and PKU screening software for infants in Isfahan province. The inputs and outputs of the software were designed in three levels including Health Care Centers in charge of the screening program, provincial reference lab, and health and treatment network of Isfahan province. Immediate registration of sample data at the time and location of sampling, providing the provincial reference Laboratory and Health Centers of different eparchies with the ability to instantly observe, monitor, and follow-up on the samples at any moment, online verification of samples by reference lab, creating a daily schedule for reference lab, and receiving of the results from analysis equipment; and entering the results into the database without the need for user input are among the features of this software. The implementation of hypothyroidism screening software led to an increase in the quality and efficiency of the screening program; minimized the risk of human error in the process and solved many of the previous limitations of the screening program which were the main goals for implementation of this software. The implementation of this software also resulted in improvement in precision and quality of services provided for these two diseases and better accuracy and precision for data inputs by providing the possibility of entering the sample data at the place and time of sampling which then resulted in the possibility of management based on precise data and also helped develop a comprehensive database and improved the satisfaction of service recipients.
Preprocessing for Eddy Dissipation Rate and TKE Profile Generation
NASA Technical Reports Server (NTRS)
Zak, J. Allen; Rodgers, William G., Jr.; McKissick, Burnell T. (Technical Monitor)
2001-01-01
The Aircraft Vortex Spacing System (AVOSS), a set of algorithms to determine aircraft spacing according to wake vortex behavior prediction, requires turbulence profiles to appropriately determine arrival and departure aircraft spacing. The ambient atmospheric turbulence profile must always be produced, even if the result is an arbitrary (canned) profile. The original turbulence profile code was generated By North Carolina State University and used in a non-real-time environment in the past. All the input parameters could be carefully selected and screened prior to input. Since this code must run in real-time using actual measurements in the field as input, it became imperative to begin a data checking and screening process as part of the real-time implementation. The process described herein is a step towards ensuring that the best possible turbulence profile is always provided to AVOSS. Data fill-ins, constant profiles and arbitrary profiles are used only as a last resort, but are essential to ensure uninterrupted application of AVOSS.
NASA Astrophysics Data System (ADS)
Creaco, E.; Berardi, L.; Sun, Siao; Giustolisi, O.; Savic, D.
2016-04-01
The growing availability of field data, from information and communication technologies (ICTs) in "smart" urban infrastructures, allows data modeling to understand complex phenomena and to support management decisions. Among the analyzed phenomena, those related to storm water quality modeling have recently been gaining interest in the scientific literature. Nonetheless, the large amount of available data poses the problem of selecting relevant variables to describe a phenomenon and enable robust data modeling. This paper presents a procedure for the selection of relevant input variables using the multiobjective evolutionary polynomial regression (EPR-MOGA) paradigm. The procedure is based on scrutinizing the explanatory variables that appear inside the set of EPR-MOGA symbolic model expressions of increasing complexity and goodness of fit to target output. The strategy also enables the selection to be validated by engineering judgement. In such context, the multiple case study extension of EPR-MOGA, called MCS-EPR-MOGA, is adopted. The application of the proposed procedure to modeling storm water quality parameters in two French catchments shows that it was able to significantly reduce the number of explanatory variables for successive analyses. Finally, the EPR-MOGA models obtained after the input selection are compared with those obtained by using the same technique without benefitting from input selection and with those obtained in previous works where other data-modeling techniques were used on the same data. The comparison highlights the effectiveness of both EPR-MOGA and the input selection procedure.
Factors influencing the drain and rinse operation of Banana screens
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Brien, M.; Firth, B.
An Australian Coal Association Research Project (ACARP) study to identify the variables and effects on Banana screens is described in this article. The impacts of the following system variables were investigated: panel angle, volumetric feed flow rate, solids content of feed screen motion, vibration frequency, magnetite content and impact of screen aperture. The article was adapted from a presentation at Coal Prep 2005, Lexington, KY, USA in May 2005. 4 refs., 8 figs., 1 tab.
Partial Granger causality--eliminating exogenous inputs and latent variables.
Guo, Shuixia; Seth, Anil K; Kendrick, Keith M; Zhou, Cong; Feng, Jianfeng
2008-07-15
Attempts to identify causal interactions in multivariable biological time series (e.g., gene data, protein data, physiological data) can be undermined by the confounding influence of environmental (exogenous) inputs. Compounding this problem, we are commonly only able to record a subset of all related variables in a system. These recorded variables are likely to be influenced by unrecorded (latent) variables. To address this problem, we introduce a novel variant of a widely used statistical measure of causality--Granger causality--that is inspired by the definition of partial correlation. Our 'partial Granger causality' measure is extensively tested with toy models, both linear and nonlinear, and is applied to experimental data: in vivo multielectrode array (MEA) local field potentials (LFPs) recorded from the inferotemporal cortex of sheep. Our results demonstrate that partial Granger causality can reveal the underlying interactions among elements in a network in the presence of exogenous inputs and latent variables in many cases where the existing conditional Granger causality fails.
Decina, Stephen M; Templer, Pamela H; Hutyra, Lucy R; Gately, Conor K; Rao, Preeti
2017-12-31
Atmospheric deposition of nitrogen (N) is a major input of N to the biosphere and is elevated beyond preindustrial levels throughout many ecosystems. Deposition monitoring networks in the United States generally avoid urban areas in order to capture regional patterns of N deposition, and studies measuring N deposition in cities usually include only one or two urban sites in an urban-rural comparison or as an anchor along an urban-to-rural gradient. Describing patterns and drivers of atmospheric N inputs is crucial for understanding the effects of N deposition; however, little is known about the variability and drivers of atmospheric N inputs or their effects on soil biogeochemistry within urban ecosystems. We measured rates of canopy throughfall N as a measure of atmospheric N inputs, as well as soil net N mineralization and nitrification, soil solution N, and soil respiration at 15 sites across the greater Boston, Massachusetts area. Rates of throughfall N are 8.70±0.68kgNha -1 yr -1 , vary 3.5-fold across sites, and are positively correlated with rates of local vehicle N emissions. Ammonium (NH 4 + ) composes 69.9±2.2% of inorganic throughfall N inputs and is highest in late spring, suggesting a contribution from local fertilizer inputs. Soil solution NO 3 - is positively correlated with throughfall NO 3 - inputs. In contrast, soil solution NH 4 + , net N mineralization, nitrification, and soil respiration are not correlated with rates of throughfall N inputs. Rather, these processes are correlated with soil properties such as soil organic matter. Our results demonstrate high variability in rates of urban throughfall N inputs, correlation of throughfall N inputs with local vehicle N emissions, and a decoupling of urban soil biogeochemistry and throughfall N inputs. Copyright © 2017 Elsevier B.V. All rights reserved.
He, Yan-Lin; Xu, Yuan; Geng, Zhi-Qiang; Zhu, Qun-Xiong
2016-03-01
In this paper, a hybrid robust model based on an improved functional link neural network integrating with partial least square (IFLNN-PLS) is proposed. Firstly, an improved functional link neural network with small norm of expanded weights and high input-output correlation (SNEWHIOC-FLNN) was proposed for enhancing the generalization performance of FLNN. Unlike the traditional FLNN, the expanded variables of the original inputs are not directly used as the inputs in the proposed SNEWHIOC-FLNN model. The original inputs are attached to some small norm of expanded weights. As a result, the correlation coefficient between some of the expanded variables and the outputs is enhanced. The larger the correlation coefficient is, the more relevant the expanded variables tend to be. In the end, the expanded variables with larger correlation coefficient are selected as the inputs to improve the performance of the traditional FLNN. In order to test the proposed SNEWHIOC-FLNN model, three UCI (University of California, Irvine) regression datasets named Housing, Concrete Compressive Strength (CCS), and Yacht Hydro Dynamics (YHD) are selected. Then a hybrid model based on the improved FLNN integrating with partial least square (IFLNN-PLS) was built. In IFLNN-PLS model, the connection weights are calculated using the partial least square method but not the error back propagation algorithm. Lastly, IFLNN-PLS was developed as an intelligent measurement model for accurately predicting the key variables in the Purified Terephthalic Acid (PTA) process and the High Density Polyethylene (HDPE) process. Simulation results illustrated that the IFLNN-PLS could significant improve the prediction performance. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.
Dynamic modal estimation using instrumental variables
NASA Technical Reports Server (NTRS)
Salzwedel, H.
1980-01-01
A method to determine the modes of dynamical systems is described. The inputs and outputs of a system are Fourier transformed and averaged to reduce the error level. An instrumental variable method that estimates modal parameters from multiple correlations between responses of single input, multiple output systems is applied to estimate aircraft, spacecraft, and off-shore platform modal parameters.
Urban vs. Rural CLIL: An Analysis of Input-Related Variables, Motivation and Language Attainment
ERIC Educational Resources Information Center
Alejo, Rafael; Piquer-Píriz, Ana
2016-01-01
The present article carries out an in-depth analysis of the differences in motivation, input-related variables and linguistic attainment of the students at two content and language integrated learning (CLIL) schools operating within the same institutional and educational context, the Spanish region of Extremadura, and differing only in terms of…
Variable Input and the Acquisition of Plural Morphology
ERIC Educational Resources Information Center
Miller, Karen L.; Schmitt, Cristina
2012-01-01
The present article examines the effect of variable input on the acquisition of plural morphology in two varieties of Spanish: Chilean Spanish, where the plural marker is sometimes omitted due to a phonological process of syllable final /s/ lenition, and Mexican Spanish (of Mexico City), with no such lenition process. The goal of the study is to…
Precision digital pulse phase generator
McEwan, T.E.
1996-10-08
A timing generator comprises a crystal oscillator connected to provide an output reference pulse. A resistor-capacitor combination is connected to provide a variable-delay output pulse from an input connected to the crystal oscillator. A phase monitor is connected to provide duty-cycle representations of the reference and variable-delay output pulse phase. An operational amplifier drives a control voltage to the resistor-capacitor combination according to currents integrated from the phase monitor and injected into summing junctions. A digital-to-analog converter injects a control current into the summing junctions according to an input digital control code. A servo equilibrium results that provides a phase delay of the variable-delay output pulse to the output reference pulse that linearly depends on the input digital control code. 2 figs.
Precision digital pulse phase generator
McEwan, Thomas E.
1996-01-01
A timing generator comprises a crystal oscillator connected to provide an output reference pulse. A resistor-capacitor combination is connected to provide a variable-delay output pulse from an input connected to the crystal oscillator. A phase monitor is connected to provide duty-cycle representations of the reference and variable-delay output pulse phase. An operational amplifier drives a control voltage to the resistor-capacitor combination according to currents integrated from the phase monitor and injected into summing junctions. A digital-to-analog converter injects a control current into the summing junctions according to an input digital control code. A servo equilibrium results that provides a phase delay of the variable-delay output pulse to the output reference pulse that linearly depends on the input digital control code.
From CBCL to DSM: A Comparison of Two Methods to Screen for DSM-IV Diagnoses Using CBCL Data
ERIC Educational Resources Information Center
Krol, Nicole P. C. M.; De Bruyn, Eric E. J.; Coolen, Jolanda C.; van Aarle, Edward J. M.
2006-01-01
The screening efficiency of 2 methods to convert Child Behavior Checklist (CBCL) assessment data into Diagnostic and Statistical Manual of Mental Disorders (4th ed. [DSM-IV]; American Psychiatric Association, 1994) diagnoses was compared. The Machine-Aided Diagnosis (MAD) method converts CBCL input data directly into DSM-IV symptom criteria. The…
Advanced Terrain Representation for the Microticcit Workstation: System Maintenance Manual
1986-02-01
enter the */ /* password. */ /* Inputs: passwd - password to compare userfs entry to */ /* Outputs: TRUE - if password entered correctly...include "atrdefs.h" #include "ctype.h" extern char window[]; /* useable portion of screen */ 1 i getpw( passwd ) char passwd []; { int c...blank input window */ pcvgcp(&row,*col); curs_off(); nchars - ntries - 0; len « strlen( passwd ); pcvwca(len,• *,REVIDEO); /* process keys till user
DefEX: Hands-On Cyber Defense Exercise for Undergraduate Students
2011-07-01
Injection, and 4) File Upload. Next, the students patched the associated flawed Perl and PHP Hypertext Preprocessor ( PHP ) code. Finally, students...underlying script. The Zora XSS vulnerability existed in a PHP file that echoed unfiltered user input back to the screen. To eliminate the...vulnerability, students filtered the input using the PHP htmlentities function and retested the code. The htmlentities function translates certain ambiguous
Assessing the Use of Input Devices for Teachers and Children in Early Childhood Education Programs
ERIC Educational Resources Information Center
Wood, Eileen; Willoughby, Teena; Schmidt, Alice; Porter, Lisa; Specht, Jacqueline; Gilbert, Jessica
2004-01-01
The impact of four computer input devices (mouse, EZ ball, touch pad, touch screen) for 81 preschoolers (ranging from 34 to 78 months of age) and 43 early childhood educators (mean age was 29 years and 9 months) was examined. Participants played two computer games with 10 trials for each game followed by a survey assessing their preferences for…
Regenerative braking device with rotationally mounted energy storage means
Hoppie, Lyle O.
1982-03-16
A regenerative braking device for an automotive vehicle includes an energy storage assembly (12) having a plurality of rubber rollers (26, 28) mounted for rotation between an input shaft (30) and an output shaft (32), clutches (50, 56) and brakes (52, 58) associated with each shaft, and a continuously variable transmission (22) connectable to a vehicle drivetrain and to the input and output shafts by the respective clutches. In a second embodiment the clutches and brakes are dispensed with and the variable ratio transmission is connected directly across the input and output shafts. In both embodiments the rubber rollers are torsionally stressed to accumulate energy from the vehicle when the input shaft rotates faster or relative to the output shaft and are torsionally relaxed to deliver energy to the vehicle when the output shaft rotates faster or relative to the input shaft.
NASA Technical Reports Server (NTRS)
Carlson, C. R.
1981-01-01
The user documentation of the SYSGEN model and its links with other simulations is described. The SYSGEN is a production costing and reliability model of electric utility systems. Hydroelectric, storage, and time dependent generating units are modeled in addition to conventional generating plants. Input variables, modeling options, output variables, and reports formats are explained. SYSGEN also can be run interactively by using a program called FEPS (Front End Program for SYSGEN). A format for SYSGEN input variables which is designed for use with FEPS is presented.
Dumuid, Dorothea; Olds, Timothy; Lewis, Lucy K; Martin-Fernández, Josep Antoni; Katzmarzyk, Peter T; Barreira, Tiago; Broyles, Stephanie T; Chaput, Jean-Philippe; Fogelholm, Mikael; Hu, Gang; Kuriyan, Rebecca; Kurpad, Anura; Lambert, Estelle V; Maia, José; Matsudo, Victor; Onywera, Vincent O; Sarmiento, Olga L; Standage, Martyn; Tremblay, Mark S; Tudor-Locke, Catrine; Zhao, Pei; Gillison, Fiona; Maher, Carol
2017-04-01
To evaluate the relationship between children's lifestyles and health-related quality of life and to explore whether this relationship varies among children from different world regions. This study used cross-sectional data from the International Study of Childhood Obesity, Lifestyle and the Environment. Children (9-11 years) were recruited from sites in 12 nations (n = 5759). Clustering input variables were 24-hour accelerometry and self-reported diet and screen time. Health-related quality of life was self-reported with KIDSCREEN-10. Cluster analyses (using compositional analysis techniques) were performed on a site-wise basis. Lifestyle behavior cluster characteristics were compared between sites. The relationship between cluster membership and health-related quality of life was assessed with the use of linear models. Lifestyle behavior clusters were similar across the 12 sites, with clusters commonly characterized by (1) high physical activity (actives); (2) high sedentary behavior (sitters); (3) high screen time/unhealthy eating pattern (junk-food screenies); and (4) low screen time/healthy eating pattern and moderate physical activity/sedentary behavior (all-rounders). Health-related quality of life was greatest in the all-rounders cluster. Children from different world regions clustered into groups of similar lifestyle behaviors. Cluster membership was related to differing health-related quality of life, with children from the all-rounders cluster consistently reporting greatest health-related quality of life at sites around the world. Findings support the importance of a healthy combination of lifestyle behaviors in childhood: low screen time, healthy eating pattern, and balanced daily activity behaviors (physical activity and sedentary behavior). ClinicalTrials.gov: NCT01722500. Copyright © 2016 Elsevier Inc. All rights reserved.
Chang, Anthony C
2012-03-01
The preparticipation screening for athlete participation in sports typically entails a comprehensive medical and family history and a complete physical examination. A 12-lead electrocardiogram (ECG) can increase the likelihood of detecting cardiac diagnoses such as hypertrophic cardiomyopathy, but this diagnostic test as part of the screening process has engendered considerable controversy. The pro position is supported by argument that international screening protocols support its use, positive diagnosis has multiple benefits, history and physical examination are inadequate, primary prevention is essential, and the cost effectiveness is justified. Although the aforementioned myriad of justifications for routine ECG screening of young athletes can be persuasive, several valid contentions oppose supporting such a policy, namely, that the sudden death incidence is very (too) low, the ECG screening will be too costly, the false-positive rate is too high, resources will be allocated away from other diseases, and manpower is insufficient for its execution. Clinicians, including pediatric cardiologists, have an understandable proclivity for avoiding this prodigious national endeavor. The controversy, however, should not be focused on whether an inexpensive, noninvasive test such as an ECG should be mandated but should instead be directed at just how these tests for young athletes can be performed in the clinical imbroglio of these disease states (with variable genetic penetrance and phenotypic expression) with concomitant fiscal accountability and logistical expediency in this era of economic restraint. This monumental endeavor in any city or region requires two crucial elements well known to business scholars: implementation and execution. The eventual solution for the screening ECG dilemma requires a truly innovative and systematic approach that will liberate us from inadequate conventional solutions. Artificial intelligence, specifically the process termed "machine learning" and "neural networking," involves complex algorithms that allow computers to improve the decision-making process based on repeated input of empirical data (e.g., databases and ECGs). These elements all can be improved with a national database, evidence-based medicine, and in the near future, innovation that entails a Kurzweilian artificial intelligence infrastructure with machine learning and neural networking that will construct the ultimate clinical decision-making algorithm.
12 CFR Appendix A to Subpart B of... - Risk-Based Capital Test Methodology and Specifications
Code of Federal Regulations, 2013 CFR
2013-01-01
....3.2, Mortgage Amortization Schedule Inputs 3-32, Loan Group Inputs for Mortgage Amortization... Prepayment Explanatory Variables F 3.6.3.5.2, Multifamily Default and Prepayment Inputs 3-38, Loan Group... Group inputs for Gross Loss Severity F 3.3.4, Interest Rates Outputs3.6.3.3.4, Mortgage Amortization...
12 CFR Appendix A to Subpart B of... - Risk-Based Capital Test Methodology and Specifications
Code of Federal Regulations, 2011 CFR
2011-01-01
....3.2, Mortgage Amortization Schedule Inputs 3-32, Loan Group Inputs for Mortgage Amortization... Prepayment Explanatory Variables F 3.6.3.5.2, Multifamily Default and Prepayment Inputs 3-38, Loan Group... Group inputs for Gross Loss Severity F 3.3.4, Interest Rates Outputs3.6.3.3.4, Mortgage Amortization...
12 CFR Appendix A to Subpart B of... - Risk-Based Capital Test Methodology and Specifications
Code of Federal Regulations, 2012 CFR
2012-01-01
....3.2, Mortgage Amortization Schedule Inputs 3-32, Loan Group Inputs for Mortgage Amortization... Prepayment Explanatory Variables F 3.6.3.5.2, Multifamily Default and Prepayment Inputs 3-38, Loan Group... Group inputs for Gross Loss Severity F 3.3.4, Interest Rates Outputs3.6.3.3.4, Mortgage Amortization...
12 CFR Appendix A to Subpart B of... - Risk-Based Capital Test Methodology and Specifications
Code of Federal Regulations, 2014 CFR
2014-01-01
....3.2, Mortgage Amortization Schedule Inputs 3-32, Loan Group Inputs for Mortgage Amortization... Prepayment Explanatory Variables F 3.6.3.5.2, Multifamily Default and Prepayment Inputs 3-38, Loan Group... Group inputs for Gross Loss Severity F 3.3.4, Interest Rates Outputs3.6.3.3.4, Mortgage Amortization...
DEC Ada interface to Screen Management Guidelines (SMG)
NASA Technical Reports Server (NTRS)
Laomanachareon, Somsak; Lekkos, Anthony A.
1986-01-01
DEC's Screen Management Guidelines are the Run-Time Library procedures that perform terminal-independent screen management functions on a VT100-class terminal. These procedures assist users in designing, composing, and keeping track of complex images on a video screen. There are three fundamental elements in the screen management model: the pasteboard, the virtual display, and the virtual keyboard. The pasteboard is like a two-dimensional area on which a user places and manipulates screen displays. The virtual display is a rectangular part of the terminal screen to which a program writes data with procedure calls. The virtual keyboard is a logical structure for input operation associated with a physical keyboard. SMG can be called by all major VAX languages. Through Ada, predefined language Pragmas are used to interface with SMG. These features and elements of SMG are briefly discussed.
SPACEBAR: Kinematic design by computer graphics
NASA Technical Reports Server (NTRS)
Ricci, R. J.
1975-01-01
The interactive graphics computer program SPACEBAR, conceived to reduce the time and complexity associated with the development of kinematic mechanisms on the design board, was described. This program allows the direct design and analysis of mechanisms right at the terminal screen. All input variables, including linkage geometry, stiffness, and applied loading conditions, can be fed into or changed at the terminal and may be displayed in three dimensions. All mechanism configurations can be cycled through their range of travel and viewed in their various geometric positions. Output data includes geometric positioning in orthogonal coordinates of each node point in the mechanism, velocity and acceleration of the node points, and internal loads and displacements of the node points and linkages. All analysis calculations take at most a few seconds to complete. Output data can be viewed at the scope and also printed at the discretion of the user.
M-DAS: System for multispectral data analysis. [in Saginaw Bay, Michigan
NASA Technical Reports Server (NTRS)
Johnson, R. H.
1975-01-01
M-DAS is a ground data processing system designed for analysis of multispectral data. M-DAS operates on multispectral data from LANDSAT, S-192, M2S and other sources in CCT form. Interactive training by operator-investigators using a variable cursor on a color display was used to derive optimum processing coefficients and data on cluster separability. An advanced multivariate normal-maximum likelihood processing algorithm was used to produce output in various formats: color-coded film images, geometrically corrected map overlays, moving displays of scene sections, coverage tabulations and categorized CCTs. The analysis procedure for M-DAS involves three phases: (1) screening and training, (2) analysis of training data to compute performance predictions and processing coefficients, and (3) processing of multichannel input data into categorized results. Typical M-DAS applications involve iteration between each of these phases. A series of photographs of the M-DAS display are used to illustrate M-DAS operation.
The Effect of Visual Variability on the Learning of Academic Concepts.
Bourgoyne, Ashley; Alt, Mary
2017-06-10
The purpose of this study was to identify effects of variability of visual input on development of conceptual representations of academic concepts for college-age students with normal language (NL) and those with language-learning disabilities (LLD). Students with NL (n = 11) and LLD (n = 11) participated in a computer-based training for introductory biology course concepts. Participants were trained on half the concepts under a low-variability condition and half under a high-variability condition. Participants completed a posttest in which they were asked to identify and rate the accuracy of novel and trained visual representations of the concepts. We performed separate repeated measures analyses of variance to examine the accuracy of identification and ratings. Participants were equally accurate on trained and novel items in the high-variability condition, but were less accurate on novel items only in the low-variability condition. The LLD group showed the same pattern as the NL group; they were just less accurate. Results indicated that high-variability visual input may facilitate the acquisition of academic concepts in college students with NL and LLD. High-variability visual input may be especially beneficial for generalization to novel representations of concepts. Implicit learning methods may be harnessed by college courses to provide students with basic conceptual knowledge when they are entering courses or beginning new units.
NASA Technical Reports Server (NTRS)
1993-01-01
The information required by a programmer using the Minimum Hamiltonian AScent Trajectory Evaluation (MASTRE) Program is provided. This document enables the programmer to either modify the program or convert the program to computers other than the VAX computer. Documentation for each subroutine or function based on providing the definitions of the variables and a source listing are included. Questions concerning the equations, techniques, or input requirements should be answered by either the Engineering or User's manuals. Three appendices are also included which provide a listing of the Root-Sum-Square (RSS) program, a listing of subroutine names and definitions used in the MASTRE User Friendly Interface Program, and listing of the subroutine names and definitions used in the Mass Properties Program. The RSS Program is used to aid in the performance of dispersion analyses. The RSS program reads a file generated by the MASTRE Program, calculates dispersion parameters, and generates output tables and output plot files. UFI Program provides a screen user interface to aid the user in providing input to the model. The Mass Properties Program defines the mass properties data for the MASTRE program through the use of user interface software.
Whiteway, Matthew R; Butts, Daniel A
2017-03-01
The activity of sensory cortical neurons is not only driven by external stimuli but also shaped by other sources of input to the cortex. Unlike external stimuli, these other sources of input are challenging to experimentally control, or even observe, and as a result contribute to variability of neural responses to sensory stimuli. However, such sources of input are likely not "noise" and may play an integral role in sensory cortex function. Here we introduce the rectified latent variable model (RLVM) in order to identify these sources of input using simultaneously recorded cortical neuron populations. The RLVM is novel in that it employs nonnegative (rectified) latent variables and is much less restrictive in the mathematical constraints on solutions because of the use of an autoencoder neural network to initialize model parameters. We show that the RLVM outperforms principal component analysis, factor analysis, and independent component analysis, using simulated data across a range of conditions. We then apply this model to two-photon imaging of hundreds of simultaneously recorded neurons in mouse primary somatosensory cortex during a tactile discrimination task. Across many experiments, the RLVM identifies latent variables related to both the tactile stimulation as well as nonstimulus aspects of the behavioral task, with a majority of activity explained by the latter. These results suggest that properly identifying such latent variables is necessary for a full understanding of sensory cortical function and demonstrate novel methods for leveraging large population recordings to this end. NEW & NOTEWORTHY The rapid development of neural recording technologies presents new opportunities for understanding patterns of activity across neural populations. Here we show how a latent variable model with appropriate nonlinear form can be used to identify sources of input to a neural population and infer their time courses. Furthermore, we demonstrate how these sources are related to behavioral contexts outside of direct experimental control. Copyright © 2017 the American Physiological Society.
Not All Children Agree: Acquisition of Agreement when the Input Is Variable
ERIC Educational Resources Information Center
Miller, Karen
2012-01-01
In this paper we investigate the effect of variable input on the acquisition of grammar. More specifically, we examine the acquisition of the third person singular marker -s on the auxiliary "do" in comprehension and production in two groups of children who are exposed to similar varieties of English but that differ with respect to adult…
Boonjing, Veera; Intakosum, Sarun
2016-01-01
This study investigated the use of Artificial Neural Network (ANN) and Genetic Algorithm (GA) for prediction of Thailand's SET50 index trend. ANN is a widely accepted machine learning method that uses past data to predict future trend, while GA is an algorithm that can find better subsets of input variables for importing into ANN, hence enabling more accurate prediction by its efficient feature selection. The imported data were chosen technical indicators highly regarded by stock analysts, each represented by 4 input variables that were based on past time spans of 4 different lengths: 3-, 5-, 10-, and 15-day spans before the day of prediction. This import undertaking generated a big set of diverse input variables with an exponentially higher number of possible subsets that GA culled down to a manageable number of more effective ones. SET50 index data of the past 6 years, from 2009 to 2014, were used to evaluate this hybrid intelligence prediction accuracy, and the hybrid's prediction results were found to be more accurate than those made by a method using only one input variable for one fixed length of past time span. PMID:27974883
Inthachot, Montri; Boonjing, Veera; Intakosum, Sarun
2016-01-01
This study investigated the use of Artificial Neural Network (ANN) and Genetic Algorithm (GA) for prediction of Thailand's SET50 index trend. ANN is a widely accepted machine learning method that uses past data to predict future trend, while GA is an algorithm that can find better subsets of input variables for importing into ANN, hence enabling more accurate prediction by its efficient feature selection. The imported data were chosen technical indicators highly regarded by stock analysts, each represented by 4 input variables that were based on past time spans of 4 different lengths: 3-, 5-, 10-, and 15-day spans before the day of prediction. This import undertaking generated a big set of diverse input variables with an exponentially higher number of possible subsets that GA culled down to a manageable number of more effective ones. SET50 index data of the past 6 years, from 2009 to 2014, were used to evaluate this hybrid intelligence prediction accuracy, and the hybrid's prediction results were found to be more accurate than those made by a method using only one input variable for one fixed length of past time span.
Variability in Institutional Screening Practices Related to Collegiate Student-Athlete Mental Health
Kroshus, Emily
2016-01-01
Context: Universal screening for mental health concerns, as part of the preparticipation examination in collegiate sports medicine settings, can be an important and feasible strategy for facilitating early detection of mental health disorders. Objective: To assess whether sports medicine departments at National Collegiate Athletic Association (NCAA) member colleges have policies related to identifying student-athlete mental health problems, the nature of preparticipation examination screening related to mental health, and whether other departmental or institutional screening initiatives are in place. I also aimed to characterize the variability in screening by institutional characteristics. Design: Cross-sectional study. Setting: College sports medicine departments. Patients or Other Participants: Team physicians and head athletic trainers at NCAA member colleges (n = 365, 30.3% response rate). Main Outcome Measure(s): Electronic survey of departmental mental health screening activities. Results: A total of 39% of respondents indicated that their institution had a written plan related to identifying student-athletes with mental health concerns. Fewer than half reported that their sports medicine department administers a written or verbal screening instrument for symptoms of disordered eating (44.5%), depression (32.3%), or anxiety (30.7%). The strongest predictors of mental health screening were the presence of a written plan related to identifying student-athlete mental health concerns and the employment of a clinical psychologist. Additionally, Division I institutions and institutions with a greater ratio of athletic trainers to student-athletes tended to engage in more screening. Conclusions: The substantial among-institutions variability in mental health screening suggests that opportunities exist to make these practices more widespread. To address this variability, recent NCAA mental health best-practice guidelines suggested that institutions should screen for a range of mental health disorders and risk behaviors. However, at some institutions, staffing deficits may need to be addressed to allow for implementation of screening-related activities. PMID:27111587
INFANT HEALTH PRODUCTION FUNCTIONS: WHAT A DIFFERENCE THE DATA MAKE
Reichman, Nancy E.; Corman, Hope; Noonan, Kelly; Dave, Dhaval
2008-01-01
SUMMARY We examine the extent to which infant health production functions are sensitive to model specification and measurement error. We focus on the importance of typically unobserved but theoretically important variables (typically unobserved variables, TUVs), other non-standard covariates (NSCs), input reporting, and characterization of infant health. The TUVs represent wantedness, taste for risky behavior, and maternal health endowment. The NSCs include father characteristics. We estimate the effects of prenatal drug use, prenatal cigarette smoking, and First trimester prenatal care on birth weight, low birth weight, and a measure of abnormal infant health conditions. We compare estimates using self-reported inputs versus input measures that combine information from medical records and self-reports. We find that TUVs and NSCs are significantly associated with both inputs and outcomes, but that excluding them from infant health production functions does not appreciably affect the input estimates. However, using self-reported inputs leads to overestimated effects of inputs, particularly prenatal care, on outcomes, and using a direct measure of infant health does not always yield input estimates similar to those when using birth weight outcomes. The findings have implications for research, data collection, and public health policy. PMID:18792077
Thomas, R.E.
1959-01-20
An electronic circuit is presented for automatically computing the product of two selected variables by multiplying the voltage pulses proportional to the variables. The multiplier circuit has a plurality of parallel resistors of predetermined values connected through separate gate circults between a first input and the output terminal. One voltage pulse is applied to thc flrst input while the second voltage pulse is applied to control circuitry for the respective gate circuits. Thc magnitude of the second voltage pulse selects the resistors upon which the first voltage pulse is imprcssed, whereby the resultant output voltage is proportional to the product of the input voltage pulses
Information Extraction for Clinical Data Mining: A Mammography Case Study
Nassif, Houssam; Woods, Ryan; Burnside, Elizabeth; Ayvaci, Mehmet; Shavlik, Jude; Page, David
2013-01-01
Breast cancer is the leading cause of cancer mortality in women between the ages of 15 and 54. During mammography screening, radiologists use a strict lexicon (BI-RADS) to describe and report their findings. Mammography records are then stored in a well-defined database format (NMD). Lately, researchers have applied data mining and machine learning techniques to these databases. They successfully built breast cancer classifiers that can help in early detection of malignancy. However, the validity of these models depends on the quality of the underlying databases. Unfortunately, most databases suffer from inconsistencies, missing data, inter-observer variability and inappropriate term usage. In addition, many databases are not compliant with the NMD format and/or solely consist of text reports. BI-RADS feature extraction from free text and consistency checks between recorded predictive variables and text reports are crucial to addressing this problem. We describe a general scheme for concept information retrieval from free text given a lexicon, and present a BI-RADS features extraction algorithm for clinical data mining. It consists of a syntax analyzer, a concept finder and a negation detector. The syntax analyzer preprocesses the input into individual sentences. The concept finder uses a semantic grammar based on the BI-RADS lexicon and the experts’ input. It parses sentences detecting BI-RADS concepts. Once a concept is located, a lexical scanner checks for negation. Our method can handle multiple latent concepts within the text, filtering out ultrasound concepts. On our dataset, our algorithm achieves 97.7% precision, 95.5% recall and an F1-score of 0.97. It outperforms manual feature extraction at the 5% statistical significance level. PMID:23765123
Information Extraction for Clinical Data Mining: A Mammography Case Study.
Nassif, Houssam; Woods, Ryan; Burnside, Elizabeth; Ayvaci, Mehmet; Shavlik, Jude; Page, David
2009-01-01
Breast cancer is the leading cause of cancer mortality in women between the ages of 15 and 54. During mammography screening, radiologists use a strict lexicon (BI-RADS) to describe and report their findings. Mammography records are then stored in a well-defined database format (NMD). Lately, researchers have applied data mining and machine learning techniques to these databases. They successfully built breast cancer classifiers that can help in early detection of malignancy. However, the validity of these models depends on the quality of the underlying databases. Unfortunately, most databases suffer from inconsistencies, missing data, inter-observer variability and inappropriate term usage. In addition, many databases are not compliant with the NMD format and/or solely consist of text reports. BI-RADS feature extraction from free text and consistency checks between recorded predictive variables and text reports are crucial to addressing this problem. We describe a general scheme for concept information retrieval from free text given a lexicon, and present a BI-RADS features extraction algorithm for clinical data mining. It consists of a syntax analyzer, a concept finder and a negation detector. The syntax analyzer preprocesses the input into individual sentences. The concept finder uses a semantic grammar based on the BI-RADS lexicon and the experts' input. It parses sentences detecting BI-RADS concepts. Once a concept is located, a lexical scanner checks for negation. Our method can handle multiple latent concepts within the text, filtering out ultrasound concepts. On our dataset, our algorithm achieves 97.7% precision, 95.5% recall and an F 1 -score of 0.97. It outperforms manual feature extraction at the 5% statistical significance level.
Rispoli, Matthew; Holt, Janet K.
2017-01-01
Purpose This follow-up study examined whether a parent intervention that increased the diversity of lexical noun phrase subjects in parent input and accelerated children's sentence diversity (Hadley et al., 2017) had indirect benefits on tense/agreement (T/A) morphemes in parent input and children's spontaneous speech. Method Differences in input variables related to T/A marking were compared for parents who received toy talk instruction and a quasi-control group: input informativeness and full is declaratives. Language growth on tense agreement productivity (TAP) was modeled for 38 children from language samples obtained at 21, 24, 27, and 30 months. Parent input properties following instruction and children's growth in lexical diversity and sentence diversity were examined as predictors of TAP growth. Results Instruction increased parent use of full is declaratives (ηp 2 ≥ .25) but not input informativeness. Children's sentence diversity was also a significant time-varying predictor of TAP growth. Two input variables, lexical noun phrase subject diversity and full is declaratives, were also significant predictors, even after controlling for children's sentence diversity. Conclusions These findings establish a link between children's sentence diversity and the development of T/A morphemes and provide evidence about characteristics of input that facilitate growth in this grammatical system. PMID:28892819
Wesolowski, Edwin A.
1996-01-01
Two separate studies to simulate the effects of discharging treated wastewater to the Red River of the North at Fargo, North Dakota, and Moorhead, Minnesota, have been completed. In the first study, the Red River at Fargo Water-Quality Model was calibrated and verified for icefree conditions. In the second study, the Red River at Fargo Ice-Cover Water-Quality Model was verified for ice-cover conditions.To better understand and apply the Red River at Fargo Water-Quality Model and the Red River at Fargo Ice-Cover Water-Quality Model, the uncertainty associated with simulated constituent concentrations and property values was analyzed and quantified using the Enhanced Stream Water Quality Model-Uncertainty Analysis. The Monte Carlo simulation and first-order error analysis methods were used to analyze the uncertainty in simulated values for six constituents and properties at sites 5, 10, and 14 (upstream to downstream order). The constituents and properties analyzed for uncertainty are specific conductance, total organic nitrogen (reported as nitrogen), total ammonia (reported as nitrogen), total nitrite plus nitrate (reported as nitrogen), 5-day carbonaceous biochemical oxygen demand for ice-cover conditions and ultimate carbonaceous biochemical oxygen demand for ice-free conditions, and dissolved oxygen. Results are given in detail for both the ice-cover and ice-free conditions for specific conductance, total ammonia, and dissolved oxygen.The sensitivity and uncertainty of the simulated constituent concentrations and property values to input variables differ substantially between ice-cover and ice-free conditions. During ice-cover conditions, simulated specific-conductance values are most sensitive to the headwatersource specific-conductance values upstream of site 10 and the point-source specific-conductance values downstream of site 10. These headwater-source and point-source specific-conductance values also are the key sources of uncertainty. Simulated total ammonia concentrations are most sensitive to the point-source total ammonia concentrations at all three sites. Other input variables that contribute substantially to the variability of simulated total ammonia concentrations are the headwater-source total ammonia and the instream reaction coefficient for biological decay of total ammonia to total nitrite. Simulated dissolved-oxygen concentrations at all three sites are most sensitive to headwater-source dissolved-oxygen concentration. This input variable is the key source of variability for simulated dissolved-oxygen concentrations at sites 5 and 10. Headwatersource and point-source dissolved-oxygen concentrations are the key sources of variability for simulated dissolved-oxygen concentrations at site 14.During ice-free conditions, simulated specific-conductance values at all three sites are most sensitive to the headwater-source specific-conductance values. Headwater-source specificconductance values also are the key source of uncertainty. The input variables to which total ammonia and dissolved oxygen are most sensitive vary from site to site and may or may not correspond to the input variables that contribute the most to the variability. The input variables that contribute the most to the variability of simulated total ammonia concentrations are pointsource total ammonia, instream reaction coefficient for biological decay of total ammonia to total nitrite, and Manning's roughness coefficient. The input variables that contribute the most to the variability of simulated dissolved-oxygen concentrations are reaeration rate, sediment oxygen demand rate, and headwater-source algae as chlorophyll a.
Kernel-PCA data integration with enhanced interpretability
2014-01-01
Background Nowadays, combining the different sources of information to improve the biological knowledge available is a challenge in bioinformatics. One of the most powerful methods for integrating heterogeneous data types are kernel-based methods. Kernel-based data integration approaches consist of two basic steps: firstly the right kernel is chosen for each data set; secondly the kernels from the different data sources are combined to give a complete representation of the available data for a given statistical task. Results We analyze the integration of data from several sources of information using kernel PCA, from the point of view of reducing dimensionality. Moreover, we improve the interpretability of kernel PCA by adding to the plot the representation of the input variables that belong to any dataset. In particular, for each input variable or linear combination of input variables, we can represent the direction of maximum growth locally, which allows us to identify those samples with higher/lower values of the variables analyzed. Conclusions The integration of different datasets and the simultaneous representation of samples and variables together give us a better understanding of biological knowledge. PMID:25032747
NASA Astrophysics Data System (ADS)
Forsythe, N.; Blenkinsop, S.; Fowler, H. J.
2015-05-01
A three-step climate classification was applied to a spatial domain covering the Himalayan arc and adjacent plains regions using input data from four global meteorological reanalyses. Input variables were selected based on an understanding of the climatic drivers of regional water resource variability and crop yields. Principal component analysis (PCA) of those variables and k-means clustering on the PCA outputs revealed a reanalysis ensemble consensus for eight macro-climate zones. Spatial statistics of input variables for each zone revealed consistent, distinct climatologies. This climate classification approach has potential for enhancing assessment of climatic influences on water resources and food security as well as for characterising the skill and bias of gridded data sets, both meteorological reanalyses and climate models, for reproducing subregional climatologies. Through their spatial descriptors (area, geographic centroid, elevation mean range), climate classifications also provide metrics, beyond simple changes in individual variables, with which to assess the magnitude of projected climate change. Such sophisticated metrics are of particular interest for regions, including mountainous areas, where natural and anthropogenic systems are expected to be sensitive to incremental climate shifts.
NASA Astrophysics Data System (ADS)
Hao, Wenrui; Lu, Zhenzhou; Li, Luyi
2013-05-01
In order to explore the contributions by correlated input variables to the variance of the output, a novel interpretation framework of importance measure indices is proposed for a model with correlated inputs, which includes the indices of the total correlated contribution and the total uncorrelated contribution. The proposed indices accurately describe the connotations of the contributions by the correlated input to the variance of output, and they can be viewed as the complement and correction of the interpretation about the contributions by the correlated inputs presented in "Estimation of global sensitivity indices for models with dependent variables, Computer Physics Communications, 183 (2012) 937-946". Both of them contain the independent contribution by an individual input. Taking the general form of quadratic polynomial as an illustration, the total correlated contribution and the independent contribution by an individual input are derived analytically, from which the components and their origins of both contributions of correlated input can be clarified without any ambiguity. In the special case that no square term is included in the quadratic polynomial model, the total correlated contribution by the input can be further decomposed into the variance contribution related to the correlation of the input with other inputs and the independent contribution by the input itself, and the total uncorrelated contribution can be further decomposed into the independent part by interaction between the input and others and the independent part by the input itself. Numerical examples are employed and their results demonstrate that the derived analytical expressions of the variance-based importance measure are correct, and the clarification of the correlated input contribution to model output by the analytical derivation is very important for expanding the theory and solutions of uncorrelated input to those of the correlated one.
A Novel Approach for Efficient Pharmacophore-based Virtual Screening: Method and Applications
Dror, Oranit; Schneidman-Duhovny, Dina; Inbar, Yuval; Nussinov, Ruth; Wolfson, Haim J.
2009-01-01
Virtual screening is emerging as a productive and cost-effective technology in rational drug design for the identification of novel lead compounds. An important model for virtual screening is the pharmacophore. Pharmacophore is the spatial configuration of essential features that enable a ligand molecule to interact with a specific target receptor. In the absence of a known receptor structure, a pharmacophore can be identified from a set of ligands that have been observed to interact with the target receptor. Here, we present a novel computational method for pharmacophore detection and virtual screening. The pharmacophore detection module is able to: (i) align multiple flexible ligands in a deterministic manner without exhaustive enumeration of the conformational space, (ii) detect subsets of input ligands that may bind to different binding sites or have different binding modes, (iii) address cases where the input ligands have different affinities by defining weighted pharmacophores based on the number of ligands that share them, and (iv) automatically select the most appropriate pharmacophore candidates for virtual screening. The algorithm is highly efficient, allowing a fast exploration of the chemical space by virtual screening of huge compound databases. The performance of PharmaGist was successfully evaluated on a commonly used dataset of G-Protein Coupled Receptor alpha1A. Additionally, a large-scale evaluation using the DUD (directory of useful decoys) dataset was performed. DUD contains 2950 active ligands for 40 different receptors, with 36 decoy compounds for each active ligand. PharmaGist enrichment rates are comparable with other state-of-the-art tools for virtual screening. Availability The software is available for download. A user-friendly web interface for pharmacophore detection is available at http://bioinfo3d.cs.tau.ac.il/PharmaGist. PMID:19803502
Modeling nitrate at domestic and public-supply well depths in the Central Valley, California
Nolan, Bernard T.; Gronberg, JoAnn M.; Faunt, Claudia C.; Eberts, Sandra M.; Belitz, Ken
2014-01-01
Aquifer vulnerability models were developed to map groundwater nitrate concentration at domestic and public-supply well depths in the Central Valley, California. We compared three modeling methods for ability to predict nitrate concentration >4 mg/L: logistic regression (LR), random forest classification (RFC), and random forest regression (RFR). All three models indicated processes of nitrogen fertilizer input at the land surface, transmission through coarse-textured, well-drained soils, and transport in the aquifer to the well screen. The total percent correct predictions were similar among the three models (69–82%), but RFR had greater sensitivity (84% for shallow wells and 51% for deep wells). The results suggest that RFR can better identify areas with high nitrate concentration but that LR and RFC may better describe bulk conditions in the aquifer. A unique aspect of the modeling approach was inclusion of outputs from previous, physically based hydrologic and textural models as predictor variables, which were important to the models. Vertical water fluxes in the aquifer and percent coarse material above the well screen were ranked moderately high-to-high in the RFR models, and the average vertical water flux during the irrigation season was highly significant (p < 0.0001) in logistic regression.
Blade loss transient dynamics analysis. Volume 3: User's manual for TETRA program
NASA Technical Reports Server (NTRS)
Black, G. R.; Gallardo, V. C.; Storace, A. S.; Sagendorph, F.
1981-01-01
The users manual for TETRA contains program logic, flow charts, error messages, input sheets, modeling instructions, option descriptions, input variable descriptions, and demonstration problems. The process of obtaining a NASTRAN 17.5 generated modal input file for TETRA is also described with a worked sample.
The appropriate spatial scale for a distributed energy balance model was investigated by: (a) determining the scale of variability associated with the remotely sensed and GIS-generated model input data; and (b) examining the effects of input data spatial aggregation on model resp...
Multiplexer and time duration measuring circuit
Gray, Jr., James
1980-01-01
A multiplexer device is provided for multiplexing data in the form of randomly developed, variable width pulses from a plurality of pulse sources to a master storage. The device includes a first multiplexer unit which includes a plurality of input circuits each coupled to one of the pulse sources, with all input circuits being disabled when one input circuit receives an input pulse so that only one input pulse is multiplexed by the multiplexer unit at any one time.
2008-06-09
This is a photo of an engineering model of the Thermal and Evolved-Gas Analyzer TEGA instrument on board NASA Phoenix Mars Lander. This view shows a TEGA oven-loading mechanism beneath the input screen.
The impact of 14-nm photomask uncertainties on computational lithography solutions
NASA Astrophysics Data System (ADS)
Sturtevant, John; Tejnil, Edita; Lin, Tim; Schultze, Steffen; Buck, Peter; Kalk, Franklin; Nakagawa, Kent; Ning, Guoxiang; Ackmann, Paul; Gans, Fritz; Buergel, Christian
2013-04-01
Computational lithography solutions rely upon accurate process models to faithfully represent the imaging system output for a defined set of process and design inputs. These models, which must balance accuracy demands with simulation runtime boundary conditions, rely upon the accurate representation of multiple parameters associated with the scanner and the photomask. While certain system input variables, such as scanner numerical aperture, can be empirically tuned to wafer CD data over a small range around the presumed set point, it can be dangerous to do so since CD errors can alias across multiple input variables. Therefore, many input variables for simulation are based upon designed or recipe-requested values or independent measurements. It is known, however, that certain measurement methodologies, while precise, can have significant inaccuracies. Additionally, there are known errors associated with the representation of certain system parameters. With shrinking total CD control budgets, appropriate accounting for all sources of error becomes more important, and the cumulative consequence of input errors to the computational lithography model can become significant. In this work, we examine with a simulation sensitivity study, the impact of errors in the representation of photomask properties including CD bias, corner rounding, refractive index, thickness, and sidewall angle. The factors that are most critical to be accurately represented in the model are cataloged. CD Bias values are based on state of the art mask manufacturing data and other variables changes are speculated, highlighting the need for improved metrology and awareness.
Prediction of problematic wine fermentations using artificial neural networks.
Román, R César; Hernández, O Gonzalo; Urtubia, U Alejandra
2011-11-01
Artificial neural networks (ANNs) have been used for the recognition of non-linear patterns, a characteristic of bioprocesses like wine production. In this work, ANNs were tested to predict problems of wine fermentation. A database of about 20,000 data from industrial fermentations of Cabernet Sauvignon and 33 variables was used. Two different ways of inputting data into the model were studied, by points and by fermentation. Additionally, different sub-cases were studied by varying the predictor variables (total sugar, alcohol, glycerol, density, organic acids and nitrogen compounds) and the time of fermentation (72, 96 and 256 h). The input of data by fermentations gave better results than the input of data by points. In fact, it was possible to predict 100% of normal and problematic fermentations using three predictor variables: sugars, density and alcohol at 72 h (3 days). Overall, ANNs were capable of obtaining 80% of prediction using only one predictor variable at 72 h; however, it is recommended to add more fermentations to confirm this promising result.
Amiryousefi, Mohammad Reza; Mohebbi, Mohebbat; Khodaiyan, Faramarz
2014-01-01
The objectives of this study were to use image analysis and artificial neural network (ANN) to predict mass transfer kinetics as well as color changes and shrinkage of deep-fat fried ostrich meat cubes. Two generalized feedforward networks were separately developed by using the operation conditions as inputs. Results based on the highest numerical quantities of the correlation coefficients between the experimental versus predicted values, showed proper fitting. Sensitivity analysis results of selected ANNs showed that among the input variables, frying temperature was the most sensitive to moisture content (MC) and fat content (FC) compared to other variables. Sensitivity analysis results of selected ANNs showed that MC and FC were the most sensitive to frying temperature compared to other input variables. Similarly, for the second ANN architecture, microwave power density was the most impressive variable having the maximum influence on both shrinkage percentage and color changes. Copyright © 2013 Elsevier Ltd. All rights reserved.
Metamodeling and mapping of nitrate flux in the unsaturated zone and groundwater, Wisconsin, USA
NASA Astrophysics Data System (ADS)
Nolan, Bernard T.; Green, Christopher T.; Juckem, Paul F.; Liao, Lixia; Reddy, James E.
2018-04-01
Nitrate contamination of groundwater in agricultural areas poses a major challenge to the sustainability of water resources. Aquifer vulnerability models are useful tools that can help resource managers identify areas of concern, but quantifying nitrogen (N) inputs in such models is challenging, especially at large spatial scales. We sought to improve regional nitrate (NO3-) input functions by characterizing unsaturated zone NO3- transport to groundwater through use of surrogate, machine-learning metamodels of a process-based N flux model. The metamodels used boosted regression trees (BRTs) to relate mappable landscape variables to parameters and outputs of a previous "vertical flux method" (VFM) applied at sampled wells in the Fox, Wolf, and Peshtigo (FWP) river basins in northeastern Wisconsin. In this context, the metamodels upscaled the VFM results throughout the region, and the VFM parameters and outputs are the metamodel response variables. The study area encompassed the domain of a detailed numerical model that provided additional predictor variables, including groundwater recharge, to the metamodels. We used a statistical learning framework to test a range of model complexities to identify suitable hyperparameters of the six BRT metamodels corresponding to each response variable of interest: NO3- source concentration factor (which determines the local NO3- input concentration); unsaturated zone travel time; NO3- concentration at the water table in 1980, 2000, and 2020 (three separate metamodels); and NO3- "extinction depth", the eventual steady state depth of the NO3- front. The final metamodels were trained to 129 wells within the active numerical flow model area, and considered 58 mappable predictor variables compiled in a geographic information system (GIS). These metamodels had training and cross-validation testing R2 values of 0.52 - 0.86 and 0.22 - 0.38, respectively, and predictions were compiled as maps of the above response variables. Testing performance was reasonable, considering that we limited the metamodel predictor variables to mappable factors as opposed to using all available VFM input variables. Relationships between metamodel predictor variables and mapped outputs were generally consistent with expectations, e.g. with greater source concentrations and NO3- at the groundwater table in areas of intensive crop use and well drained soils. Shorter unsaturated zone travel times in poorly drained areas likely indicated preferential flow through clay soils, and a tendency for fine grained deposits to collocate with areas of shallower water table. Numerical estimates of groundwater recharge were important in the metamodels and may have been a proxy for N input and redox conditions in the northern FWP, which had shallow predicted NO3- extinction depth. The metamodel results provide proof-of-concept for regional characterization of unsaturated zone NO3- transport processes in a statistical framework based on readily mappable GIS input variables.
Two-Stage Variable Sample-Rate Conversion System
NASA Technical Reports Server (NTRS)
Tkacenko, Andre
2009-01-01
A two-stage variable sample-rate conversion (SRC) system has been pro posed as part of a digital signal-processing system in a digital com munication radio receiver that utilizes a variety of data rates. The proposed system would be used as an interface between (1) an analog- todigital converter used in the front end of the receiver to sample an intermediatefrequency signal at a fixed input rate and (2) digita lly implemented tracking loops in subsequent stages that operate at v arious sample rates that are generally lower than the input sample r ate. This Two-Stage System would be capable of converting from an input sample rate to a desired lower output sample rate that could be var iable and not necessarily a rational fraction of the input rate.
Luo, Zhongkui; Feng, Wenting; Luo, Yiqi; Baldock, Jeff; Wang, Enli
2017-10-01
Soil organic carbon (SOC) dynamics are regulated by the complex interplay of climatic, edaphic and biotic conditions. However, the interrelation of SOC and these drivers and their potential connection networks are rarely assessed quantitatively. Using observations of SOC dynamics with detailed soil properties from 90 field trials at 28 sites under different agroecosystems across the Australian cropping regions, we investigated the direct and indirect effects of climate, soil properties, carbon (C) inputs and soil C pools (a total of 17 variables) on SOC change rate (r C , Mg C ha -1 yr -1 ). Among these variables, we found that the most influential variables on r C were the average C input amount and annual precipitation, and the total SOC stock at the beginning of the trials. Overall, C inputs (including C input amount and pasture frequency in the crop rotation system) accounted for 27% of the relative influence on r C , followed by climate 25% (including precipitation and temperature), soil C pools 24% (including pool size and composition) and soil properties (such as cation exchange capacity, clay content, bulk density) 24%. Path analysis identified a network of intercorrelations of climate, soil properties, C inputs and soil C pools in determining r C . The direct correlation of r C with climate was significantly weakened if removing the effects of soil properties and C pools, and vice versa. These results reveal the relative importance of climate, soil properties, C inputs and C pools and their complex interconnections in regulating SOC dynamics. Ignorance of the impact of changes in soil properties, C pool composition and C input (quantity and quality) on SOC dynamics is likely one of the main sources of uncertainty in SOC predictions from the process-based SOC models. © 2017 John Wiley & Sons Ltd.
Lung Cancer Screening Participation: Developing a Conceptual Model to Guide Research
Carter-Harris, Lisa; Davis, Lorie L.; Rawl, Susan M.
2017-01-01
Purpose To describe the development of a conceptual model to guide research focused on lung cancer screening participation from the perspective of the individual in the decision-making process. Methods Based on a comprehensive review of empirical and theoretical literature, a conceptual model was developed linking key psychological variables (stigma, medical mistrust, fatalism, worry, and fear) to the health belief model and precaution adoption process model. Results Proposed model concepts have been examined in prior research of either lung or other cancer screening behavior. To date, a few studies have explored a limited number of variables that influence screening behavior in lung cancer specifically. Therefore, relationships among concepts in the model have been proposed and future research directions presented. Conclusion This proposed model is an initial step to support theoretically based research. As lung cancer screening becomes more widely implemented, it is critical to theoretically guide research to understand variables that may be associated with lung cancer screening participation. Findings from future research guided by the proposed conceptual model can be used to refine the model and inform tailored intervention development. PMID:28304262
Lung Cancer Screening Participation: Developing a Conceptual Model to Guide Research.
Carter-Harris, Lisa; Davis, Lorie L; Rawl, Susan M
2016-11-01
To describe the development of a conceptual model to guide research focused on lung cancer screening participation from the perspective of the individual in the decision-making process. Based on a comprehensive review of empirical and theoretical literature, a conceptual model was developed linking key psychological variables (stigma, medical mistrust, fatalism, worry, and fear) to the health belief model and precaution adoption process model. Proposed model concepts have been examined in prior research of either lung or other cancer screening behavior. To date, a few studies have explored a limited number of variables that influence screening behavior in lung cancer specifically. Therefore, relationships among concepts in the model have been proposed and future research directions presented. This proposed model is an initial step to support theoretically based research. As lung cancer screening becomes more widely implemented, it is critical to theoretically guide research to understand variables that may be associated with lung cancer screening participation. Findings from future research guided by the proposed conceptual model can be used to refine the model and inform tailored intervention development.
Kumar, Mukesh; Singh, Amrinder; Beniwal, Vikas; Salar, Raj Kumar
2016-12-01
Tannase (tannin acyl hydrolase E.C 3.1.1.20) is an inducible, largely extracellular enzyme that causes the hydrolysis of ester and depside bonds present in various substrates. Large scale industrial application of this enzyme is very limited owing to its high production costs. In the present study, cost effective production of tannase by Klebsiella pneumoniae KP715242 was studied under submerged fermentation using different tannin rich agro-residues like Indian gooseberry leaves (Phyllanthus emblica), Black plum leaves (Syzygium cumini), Eucalyptus leaves (Eucalyptus glogus) and Babul leaves (Acacia nilotica). Among all agro-residues, Indian gooseberry leaves were found to be the best substrate for tannase production under submerged fermentation. Sequential optimization approach using Taguchi orthogonal array screening and response surface methodology was adopted to optimize the fermentation variables in order to enhance the enzyme production. Eleven medium components were screened primarily by Taguchi orthogonal array design to identify the most contributing factors towards the enzyme production. The four most significant contributing variables affecting tannase production were found to be pH (23.62 %), tannin extract (20.70 %), temperature (20.33 %) and incubation time (14.99 %). These factors were further optimized with central composite design using response surface methodology. Maximum tannase production was observed at 5.52 pH, 39.72 °C temperature, 91.82 h of incubation time and 2.17 % tannin content. The enzyme activity was enhanced by 1.26 fold under these optimized conditions. The present study emphasizes the use of agro-residues as a potential substrate with an aim to lower down the input costs for tannase production so that the enzyme could be used proficiently for commercial purposes.
Innovations in Basic Flight Training for the Indonesian Air Force
1990-12-01
microeconomic theory that could approximate the optimum mix of training hours between an aircraft and simulator, and therefore improve cost effectiveness...The microeconomic theory being used is normally employed when showing production with two variable inputs. An example of variable inputs would be labor...NAS Corpus Christi, Texas, Aerodynamics of the T-34C, 1989. 26. Naval Air Training Command, NAS Corpus Christi, Texas, Meteorological Theory Workbook
A Bayesian approach to model structural error and input variability in groundwater modeling
NASA Astrophysics Data System (ADS)
Xu, T.; Valocchi, A. J.; Lin, Y. F. F.; Liang, F.
2015-12-01
Effective water resource management typically relies on numerical models to analyze groundwater flow and solute transport processes. Model structural error (due to simplification and/or misrepresentation of the "true" environmental system) and input forcing variability (which commonly arises since some inputs are uncontrolled or estimated with high uncertainty) are ubiquitous in groundwater models. Calibration that overlooks errors in model structure and input data can lead to biased parameter estimates and compromised predictions. We present a fully Bayesian approach for a complete assessment of uncertainty for spatially distributed groundwater models. The approach explicitly recognizes stochastic input and uses data-driven error models based on nonparametric kernel methods to account for model structural error. We employ exploratory data analysis to assist in specifying informative prior for error models to improve identifiability. The inference is facilitated by an efficient sampling algorithm based on DREAM-ZS and a parameter subspace multiple-try strategy to reduce the required number of forward simulations of the groundwater model. We demonstrate the Bayesian approach through a synthetic case study of surface-ground water interaction under changing pumping conditions. It is found that explicit treatment of errors in model structure and input data (groundwater pumping rate) has substantial impact on the posterior distribution of groundwater model parameters. Using error models reduces predictive bias caused by parameter compensation. In addition, input variability increases parametric and predictive uncertainty. The Bayesian approach allows for a comparison among the contributions from various error sources, which could inform future model improvement and data collection efforts on how to best direct resources towards reducing predictive uncertainty.
NASA Astrophysics Data System (ADS)
Zounemat-Kermani, Mohammad
2012-08-01
In this study, the ability of two models of multi linear regression (MLR) and Levenberg-Marquardt (LM) feed-forward neural network was examined to estimate the hourly dew point temperature. Dew point temperature is the temperature at which water vapor in the air condenses into liquid. This temperature can be useful in estimating meteorological variables such as fog, rain, snow, dew, and evapotranspiration and in investigating agronomical issues as stomatal closure in plants. The availability of hourly records of climatic data (air temperature, relative humidity and pressure) which could be used to predict dew point temperature initiated the practice of modeling. Additionally, the wind vector (wind speed magnitude and direction) and conceptual input of weather condition were employed as other input variables. The three quantitative standard statistical performance evaluation measures, i.e. the root mean squared error, mean absolute error, and absolute logarithmic Nash-Sutcliffe efficiency coefficient ( {| {{{Log}}({{NS}})} |} ) were employed to evaluate the performances of the developed models. The results showed that applying wind vector and weather condition as input vectors along with meteorological variables could slightly increase the ANN and MLR predictive accuracy. The results also revealed that LM-NN was superior to MLR model and the best performance was obtained by considering all potential input variables in terms of different evaluation criteria.
NASA Technical Reports Server (NTRS)
Meyn, Larry A.
2018-01-01
One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use
Soft Mixer Assignment in a Hierarchical Generative Model of Natural Scene Statistics
Schwartz, Odelia; Sejnowski, Terrence J.; Dayan, Peter
2010-01-01
Gaussian scale mixture models offer a top-down description of signal generation that captures key bottom-up statistical characteristics of filter responses to images. However, the pattern of dependence among the filters for this class of models is prespecified. We propose a novel extension to the gaussian scale mixture model that learns the pattern of dependence from observed inputs and thereby induces a hierarchical representation of these inputs. Specifically, we propose that inputs are generated by gaussian variables (modeling local filter structure), multiplied by a mixer variable that is assigned probabilistically to each input from a set of possible mixers. We demonstrate inference of both components of the generative model, for synthesized data and for different classes of natural images, such as a generic ensemble and faces. For natural images, the mixer variable assignments show invariances resembling those of complex cells in visual cortex; the statistics of the gaussian components of the model are in accord with the outputs of divisive normalization models. We also show how our model helps interrelate a wide range of models of image statistics and cortical processing. PMID:16999575
Rowe, Meredith L; Levine, Susan C; Fisher, Joan A; Goldin-Meadow, Susan
2009-01-01
Children with unilateral pre- or perinatal brain injury (BI) show remarkable plasticity for language learning. Previous work highlights the important role that lesion characteristics play in explaining individual variation in plasticity in the language development of children with BI. The current study examines whether the linguistic input that children with BI receive from their caregivers also contributes to this early plasticity, and whether linguistic input plays a similar role in children with BI as it does in typically developing (TD) children. Growth in vocabulary and syntactic production is modeled for 80 children (53 TD, 27 BI) between 14 and 46 months. Findings indicate that caregiver input is an equally potent predictor of vocabulary growth in children with BI and in TD children. In contrast, input is a more potent predictor of syntactic growth for children with BI than for TD children. Controlling for input, lesion characteristics (lesion size, type, seizure history) also affect the language trajectories of children with BI. Thus, findings illustrate how both variability in the environment (linguistic input) and variability in the organism (lesion characteristics) work together to contribute to plasticity in language learning.
Variable input observer for state estimation of high-rate dynamics
NASA Astrophysics Data System (ADS)
Hong, Jonathan; Cao, Liang; Laflamme, Simon; Dodson, Jacob
2017-04-01
High-rate systems operating in the 10 μs to 10 ms timescale are likely to experience damaging effects due to rapid environmental changes (e.g., turbulence, ballistic impact). Some of these systems could benefit from real-time state estimation to enable their full potential. Examples of such systems include blast mitigation strategies, automotive airbag technologies, and hypersonic vehicles. Particular challenges in high-rate state estimation include: 1) complex time varying nonlinearities of system (e.g. noise, uncertainty, and disturbance); 2) rapid environmental changes; 3) requirement of high convergence rate. Here, we propose using a Variable Input Observer (VIO) concept to vary the input space as the event unfolds. When systems experience high-rate dynamics, rapid changes in the system occur. To investigate the VIO's potential, a VIO-based neuro-observer is constructed and studied using experimental data collected from a laboratory impact test. Results demonstrate that the input space is unique to different impact conditions, and that adjusting the input space throughout the dynamic event produces better estimations than using a traditional fixed input space strategy.
Screening large-scale association study data: exploiting interactions using random forests.
Lunetta, Kathryn L; Hayward, L Brooke; Segal, Jonathan; Van Eerdewegh, Paul
2004-12-10
Genome-wide association studies for complex diseases will produce genotypes on hundreds of thousands of single nucleotide polymorphisms (SNPs). A logical first approach to dealing with massive numbers of SNPs is to use some test to screen the SNPs, retaining only those that meet some criterion for further study. For example, SNPs can be ranked by p-value, and those with the lowest p-values retained. When SNPs have large interaction effects but small marginal effects in a population, they are unlikely to be retained when univariate tests are used for screening. However, model-based screens that pre-specify interactions are impractical for data sets with thousands of SNPs. Random forest analysis is an alternative method that produces a single measure of importance for each predictor variable that takes into account interactions among variables without requiring model specification. Interactions increase the importance for the individual interacting variables, making them more likely to be given high importance relative to other variables. We test the performance of random forests as a screening procedure to identify small numbers of risk-associated SNPs from among large numbers of unassociated SNPs using complex disease models with up to 32 loci, incorporating both genetic heterogeneity and multi-locus interaction. Keeping other factors constant, if risk SNPs interact, the random forest importance measure significantly outperforms the Fisher Exact test as a screening tool. As the number of interacting SNPs increases, the improvement in performance of random forest analysis relative to Fisher Exact test for screening also increases. Random forests perform similarly to the univariate Fisher Exact test as a screening tool when SNPs in the analysis do not interact. In the context of large-scale genetic association studies where unknown interactions exist among true risk-associated SNPs or SNPs and environmental covariates, screening SNPs using random forest analyses can significantly reduce the number of SNPs that need to be retained for further study compared to standard univariate screening methods.
Pointright: a system to redirect mouse and keyboard control among multiple machines
Johanson, Bradley E [Palo Alto, CA; Winograd, Terry A [Stanford, CA; Hutchins, Gregory M [Mountain View, CA
2008-09-30
The present invention provides a software system, PointRight, that allows for smooth and effortless control of pointing and input devices among multiple displays. With PointRight, a single free-floating mouse and keyboard can be used to control multiple screens. When the cursor reaches the edge of a screen it seamlessly moves to the adjacent screen and keyboard control is simultaneously redirected to the appropriate machine. Laptops may also redirect their keyboard and pointing device, and multiple pointers are supported simultaneously. The system automatically reconfigures itself as displays go on, go off, or change the machine they display.
Bertollo, David N; Alexander, Mary Jane; Shinn, Marybeth; Aybar, Jalila B
2007-06-01
This column describes the nonproprietary software Talker, used to adapt screening instruments to audio computer-assisted self-interviewing (ACASI) systems for low-literacy populations and other populations. Talker supports ease of programming, multiple languages, on-site scoring, and the ability to update a central research database. Key features include highly readable text display, audio presentation of questions and audio prompting of answers, and optional touch screen input. The scripting language for adapting instruments is briefly described as well as two studies in which respondents provided positive feedback on its use.
Developmental Screening Referrals: Child and Family Factors that Predict Referral Completion
ERIC Educational Resources Information Center
Jennings, Danielle J.; Hanline, Mary Frances
2013-01-01
This study researched the predictive impact of developmental screening results and the effects of child and family characteristics on completion of referrals given for evaluation. Logistical and hierarchical logistic regression analyses were used to determine the significance of 10 independent variables on the predictor variable. The number of…
Effect of solar loading on greenhouse containers used in transpiration efficiency screening
USDA-ARS?s Scientific Manuscript database
Earlier we described a simple high throughput method of screening sorghum for transpiration efficiency (TE). Subsequently it was observed that while results were consistent between lines exhibiting high and low TE, ranking between lines with similar TE was variable. We hypothesized that variable mic...
Hu, Qinglei
2007-10-01
This paper presents a dual-stage control system design method for the flexible spacecraft attitude maneuvering control by use of on-off thrusters and active vibration control by input shaper. In this design approach, attitude control system and vibration suppression were designed separately using lower order model. As a stepping stone, an integral variable structure controller with the assumption of knowing the upper bounds of the mismatched lumped perturbation has been designed which ensures exponential convergence of attitude angle and angular velocity in the presence of bounded uncertainty/disturbances. To reconstruct estimates of the system states for use in a full information variable structure control law, an asymptotic variable structure observer is also employed. In addition, the thruster output is modulated in pulse-width pulse-frequency so that the output profile is similar to the continuous control histories. For actively suppressing the induced vibration, the input shaping technique is used to modify the existing command so that less vibration will be caused by the command itself, which only requires information about the vibration frequency and damping of the closed-loop system. The rationale behind this hybrid control scheme is that the integral variable structure controller can achieve good precision pointing, even in the presence of uncertainties/disturbances, whereas the shaped input attenuator is applied to actively suppress the undesirable vibrations excited by the rapid maneuvers. Simulation results for the spacecraft model show precise attitude control and vibration suppression.
A waste characterisation procedure for ADM1 implementation based on degradation kinetics.
Girault, R; Bridoux, G; Nauleau, F; Poullain, C; Buffet, J; Steyer, J-P; Sadowski, A G; Béline, F
2012-09-01
In this study, a procedure accounting for degradation kinetics was developed to split the total COD of a substrate into each input state variable required for Anaerobic Digestion Model n°1. The procedure is based on the combination of batch experimental degradation tests ("anaerobic respirometry") and numerical interpretation of the results obtained (optimisation of the ADM1 input state variable set). The effects of the main operating parameters, such as the substrate to inoculum ratio in batch experiments and the origin of the inoculum, were investigated. Combined with biochemical fractionation of the total COD of substrates, this method enabled determination of an ADM1-consistent input state variable set for each substrate with affordable identifiability. The substrate to inoculum ratio in the batch experiments and the origin of the inoculum influenced input state variables. However, based on results modelled for a CSTR fed with the substrate concerned, these effects were not significant. Indeed, if the optimal ranges of these operational parameters are respected, uncertainty in COD fractionation is mainly limited to temporal variability of the properties of the substrates. As the method is based on kinetics and is easy to implement for a wide range of substrates, it is a very promising way to numerically predict the effect of design parameters on the efficiency of an anaerobic CSTR. This method thus promotes the use of modelling for the design and optimisation of anaerobic processes. Copyright © 2012 Elsevier Ltd. All rights reserved.
Analysis of Gambling in the Media Related to Screens: Immersion as a Predictor of Excessive Use?
Rémond, Jean-Jacques; Romo, Lucia
2018-01-02
This study investigates the intricacies between the player interface proposed by the screens, (in particular on smartphone applications or in video games) and gambling. Recent research indicates connections between "immersion" and excessive screen practice. We want to understand the causal-effects between online gambling and the "immersion" variable and understand their relationship and its contingencies. This article empirically investigates whether and how it is possible to observe immersion with its sub-dimensions in gambling on different screens. The objective of this study was to analyze: (1) the costs and benefits associated with gambling practice on screens (2) the link between gambling practice and screen practice (video game, Internet, mobile screen); (3) to observe the propensity to immersion for individuals practicing gambling on screens; and (4) to examine the comorbidities and cognitive factors associated with the practice of gambling on screen. A total of 432 adults (212 men, 220 women), recruited from Ile-de-France (France), responded to a battery of questionnaires. Our study suggests that immersion variables make it possible to understand the cognitive participation of individuals towards screens in general, the practice of gambling on screens and the excessive practice of screens.
Analysis of Gambling in the Media Related to Screens: Immersion as a Predictor of Excessive Use?
Rémond, Jean-Jacques; Romo, Lucia
2018-01-01
This study investigates the intricacies between the player interface proposed by the screens, (in particular on smartphone applications or in video games) and gambling. Recent research indicates connections between “immersion” and excessive screen practice. We want to understand the causal-effects between online gambling and the “immersion” variable and understand their relationship and its contingencies. This article empirically investigates whether and how it is possible to observe immersion with its sub-dimensions in gambling on different screens. The objective of this study was to analyze: (1) the costs and benefits associated with gambling practice on screens (2) the link between gambling practice and screen practice (video game, Internet, mobile screen); (3) to observe the propensity to immersion for individuals practicing gambling on screens; and (4) to examine the comorbidities and cognitive factors associated with the practice of gambling on screen. A total of 432 adults (212 men, 220 women), recruited from Ile-de-France (France), responded to a battery of questionnaires. Our study suggests that immersion variables make it possible to understand the cognitive participation of individuals towards screens in general, the practice of gambling on screens and the excessive practice of screens. PMID:29301311
NASA Astrophysics Data System (ADS)
Jatnieks, Janis; De Lucia, Marco; Sips, Mike; Dransch, Doris
2015-04-01
Many geoscience applications can benefit from testing many combinations of input parameters for geochemical simulation models. It is, however, a challenge to screen the input and output data from the model to identify the significant relationships between input parameters and output variables. For addressing this problem we propose a Visual Analytics approach that has been developed in an ongoing collaboration between computer science and geoscience researchers. Our Visual Analytics approach uses visualization methods of hierarchical horizontal axis, multi-factor stacked bar charts and interactive semi-automated filtering for input and output data together with automatic sensitivity analysis. This guides the users towards significant relationships. We implement our approach as an interactive data exploration tool. It is designed with flexibility in mind, so that a diverse set of tasks such as inverse modeling, sensitivity analysis and model parameter refinement can be supported. Here we demonstrate the capabilities of our approach by two examples for gas storage applications. For the first example our Visual Analytics approach enabled the analyst to observe how the element concentrations change around previously established baselines in response to thousands of different combinations of mineral phases. This supported combinatorial inverse modeling for interpreting observations about the chemical composition of the formation fluids at the Ketzin pilot site for CO2 storage. The results indicate that, within the experimental error range, the formation fluid cannot be considered at local thermodynamical equilibrium with the mineral assemblage of the reservoir rock. This is a valuable insight from the predictive geochemical modeling for the Ketzin site. For the second example our approach supports sensitivity analysis for a reaction involving the reductive dissolution of pyrite with formation of pyrrothite in presence of gaseous hydrogen. We determine that this reaction is thermodynamically favorable under a broad range of conditions. This includes low temperatures and absence of microbial catalysators. Our approach has potential for use in other applications that involve exploration of relationships in geochemical simulation model data.
Probabilistic dose-response modeling: case study using dichloromethane PBPK model results.
Marino, Dale J; Starr, Thomas B
2007-12-01
A revised assessment of dichloromethane (DCM) has recently been reported that examines the influence of human genetic polymorphisms on cancer risks using deterministic PBPK and dose-response modeling in the mouse combined with probabilistic PBPK modeling in humans. This assessment utilized Bayesian techniques to optimize kinetic variables in mice and humans with mean values from posterior distributions used in the deterministic modeling in the mouse. To supplement this research, a case study was undertaken to examine the potential impact of probabilistic rather than deterministic PBPK and dose-response modeling in mice on subsequent unit risk factor (URF) determinations. Four separate PBPK cases were examined based on the exposure regimen of the NTP DCM bioassay. These were (a) Same Mouse (single draw of all PBPK inputs for both treatment groups); (b) Correlated BW-Same Inputs (single draw of all PBPK inputs for both treatment groups except for bodyweights (BWs), which were entered as correlated variables); (c) Correlated BW-Different Inputs (separate draws of all PBPK inputs for both treatment groups except that BWs were entered as correlated variables); and (d) Different Mouse (separate draws of all PBPK inputs for both treatment groups). Monte Carlo PBPK inputs reflect posterior distributions from Bayesian calibration in the mouse that had been previously reported. A minimum of 12,500 PBPK iterations were undertaken, in which dose metrics, i.e., mg DCM metabolized by the GST pathway/L tissue/day for lung and liver were determined. For dose-response modeling, these metrics were combined with NTP tumor incidence data that were randomly selected from binomial distributions. Resultant potency factors (0.1/ED(10)) were coupled with probabilistic PBPK modeling in humans that incorporated genetic polymorphisms to derive URFs. Results show that there was relatively little difference, i.e., <10% in central tendency and upper percentile URFs, regardless of the case evaluated. Independent draws of PBPK inputs resulted in the slightly higher URFs. Results were also comparable to corresponding values from the previously reported deterministic mouse PBPK and dose-response modeling approach that used LED(10)s to derive potency factors. This finding indicated that the adjustment from ED(10) to LED(10) in the deterministic approach for DCM compensated for variability resulting from probabilistic PBPK and dose-response modeling in the mouse. Finally, results show a similar degree of variability in DCM risk estimates from a number of different sources including the current effort even though these estimates were developed using very different techniques. Given the variety of different approaches involved, 95th percentile-to-mean risk estimate ratios of 2.1-4.1 represent reasonable bounds on variability estimates regarding probabilistic assessments of DCM.
NETL CO 2 Storage prospeCtive Resource Estimation Excel aNalysis (CO 2-SCREEN) User's Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanguinito, Sean M.; Goodman, Angela; Levine, Jonathan
This user’s manual guides the use of the National Energy Technology Laboratory’s (NETL) CO 2 Storage prospeCtive Resource Estimation Excel aNalysis (CO 2-SCREEN) tool, which was developed to aid users screening saline formations for prospective CO 2 storage resources. CO 2- SCREEN applies U.S. Department of Energy (DOE) methods and equations for estimating prospective CO 2 storage resources for saline formations. CO2-SCREEN was developed to be substantive and user-friendly. It also provides a consistent method for calculating prospective CO 2 storage resources that allows for consistent comparison of results between different research efforts, such as the Regional Carbon Sequestration Partnershipsmore » (RCSP). CO 2-SCREEN consists of an Excel spreadsheet containing geologic inputs and outputs, linked to a GoldSim Player model that calculates prospective CO 2 storage resources via Monte Carlo simulation.« less
Variable Delay Element For Jitter Control In High Speed Data Links
Livolsi, Robert R.
2002-06-11
A circuit and method for decreasing the amount of jitter present at the receiver input of high speed data links which uses a driver circuit for input from a high speed data link which comprises a logic circuit having a first section (1) which provides data latches, a second section (2) which provides a circuit generates a pre-destorted output and for compensating for level dependent jitter having an OR function element and a NOR function element each of which is coupled to two inputs and to a variable delay element as an input which provides a bi-modal delay for pulse width pre-distortion, a third section (3) which provides a muxing circuit, and a forth section (4) for clock distribution in the driver circuit. A fifth section is used for logic testing the driver circuit.
A liquid lens switching-based motionless variable fiber-optic delay line
NASA Astrophysics Data System (ADS)
Khwaja, Tariq Shamim; Reza, Syed Azer; Sheikh, Mumtaz
2018-05-01
We present a Variable Fiber-Optic Delay Line (VFODL) module capable of imparting long variable delays by switching an input optical/RF signal between Single Mode Fiber (SMF) patch cords of different lengths through a pair of Electronically Controlled Tunable Lenses (ECTLs) resulting in a polarization-independent operation. Depending on intended application, the lengths of the SMFs can be chosen accordingly to achieve the desired VFODL operation dynamic range. If so desired, the state of the input signal polarization can be preserved with the use of commercially available polarization-independent ECTLs along with polarization-maintaining SMFs (PM-SMFs), resulting in an output polarization that is identical to the input. An ECTL-based design also improves power consumption and repeatability. The delay switching mechanism is electronically-controlled, involves no bulk moving parts, and can be fully-automated. The VFODL module is compact due to the use of small optical components and SMFs that can be packaged compactly.
[Reasearch progress in health economic evaluation of colorectal cancer screening in China].
Huang, Huiyao; Shi, Jufang; Dai, Min
2015-08-01
Burden of colorectal cancer is rising in China. More attention and financial input have been paid to it by central government that colorectal cancer screening program has been carried out recently in many areas in China. Diversity of screening strategies and limited health resources render selecting the best strategy in a population-wide program a challenging task that economy was also required to be considered except safety and efficacy. To provide a reference for the subsequent further economic evaluation, here we reviewed the evidence available on the economic evaluation of colorectal cancer screening in China. Meanwhile, information related to screening strategies, participation and mid-term efficacy of screening, information and results on economic evaluation were extracted and summarized. Three of the four studies finally included evaluated strategies combining immunochemical fecel occult blood test (iFOBT) with high-risk factor questionnaire as initial screening, colonoscopy as diagnostic screening. There was a consensus regarding the efficacy and effectiveness of screening compared to no screening. Whereas the lack and poor comparability between studies, multi-perspective and multi-phase economic evaluation of colorectal cancer screening is needed, relying on current population-based screening program to conduct a comprehensive cost accounting.
Smith, Sian K; Sousa, Mariana S; Essink-Bot, Marie-Louise; Halliday, Jane; Peate, Michelle; Fransen, Mirjam
2016-08-01
Supporting pregnant women to make informed choices about Down syndrome screening is widely endorsed. We reviewed the literature on: (a) the association between socioeconomic position and informed choices and decision-making about Down syndrome screening, and (b) the possible mediating variables (e.g., health literacy, numeracy skills, behavioral and communication variables) that might explain the relationship. EMBASE, MEDLINE, PubMed, CINAHL, and PsycINFO were searched from January 1999 to September 2014. The methodological quality of studies was determined by predefined criteria regarding the research aims, study design, study population and setting, measurement tools, and statistical analysis. A total of 33 studies met the inclusion criteria. Women from lower socioeconomic groups experience greater difficulties making informed choices about Down syndrome screening compared to women from higher socioeconomic groups. Most studies focus on individual dimensions of informed decision-making rather than assessing elements in conjunction with one another. Few studies have explored why there are socioeconomic differences in women's ability to make informed screening decisions. Future work is needed to identify mediating variables in this pathway. Systematic evidence-based intervention development to improve communication, understanding, and decision-making about Down syndrome screening is needed to ensure that women have an equal opportunity to make an informed choice about screening regardless of their socioeconomic position.
Modems and More: The Computer Branches Out.
ERIC Educational Resources Information Center
Dyrli, Odvard Egil
1986-01-01
Surveys new "peripherals," electronic devices that attach to computers. Devices such as videodisc players, desktop laser printers, large screen projectors, and input mechanisms that circumvent the keyboard dramatically expand the computer's instructional uses. (Author/LHW)
Lentic small water bodies: Variability of pesticide transport and transformation patterns.
Ulrich, Uta; Hörmann, Georg; Unger, Malte; Pfannerstill, Matthias; Steinmann, Frank; Fohrer, Nicola
2018-03-15
Lentic small water bodies have a high ecological potential as they fulfill several ecosystem services such as the retention of water and pollutants. They serve as a hot spot of biodiversity. Due to their location in or adjacent to agricultural fields, they can be influenced by inputs of pesticides and their transformation products. Since small water bodies have rarely been part of monitorings/campaigns up to now, their current exposure and processes guiding the pesticide input are not understood, yet. This study presents results of a sampling campaign of 10 lentic small water bodies from 2015 to 2016. They were sampled once after the spring application for a pesticide target screening, before autumn application and three times after rainfall events following the application. The autumn sampling focused on the herbicides metazachlor, flufenacet and their transformation products - oxalic acid and - sulfonic acid as representatives for common pesticides in the study region. The concentrations were associated with rainfall before and after application, characteristics of the site and the water bodies, physicochemical parameters and the applied amount of pesticides. The key results of the pesticide screening in spring indicate positive detections of pesticides which have not been applied for years to the single fields. The autumn sampling showed frequent occurrences of the transformation products, which are formed in soil, from 39% to 94% of all samples (n=71). Discharge patterns were observed for metazachlor with highest concentrations in the first sample after application and then decreasing, but not for flufenacet. The concentrations of the transformation products increased over time and revealed highest values mainly in the last sample. Besides rainfall patterns right after application, the spatial and temporal dissemination of the pesticides to the water bodies seems to play a major role to understand the exposure of lentic small water bodies. Copyright © 2017 Elsevier B.V. All rights reserved.
Aerosol climatology using a tunable spectral variability cloud screening of AERONET data
NASA Technical Reports Server (NTRS)
Kaufman, Yoram J.; Gobbi, Gian Paolo; Koren, Ilan
2005-01-01
Can cloud screening of an aerosol data set, affect the aerosol optical thickness (AOT) climatology? Aerosols, humidity and clouds are correlated. Therefore, rigorous cloud screening can systematically bias towards less cloudy conditions, underestimating the average AOT. Here, using AERONET data we show that systematic rejection of variable atmospheric optical conditions can generate such bias in the average AOT. Therefore we recommend (1) to introduce more powerful spectral variability cloud screening and (2) to change the philosophy behind present aerosol climatologies: Instead of systematically rejecting all cloud contaminations, we suggest to intentionally allow the presence of cloud contamination, estimate the statistical impact of the contamination and correct for it. The analysis, applied to 10 AERONET stations with approx. 4 years of data, shows almost no change for Rome (Italy), but up to a change in AOT of 0.12 in Beijing (PRC). Similar technique may be explored for satellite analysis, e.g. MODIS.
Global Sensitivity Analysis of Environmental Models: Convergence, Robustness and Validation
NASA Astrophysics Data System (ADS)
Sarrazin, Fanny; Pianosi, Francesca; Khorashadi Zadeh, Farkhondeh; Van Griensven, Ann; Wagener, Thorsten
2015-04-01
Global Sensitivity Analysis aims to characterize the impact that variations in model input factors (e.g. the parameters) have on the model output (e.g. simulated streamflow). In sampling-based Global Sensitivity Analysis, the sample size has to be chosen carefully in order to obtain reliable sensitivity estimates while spending computational resources efficiently. Furthermore, insensitive parameters are typically identified through the definition of a screening threshold: the theoretical value of their sensitivity index is zero but in a sampling-base framework they regularly take non-zero values. There is little guidance available for these two steps in environmental modelling though. The objective of the present study is to support modellers in making appropriate choices, regarding both sample size and screening threshold, so that a robust sensitivity analysis can be implemented. We performed sensitivity analysis for the parameters of three hydrological models with increasing level of complexity (Hymod, HBV and SWAT), and tested three widely used sensitivity analysis methods (Elementary Effect Test or method of Morris, Regional Sensitivity Analysis, and Variance-Based Sensitivity Analysis). We defined criteria based on a bootstrap approach to assess three different types of convergence: the convergence of the value of the sensitivity indices, of the ranking (the ordering among the parameters) and of the screening (the identification of the insensitive parameters). We investigated the screening threshold through the definition of a validation procedure. The results showed that full convergence of the value of the sensitivity indices is not necessarily needed to rank or to screen the model input factors. Furthermore, typical values of the sample sizes that are reported in the literature can be well below the sample sizes that actually ensure convergence of ranking and screening.
Method and system to estimate variables in an integrated gasification combined cycle (IGCC) plant
Kumar, Aditya; Shi, Ruijie; Dokucu, Mustafa
2013-09-17
System and method to estimate variables in an integrated gasification combined cycle (IGCC) plant are provided. The system includes a sensor suite to measure respective plant input and output variables. An extended Kalman filter (EKF) receives sensed plant input variables and includes a dynamic model to generate a plurality of plant state estimates and a covariance matrix for the state estimates. A preemptive-constraining processor is configured to preemptively constrain the state estimates and covariance matrix to be free of constraint violations. A measurement-correction processor may be configured to correct constrained state estimates and a constrained covariance matrix based on processing of sensed plant output variables. The measurement-correction processor is coupled to update the dynamic model with corrected state estimates and a corrected covariance matrix. The updated dynamic model may be configured to estimate values for at least one plant variable not originally sensed by the sensor suite.
Metamodeling and mapping of nitrate flux in the unsaturated zone and groundwater, Wisconsin, USA
Nolan, Bernard T.; Green, Christopher T.; Juckem, Paul F.; Liao, Lixia; Reddy, James E.
2018-01-01
Nitrate contamination of groundwater in agricultural areas poses a major challenge to the sustainability of water resources. Aquifer vulnerability models are useful tools that can help resource managers identify areas of concern, but quantifying nitrogen (N) inputs in such models is challenging, especially at large spatial scales. We sought to improve regional nitrate (NO3−) input functions by characterizing unsaturated zone NO3− transport to groundwater through use of surrogate, machine-learning metamodels of a process-based N flux model. The metamodels used boosted regression trees (BRTs) to relate mappable landscape variables to parameters and outputs of a previous “vertical flux method” (VFM) applied at sampled wells in the Fox, Wolf, and Peshtigo (FWP) river basins in northeastern Wisconsin. In this context, the metamodels upscaled the VFM results throughout the region, and the VFM parameters and outputs are the metamodel response variables. The study area encompassed the domain of a detailed numerical model that provided additional predictor variables, including groundwater recharge, to the metamodels. We used a statistical learning framework to test a range of model complexities to identify suitable hyperparameters of the six BRT metamodels corresponding to each response variable of interest: NO3− source concentration factor (which determines the local NO3− input concentration); unsaturated zone travel time; NO3− concentration at the water table in 1980, 2000, and 2020 (three separate metamodels); and NO3− “extinction depth”, the eventual steady state depth of the NO3−front. The final metamodels were trained to 129 wells within the active numerical flow model area, and considered 58 mappable predictor variables compiled in a geographic information system (GIS). These metamodels had training and cross-validation testing R2 values of 0.52 – 0.86 and 0.22 – 0.38, respectively, and predictions were compiled as maps of the above response variables. Testing performance was reasonable, considering that we limited the metamodel predictor variables to mappable factors as opposed to using all available VFM input variables. Relationships between metamodel predictor variables and mapped outputs were generally consistent with expectations, e.g. with greater source concentrations and NO3− at the groundwater table in areas of intensive crop use and well drained soils. Shorter unsaturated zone travel times in poorly drained areas likely indicated preferential flow through clay soils, and a tendency for fine grained deposits to collocate with areas of shallower water table. Numerical estimates of groundwater recharge were important in the metamodels and may have been a proxy for N input and redox conditions in the northern FWP, which had shallow predicted NO3− extinction depth. The metamodel results provide proof-of-concept for regional characterization of unsaturated zone NO3− transport processes in a statistical framework based on readily mappable GIS input variables.
Fichez, R; Chifflet, S; Douillet, P; Gérard, P; Gutierrez, F; Jouon, A; Ouillon, S; Grenz, C
2010-01-01
Considering the growing concern about the impact of anthropogenic inputs on coral reefs and coral reef lagoons, surprisingly little attention has been given to the relationship between those inputs and the trophic status of lagoon waters. The present paper describes the distribution of biogeochemical parameters in the coral reef lagoon of New Caledonia where environmental conditions allegedly range from pristine oligotrophic to anthropogenically influenced. The study objectives were to: (i) identify terrigeneous and anthropogenic inputs and propose a typology of lagoon waters, (ii) determine temporal variability of water biogeochemical parameters at time-scales ranging from hours to seasons. Combined ACP-cluster analyses revealed that over the 2000 km(2) lagoon area around the city of Nouméa, "natural" terrigeneous versus oceanic influences affecting all stations only accounted for less than 20% of the spatial variability whereas 60% of that spatial variability could be attributed to significant eutrophication of a limited number of inshore stations. ACP analysis allowed to unambiguously discriminating between the natural trophic enrichment along the offshore-inshore gradient and anthropogenically induced eutrophication. High temporal variability in dissolved inorganic nutrients concentrations strongly hindered their use as indicators of environmental status. Due to longer turn over time, particulate organic material and more specifically chlorophyll a appeared as more reliable nonconservative tracer of trophic status. Results further provided evidence that ENSO occurrences might temporarily lower the trophic status of the New Caledonia lagoon. It is concluded that, due to such high frequency temporal variability, the use of biogeochemical parameters in environmental surveys require adapted sampling strategies, data management and environmental alert methods. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
Screening strategies for atrial fibrillation: a systematic review and cost-effectiveness analysis.
Welton, Nicky J; McAleenan, Alexandra; Thom, Howard Hz; Davies, Philippa; Hollingworth, Will; Higgins, Julian Pt; Okoli, George; Sterne, Jonathan Ac; Feder, Gene; Eaton, Diane; Hingorani, Aroon; Fawsitt, Christopher; Lobban, Trudie; Bryden, Peter; Richards, Alison; Sofat, Reecha
2017-05-01
Atrial fibrillation (AF) is a common cardiac arrhythmia that increases the risk of thromboembolic events. Anticoagulation therapy to prevent AF-related stroke has been shown to be cost-effective. A national screening programme for AF may prevent AF-related events, but would involve a substantial investment of NHS resources. To conduct a systematic review of the diagnostic test accuracy (DTA) of screening tests for AF, update a systematic review of comparative studies evaluating screening strategies for AF, develop an economic model to compare the cost-effectiveness of different screening strategies and review observational studies of AF screening to provide inputs to the model. Systematic review, meta-analysis and cost-effectiveness analysis. Primary care. Adults. Screening strategies, defined by screening test, age at initial and final screens, screening interval and format of screening {systematic opportunistic screening [individuals offered screening if they consult with their general practitioner (GP)] or systematic population screening (when all eligible individuals are invited to screening)}. Sensitivity, specificity and diagnostic odds ratios; the odds ratio of detecting new AF cases compared with no screening; and the mean incremental net benefit compared with no screening. Two reviewers screened the search results, extracted data and assessed the risk of bias. A DTA meta-analysis was perfomed, and a decision tree and Markov model was used to evaluate the cost-effectiveness of the screening strategies. Diagnostic test accuracy depended on the screening test and how it was interpreted. In general, the screening tests identified in our review had high sensitivity (> 0.9). Systematic population and systematic opportunistic screening strategies were found to be similarly effective, with an estimated 170 individuals needed to be screened to detect one additional AF case compared with no screening. Systematic opportunistic screening was more likely to be cost-effective than systematic population screening, as long as the uptake of opportunistic screening observed in randomised controlled trials translates to practice. Modified blood pressure monitors, photoplethysmography or nurse pulse palpation were more likely to be cost-effective than other screening tests. A screening strategy with an initial screening age of 65 years and repeated screens every 5 years until age 80 years was likely to be cost-effective, provided that compliance with treatment does not decline with increasing age. A national screening programme for AF is likely to represent a cost-effective use of resources. Systematic opportunistic screening is more likely to be cost-effective than systematic population screening. Nurse pulse palpation or modified blood pressure monitors would be appropriate screening tests, with confirmation by diagnostic 12-lead electrocardiography interpreted by a trained GP, with referral to a specialist in the case of an unclear diagnosis. Implementation strategies to operationalise uptake of systematic opportunistic screening in primary care should accompany any screening recommendations. Many inputs for the economic model relied on a single trial [the Screening for Atrial Fibrillation in the Elderly (SAFE) study] and DTA results were based on a few studies at high risk of bias/of low applicability. Comparative studies measuring long-term outcomes of screening strategies and DTA studies for new, emerging technologies and to replicate the results for photoplethysmography and GP interpretation of 12-lead electrocardiography in a screening population. This study is registered as PROSPERO CRD42014013739. The National Institute for Health Research Health Technology Assessment programme.
Straver, J M; Janssen, A F W; Linnemann, A R; van Boekel, M A J S; Beumer, R R; Zwietering, M H
2007-09-01
This study aimed to characterize the number of Salmonella on chicken breast filet at the retail level and to evaluate if this number affects the risk of salmonellosis. From October to December 2005, 220 chilled raw filets (without skin) were collected from five local retail outlets in The Netherlands. Filet rinses that were positive after enrichment were enumerated with a three-tube most-probable-number (MPN) assay. Nineteen filets (8.6%) were contaminated above the detection limit of the MPN method (10 Salmonella per filet). The number of Salmonella on positive filets varied from 1 to 3.81 log MPN per filet. The obtained enumeration data were applied in a risk assessment model. The model considered possible growth during domestic storage, cross-contamination from filet via a cutting board to lettuce, and possible illness due to consumption of the prepared lettuce. A screening analysis with expected-case and worst-case estimates for the input values of the model showed that variability in the inputs was of relevance. Therefore, a Monte Carlo simulation with probability distributions for the inputs was carried out to predict the annual number of illnesses. Remarkably, over two-thirds of annual predicted illnesses were caused by the small fraction of filets containing more than 3 log Salmonella at retail (0.8% of all filets). The enumeration results can be used to confirm this hypothesis in a more elaborate risk assessment. Modeling of the supply chain can provide insight for possible intervention strategies to reduce the incidence of rare, but extreme levels. Reduction seems feasible within current practices, because the retail market study indicated a significant difference between suppliers.
Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool
NASA Astrophysics Data System (ADS)
Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.
2018-06-01
Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.
Validation of a Nutrition Screening Tool for Pediatric Patients with Cystic Fibrosis.
Souza Dos Santos Simon, Miriam Isabel; Forte, Gabriele Carra; da Silva Pereira, Juliane; da Fonseca Andrade Procianoy, Elenara; Drehmer, Michele
2016-05-01
In cystic fibrosis (CF), nutrition diagnosis is of critical relevance because the early identification of nutrition-related compromise enables early, adequate intervention and, consequently, influences patient prognosis. Up to now, there has not been a validated nutrition screening tool that takes into consideration clinical variables. To validate a specific nutritional risk screening tool for patients with CF based on clinical variables, anthropometric parameters, and dietary intake. Cross-sectional study. The nutrition screening tool was compared with a risk screening tool proposed by McDonald and the Cystic Fibrosis Foundation criteria. Patients aged 6 to 18 years, with a diagnosis of CF confirmed by two determinations of elevated chloride level in sweat (sweat test) and/or by identification of two CF-associated genetic mutations who were receiving follow-up care through the outpatient clinic of a Cystic Fibrosis Treatment Center. Earlier identification of nutritional risk in CF patients aged 6 to 18 years when a new screening tool was applied. Agreement among the tested methods was assessed by means of the kappa coefficient for categorical variables. Sensitivity, specificity, and accuracy values were calculated. The significance level was set at 5% (P<0.05). Statistical analyses were carried out in PASW Statistics for Windows version 18.0 (2009, SPSS Inc). Eighty-two patients (49% men, aged 6 to 18 years) were enrolled in the study. The agreement between the proposed screening tool and the tool for screening nutritional risk for CF by the McDonald method was good (κ=0.804; P<0.001) and the sensitivity and specificity was 85% and 95%, respectively. Agreement with the Cystic Fibrosis Foundation criteria was lower (κ=0.418; P<0.001), and the sensitivity and specificity were both 72%. The proposed screening tool with defined clinical variables promotes earlier identification of nutritional risk in pediatric patients with CF. Copyright © 2016 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.
Binary full adder, made of fusion gates, in a subexcitable Belousov-Zhabotinsky system
NASA Astrophysics Data System (ADS)
Adamatzky, Andrew
2015-09-01
In an excitable thin-layer Belousov-Zhabotinsky (BZ) medium a localized perturbation leads to the formation of omnidirectional target or spiral waves of excitation. A subexcitable BZ medium responds to asymmetric local perturbation by producing traveling localized excitation wave-fragments, distant relatives of dissipative solitons. The size and life span of an excitation wave-fragment depend on the illumination level of the medium. Under the right conditions the wave-fragments conserve their shape and velocity vectors for extended time periods. I interpret the wave-fragments as values of Boolean variables. When two or more wave-fragments collide they annihilate or merge into a new wave-fragment. States of the logic variables, represented by the wave-fragments, are changed in the result of the collision between the wave-fragments. Thus, a logical gate is implemented. Several theoretical designs and experimental laboratory implementations of Boolean logic gates have been proposed in the past but little has been done cascading the gates into binary arithmetical circuits. I propose a unique design of a binary one-bit full adder based on a fusion gate. A fusion gate is a two-input three-output logical device which calculates the conjunction of the input variables and the conjunction of one input variable with the negation of another input variable. The gate is made of three channels: two channels cross each other at an angle, a third channel starts at the junction. The channels contain a BZ medium. When two excitation wave-fragments, traveling towards each other along input channels, collide at the junction they merge into a single wave-front traveling along the third channel. If there is just one wave-front in the input channel, the front continues its propagation undisturbed. I make a one-bit full adder by cascading two fusion gates. I show how to cascade the adder blocks into a many-bit full adder. I evaluate the feasibility of my designs by simulating the evolution of excitation in the gates and adders using the numerical integration of Oregonator equations.
Pereira, Gilberto de Araujo; Louzada-Neto, Francisco; Barbosa, Valdirene de Fátima; Ferreira-Silva, Márcia Maria; de Moraes-Souza, Helio
2012-01-01
The frequent occurrence of inconclusive serology in blood banks and the absence of a gold standard test for Chagas'disease led us to examine the efficacy of the blood culture test and five commercial tests (ELISA, IIF, HAI, c-ELISA, rec-ELISA) used in screening blood donors for Chagas disease, as well as to investigate the prevalence of Trypanosoma cruzi infection among donors with inconclusive serology screening in respect to some epidemiological variables. To obtain estimates of interest we considered a Bayesian latent class model with inclusion of covariates from the logit link. A better performance was observed with some categories of epidemiological variables. In addition, all pairs of tests (excluding the blood culture test) presented as good alternatives for both screening (sensitivity > 99.96% in parallel testing) and for confirmation (specificity > 99.93% in serial testing) of Chagas disease. The prevalence of 13.30% observed in the stratum of donors with inconclusive serology, means that probably most of these are non-reactive serology. In addition, depending on the level of specific epidemiological variables, the absence of infection can be predicted with a probability of 100% in this group from the pairs of tests using parallel testing. The epidemiological variables can lead to improved test results and thus assist in the clarification of inconclusive serology screening results. Moreover, all combinations of pairs using the five commercial tests are good alternatives to confirm results.
UWB delay and multiply receiver
Dallum, Gregory E.; Pratt, Garth C.; Haugen, Peter C.; Romero, Carlos E.
2013-09-10
An ultra-wideband (UWB) delay and multiply receiver is formed of a receive antenna; a variable gain attenuator connected to the receive antenna; a signal splitter connected to the variable gain attenuator; a multiplier having one input connected to an undelayed signal from the signal splitter and another input connected to a delayed signal from the signal splitter, the delay between the splitter signals being equal to the spacing between pulses from a transmitter whose pulses are being received by the receive antenna; a peak detection circuit connected to the output of the multiplier and connected to the variable gain attenuator to control the variable gain attenuator to maintain a constant amplitude output from the multiplier; and a digital output circuit connected to the output of the multiplier.
Alpha1 LASSO data bundles Lamont, OK
Gustafson, William Jr; Vogelmann, Andrew; Endo, Satoshi; Toto, Tami; Xiao, Heng; Li, Zhijin; Cheng, Xiaoping; Krishna, Bhargavi (ORCID:000000018828528X)
2016-08-03
A data bundle is a unified package consisting of LASSO LES input and output, observations, evaluation diagnostics, and model skill scores. LES input includes model configuration information and forcing data. LES output includes profile statistics and full domain fields of cloud and environmental variables. Model evaluation data consists of LES output and ARM observations co-registered on the same grid and sampling frequency. Model performance is quantified by skill scores and diagnostics in terms of cloud and environmental variables.
Type 2 Diabetes Screening Test by Means of a Pulse Oximeter.
Moreno, Enrique Monte; Lujan, Maria Jose Anyo; Rusinol, Montse Torrres; Fernandez, Paqui Juarez; Manrique, Pilar Nunez; Trivino, Cristina Aragon; Miquel, Magda Pedrosa; Rodriguez, Marife Alvarez; Burguillos, M Jose Gonzalez
2017-02-01
In this paper, we propose a method for screening for the presence of type 2 diabetes by means of the signal obtained from a pulse oximeter. The screening system consists of two parts: the first analyzes the signal obtained from the pulse oximeter, and the second consists of a machine-learning module. The system consists of a front end that extracts a set of features form the pulse oximeter signal. These features are based on physiological considerations. The set of features were the input of a machine-learning algorithm that determined the class of the input sample, i.e., whether the subject had diabetes or not. The machine-learning algorithms were random forests, gradient boosting, and linear discriminant analysis as benchmark. The system was tested on a database of [Formula: see text] subjects (two samples per subject) collected from five community health centers. The mean receiver operating characteristic area found was [Formula: see text]% (median value [Formula: see text]% and range [Formula: see text]%), with a specificity = [Formula: see text]% for a threshold that gave a sensitivity = [Formula: see text]%. We present a screening method for detecting diabetes that has a performance comparable to the glycated haemoglobin (haemoglobin A1c HbA1c) test, does not require blood extraction, and yields results in less than 5 min.
Circling motion and screen edges as an alternative input method for on-screen target manipulation.
Ka, Hyun W; Simpson, Richard C
2017-04-01
To investigate a new alternative interaction method, called circling interface, for manipulating on-screen objects. To specify a target, the user makes a circling motion around the target. To specify a desired pointing command with the circling interface, each edge of the screen is used. The user selects a command before circling the target. To evaluate the circling interface, we conducted an experiment with 16 participants, comparing the performance on pointing tasks with different combinations of selection method (circling interface, physical mouse and dwelling interface) and input device (normal computer mouse, head pointer and joystick mouse emulator). A circling interface is compatible with many types of pointing devices, not requiring physical activation of mouse buttons, and is more efficient than dwell-clicking. Across all common pointing operations, the circling interface had a tendency to produce faster performance with a head-mounted mouse emulator than with a joystick mouse. The performance accuracy of the circling interface outperformed the dwelling interface. It was demonstrated that the circling interface has the potential as another alternative pointing method for selecting and manipulating objects in a graphical user interface. Implications for Rehabilitation A circling interface will improve clinical practice by providing an alternative pointing method that does not require physically activating mouse buttons and is more efficient than dwell-clicking. The Circling interface can also work with AAC devices.
A Multivariate Analysis of the Early Dropout Process
ERIC Educational Resources Information Center
Fiester, Alan R.; Rudestam, Kjell E.
1975-01-01
Principal-component factor analyses were performed on patient input (demographic and pretherapy expectations), therapist input (demographic), and patient perspective therapy process variables that significantly differentiated early dropout from nondropout outpatients at two community mental health centers. (Author)
Approach for Input Uncertainty Propagation and Robust Design in CFD Using Sensitivity Derivatives
NASA Technical Reports Server (NTRS)
Putko, Michele M.; Taylor, Arthur C., III; Newman, Perry A.; Green, Lawrence L.
2002-01-01
An implementation of the approximate statistical moment method for uncertainty propagation and robust optimization for quasi 3-D Euler CFD code is presented. Given uncertainties in statistically independent, random, normally distributed input variables, first- and second-order statistical moment procedures are performed to approximate the uncertainty in the CFD output. Efficient calculation of both first- and second-order sensitivity derivatives is required. In order to assess the validity of the approximations, these moments are compared with statistical moments generated through Monte Carlo simulations. The uncertainties in the CFD input variables are also incorporated into a robust optimization procedure. For this optimization, statistical moments involving first-order sensitivity derivatives appear in the objective function and system constraints. Second-order sensitivity derivatives are used in a gradient-based search to successfully execute a robust optimization. The approximate methods used throughout the analyses are found to be valid when considering robustness about input parameter mean values.
Variable ratio regenerative braking device
Hoppie, Lyle O.
1981-12-15
Disclosed is a regenerative braking device (10) for an automotive vehicle. The device includes an energy storage assembly (12) having a plurality of rubber rollers (26, 28) mounted for rotation between an input shaft (36) and an output shaft (42), clutches (38, 46) and brakes (40, 48) associated with each shaft, and a continuously variable transmission (22) connectable to a vehicle drivetrain and to the input and output shafts by the respective clutches. The rubber rollers are torsionally stressed to accumulate energy from the vehicle when the input shaft is clutched to the transmission while the brake on the output shaft is applied, and are torsionally relaxed to deliver energy to the vehicle when the output shaft is clutched to the transmission while the brake on the input shaft is applied. The transmission ratio is varied to control the rate of energy accumulation and delivery for a given rotational speed of the vehicle drivetrain.
NASA Technical Reports Server (NTRS)
Chen, B. M.; Saber, A.
1993-01-01
A simple and noniterative procedure for the computation of the exact value of the infimum in the singular H(infinity)-optimization problem is presented, as a continuation of our earlier work. Our problem formulation is general and we do not place any restrictions in the finite and infinite zero structures of the system, and the direct feedthrough terms between the control input and the controlled output variables and between the disturbance input and the measurement output variables. Our method is applicable to a class of singular H(infinity)-optimization problems for which the transfer functions from the control input to the controlled output and from the disturbance input to the measurement output satisfy certain geometric conditions. In particular, the paper extends the result of earlier work by allowing these two transfer functions to have invariant zeros on the j(omega) axis.
NASA Astrophysics Data System (ADS)
Yadav, D.; Upadhyay, H. C.
1992-07-01
Vehicles obtain track-induced input through the wheels, which commonly number more than one. Analysis available for the vehicle response in a variable velocity run on a non-homogeneously profiled flexible track supported by compliant inertial foundation is for a linear heave model having a single ground input. This analysis is being extended to two point input models with heave-pitch and heave-roll degrees of freedom. Closed form expressions have been developed for the system response statistics. Results are presented for a railway coach and track/foundation problem, and the performances of heave, heave-pitch and heave-roll models have been compared. The three models are found to agree in describing the track response. However, the vehicle sprung mass behaviour is predicted to be different by these models, indicating the strong effect of coupling on the vehicle vibration.
Shared Bibliographic Input Network (SBIN) Conference Proceedings, October 1982
1982-10-01
records for their document collections. In the process of development of the CIRCANET requirements, we conceived a flowchart (figure 3) of the ideal system...with beginners ). b. If an error is made when taping, it can be detecteA and corrected on screen before writing onto the tape being used for input. c...It’s an easier and quicker method for beginners to become familiar with all of the fields and their entries. d. If a record is lost during the process
Mak, Yim Wah; Wu, Cynthia Sau Ting; Hui, Donna Wing Shun; Lam, Siu Ping; Tse, Hei Yin; Yu, Wing Yan; Wong, Ho Ting
2014-10-28
Screen viewing is considered to have adverse impacts on the sleep of adolescents. Although there has been a considerable amount of research on the association between screen viewing and sleep, most studies have focused on specific types of screen viewing devices such as televisions and computers. The present study investigated the duration with which currently prevalent screen viewing devices (including televisions, personal computers, mobile phones, and portable video devices) are viewed in relation to sleep duration, sleep quality, and daytime sleepiness among Hong Kong adolescents (N = 762). Television and computer viewing remain prevalent, but were not correlated with sleep variables. Mobile phone viewing was correlated with all sleep variables, while portable video device viewing was shown to be correlated only with daytime sleepiness. The results demonstrated a trend of increase in the prevalence and types of screen viewing and their effects on the sleep patterns of adolescents.
Panda, Bhuputra; Thakur, Harshad P
2016-10-31
One of the principal goals of any health care system is to improve health through the provision of clinical and public health services. Decentralization as a reform measure aims to improve inputs, management processes and health outcomes, and has political, administrative and financial connotations. It is argued that the robustness of a health system in achieving desirable outcomes is contingent upon the width and depth of 'decision space' at the local level. Studies have used different approaches to examine one or more facets of decentralization and its effect on health system functioning; however, lack of consensus on an acceptable framework is a critical gap in determining its quantum and quality. Theorists have resorted to concepts of 'trust', 'convenience' and 'mutual benefits' to explain, define and measure components of governance in health. In the emerging 'continuum of health services' model, the challenge lies in identifying variables of performance (fiscal allocation, autonomy at local level, perception of key stakeholders, service delivery outputs, etc.) through the prism of decentralization in the first place, and in establishing directed relationships among them. This focused review paper conducted extensive web-based literature search, using PubMed and Google Scholar search engines. After screening of key words and study objectives, we retrieved 180 articles for next round of screening. One hundred and four full articles (three working papers and 101 published papers) were reviewed in totality. We attempted to summarize existing literature on decentralization and health systems performance, explain key concepts and essential variables, and develop a framework for further scientific scrutiny. Themes are presented in three separate segments of dimensions, difficulties and derivatives. Evaluation of local decision making and its effect on health system performance has been studied in a compartmentalized manner. There is sparse evidence about innovations attributable to decentralization. We observed that in India, there is very scant evaluative study on the subject. We didn't come across a single study examining the perception and experiences of local decision makers about the opportunities and challenges they faced. The existing body of evidences may be inadequate to feed into sound policy making. The principles of management hinge on measurement of inputs, processes and outputs. In the conceptual framework we propose three levels of functions (health systems functions, management functions and measurement functions) being intricately related to inputs, processes and outputs. Each level of function encompasses essential elements derived from the synthesis of information gathered through literature review and non-participant observation. We observed that it is difficult to quantify characteristics of governance at institutional, system and individual levels except through proxy means. There is an urgent need to sensitize governments and academia about how best more objective evaluation of 'shared governance' can be undertaken to benefit policy making. The future direction of enquiry should focus on context-specific evidence of its effect on the entire spectrum of health system, with special emphasis on efficiency, community participation, human resource management and quality of services.
Electrical Advantages of Dendritic Spines
Gulledge, Allan T.; Carnevale, Nicholas T.; Stuart, Greg J.
2012-01-01
Many neurons receive excitatory glutamatergic input almost exclusively onto dendritic spines. In the absence of spines, the amplitudes and kinetics of excitatory postsynaptic potentials (EPSPs) at the site of synaptic input are highly variable and depend on dendritic location. We hypothesized that dendritic spines standardize the local geometry at the site of synaptic input, thereby reducing location-dependent variability of local EPSP properties. We tested this hypothesis using computational models of simplified and morphologically realistic spiny neurons that allow direct comparison of EPSPs generated on spine heads with EPSPs generated on dendritic shafts at the same dendritic locations. In all morphologies tested, spines greatly reduced location-dependent variability of local EPSP amplitude and kinetics, while having minimal impact on EPSPs measured at the soma. Spine-dependent standardization of local EPSP properties persisted across a range of physiologically relevant spine neck resistances, and in models with variable neck resistances. By reducing the variability of local EPSPs, spines standardized synaptic activation of NMDA receptors and voltage-gated calcium channels. Furthermore, spines enhanced activation of NMDA receptors and facilitated the generation of NMDA spikes and axonal action potentials in response to synaptic input. Finally, we show that dynamic regulation of spine neck geometry can preserve local EPSP properties following plasticity-driven changes in synaptic strength, but is inefficient in modifying the amplitude of EPSPs in other cellular compartments. These observations suggest that one function of dendritic spines is to standardize local EPSP properties throughout the dendritic tree, thereby allowing neurons to use similar voltage-sensitive postsynaptic mechanisms at all dendritic locations. PMID:22532875
Nonlinear Dynamic Models in Advanced Life Support
NASA Technical Reports Server (NTRS)
Jones, Harry
2002-01-01
To facilitate analysis, ALS systems are often assumed to be linear and time invariant, but they usually have important nonlinear and dynamic aspects. Nonlinear dynamic behavior can be caused by time varying inputs, changes in system parameters, nonlinear system functions, closed loop feedback delays, and limits on buffer storage or processing rates. Dynamic models are usually cataloged according to the number of state variables. The simplest dynamic models are linear, using only integration, multiplication, addition, and subtraction of the state variables. A general linear model with only two state variables can produce all the possible dynamic behavior of linear systems with many state variables, including stability, oscillation, or exponential growth and decay. Linear systems can be described using mathematical analysis. Nonlinear dynamics can be fully explored only by computer simulations of models. Unexpected behavior is produced by simple models having only two or three state variables with simple mathematical relations between them. Closed loop feedback delays are a major source of system instability. Exceeding limits on buffer storage or processing rates forces systems to change operating mode. Different equilibrium points may be reached from different initial conditions. Instead of one stable equilibrium point, the system may have several equilibrium points, oscillate at different frequencies, or even behave chaotically, depending on the system inputs and initial conditions. The frequency spectrum of an output oscillation may contain harmonics and the sums and differences of input frequencies, but it may also contain a stable limit cycle oscillation not related to input frequencies. We must investigate the nonlinear dynamic aspects of advanced life support systems to understand and counter undesirable behavior.
Predicting language outcomes for children learning AAC: Child and environmental factors
Brady, Nancy C.; Thiemann-Bourque, Kathy; Fleming, Kandace; Matthews, Kris
2014-01-01
Purpose To investigate a model of language development for nonverbal preschool age children learning to communicate with AAC. Method Ninety-three preschool children with intellectual disabilities were assessed at Time 1, and 82 of these children were assessed one year later at Time 2. The outcome variable was the number of different words the children produced (with speech, sign or SGD). Children’s intrinsic predictor for language was modeled as a latent variable consisting of cognitive development, comprehension, play, and nonverbal communication complexity. Adult input at school and home, and amount of AAC instruction were proposed mediators of vocabulary acquisition. Results A confirmatory factor analysis revealed that measures converged as a coherent construct and an SEM model indicated that the intrinsic child predictor construct predicted different words children produced. The amount of input received at home but not at school was a significant mediator. Conclusions Our hypothesized model accurately reflected a latent construct of Intrinsic Symbolic Factor (ISF). Children who evidenced higher initial levels of ISF and more adult input at home produced more words one year later. Findings support the need to assess multiple child variables, and suggest interventions directed to the indicators of ISF and input. PMID:23785187
Zuhtuogullari, Kursat; Allahverdi, Novruz; Arikan, Nihat
2013-01-01
The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes) with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data.
Zuhtuogullari, Kursat; Allahverdi, Novruz; Arikan, Nihat
2013-01-01
The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes) with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data. PMID:23573172
Interacting with notebook input devices: an analysis of motor performance and users' expertise.
Sutter, Christine; Ziefle, Martina
2005-01-01
In the present study the usability of two different types of notebook input devices was examined. The independent variables were input device (touchpad vs. mini-joystick) and user expertise (expert vs. novice state). There were 30 participants, of whom 15 were touchpad experts and the other 15 were mini-joystick experts. The experimental tasks were a point-click task (Experiment 1) and a point-drag-drop task (Experiment 2). Dependent variables were the time and accuracy of cursor control. To assess carryover effects, we had the participants complete both experiments, using not only the input device for which they were experts but also the device for which they were novices. Results showed the touchpad performance to be clearly superior to mini-joystick performance. Overall, experts showed better performance than did novices. The significant interaction of input device and expertise showed that the use of an unknown device is difficult, but only for touchpad experts, who were remarkably slower and less accurate when using a mini-joystick. Actual and potential applications of this research include an evaluation of current notebook input devices. The outcomes allow ergonomic guidelines to be derived for optimized usage and design of the mini-joystick and touchpad devices.
Printer model for dot-on-dot halftone screens
NASA Astrophysics Data System (ADS)
Balasubramanian, Raja
1995-04-01
A printer model is described for dot-on-dot halftone screens. For a given input CMYK signal, the model predicts the resulting spectral reflectance of the printed patch. The model is derived in two steps. First, the C, M, Y, K dot growth functions are determined which relate the input digital value to the actual dot area coverages of the colorants. Next, the reflectance of a patch is predicted as a weighted combination of the reflectances of the four solid C, M, Y, K patches and their various overlays. This approach is analogous to the Neugebauer model, with the random mixing equations being replaced by dot-on-dot mixing equations. A Yule-Neilsen correction factor is incorporated to account for light scattering within the paper. The dot area functions and Yule-Neilsen parameter are chosen to optimize the fit to a set of training data. The model is also extended to a cellular framework, requiring additional measurements. The model is tested with a four color xerographic printer employing a line-on-line halftone screen. CIE L*a*b* errors are obtained between measurements and model predictions. The Yule-Neilsen factor significantly decreases the model error. Accuracy is also increased with the use of a cellular framework.
PyzoFlex: a printed piezoelectric pressure sensing foil for human machine interfaces
NASA Astrophysics Data System (ADS)
Zirkl, M.; Scheipl, G.; Stadlober, B.; Rendl, C.; Greindl, P.; Haller, M.; Hartmann, P.
2013-09-01
Ferroelectric material supports both pyro- and piezoelectric effects that can be used for sensing pressures on large, bended surfaces. We present PyzoFlex, a pressure-sensing input device that is based on a ferroelectric material (PVDF:TrFE). It is constructed by a sandwich structure of four layers that can easily be printed on any substrate. The PyzoFlex foil is sensitive to pressure- and temperature changes, bendable, energy-efficient, and it can easily be produced by a screen-printing routine. Even a hovering input-mode is feasible due to its pyroelectric effect. In this paper, we introduce this novel, fully printed input technology and discuss its benefits and limitations.
NASA Astrophysics Data System (ADS)
Milovančević, Miloš; Nikolić, Vlastimir; Anđelković, Boban
2017-01-01
Vibration-based structural health monitoring is widely recognized as an attractive strategy for early damage detection in civil structures. Vibration monitoring and prediction is important for any system since it can save many unpredictable behaviors of the system. If the vibration monitoring is properly managed, that can ensure economic and safe operations. Potentials for further improvement of vibration monitoring lie in the improvement of current control strategies. One of the options is the introduction of model predictive control. Multistep ahead predictive models of vibration are a starting point for creating a successful model predictive strategy. For the purpose of this article, predictive models of are created for vibration monitoring of planetary power transmissions in pellet mills. The models were developed using the novel method based on ANFIS (adaptive neuro fuzzy inference system). The aim of this study is to investigate the potential of ANFIS for selecting the most relevant variables for predictive models of vibration monitoring of pellet mills power transmission. The vibration data are collected by PIC (Programmable Interface Controller) microcontrollers. The goal of the predictive vibration monitoring of planetary power transmissions in pellet mills is to indicate deterioration in the vibration of the power transmissions before the actual failure occurs. The ANFIS process for variable selection was implemented in order to detect the predominant variables affecting the prediction of vibration monitoring. It was also used to select the minimal input subset of variables from the initial set of input variables - current and lagged variables (up to 11 steps) of vibration. The obtained results could be used for simplification of predictive methods so as to avoid multiple input variables. It was preferable to used models with less inputs because of overfitting between training and testing data. While the obtained results are promising, further work is required in order to get results that could be directly applied in practice.
Giles, Courtney D; Brown, Lawrie K; Adu, Michael O; Mezeli, Malika M; Sandral, Graeme A; Simpson, Richard J; Wendler, Renate; Shand, Charles A; Menezes-Blackburn, Daniel; Darch, Tegan; Stutter, Marc I; Lumsdon, David G; Zhang, Hao; Blackwell, Martin S A; Wearing, Catherine; Cooper, Patricia; Haygarth, Philip M; George, Timothy S
2017-02-01
Phosphorus (P) and nitrogen (N) use efficiency may be improved through increased biodiversity in agroecosystems. Phenotypic variation in plants' response to nutrient deficiency may influence positive complementarity in intercropping systems. A multicomponent screening approach was used to assess the influence of P supply and N source on the phenotypic plasticity of nutrient foraging traits in barley (H. vulgare L.) and legume species. Root morphology and exudation were determined in six plant nutrient treatments. A clear divergence in the response of barley and legumes to the nutrient treatments was observed. Root morphology varied most among legumes, whereas exudate citrate and phytase activity were most variable in barley. Changes in root morphology were minimized in plants provided with ammonium in comparison to nitrate but increased under P deficiency. Exudate phytase activity and pH varied with legume species, whereas citrate efflux, specific root length, and root diameter lengths were more variable among barley cultivars. Three legume species and four barley cultivars were identified as the most responsive to P deficiency and the most contrasting of the cultivars and species tested. Phenotypic response to nutrient availability may be a promising approach for the selection of plant combinations for minimal input cropping systems. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Using a Bayesian network to predict barrier island geomorphologic characteristics
Gutierrez, Ben; Plant, Nathaniel G.; Thieler, E. Robert; Turecek, Aaron
2015-01-01
Quantifying geomorphic variability of coastal environments is important for understanding and describing the vulnerability of coastal topography, infrastructure, and ecosystems to future storms and sea level rise. Here we use a Bayesian network (BN) to test the importance of multiple interactions between barrier island geomorphic variables. This approach models complex interactions and handles uncertainty, which is intrinsic to future sea level rise, storminess, or anthropogenic processes (e.g., beach nourishment and other forms of coastal management). The BN was developed and tested at Assateague Island, Maryland/Virginia, USA, a barrier island with sufficient geomorphic and temporal variability to evaluate our approach. We tested the ability to predict dune height, beach width, and beach height variables using inputs that included longer-term, larger-scale, or external variables (historical shoreline change rates, distances to inlets, barrier width, mean barrier elevation, and anthropogenic modification). Data sets from three different years spanning nearly a decade sampled substantial temporal variability and serve as a proxy for analysis of future conditions. We show that distinct geomorphic conditions are associated with different long-term shoreline change rates and that the most skillful predictions of dune height, beach width, and beach height depend on including multiple input variables simultaneously. The predictive relationships are robust to variations in the amount of input data and to variations in model complexity. The resulting model can be used to evaluate scenarios related to coastal management plans and/or future scenarios where shoreline change rates may differ from those observed historically.
Screen Color and Reading Performance on Closed-Circuit Television.
ERIC Educational Resources Information Center
Jacobs, R. J.
1990-01-01
To investigate whether screen color is an important variable in the prescription of closed circuit television (CCTV) systems, 16 adults with low vision were assessed on reading performance on white, green, and amber screens. When the screen luminance and contrast were equated for each CCTV, subjects' reading performance was unaffected by screen…
Symbolic PathFinder: Symbolic Execution of Java Bytecode
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Rungta, Neha
2010-01-01
Symbolic Pathfinder (SPF) combines symbolic execution with model checking and constraint solving for automated test case generation and error detection in Java programs with unspecified inputs. In this tool, programs are executed on symbolic inputs representing multiple concrete inputs. Values of variables are represented as constraints generated from the analysis of Java bytecode. The constraints are solved using off-the shelf solvers to generate test inputs guaranteed to achieve complex coverage criteria. SPF has been used successfully at NASA, in academia, and in industry.
Quétard, Boris; Quinton, Jean-Charles; Colomb, Michèle; Pezzulo, Giovanni; Barca, Laura; Izaute, Marie; Appadoo, Owen Kevin; Mermillod, Martial
2015-09-01
Detecting a pedestrian while driving in the fog is one situation where the prior expectation about the target presence is integrated with the noisy visual input. We focus on how these sources of information influence the oculomotor behavior and are integrated within an underlying decision-making process. The participants had to judge whether high-/low-density fog scenes displayed on a computer screen contained a pedestrian or a deer by executing a mouse movement toward the response button (mouse-tracking). A variable road sign was added on the scene to manipulate expectations about target identity. We then analyzed the timing and amplitude of the deviation of mouse trajectories toward the incorrect response and, using an eye tracker, the detection time (before fixating the target) and the identification time (fixations on the target). Results revealed that expectation of the correct target results in earlier decisions with less deviation toward the alternative response, this effect being partially explained by the facilitation of target identification.
NASA Technical Reports Server (NTRS)
Allen, R. W.; Mcruer, D. T.
1977-01-01
A simulation experiment was conducted to determine the effect of reduced visibility on driver lateral (steering) control. The simulator included a real car cab and a single lane road image projected on a screen six feet in front of the driver. Simulated equations of motion controlled apparent car lane position in response to driver steering actions, wind gusts, and road curvature. Six drivers experienced a range of visibility conditions at various speeds with assorted roadmaking configurations (mark and gap lengths). Driver describing functions were measured and detailed parametric model fits were determined. A pursuit model employing a road curvature feedforward was very effective in explaining driver behavior in following randomly curving roads. Sampled-data concepts were also effective in explaining the combined effects of reduced visibility and intermittent road markings on the driver's dynamic time delay. The results indicate the relative importance of various perceptual variables as the visual input to the driver's steering control process is changed.
Wagner, Shannon; White, Marc; Schultz, Izabela; Murray, Eleanor; Bradley, Susan M; Hsu, Vernita; McGuire, Lisa; Schulz, Werner
2014-01-01
A challenge facing stakeholders is the identification and translation of relevant high quality research to inform policy and practice. This study engaged academic and community stakeholders in conducting a best evidence-synthesis to identify modifiable risk and protective worker factors across health conditions impacting work-related absence. To identify modifiable worker disability risk and protective factors across common health conditions impacting work-related absence. We searched Medline, Embase, CINHAL, The Cochrane Library, PsycINFO, BusinessSourceComplete, and ABI/Inform from 2000 to 2011. Quantitative, qualitative, or mixed methods systematic reviews of work-focused population were considered for inclusion. Two or more reviewers independently reviewed articles for inclusion and methodological screening. The search strategy, expert input and grey literature identified 2,467 unique records. One hundred and forty-two full text articles underwent comprehensive review. Twenty-four systematic reviews met eligibility criteria. Modifiable worker factors found to have consistent evidence across two or more health conditions included emotional distress, negative enduring psychology/personality factors, negative health and disability perception, decreased physical activity, lack of family support, poor general health, increased functional disability, increased pain, increased fatigue and lack of motivation to return to work. Systematic reviews are limited by availability of high quality studies, lack of consistency of methodological screening and reporting, and variability of outcome measures used.
Wade, James H; Jones, Joshua D; Lenov, Ivan L; Riordan, Colleen M; Sligar, Stephen G; Bailey, Ryan C
2017-08-22
The characterization of integral membrane proteins presents numerous analytical challenges on account of their poor activity under non-native conditions, limited solubility in aqueous solutions, and low expression in most cell culture systems. Nanodiscs are synthetic model membrane constructs that offer many advantages for studying membrane protein function by offering a native-like phospholipid bilayer environment. The successful incorporation of membrane proteins within Nanodiscs requires experimental optimization of conditions. Standard protocols for Nanodisc formation can require large amounts of time and input material, limiting the facile screening of formation conditions. Capitalizing on the miniaturization and efficient mass transport inherent to microfluidics, we have developed a microfluidic platform for efficient Nanodisc assembly and purification, and demonstrated the ability to incorporate functional membrane proteins into the resulting Nanodiscs. In addition to working with reduced sample volumes, this platform simplifies membrane protein incorporation from a multi-stage protocol requiring several hours or days into a single platform that outputs purified Nanodiscs in less than one hour. To demonstrate the utility of this platform, we incorporated Cytochrome P450 into Nanodiscs of variable size and lipid composition, and present spectroscopic evidence for the functional active site of the membrane protein. This platform is a promising new tool for membrane protein biology and biochemistry that enables tremendous versatility for optimizing the incorporation of membrane proteins using microfluidic gradients to screen across diverse formation conditions.
Role of Updraft Velocity in Temporal Variability of Global Cloud Hydrometeor Number
NASA Technical Reports Server (NTRS)
Sullivan, Sylvia C.; Lee, Dong Min; Oreopoulos, Lazaros; Nenes, Athanasios
2016-01-01
Understanding how dynamical and aerosol inputs affect the temporal variability of hydrometeor formation in climate models will help to explain sources of model diversity in cloud forcing, to provide robust comparisons with data, and, ultimately, to reduce the uncertainty in estimates of the aerosol indirect effect. This variability attribution can be done at various spatial and temporal resolutions with metrics derived from online adjoint sensitivities of droplet and crystal number to relevant inputs. Such metrics are defined and calculated from simulations using the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) and the National Center for Atmospheric Research Community Atmosphere Model Version 5.1 (CAM5.1). Input updraft velocity fluctuations can explain as much as 48% of temporal variability in output ice crystal number and 61% in droplet number in GEOS-5 and up to 89% of temporal variability in output ice crystal number in CAM5.1. In both models, this vertical velocity attribution depends strongly on altitude. Despite its importance for hydrometeor formation, simulated vertical velocity distributions are rarely evaluated against observations due to the sparsity of relevant data. Coordinated effort by the atmospheric community to develop more consistent, observationally based updraft treatments will help to close this knowledge gap.
Role of updraft velocity in temporal variability of global cloud hydrometeor number
Sullivan, Sylvia C.; Lee, Dongmin; Oreopoulos, Lazaros; ...
2016-05-16
Understanding how dynamical and aerosol inputs affect the temporal variability of hydrometeor formation in climate models will help to explain sources of model diversity in cloud forcing, to provide robust comparisons with data, and, ultimately, to reduce the uncertainty in estimates of the aerosol indirect effect. This variability attribution can be done at various spatial and temporal resolutions with metrics derived from online adjoint sensitivities of droplet and crystal number to relevant inputs. Such metrics are defined and calculated from simulations using the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) and the National Center for Atmospheric Research Communitymore » Atmosphere Model Version 5.1 (CAM5.1). Input updraft velocity fluctuations can explain as much as 48% of temporal variability in output ice crystal number and 61% in droplet number in GEOS-5 and up to 89% of temporal variability in output ice crystal number in CAM5.1. In both models, this vertical velocity attribution depends strongly on altitude. Despite its importance for hydrometeor formation, simulated vertical velocity distributions are rarely evaluated against observations due to the sparsity of relevant data. Finally, coordinated effort by the atmospheric community to develop more consistent, observationally based updraft treatments will help to close this knowledge gap.« less
Role of updraft velocity in temporal variability of global cloud hydrometeor number
NASA Astrophysics Data System (ADS)
Sullivan, Sylvia C.; Lee, Dongmin; Oreopoulos, Lazaros; Nenes, Athanasios
2016-05-01
Understanding how dynamical and aerosol inputs affect the temporal variability of hydrometeor formation in climate models will help to explain sources of model diversity in cloud forcing, to provide robust comparisons with data, and, ultimately, to reduce the uncertainty in estimates of the aerosol indirect effect. This variability attribution can be done at various spatial and temporal resolutions with metrics derived from online adjoint sensitivities of droplet and crystal number to relevant inputs. Such metrics are defined and calculated from simulations using the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) and the National Center for Atmospheric Research Community Atmosphere Model Version 5.1 (CAM5.1). Input updraft velocity fluctuations can explain as much as 48% of temporal variability in output ice crystal number and 61% in droplet number in GEOS-5 and up to 89% of temporal variability in output ice crystal number in CAM5.1. In both models, this vertical velocity attribution depends strongly on altitude. Despite its importance for hydrometeor formation, simulated vertical velocity distributions are rarely evaluated against observations due to the sparsity of relevant data. Coordinated effort by the atmospheric community to develop more consistent, observationally based updraft treatments will help to close this knowledge gap.
Whitmore, Leanne S.; Davis, Ryan W.; McCormick, Robert L.; ...
2016-09-15
Screening a large number of biologically derived molecules for potential fuel compounds without recourse to experimental testing is important in identifying understudied yet valuable molecules. Experimental testing, although a valuable standard for measuring fuel properties, has several major limitations, including the requirement of testably high quantities, considerable expense, and a large amount of time. This paper discusses the development of a general-purpose fuel property tool, using machine learning, whose outcome is to screen molecules for desirable fuel properties. BioCompoundML adopts a general methodology, requiring as input only a list of training compounds (with identifiers and measured values) and a listmore » of testing compounds (with identifiers). For the training data, BioCompoundML collects open data from the National Center for Biotechnology Information, incorporates user-provided features, imputes missing values, performs feature reduction, builds a classifier, and clusters compounds. BioCompoundML then collects data for the testing compounds, predicts class membership, and determines whether compounds are found in the range of variability of the training data set. We demonstrate this tool using three different fuel properties: research octane number (RON), threshold soot index (TSI), and melting point (MP). Here we provide measures of its success with these properties using randomized train/test measurements: average accuracy is 88% in RON, 85% in TSI, and 94% in MP; average precision is 88% in RON, 88% in TSI, and 95% in MP; and average recall is 88% in RON, 82% in TSI, and 97% in MP. The receiver operator characteristics (area under the curve) were estimated at 0.88 in RON, 0.86 in TSI, and 0.87 in MP. We also measured the success of BioCompoundML by sending 16 compounds for direct RON determination. Finally, we provide a screen of 1977 hydrocarbons/oxygenates within the 8696 compounds in MetaCyc, identifying compounds with high predictive strength for high or low RON.« less
Wrighton-Smith, Peter; Sneed, Laurie; Humphrey, Frances; Tao, Xuguang; Bernacki, Edward
2012-07-01
To determine the price point at which an interferon-γ release assay (IGRA) is less costly than a tuberculin skin test (TST) for health care employee tuberculosis screening. A multidecision tree-based cost model incorporating inputs gathered from time-motion studies and parallel testing by IGRA and TST was conducted in a subset of our employees. Administering a TST testing program costs $73.20 per person screened, $90.80 per new hire, and $63.42 per annual screen. Use of an IGRA for employee health testing is cost saving at an IGRA test cost of $54.83 or less per test and resulted in higher completion rates because of the elimination of the need for a second visit to interpret the TST. Using an IGRA for employee health screening can be an institutional cost saving and results in higher compliance rates.
Soft sensor modeling based on variable partition ensemble method for nonlinear batch processes
NASA Astrophysics Data System (ADS)
Wang, Li; Chen, Xiangguang; Yang, Kai; Jin, Huaiping
2017-01-01
Batch processes are always characterized by nonlinear and system uncertain properties, therefore, the conventional single model may be ill-suited. A local learning strategy soft sensor based on variable partition ensemble method is developed for the quality prediction of nonlinear and non-Gaussian batch processes. A set of input variable sets are obtained by bootstrapping and PMI criterion. Then, multiple local GPR models are developed based on each local input variable set. When a new test data is coming, the posterior probability of each best performance local model is estimated based on Bayesian inference and used to combine these local GPR models to get the final prediction result. The proposed soft sensor is demonstrated by applying to an industrial fed-batch chlortetracycline fermentation process.
User's Guide for RESRAD-OFFSITE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gnanapragasam, E.; Yu, C.
2015-04-01
The RESRAD-OFFSITE code can be used to model the radiological dose or risk to an offsite receptor. This User’s Guide for RESRAD-OFFSITE Version 3.1 is an update of the User’s Guide for RESRAD-OFFSITE Version 2 contained in the Appendix A of the User’s Manual for RESRAD-OFFSITE Version 2 (ANL/EVS/TM/07-1, DOE/HS-0005, NUREG/CR-6937). This user’s guide presents the basic information necessary to use Version 3.1 of the code. It also points to the help file and other documents that provide more detailed information about the inputs, the input forms and features/tools in the code; two of the features (overriding the source termmore » and computing area factors) are discussed in the appendices to this guide. Section 2 describes how to download and install the code and then verify the installation of the code. Section 3 shows ways to navigate through the input screens to simulate various exposure scenarios and to view the results in graphics and text reports. Section 4 has screen shots of each input form in the code and provides basic information about each parameter to increase the user’s understanding of the code. Section 5 outlines the contents of all the text reports and the graphical output. It also describes the commands in the two output viewers. Section 6 deals with the probabilistic and sensitivity analysis tools available in the code. Section 7 details the various ways of obtaining help in the code.« less
Kantún-Manzano, C A; Herrera-Silveira, J A; Arcega-Cabrera, F
2018-01-01
The influence of coastal submarine groundwater discharges (SGD) on the distribution and abundance of seagrass meadows was investigated. In 2012, hydrological variability, nutrient variability in sediments and the biotic characteristics of two seagrass beds, one with SGD present and one without, were studied. Findings showed that SGD inputs were related with one dominant seagrass species. To further understand this, a generalized additive model (GAM) was used to explore the relationship between seagrass biomass and environment conditions (water and sediment variables). Salinity range (21-35.5 PSU) was the most influential variable (85%), explaining why H. wrightii was the sole plant species present at the SGD site. At the site without SGD, GAM could not be performed since environmental variables could not explain a total variance of > 60%. This research shows the relevance of monitoring SGD inputs in coastal karstic areas since they significantly affect biotic characteristics of seagrass beds.
Multiple-input multiple-output causal strategies for gene selection.
Bontempi, Gianluca; Haibe-Kains, Benjamin; Desmedt, Christine; Sotiriou, Christos; Quackenbush, John
2011-11-25
Traditional strategies for selecting variables in high dimensional classification problems aim to find sets of maximally relevant variables able to explain the target variations. If these techniques may be effective in generalization accuracy they often do not reveal direct causes. The latter is essentially related to the fact that high correlation (or relevance) does not imply causation. In this study, we show how to efficiently incorporate causal information into gene selection by moving from a single-input single-output to a multiple-input multiple-output setting. We show in synthetic case study that a better prioritization of causal variables can be obtained by considering a relevance score which incorporates a causal term. In addition we show, in a meta-analysis study of six publicly available breast cancer microarray datasets, that the improvement occurs also in terms of accuracy. The biological interpretation of the results confirms the potential of a causal approach to gene selection. Integrating causal information into gene selection algorithms is effective both in terms of prediction accuracy and biological interpretation.
Hammerstrom, Donald J.
2013-10-15
A method for managing the charging and discharging of batteries wherein at least one battery is connected to a battery charger, the battery charger is connected to a power supply. A plurality of controllers in communication with one and another are provided, each of the controllers monitoring a subset of input variables. A set of charging constraints may then generated for each controller as a function of the subset of input variables. A set of objectives for each controller may also be generated. A preferred charge rate for each controller is generated as a function of either the set of objectives, the charging constraints, or both, using an algorithm that accounts for each of the preferred charge rates for each of the controllers and/or that does not violate any of the charging constraints. A current flow between the battery and the battery charger is then provided at the actual charge rate.
L.R. Grosenbaugh
1967-01-01
Describes an expansible computerized system that provides data needed in regression or covariance analysis of as many as 50 variables, 8 of which may be dependent. Alternatively, it can screen variously generated combinations of independent variables to find the regression with the smallest mean-squared-residual, which will be fitted if desired. The user can easily...
A Within-subjects Experimental Protocol to Assess the Effects of Social Input on Infant EEG.
St John, Ashley M; Kao, Katie; Chita-Tegmark, Meia; Liederman, Jacqueline; Grieve, Philip G; Tarullo, Amanda R
2017-05-03
Despite the importance of social interactions for infant brain development, little research has assessed functional neural activation while infants socially interact. Electroencephalography (EEG) power is an advantageous technique to assess infant functional neural activation. However, many studies record infant EEG only during one baseline condition. This protocol describes a paradigm that is designed to comprehensively assess infant EEG activity in both social and nonsocial contexts as well as tease apart how different types of social inputs differentially relate to infant EEG. The within-subjects paradigm includes four controlled conditions. In the nonsocial condition, infants view objects on computer screens. The joint attention condition involves an experimenter directing the infant's attention to pictures. The joint attention condition includes three types of social input: language, face-to-face interaction, and the presence of joint attention. Differences in infant EEG between the nonsocial and joint attention conditions could be due to any of these three types of input. Therefore, two additional conditions (one with language input while the experimenter is hidden behind a screen and one with face-to-face interaction) were included to assess the driving contextual factors in patterns of infant neural activation. Representative results demonstrate that infant EEG power varied by condition, both overall and differentially by brain region, supporting the functional nature of infant EEG power. This technique is advantageous in that it includes conditions that are clearly social or nonsocial and allows for examination of how specific types of social input relate to EEG power. This paradigm can be used to assess how individual differences in age, affect, socioeconomic status, and parent-infant interaction quality relate to the development of the social brain. Based on the demonstrated functional nature of infant EEG power, future studies should consider the role of EEG recording context and design conditions that are clearly social or nonsocial.
Tan, Nicholas X.; Rydzak, Chara; Yang, Li-Gang; Vickerman, Peter; Yang, Bin; Peeling, Rosanna W.; Hawkes, Sarah; Chen, Xiang-Sheng; Tucker, Joseph D.
2013-01-01
Background Syphilis is a major public health problem in many regions of China, with increases in congenital syphilis (CS) cases causing concern. The Chinese Ministry of Health recently announced a comprehensive 10-y national syphilis control plan focusing on averting CS. The decision analytic model presented here quantifies the impact of the planned strategies to determine whether they are likely to meet the goals laid out in the control plan. Methods and Findings Our model incorporated data on age-stratified fertility, female adult syphilis cases, and empirical syphilis transmission rates to estimate the number of CS cases associated with prenatal syphilis infection on a yearly basis. Guangdong Province was the focus of this analysis because of the availability of high-quality demographic and public health data. Each model outcome was simulated 1,000 times to incorporate uncertainty in model inputs. The model was validated using data from a CS intervention program among 477,656 women in China. Sensitivity analyses were performed to identify which variables are likely to be most influential in achieving Chinese and international policy goals. Increasing prenatal screening coverage was the single most effective strategy for reducing CS cases. An incremental increase in prenatal screening from the base case of 57% coverage to 95% coverage was associated with 106 (95% CI: 101, 111) CS cases averted per 100,000 live births (58% decrease). The policy strategies laid out in the national plan led to an outcome that fell short of the target, while a four-pronged comprehensive syphilis control strategy consisting of increased prenatal screening coverage, increased treatment completion, earlier prenatal screening, and improved syphilis test characteristics was associated with 157 (95% CI: 154, 160) CS cases averted per 100,000 live births (85% decrease). Conclusions The Chinese national plan provides a strong foundation for syphilis control, but more comprehensive measures that include earlier and more extensive screening are necessary for reaching policy goals. Please see later in the article for the Editors' Summary PMID:23349624
Ex Priori: Exposure-based Prioritization across Chemical Space
EPA's Exposure Prioritization (Ex Priori) is a simplified, quantitative visual dashboard that makes use of data from various inputs to provide rank-ordered internalized dose metric. This complements other high throughput screening by viewing exposures within all chemical space si...
Ambrose, Sophie E; Walker, Elizabeth A; Unflat-Berry, Lauren M; Oleson, Jacob J; Moeller, Mary Pat
2015-01-01
The primary objective of this study was to examine the quantity and quality of caregiver talk directed to children who are hard of hearing (CHH) compared with children with normal hearing (CNH). For the CHH only, the study explored how caregiver input changed as a function of child age (18 months versus 3 years), which child and family factors contributed to variance in caregiver linguistic input at 18 months and 3 years, and how caregiver talk at 18 months related to child language outcomes at 3 years. Participants were 59 CNH and 156 children with bilateral, mild-to-severe hearing loss. When children were approximately 18 months and/or 3 years of age, caregivers and children participated in a 5-min semistructured, conversational interaction. Interactions were transcribed and coded for two features of caregiver input representing quantity (number of total utterances and number of total words) and four features representing quality (number of different words, mean length of utterance in morphemes, proportion of utterances that were high level, and proportion of utterances that were directing). In addition, at the 18-month visit, parents completed a standardized questionnaire regarding their child's communication development. At the 3-year visit, a clinician administered a standardized language measure. At the 18-month visit, the CHH were exposed to a greater proportion of directing utterances than the CNH. At the 3-year visit, there were significant differences between the CNH and CHH for number of total words and all four of the quality variables, with the CHH being exposed to fewer words and lower quality input. Caregivers generally provided higher quality input to CHH at the 3-year visit compared with the 18-month visit. At the 18-month visit, quantity variables, but not quality variables, were related to several child and family factors. At the 3-year visit, the variable most strongly related to caregiver input was child language. Longitudinal analyses indicated that quality, but not quantity, of caregiver linguistic input at 18 months was related to child language abilities at 3 years, with directing utterances accounting for significant unique variance in child language outcomes. Although caregivers of CHH increased their use of quality features of linguistic input over time, the differences when compared with CNH suggest that some caregivers may need additional support to provide their children with optimal language learning environments. This is particularly important given the relationships that were identified between quality features of caregivers' linguistic input and children's language abilities. Family supports should include a focus on developing a style that is conversational eliciting as opposed to directive.
Ojinnaka, Chinedum O; Bolin, Jane N; McClellan, David A; Helduser, Janet W; Nash, Philip; Ory, Marcia G
2015-01-01
To determine the association between health literacy, communication habits and colorectal cancer (CRC) screening among low-income patients. Survey responses of patients who received financial assistance for colonoscopy between 2011 and 2014 at a family medicine residency clinic were analyzed using multivariate logistic regression (n = 456). There were two dependent variables: (1) previous CRC screening and (2) CRC screening adherence. Our independent variables of interest were health literacy and communication habits. Over two-thirds (67.13%) of respondents had not been previously screened for CRC. Multivariate analysis showed a decreased likelihood of previous CRC screening among those who had marginal (OR = 0.52; 95% CI = 0.29-0.92) or inadequate health literacy (OR = 0.49; 95% CI = 0.27-0.87) compared to those with adequate health literacy. Controlling for health literacy, the significant association between educational attainment and previous CRC screening was eliminated. Thus, health literacy mediated the relationship between educational attainment and previous CRC screening. There was no significant association between communication habits and previous CRC screening. There was no significant association between screening guideline adherence, and health literacy or communication. Limited health literacy is a potential barrier to CRC screening. Suboptimal CRC screening rates reported among those with lower educational attainment may be mediated by limited health literacy.
A Framework to Guide the Assessment of Human-Machine Systems.
Stowers, Kimberly; Oglesby, James; Sonesh, Shirley; Leyva, Kevin; Iwig, Chelsea; Salas, Eduardo
2017-03-01
We have developed a framework for guiding measurement in human-machine systems. The assessment of safety and performance in human-machine systems often relies on direct measurement, such as tracking reaction time and accidents. However, safety and performance emerge from the combination of several variables. The assessment of precursors to safety and performance are thus an important part of predicting and improving outcomes in human-machine systems. As part of an in-depth literature analysis involving peer-reviewed, empirical articles, we located and classified variables important to human-machine systems, giving a snapshot of the state of science on human-machine system safety and performance. Using this information, we created a framework of safety and performance in human-machine systems. This framework details several inputs and processes that collectively influence safety and performance. Inputs are divided according to human, machine, and environmental inputs. Processes are divided into attitudes, behaviors, and cognitive variables. Each class of inputs influences the processes and, subsequently, outcomes that emerge in human-machine systems. This framework offers a useful starting point for understanding the current state of the science and measuring many of the complex variables relating to safety and performance in human-machine systems. This framework can be applied to the design, development, and implementation of automated machines in spaceflight, military, and health care settings. We present a hypothetical example in our write-up of how it can be used to aid in project success.
Creating a non-linear total sediment load formula using polynomial best subset regression model
NASA Astrophysics Data System (ADS)
Okcu, Davut; Pektas, Ali Osman; Uyumaz, Ali
2016-08-01
The aim of this study is to derive a new total sediment load formula which is more accurate and which has less application constraints than the well-known formulae of the literature. 5 most known stream power concept sediment formulae which are approved by ASCE are used for benchmarking on a wide range of datasets that includes both field and flume (lab) observations. The dimensionless parameters of these widely used formulae are used as inputs in a new regression approach. The new approach is called Polynomial Best subset regression (PBSR) analysis. The aim of the PBRS analysis is fitting and testing all possible combinations of the input variables and selecting the best subset. Whole the input variables with their second and third powers are included in the regression to test the possible relation between the explanatory variables and the dependent variable. While selecting the best subset a multistep approach is used that depends on significance values and also the multicollinearity degrees of inputs. The new formula is compared to others in a holdout dataset and detailed performance investigations are conducted for field and lab datasets within this holdout data. Different goodness of fit statistics are used as they represent different perspectives of the model accuracy. After the detailed comparisons are carried out we figured out the most accurate equation that is also applicable on both flume and river data. Especially, on field dataset the prediction performance of the proposed formula outperformed the benchmark formulations.
NASA Astrophysics Data System (ADS)
Valade, A.; Ciais, P.; Vuichard, N.; Viovy, N.; Huth, N.; Marin, F.; Martiné, J.-F.
2014-01-01
Agro-Land Surface Models (agro-LSM) have been developed from the integration of specific crop processes into large-scale generic land surface models that allow calculating the spatial distribution and variability of energy, water and carbon fluxes within the soil-vegetation-atmosphere continuum. When developing agro-LSM models, a particular attention must be given to the effects of crop phenology and management on the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty of Agro-LSM models is related to their usually large number of parameters. In this study, we quantify the parameter-values uncertainty in the simulation of sugar cane biomass production with the agro-LSM ORCHIDEE-STICS, using a multi-regional approach with data from sites in Australia, La Réunion and Brazil. In ORCHIDEE-STICS, two models are chained: STICS, an agronomy model that calculates phenology and management, and ORCHIDEE, a land surface model that calculates biomass and other ecosystem variables forced by STICS' phenology. First, the parameters that dominate the uncertainty of simulated biomass at harvest date are determined through a screening of 67 different parameters of both STICS and ORCHIDEE on a multi-site basis. Secondly, the uncertainty of harvested biomass attributable to those most sensitive parameters is quantified and specifically attributed to either STICS (phenology, management) or to ORCHIDEE (other ecosystem variables including biomass) through distinct Monte-Carlo runs. The uncertainty on parameter values is constrained using observations by calibrating the model independently at seven sites. In a third step, a sensitivity analysis is carried out by varying the most sensitive parameters to investigate their effects at continental scale. A Monte-Carlo sampling method associated with the calculation of Partial Ranked Correlation Coefficients is used to quantify the sensitivity of harvested biomass to input parameters on a continental scale across the large regions of intensive sugar cane cultivation in Australia and Brazil. Ten parameters driving most of the uncertainty in the ORCHIDEE-STICS modeled biomass at the 7 sites are identified by the screening procedure. We found that the 10 most sensitive parameters control phenology (maximum rate of increase of LAI) and root uptake of water and nitrogen (root profile and root growth rate, nitrogen stress threshold) in STICS, and photosynthesis (optimal temperature of photosynthesis, optimal carboxylation rate), radiation interception (extinction coefficient), and transpiration and respiration (stomatal conductance, growth and maintenance respiration coefficients) in ORCHIDEE. We find that the optimal carboxylation rate and photosynthesis temperature parameters contribute most to the uncertainty in harvested biomass simulations at site scale. The spatial variation of the ranked correlation between input parameters and modeled biomass at harvest is well explained by rain and temperature drivers, suggesting climate-mediated different sensitivities of modeled sugar cane yield to the model parameters, for Australia and Brazil. This study reveals the spatial and temporal patterns of uncertainty variability for a highly parameterized agro-LSM and calls for more systematic uncertainty analyses of such models.
NASA Astrophysics Data System (ADS)
Valade, A.; Ciais, P.; Vuichard, N.; Viovy, N.; Caubel, A.; Huth, N.; Marin, F.; Martiné, J.-F.
2014-06-01
Agro-land surface models (agro-LSM) have been developed from the integration of specific crop processes into large-scale generic land surface models that allow calculating the spatial distribution and variability of energy, water and carbon fluxes within the soil-vegetation-atmosphere continuum. When developing agro-LSM models, particular attention must be given to the effects of crop phenology and management on the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty of agro-LSM models is related to their usually large number of parameters. In this study, we quantify the parameter-values uncertainty in the simulation of sugarcane biomass production with the agro-LSM ORCHIDEE-STICS, using a multi-regional approach with data from sites in Australia, La Réunion and Brazil. In ORCHIDEE-STICS, two models are chained: STICS, an agronomy model that calculates phenology and management, and ORCHIDEE, a land surface model that calculates biomass and other ecosystem variables forced by STICS phenology. First, the parameters that dominate the uncertainty of simulated biomass at harvest date are determined through a screening of 67 different parameters of both STICS and ORCHIDEE on a multi-site basis. Secondly, the uncertainty of harvested biomass attributable to those most sensitive parameters is quantified and specifically attributed to either STICS (phenology, management) or to ORCHIDEE (other ecosystem variables including biomass) through distinct Monte Carlo runs. The uncertainty on parameter values is constrained using observations by calibrating the model independently at seven sites. In a third step, a sensitivity analysis is carried out by varying the most sensitive parameters to investigate their effects at continental scale. A Monte Carlo sampling method associated with the calculation of partial ranked correlation coefficients is used to quantify the sensitivity of harvested biomass to input parameters on a continental scale across the large regions of intensive sugarcane cultivation in Australia and Brazil. The ten parameters driving most of the uncertainty in the ORCHIDEE-STICS modeled biomass at the 7 sites are identified by the screening procedure. We found that the 10 most sensitive parameters control phenology (maximum rate of increase of LAI) and root uptake of water and nitrogen (root profile and root growth rate, nitrogen stress threshold) in STICS, and photosynthesis (optimal temperature of photosynthesis, optimal carboxylation rate), radiation interception (extinction coefficient), and transpiration and respiration (stomatal conductance, growth and maintenance respiration coefficients) in ORCHIDEE. We find that the optimal carboxylation rate and photosynthesis temperature parameters contribute most to the uncertainty in harvested biomass simulations at site scale. The spatial variation of the ranked correlation between input parameters and modeled biomass at harvest is well explained by rain and temperature drivers, suggesting different climate-mediated sensitivities of modeled sugarcane yield to the model parameters, for Australia and Brazil. This study reveals the spatial and temporal patterns of uncertainty variability for a highly parameterized agro-LSM and calls for more systematic uncertainty analyses of such models.
NASA Astrophysics Data System (ADS)
Tsai, Tsung-Ying; Chang, Kai-Wei; Chen, Calvin Yu-Chian
2011-06-01
The rapidly advancing researches on traditional Chinese medicine (TCM) have greatly intrigued pharmaceutical industries worldwide. To take initiative in the next generation of drug development, we constructed a cloud-computing system for TCM intelligent screening system (iScreen) based on TCM Database@Taiwan. iScreen is compacted web server for TCM docking and followed by customized de novo drug design. We further implemented a protein preparation tool that both extract protein of interest from a raw input file and estimate the size of ligand bind site. In addition, iScreen is designed in user-friendly graphic interface for users who have less experience with the command line systems. For customized docking, multiple docking services, including standard, in-water, pH environment, and flexible docking modes are implemented. Users can download first 200 TCM compounds of best docking results. For TCM de novo drug design, iScreen provides multiple molecular descriptors for a user's interest. iScreen is the world's first web server that employs world's largest TCM database for virtual screening and de novo drug design. We believe our web server can lead TCM research to a new era of drug development. The TCM docking and screening server is available at http://iScreen.cmu.edu.tw/.
Tsai, Tsung-Ying; Chang, Kai-Wei; Chen, Calvin Yu-Chian
2011-06-01
The rapidly advancing researches on traditional Chinese medicine (TCM) have greatly intrigued pharmaceutical industries worldwide. To take initiative in the next generation of drug development, we constructed a cloud-computing system for TCM intelligent screening system (iScreen) based on TCM Database@Taiwan. iScreen is compacted web server for TCM docking and followed by customized de novo drug design. We further implemented a protein preparation tool that both extract protein of interest from a raw input file and estimate the size of ligand bind site. In addition, iScreen is designed in user-friendly graphic interface for users who have less experience with the command line systems. For customized docking, multiple docking services, including standard, in-water, pH environment, and flexible docking modes are implemented. Users can download first 200 TCM compounds of best docking results. For TCM de novo drug design, iScreen provides multiple molecular descriptors for a user's interest. iScreen is the world's first web server that employs world's largest TCM database for virtual screening and de novo drug design. We believe our web server can lead TCM research to a new era of drug development. The TCM docking and screening server is available at http://iScreen.cmu.edu.tw/.
Sobol' sensitivity analysis for stressor impacts on honeybee ...
We employ Monte Carlo simulation and nonlinear sensitivity analysis techniques to describe the dynamics of a bee exposure model, VarroaPop. Daily simulations are performed of hive population trajectories, taking into account queen strength, foraging success, mite impacts, weather, colony resources, population structure, and other important variables. This allows us to test the effects of defined pesticide exposure scenarios versus controlled simulations that lack pesticide exposure. The daily resolution of the model also allows us to conditionally identify sensitivity metrics. We use the variancebased global decomposition sensitivity analysis method, Sobol’, to assess firstand secondorder parameter sensitivities within VarroaPop, allowing us to determine how variance in the output is attributed to each of the input variables across different exposure scenarios. Simulations with VarroaPop indicate queen strength, forager life span and pesticide toxicity parameters are consistent, critical inputs for colony dynamics. Further analysis also reveals that the relative importance of these parameters fluctuates throughout the simulation period according to the status of other inputs. Our preliminary results show that model variability is conditional and can be attributed to different parameters depending on different timescales. By using sensitivity analysis to assess model output and variability, calibrations of simulation models can be better informed to yield more
NASA Astrophysics Data System (ADS)
Rahmati, Mehdi
2017-08-01
Developing accurate and reliable pedo-transfer functions (PTFs) to predict soil non-readily available characteristics is one of the most concerned topic in soil science and selecting more appropriate predictors is a crucial factor in PTFs' development. Group method of data handling (GMDH), which finds an approximate relationship between a set of input and output variables, not only provide an explicit procedure to select the most essential PTF input variables, but also results in more accurate and reliable estimates than other mostly applied methodologies. Therefore, the current research was aimed to apply GMDH in comparison with multivariate linear regression (MLR) and artificial neural network (ANN) to develop several PTFs to predict soil cumulative infiltration point-basely at specific time intervals (0.5-45 min) using soil readily available characteristics (RACs). In this regard, soil infiltration curves as well as several soil RACs including soil primary particles (clay (CC), silt (Si), and sand (Sa)), saturated hydraulic conductivity (Ks), bulk (Db) and particle (Dp) densities, organic carbon (OC), wet-aggregate stability (WAS), electrical conductivity (EC), and soil antecedent (θi) and field saturated (θfs) water contents were measured at 134 different points in Lighvan watershed, northwest of Iran. Then, applying GMDH, MLR, and ANN methodologies, several PTFs have been developed to predict cumulative infiltrations using two sets of selected soil RACs including and excluding Ks. According to the test data, results showed that developed PTFs by GMDH and MLR procedures using all soil RACs including Ks resulted in more accurate (with E values of 0.673-0.963) and reliable (with CV values lower than 11 percent) predictions of cumulative infiltrations at different specific time steps. In contrast, ANN procedure had lower accuracy (with E values of 0.356-0.890) and reliability (with CV values up to 50 percent) compared to GMDH and MLR. The results also revealed that Ks exclusion from input variables list caused around 30 percent decrease in PTFs accuracy for all applied procedures. However, it seems that Ks exclusion resulted in more practical PTFs especially in the case of GMDH network applying input variables which are less time consuming than Ks. In general, it is concluded that GMDH provides more accurate and reliable estimates of cumulative infiltration (a non-readily available characteristic of soil) with a minimum set of input variables (2-4 input variables) and can be promising strategy to model soil infiltration combining the advantages of ANN and MLR methodologies.
Wheeler, David C; Czarnota, Jenna; Jones, Resa M
2017-01-01
Socioeconomic status (SES) is often considered a risk factor for health outcomes. SES is typically measured using individual variables of educational attainment, income, housing, and employment variables or a composite of these variables. Approaches to building the composite variable include using equal weights for each variable or estimating the weights with principal components analysis or factor analysis. However, these methods do not consider the relationship between the outcome and the SES variables when constructing the index. In this project, we used weighted quantile sum (WQS) regression to estimate an area-level SES index and its effect in a model of colonoscopy screening adherence in the Minnesota-Wisconsin Metropolitan Statistical Area. We considered several specifications of the SES index including using different spatial scales (e.g., census block group-level, tract-level) for the SES variables. We found a significant positive association (odds ratio = 1.17, 95% CI: 1.15-1.19) between the SES index and colonoscopy adherence in the best fitting model. The model with the best goodness-of-fit included a multi-scale SES index with 10 variables at the block group-level and one at the tract-level, with home ownership, race, and income among the most important variables. Contrary to previous index construction, our results were not consistent with an assumption of equal importance of variables in the SES index when explaining colonoscopy screening adherence. Our approach is applicable in any study where an SES index is considered as a variable in a regression model and the weights for the SES variables are not known in advance.
How model and input uncertainty impact maize yield simulations in West Africa
NASA Astrophysics Data System (ADS)
Waha, Katharina; Huth, Neil; Carberry, Peter; Wang, Enli
2015-02-01
Crop models are common tools for simulating crop yields and crop production in studies on food security and global change. Various uncertainties however exist, not only in the model design and model parameters, but also and maybe even more important in soil, climate and management input data. We analyze the performance of the point-scale crop model APSIM and the global scale crop model LPJmL with different climate and soil conditions under different agricultural management in the low-input maize-growing areas of Burkina Faso, West Africa. We test the models’ response to different levels of input information from little to detailed information on soil, climate (1961-2000) and agricultural management and compare the models’ ability to represent the observed spatial (between locations) and temporal variability (between years) in crop yields. We found that the resolution of different soil, climate and management information influences the simulated crop yields in both models. However, the difference between models is larger than between input data and larger between simulations with different climate and management information than between simulations with different soil information. The observed spatial variability can be represented well from both models even with little information on soils and management but APSIM simulates a higher variation between single locations than LPJmL. The agreement of simulated and observed temporal variability is lower due to non-climatic factors e.g. investment in agricultural research and development between 1987 and 1991 in Burkina Faso which resulted in a doubling of maize yields. The findings of our study highlight the importance of scale and model choice and show that the most detailed input data does not necessarily improve model performance.
Konstantinidis, Spyridon; Titchener-Hooker, Nigel; Velayudhan, Ajoy
2017-08-01
Bioprocess development studies often involve the investigation of numerical and categorical inputs via the adoption of Design of Experiments (DoE) techniques. An attractive alternative is the deployment of a grid compatible Simplex variant which has been shown to yield optima rapidly and consistently. In this work, the method is combined with dummy variables and it is deployed in three case studies wherein spaces are comprised of both categorical and numerical inputs, a situation intractable by traditional Simplex methods. The first study employs in silico data and lays out the dummy variable methodology. The latter two employ experimental data from chromatography based studies performed with the filter-plate and miniature column High Throughput (HT) techniques. The solute of interest in the former case study was a monoclonal antibody whereas the latter dealt with the separation of a binary system of model proteins. The implemented approach prevented the stranding of the Simplex method at local optima, due to the arbitrary handling of the categorical inputs, and allowed for the concurrent optimization of numerical and categorical, multilevel and/or dichotomous, inputs. The deployment of the Simplex method, combined with dummy variables, was therefore entirely successful in identifying and characterizing global optima in all three case studies. The Simplex-based method was further shown to be of equivalent efficiency to a DoE-based approach, represented here by D-Optimal designs. Such an approach failed, however, to both capture trends and identify optima, and led to poor operating conditions. It is suggested that the Simplex-variant is suited to development activities involving numerical and categorical inputs in early bioprocess development. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Mak, Yim Wah; Wu, Cynthia Sau Ting; Hui, Donna Wing Shun; Lam, Siu Ping; Tse, Hei Yin; Yu, Wing Yan; Wong, Ho Ting
2014-01-01
Screen viewing is considered to have adverse impacts on the sleep of adolescents. Although there has been a considerable amount of research on the association between screen viewing and sleep, most studies have focused on specific types of screen viewing devices such as televisions and computers. The present study investigated the duration with which currently prevalent screen viewing devices (including televisions, personal computers, mobile phones, and portable video devices) are viewed in relation to sleep duration, sleep quality, and daytime sleepiness among Hong Kong adolescents (N = 762). Television and computer viewing remain prevalent, but were not correlated with sleep variables. Mobile phone viewing was correlated with all sleep variables, while portable video device viewing was shown to be correlated only with daytime sleepiness. The results demonstrated a trend of increase in the prevalence and types of screen viewing and their effects on the sleep patterns of adolescents. PMID:25353062
Beam self-trapping in a BCT crystal
NASA Astrophysics Data System (ADS)
Matusevich, V.; Kiessling, A.; Kowarschik, R.; Zagorskiy, A. E.; Shepelevich, V. V.
2006-01-01
We present some aspects of wave self-focusing and self-defocusing in a photorefractive Ba 0.77Ca 0.23TiO 3 (BCT) crystal without external electric field and without background illumination. The effects depend on the cross-section of the input beam. We show that by decreasing of the diameter of the input beam from 730 μm the fanning effect disappears at 150 μm. A symmetrical self-focusing is observed for input diameters from 150 um down to 40 μm and a symmetrical self-defocusing for input diameters from 40 μm down to 20 μm. The 1D self-trapping is detected at 65 μm in BCT. Light power and wavelength are correspondingly 3 mW and 633 nm. The experimental results are supplemented with numerical calculations based on both photovoltaic model and model of screening soliton.
Steckelberg, Anke; Kasper, Jürgen; Mühlhauser, Ingrid
2007-08-27
Evidence-based patient information (EBPI) is a prerequisite for informed decision-making. However, presentation of EBPI may lead to irrational reactions causing avoidance, minimisation and devaluation of the information. To explore whether the theory of cognitive dissonance is applicable to medical decision-making and useful to explain these phenomena. 261 volunteers from Hamburg (157 women), >or=50 years old without diagnosis of colorectal cancer. DESIGN AND VARIABLES: Within an experiment we simulated information seeking on colorectal cancer screening. Consumers' attitudes towards screening were surveyed using a rating scale from -5 (participate in no way) to +5 (participate unconditionally) (independent variable). Using a cover story, participants were asked to sort 5 article headlines according to their reading preferences. The headlines simulated the pro to contra variety of contents to be found in print media about colorectal cancer screening. The dependent variable was the sequence of article headlines. Participants were very much in favour of screening with scores for faecal occult blood test of 4.0 (0.1) and for colonoscopy 3.3 (0.1). According to our hypothesis we found statistically significant positive correlations between the stimuli in favour of screening and attitudes and significant negative correlations between the stimuli against screening and attitudes. The theory of cognitive dissonance is applicable to medical decision-making. It may explain some phenomena of irrational reactions to evidence-based patient information.
Steckelberg, Anke; Kasper, Jürgen; Mühlhauser, Ingrid
2007-01-01
Background: Evidence-based patient information (EBPI) is a prerequisite for informed decision-making. However, presentation of EBPI may lead to irrational reactions causing avoidance, minimisation and devaluation of the information. Objective: To explore whether the theory of cognitive dissonance is applicable to medical decision-making and useful to explain these phenomena. Setting and participants: 261 volunteers from Hamburg (157 women), ≥50 years old without diagnosis of colorectal cancer. Design and variables: Within an experiment we simulated information seeking on colorectal cancer screening. Consumers’ attitudes towards screening were surveyed using a rating scale from -5 (participate in no way) to +5 (participate unconditionally) (independent variable). Using a cover story, participants were asked to sort 5 article headlines according to their reading preferences. The headlines simulated the pro to contra variety of contents to be found in print media about colorectal cancer screening. The dependent variable was the sequence of article headlines. Results: Participants were very much in favour of screening with scores for faecal occult blood test of 4.0 (0.1) and for colonoscopy 3.3 (0.1). According to our hypothesis we found statistically significant positive correlations between the stimuli in favour of screening and attitudes and significant negative correlations between the stimuli against screening and attitudes. Conclusion: The theory of cognitive dissonance is applicable to medical decision-making. It may explain some phenomena of irrational reactions to evidence-based patient information. PMID:19675713
e-Drug3D: 3D structure collections dedicated to drug repurposing and fragment-based drug design.
Pihan, Emilie; Colliandre, Lionel; Guichou, Jean-François; Douguet, Dominique
2012-06-01
In the drug discovery field, new uses for old drugs, selective optimization of side activities and fragment-based drug design (FBDD) have proved to be successful alternatives to high-throughput screening. e-Drug3D is a database of 3D chemical structures of drugs that provides several collections of ready-to-screen SD files of drugs and commercial drug fragments. They are natural inputs in studies dedicated to drug repurposing and FBDD. e-Drug3D collections are freely available at http://chemoinfo.ipmc.cnrs.fr/e-drug3d.html either for download or for direct in silico web-based screenings.
Readiness of nursing students to screen women for domestic violence.
Ben Natan, Merav; Khater, Marva; Ighbariyea, Raeqa; Herbet, Hanin
2016-09-01
Although domestic violence against women is common in Israel and elsewhere, and though medical staff in Israel have a universal obligation to screen women for domestic violence, actual screening rates remain low. To examine which variables affect nursing students' intention to screen women for domestic violence when providing treatment, and whether the Theory of Planned Behavior (TPB) developed by Ajzen (1991) predicts this intention. This study is a quantitative cross sectional study. A large academic nursing school in central Israel. A convenience sample of 200 nursing students who had completed at least one year of studies took part in the study. Students completed a questionnaire based on the TPB. Nursing students showed high intention to screen women for domestic violence when providing treatment. Normative beliefs, subjective norms, behavioral beliefs, perceived control, and knowledge were found to affect students' intention to screen women for domestic violence. The opinion of the clinical instructor was most significant for students. The theoretical model predicted 32% of students' intention to screen women for domestic violence, with normative beliefs being the most significant variable. Copyright © 2016 Elsevier Ltd. All rights reserved.
Pelletier, Valerie; Winter, Kelly; Albatineh, Ahmed N.
2013-01-01
Objectives. Physician recommendation plays a crucial role in receiving endoscopic screening for colorectal cancer (CRC). This study explored factors associated with racial/ethnic differences in rates of screening recommendation. Methods. Data on 5900 adults eligible for endoscopic screening were obtained from the National Health Interview Survey. Odds ratios of receiving an endoscopy recommendation were calculated for selected variables. Planned, sequenced logistic regressions were conducted to examine the extent to which socioeconomic and health care variables account for racial/ethnic disparities in recommendation rates. Results. Differential rates were observed for CRC screening and screening recommendations among racial/ethnic groups. Compared with Whites, Hispanics were 34% less likely (P < .01) and Blacks were 26% less likely (P < .05) to receive this recommendation. The main predictors that emerged in sequenced analysis were education for Hispanics and Blacks and income for Blacks. After accounting for the effects of usual source of care, insurance coverage, and education, the disparity reduced and became statistically insignificant. Conclusions. Socioeconomic status and access to health care may explain major racial/ethnic disparities in CRC screening recommendation rates. PMID:23678899
Sympathovagal imbalance in hyperthyroidism.
Burggraaf, J; Tulen, J H; Lalezari, S; Schoemaker, R C; De Meyer, P H; Meinders, A E; Cohen, A F; Pijl, H
2001-07-01
We assessed sympathovagal balance in thyrotoxicosis. Fourteen patients with Graves' hyperthyroidism were studied before and after 7 days of treatment with propranolol (40 mg 3 times a day) and in the euthyroid state. Data were compared with those obtained in a group of age-, sex-, and weight-matched controls. Autonomic inputs to the heart were assessed by power spectral analysis of heart rate variability. Systemic exposure to sympathetic neurohormones was estimated on the basis of 24-h urinary catecholamine excretion. The spectral power in the high-frequency domain was considerably reduced in hyperthyroid patients, indicating diminished vagal inputs to the heart. Increased heart rate and mid-frequency/high-frequency power ratio in the presence of reduced total spectral power and increased urinary catecholamine excretion strongly suggest enhanced sympathetic inputs in thyrotoxicosis. All abnormal features of autonomic balance were completely restored to normal in the euthyroid state. beta-Adrenoceptor antagonism reduced heart rate in hyperthyroid patients but did not significantly affect heart rate variability or catecholamine excretion. This is in keeping with the concept of a joint disruption of sympathetic and vagal inputs to the heart underlying changes in heart rate variability. Thus thyrotoxicosis is characterized by profound sympathovagal imbalance, brought about by increased sympathetic activity in the presence of diminished vagal tone.
Harbaugh, Arien W.
2011-01-01
The MFI2005 data-input (entry) program was developed for use with the U.S. Geological Survey modular three-dimensional finite-difference groundwater model, MODFLOW-2005. MFI2005 runs on personal computers and is designed to be easy to use; data are entered interactively through a series of display screens. MFI2005 supports parameter estimation using the UCODE_2005 program for parameter estimation. Data for MODPATH, a particle-tracking program for use with MODFLOW-2005, also can be entered using MFI2005. MFI2005 can be used in conjunction with other data-input programs so that the different parts of a model dataset can be entered by using the most suitable program.
ERIC Educational Resources Information Center
Seiverling, Laura; Hendy, Helen M.; Williams, Keith
2011-01-01
The present study evaluated the 23-item Screening Tool for Feeding Problems (STEP; Matson & Kuhn, 2001) with a sample of children referred to a hospital-based feeding clinic to examine the scale's psychometric characteristics and then demonstrate how a children's revision of the STEP, the STEP-CHILD is associated with child and parent variables.…
MODELING LEACHING OF VIRUSES BY THE MONTE CARLO METHOD
A predictive screening model was developed for fate and transport
of viruses in the unsaturated zone. A database of input parameters
allowed Monte Carlo analysis with the model. The resulting kernel
densities of predicted attenuation during percolation indicated very ...
Hoang Thi, Thanh Huong; Lemdani, Mohamed; Flament, Marie-Pierre
2013-09-10
In a previous study of ours, the association of sodium caseinate and lecithin was demonstrated to be promising for masking the bitterness of acetaminophen via drug encapsulation. The encapsulating mechanisms were suggested to be based on the segregation of multicomponent droplets occurring during spray-drying. The spray-dried particles delayed the drug release within the mouth during the early time upon administration and hence masked the bitterness. Indeed, taste-masking is achieved if, within the frame of 1-2 min, drug substance is either not released or the released amount is below the human threshold for identifying its bad taste. The aim of this work was (i) to evaluate the effect of various processing and formulation parameters on the taste-masking efficiency and (ii) to determine the optimal formulation for optimal taste-masking effect. Four investigated input variables included inlet temperature (X1), spray flow (X2), sodium caseinate amount (X3) and lecithin amount (X4). The percentage of drug release amount during the first 2 min was considered as the response variable (Y). A 2(4)-full factorial design was applied and allowed screening for the most influential variables i.e. sodium caseinate amount and lecithin amount. Optimizing these two variables was therefore conducted by a simplex approach. The SEM and DSC results of spray-dried powder prepared under optimal conditions showed that drug seemed to be well encapsulated. The drug release during the first 2 min significantly decreased, 7-fold less than the unmasked drug particles. Therefore, the optimal formulation that performed the best taste-masking effect was successfully achieved. Copyright © 2013 Elsevier B.V. All rights reserved.
Sensitivity and uncertainty of input sensor accuracy for grass-based reference evapotranspiration
USDA-ARS?s Scientific Manuscript database
Quantification of evapotranspiration (ET) in agricultural environments is becoming of increasing importance throughout the world, thus understanding input variability of relevant sensors is of paramount importance as well. The Colorado Agricultural and Meteorological Network (CoAgMet) and the Florid...
Assessment of input uncertainty by seasonally categorized latent variables using SWAT
USDA-ARS?s Scientific Manuscript database
Watershed processes have been explored with sophisticated simulation models for the past few decades. It has been stated that uncertainty attributed to alternative sources such as model parameters, forcing inputs, and measured data should be incorporated during the simulation process. Among varyin...
Speaker Invariance for Phonetic Information: an fMRI Investigation
Salvata, Caden; Blumstein, Sheila E.; Myers, Emily B.
2012-01-01
The current study explored how listeners map the variable acoustic input onto a common sound structure representation while being able to retain phonetic detail to distinguish among the identity of talkers. An adaptation paradigm was utilized to examine areas which showed an equal neural response (equal release from adaptation) to phonetic change when spoken by the same speaker and when spoken by two different speakers, and insensitivity (failure to show release from adaptation) when the same phonetic input was spoken by a different speaker. Neural areas which showed speaker invariance were located in the anterior portion of the middle superior temporal gyrus bilaterally. These findings provide support for the view that speaker normalization processes allow for the translation of a variable speech input to a common abstract sound structure. That this process appears to occur early in the processing stream, recruiting temporal structures, suggests that this mapping takes place prelexically, before sound structure input is mapped on to lexical representations. PMID:23264714
The input and output management of solid waste using DEA models: A case study at Jengka, Pahang
NASA Astrophysics Data System (ADS)
Mohamed, Siti Rosiah; Ghazali, Nur Fadzrina Mohd; Mohd, Ainun Hafizah
2017-08-01
Data Envelopment Analysis (DEA) as a tool for obtaining performance indices has been used extensively in several of organizations sector. The ways to improve the efficiency of Decision Making Units (DMUs) is impractical because some of inputs and outputs are uncontrollable and in certain situation its produce weak efficiency which often reflect the impact for operating environment. Based on the data from Alam Flora Sdn. Bhd Jengka, the researcher wants to determine the efficiency of solid waste management (SWM) in town Jengka Pahang using CCRI and CCRO model of DEA and duality formulation with vector average input and output. Three input variables (length collection in meter, frequency time per week in hour and number of garbage truck) and 2 outputs variables (frequency collection and the total solid waste collection in kilogram) are analyzed. As a conclusion, it shows only three roads from 23 roads are efficient that achieve efficiency score 1. Meanwhile, 20 other roads are in an inefficient management.
NASA Technical Reports Server (NTRS)
Fortenbaugh, R. L.
1980-01-01
Equations incorporated in a VATOL six degree of freedom off-line digital simulation program and data for the Vought SF-121 VATOL aircraft concept which served as the baseline for the development of this program are presented. The equations and data are intended to facilitate the development of a piloted VATOL simulation. The equation presentation format is to state the equations which define a particular model segment. Listings of constants required to quantify the model segment, input variables required to exercise the model segment, and output variables required by other model segments are included. In several instances a series of input or output variables are followed by a section number in parentheses which identifies the model segment of origination or termination of those variables.
Mathematical models of the simplest fuzzy PI/PD controllers with skewed input and output fuzzy sets.
Mohan, B M; Sinha, Arpita
2008-07-01
This paper unveils mathematical models for fuzzy PI/PD controllers which employ two skewed fuzzy sets for each of the two-input variables and three skewed fuzzy sets for the output variable. The basic constituents of these models are Gamma-type and L-type membership functions for each input, trapezoidal/triangular membership functions for output, intersection/algebraic product triangular norm, maximum/drastic sum triangular conorm, Mamdani minimum/Larsen product/drastic product inference method, and center of sums defuzzification method. The existing simplest fuzzy PI/PD controller structures derived via symmetrical fuzzy sets become special cases of the mathematical models revealed in this paper. Finally, a numerical example along with its simulation results are included to demonstrate the effectiveness of the simplest fuzzy PI controllers.
Sociodemographic and home environment predictors of screen viewing among Spanish school children.
Hoyos Cillero, Itziar; Jago, Russell
2011-09-01
Higher screen-viewing levels increase the risk of obesity. Understanding the correlates of screen viewing is an important first step in designing interventions but there is lack of information on the correlates among Spanish children. This study examined associations among environmental, sociocultural, age variables and screen viewing among Spanish children. Children completed a questionnaire about time spent in screen viewing. BMI was assessed and children were classified into obesity groups using International Obesity Task Force cut-off points. Parents completed a questionnaire about sociodemographic, environmental and sociocultural variables. Participants were 247 primary and 256 secondary school-aged children and their parents. Time spent in screen viewing increased with age. Males spent more time than females in screen viewing. Greater access to bedroom media sources was associated with higher screen viewing. Younger children from single-parent households and older children having a younger parent, siblings and a father who was not working were higher screen-viewers on weekends and weekdays, respectively. For older children parental TV viewing time appeared to be a significant correlate, while parental rules was a determinant predictor for younger children on weekdays. Environmental and sociocultural factors influence the time children spend in screen viewing. Parents play a central role in child's screen viewing; therefore, interventions that target environmental and family TV viewing practices are likely to be effective.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Behrang, M.A.; Assareh, E.; Ghanbarzadeh, A.
2010-08-15
The main objective of present study is to predict daily global solar radiation (GSR) on a horizontal surface, based on meteorological variables, using different artificial neural network (ANN) techniques. Daily mean air temperature, relative humidity, sunshine hours, evaporation, and wind speed values between 2002 and 2006 for Dezful city in Iran (32 16'N, 48 25'E), are used in this study. In order to consider the effect of each meteorological variable on daily GSR prediction, six following combinations of input variables are considered: (I)Day of the year, daily mean air temperature and relative humidity as inputs and daily GSR as output.more » (II)Day of the year, daily mean air temperature and sunshine hours as inputs and daily GSR as output. (III)Day of the year, daily mean air temperature, relative humidity and sunshine hours as inputs and daily GSR as output. (IV)Day of the year, daily mean air temperature, relative humidity, sunshine hours and evaporation as inputs and daily GSR as output. (V)Day of the year, daily mean air temperature, relative humidity, sunshine hours and wind speed as inputs and daily GSR as output. (VI)Day of the year, daily mean air temperature, relative humidity, sunshine hours, evaporation and wind speed as inputs and daily GSR as output. Multi-layer perceptron (MLP) and radial basis function (RBF) neural networks are applied for daily GSR modeling based on six proposed combinations. The measured data between 2002 and 2005 are used to train the neural networks while the data for 214 days from 2006 are used as testing data. The comparison of obtained results from ANNs and different conventional GSR prediction (CGSRP) models shows very good improvements (i.e. the predicted values of best ANN model (MLP-V) has a mean absolute percentage error (MAPE) about 5.21% versus 10.02% for best CGSRP model (CGSRP 5)). (author)« less
Fuzzy Neuron: Method and Hardware Realization
NASA Technical Reports Server (NTRS)
Krasowski, Michael J.; Prokop, Norman F.
2014-01-01
This innovation represents a method by which single-to-multi-input, single-to-many-output system transfer functions can be estimated from input/output data sets. This innovation can be run in the background while a system is operating under other means (e.g., through human operator effort), or may be utilized offline using data sets created from observations of the estimated system. It utilizes a set of fuzzy membership functions spanning the input space for each input variable. Linear combiners associated with combinations of input membership functions are used to create the output(s) of the estimator. Coefficients are adjusted online through the use of learning algorithms.
Group interaction and flight crew performance
NASA Technical Reports Server (NTRS)
Foushee, H. Clayton; Helmreich, Robert L.
1988-01-01
The application of human-factors analysis to the performance of aircraft-operation tasks by the crew as a group is discussed in an introductory review and illustrated with anecdotal material. Topics addressed include the function of a group in the operational environment, the classification of group performance factors (input, process, and output parameters), input variables and the flight crew process, and the effect of process variables on performance. Consideration is given to aviation safety issues, techniques for altering group norms, ways of increasing crew effort and coordination, and the optimization of group composition.
Flora, David B.; LaBrish, Cathy; Chalmers, R. Philip
2011-01-01
We provide a basic review of the data screening and assumption testing issues relevant to exploratory and confirmatory factor analysis along with practical advice for conducting analyses that are sensitive to these concerns. Historically, factor analysis was developed for explaining the relationships among many continuous test scores, which led to the expression of the common factor model as a multivariate linear regression model with observed, continuous variables serving as dependent variables, and unobserved factors as the independent, explanatory variables. Thus, we begin our paper with a review of the assumptions for the common factor model and data screening issues as they pertain to the factor analysis of continuous observed variables. In particular, we describe how principles from regression diagnostics also apply to factor analysis. Next, because modern applications of factor analysis frequently involve the analysis of the individual items from a single test or questionnaire, an important focus of this paper is the factor analysis of items. Although the traditional linear factor model is well-suited to the analysis of continuously distributed variables, commonly used item types, including Likert-type items, almost always produce dichotomous or ordered categorical variables. We describe how relationships among such items are often not well described by product-moment correlations, which has clear ramifications for the traditional linear factor analysis. An alternative, non-linear factor analysis using polychoric correlations has become more readily available to applied researchers and thus more popular. Consequently, we also review the assumptions and data-screening issues involved in this method. Throughout the paper, we demonstrate these procedures using an historic data set of nine cognitive ability variables. PMID:22403561
Using artificial intelligence to predict permeability from petrographic data
NASA Astrophysics Data System (ADS)
Ali, Maqsood; Chawathé, Adwait
2000-10-01
Petrographic data collected during thin section analysis can be invaluable for understanding the factors that control permeability distribution. Reliable prediction of permeability is important for reservoir characterization. The petrographic elements (mineralogy, porosity types, cements and clays, and pore morphology) interact with each other uniquely to generate a specific permeability distribution. It is difficult to quantify accurately this interaction and its consequent effect on permeability, emphasizing the non-linear nature of the process. To capture these non-linear interactions, neural networks were used to predict permeability from petrographic data. The neural net was used as a multivariate correlative tool because of its ability to learn the non-linear relationships between multiple input and output variables. The study was conducted on the upper Queen formation called the Shattuck Member (Permian age). The Shattuck Member is composed of very fine-grained arkosic sandstone. The core samples were available from the Sulimar Queen and South Lucky Lake fields located in Chaves County, New Mexico. Nineteen petrographic elements were collected for each permeability value using a combined minipermeameter-petrographic technique. In order to reduce noise and overfitting the permeability model, these petrographic elements were screened, and their control (ranking) with respect to permeability was determined using fuzzy logic. Since the fuzzy logic algorithm provides unbiased ranking, it was used to reduce the dimensionality of the input variables. Based on the fuzzy logic ranking, only the most influential petrographic elements were selected as inputs for permeability prediction. The neural net was trained and tested using data from Well 1-16 in the Sulimar Queen field. Relying on the ranking obtained from the fuzzy logic analysis, the net was trained using the most influential three, five, and ten petrographic elements. A fast algorithm (the scaled conjugate gradient method) was used to optimize the network weight matrix. The net was then successfully used to predict the permeability in the nearby South Lucky Lake field, also in the Shattuck Member. This study underscored various important aspects of using neural networks as non-linear estimators. The neural network learnt the complex relationships between petrographic control and permeability. By predicting permeability in a remotely-located, yet geologically similar field, the generalizing capability of the neural network was also demonstrated. In old fields, where conventional petrographic analysis was routine, this technique may be used to supplement core permeability estimates.
Stochastic analysis of multiphase flow in porous media: II. Numerical simulations
NASA Astrophysics Data System (ADS)
Abin, A.; Kalurachchi, J. J.; Kemblowski, M. W.; Chang, C.-M.
1996-08-01
The first paper (Chang et al., 1995b) of this two-part series described the stochastic analysis using spectral/perturbation approach to analyze steady state two-phase (water and oil) flow in a, liquid-unsaturated, three fluid-phase porous medium. In this paper, the results between the numerical simulations and closed-form expressions obtained using the perturbation approach are compared. We present the solution to the one-dimensional, steady-state oil and water flow equations. The stochastic input processes are the spatially correlated logk where k is the intrinsic permeability and the soil retention parameter, α. These solutions are subsequently used in the numerical simulations to estimate the statistical properties of the key output processes. The comparison between the results of the perturbation analysis and numerical simulations showed a good agreement between the two methods over a wide range of logk variability with three different combinations of input stochastic processes of logk and soil parameter α. The results clearly demonstrated the importance of considering the spatial variability of key subsurface properties under a variety of physical scenarios. The variability of both capillary pressure and saturation is affected by the type of input stochastic process used to represent the spatial variability. The results also demonstrated the applicability of perturbation theory in predicting the system variability and defining effective fluid properties through the ergodic assumption.
Simulating maize yield and biomass with spatial variability of soil field capacity
USDA-ARS?s Scientific Manuscript database
Spatial variability in field soil water and other properties is a challenge for system modelers who use only representative values for model inputs, rather than their distributions. In this study, we compared simulation results from a calibrated model with spatial variability of soil field capacity ...
Screen-related sedentary behaviors: children's and parents' attitudes, motivations, and practices.
He, Meizi; Piché, Leonard; Beynon, Charlene; Harris, Stewart
2010-01-01
To investigate school-aged children's and parents' attitudes, social influences, and intentions toward excessive screen-related sedentary behavior (S-RSB). A cross-sectional study using a survey methodology. Elementary schools in London, Ontario, Canada. All grades 5 and 6 students, their parents, and their teachers in the participating schools were invited to voluntarily participate; 508 student-parent pairs completed the surveys. Children's screen-related behaviors. Data were analyzed using the Independent Student t test to compare differences of continuous variables and the chi-square test to test for differences of categorical variables. Children spent 3.3 +/- 0.15 (standard error) hours per day engaged in screen-related activities. Entertainment, spending time with family, and boredom were cited as the top 3 reasons for television viewing and video game playing. Compared to "low-screen users" (ie, < 2 hours/day), "high-screen users" (ie, >or= 2 hours/day) had a less negative attitude toward excessive S-RSB and perceived loosened parental rules on screen use. Parents of high-screen users had a less negative attitude toward children's S-RSB, had fewer rules about their children's screen use, and were more likely to be sedentary themselves. Intervention strategies aimed at reducing S-RSB should involve both parents and children and should focus on fostering behavioral changes and promoting parental role modeling.
Simulated lumped-parameter system reduced-order adaptive control studies
NASA Technical Reports Server (NTRS)
Johnson, C. R., Jr.; Lawrence, D. A.; Taylor, T.; Malakooti, M. V.
1981-01-01
Two methods of interpreting the misbehavior of reduced order adaptive controllers are discussed. The first method is based on system input-output description and the second is based on state variable description. The implementation of the single input, single output, autoregressive, moving average system is considered.
Computational diagnosis of canine lymphoma
NASA Astrophysics Data System (ADS)
Mirkes, E. M.; Alexandrakis, I.; Slater, K.; Tuli, R.; Gorban, A. N.
2014-03-01
One out of four dogs will develop cancer in their lifetime and 20% of those will be lymphoma cases. PetScreen developed a lymphoma blood test using serum samples collected from several veterinary practices. The samples were fractionated and analysed by mass spectrometry. Two protein peaks, with the highest diagnostic power, were selected and further identified as acute phase proteins, C-Reactive Protein and Haptoglobin. Data mining methods were then applied to the collected data for the development of an online computer-assisted veterinary diagnostic tool. The generated software can be used as a diagnostic, monitoring and screening tool. Initially, the diagnosis of lymphoma was formulated as a classification problem and then later refined as a lymphoma risk estimation. Three methods, decision trees, kNN and probability density evaluation, were used for classification and risk estimation and several preprocessing approaches were implemented to create the diagnostic system. For the differential diagnosis the best solution gave a sensitivity and specificity of 83.5% and 77%, respectively (using three input features, CRP, Haptoglobin and standard clinical symptom). For the screening task, the decision tree method provided the best result, with sensitivity and specificity of 81.4% and >99%, respectively (using the same input features). Furthermore, the development and application of new techniques for the generation of risk maps allowed their user-friendly visualization.
Thomas, Duncan C
2017-07-01
Screening behavior depends on previous screening history and family members' behaviors, which can act as both confounders and intermediate variables on a causal pathway from screening to disease risk. Conventional analyses that adjust for these variables can lead to incorrect inferences about the causal effect of screening if high-risk individuals are more likely to be screened. Analyzing the data in a manner that treats screening as randomized conditional on covariates allows causal parameters to be estimated; inverse probability weighting based on propensity of exposure scores is one such method considered here. I simulated family data under plausible models for the underlying disease process and for screening behavior to assess the performance of alternative methods of analysis and whether a targeted screening approach based on individuals' risk factors would lead to a greater reduction in cancer incidence in the population than a uniform screening policy. Simulation results indicate that there can be a substantial underestimation of the effect of screening on subsequent cancer risk when using conventional analysis approaches, which is avoided by using inverse probability weighting. A large case-control study of colonoscopy and colorectal cancer from Germany shows a strong protective effect of screening, but inverse probability weighting makes this effect even stronger. Targeted screening approaches based on either fixed risk factors or family history yield somewhat greater reductions in cancer incidence with fewer screens needed to prevent one cancer than population-wide approaches, but the differences may not be large enough to justify the additional effort required. See video abstract at, http://links.lww.com/EDE/B207.
A new polytopic approach for the unknown input functional observer design
NASA Astrophysics Data System (ADS)
Bezzaoucha, Souad; Voos, Holger; Darouach, Mohamed
2018-03-01
In this paper, a constructive procedure to design Functional Unknown Input Observers for nonlinear continuous time systems is proposed under the Polytopic Takagi-Sugeno framework. An equivalent representation for the nonlinear model is achieved using the sector nonlinearity transformation. Applying the Lyapunov theory and the ? attenuation, linear matrix inequalities conditions are deduced which are solved for feasibility to obtain the observer design matrices. To cope with the effect of unknown inputs, classical approach of decoupling the unknown input for the linear case is used. Both algebraic and solver-based solutions are proposed (relaxed conditions). Necessary and sufficient conditions for the existence of the functional polytopic observer are given. For both approaches, the general and particular cases (measurable premise variables, full state estimation with full and reduced order cases) are considered and it is shown that the proposed conditions correspond to the one presented for standard linear case. To illustrate the proposed theoretical results, detailed numerical simulations are presented for a Quadrotor Aerial Robots Landing and a Waste Water Treatment Plant. Both systems are highly nonlinear and represented in a T-S polytopic form with unmeasurable premise variables and unknown inputs.
Surrogate-based optimization of hydraulic fracturing in pre-existing fracture networks
NASA Astrophysics Data System (ADS)
Chen, Mingjie; Sun, Yunwei; Fu, Pengcheng; Carrigan, Charles R.; Lu, Zhiming; Tong, Charles H.; Buscheck, Thomas A.
2013-08-01
Hydraulic fracturing has been used widely to stimulate production of oil, natural gas, and geothermal energy in formations with low natural permeability. Numerical optimization of fracture stimulation often requires a large number of evaluations of objective functions and constraints from forward hydraulic fracturing models, which are computationally expensive and even prohibitive in some situations. Moreover, there are a variety of uncertainties associated with the pre-existing fracture distributions and rock mechanical properties, which affect the optimized decisions for hydraulic fracturing. In this study, a surrogate-based approach is developed for efficient optimization of hydraulic fracturing well design in the presence of natural-system uncertainties. The fractal dimension is derived from the simulated fracturing network as the objective for maximizing energy recovery sweep efficiency. The surrogate model, which is constructed using training data from high-fidelity fracturing models for mapping the relationship between uncertain input parameters and the fractal dimension, provides fast approximation of the objective functions and constraints. A suite of surrogate models constructed using different fitting methods is evaluated and validated for fast predictions. Global sensitivity analysis is conducted to gain insights into the impact of the input variables on the output of interest, and further used for parameter screening. The high efficiency of the surrogate-based approach is demonstrated for three optimization scenarios with different and uncertain ambient conditions. Our results suggest the critical importance of considering uncertain pre-existing fracture networks in optimization studies of hydraulic fracturing.
Samad, Noor Asma Fazli Abdul; Sin, Gürkan; Gernaey, Krist V; Gani, Rafiqul
2013-11-01
This paper presents the application of uncertainty and sensitivity analysis as part of a systematic model-based process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty, while for sensitivity analysis, global methods including the standardized regression coefficients (SRC) and Morris screening are used to identify the most significant parameters. The potassium dihydrogen phosphate (KDP) crystallization process is used as a case study, both in open-loop and closed-loop operation. In the uncertainty analysis, the impact on the predicted output of uncertain parameters related to the nucleation and the crystal growth model has been investigated for both a one- and two-dimensional crystal size distribution (CSD). The open-loop results show that the input uncertainties lead to significant uncertainties on the CSD, with appearance of a secondary peak due to secondary nucleation for both cases. The sensitivity analysis indicated that the most important parameters affecting the CSDs are nucleation order and growth order constants. In the proposed PAT system design (closed-loop), the target CSD variability was successfully reduced compared to the open-loop case, also when considering uncertainty in nucleation and crystal growth model parameters. The latter forms a strong indication of the robustness of the proposed PAT system design in achieving the target CSD and encourages its transfer to full-scale implementation. Copyright © 2013 Elsevier B.V. All rights reserved.
Screen Layout Design: Research into the Overall Appearance of the Screen.
ERIC Educational Resources Information Center
Grabinger, R. Scott
1989-01-01
Examines the current state of research into the visual effects of screen designs used in computer-assisted instruction and suggests areas for future efforts. Topics discussed include technical elements and comprehensibility elements in layout design; single element and multiple element research methodologies; dependent variables; and learning…
NASA Astrophysics Data System (ADS)
Zhuo, L.; Mekonnen, M. M.; Hoekstra, A. Y.
2014-06-01
Water Footprint Assessment is a fast-growing field of research, but as yet little attention has been paid to the uncertainties involved. This study investigates the sensitivity of and uncertainty in crop water footprint (in m3 t-1) estimates related to uncertainties in important input variables. The study focuses on the green (from rainfall) and blue (from irrigation) water footprint of producing maize, soybean, rice, and wheat at the scale of the Yellow River basin in the period 1996-2005. A grid-based daily water balance model at a 5 by 5 arcmin resolution was applied to compute green and blue water footprints of the four crops in the Yellow River basin in the period considered. The one-at-a-time method was carried out to analyse the sensitivity of the crop water footprint to fractional changes of seven individual input variables and parameters: precipitation (PR), reference evapotranspiration (ET0), crop coefficient (Kc), crop calendar (planting date with constant growing degree days), soil water content at field capacity (Smax), yield response factor (Ky) and maximum yield (Ym). Uncertainties in crop water footprint estimates related to uncertainties in four key input variables: PR, ET0, Kc, and crop calendar were quantified through Monte Carlo simulations. The results show that the sensitivities and uncertainties differ across crop types. In general, the water footprint of crops is most sensitive to ET0 and Kc, followed by the crop calendar. Blue water footprints were more sensitive to input variability than green water footprints. The smaller the annual blue water footprint is, the higher its sensitivity to changes in PR, ET0, and Kc. The uncertainties in the total water footprint of a crop due to combined uncertainties in climatic inputs (PR and ET0) were about ±20% (at 95% confidence interval). The effect of uncertainties in ET0was dominant compared to that of PR. The uncertainties in the total water footprint of a crop as a result of combined key input uncertainties were on average ±30% (at 95% confidence level).
Fogel, Benjamin N; Nguyen, Hong Loan T; Smink, Gayle; Sekhar, Deepa L
2018-04-01
We conducted an inventory of state-based recommendations for follow-up of alpha thalassemia silent carrier and trait identified on newborn screen. We found wide variability in the nature and timing of these recommendations. We recommend a standardized recommendation to guide pediatricians in evidenced-based care for this population. Copyright © 2017 Elsevier Inc. All rights reserved.
Ibrahim, Tamer M; Bauer, Matthias R; Boeckler, Frank M
2015-01-01
Structure-based virtual screening techniques can help to identify new lead structures and complement other screening approaches in drug discovery. Prior to docking, the data (protein crystal structures and ligands) should be prepared with great attention to molecular and chemical details. Using a subset of 18 diverse targets from the recently introduced DEKOIS 2.0 benchmark set library, we found differences in the virtual screening performance of two popular docking tools (GOLD and Glide) when employing two different commercial packages (e.g. MOE and Maestro) for preparing input data. We systematically investigated the possible factors that can be responsible for the found differences in selected sets. For the Angiotensin-I-converting enzyme dataset, preparation of the bioactive molecules clearly exerted the highest influence on VS performance compared to preparation of the decoys or the target structure. The major contributing factors were different protonation states, molecular flexibility, and differences in the input conformation (particularly for cyclic moieties) of bioactives. In addition, score normalization strategies eliminated the biased docking scores shown by GOLD (ChemPLP) for the larger bioactives and produced a better performance. Generalizing these normalization strategies on the 18 DEKOIS 2.0 sets, improved the performances for the majority of GOLD (ChemPLP) docking, while it showed detrimental performances for the majority of Glide (SP) docking. In conclusion, we exemplify herein possible issues particularly during the preparation stage of molecular data and demonstrate to which extent these issues can cause perturbations in the virtual screening performance. We provide insights into what problems can occur and should be avoided, when generating benchmarks to characterize the virtual screening performance. Particularly, careful selection of an appropriate molecular preparation setup for the bioactive set and the use of score normalization for docking with GOLD (ChemPLP) appear to have a great importance for the screening performance. For virtual screening campaigns, we recommend to invest time and effort into including alternative preparation workflows into the generation of the master library, even at the cost of including multiple representations of each molecule. Graphical AbstractUsing DEKOIS 2.0 benchmark sets in structure-based virtual screening to probe the impact of molecular preparation and score normalization.
Gravity dependence of the effect of optokinetic stimulation on the subjective visual vertical.
Ward, Bryan K; Bockisch, Christopher J; Caramia, Nicoletta; Bertolini, Giovanni; Tarnutzer, Alexander Andrea
2017-05-01
Accurate and precise estimates of direction of gravity are essential for spatial orientation. According to Bayesian theory, multisensory vestibular, visual, and proprioceptive input is centrally integrated in a weighted fashion based on the reliability of the component sensory signals. For otolithic input, a decreasing signal-to-noise ratio was demonstrated with increasing roll angle. We hypothesized that the weights of vestibular (otolithic) and extravestibular (visual/proprioceptive) sensors are roll-angle dependent and predicted an increased weight of extravestibular cues with increasing roll angle, potentially following the Bayesian hypothesis. To probe this concept, the subjective visual vertical (SVV) was assessed in different roll positions (≤ ± 120°, steps = 30°, n = 10) with/without presenting an optokinetic stimulus (velocity = ± 60°/s). The optokinetic stimulus biased the SVV toward the direction of stimulus rotation for roll angles ≥ ± 30° ( P < 0.005). Offsets grew from 3.9 ± 1.8° (upright) to 22.1 ± 11.8° (±120° roll tilt, P < 0.001). Trial-to-trial variability increased with roll angle, demonstrating a nonsignificant increase when providing optokinetic stimulation. Variability and optokinetic bias were correlated ( R 2 = 0.71, slope = 0.71, 95% confidence interval = 0.57-0.86). An optimal-observer model combining an optokinetic bias with vestibular input reproduced measured errors closely. These findings support the hypothesis of a weighted multisensory integration when estimating direction of gravity with optokinetic stimulation. Visual input was weighted more when vestibular input became less reliable, i.e., at larger roll-tilt angles. However, according to Bayesian theory, the variability of combined cues is always lower than the variability of each source cue. If the observed increase in variability, although nonsignificant, is true, either it must depend on an additional source of variability, added after SVV computation, or it would conflict with the Bayesian hypothesis. NEW & NOTEWORTHY Applying a rotating optokinetic stimulus while recording the subjective visual vertical in different whole body roll angles, we noted the optokinetic-induced bias to correlate with the roll angle. These findings allow the hypothesis that the established optimal weighting of single-sensory cues depending on their reliability to estimate direction of gravity could be extended to a bias caused by visual self-motion stimuli. Copyright © 2017 the American Physiological Society.
Clarke, Nicholas; McNamara, Deirdre; Kearney, Patricia M; O'Morain, Colm A; Shearer, Nikki; Sharp, Linda
2016-12-01
This study aimed to investigate the effects of sex and deprivation on participation in a population-based faecal immunochemical test (FIT) colorectal cancer screening programme. The study population included 9785 individuals invited to participate in two rounds of a population-based biennial FIT-based screening programme, in a relatively deprived area of Dublin, Ireland. Explanatory variables included in the analysis were sex, deprivation category of area of residence and age (at end of screening). The primary outcome variable modelled was participation status in both rounds combined (with "participation" defined as having taken part in either or both rounds of screening). Poisson regression with a log link and robust error variance was used to estimate relative risks (RR) for participation. As a sensitivity analysis, data were stratified by screening round. In both the univariable and multivariable models deprivation was strongly associated with participation. Increasing affluence was associated with higher participation; participation was 26% higher in people resident in the most affluent compared to the most deprived areas (multivariable RR=1.26: 95% CI 1.21-1.30). Participation was significantly lower in males (multivariable RR=0.96: 95%CI 0.95-0.97) and generally increased with increasing age (trend per age group, multivariable RR=1.02: 95%CI, 1.01-1.02). No significant interactions between the explanatory variables were found. The effects of deprivation and sex were similar by screening round. Deprivation and male gender are independently associated with lower uptake of population-based FIT colorectal cancer screening, even in a relatively deprived setting. Development of evidence-based interventions to increase uptake in these disadvantaged groups is urgently required. Copyright © 2016. Published by Elsevier Inc.
Stochastic empirical loading and dilution model (SELDM) version 1.0.0
Granato, Gregory E.
2013-01-01
The Stochastic Empirical Loading and Dilution Model (SELDM) is designed to transform complex scientific data into meaningful information about the risk of adverse effects of runoff on receiving waters, the potential need for mitigation measures, and the potential effectiveness of such management measures for reducing these risks. The U.S. Geological Survey developed SELDM in cooperation with the Federal Highway Administration to help develop planning-level estimates of event mean concentrations, flows, and loads in stormwater from a site of interest and from an upstream basin. Planning-level estimates are defined as the results of analyses used to evaluate alternative management measures; planning-level estimates are recognized to include substantial uncertainties (commonly orders of magnitude). SELDM uses information about a highway site, the associated receiving-water basin, precipitation events, stormflow, water quality, and the performance of mitigation measures to produce a stochastic population of runoff-quality variables. SELDM provides input statistics for precipitation, prestorm flow, runoff coefficients, and concentrations of selected water-quality constituents from National datasets. Input statistics may be selected on the basis of the latitude, longitude, and physical characteristics of the site of interest and the upstream basin. The user also may derive and input statistics for each variable that are specific to a given site of interest or a given area. SELDM is a stochastic model because it uses Monte Carlo methods to produce the random combinations of input variable values needed to generate the stochastic population of values for each component variable. SELDM calculates the dilution of runoff in the receiving waters and the resulting downstream event mean concentrations and annual average lake concentrations. Results are ranked, and plotting positions are calculated, to indicate the level of risk of adverse effects caused by runoff concentrations, flows, and loads on receiving waters by storm and by year. Unlike deterministic hydrologic models, SELDM is not calibrated by changing values of input variables to match a historical record of values. Instead, input values for SELDM are based on site characteristics and representative statistics for each hydrologic variable. Thus, SELDM is an empirical model based on data and statistics rather than theoretical physiochemical equations. SELDM is a lumped parameter model because the highway site, the upstream basin, and the lake basin each are represented as a single homogeneous unit. Each of these source areas is represented by average basin properties, and results from SELDM are calculated as point estimates for the site of interest. Use of the lumped parameter approach facilitates rapid specification of model parameters to develop planning-level estimates with available data. The approach allows for parsimony in the required inputs to and outputs from the model and flexibility in the use of the model. For example, SELDM can be used to model runoff from various land covers or land uses by using the highway-site definition as long as representative water quality and impervious-fraction data are available.
ERIC Educational Resources Information Center
Humphreys, Betsy P.
2013-01-01
Universal developmental screening during pediatric well child care detects early delays in development and is a critical gateway to early intervention for young children at risk for Autism Spectrum Disorders (ASD). Developmental screening practices are highly variable, and few studies have examined screening utilization for children at risk for…
Screening and validation of EXTraS data products
NASA Astrophysics Data System (ADS)
Carpano, Stefania; Haberl, F.; De Luca, A.; Tiengo, A.: Israel, G.; Rodriguez, G.; Belfiore, A.; Rosen, S.; Read, A.; Wilms, J.; Kreikenbohm, A.; Law-Green, D.
2015-09-01
The EXTraS project (Exploring the X-ray Transient and variable Sky) is aimed at fullyexploring the serendipitous content of the XMM-Newton EPIC database in the timedomain. The project is funded within the EU/FP7-Cooperation Space framework and is carried out by a collaboration including INAF (Italy), IUSS (Italy), CNR/IMATI (Italy), University of Leicester (UK), MPE (Germany) and ECAP (Germany). The several tasks consist in characterise aperiodicvariability for all 3XMM sources, search for short-term periodic variability on hundreds of thousands sources, detect new transient sources that are missed by standard source detection and hence not belonging to the 3XMM catalogue, search for long term variability by measuring fluxes or upper limits for both pointed and slew observations, and finally perform multiwavelength characterisation andclassification. Screening and validation of the different products is essentially in order to reject flawed results, generated by the automatic pipelines. We present here the screening tool we developed in the form of a Graphical User Interface and our plans for a systematic screening of the different catalogues.
Evaluating variable rate fungicide applications for control of Sclerotinia
USDA-ARS?s Scientific Manuscript database
Oklahoma peanut growers continue to try to increase yields and reduce input costs. Perhaps the largest input in a peanut crop is fungicide applications. This is especially true for areas in the state that have high disease pressure from Sclerotinia. On average, a single fungicide application cost...
Human encroachment on the coastal zone has led to a rise in the delivery of nitrogen (N) to estuarine and near-shore waters. Potential routes of anthropogenic N inputs include export from estuaries, atmospheric deposition, and dissolved N inputs from groundwater outflow. Stable...
Learning a Novel Pattern through Balanced and Skewed Input
ERIC Educational Resources Information Center
McDonough, Kim; Trofimovich, Pavel
2013-01-01
This study compared the effectiveness of balanced and skewed input at facilitating the acquisition of the transitive construction in Esperanto, characterized by the accusative suffix "-n" and variable word order (SVO, OVS). Thai university students (N = 98) listened to 24 sentences under skewed (one noun with high token frequency) or…
Code of Federal Regulations, 2014 CFR
2014-10-01
... must be considered as essential variables: Number of passes; thickness of plate; heat input per pass... not be used. The number of passes, thickness of plate, and heat input per pass may not vary more than... machine heat processes, provided such surfaces are remelted in the subsequent welding process. Where there...
Code of Federal Regulations, 2013 CFR
2013-10-01
... must be considered as essential variables: Number of passes; thickness of plate; heat input per pass... not be used. The number of passes, thickness of plate, and heat input per pass may not vary more than... machine heat processes, provided such surfaces are remelted in the subsequent welding process. Where there...
Code of Federal Regulations, 2012 CFR
2012-10-01
... must be considered as essential variables: Number of passes; thickness of plate; heat input per pass... not be used. The number of passes, thickness of plate, and heat input per pass may not vary more than... machine heat processes, provided such surfaces are remelted in the subsequent welding process. Where there...
Code of Federal Regulations, 2011 CFR
2011-10-01
... must be considered as essential variables: Number of passes; thickness of plate; heat input per pass... not be used. The number of passes, thickness of plate, and heat input per pass may not vary more than... machine heat processes, provided such surfaces are remelted in the subsequent welding process. Where there...
Mu, Zhijian; Huang, Aiying; Ni, Jiupai; Xie, Deti
2014-01-01
Organic soils are an important source of N2O, but global estimates of these fluxes remain uncertain because measurements are sparse. We tested the hypothesis that N2O fluxes can be predicted from estimates of mineral nitrogen input, calculated from readily-available measurements of CO2 flux and soil C/N ratio. From studies of organic soils throughout the world, we compiled a data set of annual CO2 and N2O fluxes which were measured concurrently. The input of soil mineral nitrogen in these studies was estimated from applied fertilizer nitrogen and organic nitrogen mineralization. The latter was calculated by dividing the rate of soil heterotrophic respiration by soil C/N ratio. This index of mineral nitrogen input explained up to 69% of the overall variability of N2O fluxes, whereas CO2 flux or soil C/N ratio alone explained only 49% and 36% of the variability, respectively. Including water table level in the model, along with mineral nitrogen input, further improved the model with the explanatory proportion of variability in N2O flux increasing to 75%. Unlike grassland or cropland soils, forest soils were evidently nitrogen-limited, so water table level had no significant effect on N2O flux. Our proposed approach, which uses the product of soil-derived CO2 flux and the inverse of soil C/N ratio as a proxy for nitrogen mineralization, shows promise for estimating regional or global N2O fluxes from organic soils, although some further enhancements may be warranted.
Lang, Jason M; Connell, Christian M
2017-05-01
Childhood exposure to trauma, including violence and abuse, is a major public health concern that has resulted in increased efforts to promote trauma-informed child-serving systems. Trauma screening is an important component of such trauma-informed systems, yet widespread use of trauma screening is rare in part due to the lack of brief, validated trauma screening measures for children. We describe development and validation of the Child Trauma Screen (CTS), a 10-item screening measure of trauma exposure and posttraumatic stress disorder (PTSD) symptoms for children consistent with the DSM-5 definition of PTSD. Study 1 describes measure development incorporating analysis to derive items based on existing measures from 1,065 children and caregivers together with stakeholder input to finalize item selection. Study 2 describes validation of the CTS with a clinical sample of 74 children and their caregivers. Results support the CTS as an empirically derived, reliable measure to screen children for trauma exposure and PTSD symptoms with strong convergent, divergent, and criterion validity. The CTS is a promising measure for rapidly and reliably screening children for trauma exposure and PTSD symptoms. Future research is needed to confirm validation and to examine feasibility and utility of its use across various child-serving systems. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Sensitivity analysis of a sound absorption model with correlated inputs
NASA Astrophysics Data System (ADS)
Chai, W.; Christen, J.-L.; Zine, A.-M.; Ichchou, M.
2017-04-01
Sound absorption in porous media is a complex phenomenon, which is usually addressed with homogenized models, depending on macroscopic parameters. Since these parameters emerge from the structure at microscopic scale, they may be correlated. This paper deals with sensitivity analysis methods of a sound absorption model with correlated inputs. Specifically, the Johnson-Champoux-Allard model (JCA) is chosen as the objective model with correlation effects generated by a secondary micro-macro semi-empirical model. To deal with this case, a relatively new sensitivity analysis method Fourier Amplitude Sensitivity Test with Correlation design (FASTC), based on Iman's transform, is taken into application. This method requires a priori information such as variables' marginal distribution functions and their correlation matrix. The results are compared to the Correlation Ratio Method (CRM) for reference and validation. The distribution of the macroscopic variables arising from the microstructure, as well as their correlation matrix are studied. Finally the results of tests shows that the correlation has a very important impact on the results of sensitivity analysis. Assessment of correlation strength among input variables on the sensitivity analysis is also achieved.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., Demand Side Variability, and Network Variability studies, including input data, processing programs, and... should include the product or product groups carried under each listed contract; (k) Spreadsheets and...
NASA Technical Reports Server (NTRS)
Aggarwal, Arun K.
1993-01-01
The computer program SASHBEAN (Sikorsky Aircraft Spherical Roller High Speed Bearing Analysis) analyzes and predicts the operating characteristics of a Single Row, Angular Contact, Spherical Roller Bearing (SRACSRB). The program runs on an IBM or IBM compatible personal computer, and for a given set of input data analyzes the bearing design for it's ring deflections (axial and radial), roller deflections, contact areas and stresses, induced axial thrust, rolling element and cage rotation speeds, lubrication parameters, fatigue lives, and amount of heat generated in the bearing. The dynamic loading of rollers due to centrifugal forces and gyroscopic moments, which becomes quite significant at high speeds, is fully considered in this analysis. For a known application and it's parameters, the program is also capable of performing steady-state and time-transient thermal analyses of the bearing system. The steady-state analysis capability allows the user to estimate the expected steady-state temperature map in and around the bearing under normal operating conditions. On the other hand, the transient analysis feature provides the user a means to simulate the 'lost lubricant' condition and predict a time-temperature history of various critical points in the system. The bearing's 'time-to-failure' estimate may also be made from this (transient) analysis by considering the bearing as failed when a certain temperature limit is reached in the bearing components. The program is fully interactive and allows the user to get started and access most of its features with a minimal of training. For the most part, the program is menu driven, and adequate help messages were provided to guide a new user through various menu options and data input screens. All input data, both for mechanical and thermal analyses, are read through graphical input screens, thereby eliminating any need of a separate text editor/word processor to edit/create data files. Provision is also available to select and view the contents of output files on the monitor screen if no paper printouts are required. A separate volume (Volume-2) of this documentation describes, in detail, the underlying mathematical formulations, assumptions, and solution algorithms of this program.
Investigation of energy management strategies for photovoltaic systems - An analysis technique
NASA Technical Reports Server (NTRS)
Cull, R. C.; Eltimsahy, A. H.
1982-01-01
Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.
Investigation of energy management strategies for photovoltaic systems - An analysis technique
NASA Astrophysics Data System (ADS)
Cull, R. C.; Eltimsahy, A. H.
Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.
Troyer, T W; Miller, K D
1997-07-01
To understand the interspike interval (ISI) variability displayed by visual cortical neurons (Softky & Koch, 1993), it is critical to examine the dynamics of their neuronal integration, as well as the variability in their synaptic input current. Most previous models have focused on the latter factor. We match a simple integrate-and-fire model to the experimentally measured integrative properties of cortical regular spiking cells (McCormick, Connors, Lighthall, & Prince, 1985). After setting RC parameters, the post-spike voltage reset is set to match experimental measurements of neuronal gain (obtained from in vitro plots of firing frequency versus injected current). Examination of the resulting model leads to an intuitive picture of neuronal integration that unifies the seemingly contradictory 1/square root of N and random walk pictures that have previously been proposed. When ISIs are dominated by postspike recovery, 1/square root of N arguments hold and spiking is regular; after the "memory" of the last spike becomes negligible, spike threshold crossing is caused by input variance around a steady state and spiking is Poisson. In integrate-and-fire neurons matched to cortical cell physiology, steady-state behavior is predominant, and ISIs are highly variable at all physiological firing rates and for a wide range of inhibitory and excitatory inputs.
Analysis on electronic control unit of continuously variable transmission
NASA Astrophysics Data System (ADS)
Cao, Shuanggui
Continuously variable transmission system can ensure that the engine work along the line of best fuel economy, improve fuel economy, save fuel and reduce harmful gas emissions. At the same time, continuously variable transmission allows the vehicle speed is more smooth and improves the ride comfort. Although the CVT technology has made great development, but there are many shortcomings in the CVT. The CVT system of ordinary vehicles now is still low efficiency, poor starting performance, low transmission power, and is not ideal controlling, high cost and other issues. Therefore, many scholars began to study some new type of continuously variable transmission. The transmission system with electronic systems control can achieve automatic control of power transmission, give full play to the characteristics of the engine to achieve optimal control of powertrain, so the vehicle is always traveling around the best condition. Electronic control unit is composed of the core processor, input and output circuit module and other auxiliary circuit module. Input module collects and process many signals sent by sensor and , such as throttle angle, brake signals, engine speed signal, speed signal of input and output shaft of transmission, manual shift signals, mode selection signals, gear position signal and the speed ratio signal, so as to provide its corresponding processing for the controller core.
Simple Sensitivity Analysis for Orion GNC
NASA Technical Reports Server (NTRS)
Pressburger, Tom; Hoelscher, Brian; Martin, Rodney; Sricharan, Kumar
2013-01-01
The performance of Orion flight software, especially its GNC software, is being analyzed by running Monte Carlo simulations of Orion spacecraft flights. The simulated performance is analyzed for conformance with flight requirements, expressed as performance constraints. Flight requirements include guidance (e.g. touchdown distance from target) and control (e.g., control saturation) as well as performance (e.g., heat load constraints). The Monte Carlo simulations disperse hundreds of simulation input variables, for everything from mass properties to date of launch.We describe in this paper a sensitivity analysis tool (Critical Factors Tool or CFT) developed to find the input variables or pairs of variables which by themselves significantly influence satisfaction of requirements or significantly affect key performance metrics (e.g., touchdown distance from target). Knowing these factors can inform robustness analysis, can inform where engineering resources are most needed, and could even affect operations. The contributions of this paper include the introduction of novel sensitivity measures, such as estimating success probability, and a technique for determining whether pairs of factors are interacting dependently or independently. The tool found that input variables such as moments, mass, thrust dispersions, and date of launch were found to be significant factors for success of various requirements. Examples are shown in this paper as well as a summary and physics discussion of EFT-1 driving factors that the tool found.
Esmaeili, Alireza; Stewart, Andrew M; Hopkins, William G; Elias, George P; Lazarus, Brendan H; Rowell, Amber E; Aughey, Robert J
2018-01-01
Aim: The sit and reach test (S&R), dorsiflexion lunge test (DLT), and adductor squeeze test (AST) are commonly used in weekly musculoskeletal screening for athlete monitoring and injury prevention purposes. The aim of this study was to determine the normal week to week variability of the test scores, individual differences in variability, and the effects of training load on the scores. Methods: Forty-four elite Australian rules footballers from one club completed the weekly screening tests on day 2 or 3 post-main training (pre-season) or post-match (in-season) over a 10 month season. Ratings of perceived exertion and session duration for all training sessions were used to derive various measures of training load via both simple summations and exponentially weighted moving averages. Data were analyzed via linear and quadratic mixed modeling and interpreted using magnitude-based inference. Results: Substantial small to moderate variability was found for the tests at both season phases; for example over the in-season, the normal variability ±90% confidence limits were as follows: S&R ±1.01 cm, ±0.12; DLT ±0.48 cm, ±0.06; AST ±7.4%, ±0.6%. Small individual differences in variability existed for the S&R and AST (factor standard deviations between 1.31 and 1.66). All measures of training load had trivial effects on the screening scores. Conclusion: A change in a test score larger than the normal variability is required to be considered a true change. Athlete monitoring and flagging systems need to account for the individual differences in variability. The tests are not sensitive to internal training load when conducted 2 or 3 days post-training or post-match, and the scores should be interpreted cautiously when used as measures of recovery.
Advances in EPA’s Rapid Exposure and Dosimetry Project (Interagency Alternatives Assessment Webinar)
Estimates of human and ecological exposures are required as critical input to risk-based prioritization and screening of chemicals. The CSS Rapid Exposure and Dosimetry project seeks to develop the data, tools, and evaluation approaches required to generate rapid and scientifical...
SDMProjectBuilder: SWAT Simulation and Calibration for Nutrient Fate and Transport
This tutorial reviews screens, icons, and basic functions for downloading flow, sediment, and nutrient observations for a watershed of interest; how to prepare SWAT-CUP input files for SWAT parameter calibration; and how to perform SWAT parameter calibration with SWAT-CUP. It dem...
Wastewater treatment plant (WWTP) effluents are a known contributor of chemical mixture inputs into the environment. Whole effluent testing guidelines were developed to screen these complex mixtures for acute toxicity. However, efficient and cost-effective approaches for screenin...
Spatial patterns of throughfall isotopic composition at the event and seasonal timescales
Scott T. Allen; Richard F. Keim; Jeffrey J. McDonnell
2015-01-01
Spatial variability of throughfall isotopic composition in forests is indicative of complex processes occurring in the canopy and remains insufficiently understood to properly characterize precipitation inputs to the catchment water balance. Here we investigate variability of throughfall isotopic composition with the objectives: (1) to quantify the spatial variability...
NASA Astrophysics Data System (ADS)
Garousi Nejad, I.; He, S.; Tang, Q.; Ogden, F. L.; Steinke, R. C.; Frazier, N.; Tarboton, D. G.; Ohara, N.; Lin, H.
2017-12-01
Spatial scale is one of the main considerations in hydrological modeling of snowmelt in mountainous areas. The size of model elements controls the degree to which variability can be explicitly represented versus what needs to be parameterized using effective properties such as averages or other subgrid variability parameterizations that may degrade the quality of model simulations. For snowmelt modeling terrain parameters such as slope, aspect, vegetation and elevation play an important role in the timing and quantity of snowmelt that serves as an input to hydrologic runoff generation processes. In general, higher resolution enhances the accuracy of the simulation since fine meshes represent and preserve the spatial variability of atmospheric and surface characteristics better than coarse resolution. However, this increases computational cost and there may be a scale beyond which the model response does not improve due to diminishing sensitivity to variability and irreducible uncertainty associated with the spatial interpolation of inputs. This paper examines the influence of spatial resolution on the snowmelt process using simulations of and data from the Animas River watershed, an alpine mountainous area in Colorado, USA, using an unstructured distributed physically based hydrological model developed for a parallel computing environment, ADHydro. Five spatial resolutions (30 m, 100 m, 250 m, 500 m, and 1 km) were used to investigate the variations in hydrologic response. This study demonstrated the importance of choosing the appropriate spatial scale in the implementation of ADHydro to obtain a balance between representing spatial variability and the computational cost. According to the results, variation in the input variables and parameters due to using different spatial resolution resulted in changes in the obtained hydrological variables, especially snowmelt, both at the basin-scale and distributed across the model mesh.
NASA Astrophysics Data System (ADS)
Srinivas, Kadivendi; Vundavilli, Pandu R.; Manzoor Hussain, M.; Saiteja, M.
2016-09-01
Welding input parameters such as current, gas flow rate and torch angle play a significant role in determination of qualitative mechanical properties of weld joint. Traditionally, it is necessary to determine the weld input parameters for every new welded product to obtain a quality weld joint which is time consuming. In the present work, the effect of plasma arc welding parameters on mild steel was studied using a neural network approach. To obtain a response equation that governs the input-output relationships, conventional regression analysis was also performed. The experimental data was constructed based on Taguchi design and the training data required for neural networks were randomly generated, by varying the input variables within their respective ranges. The responses were calculated for each combination of input variables by using the response equations obtained through the conventional regression analysis. The performances in Levenberg-Marquardt back propagation neural network and radial basis neural network (RBNN) were compared on various randomly generated test cases, which are different from the training cases. From the results, it is interesting to note that for the above said test cases RBNN analysis gave improved training results compared to that of feed forward back propagation neural network analysis. Also, RBNN analysis proved a pattern of increasing performance as the data points moved away from the initial input values.
Single-cell analysis of population context advances RNAi screening at multiple levels
Snijder, Berend; Sacher, Raphael; Rämö, Pauli; Liberali, Prisca; Mench, Karin; Wolfrum, Nina; Burleigh, Laura; Scott, Cameron C; Verheije, Monique H; Mercer, Jason; Moese, Stefan; Heger, Thomas; Theusner, Kristina; Jurgeit, Andreas; Lamparter, David; Balistreri, Giuseppe; Schelhaas, Mario; De Haan, Cornelis A M; Marjomäki, Varpu; Hyypiä, Timo; Rottier, Peter J M; Sodeik, Beate; Marsh, Mark; Gruenberg, Jean; Amara, Ali; Greber, Urs; Helenius, Ari; Pelkmans, Lucas
2012-01-01
Isogenic cells in culture show strong variability, which arises from dynamic adaptations to the microenvironment of individual cells. Here we study the influence of the cell population context, which determines a single cell's microenvironment, in image-based RNAi screens. We developed a comprehensive computational approach that employs Bayesian and multivariate methods at the single-cell level. We applied these methods to 45 RNA interference screens of various sizes, including 7 druggable genome and 2 genome-wide screens, analysing 17 different mammalian virus infections and four related cell physiological processes. Analysing cell-based screens at this depth reveals widespread RNAi-induced changes in the population context of individual cells leading to indirect RNAi effects, as well as perturbations of cell-to-cell variability regulators. We find that accounting for indirect effects improves the consistency between siRNAs targeted against the same gene, and between replicate RNAi screens performed in different cell lines, in different labs, and with different siRNA libraries. In an era where large-scale RNAi screens are increasingly performed to reach a systems-level understanding of cellular processes, we show that this is often improved by analyses that account for and incorporate the single-cell microenvironment. PMID:22531119
Thinking Style as a Predictor of Men’s Participation in Cancer Screening
McGuiness, Clare E.; Turnbull, Deborah; Wilson, Carlene; Duncan, Amy; Flight, Ingrid H.; Zajac, Ian
2016-01-01
Men’s participation in cancer screening may be influenced by their thinking style. Men’s need for cognition (NFC) and faith in intuition were measured to explore whether they varied by demographic variables or predicted screening behavior. Australian males (n = 585, aged 50-74 years) completed surveys about past screening and were subsequently offered mailed fecal occult blood tests (FOBTs). Demographic predictors included age, socioeconomic status, educational attainment, and language spoken at home. The screening behaviors were self-reported prostate cancer screening (prostate-specific antigen testing and digital rectal examinations [DREs]), and colorectal cancer screening (self-reported FOBT participation and recorded uptake of the FOBT offer). Analysis comprised principal component analysis and structural equation modelling. NFC was positively related to demographic variables education, socioeconomic status, and speaking English at home. Faith in intuition was negatively related to educational attainment. NFC predicted variance in self-reported DRE participation (r = .11, p = .016). No other relationships with thinking style were statistically significant. The relationship of NFC to DRE participation may reflect the way certain attributes of this screening method are processed, or alternatively, it may reflect willingness to report participation. The relationship of thinking style to a range of healthy behaviors should be further explored. PMID:27923966
Pointing Device Performance in Steering Tasks.
Senanayake, Ransalu; Goonetilleke, Ravindra S
2016-06-01
Use of touch-screen-based interactions is growing rapidly. Hence, knowing the maneuvering efficacy of touch screens relative to other pointing devices is of great importance in the context of graphical user interfaces. Movement time, accuracy, and user preferences of four pointing device settings were evaluated on a computer with 14 participants aged 20.1 ± 3.13 years. It was found that, depending on the difficulty of the task, the optimal settings differ for ballistic and visual control tasks. With a touch screen, resting the arm increased movement time for steering tasks. When both performance and comfort are considered, whether to use a mouse or a touch screen for person-computer interaction depends on the steering difficulty. Hence, a input device should be chosen based on the application, and should be optimized to match the graphical user interface. © The Author(s) 2016.
MODELING OF HUMAN EXPOSURE TO IN-VEHICLE PM2.5 FROM ENVIRONMENTAL TOBACCO SMOKE
Cao, Ye; Frey, H. Christopher
2012-01-01
Environmental tobacco smoke (ETS) is estimated to be a significant contributor to in-vehicle human exposure to fine particulate matter of 2.5 µm or smaller (PM2.5). A critical assessment was conducted of a mass balance model for estimating PM2.5 concentration with smoking in a motor vehicle. Recommendations for the range of inputs to the mass-balance model are given based on literature review. Sensitivity analysis was used to determine which inputs should be prioritized for data collection. Air exchange rate (ACH) and the deposition rate have wider relative ranges of variation than other inputs, representing inter-individual variability in operations, and inter-vehicle variability in performance, respectively. Cigarette smoking and emission rates, and vehicle interior volume, are also key inputs. The in-vehicle ETS mass balance model was incorporated into the Stochastic Human Exposure and Dose Simulation for Particulate Matter (SHEDS-PM) model to quantify the potential magnitude and variability of in-vehicle exposures to ETS. The in-vehicle exposure also takes into account near-road incremental PM2.5 concentration from on-road emissions. Results of probabilistic study indicate that ETS is a key contributor to the in-vehicle average and high-end exposure. Factors that mitigate in-vehicle ambient PM2.5 exposure lead to higher in-vehicle ETS exposure, and vice versa. PMID:23060732
Computing the structural influence matrix for biological systems.
Giordano, Giulia; Cuba Samaniego, Christian; Franco, Elisa; Blanchini, Franco
2016-06-01
We consider the problem of identifying structural influences of external inputs on steady-state outputs in a biological network model. We speak of a structural influence if, upon a perturbation due to a constant input, the ensuing variation of the steady-state output value has the same sign as the input (positive influence), the opposite sign (negative influence), or is zero (perfect adaptation), for any feasible choice of the model parameters. All these signs and zeros can constitute a structural influence matrix, whose (i, j) entry indicates the sign of steady-state influence of the jth system variable on the ith variable (the output caused by an external persistent input applied to the jth variable). Each entry is structurally determinate if the sign does not depend on the choice of the parameters, but is indeterminate otherwise. In principle, determining the influence matrix requires exhaustive testing of the system steady-state behaviour in the widest range of parameter values. Here we show that, in a broad class of biological networks, the influence matrix can be evaluated with an algorithm that tests the system steady-state behaviour only at a finite number of points. This algorithm also allows us to assess the structural effect of any perturbation, such as variations of relevant parameters. Our method is applied to nontrivial models of biochemical reaction networks and population dynamics drawn from the literature, providing a parameter-free insight into the system dynamics.
An anesthesia information system for monitoring and record keeping during surgical anesthesia.
Klocke, H; Trispel, S; Rau, G; Hatzky, U; Daub, D
1986-10-01
We have developed an anesthesia information system (AIS) that supports the anesthesiologist in monitoring and recording during a surgical operation. In development of the system, emphasis was placed on providing an anesthesiologist-computer interface that can be adapted to typical situations during anesthesia and to individual user behavior. One main feature of this interface is the integration of the input and output of information. The only device for interaction between the anesthesiologist and the AIS is a touch-sensitive, high-resolution color display screen. The anesthesiologist enters information by touching virtual function keys displayed on the screen. A data window displays all data generated over time, such as automatically recorded vital signs, including blood pressure, heart rate, and rectal and esophageal temperatures, and manually entered variables, such as administered drugs, and ventilator settings. The information gathered by the AIS is presented on the cathode ray tube in several pages. A main distributor page gives an overall view of the content of every work page. A one-page record of the anesthesia is automatically plotted on a multicolor digital plotter during the operation. An example of the use of the AIS is presented from a field test of the system during which it was evaluated in the operating room without interfering with the ongoing operation. Medical staff who used the AIS imitated the anesthesiologist's recording and information search behavior but did not have responsibility for the conduct of the anesthetic.
Propagation of variability in railway dynamic simulations: application to virtual homologation
NASA Astrophysics Data System (ADS)
Funfschilling, Christine; Perrin, Guillaume; Kraft, Sönke
2012-01-01
Railway dynamic simulations are increasingly used to predict and analyse the behaviour of the vehicle and of the track during their whole life cycle. Up to now however, no simulation has been used in the certification procedure even if the expected benefits are important: cheaper and shorter procedures, more objectivity, better knowledge of the behaviour around critical situations. Deterministic simulations are nevertheless too poor to represent the whole physical of the track/vehicle system which contains several sources of variability: variability of the mechanical parameters of a train among a class of vehicles (mass, stiffness and damping of different suspensions), variability of the contact parameters (friction coefficient, wheel and rail profiles) and variability of the track design and quality. This variability plays an important role on the safety, on the ride quality, and thus on the certification criteria. When using the simulation for certification purposes, it seems therefore crucial to take into account the variability of the different inputs. The main goal of this article is thus to propose a method to introduce the variability in railway dynamics. A four-step method is described namely the definition of the stochastic problem, the modelling of the inputs variability, the propagation and the analysis of the output. Each step is illustrated with railway examples.
Becságh, Péter; Szakács, Orsolya
2014-10-01
During diagnostic workflow when detecting sequence alterations, sometimes it is important to design an algorithm that includes screening and direct tests in combination. Normally the use of direct test, which is mainly sequencing, is limited. There is an increased need for effective screening tests, with "closed tube" during the whole process and therefore decreasing the risk of PCR product contamination. The aim of this study was to design such a closed tube, detection probe based screening assay to detect different kind of sequence alterations in the exon 11 of the human c-kit gene region. Inside this region there are variable possible deletions and single nucleotide changes. During assay setup, more probe chemistry formats were screened and tested. After some optimization steps the taqman probe format was selected.
Jones, Loretta; Bazargan, Mohsen; Lucas-Wright, Anna; Vadgama, Jaydutt V; Vargas, Roberto; Smith, James; Otoukesh, Salman; Maxwell, Annette E
2013-01-01
Most theoretical formulations acknowledge that knowledge and awareness of cancer screening and prevention recommendations significantly influence health behaviors. This study compares perceived knowledge of cancer prevention and screening with test-based knowledge in a community sample. We also examine demographic variables and self-reported cancer screening and prevention behaviors as correlates of both knowledge scores, and consider whether cancer related knowledge can be accurately assessed using just a few, simple questions in a short and easy-to-complete survey. We used a community-partnered participatory research approach to develop our study aims and a survey. The study sample was composed of 180 predominantly African American and Hispanic community individuals who participated in a full-day cancer prevention and screening promotion conference in South Los Angeles, California, on July 2011. Participants completed a self-administered survey in English or Spanish at the beginning of the conference. Our data indicate that perceived and test-based knowledge scores are only moderately correlated. Perceived knowledge score shows a stronger association with demographic characteristics and other cancer related variables than the test-based score. Thirteen out of twenty variables that are examined in our study showed a statistically significant correlation with the perceived knowledge score, however, only four variables demonstrated a statistically significant correlation with the test-based knowledge score. Perceived knowledge of cancer prevention and screening was assessed with fewer items than test-based knowledge. Thus, using this assessment could potentially reduce respondent burden. However, our data demonstrate that perceived and test-based knowledge are separate constructs.
Adamson, David N; Mustafi, Debarshi; Zhang, John X J; Zheng, Bo; Ismagilov, Rustem F
2006-09-01
This paper reports a method for the production of arrays of nanolitre plugs with distinct chemical compositions. One of the primary constraints on the use of plug-based microfluidics for large scale biological screening is the difficulty of fabricating arrays of chemically distinct plugs on the nanolitre scale. Here, using microfluidic devices with several T-junctions linked in series, a single input array of large (approximately 320 nL) plugs was split to produce 16 output arrays of smaller (approximately 20 nL) plugs; the composition and configuration of these arrays were identical to that of the input. This paper shows how the passive break-up of plugs in T-junction microchannel geometries can be used to produce a set of smaller-volume output arrays useful for chemical screening from a single large-volume array. A simple theoretical description is presented to describe splitting as a function of the Capillary number, the capillary pressure, the total pressure difference across the channel, and the geometric fluidic resistance. By accounting for these considerations, plug coalescence and plug-plug contamination can be eliminated from the splitting process and the symmetry of splitting can be preserved. Furthermore, single-outlet splitting devices were implemented with both valve- and volume-based methods for coordinating the release of output arrays. Arrays of plugs containing commercial sparse matrix screens were obtained from the presented splitting method and these arrays were used in protein crystallization trials. The techniques presented in this paper may facilitate the implementation of high-throughput chemical and biological screening.
Ogunyemi, Omolola; Teklehaimanot, Senait; Patty, Lauren; Moran, Erin; George, Sheba
2013-01-01
Introduction Screening guidelines for diabetic patients recommend yearly eye examinations to detect diabetic retinopathy and other forms of diabetic eye disease. However, annual screening rates for retinopathy in US urban safety net settings remain low. Methods Using data gathered from a study of teleretinal screening in six urban safety net clinics, we assessed whether predictive modeling could be of value in identifying patients at risk of developing retinopathy. We developed and examined the accuracy of two predictive modeling approaches for diabetic retinopathy in a sample of 513 diabetic individuals, using routinely available clinical variables from retrospective medical record reviews. Bayesian networks and radial basis function (neural) networks were learned using ten-fold cross-validation. Results The predictive models were modestly predictive with the best model having an AUC of 0.71. Discussion Using routinely available clinical variables to predict patients at risk of developing retinopathy and to target them for annual eye screenings may be of some usefulness to safety net clinics. PMID:23920536
Ogunyemi, Omolola; Teklehaimanot, Senait; Patty, Lauren; Moran, Erin; George, Sheba
2013-01-01
Screening guidelines for diabetic patients recommend yearly eye examinations to detect diabetic retinopathy and other forms of diabetic eye disease. However, annual screening rates for retinopathy in US urban safety net settings remain low. Using data gathered from a study of teleretinal screening in six urban safety net clinics, we assessed whether predictive modeling could be of value in identifying patients at risk of developing retinopathy. We developed and examined the accuracy of two predictive modeling approaches for diabetic retinopathy in a sample of 513 diabetic individuals, using routinely available clinical variables from retrospective medical record reviews. Bayesian networks and radial basis function (neural) networks were learned using ten-fold cross-validation. The predictive models were modestly predictive with the best model having an AUC of 0.71. Using routinely available clinical variables to predict patients at risk of developing retinopathy and to target them for annual eye screenings may be of some usefulness to safety net clinics.
African crop yield reductions due to increasingly unbalanced Nitrogen and Phosphorus consumption
NASA Astrophysics Data System (ADS)
van der Velde, Marijn; Folberth, Christian; Balkovič, Juraj; Ciais, Philippe; Fritz, Steffen; Janssens, Ivan A.; Obersteiner, Michael; See, Linda; Skalský, Rastislav; Xiong, Wei; Peñuealas, Josep
2014-05-01
The impact of soil nutrient depletion on crop production has been known for decades, but robust assessments of the impact of increasingly unbalanced nitrogen (N) and phosphorus (P) application rates on crop production are lacking. Here, we use crop response functions based on 741 FAO maize crop trials and EPIC crop modeling across Africa to examine maize yield deficits resulting from unbalanced N:P applications under low, medium, and high input scenarios, for past (1975), current, and future N:P mass ratios of respectively, 1:0.29, 1:0.15, and 1:0.05. At low N inputs (10 kg/ha), current yield deficits amount to 10% but will increase up to 27% under the assumed future N:P ratio, while at medium N inputs (50 kg N/ha), future yield losses could amount to over 40%. The EPIC crop model was then used to simulate maize yields across Africa. The model results showed relative median future yield reductions at low N inputs of 40%, and 50% at medium and high inputs, albeit with large spatial variability. Dominant low-quality soils such as Ferralsols, which are strongly adsorbing P, and Arenosols with a low nutrient retention capacity, are associated with a strong yield decline, although Arenosols show very variable crop yield losses at low inputs. Optimal N:P ratios, i.e. those where the lowest amount of applied P produces the highest yield (given N input) where calculated with EPIC to be as low as 1:0.5. Finally, we estimated the additional P required given current N inputs, and given N inputs that would allow Africa to close yield gaps (ca. 70%). At current N inputs, P consumption would have to increase 2.3-fold to be optimal, and to increase 11.7-fold to close yield gaps. The P demand to overcome these yield deficits would provide a significant additional pressure on current global extraction of P resources.
Applying operations research to optimize a novel population management system for cancer screening.
Zai, Adrian H; Kim, Seokjin; Kamis, Arnold; Hung, Ken; Ronquillo, Jeremiah G; Chueh, Henry C; Atlas, Steven J
2014-02-01
To optimize a new visit-independent, population-based cancer screening system (TopCare) by using operations research techniques to simulate changes in patient outreach staffing levels (delegates, navigators), modifications to user workflow within the information technology (IT) system, and changes in cancer screening recommendations. TopCare was modeled as a multiserver, multiphase queueing system. Simulation experiments implemented the queueing network model following a next-event time-advance mechanism, in which systematic adjustments were made to staffing levels, IT workflow settings, and cancer screening frequency in order to assess their impact on overdue screenings per patient. TopCare reduced the average number of overdue screenings per patient from 1.17 at inception to 0.86 during simulation to 0.23 at steady state. Increases in the workforce improved the effectiveness of TopCare. In particular, increasing the delegate or navigator staff level by one person improved screening completion rates by 1.3% or 12.2%, respectively. In contrast, changes in the amount of time a patient entry stays on delegate and navigator lists had little impact on overdue screenings. Finally, lengthening the screening interval increased efficiency within TopCare by decreasing overdue screenings at the patient level, resulting in a smaller number of overdue patients needing delegates for screening and a higher fraction of screenings completed by delegates. Simulating the impact of changes in staffing, system parameters, and clinical inputs on the effectiveness and efficiency of care can inform the allocation of limited resources in population management.
EPA Exposure Research and the ExpoCast Project: New Methods and New Data (NIEHS Exposome webinar)
Estimates of human and ecological exposures are required as critical input to risk-based prioritization and screening of thousands of chemicals. In a 2009 commentary in Environmental Health Perspectives, Shelden and Hubal proposed that “Novel statistical and informatic approaches...
DOT National Transportation Integrated Search
2012-03-31
This report evaluates the performance of Continuous Risk Profile (CRP) compared with the : Sliding Window Method (SWM) and Peak Searching (PS) methods. These three network : screening methods all require the same inputs: traffic collision data and Sa...
DOT National Transportation Integrated Search
2012-03-01
This report evaluates the performance of Continuous Risk Profile (CRP) compared with the : Sliding Window Method (SWM) and Peak Searching (PS) methods. These three network : screening methods all require the same inputs: traffic collision data and Sa...
Recent advances in targeted RNA-Seq technology allow researchers to efficiently and cost-effectively obtain whole transcriptome profiles using picograms of mRNA from human cell lysates. Low mRNA input requirements and sample multiplexing capabilities has made time- and concentrat...
Numerical, mathematical models of water and chemical movement in soils are used as decision aids for determining soil screening levels (SSLs) of radionuclides in the unsaturated zone. Many models require extensive input parameters which include uncertainty due to soil variabil...
SDMProjectBuilder: SWAT Setup for Nutrient Fate and Transport
This tutorial reviews some of the screens, icons, and basic functions of the SDMProjectBuilder (SDMPB) and explains how one uses SDMPB output to populate the Soil and Water Assessment Tool (SWAT) input files for nutrient fate and transport modeling in the Salt River Basin. It dem...
Multimodal Application for Foreign Language Teaching
ERIC Educational Resources Information Center
Magal-Royo, Teresa; Gimenez-Lopez, Jose Luis; Pairy, Blas; Garcia-Laborda, Jesus; Gonzalez-Del Rio, Jimena
2011-01-01
The current development of educational applications for language learning has experienced a qualitative change in the criteria of interaction between users and devices due to the technological advances of input and output data through keyboard, mouse, stylus, tactile screen, etc. The multiple interactions generated in a natural way by humans…
Estimates of human and ecological exposures are required as critical input to risk-based prioritization and screening of chemicals. This project seeks to develop the data, tools, and evaluation approaches required to generate rapid and scientifically-defensible exposure predictio...
Application of neural networks and sensitivity analysis to improved prediction of trauma survival.
Hunter, A; Kennedy, L; Henry, J; Ferguson, I
2000-05-01
The performance of trauma departments is widely audited by applying predictive models that assess probability of survival, and examining the rate of unexpected survivals and deaths. Although the TRISS methodology, a logistic regression modelling technique, is still the de facto standard, it is known that neural network models perform better. A key issue when applying neural network models is the selection of input variables. This paper proposes a novel form of sensitivity analysis, which is simpler to apply than existing techniques, and can be used for both numeric and nominal input variables. The technique is applied to the audit survival problem, and used to analyse the TRISS variables. The conclusions discuss the implications for the design of further improved scoring schemes and predictive models.
USDA-ARS?s Scientific Manuscript database
Precipitation patterns and nutrient inputs impact transport of nitrate (NO3-N) and phosphorus (TP) from Midwest watersheds. Nutrient concentrations and yields from two subsurface-drained watersheds, the Little Cobb River (LCR) in southern Minnesota and the South Fork Iowa River (SFIR) in northern Io...
Software development guidelines
NASA Technical Reports Server (NTRS)
Kovalevsky, N.; Underwood, J. M.
1979-01-01
Analysis, modularization, flowcharting, existing programs and subroutines, compatibility, input and output data, adaptability to checkout, and general-purpose subroutines are summarized. Statement ordering and numbering, specification statements, variable names, arrays, arithemtical expressions and statements, control statements, input/output, and subroutines are outlined. Intermediate results, desk checking, checkout data, dumps, storage maps, diagnostics, and program timing are reviewed.
Testing an Instructional Model in a University Educational Setting from the Student's Perspective
ERIC Educational Resources Information Center
Betoret, Fernando Domenech
2006-01-01
We tested a theoretical model that hypothesized relationships between several variables from input, process and product in an educational setting, from the university student's perspective, using structural equation modeling. In order to carry out the analysis, we measured in sequential order the input (referring to students' personal…
Estuaries in the Pacific Northwest have major intraannual and within estuary variation in sources and magnitudes of nutrient inputs. To develop an approach for setting nutrient criteria for these systems, we conducted a case study for Yaquina Bay, OR based on a synthesis of resea...
NASA Technical Reports Server (NTRS)
Dash, S. M.; Pergament, H. S.
1978-01-01
The basic code structure is discussed, including the overall program flow and a brief description of all subroutines. Instructions on the preparation of input data, definitions of key FORTRAN variables, sample input and output, and a complete listing of the code are presented.
Model predictive controller design for boost DC-DC converter using T-S fuzzy cost function
NASA Astrophysics Data System (ADS)
Seo, Sang-Wha; Kim, Yong; Choi, Han Ho
2017-11-01
This paper proposes a Takagi-Sugeno (T-S) fuzzy method to select cost function weights of finite control set model predictive DC-DC converter control algorithms. The proposed method updates the cost function weights at every sample time by using T-S type fuzzy rules derived from the common optimal control engineering knowledge that a state or input variable with an excessively large magnitude can be penalised by increasing the weight corresponding to the variable. The best control input is determined via the online optimisation of the T-S fuzzy cost function for all the possible control input sequences. This paper implements the proposed model predictive control algorithm in real time on a Texas Instruments TMS320F28335 floating-point Digital Signal Processor (DSP). Some experimental results are given to illuminate the practicality and effectiveness of the proposed control system under several operating conditions. The results verify that our method can yield not only good transient and steady-state responses (fast recovery time, small overshoot, zero steady-state error, etc.) but also insensitiveness to abrupt load or input voltage parameter variations.
Leung, Doris Y P; Wong, Eliza M L; Chan, Carmen W H
2016-04-01
The prevalence of colorectal cancer (CRC) among older people is high. Screening for CRC presents a cost-effective secondary prevention and control strategy which results in a significant reduction in mortality. This study aims to describe the prevalence of CRC screening and examine its risk factors among Chinese community-dwelling older people guided by a comprehensive model combining Health Belief Model and Extended Parallel Processing Model. A descriptive correlational study was conducted. A convenience sample of 240 community-dwelling adults aged ≥60 was recruited in May-July in 2012 in Hong Kong. Participants were asked to complete a questionnaire which collected information on demographic variables, CRC-related psychosocial variables and whether they had a CRC screening in the past 10 years. Among the participants, 25.4% reported having a CRC screening test. Results of logistic regression analyses indicated that participants with a higher level in cue to action, and lower perceived knowledge barriers and severity-fear were significantly associated with participation in CRC screening. But there were no significant associations between fatalism and cancer fear with screening. The prevalence of CRC screening was low in Hong Kong Chinese community-dwelling elders. A number of modifiable factors associated with CRC screening were identified which provides specific targets for interventions. This study also adds to the knowledge regarding the associations between fatalism and fear with CRC screening behaviors among Chinese older people. Copyright © 2015 Elsevier Ltd. All rights reserved.
Danyliv, Andriy; Gillespie, Paddy; O'Neill, Ciaran; Tierney, Marie; O'Dea, Angela; McGuire, Brian E; Glynn, Liam G; Dunne, Fidelma P
2016-03-01
The aim of the study was to assess the cost-effectiveness of screening for gestational diabetes mellitus (GDM) in primary and secondary care settings, compared with a no-screening option, in the Republic of Ireland. The analysis was based on a decision-tree model of alternative screening strategies in primary and secondary care settings. It synthesised data generated from a randomised controlled trial (screening uptake) and from the literature. Costs included those relating to GDM screening and treatment, and the care of adverse outcomes. Effects were assessed in terms of quality-adjusted life years (QALYs). The impact of the parameter uncertainty was assessed in a range of sensitivity analyses. Screening in either setting was found to be superior to no screening, i.e. it provided for QALY gains and cost savings. Screening in secondary care was found to be superior to screening in primary care, providing for modest QALY gains of 0.0006 and a saving of €21.43 per screened case. The conclusion held with high certainty across the range of ceiling ratios from zero to €100,000 per QALY and across a plausible range of input parameters. The results of this study demonstrate that implementation of universal screening is cost-effective. This is an argument in favour of introducing a properly designed and funded national programme of screening for GDM, although affordability remains to be assessed. In the current environment, screening for GDM in secondary care settings appears to be the better solution in consideration of cost-effectiveness.
Screening for EIA in India: enhancing effectiveness through ecological carrying capacity approach.
Rajaram, T; Das, Ashutosh
2011-01-01
Developing countries across the world have embraced the policy of high economic growth as a means to reduce poverty. This economic growth largely based on industrial output is fast degrading the ecosystems, jeopardizing their long term sustainability. Environmental Impact Assessment (EIA) has long been recognized as a tool which can help in protecting the ecosystems and aid sustainable development. The Screening guidelines for EIA reflect the level of commitment the nation displays towards tightening its environmental protection system. The paper analyses the screening process for EIA in India and dissects the rationale behind the exclusions and thresholds set in the screening process. The screening process in India is compared with that of the European Union with the aim of understanding the extent of deviations from a screening approach in the context of better economic development. It is found that the Indian system excludes many activities from the purview of screening itself when compared to the EU. The constraints responsible for these exclusions are discussed and the shortcomings of the current command and control system of environmental management in India are also explained. It is suggested that an ecosystem carrying capacity based management system can provide significant inputs to enhance the effectiveness of EIA process from screening to monitoring. Copyright © 2010 Elsevier Ltd. All rights reserved.
Natowicz, Marvin R; Hiller, Elaine H
2002-01-01
Newborn screening programs collectively administer the largest genetic testing initiative in the United States. The redress of grievances is an important mechanism for consumers to provide input into clinical and public health programs. In this study, we evaluated mechanisms for addressing consumer grievances in newborn screening programs. To do this, we surveyed all 50 state plus the District of Columbia newborn screening programs by questionnaire regarding protocols for receipt and redress of problems reported by parents of newborns and ascertained the existence and nature of complaints and how complaints were documented and addressed. Pertinent state and federal legislation and regulation were also reviewed. Six of 49 newborn screening programs reported having formal policies for handling consumer grievances. Four states reported having pertinent legislation or regulation. Thirty-eight of 49 states reported having received complaints from 1993 to 1995. Thirteen of 49 newborn screening programs reported that they actively seek feedback from consumers. Consumer grievances ranged from minor complaints to potentially life-threatening concerns. In general, complaints are managed on an ad hoc basis; formal policies are typically lacking. As newborn screening programs affect a vast number of Americans, a proactive and comprehensive approach, including solicitation of consumer feedback, could benefit both newborn screening programs and the public served by them.
Organizational Factors Affecting the Likelihood of Cancer Screening Among VA Patients.
Chou, Ann F; Rose, Danielle E; Farmer, Melissa; Canelo, Ismelda; Yano, Elizabeth M
2015-12-01
Preventive service delivery, including cancer screenings, continues to pose a challenge to quality improvement efforts. Although many studies have focused on person-level characteristics associated with screening, less is known about organizational influences on cancer screening. This study aims to understand the association between organizational factors and adherence to cancer screenings. This study employed a cross-sectional design using organizational-level, patient-level, and area-level data. Dependent variables included breast, cervical, and colorectal cancer screening. Organizational factors describing resource sufficiency were constructed using factor analyses from a survey of 250 Veterans Affairs primary care directors. We conducted random-effects logistic regression analyses, modeling cancer screening as a function of organizational factors, controlling for patient-level and area-level factors. Overall, 87% of the patients received mammograms, 92% received cervical and 78% had colorectal screening. Quality improvement orientation increased the odds of cervical [odds ratio (OR): 1.27; 95% confidence interval (CI), 1.03-1.57] and colorectal cancer screening (OR: 1.10; 95% CI, 1.00-1.20). Authority in determining primary care components increased the odds of mammography screening (OR: 1.23; 95% CI, 1.03-1.51). Sufficiency in clinical staffing increased the odds of mammography and cervical cancer screenings. Several patient-level factors, serving as control variables, were associated with achievement of screenings. Resource sufficiency led to increased odds of screening possibly because they promote excellence in patient care by conveying organizational goals and facilitate goal achievement with resources. Complementary to patient-level factors, our findings identified organizational processes associated with better performance, which offer concrete strategies in which facilities can evaluate their capabilities to implement best practices to foster and sustain a culture of quality care.
ERIC Educational Resources Information Center
Unic, Ivana; Stalmeier, Peep F. M.; Peer, Petronella G. M.; van Daal, Willem A. J.
1997-01-01
Studies of variables predicting familial breast cancer (N=59) were analyzed to develop screening recommendations for women with nonhereditary familial breast cancer present. The pooled relative risk (RR) and cumulative probability were used to estimate risk. Data and conclusions are presented. Recommendations for screening and counseling are…
Learning from adaptive neural dynamic surface control of strict-feedback systems.
Wang, Min; Wang, Cong
2015-06-01
Learning plays an essential role in autonomous control systems. However, how to achieve learning in the nonstationary environment for nonlinear systems is a challenging problem. In this paper, we present learning method for a class of n th-order strict-feedback systems by adaptive dynamic surface control (DSC) technology, which achieves the human-like ability of learning by doing and doing with learned knowledge. To achieve the learning, this paper first proposes stable adaptive DSC with auxiliary first-order filters, which ensures the boundedness of all the signals in the closed-loop system and the convergence of tracking errors in a finite time. With the help of DSC, the derivative of the filter output variable is used as the neural network (NN) input instead of traditional intermediate variables. As a result, the proposed adaptive DSC method reduces greatly the dimension of NN inputs, especially for high-order systems. After the stable DSC design, we decompose the stable closed-loop system into a series of linear time-varying perturbed subsystems. Using a recursive design, the recurrent property of NN input variables is easily verified since the complexity is overcome using DSC. Subsequently, the partial persistent excitation condition of the radial basis function NN is satisfied. By combining a state transformation, accurate approximations of the closed-loop system dynamics are recursively achieved in a local region along recurrent orbits. Then, the learning control method using the learned knowledge is proposed to achieve the closed-loop stability and the improved control performance. Simulation studies are performed to demonstrate the proposed scheme can not only reuse the learned knowledge to achieve the better control performance with the faster tracking convergence rate and the smaller tracking error but also greatly alleviate the computational burden because of reducing the number and complexity of NN input variables.
NASA Astrophysics Data System (ADS)
Dumedah, Gift; Walker, Jeffrey P.
2017-03-01
The sources of uncertainty in land surface models are numerous and varied, from inaccuracies in forcing data to uncertainties in model structure and parameterizations. Majority of these uncertainties are strongly tied to the overall makeup of the model, but the input forcing data set is independent with its accuracy usually defined by the monitoring or the observation system. The impact of input forcing data on model estimation accuracy has been collectively acknowledged to be significant, yet its quantification and the level of uncertainty that is acceptable in the context of the land surface model to obtain a competitive estimation remain mostly unknown. A better understanding is needed about how models respond to input forcing data and what changes in these forcing variables can be accommodated without deteriorating optimal estimation of the model. As a result, this study determines the level of forcing data uncertainty that is acceptable in the Joint UK Land Environment Simulator (JULES) to competitively estimate soil moisture in the Yanco area in south eastern Australia. The study employs hydro genomic mapping to examine the temporal evolution of model decision variables from an archive of values obtained from soil moisture data assimilation. The data assimilation (DA) was undertaken using the advanced Evolutionary Data Assimilation. Our findings show that the input forcing data have significant impact on model output, 35% in root mean square error (RMSE) for 5cm depth of soil moisture and 15% in RMSE for 15cm depth of soil moisture. This specific quantification is crucial to illustrate the significance of input forcing data spread. The acceptable uncertainty determined based on dominant pathway has been validated and shown to be reliable for all forcing variables, so as to provide optimal soil moisture. These findings are crucial for DA in order to account for uncertainties that are meaningful from the model standpoint. Moreover, our results point to a proper treatment of input forcing data in general land surface and hydrological model estimation.
SAMBA: Sparse Approximation of Moment-Based Arbitrary Polynomial Chaos
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahlfeld, R., E-mail: r.ahlfeld14@imperial.ac.uk; Belkouchi, B.; Montomoli, F.
2016-09-01
A new arbitrary Polynomial Chaos (aPC) method is presented for moderately high-dimensional problems characterised by limited input data availability. The proposed methodology improves the algorithm of aPC and extends the method, that was previously only introduced as tensor product expansion, to moderately high-dimensional stochastic problems. The fundamental idea of aPC is to use the statistical moments of the input random variables to develop the polynomial chaos expansion. This approach provides the possibility to propagate continuous or discrete probability density functions and also histograms (data sets) as long as their moments exist, are finite and the determinant of the moment matrixmore » is strictly positive. For cases with limited data availability, this approach avoids bias and fitting errors caused by wrong assumptions. In this work, an alternative way to calculate the aPC is suggested, which provides the optimal polynomials, Gaussian quadrature collocation points and weights from the moments using only a handful of matrix operations on the Hankel matrix of moments. It can therefore be implemented without requiring prior knowledge about statistical data analysis or a detailed understanding of the mathematics of polynomial chaos expansions. The extension to more input variables suggested in this work, is an anisotropic and adaptive version of Smolyak's algorithm that is solely based on the moments of the input probability distributions. It is referred to as SAMBA (PC), which is short for Sparse Approximation of Moment-Based Arbitrary Polynomial Chaos. It is illustrated that for moderately high-dimensional problems (up to 20 different input variables or histograms) SAMBA can significantly simplify the calculation of sparse Gaussian quadrature rules. SAMBA's efficiency for multivariate functions with regard to data availability is further demonstrated by analysing higher order convergence and accuracy for a set of nonlinear test functions with 2, 5 and 10 different input distributions or histograms.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cho, Daniel D; Wernicke, A Gabriella; Nori, Dattatreyudu
Purpose/Objective(s): The aim of this study is to build the estimator of toxicity using artificial neural network (ANN) for head and neck cancer patients Materials/Methods: An ANN can combine variables into a predictive model during training and considered all possible correlations of variables. We constructed an ANN based on the data from 73 patients with advanced H and N cancer treated with external beam radiotherapy and/or chemotherapy at our institution. For the toxicity estimator we defined input data including age, sex, site, stage, pathology, status of chemo, technique of external beam radiation therapy (EBRT), length of treatment, dose of EBRT,more » status of post operation, length of follow-up, the status of local recurrences and distant metastasis. These data were digitized based on the significance and fed to the ANN as input nodes. We used 20 hidden nodes (for the 13 input nodes) to take care of the correlations of input nodes. For training ANN, we divided data into three subsets such as training set, validation set and test set. Finally, we built the estimator for the toxicity from ANN output. Results: We used 13 input variables including the status of local recurrences and distant metastasis and 20 hidden nodes for correlations. 59 patients for training set, 7 patients for validation set and 7 patients for test set and fed the inputs to Matlab neural network fitting tool. We trained the data within 15% of errors of outcome. In the end we have the toxicity estimation with 74% of accuracy. Conclusion: We proved in principle that ANN can be a very useful tool for predicting the RT outcomes for high risk H and N patients. Currently we are improving the results using cross validation.« less
SAMBA: Sparse Approximation of Moment-Based Arbitrary Polynomial Chaos
NASA Astrophysics Data System (ADS)
Ahlfeld, R.; Belkouchi, B.; Montomoli, F.
2016-09-01
A new arbitrary Polynomial Chaos (aPC) method is presented for moderately high-dimensional problems characterised by limited input data availability. The proposed methodology improves the algorithm of aPC and extends the method, that was previously only introduced as tensor product expansion, to moderately high-dimensional stochastic problems. The fundamental idea of aPC is to use the statistical moments of the input random variables to develop the polynomial chaos expansion. This approach provides the possibility to propagate continuous or discrete probability density functions and also histograms (data sets) as long as their moments exist, are finite and the determinant of the moment matrix is strictly positive. For cases with limited data availability, this approach avoids bias and fitting errors caused by wrong assumptions. In this work, an alternative way to calculate the aPC is suggested, which provides the optimal polynomials, Gaussian quadrature collocation points and weights from the moments using only a handful of matrix operations on the Hankel matrix of moments. It can therefore be implemented without requiring prior knowledge about statistical data analysis or a detailed understanding of the mathematics of polynomial chaos expansions. The extension to more input variables suggested in this work, is an anisotropic and adaptive version of Smolyak's algorithm that is solely based on the moments of the input probability distributions. It is referred to as SAMBA (PC), which is short for Sparse Approximation of Moment-Based Arbitrary Polynomial Chaos. It is illustrated that for moderately high-dimensional problems (up to 20 different input variables or histograms) SAMBA can significantly simplify the calculation of sparse Gaussian quadrature rules. SAMBA's efficiency for multivariate functions with regard to data availability is further demonstrated by analysing higher order convergence and accuracy for a set of nonlinear test functions with 2, 5 and 10 different input distributions or histograms.
Bittner, J.W.; Biscardi, R.W.
1991-03-19
An electronic measurement circuit is disclosed for high speed comparison of the relative amplitudes of a predetermined number of electrical input signals independent of variations in the magnitude of the sum of the signals. The circuit includes a high speed electronic switch that is operably connected to receive on its respective input terminals one of said electrical input signals and to have its common terminal serve as an input for a variable-gain amplifier-detector circuit that is operably connected to feed its output to a common terminal of a second high speed electronic switch. The respective terminals of the second high speed electronic switch are operably connected to a plurality of integrating sample and hold circuits, which in turn have their outputs connected to a summing logic circuit that is operable to develop first, second and third output voltages, the first output voltage being proportional to a predetermined ratio of sums and differences between the compared input signals, the second output voltage being proportional to a second summed ratio of predetermined sums and differences between said input signals, and the third output voltage being proportional to the sum of signals to the summing logic circuit. A servo system that is operably connected to receive said third output signal and compare it with a reference voltage to develop a slowly varying feedback voltage to control the variable-gain amplifier in said common amplifier-detector circuit in order to make said first and second output signals independent of variations in the magnitude of the sum of said input signals. 2 figures.
Bittner, John W.; Biscardi, Richard W.
1991-01-01
An electronic measurement circuit for high speed comparison of the relative amplitudes of a predetermined number of electrical input signals independent of variations in the magnitude of the sum of the signals. The circuit includes a high speed electronic switch that is operably connected to receive on its respective input terminals one of said electrical input signals and to have its common terminal serve as an input for a variable-gain amplifier-detector circuit that is operably connected to feed its output to a common terminal of a second high speed electronic switch. The respective terminals of the second high speed electronic switch are operably connected to a plurality of integrating sample and hold circuits, which in turn have their outputs connected to a summing logic circuit that is operable to develop first, second and third output voltages, the first output voltage being proportional to a predetermined ratio of sums and differences between the compared input signals, the second output voltage being proportional to a second summed ratio of predetermined sums and differences between said input signals, and the third output voltage being proportional to the sum of signals to the summing logic circuit. A servo system that is operably connected to receive said third output signal and compare it with a reference voltage to develop a slowly varying feedback voltage to control the variable-gain amplifier in said common amplifier-detector circuit in order to make said first and second output signals independent of variations in the magnitude of the sum of said input signals.
Biological variability of transferrin saturation and unsaturated iron binding capacity
Adams, PC; Reboussin, DM; Press, RD; Barton, JC; Acton, RT; Moses, GC; Leiendecker-Foster, C; McLaren, GD; Dawkins, FW; Gordeuk, VR; Lovato, L; Eckfeldt, JH
2007-01-01
Background Transferrin saturation is widely considered the preferred screening test for hemochromatosis. Unsaturated iron binding capacity has similar performance at lower cost. However, the within-person biological variability of both these tests may limit their ability at commonly used cut points to detect HFE C282Y homozygous patients. Methods The Hemochromatosis and Iron Overload Screening (HEIRS) Study screened 101,168 primary care participants for iron overload using tansferrin saturation, unsaturated iron binding capacity, ferritin and HFE C282Y and H63D genotyping. Transferrin saturation and unsaturated iron binding capacity were performed at initial screening and again when selected participants and controls returned for a clinical examination several months later. A missed case was defined as a C282Y homozygote who had transferrin saturation below cut point (45 % women, 50 % men) or unsaturated iron binding capacity above cut point (150 μmol/L women, 125 μmol/L men) at either the initial screening or clinical examination, or both, regardless of serum ferritin. Results There were 209 C282Y previously undiagnosed homozygotes with transferrin saturation and unsaturated iron binding capacity testing done at initial screening and clinical examination. Sixty-eight C282Y homozygotes (33%) would have been missed at these transferrin saturation cut points (19 men, 49 women, median SF 170 μg/L, first and third quartiles 50 and 474 μg/L), and 58 homozygotes (28 %) would have been missed at the unsaturated iron binding capacity cut points (20 men, 38 women, median SF 168 μg/L, quartiles 38 and 454 μg/L). There was no advantage to using fasting samples. Conclusions The within-person biological variability of transferrin saturation and unsaturated iron binding capacity limit their usefulness as an initial screening test for expressing C282Y homozygotes. PMID:17976429
An exact algebraic solution of the infimum in H-infinity optimization with output feedback
NASA Technical Reports Server (NTRS)
Chen, Ben M.; Saberi, Ali; Ly, Uy-Loi
1991-01-01
This paper presents a simple and noniterative procedure for the computation of the exact value of the infimum in the standard H-infinity-optimal control with output feedback. The problem formulation is general and does not place any restrictions on the direct feedthrough terms between the control input and the controlled output variables, and between the disturbance input and the measurement output variables. The method is applicable to systems that satisfy (1) the transfer function from the control input to the controlled output is right-invertible and has no invariant zeros on the j(w) axis and, (2) the transfer function from the disturbance to the measurement output is left-invertible and has no invariant zeros on the j(w) axis. A set of necessary and sufficient conditions for the solvability of H-infinity-almost disturbance decoupling problem via measurement feedback with internal stability is also given.
Scenario planning for water resource management in semi arid zone
NASA Astrophysics Data System (ADS)
Gupta, Rajiv; Kumar, Gaurav
2018-06-01
Scenario planning for water resource management in semi arid zone is performed using systems Input-Output approach of time domain analysis. This approach derived the future weights of input variables of the hydrological system from their precedent weights. Input variables considered here are precipitation, evaporation, population and crop irrigation. Ingles & De Souza's method and Thornthwaite model have been used to estimate runoff and evaporation respectively. Difference between precipitation inflow and the sum of runoff and evaporation has been approximated as groundwater recharge. Population and crop irrigation derived the total water demand. Compensation of total water demand by groundwater recharge has been analyzed. Further compensation has been evaluated by proposing efficient methods of water conservation. The best measure to be adopted for water conservation is suggested based on the cost benefit analysis. A case study for nine villages in Chirawa region of district Jhunjhunu, Rajasthan (India) validates the model.
Automatic insulation resistance testing apparatus
Wyant, Francis J.; Nowlen, Steven P.; Luker, Spencer M.
2005-06-14
An apparatus and method for automatic measurement of insulation resistances of a multi-conductor cable. In one embodiment of the invention, the apparatus comprises a power supply source, an input measuring means, an output measuring means, a plurality of input relay controlled contacts, a plurality of output relay controlled contacts, a relay controller and a computer. In another embodiment of the invention the apparatus comprises a power supply source, an input measuring means, an output measuring means, an input switching unit, an output switching unit and a control unit/data logger. Embodiments of the apparatus of the invention may also incorporate cable fire testing means. The apparatus and methods of the present invention use either voltage or current for input and output measured variables.
Medical Image Intensifier In 1980 (What Really Happened)
NASA Astrophysics Data System (ADS)
Baiter, Stephen; Kuhl, Walter
1980-08-01
In 1972, at the first SPIE seminar covering the application of optical instrumentation in medicine, Balter and Stanton presented a paper forecasting the status of x-ray image intensifiers in the year 1980. Now, eight years later, it is 1980, and it seems a good idea to evaluate these forecasts in the light of what has actually happened. The x-ray sensitive image intensifier tube (with cesium iodide as an input phosphor) is used nearly universally. Input screen sizes range from 15 cm to 36 cm in diameter. Real time monitoring of both fluoroscopic and fluorographic examinations is generally performed via closed circuit television. Archival recording of images is carried out using cameras with film formats of approximately 100 mm for single exposure or serial fluorography and 35 mm for cine fluorography. With the detective quantum efficiency of image intensifier tubes remaining near 50% throughout the decade, the noise content of most fluorographic and fluoroscopic images is still determined by the input exposure. Consequently, patient doses today, in 1980, have not substantially changed in the last ten years. There is, however, interest in uncoupling the x-ray dose and the image brightness by providing a variable optical diaphragm between the output of the image intensifier tube and the recording devices. During the past eight years, there has been a major philosophical change in the approach to imaging systems. It is now realized that medical image quality is much more dependent on the reduction of large area contrast losses than on the limiting resolution of the imaging system. It has also been clear that much diagnostic information is carried by spatial frequencies in the neighborhood of one line pair per millimeter (referred to the patient). The design of modern image intensifiers has been directed toward improvement in the large area contrast by minimizing x-ray and optical scatter in both the image intensifier tube and its associated components.
A stochastic model of input effectiveness during irregular gamma rhythms.
Dumont, Grégory; Northoff, Georg; Longtin, André
2016-02-01
Gamma-band synchronization has been linked to attention and communication between brain regions, yet the underlying dynamical mechanisms are still unclear. How does the timing and amplitude of inputs to cells that generate an endogenously noisy gamma rhythm affect the network activity and rhythm? How does such "communication through coherence" (CTC) survive in the face of rhythm and input variability? We present a stochastic modelling approach to this question that yields a very fast computation of the effectiveness of inputs to cells involved in gamma rhythms. Our work is partly motivated by recent optogenetic experiments (Cardin et al. Nature, 459(7247), 663-667 2009) that tested the gamma phase-dependence of network responses by first stabilizing the rhythm with periodic light pulses to the interneurons (I). Our computationally efficient model E-I network of stochastic two-state neurons exhibits finite-size fluctuations. Using the Hilbert transform and Kuramoto index, we study how the stochastic phase of its gamma rhythm is entrained by external pulses. We then compute how this rhythmic inhibition controls the effectiveness of external input onto pyramidal (E) cells, and how variability shapes the window of firing opportunity. For transferring the time variations of an external input to the E cells, we find a tradeoff between the phase selectivity and depth of rate modulation. We also show that the CTC is sensitive to the jitter in the arrival times of spikes to the E cells, and to the degree of I-cell entrainment. We further find that CTC can occur even if the underlying deterministic system does not oscillate; quasicycle-type rhythms induced by the finite-size noise retain the basic CTC properties. Finally a resonance analysis confirms the relative importance of the I cell pacing for rhythm generation. Analysis of whole network behaviour, including computations of synchrony, phase and shifts in excitatory-inhibitory balance, can be further sped up by orders of magnitude using two coupled stochastic differential equations, one for each population. Our work thus yields a fast tool to numerically and analytically investigate CTC in a noisy context. It shows that CTC can be quite vulnerable to rhythm and input variability, which both decrease phase preference.
Karmakar, Chandan; Udhayakumar, Radhagayathri K; Li, Peng; Venkatesh, Svetha; Palaniswami, Marimuthu
2017-01-01
Distribution entropy ( DistEn ) is a recently developed measure of complexity that is used to analyse heart rate variability (HRV) data. Its calculation requires two input parameters-the embedding dimension m , and the number of bins M which replaces the tolerance parameter r that is used by the existing approximation entropy ( ApEn ) and sample entropy ( SampEn ) measures. The performance of DistEn can also be affected by the data length N . In our previous studies, we have analyzed stability and performance of DistEn with respect to one parameter ( m or M ) or combination of two parameters ( N and M ). However, impact of varying all the three input parameters on DistEn is not yet studied. Since DistEn is predominantly aimed at analysing short length heart rate variability (HRV) signal, it is important to comprehensively study the stability, consistency and performance of the measure using multiple case studies. In this study, we examined the impact of changing input parameters on DistEn for synthetic and physiological signals. We also compared the variations of DistEn and performance in distinguishing physiological (Elderly from Young) and pathological (Healthy from Arrhythmia) conditions with ApEn and SampEn . The results showed that DistEn values are minimally affected by the variations of input parameters compared to ApEn and SampEn. DistEn also showed the most consistent and the best performance in differentiating physiological and pathological conditions with various of input parameters among reported complexity measures. In conclusion, DistEn is found to be the best measure for analysing short length HRV time series.
Mu, Zhijian; Huang, Aiying; Ni, Jiupai; Xie, Deti
2014-01-01
Organic soils are an important source of N2O, but global estimates of these fluxes remain uncertain because measurements are sparse. We tested the hypothesis that N2O fluxes can be predicted from estimates of mineral nitrogen input, calculated from readily-available measurements of CO2 flux and soil C/N ratio. From studies of organic soils throughout the world, we compiled a data set of annual CO2 and N2O fluxes which were measured concurrently. The input of soil mineral nitrogen in these studies was estimated from applied fertilizer nitrogen and organic nitrogen mineralization. The latter was calculated by dividing the rate of soil heterotrophic respiration by soil C/N ratio. This index of mineral nitrogen input explained up to 69% of the overall variability of N2O fluxes, whereas CO2 flux or soil C/N ratio alone explained only 49% and 36% of the variability, respectively. Including water table level in the model, along with mineral nitrogen input, further improved the model with the explanatory proportion of variability in N2O flux increasing to 75%. Unlike grassland or cropland soils, forest soils were evidently nitrogen-limited, so water table level had no significant effect on N2O flux. Our proposed approach, which uses the product of soil-derived CO2 flux and the inverse of soil C/N ratio as a proxy for nitrogen mineralization, shows promise for estimating regional or global N2O fluxes from organic soils, although some further enhancements may be warranted. PMID:24798347
Antanasijević, Davor Z; Pocajt, Viktor V; Povrenović, Dragan S; Ristić, Mirjana Đ; Perić-Grujić, Aleksandra A
2013-01-15
This paper describes the development of an artificial neural network (ANN) model for the forecasting of annual PM(10) emissions at the national level, using widely available sustainability and economical/industrial parameters as inputs. The inputs for the model were selected and optimized using a genetic algorithm and the ANN was trained using the following variables: gross domestic product, gross inland energy consumption, incineration of wood, motorization rate, production of paper and paperboard, sawn wood production, production of refined copper, production of aluminum, production of pig iron and production of crude steel. The wide availability of the input parameters used in this model can overcome a lack of data and basic environmental indicators in many countries, which can prevent or seriously impede PM emission forecasting. The model was trained and validated with the data for 26 EU countries for the period from 1999 to 2006. PM(10) emission data, collected through the Convention on Long-range Transboundary Air Pollution - CLRTAP and the EMEP Programme or as emission estimations by the Regional Air Pollution Information and Simulation (RAINS) model, were obtained from Eurostat. The ANN model has shown very good performance and demonstrated that the forecast of PM(10) emission up to two years can be made successfully and accurately. The mean absolute error for two-year PM(10) emission prediction was only 10%, which is more than three times better than the predictions obtained from the conventional multi-linear regression and principal component regression models that were trained and tested using the same datasets and input variables. Copyright © 2012 Elsevier B.V. All rights reserved.
Karmakar, Chandan; Udhayakumar, Radhagayathri K.; Li, Peng; Venkatesh, Svetha; Palaniswami, Marimuthu
2017-01-01
Distribution entropy (DistEn) is a recently developed measure of complexity that is used to analyse heart rate variability (HRV) data. Its calculation requires two input parameters—the embedding dimension m, and the number of bins M which replaces the tolerance parameter r that is used by the existing approximation entropy (ApEn) and sample entropy (SampEn) measures. The performance of DistEn can also be affected by the data length N. In our previous studies, we have analyzed stability and performance of DistEn with respect to one parameter (m or M) or combination of two parameters (N and M). However, impact of varying all the three input parameters on DistEn is not yet studied. Since DistEn is predominantly aimed at analysing short length heart rate variability (HRV) signal, it is important to comprehensively study the stability, consistency and performance of the measure using multiple case studies. In this study, we examined the impact of changing input parameters on DistEn for synthetic and physiological signals. We also compared the variations of DistEn and performance in distinguishing physiological (Elderly from Young) and pathological (Healthy from Arrhythmia) conditions with ApEn and SampEn. The results showed that DistEn values are minimally affected by the variations of input parameters compared to ApEn and SampEn. DistEn also showed the most consistent and the best performance in differentiating physiological and pathological conditions with various of input parameters among reported complexity measures. In conclusion, DistEn is found to be the best measure for analysing short length HRV time series. PMID:28979215
van der Have, Mike; Oldenburg, Bas; Fidder, Herma H; Belderbos, Tim D G; Siersema, Peter D; van Oijen, Martijn G H
2014-03-01
Treatment with tumor necrosis factor-α (TNF-α) inhibitors in patients with Crohn's disease (CD) is associated with potentially serious infections, including tuberculosis (TB) and hepatitis B virus (HBV). We assessed the cost-effectiveness of extensive TB screening and HBV screening prior to initiating TNF-α inhibitors in CD. We constructed two Markov models: (1) comparing tuberculin skin test (TST) combined with chest X-ray (conventional TB screening) versus TST and chest X-ray followed by the interferon-gamma release assay (extensive TB screening) in diagnosing TB; and (2) HBV screening versus no HBV screening. Our base-case included an adult CD patient starting with infliximab treatment. Input parameters were extracted from the literature. Direct medical costs were assessed and discounted following a third-party payer perspective. The main outcome was the incremental cost-effectiveness ratio (ICER). Sensitivity and Monte Carlo analyses were performed over wide ranges of probability and cost estimates. At base-case, the ICERs of extensive screening and HBV screening were €64,340 and €75,760 respectively to gain one quality-adjusted life year. Sensitivity analyses concluded that extensive TB screening was a cost-effective strategy if the latent TB prevalence is more than 12 % or if the false positivity rate of TST is more than 20 %. HBV screening became cost-effective if HBV reactivation or HBV-related mortality is higher than 37 and 62 %, respectively. Extensive TB screening and HBV screening are not cost-effective compared with conventional TB screening and no HBV screening, respectively. However, when targeted at high-risk patient groups, these screening strategies are likely to become cost-effective.
Hoppie, Lyle O.
1982-01-12
Disclosed are several embodiments of a regenerative braking device for an automotive vehicle. The device includes a plurality of rubber rollers (24, 26) mounted for rotation between an input shaft (14) connectable to the vehicle drivetrain and an output shaft (16) which is drivingly connected to the input shaft by a variable ratio transmission (20). When the transmission ratio is such that the input shaft rotates faster than the output shaft, the rubber rollers are torsionally stressed to accumulate energy, thereby slowing the vehicle. When the transmission ratio is such that the output shaft rotates faster than the input shaft, the rubber rollers are torsionally relaxed to deliver accumulated energy, thereby accelerating or driving the vehicle.
Post, R.F.
1958-11-11
An electronic computer circuit is described for producing an output voltage proportional to the product or quotient of tbe voltages of a pair of input signals. ln essence, the disclosed invention provides a computer having two channels adapted to receive separate input signals and each having amplifiers with like fixed amplification factors and like negatlve feedback amplifiers. One of the channels receives a constant signal for comparison purposes, whereby a difference signal is produced to control the amplification factors of the variable feedback amplifiers. The output of the other channel is thereby proportional to the product or quotient of input signals depending upon the relation of input to fixed signals in the first mentioned channel.
NASA Astrophysics Data System (ADS)
Jin, Hyeongmin; Heo, Changyong; Kim, Jong Hyo
2018-02-01
Differing reconstruction kernels are known to strongly affect the variability of imaging biomarkers and thus remain as a barrier in translating the computer aided quantification techniques into clinical practice. This study presents a deep learning application to CT kernel conversion which converts a CT image of sharp kernel to that of standard kernel and evaluates its impact on variability reduction of a pulmonary imaging biomarker, the emphysema index (EI). Forty cases of low-dose chest CT exams obtained with 120kVp, 40mAs, 1mm thickness, of 2 reconstruction kernels (B30f, B50f) were selected from the low dose lung cancer screening database of our institution. A Fully convolutional network was implemented with Keras deep learning library. The model consisted of symmetric layers to capture the context and fine structure characteristics of CT images from the standard and sharp reconstruction kernels. Pairs of the full-resolution CT data set were fed to input and output nodes to train the convolutional network to learn the appropriate filter kernels for converting the CT images of sharp kernel to standard kernel with a criterion of measuring the mean squared error between the input and target images. EIs (RA950 and Perc15) were measured with a software package (ImagePrism Pulmo, Seoul, South Korea) and compared for the data sets of B50f, B30f, and the converted B50f. The effect of kernel conversion was evaluated with the mean and standard deviation of pair-wise differences in EI. The population mean of RA950 was 27.65 +/- 7.28% for B50f data set, 10.82 +/- 6.71% for the B30f data set, and 8.87 +/- 6.20% for the converted B50f data set. The mean of pair-wise absolute differences in RA950 between B30f and B50f is reduced from 16.83% to 1.95% using kernel conversion. Our study demonstrates the feasibility of applying the deep learning technique for CT kernel conversion and reducing the kernel-induced variability of EI quantification. The deep learning model has a potential to improve the reliability of imaging biomarker, especially in evaluating the longitudinal changes of EI even when the patient CT scans were performed with different kernels.
Turbidity of mouthrinsed water as a screening index for oral malodor.
Ueno, Masayuki; Takeuchi, Susumu; Samnieng, Patcharaphol; Morishima, Seiji; Shinada, Kayoko; Kawaguchi, Yoko
2013-08-01
The objectives of this research were to examine the relationship between turbidity of mouthrinsed water and oral malodor, and to evaluate whether the turbidity could be used to screen oral malodor. The subjects were 165 oral malodor patients. Gas chromatography and organoleptic test (OT) were used for oral malodor measurement. Oral examination along with collection of saliva and quantification of bacteria was conducted. Turbidity of mouthrinsed water was measured with turbidimeter. Logistic regression with oral malodor status by OT as the dependent variable and receiver operating characteristic (ROC) analysis were performed. Turbidity had a significant association with oral malodor status. In addition, ROC analysis showed that the turbidity had an ability to screen for presence or absence of oral malodor. Turbidity could reflect or represent other influential variables of oral malodor and may be useful as a screening method for oral malodor. Copyright © 2013 Elsevier Inc. All rights reserved.
Cost-Effectiveness of Routine Screening for Critical Congenital Heart Disease in US Newborns
Peterson, Cora; Grosse, Scott D.; Oster, Matthew E.; Olney, Richard S.; Cassell, Cynthia H.
2015-01-01
OBJECTIVES Clinical evidence indicates newborn critical congenital heart disease (CCHD) screening through pulse oximetry is lifesaving. In 2011, CCHD was added to the US Recommended Uniform Screening Panel for newborns. Several states have implemented or are considering screening mandates. This study aimed to estimate the cost-effectiveness of routine screening among US newborns unsuspected of having CCHD. METHODS We developed a cohort model with a time horizon of infancy to estimate the inpatient medical costs and health benefits of CCHD screening. Model inputs were derived from new estimates of hospital screening costs and inpatient care for infants with late-detected CCHD, defined as no diagnosis at the birth hospital. We estimated the number of newborns with CCHD detected at birth hospitals and life-years saved with routine screening compared with no screening. RESULTS Screening was estimated to incur an additional cost of $6.28 per newborn, with incremental costs of $20 862 per newborn with CCHD detected at birth hospitals and $40 385 per life-year gained (2011 US dollars). We estimated 1189 more newborns with CCHD would be identified at birth hospitals and 20 infant deaths averted annually with screening. Another 1975 false-positive results not associated with CCHD were estimated to occur, although these results had a minimal impact on total estimated costs. CONCLUSIONS This study provides the first US cost-effectiveness analysis of CCHD screening in the United States could be reasonably cost-effective. We anticipate data from states that have recently approved or initiated CCHD screening will become available over the next few years to refine these projections. PMID:23918890
A user's guide for DTIZE an interactive digitizing and graphical editing computer program
NASA Technical Reports Server (NTRS)
Thomas, C. C.
1981-01-01
A guide for DTIZE, a two dimensional digitizing program with graphical editing capability, is presented. DTIZE provides the capability to simultaneously create and display a picture on the display screen. Data descriptions may be permanently saved in three different formats. DTIZE creates the picture graphics in the locator mode, thus inputting one coordinate each time the terminator button is pushed. Graphic input devices (GIN) are also used to select function command menu. These menu commands and the program's interactive prompting sequences provide a complete capability for creating, editing, and permanently recording a graphical picture file. DTIZE is written in FORTRAN IV language for the Tektronix 4081 graphic system utilizing the Plot 80 Distributed Graphics Library (DGL) subroutines. The Tektronix 4953/3954 Graphic Tablet with mouse, pen, or joystick are used as graphics input devices to create picture graphics.
NASA Astrophysics Data System (ADS)
Subara, Deni; Jaswir, Irwandi; Alkhatib, Maan Fahmi Rashid; Noorbatcha, Ibrahim Ali
2018-01-01
The aim of this experiment is to screen and to understand the process variables on the fabrication of fish gelatin nanoparticles by using quality-design approach. The most influencing process variables were screened by using Plackett-Burman design. Mean particles size, size distribution, and zeta potential were found in the range 240±9.76 nm, 0.3, and -9 mV, respectively. Statistical results explained that concentration of acetone, pH of solution during precipitation step and volume of cross linker had a most significant effect on particles size of fish gelatin nanoparticles. It was found that, time and chemical consuming is lower than previous research. This study revealed the potential of quality-by design in understanding the effects of process variables on the fish gelatin nanoparticles production.
Breast cancer screening among Asian immigrant women in Canada.
Sun, Zhuoyu; Xiong, Hui; Kearney, Anne; Zhang, Jin; Liu, Wei; Huang, Guowei; Wang, Peizhong Peter
2010-02-01
The objective of this study was to examine the pattern of breast cancer screening among Asian immigrant women aged 50-69 years and compare it with corresponding non-immigrant women in Canada. Data from the Canadian Community Health Survey cycle 2.1 (2003) were utilized. Self-reported screening histories were used as outcome variables: socioeconomic status and medical histories were used as predictive variables. Analyses were weighted to represent the target population. Multivariate logistic regression analyses were performed to compare the screening pattern among Asian immigrant women and corresponding non-immigrant Canadians. In total, 508 Asian immigrant women were included in this study. The results suggest that 71% and 60% of Asian immigrant women reported ever having had and recent mammogram use, respectively, while the corresponding figures for non-immigrant women were 89% and 72%. The observed differences were statistically significant and could not be explained by confounding factors. The ability to speak one of the two official languages is an important barrier to mammography screening among Asian immigrant women. The findings show lower rates of mammography screening among Asian immigrant women in Canada. If breast screening is to remain a health policy objective in Canada, targeted efforts to increase the recruitment of Asian immigrant women need to be developed or strengthened.
NASA Astrophysics Data System (ADS)
Rathod, Vishal
The objective of the present project was to develop the Ibuprofen-loaded Nanostructured Lipid Carrier (IBU-NLCs) for topical ocular delivery based on substantial pre-formulation screening of the components and understanding the interplay between the formulation and process variables. The BCS Class II drug: Ibuprofen was selected as the model drug for the current study. IBU-NLCs were prepared by melt emulsification and ultrasonication technique. Extensive pre-formulation studies were performed to screen the lipid components (solid and liquid) based on drug's solubility and affinity as well as components compatibility. The results from DSC & XRD assisted in selecting the most suitable ratio to be utilized for future studies. DynasanRTM 114 was selected as the solid lipid & MiglyolRTM 840 was selected as the liquid lipid based on preliminary lipid screening. The ratio of 6:4 was predicted to be the best based on its crystallinity index and the thermal events. As there are many variables involved for further optimization of the formulation, a single design approach is not always adequate. A hybrid-design approach was applied by employing the Plackett Burman design (PBD) for preliminary screening of 7 critical variables, followed by Box-Behnken design (BBD), a sub-type of response surface methodology (RSM) design using 2 relatively significant variables from the former design and incorporating Surfactant/Co-surfactant ratio as the third variable. Comparatively, KolliphorRTM HS15 demonstrated lower Mean Particle Size (PS) & Polydispersity Index (PDI) and KolliphorRTM P188 resulted in Zeta Potential (ZP) < -20 mV during the surfactant screening & stability studies. Hence, Surfactant/Cosurfactant ratio was employed as the third variable to understand its synergistic effect on the response variables. We selected PS, PDI, and ZP as critical response variables in the PBD since they significantly influence the stability & performance of NLCs. Formulations prepared using BBD were further characterized and evaluated concerning PS, PDI, ZP and Entrapment Efficiency (EE) to identify the multi-factor interactions between selected formulation variables. In vitro release studies were performed using Spectra/por dialysis membrane on Franz diffusion cell and Phosphate Saline buffer (7.4 pH) as the medium. Samples for assay, EE, Loading Capacity (LC), Solubility studies & in-vitro release were filtered using Amicon 50K and analyzed via UPLC system (Waters) at a detection wavelength of 220 nm. Significant variables were selected through PBD, and the third variable was incorporated based on surfactant screening & stability studies for the next design. Assay of the BBD based formulations was found to be within 95-104% of the theoretically calculated values. Further studies were investigated for PS, PDI, ZP & EE. PS was found to be in the range of 103-194 nm with PDI ranging from 0.118 to 0.265. The ZP and EE were observed to be in the range of -22.2 to -11 mV & 90 to 98.7 % respectively. Drug release of 30% was observed from the optimized formulation in the first 6 hr of in-vitro studies, and the drug release showed a sustained release of ibuprofen thereafter over several hours. These values also confirm that the production method, and all other selected variables, effectively promoted the incorporation of ibuprofen in NLC. Quality by Design (QbD) approach was successfully implemented in developing a robust ophthalmic formulation with superior physicochemical and morphometric properties. NLCs as the nanocarrier demonstrated promising perspective for topical delivery of poorly water-soluble drugs.
ERIC Educational Resources Information Center
Ramírez-Esparza, Nairán; García-Sierra, Adrián; Kuhl, Patricia K.
2017-01-01
This study tested the impact of child-directed language input on language development in Spanish-English bilingual infants (N = 25, 11- and 14-month-olds from the Seattle metropolitan area), across languages and independently for each language, controlling for socioeconomic status. Language input was characterized by social interaction variables,…
TRANDESNF: A computer program for transonic airfoil design and analysis in nonuniform flow
NASA Technical Reports Server (NTRS)
Chang, J. F.; Lan, C. Edward
1987-01-01
The use of a transonic airfoil code for analysis, inverse design, and direct optimization of an airfoil immersed in propfan slipstream is described. A summary of the theoretical method, program capabilities, input format, output variables, and program execution are described. Input data of sample test cases and the corresponding output are given.
MURI: Impact of Oceanographic Variability on Acoustic Communications
2011-09-01
multiplexing ( OFDM ), multiple- input/multiple-output ( MIMO ) transmissions, and multi-user single-input/multiple-output (SIMO) communications. Lastly... MIMO - OFDM communications: Receiver design for Doppler distorted underwater acoustic channels,” Proc. Asilomar Conf. on Signals, Systems, and... MIMO ) will be of particular interest. Validating experimental data will be obtained during the ONR acoustic communications experiment in summer 2008
Simple Sensitivity Analysis for Orion Guidance Navigation and Control
NASA Technical Reports Server (NTRS)
Pressburger, Tom; Hoelscher, Brian; Martin, Rodney; Sricharan, Kumar
2013-01-01
The performance of Orion flight software, especially its GNC software, is being analyzed by running Monte Carlo simulations of Orion spacecraft flights. The simulated performance is analyzed for conformance with flight requirements, expressed as performance constraints. Flight requirements include guidance (e.g. touchdown distance from target) and control (e.g., control saturation) as well as performance (e.g., heat load constraints). The Monte Carlo simulations disperse hundreds of simulation input variables, for everything from mass properties to date of launch. We describe in this paper a sensitivity analysis tool ("Critical Factors Tool" or CFT) developed to find the input variables or pairs of variables which by themselves significantly influence satisfaction of requirements or significantly affect key performance metrics (e.g., touchdown distance from target). Knowing these factors can inform robustness analysis, can inform where engineering resources are most needed, and could even affect operations. The contributions of this paper include the introduction of novel sensitivity measures, such as estimating success probability, and a technique for determining whether pairs of factors are interacting dependently or independently. The tool found that input variables such as moments, mass, thrust dispersions, and date of launch were found to be significant factors for success of various requirements. Examples are shown in this paper as well as a summary and physics discussion of EFT-1 driving factors that the tool found.
NASA Astrophysics Data System (ADS)
Carter, Jeffrey R.; Simon, Wayne E.
1990-08-01
Neural networks are trained using Recursive Error Minimization (REM) equations to perform statistical classification. Using REM equations with continuous input variables reduces the required number of training experiences by factors of one to two orders of magnitude over standard back propagation. Replacing the continuous input variables with discrete binary representations reduces the number of connections by a factor proportional to the number of variables reducing the required number of experiences by another order of magnitude. Undesirable effects of using recurrent experience to train neural networks for statistical classification problems are demonstrated and nonrecurrent experience used to avoid these undesirable effects. 1. THE 1-41 PROBLEM The statistical classification problem which we address is is that of assigning points in ddimensional space to one of two classes. The first class has a covariance matrix of I (the identity matrix) the covariance matrix of the second class is 41. For this reason the problem is known as the 1-41 problem. Both classes have equal probability of occurrence and samples from both classes may appear anywhere throughout the ddimensional space. Most samples near the origin of the coordinate system will be from the first class while most samples away from the origin will be from the second class. Since the two classes completely overlap it is impossible to have a classifier with zero error. The minimum possible error is known as the Bayes error and
Fuzzy parametric uncertainty analysis of linear dynamical systems: A surrogate modeling approach
NASA Astrophysics Data System (ADS)
Chowdhury, R.; Adhikari, S.
2012-10-01
Uncertainty propagation engineering systems possess significant computational challenges. This paper explores the possibility of using correlated function expansion based metamodelling approach when uncertain system parameters are modeled using Fuzzy variables. In particular, the application of High-Dimensional Model Representation (HDMR) is proposed for fuzzy finite element analysis of dynamical systems. The HDMR expansion is a set of quantitative model assessment and analysis tools for capturing high-dimensional input-output system behavior based on a hierarchy of functions of increasing dimensions. The input variables may be either finite-dimensional (i.e., a vector of parameters chosen from the Euclidean space RM) or may be infinite-dimensional as in the function space CM[0,1]. The computational effort to determine the expansion functions using the alpha cut method scales polynomially with the number of variables rather than exponentially. This logic is based on the fundamental assumption underlying the HDMR representation that only low-order correlations among the input variables are likely to have significant impacts upon the outputs for most high-dimensional complex systems. The proposed method is integrated with a commercial Finite Element software. Modal analysis of a simplified aircraft wing with Fuzzy parameters has been used to illustrate the generality of the proposed approach. In the numerical examples, triangular membership functions have been used and the results have been validated against direct Monte Carlo simulations.
The method of belief scales as a means for dealing with uncertainty in tough regulatory decisions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pilch, Martin M.
Modeling and simulation is playing an increasing role in supporting tough regulatory decisions, which are typically characterized by variabilities and uncertainties in the scenarios, input conditions, failure criteria, model parameters, and even model form. Variability exists when there is a statistically significant database that is fully relevant to the application. Uncertainty, on the other hand, is characterized by some degree of ignorance. A simple algebraic problem was used to illustrate how various risk methodologies address variability and uncertainty in a regulatory context. These traditional risk methodologies include probabilistic methods (including frequensic and Bayesian perspectives) and second-order methods where variabilities andmore » uncertainties are treated separately. Representing uncertainties with (subjective) probability distributions and using probabilistic methods to propagate subjective distributions can lead to results that are not logically consistent with available knowledge and that may not be conservative. The Method of Belief Scales (MBS) is developed as a means to logically aggregate uncertain input information and to propagate that information through the model to a set of results that are scrutable, easily interpretable by the nonexpert, and logically consistent with the available input information. The MBS, particularly in conjunction with sensitivity analyses, has the potential to be more computationally efficient than other risk methodologies. The regulatory language must be tailored to the specific risk methodology if ambiguity and conflict are to be avoided.« less
Feeney, Daniel F; Meyer, François G; Noone, Nicholas; Enoka, Roger M
2017-10-01
Motor neurons appear to be activated with a common input signal that modulates the discharge activity of all neurons in the motor nucleus. It has proven difficult for neurophysiologists to quantify the variability in a common input signal, but characterization of such a signal may improve our understanding of how the activation signal varies across motor tasks. Contemporary methods of quantifying the common input to motor neurons rely on compiling discrete action potentials into continuous time series, assuming the motor pool acts as a linear filter, and requiring signals to be of sufficient duration for frequency analysis. We introduce a space-state model in which the discharge activity of motor neurons is modeled as inhomogeneous Poisson processes and propose a method to quantify an abstract latent trajectory that represents the common input received by motor neurons. The approach also approximates the variation in synaptic noise in the common input signal. The model is validated with four data sets: a simulation of 120 motor units, a pair of integrate-and-fire neurons with a Renshaw cell providing inhibitory feedback, the discharge activity of 10 integrate-and-fire neurons, and the discharge times of concurrently active motor units during an isometric voluntary contraction. The simulations revealed that a latent state-space model is able to quantify the trajectory and variability of the common input signal across all four conditions. When compared with the cumulative spike train method of characterizing common input, the state-space approach was more sensitive to the details of the common input current and was less influenced by the duration of the signal. The state-space approach appears to be capable of detecting rather modest changes in common input signals across conditions. NEW & NOTEWORTHY We propose a state-space model that explicitly delineates a common input signal sent to motor neurons and the physiological noise inherent in synaptic signal transmission. This is the first application of a deterministic state-space model to represent the discharge characteristics of motor units during voluntary contractions. Copyright © 2017 the American Physiological Society.
Ting, Hua; Huang, Ren-Jing; Lai, Ching-Hsiang; Chang, Shen-Wen; Chung, Ai-Hui; Kuo, Teng-Yao; Chang, Ching-Haur; Shih, Tung-Sheng; Lee, Shin-Da
2014-01-01
Background: Sleepiness-at-the-wheel has been identified as a major cause of highway accidents. The aim of our study is identifying the candidate measures for home-based screening of sleep disordered breathing in Taiwanese bus drivers, instead of polysomnography. Methods: Overnight polysomnography accompanied with simultaneous measurements of alternative screening devices (pulse oximetry, ApneaLink, and Actigraphy), heart rate variability, wake-up systolic blood pressure and questionnaires were completed by 151 eligible participants who were long-haul bus drivers with a duty period of more than 12 h a day and duty shifting. Results: 63.6% of professional bus drivers were diagnosed as having sleep disordered breathing and had a higher body mass index, neck circumference, systolic blood pressure, arousal index and desaturation index than those professional bus drivers without evidence of sleep disordered breathing. Simple home-based candidate measures: (1) Pulse oximetry, oxygen-desaturation indices by ≥3% and 4% (r = 0.87∼0.92); (2) Pulse oximetry, pulse-rising indices by ≥7% and 8% from a baseline (r = 0.61∼0.89); and (3) ApneaLink airflow detection, apnea-hypopnea indices (r = 0.70∼0.70), based on recording-time or Actigraphy-corrected total sleep time were all significantly correlated with, and had high agreement with, corresponding polysomnographic apnea-hypopnea indices [(1) 94.5%∼96.6%, (2) 93.8%∼97.2%, (3) 91.1%∼91.3%, respectively]. Conversely, no validities of SDB screening were found in the multi-variables apnea prediction questionnaire, Epworth Sleepiness Scale, night-sleep heart rate variability, wake-up systolic blood pressure and anthropometric variables. Conclusions: The indices of pulse oximetry and apnea flow detection are eligible criteria for home-based screening of sleep disordered breathing, specifically for professional drivers. PMID:24803198
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurnik, Charles W; Violette, Daniel M.
Addressing other evaluation issues that have been raised in the context of energy efficiency programs, this chapter focuses on methods used to address the persistence of energy savings, which is an important input to the benefit/cost analysis of energy efficiency programs and portfolios. In addition to discussing 'persistence' (which refers to the stream of benefits over time from an energy efficiency measure or program), this chapter provides a summary treatment of these issues -Synergies across programs -Rebound -Dual baselines -Errors in variables (the measurement and/or accuracy of input variables to the evaluation).
NASA Technical Reports Server (NTRS)
Newsom, J. R.; Mukhopadhyay, V.
1983-01-01
A method for designing robust feedback controllers for multiloop systems is presented. Robustness is characterized in terms of the minimum singular value of the system return difference matrix at the plant input. Analytical gradients of the singular values with respect to design variables in the controller are derived. A cumulative measure of the singular values and their gradients with respect to the design variables is used with a numerical optimization technique to increase the system's robustness. Both unconstrained and constrained optimization techniques are evaluated. Numerical results are presented for a two-input/two-output drone flight control system.
Guo, Changning; Doub, William H; Kauffman, John F
2010-08-01
Monte Carlo simulations were applied to investigate the propagation of uncertainty in both input variables and response measurements on model prediction for nasal spray product performance design of experiment (DOE) models in the first part of this study, with an initial assumption that the models perfectly represent the relationship between input variables and the measured responses. In this article, we discard the initial assumption, and extended the Monte Carlo simulation study to examine the influence of both input variable variation and product performance measurement variation on the uncertainty in DOE model coefficients. The Monte Carlo simulations presented in this article illustrate the importance of careful error propagation during product performance modeling. Our results show that the error estimates based on Monte Carlo simulation result in smaller model coefficient standard deviations than those from regression methods. This suggests that the estimated standard deviations from regression may overestimate the uncertainties in the model coefficients. Monte Carlo simulations provide a simple software solution to understand the propagation of uncertainty in complex DOE models so that design space can be specified with statistically meaningful confidence levels. (c) 2010 Wiley-Liss, Inc. and the American Pharmacists Association
NASA Astrophysics Data System (ADS)
Behera, Kishore Kumar; Pal, Snehanshu
2018-03-01
This paper describes a new approach towards optimum utilisation of ferrochrome added during stainless steel making in AOD converter. The objective of optimisation is to enhance end blow chromium content of steel and reduce the ferrochrome addition during refining. By developing a thermodynamic based mathematical model, a study has been conducted to compute the optimum trade-off between ferrochrome addition and end blow chromium content of stainless steel using a predator prey genetic algorithm through training of 100 dataset considering different input and output variables such as oxygen, argon, nitrogen blowing rate, duration of blowing, initial bath temperature, chromium and carbon content, weight of ferrochrome added during refining. Optimisation is performed within constrained imposed on the input parameters whose values fall within certain ranges. The analysis of pareto fronts is observed to generate a set of feasible optimal solution between the two conflicting objectives that provides an effective guideline for better ferrochrome utilisation. It is found out that after a certain critical range, further addition of ferrochrome does not affect the chromium percentage of steel. Single variable response analysis is performed to study the variation and interaction of all individual input parameters on output variables.
Variability of perceptual multistability: from brain state to individual trait
Kleinschmidt, Andreas; Sterzer, Philipp; Rees, Geraint
2012-01-01
Few phenomena are as suitable as perceptual multistability to demonstrate that the brain constructively interprets sensory input. Several studies have outlined the neural circuitry involved in generating perceptual inference but only more recently has the individual variability of this inferential process been appreciated. Studies of the interaction of evoked and ongoing neural activity show that inference itself is not merely a stimulus-triggered process but is related to the context of the current brain state into which the processing of external stimulation is embedded. As brain states fluctuate, so does perception of a given sensory input. In multistability, perceptual fluctuation rates are consistent for a given individual but vary considerably between individuals. There has been some evidence for a genetic basis for these individual differences and recent morphometric studies of parietal lobe regions have identified neuroanatomical substrates for individual variability in spontaneous switching behaviour. Moreover, disrupting the function of these latter regions by transcranial magnetic stimulation yields systematic interference effects on switching behaviour, further arguing for a causal role of these regions in perceptual inference. Together, these studies have advanced our understanding of the biological mechanisms by which the brain constructs the contents of consciousness from sensory input. PMID:22371620
Mandelblatt, J; Traxler, M; Lakin, P; Kanetsky, P; Kao, R
1993-01-01
Factors associated with participation in breast and cervix cancer screening among elderly black women of low socioeconomic status were determined. Data from a baseline cross-sectional random survey were used together with data on whether screening was subsequently completed or refused. The subjects were a random sample of women attending an urban public hospital primary care clinic for routine medical care with a birth year of 1924 or earlier. Among the 271 women in the study group, 70% completed screening. Stated intent was the strongest predictor of participation; women who intended to have both mammography and Pap testing were 2.7 times more likely to participate than those who intended to have neither test (95% confidence interval 1.4, 4.9; P < 0.01), controlling for age, insurance status, and level of chronic illness. Women who had more than three chronic illnesses were twice as likely to participate than those with three or fewer illnesses (95% confidence interval 1.1, 3.4 P < 0.02), controlling for the remaining variables. Other variables, including age, history of a recent screening examination, attitudes, or knowledge, were not related to participation. Stated intent was the only variable that predicted compliance with both mammography and Pap smear completion in separate regression models for the individual tests. A high proportion of elderly, socioeconomically disadvantaged black women will participate in cancer screening when it is offered in a primary care setting. Further research on behavioral intentions should be conducted to refine interventions designed to enhance the use of early cancer detection among vulnerable population groups.
Anniss, Angela M; Young, Alan; O'Driscoll, Denise M
2016-12-15
Multiple sleep latency testing (MSLT) and the maintenance of wakefulness test (MWT) are gold-standard objective tests of daytime sleepiness and alertness; however, there is marked variability in their interpretation and practice. This study aimed to determine the incidence of positive drug screens and their influence on MSLT, MWT, and polysomnographic variables. All patients attending Eastern Health Sleep Laboratory for MSLT or MWT over a 21-mo period were included in the study. Urinary drug screening for amphetamines, barbiturates, benzodiazepines, cannabinoids, cocaine, methadone, and opiates was performed following overnight polysomnography (PSG). Demographics and PSG variables were compared. Of 69 studies, MSLT (43) and MWT (26), 16% of patients had positive urinary drug screening (7 MSLT; 4 MWT). Drugs detected included amphetamines, cannabinoids, opiates, and benzodiazepines. No patient self-reported use of these medications prior to testing. No demographic, MSLT or MWT PSG data or overnight PSG data showed any statistical differences between positive and negative drug screen groups. Of seven MSLT patients testing positive for drug use, one met criteria for the diagnosis of narcolepsy and five for idiopathic hypersomnia. On MWT, three of the four drug-positive patients had a history of a motor vehicle accident and two patients were occupational drivers. These findings indicate drug use is present in patients attending for daytime testing of objective sleepiness and wakefulness. These data support routine urinary drug screening in all patients undergoing MSLT or MWT studies to ensure accurate interpretation in the context of illicit and prescription drug use. © 2016 American Academy of Sleep Medicine
Sequential Modular Position and Momentum Measurements of a Trapped Ion Mechanical Oscillator
NASA Astrophysics Data System (ADS)
Flühmann, C.; Negnevitsky, V.; Marinelli, M.; Home, J. P.
2018-04-01
The noncommutativity of position and momentum observables is a hallmark feature of quantum physics. However, this incompatibility does not extend to observables that are periodic in these base variables. Such modular-variable observables have been suggested as tools for fault-tolerant quantum computing and enhanced quantum sensing. Here, we implement sequential measurements of modular variables in the oscillatory motion of a single trapped ion, using state-dependent displacements and a heralded nondestructive readout. We investigate the commutative nature of modular variable observables by demonstrating no-signaling in time between successive measurements, using a variety of input states. Employing a different periodicity, we observe signaling in time. This also requires wave-packet overlap, resulting in quantum interference that we enhance using squeezed input states. The sequential measurements allow us to extract two-time correlators for modular variables, which we use to violate a Leggett-Garg inequality. Signaling in time and Leggett-Garg inequalities serve as efficient quantum witnesses, which we probe here with a mechanical oscillator, a system that has a natural crossover from the quantum to the classical regime.
Moreira, Fabiana Tavares; Prantoni, Alessandro Lívio; Martini, Bruno; de Abreu, Michelle Alves; Stoiev, Sérgio Biato; Turra, Alexander
2016-01-15
Microplastics such as pellets have been reported for many years on sandy beaches around the globe. Nevertheless, high variability is observed in their estimates and distribution patterns across the beach environment are still to be unravelled. Here, we investigate the small-scale temporal and spatial variability in the abundance of pellets in the intertidal zone of a sandy beach and evaluate factors that can increase the variability in data sets. The abundance of pellets was estimated during twelve consecutive tidal cycles, identifying the position of the high tide between cycles and sampling drift-lines across the intertidal zone. We demonstrate that beach dynamic processes such as the overlap of strandlines and artefacts of the methods can increase the small-scale variability. The results obtained are discussed in terms of the methodological considerations needed to understand the distribution of pellets in the beach environment, with special implications for studies focused on patterns of input. Copyright © 2015 Elsevier Ltd. All rights reserved.
Data analytics using canonical correlation analysis and Monte Carlo simulation
NASA Astrophysics Data System (ADS)
Rickman, Jeffrey M.; Wang, Yan; Rollett, Anthony D.; Harmer, Martin P.; Compson, Charles
2017-07-01
A canonical correlation analysis is a generic parametric model used in the statistical analysis of data involving interrelated or interdependent input and output variables. It is especially useful in data analytics as a dimensional reduction strategy that simplifies a complex, multidimensional parameter space by identifying a relatively few combinations of variables that are maximally correlated. One shortcoming of the canonical correlation analysis, however, is that it provides only a linear combination of variables that maximizes these correlations. With this in mind, we describe here a versatile, Monte-Carlo based methodology that is useful in identifying non-linear functions of the variables that lead to strong input/output correlations. We demonstrate that our approach leads to a substantial enhancement of correlations, as illustrated by two experimental applications of substantial interest to the materials science community, namely: (1) determining the interdependence of processing and microstructural variables associated with doped polycrystalline aluminas, and (2) relating microstructural decriptors to the electrical and optoelectronic properties of thin-film solar cells based on CuInSe2 absorbers. Finally, we describe how this approach facilitates experimental planning and process control.
Functional genomics (FG) screens, using RNAi or CRISPR technology, have become a standard tool for systematic, genome-wide loss-of-function studies for therapeutic target discovery. As in many large-scale assays, however, off-target effects, variable reagents' potency and experimental noise must be accounted for appropriately control for false positives.
Christofaro, Diego Giulliano Destro; De Andrade, Selma Maffei; Mesas, Arthur Eumann; Fernandes, Rômulo Araújo; Farias Júnior, José Cazuza
2016-01-01
To analyse the associations between high screen time and overweight, poor dietary habits and physical activity in adolescents according to sex. The study comprised 515 boys and 716 girls aged 14-17 years from Londrina, Brazil. Nutritional status (normal weight or overweight/obese) was assessed by calculating the body mass index. Eating habits and time spent in physical activity were reported using a questionnaire. The measurement of screen time considered the time spent watching television, using a computer and playing video games during a normal week. Associations between high screen time and dependent variables (nutritional status, eating habits and physical activity levels) were assessed by binary logistic regression, adjusted for sociodemographic and lifestyle variables. Most adolescents (93.8% of boys and 87.2% of girls) spent more than 2 hours per day in screen-time activities. After adjustments, an increasing trend in the prevalence of overweight and physical inactivity with increasing time spent on screen activities was observed for both sexes. Screen times of >4 hours/day compared with <2 hours/day were associated with physical inactivity, low consumption of vegetables and high consumption of sweets only in girls and the consumption of soft drinks in both sexes. The frequency of overweight and physical inactivity increased with increasing screen time in a trending manner and independently of the main confounders. The relationship between high screen time and poor eating habits was particularly relevant for adolescent girls.
Screen-related sedentary behaviours: Children’s and parents’ attitudes, motivations, and practices
He, Meizi; Piché, Leonard; Beynon, Charlene; Harris, Stewart
2016-01-01
Objective To investigate school-aged children’s and parents’ attitudes, social influences, and intentions toward excessive screen-related sedentary behaviour (S-RSB). Design A cross-sectional study using a survey methodology. Setting Elementary schools in London, Ontario, Canada. Participants All grades five and six students, their parents and teachers in the participating schools were invited to voluntarily participate; 508 student-parent pairs completed the surveys. Main Outcome Measure Children’s screen-related behaviours. Analysis Data were analyzed using the Independent Student t-test to compare differences of continuous variables and the Chi-Square test to test for differences of categorical variables. Results Children spent 3.3 ± 0.15 (standard error) hours per day engaged in screen-related activities. Entertainment, spending time with family, and boredom were cited as the top three reasons for television viewing and video game playing. Compared to “low-screen-users” (i.e. < 2hours/day), “high-screen-users” (i.e. ≥2hours/day) held less negative attitudes toward excessive S-RSB and perceived loosened parental rules on screen use. Parents of “high-screen-users” held less negative attitudes towards children’s S-RSB, had fewer rules about their children’s screen use, and were more likely to be sedentary themselves. Conclusions and Implications Intervention strategies aimed at reducing S-RSB should involve both parents and children and should focus on fostering behavioural changes and promoting parental role-modeling. PMID:19914872
Biological control of appetite: A daunting complexity.
MacLean, Paul S; Blundell, John E; Mennella, Julie A; Batterham, Rachel L
2017-03-01
This review summarizes a portion of the discussions of an NIH Workshop (Bethesda, MD, 2015) titled "Self-Regulation of Appetite-It's Complicated," which focused on the biological aspects of appetite regulation. This review summarizes the key biological inputs of appetite regulation and their implications for body weight regulation. These discussions offer an update of the long-held, rigid perspective of an "adipocentric" biological control, taking a broader view that also includes important inputs from the digestive tract, from lean mass, and from the chemical sensory systems underlying taste and smell. It is only beginning to be understood how these biological systems are integrated and how this integrated input influences appetite and food eating behaviors. The relevance of these biological inputs was discussed primarily in the context of obesity and the problem of weight regain, touching on topics related to the biological predisposition for obesity and the impact that obesity treatments (dieting, exercise, bariatric surgery, etc.) might have on appetite and weight loss maintenance. Finally considered is a common theme that pervaded the workshop discussions, which was individual variability. It is this individual variability in the predisposition for obesity and in the biological response to weight loss that makes the biological component of appetite regulation so complicated. When this individual biological variability is placed in the context of the diverse environmental and behavioral pressures that also influence food eating behaviors, it is easy to appreciate the daunting complexities that arise with the self-regulation of appetite. © 2017 The Obesity Society.
Biological Control of Appetite: A Daunting Complexity
MacLean, Paul S.; Blundell, John E.; Mennella, Julie A.; Batterham, Rachel L.
2017-01-01
Objective This review summarizes a portion of the discussions of an NIH Workshop (Bethesda, MD, 2015) entitled, “Self-Regulation of Appetite, It's Complicated,” which focused on the biological aspects of appetite regulation. Methods Here we summarize the key biological inputs of appetite regulation and their implications for body weight regulation. Results These discussions offer an update of the long-held, rigid perspective of an “adipocentric” biological control, taking a broader view that also includes important inputs from the digestive tract, from lean mass, and from the chemical sensory systems underlying taste and smell. We are only beginning to understand how these biological systems are integrated and how this integrated input influences appetite and food eating behaviors. The relevance of these biological inputs was discussed primarily in the context of obesity and the problem of weight regain, touching on topics related to the biological predisposition for obesity and the impact that obesity treatments (dieting, exercise, bariatric surgery, etc.) might have on appetite and weight loss maintenance. Finally, we consider a common theme that pervaded the workshop discussions, which was individual variability. Conclusions It is this individual variability in the predisposition for obesity and in the biological response to weight loss that makes the biological component of appetite regulation so complicated. When this individual biological variability is placed in the context of the diverse environmental and behavioral pressures that also influence food eating behaviors, it is easy to appreciate the daunting complexities that arise with the self-regulation of appetite. PMID:28229538
ERIC Educational Resources Information Center
Aguilar, Jessica M.; Plante, Elena; Sandoval, Michelle
2018-01-01
Purpose: Variability in the input plays an important role in language learning. The current study examined the role of object variability for new word learning by preschoolers with specific language impairment (SLI). Method: Eighteen 4- and 5-year-old children with SLI were taught 8 new words in 3 short activities over the course of 3 sessions.…
Audience Response Systems in Higher Education: Applications and Cases
ERIC Educational Resources Information Center
Banks, David
2006-01-01
Taking advantage of user-friendly technology, Audience Response Systems (ARS) facilitates greater interaction with participants engaged in a variety of group activities. Each participant has an input device that permits them to express a view in complete anonymity, and the composite view of the total group appears on a public screen. ARS can then…
Sparse Measurement Systems: Applications, Analysis, Algorithms and Design
ERIC Educational Resources Information Center
Narayanaswamy, Balakrishnan
2011-01-01
This thesis deals with "large-scale" detection problems that arise in many real world applications such as sensor networks, mapping with mobile robots and group testing for biological screening and drug discovery. These are problems where the values of a large number of inputs need to be inferred from noisy observations and where the…
Diversity and antifungal activity of endophytic diazotrophic bacteria colonizing sugarcane in Egypt
USDA-ARS?s Scientific Manuscript database
The price of nitrogen continues to increase and is a major input in sugarcane production. Sugarcane grown in Egypt was screened for the presence of nitrogen-fixing bacteria. Nitrogen-free medium LGI-P was used to isolate bacteria from cane stalks. Among the 52 isolates subjected to acetylene redu...
Playforth, Krupa B; Coughlan, Alexandria; Upadhya, Krishna K
2016-02-01
The purpose of this study was to evaluate whether providers offer chlamydia screening to teenagers and/or whether screening is accepted at different rates depending on insurance type. Retrospective chart review. Academic center serving urban and suburban patients between April 2009 and October 2011. Nine hundred eighty-three health maintenance visits for asymptomatic, insured female adolescents aged 15-19 years. None. Dichotomous dependent variables of interest indicated whether chlamydia screening was: (1) offered; and (2) accepted. The key independent variable insurance type was coded as 'public' if Medicaid or Medicaid Managed Care and 'private' if a commercial plan. χ(2) and logistic regression analyses were used to assess the significance of differences in screening rates according to insurance type. Of asymptomatic health-maintenance visits 933 (95%) had a documented sexual history and 339 (34%) had a documented history of sexual activity. After excluding those who had a documented chlamydia screen in the 12 months before the visit (n = 79; 23%), 260 visits met eligibility for chlamydia screening. Only 169 (65%) of eligible visits had chlamydia screening offered and there was no difference in offer of screening according to insurance type. Significantly more visits covered by public insurance had chlamydia screening accepted (98%) than those covered by private insurance (82%). Controlling for demographic factors, the odds of accepted chlamydia screening was 8 times higher in visits covered by public insurance than those with private insurance. Although publically and privately insured teens were equally likely to be offered chlamydia screening, publically insured teens were significantly more likely to accept screening. Future research should investigate reasons for this difference in screening acceptance. These findings have implications for interventions to improve chlamydia screening because more adolescents are covered by parental insurance under the Affordable Care Act. Copyright © 2016 North American Society for Pediatric and Adolescent Gynecology. Published by Elsevier Inc. All rights reserved.
Mathematical Model of Solidification During Electroslag Casting of Pilger Roll
NASA Astrophysics Data System (ADS)
Liu, Fubin; Li, Huabing; Jiang, Zhouhua; Dong, Yanwu; Chen, Xu; Geng, Xin; Zang, Ximin
A mathematical model for describing the interaction of multiple physical fields in slag bath and solidification process in ingot during pilger roll casting with variable cross-section which is produced by the electroslag casting (ESC) process was developed. The commercial software ANSYS was applied to calculate the electromagnetic field, magnetic driven fluid flow, buoyancy-driven flow and heat transfer. The transportation phenomenon in slag bath and solidification characteristic of ingots are analyzed for variable cross-section with variable input power under the conditions of 9Cr3NiMo steel and 70%CaF2 - 30%Al2O3 slag system. The calculated results show that characteristic of current density distribution, velocity patterns and temperature profiles in the slag bath and metal pool profiles in ingot have distinct difference at variable cross-sections due to difference of input power and cooling condition. The pool shape and the local solidification time (LST) during Pilger roll ESC process are analyzed.
Spike-Threshold Adaptation Predicted by Membrane Potential Dynamics In Vivo
Fontaine, Bertrand; Peña, José Luis; Brette, Romain
2014-01-01
Neurons encode information in sequences of spikes, which are triggered when their membrane potential crosses a threshold. In vivo, the spiking threshold displays large variability suggesting that threshold dynamics have a profound influence on how the combined input of a neuron is encoded in the spiking. Threshold variability could be explained by adaptation to the membrane potential. However, it could also be the case that most threshold variability reflects noise and processes other than threshold adaptation. Here, we investigated threshold variation in auditory neurons responses recorded in vivo in barn owls. We found that spike threshold is quantitatively predicted by a model in which the threshold adapts, tracking the membrane potential at a short timescale. As a result, in these neurons, slow voltage fluctuations do not contribute to spiking because they are filtered by threshold adaptation. More importantly, these neurons can only respond to input spikes arriving together on a millisecond timescale. These results demonstrate that fast adaptation to the membrane potential captures spike threshold variability in vivo. PMID:24722397
Mobile HIV screening in Cape Town, South Africa: clinical impact, cost and cost-effectiveness.
Bassett, Ingrid V; Govindasamy, Darshini; Erlwanger, Alison S; Hyle, Emily P; Kranzer, Katharina; van Schaik, Nienke; Noubary, Farzad; Paltiel, A David; Wood, Robin; Walensky, Rochelle P; Losina, Elena; Bekker, Linda-Gail; Freedberg, Kenneth A
2014-01-01
Mobile HIV screening may facilitate early HIV diagnosis. Our objective was to examine the cost-effectiveness of adding a mobile screening unit to current medical facility-based HIV testing in Cape Town, South Africa. We used the Cost Effectiveness of Preventing AIDS Complications International (CEPAC-I) computer simulation model to evaluate two HIV screening strategies in Cape Town: 1) medical facility-based testing (the current standard of care) and 2) addition of a mobile HIV-testing unit intervention in the same community. Baseline input parameters were derived from a Cape Town-based mobile unit that tested 18,870 individuals over 2 years: prevalence of previously undiagnosed HIV (6.6%), mean CD4 count at diagnosis (males 423/µL, females 516/µL), CD4 count-dependent linkage to care rates (males 31%-58%, females 49%-58%), mobile unit intervention cost (includes acquisition, operation and HIV test costs, $29.30 per negative result and $31.30 per positive result). We conducted extensive sensitivity analyses to evaluate input uncertainty. Model outcomes included site of HIV diagnosis, life expectancy, medical costs, and the incremental cost-effectiveness ratio (ICER) of the intervention compared to medical facility-based testing. We considered the intervention to be "very cost-effective" when the ICER was less than South Africa's annual per capita Gross Domestic Product (GDP) ($8,200 in 2012). We projected that, with medical facility-based testing, the discounted (undiscounted) HIV-infected population life expectancy was 132.2 (197.7) months; this increased to 140.7 (211.7) months with the addition of the mobile unit. The ICER for the mobile unit was $2,400/year of life saved (YLS). Results were most sensitive to the previously undiagnosed HIV prevalence, linkage to care rates, and frequency of HIV testing at medical facilities. The addition of mobile HIV screening to current testing programs can improve survival and be very cost-effective in South Africa and other resource-limited settings, and should be a priority.
Applying operations research to optimize a novel population management system for cancer screening
Zai, Adrian H; Kim, Seokjin; Kamis, Arnold; Hung, Ken; Ronquillo, Jeremiah G; Chueh, Henry C; Atlas, Steven J
2014-01-01
Objective To optimize a new visit-independent, population-based cancer screening system (TopCare) by using operations research techniques to simulate changes in patient outreach staffing levels (delegates, navigators), modifications to user workflow within the information technology (IT) system, and changes in cancer screening recommendations. Materials and methods TopCare was modeled as a multiserver, multiphase queueing system. Simulation experiments implemented the queueing network model following a next-event time-advance mechanism, in which systematic adjustments were made to staffing levels, IT workflow settings, and cancer screening frequency in order to assess their impact on overdue screenings per patient. Results TopCare reduced the average number of overdue screenings per patient from 1.17 at inception to 0.86 during simulation to 0.23 at steady state. Increases in the workforce improved the effectiveness of TopCare. In particular, increasing the delegate or navigator staff level by one person improved screening completion rates by 1.3% or 12.2%, respectively. In contrast, changes in the amount of time a patient entry stays on delegate and navigator lists had little impact on overdue screenings. Finally, lengthening the screening interval increased efficiency within TopCare by decreasing overdue screenings at the patient level, resulting in a smaller number of overdue patients needing delegates for screening and a higher fraction of screenings completed by delegates. Conclusions Simulating the impact of changes in staffing, system parameters, and clinical inputs on the effectiveness and efficiency of care can inform the allocation of limited resources in population management. PMID:24043318
Jacobsen, Clemma; Corpuz, Rebecca; Forquera, Ralph; Buchwald, Dedra
2017-01-01
This study seeks to ascertain whether a culturally tailored art calendar could improve participation in cancer screening activities. We conducted a randomized, controlled calendar mail-out in which a Native art calendar was sent by first class mail to 5,633 patients seen at an urban American Indian clinic during the prior 2 years. Using random assignment, half of the patients were mailed a “message” calendar with screening information and reminders on breast, colorectal, lung, and prostate cancer; the other half received a calendar without messages. The receipt of cancer screening services was ascertained through chart abstraction in the following 15 months. In total, 5,363 observations (health messages n=2,695; no messages n=2,668) were analyzed. The calendar with health messages did not result in increased receipt of any cancer-related prevention outcome compared to the calendar without health messages. We solicited clinic input to create a culturally appropriate visual intervention to increase cancer screening in a vulnerable, underserved urban population. Our results suggest that printed materials with health messages are likely too weak an intervention to produce the desired behavioral outcomes in cancer screening. PMID:21472495
Doorenbos, Ardith Z; Jacobsen, Clemma; Corpuz, Rebecca; Forquera, Ralph; Buchwald, Dedra
2011-09-01
This study seeks to ascertain whether a culturally tailored art calendar could improve participation in cancer screening activities. We conducted a randomized, controlled calendar mail-out in which a Native art calendar was sent by first class mail to 5,633 patients seen at an urban American Indian clinic during the prior 2 years. Using random assignment, half of the patients were mailed a "message" calendar with screening information and reminders on breast, colorectal, lung, and prostate cancer; the other half received a calendar without messages. The receipt of cancer screening services was ascertained through chart abstraction in the following 15 months. In total, 5,363 observations (health messages n = 2,695; no messages n = 2,668) were analyzed. The calendar with health messages did not result in increased receipt of any cancer-related prevention outcome compared to the calendar without health messages. We solicited clinic input to create a culturally appropriate visual intervention to increase cancer screening in a vulnerable, underserved urban population. Our results suggest that printed materials with health messages are likely too weak an intervention to produce the desired behavioral outcomes in cancer screening.
Drug screening of cancer cell lines and human primary tumors using droplet microfluidics.
Wong, Ada Hang-Heng; Li, Haoran; Jia, Yanwei; Mak, Pui-In; Martins, Rui Paulo da Silva; Liu, Yan; Vong, Chi Man; Wong, Hang Cheong; Wong, Pak Kin; Wang, Haitao; Sun, Heng; Deng, Chu-Xia
2017-08-22
Precision Medicine in Oncology requires tailoring of therapeutic strategies to individual cancer patients. Due to the limited quantity of tumor samples, this proves to be difficult, especially for early stage cancer patients whose tumors are small. In this study, we exploited a 2.4 × 2.4 centimeters polydimethylsiloxane (PDMS) based microfluidic chip which employed droplet microfluidics to conduct drug screens against suspended and adherent cancer cell lines, as well as cells dissociated from primary tumor of human patients. Single cells were dispersed in aqueous droplets and imaged within 24 hours of drug treatment to assess cell viability by ethidium homodimer 1 staining. Our results showed that 5 conditions could be screened for every 80,000 cells in one channel on our chip under current circumstances. Additionally, screening conditions have been adapted to both suspended and adherent cancer cells, giving versatility to potentially all types of cancers. Hence, this study provides a powerful tool for rapid, low-input drug screening of primary cancers within 24 hours after tumor resection from cancer patients. This paves the way for further technological advancement to cutting down sample size and increasing drug screening throughput in advent to personalized cancer therapy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerhard, M.A.; Sommer, S.C.
1995-04-01
AUTOCASK (AUTOmatic Generation of 3-D CASK models) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for the structural analysis of shipping casks for radioactive material. Model specification is performed on the microcomputer, and the analyses are performed on an engineering workstation or mainframe computer. AUTOCASK is based on 80386/80486 compatible microcomputers. The system is composed of a series of menus, input programs, display programs, a mesh generation program, and archive programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests.
SABERS. Stand-Alone ADIC Binary Exploitation Resources System. Volume II.
1981-09-01
XXXQ S X3W3 w P-’ 00S00 z~ ML Z.14.J SM C . >C.> 3c Q . . . a.E.C HL 0 c Ccc4 z _,> 0o~. 0 0) "XI 8-4 zw0 Ŕ >e 0 - 0 SMK " CELh uw C WnI-EW e cm (A...Recognizer ----- TOINT ASCII to Integer Converter * - ACESS Access Control Word Recognizer "* -.. STRAIN Input Constraint Processor S----GETBND Input...exclude the screen cursor from these areas. The code for the routine is found in the file ACESS.FLX. STRAIN STRAIN recognizes the various forms of user
Rioualen, Claire; Da Costa, Quentin; Chetrit, Bernard; Charafe-Jauffret, Emmanuelle; Ginestier, Christophe
2017-01-01
High-throughput RNAi screenings (HTS) allow quantifying the impact of the deletion of each gene in any particular function, from virus-host interactions to cell differentiation. However, there has been less development for functional analysis tools dedicated to RNAi analyses. HTS-Net, a network-based analysis program, was developed to identify gene regulatory modules impacted in high-throughput screenings, by integrating transcription factors-target genes interaction data (regulome) and protein-protein interaction networks (interactome) on top of screening z-scores. HTS-Net produces exhaustive HTML reports for results navigation and exploration. HTS-Net is a new pipeline for RNA interference screening analyses that proves better performance than simple gene rankings by z-scores, by re-prioritizing genes and replacing them in their biological context, as shown by the three studies that we reanalyzed. Formatted input data for the three studied datasets, source code and web site for testing the system are available from the companion web site at http://htsnet.marseille.inserm.fr/. We also compared our program with existing algorithms (CARD and hotnet2). PMID:28949986
Assanelli, Deodato; Deodato, Assanelli; Ermolao, Andrea; Andrea, Ermolao; Carre, François; François, Carré; Deligiannis, Asterios; Asterios, Deligiannis; Mellwig, Klaus; Mellwig, Klaus; Klaus, Mellwig; Tahmi, Mohamed; Mohamed, Tahmi; Cesana, Bruno Mario; Mario, Cesana Bruno; Levaggi, Rosella; Rosella, Levaggi; Aliverti, Paola; Paola, Aliverti; Sharma, Sanjay; Sanjay, Sharma
2014-06-01
Most of the available data on the cardiovascular screening of athletes come from Italy, with fewer records being available outside of Italy and for non-Caucasian populations. The goals of the SMILE project (Sport Medicine Intervention to save Lives through ECG) are to evaluate the usefulness of 12-lead ECGs for the detection of cardiac diseases in athletes from three European countries and one African country and to estimate how many second-level examinations are needed subsequent to the initial screening in order to classify athletes with abnormal characteristics. A digital network consisting of Sport Centres and second and third opinion centres was set up in Greece, Germany, France and Algeria. Standard digital data input was carried out through the application of 12-lead ECGs, Bethesda questionnaires and physical examinations. Two hundred ninety-three of the 6,634 consecutive athletes required further evaluation, mostly (88.4 %) as a consequence of abnormal ECGs. After careful evaluation, 237 were determined to be healthy or apparently healthy, while 56 athletes were found to have cardiac disorders and were thus disqualified from active participation in sports. There was a large difference in the prevalence of diseases detected in Europe as compared with Algeria (0.23 and 4.01 %, respectively). Our data confirmed the noteworthy value of 12-lead resting ECGs as compared with other first-level evaluations, especially in athletes with asymptomatic cardiac diseases. Its value seems to have been even higher in Algeria than in the European countries. The establishment of a digital network of Sport Centres for second/third opinions in conjunction with the use of standard digital data input seems to be a valuable means for increasing the effectiveness of screening.
Ghitza, Udi E; Gore-Langton, Robert E; Lindblad, Robert; Shide, David; Subramaniam, Geetha; Tai, Betty
2013-01-01
Electronic health records (EHRs) are essential in improving quality and enhancing efficiency of health-care delivery. By 2015, medical care receiving service reimbursement from US Centers for Medicare and Medicaid Services (CMS) must show 'meaningful use' of EHRs. Substance use disorders (SUD) are grossly under-detected and under-treated in current US medical care settings. Hence, an urgent need exists for improved identification of and clinical intervention for SUD in medical settings. The National Institute on Drug Abuse Clinical Trials Network (NIDA CTN) has leveraged its infrastructure and expertise and brought relevant stakeholders together to develop consensus on brief screening and initial assessment tools for SUD in general medical settings, with the objective of incorporation into US EHRs. Stakeholders were identified and queried for input and consensus on validated screening and assessment for SUD in general medical settings to develop common data elements to serve as shared resources for EHRs on screening, brief intervention and referral to treatment (SBIRT), with the intent of supporting interoperability and data exchange in a developing Nationwide Health Information Network. Through consensus of input from stakeholders, a validated screening and brief assessment instrument, supported by Clinical Decision Support tools, was chosen to be used at out-patient general medical settings. The creation and adoption of a core set of validated common data elements and the inclusion of such consensus-based data elements for general medical settings will enable the integration of SUD treatment within mainstream health care, and support the adoption and 'meaningful use' of the US Office of the National Coordinator for Health Information Technology (ONC)-certified EHRs, as well as CMS reimbursement. Published 2012. This article is a U.S. Government work and is in the public domain in the USA.
A cortical motor nucleus drives the basal ganglia-recipient thalamus in singing birds
Goldberg, Jesse H.
2012-01-01
The pallido-recipient thalamus transmits information from the basal ganglia (BG) to the cortex and plays a critical role motor initiation and learning. Thalamic activity is strongly inhibited by pallidal inputs from the BG, but the role of non-pallidal inputs, such as excitatory inputs from cortex, is unclear. We have recorded simultaneously from presynaptic pallidal axon terminals and postsynaptic thalamocortical neurons in a BG-recipient thalamic nucleus necessary for vocal variability and learning in zebra finches. We found that song-locked rate modulations in the thalamus could not be explained by pallidal inputs alone, and persisted following pallidal lesion. Instead, thalamic activity was likely driven by inputs from a motor ‘cortical’ nucleus also necessary for singing. These findings suggest a role for cortical inputs to the pallido-recipient thalamus in driving premotor signals important for exploratory behavior and learning. PMID:22327474
Kamatuka, Kenta; Hattori, Masahiro; Sugiyama, Tomoyasu
2016-12-01
RNA interference (RNAi) screening is extensively used in the field of reverse genetics. RNAi libraries constructed using random oligonucleotides have made this technology affordable. However, the new methodology requires exploration of the RNAi target gene information after screening because the RNAi library includes non-natural sequences that are not found in genes. Here, we developed a web-based tool to support RNAi screening. The system performs short hairpin RNA (shRNA) target prediction that is informed by comprehensive enquiry (SPICE). SPICE automates several tasks that are laborious but indispensable to evaluate the shRNAs obtained by RNAi screening. SPICE has four main functions: (i) sequence identification of shRNA in the input sequence (the sequence might be obtained by sequencing clones in the RNAi library), (ii) searching the target genes in the database, (iii) demonstrating biological information obtained from the database, and (iv) preparation of search result files that can be utilized in a local personal computer (PC). Using this system, we demonstrated that genes targeted by random oligonucleotide-derived shRNAs were not different from those targeted by organism-specific shRNA. The system facilitates RNAi screening, which requires sequence analysis after screening. The SPICE web application is available at http://www.spice.sugysun.org/.
Lee, Hee Yun; Roh, Soonhee; Vang, Suzanne; Jin, Seok Won
2011-01-01
Despite the proven benefits of Pap testing, Korean American women have one of the lowest cervical cancer screening rates in the United States. This study examined how cultural factors are associated with Pap test utilization among Korean American women participants. Quota sampling was used to recruit 202 Korean American women participants residing in New York City. Hierarchical logistic regression was used to assess the association of cultural variables with Pap test receipt. Overall, participants in our study reported significantly lower Pap test utilization; only 58% reported lifetime receipt of this screening test. Logistic regression analysis revealed one of the cultural variables--prevention orientation--was the strongest correlate of recent Pap test use. Older age and married status were also found to be significant predictors of Pap test use. Findings suggest cultural factors should be considered in interventions promoting cervical cancer screening among Korean American women. Furthermore, younger Korean American women and those not living with a spouse/partner should be targeted in cervical cancer screening efforts.
Screening and clustering of sparse regressions with finite non-Gaussian mixtures.
Zhang, Jian
2017-06-01
This article proposes a method to address the problem that can arise when covariates in a regression setting are not Gaussian, which may give rise to approximately mixture-distributed errors, or when a true mixture of regressions produced the data. The method begins with non-Gaussian mixture-based marginal variable screening, followed by fitting a full but relatively smaller mixture regression model to the selected data with help of a new penalization scheme. Under certain regularity conditions, the new screening procedure is shown to possess a sure screening property even when the population is heterogeneous. We further prove that there exists an elbow point in the associated scree plot which results in a consistent estimator of the set of active covariates in the model. By simulations, we demonstrate that the new procedure can substantially improve the performance of the existing procedures in the content of variable screening and data clustering. By applying the proposed procedure to motif data analysis in molecular biology, we demonstrate that the new method holds promise in practice. © 2016, The International Biometric Society.
A Neuron-Based Screening Platform for Optimizing Genetically-Encoded Calcium Indicators
Schreiter, Eric R.; Hasseman, Jeremy P.; Tsegaye, Getahun; Fosque, Benjamin F.; Behnam, Reza; Shields, Brenda C.; Ramirez, Melissa; Kimmel, Bruce E.; Kerr, Rex A.; Jayaraman, Vivek; Looger, Loren L.; Svoboda, Karel; Kim, Douglas S.
2013-01-01
Fluorescent protein-based sensors for detecting neuronal activity have been developed largely based on non-neuronal screening systems. However, the dynamics of neuronal state variables (e.g., voltage, calcium, etc.) are typically very rapid compared to those of non-excitable cells. We developed an electrical stimulation and fluorescence imaging platform based on dissociated rat primary neuronal cultures. We describe its use in testing genetically-encoded calcium indicators (GECIs). Efficient neuronal GECI expression was achieved using lentiviruses containing a neuronal-selective gene promoter. Action potentials (APs) and thus neuronal calcium levels were quantitatively controlled by electrical field stimulation, and fluorescence images were recorded. Images were segmented to extract fluorescence signals corresponding to individual GECI-expressing neurons, which improved sensitivity over full-field measurements. We demonstrate the superiority of screening GECIs in neurons compared with solution measurements. Neuronal screening was useful for efficient identification of variants with both improved response kinetics and high signal amplitudes. This platform can be used to screen many types of sensors with cellular resolution under realistic conditions where neuronal state variables are in relevant ranges with respect to timing and amplitude. PMID:24155972
Prestressed elastomer for energy storage
Hoppie, Lyle O.; Speranza, Donald
1982-01-01
Disclosed is a regenerative braking device for an automotive vehicle. The device includes a power isolating assembly (14), an infinitely variable transmission (20) interconnecting an input shaft (16) with an output shaft (18), and an energy storage assembly (22). The storage assembly includes a plurality of elastomeric rods (44, 46) mounted for rotation and connected in series between the input and output shafts. The elastomeric rods are prestressed along their rotational or longitudinal axes to inhibit buckling of the rods due to torsional stressing of the rods in response to relative rotation of the input and output shafts.
Yiadom, Maame Yaa A B; Baugh, Christopher W; McWade, Conor M; Liu, Xulei; Song, Kyoung Jun; Patterson, Brian W; Jenkins, Cathy A; Tanski, Mary; Mills, Angela M; Salazar, Gilberto; Wang, Thomas J; Dittus, Robert S; Liu, Dandan; Storrow, Alan B
2017-02-23
Timely diagnosis of ST-segment elevation myocardial infarction (STEMI) in the emergency department (ED) is made solely by ECG. Obtaining this test within 10 minutes of ED arrival is critical to achieving the best outcomes. We investigated variability in the timely identification of STEMI across institutions and whether performance variation was associated with the ED characteristics, the comprehensiveness of screening criteria, and the STEMI screening processes. We examined STEMI screening performance in 7 EDs, with the missed case rate (MCR) as our primary end point. The MCR is the proportion of primarily screened ED patients diagnosed with STEMI who did not receive an ECG within 15 minutes of ED arrival. STEMI was defined by hospital discharge diagnosis. Relationships between the MCR and ED characteristics, screening criteria, and STEMI screening processes were assessed, along with differences in door-to-ECG times for captured versus missed patients. The overall MCR for all 7 EDs was 12.8%. The lowest and highest MCRs were 3.4% and 32.6%, respectively. The mean difference in door-to-ECG times for captured and missed patients was 31 minutes, with a range of 14 to 80 minutes of additional myocardial ischemia time for missed cases. The prevalence of primarily screened ED STEMIs was 0.09%. EDs with the greatest informedness (sensitivity+specificity-1) demonstrated superior performance across all other screening measures. The 29.2% difference in MCRs between the highest and lowest performing EDs demonstrates room for improving timely STEMI identification among primarily screened ED patients. The MCR and informedness can be used to compare screening across EDs and to understand variable performance. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.
Variables in screening for resistance to Huanglongbing
USDA-ARS?s Scientific Manuscript database
A series of experiments were initiated to assess factors which might permit more rapid screening for huanglongbing (HLB) resistance, using sweet orange in all experiments and Carrizo and/or Temple as sources of resistance/tolerance. Numerous researchers working on huanglongbing provided observations...
Feeney, Daniel F; Mani, Diba; Enoka, Roger M
2018-06-07
We investigated the associations between grooved pegboard times, force steadiness (coefficient of variation for force), and variability in an estimate of the common synaptic input to motor neurons innervating the wrist extensor muscles during steady contractions performed by young and older adults. The discharge times of motor units were derived from recordings obtained with high-density surface electrodes while participants performed steady isometric contractions at 10% and 20% of maximal voluntary contraction (MVC) force. The steady contractions were performed with a pinch grip and wrist extension, both independently (single action) and concurrently (double action). The variance in common synaptic input to motor neurons was estimated with a state-space model of the latent common input dynamics. There was a statistically significant association between the coefficient of variation for force during the steady contractions and the estimated variance in common synaptic input in young (r 2 = 0.31) and older (r 2 = 0.39) adults, but not between either the mean or the coefficient of variation for interspike interval of single motor units with the coefficient of variation for force. Moreover, the estimated variance in common synaptic input during the double-action task with the wrist extensors at the 20% target was significantly associated with grooved pegboard time (r 2 = 0.47) for older adults, but not young adults. These findings indicate that longer pegboard times of older adults were associated with worse force steadiness and greater fluctuations in the estimated common synaptic input to motor neurons during steady contractions. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Using Natural Language to Enhance Mission Effectiveness
NASA Technical Reports Server (NTRS)
Trujillo, Anna C.; Meszaros, Erica
2016-01-01
The availability of highly capable, yet relatively cheap, unmanned aerial vehicles (UAVs) is opening up new areas of use for hobbyists and for professional-related activities. The driving function of this research is allowing a non-UAV pilot, an operator, to define and manage a mission. This paper describes the preliminary usability measures of an interface that allows an operator to define the mission using speech to make inputs. An experiment was conducted to begin to enumerate the efficacy and user acceptance of using voice commands to define a multi-UAV mission and to provide high-level vehicle control commands such as "takeoff." The primary independent variable was input type - voice or mouse. The primary dependent variables consisted of the correctness of the mission parameter inputs and the time needed to make all inputs. Other dependent variables included NASA-TLX workload ratings and subjective ratings on a final questionnaire. The experiment required each subject to fill in an online form that contained comparable required information that would be needed for a package dispatcher to deliver packages. For each run, subjects typed in a simple numeric code for the package code. They then defined the initial starting position, the delivery location, and the return location using either pull-down menus or voice input. Voice input was accomplished using CMU Sphinx4-5prealpha for speech recognition. They then inputted the length of the package. These were the option fields. The subject had the system "Calculate Trajectory" and then "Takeoff" once the trajectory was calculated. Later, the subject used "Land" to finish the run. After the voice and mouse input blocked runs, subjects completed a NASA-TLX. At the conclusion of all runs, subjects completed a questionnaire asking them about their experience in inputting the mission parameters, and starting and stopping the mission using mouse and voice input. In general, the usability of voice commands is acceptable. With a relatively well-defined and simple vocabulary, the operator can input the vast majority of the mission parameters using simple, intuitive voice commands. However, voice input may be more applicable to initial mission specification rather than for critical commands such as the need to land immediately due to time and feedback constraints. It would also be convenient to retrieve relevant mission information using voice input. Therefore, further on-going research is looking at using intent from operator utterances to provide the relevant mission information to the operator. The information displayed will be inferred from the operator's utterances just before key phrases are spoken. Linguistic analysis of the context of verbal communication provides insight into the intended meaning of commonly heard phrases such as "What's it doing now?" Analyzing the semantic sphere surrounding these common phrases enables us to predict the operator's intent and supply the operator's desired information to the interface. This paper also describes preliminary investigations into the generation of the semantic space of UAV operation and the success at providing information to the interface based on the operator's utterances.
Hamashima, Chisato; Sano, Hiroshi
2018-03-27
Despite the long history of cancer screening in Japan, the participation rates in gastric and colorectal cancer screenings have not increased. Strategies for improving the participation rates have been proposed, but differences in their effects among different age groups remain unclear. The Japanese government conducted a national survey in all municipalities in Japan in 2010 to investigate whether the implementation of promotion strategies increased participation in cancer screening. We investigated the association between age factors and strategies for promoting participation in cancer screening based on this national survey. Multiple regression analysis with generalized linear model was performed using the participation rates in gastric and colorectal cancer screenings as dependent variables, and the following strategies for promoting participation as independent variables: 1) personal invitation letters, 2) household invitation letters, 3) home visits by community nurses, 4) screenings in medical offices, and 5) free cancer screening programs. One thousand six hundred thirty nine municipalities for gastric cancer screening and 1666 municipalities for colorectal cancer screening were selected for the analysis. In gastric and colorectal cancer screenings, the participation rates of individuals aged 60-69 years was higher than those of other age groups. Personal and household invitation letters were effective promotion strategies for all age groups, which encouraged even older people to participate in gastric and colorectal cancer screenings. Screening in medical offices and free screenings were not effective in all age groups. Home visits were effective, but their adoption was limited to small municipalities. To clarify whether promotion strategies can increase the participation rate in cancer screening among different age groups, 5 strategies were assessed on the basis of a national survey. Although personal and household invitation letters were effective strategies for promoting participation in cancer screening for all age groups, these strategies equally encouraged older people to participate in gastric and colorectal cancer screenings. If resource for sending invitation letters are limited, priority should be given to individuals who are in their 50s and 60s for gastric and colorectal cancer screening.
A Model of Career Success: A Longitudinal Study of Emergency Physicians
ERIC Educational Resources Information Center
Pachulicz, Sarah; Schmitt, Neal; Kuljanin, Goran
2008-01-01
Objective and subjective career success were hypothesized to mediate the relationships between sociodemographic variables, human capital indices, individual difference variables, and organizational sponsorship as inputs and a retirement decision and intentions to leave either the specialty of emergency medicine (EM) or medicine as output…
Relationship between cotton yield and soil electrical conductivity, topography, and landsat imagery
USDA-ARS?s Scientific Manuscript database
Understanding spatial and temporal variability in crop yield is a prerequisite to implementing site-specific management of crop inputs. Apparent soil electrical conductivity (ECa), soil brightness, and topography are easily obtained data that can explain yield variability. The objectives of this stu...
Input filter compensation for switching regulators
NASA Technical Reports Server (NTRS)
Kelkar, S. S.; Lee, F. C.
1983-01-01
A novel input filter compensation scheme for a buck regulator that eliminates the interaction between the input filter output impedance and the regulator control loop is presented. The scheme is implemented using a feedforward loop that senses the input filter state variables and uses this information to modulate the duty cycle signal. The feedforward design process presented is seen to be straightforward and the feedforward easy to implement. Extensive experimental data supported by analytical results show that significant performance improvement is achieved with the use of feedforward in the following performance categories: loop stability, audiosusceptibility, output impedance and transient response. The use of feedforward results in isolating the switching regulator from its power source thus eliminating all interaction between the regulator and equipment upstream. In addition the use of feedforward removes some of the input filter design constraints and makes the input filter design process simpler thus making it possible to optimize the input filter. The concept of feedforward compensation can also be extended to other types of switching regulators.
Production Economics of Private Forestry: A Comparison of Industrial and Nonindustrial Forest Owners
David H. Newman; David N. Wear
1993-01-01
This paper compares the producrion behavior of industrial and nonindustrial private forestland owners in the southeastern U.S. using a restricted profit function. Profits are modeled as a function of two outputs, sawtimber and pulpwood. one variable input, regeneration effort. and two quasi-fixed inputs, land and growing stock. Although an identical profit function is...
Nitrogen input from residential lawn care practices in suburban watersheds in Baltimore county, MD
Neely L. Law; Lawrence E. Band; J. Morgan Grove
2004-01-01
A residential lawn care survey was conducted as part of the Baltimore Ecosystem Study, a Long-term Ecological Research project funded by the National Science Foundation and collaborating agencies, to estimate the nitrogen input to urban watersheds from lawn care practices. The variability in the fertilizer N application rates and the factors affecting the application...
Darold P. Batzer; Brian J. Palik
2007-01-01
Aquatic invertebrates are crucial components of foodwebs in seasonal woodland ponds, and leaf litter is probably the most important food resource for those organisms. We quantified the influence of leaf litter inputs on aquatic invertebrates in two seasonal woodland ponds using an interception experiment. Ponds were hydrologically split using a sandbag-plastic barrier...
J.W. Hornbeck; S.W. Bailey; D.C. Buso; J.B. Shanley
1997-01-01
Chemistry of precipitation and streamwater and resulting input-output budgets for nutrient ions were determined concurrently for three years on three upland, forested watersheds located within an 80 km radius in central New England. Chemistry of precipitation and inputs of nutrients via wet deposition were similar among the three watersheds and were generally typical...
Fusion of Hard and Soft Information in Nonparametric Density Estimation
2015-06-10
and stochastic optimization models, in analysis of simulation output, and when instantiating probability models. We adopt a constrained maximum...particular, density estimation is needed for generation of input densities to simulation and stochastic optimization models, in analysis of simulation output...an essential step in simulation analysis and stochastic optimization is the generation of probability densities for input random variables; see for
ERIC Educational Resources Information Center
Huang, Francis L.; Konold, Timothy R.
2014-01-01
Psychometric properties of the Phonological Awareness Literacy Screening for Kindergarten (PALS-K) instrument were investigated in a sample of 2844 first-time public school kindergarteners. PALS-K is a widely used English literacy screening assessment. Exploratory factor analysis revealed a theoretically defensible measurement structure that was…
Variable input observer for structural health monitoring of high-rate systems
NASA Astrophysics Data System (ADS)
Hong, Jonathan; Laflamme, Simon; Cao, Liang; Dodson, Jacob
2017-02-01
The development of high-rate structural health monitoring methods is intended to provide damage detection on timescales of 10 µs -10ms where speed of detection is critical to maintain structural integrity. Here, a novel Variable Input Observer (VIO) coupled with an adaptive observer is proposed as a potential solution for complex high-rate problems. The VIO is designed to adapt its input space based on real-time identification of the system's essential dynamics. By selecting appropriate time-delayed coordinates defined by both a time delay and an embedding dimension, the proper input space is chosen which allows more accurate estimations of the current state and a reduction of the convergence rate. The optimal time-delay is estimated based on mutual information, and the embedding dimension is based on false nearest neighbors. A simulation of the VIO is conducted on a two degree-of-freedom system with simulated damage. Results are compared with an adaptive Luenberger observer, a fixed time-delay observer, and a Kalman Filter. Under its preliminary design, the VIO converges significantly faster than the Luenberger and fixed observer. It performed similarly to the Kalman Filter in terms of convergence, but with greater accuracy.
Family medicine outpatient encounters are more complex than those of cardiology and psychiatry.
Katerndahl, David; Wood, Robert; Jaén, Carlos Roberto
2011-01-01
comparison studies suggest that the guideline-concordant care provided for specific medical conditions is less optimal in primary care compared with cardiology and psychiatry settings. The purpose of this study is to estimate the relative complexity of patient encounters in general/family practice, cardiology, and psychiatry settings. secondary analysis of the 2000 National Ambulatory Medical Care Survey data for ambulatory patients seen in general/family practice, cardiology, and psychiatry settings was performed. The complexity for each variable was estimated as the quantity weighted by variability and diversity. there is minimal difference in the unadjusted input and total encounter complexity of general/family practice and cardiology; psychiatry's input is less complex. Cardiology encounters involved more input quantitatively, but the diversity of general/family practice input eliminated the difference. Cardiology also involved more complex output. However, when the duration of visit is factored in, the complexity of care provided per hour in general/family practice is 33% more relative to cardiology and 5 times more relative to psychiatry. care during family physician visits is more complex per hour than the care during visits to cardiologists or psychiatrists. This may account for a lower rate of completion of process items measured for quality of care.
Chien, Jung Hung; Mukherjee, Mukul; Siu, Ka-Chun; Stergiou, Nicholas
2016-05-01
When maintaining postural stability temporally under increased sensory conflict, a more rigid response is used where the available degrees of freedom are essentially frozen. The current study investigated if such a strategy is also utilized during more dynamic situations of postural control as is the case with walking. This study attempted to answer this question by using the Locomotor Sensory Organization Test (LSOT). This apparatus incorporates SOT inspired perturbations of the visual and the somatosensory system. Ten healthy young adults performed the six conditions of the traditional SOT and the corresponding six conditions on the LSOT. The temporal structure of sway variability was evaluated from all conditions. The results showed that in the anterior posterior direction somatosensory input is crucial for postural control for both walking and standing; visual input also had an effect but was not as prominent as the somatosensory input. In the medial lateral direction and with respect to walking, visual input has a much larger effect than somatosensory input. This is possibly due to the added contributions by peripheral vision during walking; in standing such contributions may not be as significant for postural control. In sum, as sensory conflict increases more rigid and regular sway patterns are found during standing confirming the previous results presented in the literature, however the opposite was the case with walking where more exploratory and adaptive movement patterns are present.
A dual-input nonlinear system analysis of autonomic modulation of heart rate
NASA Technical Reports Server (NTRS)
Chon, K. H.; Mullen, T. J.; Cohen, R. J.
1996-01-01
Linear analyses of fluctuations in heart rate and other hemodynamic variables have been used to elucidate cardiovascular regulatory mechanisms. The role of nonlinear contributions to fluctuations in hemodynamic variables has not been fully explored. This paper presents a nonlinear system analysis of the effect of fluctuations in instantaneous lung volume (ILV) and arterial blood pressure (ABP) on heart rate (HR) fluctuations. To successfully employ a nonlinear analysis based on the Laguerre expansion technique (LET), we introduce an efficient procedure for broadening the spectral content of the ILV and ABP inputs to the model by adding white noise. Results from computer simulations demonstrate the effectiveness of broadening the spectral band of input signals to obtain consistent and stable kernel estimates with the use of the LET. Without broadening the band of the ILV and ABP inputs, the LET did not provide stable kernel estimates. Moreover, we extend the LET to the case of multiple inputs in order to accommodate the analysis of the combined effect of ILV and ABP effect on heart rate. Analyzes of data based on the second-order Volterra-Wiener model reveal an important contribution of the second-order kernels to the description of the effect of lung volume and arterial blood pressure on heart rate. Furthermore, physiological effects of the autonomic blocking agents propranolol and atropine on changes in the first- and second-order kernels are also discussed.
Information management and analysis system for groundwater data in Thailand
NASA Astrophysics Data System (ADS)
Gill, D.; Luckananurung, P.
1992-01-01
The Ground Water Division of the Thai Department of Mineral Resources maintains a large archive of groundwater data with information on some 50,000 water wells. Each well file contains information on well location, well completion, borehole geology, water levels, water quality, and pumping tests. In order to enable efficient use of this information a computer-based system for information management and analysis was created. The project was sponsored by the United Nations Development Program and the Thai Department of Mineral Resources. The system was designed to serve users who lack prior training in automated data processing. Access is through a friendly user/system dialogue. Tasks are segmented into a number of logical steps, each of which is managed by a separate screen. Selective retrieval is possible by four different methods of area definition and by compliance with user-specified constraints on any combination of database variables. The main types of outputs are: (1) files of retrieved data, screened according to users' specifications; (2) an assortment of pre-formatted reports; (3) computed geochemical parameters and various diagrams of water chemistry derived therefrom; (4) bivariate scatter diagrams and linear regression analysis; (5) posting of data and computed results on maps; and (6) hydraulic aquifer characteristics as computed from pumping tests. Data are entered directly from formatted screens. Most records can be copied directly from hand-written documents. The database-management program performs data integrity checks in real time, enabling corrections at the time of input. The system software can be grouped into: (1) database administration and maintenance—these functions are carried out by the SIR/DBMS software package; (2) user communication interface for task definition and execution control—the interface is written in the operating system command language (VMS/DCL) and in FORTRAN 77; and (3) scientific data-processing programs, written in FORTRAN 77. The system was implemented on a DEC MicroVAX II computer.
Effects of Screening and Systemic Adjuvant Therapy on ER-Specific US Breast Cancer Mortality
Munoz, Diego; Near, Aimee M.; van Ravesteyn, Nicolien T.; Lee, Sandra J.; Schechter, Clyde B.; Alagoz, Oguzhan; Berry, Donald A.; Burnside, Elizabeth S.; Chang, Yaojen; Chisholm, Gary; de Koning, Harry J.; Ali Ergun, Mehmet; Heijnsdijk, Eveline A. M.; Huang, Hui; Stout, Natasha K.; Sprague, Brian L.; Trentham-Dietz, Amy; Mandelblatt, Jeanne S.
2014-01-01
Background Molecular characterization of breast cancer allows subtype-directed interventions. Estrogen receptor (ER) is the longest-established molecular marker. Methods We used six established population models with ER-specific input parameters on age-specific incidence, disease natural history, mammography characteristics, and treatment effects to quantify the impact of screening and adjuvant therapy on age-adjusted US breast cancer mortality by ER status from 1975 to 2000. Outcomes included stage-shifts and absolute and relative reductions in mortality; sensitivity analyses evaluated the impact of varying screening frequency or accuracy. Results In the year 2000, actual screening and adjuvant treatment reduced breast cancer mortality by a median of 17 per 100000 women (model range = 13–21) and 5 per 100000 women (model range = 3–6) for ER-positive and ER-negative cases, respectively, relative to no screening and no adjuvant treatment. For ER-positive cases, adjuvant treatment made a higher relative contribution to breast cancer mortality reduction than screening, whereas for ER-negative cases the relative contributions were similar for screening and adjuvant treatment. ER-negative cases were less likely to be screen-detected than ER-positive cases (35.1% vs 51.2%), but when screen-detected yielded a greater survival gain (five-year breast cancer survival = 35.6% vs 30.7%). Screening biennially would have captured a lower proportion of mortality reduction than annual screening for ER-negative vs ER-positive cases (model range = 80.2%–87.8% vs 85.7%–96.5%). Conclusion As advances in risk assessment facilitate identification of women with increased risk of ER-negative breast cancer, additional mortality reductions could be realized through more frequent targeted screening, provided these benefits are balanced against screening harms. PMID:25255803
Performance Studies on Distributed Virtual Screening
Krüger, Jens; de la Garza, Luis; Kohlbacher, Oliver; Nagel, Wolfgang E.
2014-01-01
Virtual high-throughput screening (vHTS) is an invaluable method in modern drug discovery. It permits screening large datasets or databases of chemical structures for those structures binding possibly to a drug target. Virtual screening is typically performed by docking code, which often runs sequentially. Processing of huge vHTS datasets can be parallelized by chunking the data because individual docking runs are independent of each other. The goal of this work is to find an optimal splitting maximizing the speedup while considering overhead and available cores on Distributed Computing Infrastructures (DCIs). We have conducted thorough performance studies accounting not only for the runtime of the docking itself, but also for structure preparation. Performance studies were conducted via the workflow-enabled science gateway MoSGrid (Molecular Simulation Grid). As input we used benchmark datasets for protein kinases. Our performance studies show that docking workflows can be made to scale almost linearly up to 500 concurrent processes distributed even over large DCIs, thus accelerating vHTS campaigns significantly. PMID:25032219
Peikert, Tobias; Duan, Fenghai; Rajagopalan, Srinivasan; Karwoski, Ronald A; Clay, Ryan; Robb, Richard A; Qin, Ziling; Sicks, JoRean; Bartholmai, Brian J; Maldonado, Fabien
2018-01-01
Optimization of the clinical management of screen-detected lung nodules is needed to avoid unnecessary diagnostic interventions. Herein we demonstrate the potential value of a novel radiomics-based approach for the classification of screen-detected indeterminate nodules. Independent quantitative variables assessing various radiologic nodule features such as sphericity, flatness, elongation, spiculation, lobulation and curvature were developed from the NLST dataset using 726 indeterminate nodules (all ≥ 7 mm, benign, n = 318 and malignant, n = 408). Multivariate analysis was performed using least absolute shrinkage and selection operator (LASSO) method for variable selection and regularization in order to enhance the prediction accuracy and interpretability of the multivariate model. The bootstrapping method was then applied for the internal validation and the optimism-corrected AUC was reported for the final model. Eight of the originally considered 57 quantitative radiologic features were selected by LASSO multivariate modeling. These 8 features include variables capturing Location: vertical location (Offset carina centroid z), Size: volume estimate (Minimum enclosing brick), Shape: flatness, Density: texture analysis (Score Indicative of Lesion/Lung Aggression/Abnormality (SILA) texture), and surface characteristics: surface complexity (Maximum shape index and Average shape index), and estimates of surface curvature (Average positive mean curvature and Minimum mean curvature), all with P<0.01. The optimism-corrected AUC for these 8 features is 0.939. Our novel radiomic LDCT-based approach for indeterminate screen-detected nodule characterization appears extremely promising however independent external validation is needed.
An Introduction to the Outcomes of Children with Hearing Loss Study
Moeller, Mary Pat; Tomblin, J. Bruce
2015-01-01
The landscape of service provision for young children with hearing loss has shifted in recent years as a result of newborn hearing screening and the early provision of interventions, including hearing technologies. It is expected that early service provision will minimize or prevent linguistic delays that typically accompany untreated permanent childhood hearing loss. The post-newborn hearing screening era has seen a resurgence of interest in empirically examining the outcomes of children with hearing loss to determine if service innovations have resulted in expected improvements in children’s functioning. The Outcomes of Children with Hearing Loss (OCHL) project was among these recent research efforts, and this introductory article provides background in the form of literature review and theoretical discussion to support the goals of the study. The OCHL project was designed to examine the language and auditory outcomes of infants and preschool-aged children with permanent, bilateral, mild-to-severe hearing loss and to identify factors that moderate the relationship between hearing loss and longitudinal outcomes. We propose that children who are hard of hearing experience limitations in access to linguistic input, which lead to a decrease in uptake of language exposure and an overall reduction in linguistic experience. We explore this hypothesis in relation to three primary factors that are proposed to influence children’s access to linguistic input: aided audibility, duration and consistency of hearing aid (HA) use, and characteristics of caregiver input. PMID:26731159
NASA Technical Reports Server (NTRS)
Minow, Joseph I.; Coffey, Victoria N.; Parker, Linda N.; Blackwell, William C., Jr.; Jun, Insoo; Garrett, Henry B.
2007-01-01
The NUMIT 1-dimensional bulk charging model is used as a screening to ol for evaluating time-dependent bulk internal or deep dielectric) ch arging of dielectrics exposed to penetrating electron environments. T he code is modified to accept time dependent electron flux time serie s along satellite orbits for the electron environment inputs instead of using the static electron flux environment input originally used b y the code and widely adopted in bulk charging models. Application of the screening technique ts demonstrated for three cases of spacecraf t exposure within the Earth's radiation belts including a geostationa ry transfer orbit and an Earth-Moon transit trajectory for a range of orbit inclinations. Electric fields and charge densities are compute d for dielectric materials with varying electrical properties exposed to relativistic electron environments along the orbits. Our objectiv e is to demonstrate a preliminary application of the time-dependent e nvironments input to the NUMIT code for evaluating charging risks to exposed dielectrics used on spacecraft when exposed to the Earth's ra diation belts. The results demonstrate that the NUMIT electric field values in GTO orbits with multiple encounters with the Earth's radiat ion belts are consistent with previous studies of charging in GTO orb its and that potential threat conditions for electrostatic discharge exist on lunar transit trajectories depending on the electrical proper ties of the materials exposed to the radiation environment.
Public health advocacy in action: the case of unproven breast cancer screening in Australia.
Johnson, Rebecca S; Croager, Emma J; Kameron, Caitlin B; Pratt, Iain S; Vreugdenburg, Thomas D; Slevin, Terry
2016-09-30
In recent years, nonmammographic breast imaging devices, such as thermography, electrical impedance scanning and elastography, have been promoted directly to consumers, which has captured the attention of governments, researchers and health organisations. These devices are not supported by evidence and risk undermining existing mammographic breast cancer screening services. During a 5-year period, Cancer Council Western Australia (CCWA) used strategic research combined with legal, policy and media advocacy to contest claims that these devices were proven alternatives to mammography for breast cancer screening. The campaign was successful because it had input from people with public health, academic, clinical and legal backgrounds, and took advantage of existing legal and regulatory avenues. CCWA's experience provides a useful advocacy model for public health practitioners who are concerned about unsafe consumer products, unproven medical devices, and misleading health information and advertising.
Investigating the disparities in cervical cancer screening among Namibian women.
Kangmennaang, Joseph; Thogarapalli, Nandini; Mkandawire, Paul; Luginaah, Isaac
2015-08-01
We examined the influence of knowledge and information, health care access and different socio-economic variables on women's decision to screen for cervical cancer using a nationally representative dataset. We use hierarchical binary logit regression models to explore the determinants of screening for cervical cancer among women who reported hearing about cervical cancer. This enabled us to include the effect of unobserved heterogeneity at the cluster level that may affect screening behaviors. Among women who have heard about cervical cancer (N=6542), only 39% of them did undergo screening with a mean age of 33 years. The univariate results reveal that women who are educated, insured, can afford money needed for treatment and reported distance not a barrier to accessing healthcare were more likely to screen. Our multivariate results indicate that insured women (OR=1.89, p=0.001) and women who had access to information through education and contact with a health worker (OR=1.41, p=0.001) were more likely to undertake screening compared to uninsured women and those with no contact with a health personnel, after controlling for relevant variables. The adoption of a universal health insurance scheme that ensures equity in access to health care and extension of public health information targeting women in rural communities especially within the Caprivi region may be needed for a large scale increase in cervical cancer screening in Namibia. Copyright © 2015 Elsevier Inc. All rights reserved.
Sedimentation in the chaparral: how do you handle unusual events?
Raymond M. Rice
1982-01-01
Abstract - Processes of erosion and sedimentation in steep chaparral drainage basins of southern California are described. The word ""hyperschedastic"" is coined to describe the sedimentation regime which is highly variable because of the interaction of marginally stable drainage basins, great variability in storm inputs, and the random occurrence...
A Longitudinal Study of Occupational Aspirations and Attainments of Iowa Young Adults.
ERIC Educational Resources Information Center
Yoesting, Dean R.; And Others
The causal linkage between socioeconomic status, occupational and educational aspiration, and attainment was examined in this attempt to test an existing theoretical model which used socioeconomic status as a major input variable, with significant other influence as a crucial intervening variable between socioeconomic status and aspiration. The…
NASA Astrophysics Data System (ADS)
Fioretti, Guido
2007-02-01
The productions function maps the inputs of a firm or a productive system onto its outputs. This article expounds generalizations of the production function that include state variables, organizational structures and increasing returns to scale. These extensions are needed in order to explain the regularities of the empirical distributions of certain economic variables.
Passik, Steven D; Lowery, Amy
2011-06-01
Opioid-related deaths in the United States have become a public health problem, with accidental and unintended overdoses being especially troubling. Screening for psychological risk factors is an important first step in safeguarding against nonadherence practices and identifying patients who may be vulnerable to the risks associated with opioid therapy. Validated screening instruments can aid in this attempt as a complementary tool to clinicians' assessments. A structured screening is imperative as part of an assessment, as clinician judgment is not the most reliable method of identifying nonadherence. As a complement to formal screening, we present for discussion and possible future study certain psychological variables observed during years of clinical practice that may be linked to medication nonadherence and accidental overdose. These variables include catastrophizing, fear, impulsivity, attention deficit disorders, existential distress, and certain personality disorders. In our experience, chronic pain patients with dual diagnoses may become "chemical copers" as a way of coping with their negative emotion. For these patients, times of stress could lead to accidental overdose. Behavioral, cognitive-behavioral (acceptance and commitment, dialectical behavior), existential (meaning-centered, dignity), and psychotropic therapies have been effective in treating these high-risk comorbidities, while managing expectations of pain relief appears key to preventing accidental overdose. Wiley Periodicals, Inc.
Passik, Steven D.; Lowery, Amy
2014-01-01
Opioid-related deaths in the United States have become a public health problem, with accidental and unintended overdoses being especially troubling. Screening for psychological risk factors is an important first step in safeguarding against nonadherence practices and identifying patients who may be vulnerable to the risks associated with opioid therapy. Validated screening instruments can aid in this attempt as a complementary tool to clinicians’ assessments. A structured screening is imperative as part of an assessment, as clinician judgment is not the most reliable method of identifying nonadherence. As a complement to formal screening, we present for discussion and possible future study certain psychological variables observed during years of clinical practice that may be linked to medication nonadherence and accidental overdose. These variables include catastrophizing, fear, impulsivity, attention deficit disorders, existential distress, and certain personality disorders. In our experience, chronic pain patients with dual diagnoses may become “chemical copers” as a way of coping with their negative emotion. For these patients, times of stress could lead to accidental overdose. Behavioral, cognitive-behavioral (acceptance and commitment, dialectical behavior), existential (meaning-centered, dignity), and psychotropic therapies have been effective in treating these high-risk comorbidities, while managing expectations of pain relief appears key to preventing accidental overdose. PMID:21668755
Guo, Weixing; Langevin, C.D.
2002-01-01
This report documents a computer program (SEAWAT) that simulates variable-density, transient, ground-water flow in three dimensions. The source code for SEAWAT was developed by combining MODFLOW and MT3DMS into a single program that solves the coupled flow and solute-transport equations. The SEAWAT code follows a modular structure, and thus, new capabilities can be added with only minor modifications to the main program. SEAWAT reads and writes standard MODFLOW and MT3DMS data sets, although some extra input may be required for some SEAWAT simulations. This means that many of the existing pre- and post-processors can be used to create input data sets and analyze simulation results. Users familiar with MODFLOW and MT3DMS should have little difficulty applying SEAWAT to problems of variable-density ground-water flow.
Trends in Solidification Grain Size and Morphology for Additive Manufacturing of Ti-6Al-4V
NASA Astrophysics Data System (ADS)
Gockel, Joy; Sheridan, Luke; Narra, Sneha P.; Klingbeil, Nathan W.; Beuth, Jack
2017-12-01
Metal additive manufacturing (AM) is used for both prototyping and production of final parts. Therefore, there is a need to predict and control the microstructural size and morphology. Process mapping is an approach that represents AM process outcomes in terms of input variables. In this work, analytical, numerical, and experimental approaches are combined to provide a holistic view of trends in the solidification grain structure of Ti-6Al-4V across a wide range of AM process input variables. The thermal gradient is shown to vary significantly through the depth of the melt pool, which precludes development of fully equiaxed microstructure throughout the depth of the deposit within any practical range of AM process variables. A strategy for grain size control is demonstrated based on the relationship between melt pool size and grain size across multiple deposit geometries, and additional factors affecting grain size are discussed.
Cordovilla-Guardia, S; Vilar-López, R; Lardelli-Claret, P; Navas, J F; Guerrero-López, F; Fernández-Mondéjar, E
To estimate how many of the trauma patients admitted to ICU would be candidates for a secondary prevention programme for trauma related to alcohol or drug use by brief motivational intervention and to define what factors prevent that intervention being performed. All 16-70year old trauma patients (n=242) admitted to ICU in 32 non-consecutive months (November 2011 to March 2015) were included in the study, coinciding with the implementation of a screening and brief motivational intervention programme for trauma patients related to substance consumption. The programme includes screening for exposure to substances at admission. Sociodemographic and clinical variables were collected prospectively. The screening for substances was not performed in 38 (15.7%) of all admitted patients. Of the patients screened, 101 (49.5%) were negative. The variables that in greater proportion impeded intervention between screening positive patients were neurological damage due to the trauma with 23 patients (37.1%) and prior psychiatric disorder with 18 (29%). Both variables were associated with substance consumption: negatives 9.9% vs positive 22.3% (P=.001) and negatives 3% vs positive 17.5% (P=.016) respectively. The number of candidates for motivational intervention was 41, 16.9% of all admitted patients. Almost 2 out of 10 patients were potential candidates. The factors that in a greater proportion precluded the intervention were the same as those associated with consumption. Mortality in ICU was associated with non-compliance with the screening protocol. Copyright © 2017 Sociedad Española de Enfermería Intensiva y Unidades Coronarias (SEEIUC). Publicado por Elsevier España, S.L.U. All rights reserved.
A service for the application of data quality information to NASA earth science satellite records
NASA Astrophysics Data System (ADS)
Armstrong, E. M.; Xing, Z.; Fry, C.; Khalsa, S. J. S.; Huang, T.; Chen, G.; Chin, T. M.; Alarcon, C.
2016-12-01
A recurring demand in working with satellite-based earth science data records is the need to apply data quality information. Such quality information is often contained within the data files as an array of "flags", but can also be represented by more complex quality descriptions such as combinations of bit flags, or even other ancillary variables that can be applied as thresholds to the geophysical variable of interest. For example, with Level 2 granules from the Group for High Resolution Sea Surface Temperature (GHRSST) project up to 6 independent variables could be used to screen the sea surface temperature measurements on a pixel-by-pixel basis. Quality screening of Level 3 data from the Soil Moisture Active Passive (SMAP) instrument can be become even more complex, involving 161 unique bit states or conditions a user can screen for. The application of quality information is often a laborious process for the user until they understand the implications of all the flags and bit conditions, and requires iterative approaches using custom software. The Virtual Quality Screening Service, a NASA ACCESS project, is addressing these issues and concerns. The project has developed an infrastructure to expose, apply, and extract quality screening information building off known and proven NASA components for data extraction and subset-by-value, data discovery, and exposure to the user of granule-based quality information. Further sharing of results through well-defined URLs and web service specifications has also been implemented. The presentation will focus on overall description of the technologies and informatics principals employed by the project. Examples of implementations of the end-to-end web service for quality screening with GHRSST and SMAP granules will be demonstrated.
Grant, Richard John; Roberts, Karen; Pointon, Carly; Hodgson, Clare; Womersley, Lynsey; Jones, Darren Craig; Tang, Eric
2009-06-01
Compound handling is a fundamental and critical step in compound screening throughout the drug discovery process. Although most compound-handling processes within compound management facilities use 100% DMSO solvent, conventional methods of manual or robotic liquid-handling systems in screening workflows often perform dilutions in aqueous solutions to maintain solvent tolerance of the biological assay. However, the use of aqueous media in these applications can lead to suboptimal data quality due to compound carryover or precipitation during the dilution steps. In cell-based assays, this effect is worsened by the unpredictable physical characteristics of compounds and the low DMSO tolerance within the assay. In some cases, the conventional approaches using manual or automated liquid handling resulted in variable IC(50) dose responses. This study examines the cause of this variability and evaluates the accuracy of screening data in these case studies. A number of liquid-handling options have been explored to address the issues and establish a generic compound-handling workflow to support cell-based screening across our screening functions. The authors discuss the validation of the Labcyte Echo reformatter as an effective noncontact solution for generic compound-handling applications against diverse compound classes using triple-quad liquid chromatography/mass spectrometry. The successful validation and implementation challenges of this technology for direct dosing onto cells in cell-based screening is discussed.
Relationship of CogScreen-AE to flight simulator performance and pilot age.
Taylor, J L; O'Hara, R; Mumenthaler, M S; Yesavage, J A
2000-04-01
We report on the relationship between CogScreen-Aeromedical Edition (AE) factor scores and flight simulator performance in aircraft pilots aged 50-69. Some 100 licensed, civilian aviators (average age 58+/-5.3 yr) performed aviation tasks in a Frasca model 141 flight simulator and the CogScreen-AE battery. The aviation performance indices were: a) staying on course; b) dialing in communication frequencies; c) avoiding conflicting traffic; d) monitoring cockpit instruments; e) executing the approach; and f) a summary score, which was the mean of these scores. The CogScreen predictors were based on a factor structure reported by Kay (11), which comprised 28 CogScreen scores. Through principal components analysis of Kay's nine factors, we reduced the number of predictors to five composite CogScreen scores: Speed/Working Memory (WM), Visual Associative Memory, Motor Coordination, Tracking, and Attribute Identification. Speed/WM scores had the highest correlation with the flight summary score, Spearman r(rho) = 0.57. A stepwise-forward multiple regression analysis indicated that four CogScreen variables could explain 45% of the variance in flight summary scores. Significant predictors, in order of entry, were: Speed/WM, Visual Associative Memory, Motor Coordination, and Tracking (p<0.05). Pilot age was found to significantly improve prediction beyond that which could be predicted by the four cognitive variables. In addition, there was some evidence for specific ability relationships between certain flight component scores and CogScreen scores, such as approach performance and tracking errors. These data support the validity of CogScreen-AE as a cognitive battery that taps skills relevant to piloting.
NASA Astrophysics Data System (ADS)
Bauer, Johannes; Dávila-Chacón, Jorge; Wermter, Stefan
2015-10-01
Humans and other animals have been shown to perform near-optimally in multi-sensory integration tasks. Probabilistic population codes (PPCs) have been proposed as a mechanism by which optimal integration can be accomplished. Previous approaches have focussed on how neural networks might produce PPCs from sensory input or perform calculations using them, like combining multiple PPCs. Less attention has been given to the question of how the necessary organisation of neurons can arise and how the required knowledge about the input statistics can be learned. In this paper, we propose a model of learning multi-sensory integration based on an unsupervised learning algorithm in which an artificial neural network learns the noise characteristics of each of its sources of input. Our algorithm borrows from the self-organising map the ability to learn latent-variable models of the input and extends it to learning to produce a PPC approximating a probability density function over the latent variable behind its (noisy) input. The neurons in our network are only required to perform simple calculations and we make few assumptions about input noise properties and tuning functions. We report on a neurorobotic experiment in which we apply our algorithm to multi-sensory integration in a humanoid robot to demonstrate its effectiveness and compare it to human multi-sensory integration on the behavioural level. We also show in simulations that our algorithm performs near-optimally under certain plausible conditions, and that it reproduces important aspects of natural multi-sensory integration on the neural level.
Brief screening for co-occurring disorders among women entering substance abuse treatment.
Lincoln, Alisa K; Liebschutz, Jane M; Chernoff, Miriam; Nguyen, Dana; Amaro, Hortensia
2006-09-07
Despite the importance of identifying co-occurring psychiatric disorders in substance abuse treatment programs, there are few appropriate and validated instruments available to substance abuse treatment staff to conduct brief screen for these conditions. This paper describes the development, implementation and validation of a brief screening instrument for mental health diagnoses and trauma among a diverse sample of Black, Hispanic and White women in substance abuse treatment. With input from clinicians and consumers, we adapted longer existing validated instruments into a 14 question screen covering demographics, mental health symptoms and physical and sexual violence exposure. All women entering treatment (methadone, residential and out-patient) at five treatment sites were screened at intake (N = 374). Eighty nine percent reported a history of interpersonal violence, and 70% reported a history of sexual assault. Eighty-eight percent reported mental health symptoms in the last 30 days. The screening questions administered to 88 female clients were validated against in-depth psychiatric diagnostic assessments by trained mental health clinicians. We estimated measures of predictive validity, including sensitivity, specificity and predictive values positive and negative. Screening items were examined multiple ways to assess utility. The screen is a useful and valid proxy for PTSD but not for other mental illness. Substance abuse treatment programs should incorporate violence exposure questions into clinical use as a matter of policy. More work is needed to develop brief screening tools measures for front-line treatment staff to accurately assess other mental health needs of women entering substance abuse treatment.
Whittington, Melanie D; Atherly, Adam J; Curtis, Donna J; Lindrooth, Richard C; Bradley, Cathy J; Campbell, Jonathan D
2017-08-01
Patients in the ICU are at the greatest risk of contracting healthcare-associated infections like methicillin-resistant Staphylococcus aureus. This study calculates the cost-effectiveness of methicillin-resistant S aureus prevention strategies and recommends specific strategies based on screening test implementation. A cost-effectiveness analysis using a Markov model from the hospital perspective was conducted to determine if the implementation costs of methicillin-resistant S aureus prevention strategies are justified by associated reductions in methicillin-resistant S aureus infections and improvements in quality-adjusted life years. Univariate and probabilistic sensitivity analyses determined the influence of input variation on the cost-effectiveness. ICU. Hypothetical cohort of adults admitted to the ICU. Three prevention strategies were evaluated, including universal decolonization, targeted decolonization, and screening and isolation. Because prevention strategies have a screening component, the screening test in the model was varied to reflect commonly used screening test categories, including conventional culture, chromogenic agar, and polymerase chain reaction. Universal and targeted decolonization are less costly and more effective than screening and isolation. This is consistent for all screening tests. When compared with targeted decolonization, universal decolonization is cost-saving to cost-effective, with maximum cost savings occurring when a hospital uses more expensive screening tests like polymerase chain reaction. Results were robust to sensitivity analyses. As compared with screening and isolation, the current standard practice in ICUs, targeted decolonization, and universal decolonization are less costly and more effective. This supports updating the standard practice to a decolonization approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ide, Toshiki; Hofmann, Holger F.; JST-CREST, Graduate School of Advanced Sciences of Matter, Hiroshima University, Kagamiyama 1-3-1, Higashi Hiroshima 739-8530
The information encoded in the polarization of a single photon can be transferred to a remote location by two-channel continuous-variable quantum teleportation. However, the finite entanglement used in the teleportation causes random changes in photon number. If more than one photon appears in the output, the continuous-variable teleportation accidentally produces clones of the original input photon. In this paper, we derive the polarization statistics of the N-photon output components and show that they can be decomposed into an optimal cloning term and completely unpolarized noise. We find that the accidental cloning of the input photon is nearly optimal at experimentallymore » feasible squeezing levels, indicating that the loss of polarization information is partially compensated by the availability of clones.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Haixia; Zhang, Jing
We propose a scheme for continuous-variable quantum cloning of coherent states with phase-conjugate input modes using linear optics. The quantum cloning machine yields M identical optimal clones from N replicas of a coherent state and N replicas of its phase conjugate. This scheme can be straightforwardly implemented with the setups accessible at present since its optical implementation only employs simple linear optical elements and homodyne detection. Compared with the original scheme for continuous-variable quantum cloning with phase-conjugate input modes proposed by Cerf and Iblisdir [Phys. Rev. Lett. 87, 247903 (2001)], which utilized a nondegenerate optical parametric amplifier, our scheme losesmore » the output of phase-conjugate clones and is regarded as irreversible quantum cloning.« less
Computer program for design analysis of radial-inflow turbines
NASA Technical Reports Server (NTRS)
Glassman, A. J.
1976-01-01
A computer program written in FORTRAN that may be used for the design analysis of radial-inflow turbines was documented. The following information is included: loss model (estimation of losses), the analysis equations, a description of the input and output data, the FORTRAN program listing and list of variables, and sample cases. The input design requirements include the power, mass flow rate, inlet temperature and pressure, and rotational speed. The program output data includes various diameters, efficiencies, temperatures, pressures, velocities, and flow angles for the appropriate calculation stations. The design variables include the stator-exit angle, rotor radius ratios, and rotor-exit tangential velocity distribution. The losses are determined by an internal loss model.
Gustafson, William Jr; Vogelmann, Andrew; Endo, Satoshi; Toto, Tami; Xiao, Heng; Li, Zhijin; Cheng, Xiaoping; Kim, Jinwon; Krishna, Bhargavi
2015-08-31
The Alpha 2 release is the second release from the LASSO Pilot Phase that builds upon the Alpha 1 release. Alpha 2 contains additional diagnostics in the data bundles and focuses on cases from spring-summer 2016. A data bundle is a unified package consisting of LASSO LES input and output, observations, evaluation diagnostics, and model skill scores. LES input include model configuration information and forcing data. LES output includes profile statistics and full domain fields of cloud and environmental variables. Model evaluation data consists of LES output and ARM observations co-registered on the same grid and sampling frequency. Model performance is quantified by skill scores and diagnostics in terms of cloud and environmental variables.