Comprehensive study of numerical anisotropy and dispersion in 3-D TLM meshes
NASA Astrophysics Data System (ADS)
Berini, Pierre; Wu, Ke
1995-05-01
This paper presents a comprehensive analysis of the numerical anisotropy and dispersion of 3-D TLM meshes constructed using several generalized symmetrical condensed TLM nodes. The dispersion analysis is performed in isotropic lossless, isotropic lossy and anisotropic lossless media and yields a comparison of the simulation accuracy for the different TLM nodes. The effect of mesh grading on the numerical dispersion is also determined. The results compare meshes constructed with Johns' symmetrical condensed node (SCN), two hybrid symmetrical condensed nodes (HSCN) and two frequency domain symmetrical condensed nodes (FDSCN). It has been found that under certain circumstances, the time domain nodes may introduce numerical anisotropy when modelling isotropic media.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reckinger, Scott James; Livescu, Daniel; Vasilyev, Oleg V.
A comprehensive numerical methodology has been developed that handles the challenges introduced by considering the compressive nature of Rayleigh-Taylor instability (RTI) systems, which include sharp interfacial density gradients on strongly stratified background states, acoustic wave generation and removal at computational boundaries, and stratification-dependent vorticity production. The computational framework is used to simulate two-dimensional single-mode RTI to extreme late-times for a wide range of flow compressibility and variable density effects. The results show that flow compressibility acts to reduce the growth of RTI for low Atwood numbers, as predicted from linear stability analysis.
NASA Technical Reports Server (NTRS)
Kung, Ernest C.
1994-01-01
The contract research has been conducted in the following three major areas: analysis of numerical simulations and parallel observations of atmospheric blocking, diagnosis of the lower boundary heating and the response of the atmospheric circulation, and comprehensive assessment of long-range forecasting with numerical and regression methods. The essential scientific and developmental purpose of this contract research is to extend our capability of numerical weather forecasting by the comprehensive general circulation model. The systematic work as listed above is thus geared to developing a technological basis for future NASA long-range forecasting.
Zdeněk Kopal: Numerical Analyst
NASA Astrophysics Data System (ADS)
Křížek, M.
2015-07-01
We give a brief overview of Zdeněk Kopal's life, his activities in the Czech Astronomical Society, his collaboration with Vladimír Vand, and his studies at Charles University, Cambridge, Harvard, and MIT. Then we survey Kopal's professional life. He published 26 monographs and 20 conference proceedings. We will concentrate on Kopal's extensive monograph Numerical Analysis (1955, 1961) that is widely accepted to be the first comprehensive textbook on numerical methods. It describes, for instance, methods for polynomial interpolation, numerical differentiation and integration, numerical solution of ordinary differential equations with initial or boundary conditions, and numerical solution of integral and integro-differential equations. Special emphasis will be laid on error analysis. Kopal himself applied numerical methods to celestial mechanics, in particular to the N-body problem. He also used Fourier analysis to investigate light curves of close binaries to discover their properties. This is, in fact, a problem from mathematical analysis.
Improved word comprehension in Global aphasia using a modified semantic feature analysis treatment.
Munro, Philippa; Siyambalapitiya, Samantha
2017-01-01
Limited research has investigated treatment of single word comprehension in people with aphasia, despite numerous studies examining treatment of naming deficits. This study employed a single case experimental design to examine efficacy of a modified semantic feature analysis (SFA) therapy in improving word comprehension in an individual with Global aphasia, who presented with a semantically based comprehension impairment. Ten treatment sessions were conducted over a period of two weeks. Following therapy, the participant demonstrated improved comprehension of treatment items and generalisation to control items, measured by performance on a spoken word picture matching task. Improvements were also observed on other language assessments (e.g. subtests of WAB-R; PALPA subtest 47) and were largely maintained over a period of 12 weeks without further therapy. This study provides support for the efficacy of a modified SFA therapy in remediating single word comprehension in individuals with aphasia with a semantically based comprehension deficit.
Surface electrical properties experiment, Part 3
NASA Technical Reports Server (NTRS)
1974-01-01
A complete unified discussion of the electromagnetic response of a plane stratified structure is reported. A detailed and comprehensive analysis of the theoretical parts of the electromagnetic is given. The numerical problem of computing numbers of the electromagnetic field strengths is discussed. It is shown that the analysis of the conductive media is not very far removed from the theoretical analysis and the numerical difficulties are not as accute as for the low-loss problem. For Vol. 1, see N75-15570; for Vol. 2 see N75-15571.
Lexical decision as an endophenotype for reading comprehension: An exploration of an association
NAPLES, ADAM; KATZ, LEN; GRIGORENKO, ELENA L.
2012-01-01
Based on numerous suggestions in the literature, we evaluated lexical decision (LD) as a putative endophenotype for reading comprehension by investigating heritability estimates and segregation analyses parameter estimates for both of these phenotypes. Specifically, in a segregation analysis of a large sample of families, we established that there is little to no overlap between genes contributing to LD and reading comprehension and that the genetic mechanism behind LD derived from this analysis appears to be more complex than that for reading comprehension. We conclude that in our sample, LD is not a good candidate as an endophenotype for reading comprehension, despite previous suggestions from the literature. Based on this conclusion, we discuss the role and benefit of the endophenotype approach in studies of complex human cognitive functions. PMID:23062302
Enviroplan—a summary methodology for comprehensive environmental planning and design
Robert Allen Jr.; George Nez; Fred Nicholson; Larry Sutphin
1979-01-01
This paper will discuss a comprehensive environmental assessment methodology that includes a numerical method for visual management and analysis. This methodology employs resource and human activity units as a means to produce a visual form unit which is the fundamental unit of the perceptual environment. The resource unit is based on the ecosystem as the fundamental...
Comprehensive analysis of transport aircraft flight performance
NASA Astrophysics Data System (ADS)
Filippone, Antonio
2008-04-01
This paper reviews the state-of-the art in comprehensive performance codes for fixed-wing aircraft. The importance of system analysis in flight performance is discussed. The paper highlights the role of aerodynamics, propulsion, flight mechanics, aeroacoustics, flight operation, numerical optimisation, stochastic methods and numerical analysis. The latter discipline is used to investigate the sensitivities of the sub-systems to uncertainties in critical state parameters or functional parameters. The paper discusses critically the data used for performance analysis, and the areas where progress is required. Comprehensive analysis codes can be used for mission fuel planning, envelope exploration, competition analysis, a wide variety of environmental studies, marketing analysis, aircraft certification and conceptual aircraft design. A comprehensive program that uses the multi-disciplinary approach for transport aircraft is presented. The model includes a geometry deck, a separate engine input deck with the main parameters, a database of engine performance from an independent simulation, and an operational deck. The comprehensive code has modules for deriving the geometry from bitmap files, an aerodynamics model for all flight conditions, a flight mechanics model for flight envelopes and mission analysis, an aircraft noise model and engine emissions. The model is validated at different levels. Validation of the aerodynamic model is done against the scale models DLR-F4 and F6. A general model analysis and flight envelope exploration are shown for the Boeing B-777-300 with GE-90 turbofan engines with intermediate passenger capacity (394 passengers in 2 classes). Validation of the flight model is done by sensitivity analysis on the wetted area (or profile drag), on the specific air range, the brake-release gross weight and the aircraft noise. A variety of results is shown, including specific air range charts, take-off weight-altitude charts, payload-range performance, atmospheric effects, economic Mach number and noise trajectories at F.A.R. landing points.
Hot forming of composite prepreg: Numerical analyses
NASA Astrophysics Data System (ADS)
Guzman-Maldonado, Eduardo; Hamila, Nahiène; Boisse, Philippe; El Azzouzi, Khalid; Tardif, Xavier; Moro, Tanguy; Chatel, Sylvain; Fideu, Paulin
2017-10-01
The work presented here is part of the "FORBANS" project about the Hot Drape Forming (HDF) process consisting of unidirectional prepregs laminates. To ensure a fine comprehension of this process a combination strategy between experiment and numerical analysis is adopted. This paper is focused on the numerical analysis using the finite element method (FEM) with a hyperelastic constitutive law. Each prepreg layer is modelled by shell elements. These elements consider the tension, in-plane shear and bending behaviour of the ply at different temperatures. The contact/friction during the forming process is taken into account using forward increment Lagrange multipliers.
Radio Propagation Prediction Software for Complex Mixed Path Physical Channels
2006-08-14
63 4.4.6. Applied Linear Regression Analysis in the Frequency Range 1-50 MHz 69 4.4.7. Projected Scaling to...4.4.6. Applied Linear Regression Analysis in the Frequency Range 1-50 MHz In order to construct a comprehensive numerical algorithm capable of
Bochev, P.; Edwards, H. C.; Kirby, R. C.; ...
2012-01-01
Intrepid is a Trilinos package for advanced discretizations of Partial Differential Equations (PDEs). The package provides a comprehensive set of tools for local, cell-based construction of a wide range of numerical methods for PDEs. This paper describes the mathematical ideas and software design principles incorporated in the package. We also provide representative examples showcasing the use of Intrepid both in the context of numerical PDEs and the more general context of data analysis.
Numerical analysis of ossicular chain lesion of human ear
NASA Astrophysics Data System (ADS)
Liu, Yingxi; Li, Sheng; Sun, Xiuzhen
2009-04-01
Lesion of ossicular chain is a common ear disease impairing the sense of hearing. A comprehensive numerical model of human ear can provide better understanding of sound transmission. In this study, we propose a three-dimensional finite element model of human ear that incorporates the canal, tympanic membrane, ossicular bones, middle ear suspensory ligaments/muscles, middle ear cavity and inner ear fluid. Numerical analysis is conducted and employed to predict the effects of middle ear cavity, malleus handle defect, hypoplasia of the long process of incus, and stapedial crus defect on sound transmission. The present finite element model is shown to be reasonable in predicting the ossicular mechanics of human ear.
From LIDAR Scanning to 3d FEM Analysis for Complex Surface and Underground Excavations
NASA Astrophysics Data System (ADS)
Chun, K.; Kemeny, J.
2017-12-01
Light detection and ranging (LIDAR) has been a prevalent remote-sensing technology applied in the geological fields due to its high precision and ease to use. One of the major applications is to use the detailed geometrical information of underground structures as a basis for the generation of three-dimensional numerical model that can be used in FEM analysis. To date, however, straightforward techniques in reconstructing numerical model from the scanned data of underground structures have not been well established or tested. In this paper, we propose a comprehensive approach integrating from LIDAR scanning to finite element numerical analysis, specifically converting LIDAR 3D point clouds of object containing complex surface geometry into finite element model. This methodology has been applied to the Kartchner Caverns in Arizona for the stability analysis. Numerical simulations were performed using the finite element code ABAQUS. The results indicate that the highlights of our technologies obtained from LIDAR is effective and provide reference for other similar engineering project in practice.
Comparing the Effectiveness of SPSS and EduG Using Different Designs for Generalizability Theory
ERIC Educational Resources Information Center
Teker, Gulsen Tasdelen; Guler, Nese; Uyanik, Gulden Kaya
2015-01-01
Generalizability theory (G theory) provides a broad conceptual framework for social sciences such as psychology and education, and a comprehensive construct for numerous measurement events by using analysis of variance, a strong statistical method. G theory, as an extension of both classical test theory and analysis of variance, is a model which…
Cost-effective use of minicomputers to solve structural problems
NASA Technical Reports Server (NTRS)
Storaasli, O. O.; Foster, E. P.
1978-01-01
Minicomputers are receiving increased use throughout the aerospace industry. Until recently, their use focused primarily on process control and numerically controlled tooling applications, while their exposure to and the opportunity for structural calculations has been limited. With the increased availability of this computer hardware, the question arises as to the feasibility and practicality of carrying out comprehensive structural analysis on a minicomputer. This paper presents results on the potential for using minicomputers for structural analysis by (1) selecting a comprehensive, finite-element structural analysis system in use on large mainframe computers; (2) implementing the system on a minicomputer; and (3) comparing the performance of the minicomputers with that of a large mainframe computer for the solution to a wide range of finite element structural analysis problems.
NASA Astrophysics Data System (ADS)
Khan, Imad; Ullah, Shafquat; Malik, M. Y.; Hussain, Arif
2018-06-01
The current analysis concentrates on the numerical solution of MHD Carreau fluid flow over a stretching cylinder under the influences of homogeneous-heterogeneous reactions. Modelled non-linear partial differential equations are converted into ordinary differential equations by using suitable transformations. The resulting system of equations is solved with the aid of shooting algorithm supported by fifth order Runge-Kutta integration scheme. The impact of non-dimensional governing parameters on the velocity, temperature, skin friction coefficient and local Nusselt number are comprehensively delineated with the help of graphs and tables.
Osuch, Tomasz; Markowski, Konrad; Jędrzejewski, Kazimierz
2015-06-10
A versatile numerical model for spectral transmission/reflection, group delay characteristic analysis, and design of tapered fiber Bragg gratings (TFBGs) is presented. This approach ensures flexibility with defining both distribution of refractive index change of the gratings (including apodization) and shape of the taper profile. Additionally, sensing and tunable dispersion properties of the TFBGs were fully examined, considering strain-induced effects. The presented numerical approach, together with Pareto optimization, were also used to design the best tanh apodization profiles of the TFBG in terms of maximizing its spectral width with simultaneous minimization of the group delay oscillations. Experimental verification of the model confirms its correctness. The combination of model versatility and possibility to define the other objective functions of Pareto optimization creates a universal tool for TFBG analysis and design.
Keller, Carmen; Junghans, Alex
2017-11-01
Individuals with low numeracy have difficulties with understanding complex graphs. Combining the information-processing approach to numeracy with graph comprehension and information-reduction theories, we examined whether high numerates' better comprehension might be explained by their closer attention to task-relevant graphical elements, from which they would expect numerical information to understand the graph. Furthermore, we investigated whether participants could be trained in improving their attention to task-relevant information and graph comprehension. In an eye-tracker experiment ( N = 110) involving a sample from the general population, we presented participants with 2 hypothetical scenarios (stomach cancer, leukemia) showing survival curves for 2 treatments. In the training condition, participants received written instructions on how to read the graph. In the control condition, participants received another text. We tracked participants' eye movements while they answered 9 knowledge questions. The sum constituted graph comprehension. We analyzed visual attention to task-relevant graphical elements by using relative fixation durations and relative fixation counts. The mediation analysis revealed a significant ( P < 0.05) indirect effect of numeracy on graph comprehension through visual attention to task-relevant information, which did not differ between the 2 conditions. Training had a significant main effect on visual attention ( P < 0.05) but not on graph comprehension ( P < 0.07). Individuals with high numeracy have better graph comprehension due to their greater attention to task-relevant graphical elements than individuals with low numeracy. With appropriate instructions, both groups can be trained to improve their graph-processing efficiency. Future research should examine (e.g., motivational) mediators between visual attention and graph comprehension to develop appropriate instructions that also result in higher graph comprehension.
Presenting numeric information with percentages and descriptive risk labels: A randomized trial
Sinayev, Aleksandr; Peters, Ellen; Tusler, Martin; Fraenkel, Liana
2015-01-01
Background Previous research demonstrated that providing (vs. not providing) numeric information about medications’ adverse effects (AEs) increased comprehension and willingness to use medication, but left open the question about which numeric format is best. Objective To determine which of four tested formats (percentage, frequency, percentage+risk label, frequency+risk label) maximizes comprehension and willingness to use medication across age and numeracy levels. Design In a cross-sectional internet survey (N=368; American Life Panel, 5/15/08–6/18/08), respondents were presented with a hypothetical prescription medication for high cholesterol. AE likelihoods were described using one of four tested formats. Main outcome measures were risk comprehension (ability to identify AE likelihood from a table) and willingness to use the medication (7-point scale; not likely=0, very likely=6). Results The percentage+risk label format resulted in the highest comprehension and willingness to use the medication compared to the other three formats (mean comprehension in percentage + risk label format=95% vs mean across the other three formats = 81%; mean willingness= 3.3 vs 2.95, respectively). Comprehension differences between percentage and frequency formats were smaller among the less numerate. Willingness to use medication depended less on age and numeracy when labels were used. Limitations Generalizability is limited by use of a sample that was older, more educated, and better off financially than national averages. Conclusions Providing numeric AE-likelihood information in a percentage format with risk labels is likely to increase risk comprehension and willingness to use a medication compared to other numeric formats. PMID:25952743
Presenting Numeric Information with Percentages and Descriptive Risk Labels: A Randomized Trial.
Sinayev, Aleksandr; Peters, Ellen; Tusler, Martin; Fraenkel, Liana
2015-11-01
Previous research demonstrated that providing (v. not providing) numeric information about the adverse effects (AEs) of medications increased comprehension and willingness to use medication but left open the question about which numeric format is best. The objective was to determine which of 4 tested formats (percentage, frequency, percentage + risk label, frequency + risk label) maximizes comprehension and willingness to use medication across age and numeracy levels. In a cross-sectional internet survey (N = 368; American Life Panel, 15 May 2008 to 18 June 2008), respondents were presented with a hypothetical prescription medication for high cholesterol. AE likelihoods were described using 1 of 4 tested formats. Main outcome measures were risk comprehension (ability to identify AE likelihood from a table) and willingness to use the medication (7-point scale; not likely = 0, very likely = 6). The percentage + risk label format resulted in the highest comprehension and willingness to use the medication compared with the other 3 formats (mean comprehension in percentage + risk label format = 95% v. mean across the other 3 formats = 81%; mean willingness = 3.3 v. 2.95, respectively). Comprehension differences between percentage and frequency formats were smaller among the less numerate. Willingness to use medication depended less on age and numeracy when labels were used. Generalizability is limited by the use of a sample that was older, more educated, and better off financially than national averages. Providing numeric AE-likelihood information in a percentage format with risk labels is likely to increase risk comprehension and willingness to use a medication compared with other numeric formats. © The Author(s) 2015.
Phonation Types in Marathi: An Acoustic Investigation
ERIC Educational Resources Information Center
Berkson, Kelly Harper
2013-01-01
This dissertation presents a comprehensive instrumental acoustic analysis of phonation type distinctions in Marathi, an Indic language with numerous breathy voiced sonorants and obstruents. Important new facts about breathy voiced sonorants, which are crosslinguistically rare, are established: male and female speakers cue breathy phonation in…
de Knegt, N C; Evenhuis, H M; Lobbezoo, F; Schuengel, C; Scherder, E J A
2013-10-01
People with intellectual disabilities are at high risk for pain and have communication difficulties. Facial and numeric scales for self-report may aid pain identification. It was examined whether the comprehension of a facial affective scale and a numeric scale for pain in adults with Down syndrome (DS) varies with presentation format. Adults with DS were included (N=106, mild to severe ID, mean age 37 years), both with (N=57) and without (N=49) physical conditions that may cause pain or discomfort. The Facial Affect Scale (FAS) and a numeric rating scale (NRS) were compared. One subgroup of participants (N=50) had to choose the two items within each format to indicate 'least pain' and 'most pain'. The other subgroup of participants (N=56) had to order three faces of the FAS from 'least pain' to 'most pain', and to answer questions about the magnitude of numbers for the NRS. Comprehension percentages were compared between two subgroups. More participants understood the FAS than the NRS, irrespective of the presentation format. The comprehension percentage for the FAS did not differ between the least-most extremities format and the ordering/magnitude format. In contrast, comprehension percentages for the NRS differed significantly between the least-most extremities format (61%) and the ordering/magnitude format (32%). The inclusion of ordering and magnitude in a presentation format is essential to assess thorough comprehension of facial and numeric scales for self-reported pain. The use of this format does not influence the number of adults with DS who pass the comprehension test for the FAS, but reduces the number of adults with DS who pass the comprehension test for the NRS. Copyright © 2013 Elsevier Ltd. All rights reserved.
Homogeneous buoyancy-generated turbulence
NASA Technical Reports Server (NTRS)
Batchelor, G. K.; Canuto, V. M.; Chasnov, J. R.
1992-01-01
Using a theoretical analysis of fundamental equations and a numerical simulation of the flow field, the statistically homogeneous motion that is generated by buoyancy forces after the creation of homogeneous random fluctuations in the density of infinite fluid at an initial instant is examined. It is shown that analytical results together with numerical results provide a comprehensive description of the 'birth, life, and death' of buoyancy-generated turbulence. Results of numerical simulations yielded the mean-square density mean-square velocity fluctuations and the associated spectra as functions of time for various initial conditions, and the time required for the mean-square density fluctuation to fall to a specified small value was estimated.
Tailoring risk communication to improve comprehension: Do patient preferences help or hurt?
Barnes, Andrew J; Hanoch, Yaniv; Miron-Shatz, Talya; Ozanne, Elissa M
2016-09-01
Risk communication tools can facilitate patients' understanding of risk information. In this novel study, we examine the hypothesis that risk communication methods tailored to individuals' preferences can increase risk comprehension. Preferences for breast cancer risk formats, and risk comprehension data were collected using an online survey from 361 women at high risk for breast cancer. Women's initial preferences were assessed by asking them which of the following risk formats would be the clearest: (a) percentage, (b) frequency, (c) bar graph, (d) pictogram, and (e) comparison to other women. Next, women were presented with 5 different formats for displaying cancer risks and asked to interpret the risk information presented. Finally, they were asked again which risk format they preferred. Initial preferences for risk formats were not associated with risk comprehension scores. However, women with lower risk comprehension scores were more likely to update their risk format preferences after they evaluated risks in different formats. Less numerate women were more likely to prefer graphical rather than numeric risk formats. Importantly, we found that women preferring graphical risk formats had lower risk comprehension in these formats compared to numeric formats. In contrast, women preferring numeric formats performed equally well across formats. Our findings suggest that tailoring risk communication to patient preferences may not improve understanding of medical risks, particularly for less numerate women, and point to the potential perils of tailoring risk communication formats to patient preferences. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Numerical study of read scheme in one-selector one-resistor crossbar array
NASA Astrophysics Data System (ADS)
Kim, Sungho; Kim, Hee-Dong; Choi, Sung-Jin
2015-12-01
A comprehensive numerical circuit analysis of read schemes of a one selector-one resistance change memory (1S1R) crossbar array is carried out. Three schemes-the ground, V/2, and V/3 schemes-are compared with each other in terms of sensing margin and power consumption. Without the aid of a complex analytical approach or SPICE-based simulation, a simple numerical iteration method is developed to simulate entire current flows and node voltages within a crossbar array. Understanding such phenomena is essential in successfully evaluating the electrical specifications of selectors for suppressing intrinsic drawbacks of crossbar arrays, such as sneaky current paths and series line resistance problems. This method provides a quantitative tool for the accurate analysis of crossbar arrays and provides guidelines for developing an optimal read scheme, array configuration, and selector device specifications.
Substantiated Best Practices in Transition: Fifteen Plus Years Later
ERIC Educational Resources Information Center
Landmark, Leena Jo; Ju, Song; Zhang, Dalun
2010-01-01
Since the transition movement in the 1980s, numerous transition practices have been developed. Kohler (1993) provided a comprehensive review and analysis of transition best practices and divided them into substantiated and implied practices based on the existence of empirical evidence. Since that review was published, the field of transition has…
Development of a Linearized Unsteady Euler Analysis with Application to Wake/Blade-Row Interactions
NASA Technical Reports Server (NTRS)
Verdon, Joseph M.; Montgomery, Matthew D.; Chuang, H. Andrew
1999-01-01
A three-dimensional, linearized, Euler analysis is being developed to provide a comprehensive and efficient unsteady aerodynamic analysis for predicting the aeroacoustic and aeroelastic responses of axial-flow turbomachinery blading. The mathematical models needed to describe nonlinear and linearized, inviscid, unsteady flows through a blade row operating within a cylindrical annular duct are presented in this report. A numerical model for linearized inviscid unsteady flows, which couples a near-field, implicit, wave-split, finite volume analysis to far-field eigen analyses, is also described. The linearized aerodynamic and numerical models have been implemented into the three-dimensional unsteady flow code, LINFLUX. This code is applied herein to predict unsteady subsonic flows driven by wake or vortical excitations. The intent is to validate the LINFLUX analysis via numerical results for simple benchmark unsteady flows and to demonstrate this analysis via application to a realistic wake/blade-row interaction. Detailed numerical results for a three-dimensional version of the 10th Standard Cascade and a fan exit guide vane indicate that LINFLUX is becoming a reliable and useful unsteady aerodynamic prediction capability that can be applied, in the future, to assess the three-dimensional flow physics important to blade-row, aeroacoustic and aeroelastic responses.
Challenges to Applying a Metamodel for Groundwater Flow Beyond Underlying Numerical Model Boundaries
NASA Astrophysics Data System (ADS)
Reeves, H. W.; Fienen, M. N.; Feinstein, D.
2015-12-01
Metamodels of environmental behavior offer opportunities for decision support, adaptive management, and increased stakeholder engagement through participatory modeling and model exploration. Metamodels are derived from calibrated, computationally demanding, numerical models. They may potentially be applied to non-modeled areas to provide screening or preliminary analysis tools for areas that do not yet have the benefit of more comprehensive study. In this decision-support mode, they may be fulfilling a role often accomplished by application of analytical solutions. The major challenge to transferring a metamodel to a non-modeled area is how to quantify the spatial data in the new area of interest in such a way that it is consistent with the data used to derive the metamodel. Tests based on transferring a metamodel derived from a numerical groundwater-flow model of the Lake Michigan Basin to other glacial settings across the northern U.S. show that the spatial scale of the numerical model must be appropriately scaled to adequately represent different settings. Careful GIS analysis of the numerical model, metamodel, and new area of interest is required for successful transfer of results.
Buckling analysis of SMA bonded sandwich structure – using FEM
NASA Astrophysics Data System (ADS)
Katariya, Pankaj V.; Das, Arijit; Panda, Subrata K.
2018-03-01
Thermal buckling strength of smart sandwich composite structure (bonded with shape memory alloy; SMA) examined numerically via a higher-order finite element model in association with marching technique. The excess geometrical distortion of the structure under the elevated environment modeled through Green’s strain function whereas the material nonlinearity counted with the help of marching method. The system responses are computed numerically by solving the generalized eigenvalue equations via a customized MATLAB code. The comprehensive behaviour of the current finite element solutions (minimum buckling load parameter) is established by solving the adequate number of numerical examples including the given input parameter. The current numerical model is extended further to check the influence of various structural parameter of the sandwich panel on the buckling temperature including the SMA effect and reported in details.
Samak, M. Mosleh E. Abu; Bakar, A. Ashrif A.; Kashif, Muhammad; Zan, Mohd Saiful Dzulkifly
2016-01-01
This paper discusses numerical analysis methods for different geometrical features that have limited interval values for typically used sensor wavelengths. Compared with existing Finite Difference Time Domain (FDTD) methods, the alternating direction implicit (ADI)-FDTD method reduces the number of sub-steps by a factor of two to three, which represents a 33% time savings in each single run. The local one-dimensional (LOD)-FDTD method has similar numerical equation properties, which should be calculated as in the previous method. Generally, a small number of arithmetic processes, which result in a shorter simulation time, are desired. The alternating direction implicit technique can be considered a significant step forward for improving the efficiency of unconditionally stable FDTD schemes. This comparative study shows that the local one-dimensional method had minimum relative error ranges of less than 40% for analytical frequencies above 42.85 GHz, and the same accuracy was generated by both methods.
Is Hirsch's "H" the Best Predictor of the Number of a Researcher's Extremely Highly Cited Articles?
ERIC Educational Resources Information Center
Cho, Kit W.; Neely, James H.
2012-01-01
Ruscio et al. (Ruscio, Seaman, D'Oriano, Stremlo, & Mahalchik, this issue) have provided an impressively comprehensive conceptual and empirical psychometric analysis of 22 modern-day citation measures. Their analyses show that although numerous measures have been developed to ameliorate perceived limitations of Hirsch's (2005) "h" index (which is…
ERIC Educational Resources Information Center
Schuster, Jonathan
2012-01-01
Reading is a complex process involving numerous skills and abilities contributing to acquiring meaning from text. Individuals without the requisite reading skills will have difficulty not only in school but throughout their lifetimes. The purpose of the study was to compare the reading ability of incoming college freshmen with that of adults with…
Little, Callie W; Haughbrook, Rasheda; Hart, Sara A
2017-01-01
Numerous twin studies have examined the genetic and environmental etiology of reading comprehension, though it is likely that etiological estimates are influenced by unidentified sample conditions (e.g. Tucker-Drob and Bates, Psychol Sci:0956797615612727, 2015). The purpose of this meta-analysis was to average the etiological influences of reading comprehension and to explore the potential moderators influencing these estimates. Results revealed an average heritability estimate of h 2 = 0.59, with significant variation in estimates across studies, suggesting potential moderation. Moderation results indicated publication year, grade level, project, zygosity methods, and response type moderated heritability estimates. The average shared environmental estimate was c 2 = 0.16, with publication year, grade and zygosity methods acting as significant moderators. These findings support the role of genetics on reading comprehension, and a small significant role of shared environmental influences. The results suggest that our interpretation of how genes and environments influence reading comprehension should reflect aspects of study and sample.
NASA Technical Reports Server (NTRS)
Smith, Eric A.; Einaudi, Franco (Technical Monitor)
2001-01-01
A comprehensive understanding of the meteorological and microphysical nature of Mediterranean storms requires a combination of in situ data analysis, radar data analysis, and satellite data analysis, effectively integrated with numerical modeling studies at various scales. An important aspect of understanding microphysical controls of severe storms, is first understanding the meteorological controls under which a storm has evolved, and then using that information to help characterize the dominant microphysical processes. For hazardous Mediterranean storms, highlighted by the October 5-6, 1998 Friuli flood event in northern Italy, a comprehensive microphysical interpretation requires an understanding of the multiple phases of storm evolution. This involves intense convective development, Sratiform decay, orographic lifting, and sloped frontal lifting processes, as well as the associated vertical motions and thermodynamical instabilities governing physical processes that effect details of the size distributions and fall rates of the various types of hydrometeors found within the storm environment. This talk overviews the microphysical elements of a severe Mediterranean storm in such a context, investigated with the aid of TRMM satellite and other remote sensing measurements, but guided by a nonhydrostatic mesoscale model simulation of the Friuli flood event. The data analysis for this paper was conducted by my research groups at the Global Hydrology and Climate Center in Huntsville, AL and Florida State University in Tallahassee, and in collaboration with Dr. Alberto Mugnai's research group at the Institute of Atmospheric Physics in Rome. The numerical modeling was conducted by Professor Oreg Tripoli and Ms. Giulia Panegrossi at the University of Wisconsin in Madison, using Professor Tripoli's nonhydrostatic modeling system (NMS). This is a scalable, fully nested mesoscale model capable of resolving nonhydrostatic circulations from regional scale down to cloud scale and below.
Vibration and noise analysis of a gear transmission system
NASA Technical Reports Server (NTRS)
Choy, F. K.; Qian, W.; Zakrajsek, J. J.; Oswald, F. B.
1993-01-01
This paper presents a comprehensive procedure to predict both the vibration and noise generated by a gear transmission system under normal operating conditions. The gearbox vibrations were obtained from both numerical simulation and experimental studies using a gear noise test rig. In addition, the noise generated by the gearbox vibrations was recorded during the experimental testing. A numerical method was used to develop linear relationships between the gearbox vibration and the generated noise. The hypercoherence function is introduced to correlate the nonlinear relationship between the fundamental noise frequency and its harmonics. A numerical procedure was developed using both the linear and nonlinear relationships generated from the experimental data to predict noise resulting from the gearbox vibrations. The application of this methodology is demonstrated by comparing the numerical and experimental results from the gear noise test rig.
Numerical Modeling of Electrode Degradation During Resistance Spot Welding Using CuCrZr Electrodes
NASA Astrophysics Data System (ADS)
Gauthier, Elise; Carron, Denis; Rogeon, Philippe; Pilvin, Philippe; Pouvreau, Cédric; Lety, Thomas; Primaux, François
2014-05-01
Resistance spot welding is a technique widely used by the automotive industry to assemble thin steel sheets. The cyclic thermo-mechanical loading associated with the accumulation of weld spots progressively deteriorates the electrodes. This study addresses the development of a comprehensive multi-physical model that describes the sequential deterioration. Welding tests achieved on uncoated and Zn-coated steel sheets are analyzed. Finite element analysis is performed using an electrical-thermal-metallurgical model. A numerical experimental design is carried out to highlight the main process parameters and boundary conditions which affect electrode degradation.
A Comprehensive Guide for Performing Sample Preparation and Top-Down Protein Analysis
Padula, Matthew P.; Berry, Iain J.; O′Rourke, Matthew B.; Raymond, Benjamin B.A.; Santos, Jerran; Djordjevic, Steven P.
2017-01-01
Methodologies for the global analysis of proteins in a sample, or proteome analysis, have been available since 1975 when Patrick O′Farrell published the first paper describing two-dimensional gel electrophoresis (2D-PAGE). This technique allowed the resolution of single protein isoforms, or proteoforms, into single ‘spots’ in a polyacrylamide gel, allowing the quantitation of changes in a proteoform′s abundance to ascertain changes in an organism′s phenotype when conditions change. In pursuit of the comprehensive profiling of the proteome, significant advances in technology have made the identification and quantitation of intact proteoforms from complex mixtures of proteins more routine, allowing analysis of the proteome from the ‘Top-Down’. However, the number of proteoforms detected by Top-Down methodologies such as 2D-PAGE or mass spectrometry has not significantly increased since O’Farrell’s paper when compared to Bottom-Up, peptide-centric techniques. This article explores and explains the numerous methodologies and technologies available to analyse the proteome from the Top-Down with a strong emphasis on the necessity to analyse intact proteoforms as a better indicator of changes in biology and phenotype. We arrive at the conclusion that the complete and comprehensive profiling of an organism′s proteome is still, at present, beyond our reach but the continuing evolution of protein fractionation techniques and mass spectrometry brings comprehensive Top-Down proteome profiling closer. PMID:28387712
A Comprehensive Guide for Performing Sample Preparation and Top-Down Protein Analysis.
Padula, Matthew P; Berry, Iain J; O Rourke, Matthew B; Raymond, Benjamin B A; Santos, Jerran; Djordjevic, Steven P
2017-04-07
Methodologies for the global analysis of proteins in a sample, or proteome analysis, have been available since 1975 when Patrick O'Farrell published the first paper describing two-dimensional gel electrophoresis (2D-PAGE). This technique allowed the resolution of single protein isoforms, or proteoforms, into single 'spots' in a polyacrylamide gel, allowing the quantitation of changes in a proteoform's abundance to ascertain changes in an organism's phenotype when conditions change. In pursuit of the comprehensive profiling of the proteome, significant advances in technology have made the identification and quantitation of intact proteoforms from complex mixtures of proteins more routine, allowing analysis of the proteome from the 'Top-Down'. However, the number of proteoforms detected by Top-Down methodologies such as 2D-PAGE or mass spectrometry has not significantly increased since O'Farrell's paper when compared to Bottom-Up, peptide-centric techniques. This article explores and explains the numerous methodologies and technologies available to analyse the proteome from the Top-Down with a strong emphasis on the necessity to analyse intact proteoforms as a better indicator of changes in biology and phenotype. We arrive at the conclusion that the complete and comprehensive profiling of an organism's proteome is still, at present, beyond our reach but the continuing evolution of protein fractionation techniques and mass spectrometry brings comprehensive Top-Down proteome profiling closer.
Basic numerical competences in large-scale assessment data: Structure and long-term relevance.
Hirsch, Stefa; Lambert, Katharina; Coppens, Karien; Moeller, Korbinian
2018-03-01
Basic numerical competences are seen as building blocks for later numerical and mathematical achievement. The current study aimed at investigating the structure of early numeracy reflected by different basic numerical competences in kindergarten and its predictive value for mathematical achievement 6 years later using data from large-scale assessment. This allowed analyses based on considerably large sample sizes (N > 1700). A confirmatory factor analysis indicated that a model differentiating five basic numerical competences at the end of kindergarten fitted the data better than a one-factor model of early numeracy representing a comprehensive number sense. In addition, these basic numerical competences were observed to reliably predict performance in a curricular mathematics test in Grade 6 even after controlling for influences of general cognitive ability. Thus, our results indicated a differentiated view on early numeracy considering basic numerical competences in kindergarten reflected in large-scale assessment data. Consideration of different basic numerical competences allows for evaluating their specific predictive value for later mathematical achievement but also mathematical learning difficulties. Copyright © 2017 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Haider, Steven J.; Loughran, David S.
2008-01-01
Despite numerous empirical studies, there is surprisingly little agreement about whether the Social Security earnings test affects male labor supply. In this paper, we provide a comprehensive analysis of the labor supply effects of the earnings test using longitudinal administrative earnings data and more commonly used survey data. We find that…
Fundamentals of digital filtering with applications in geophysical prospecting for oil
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mesko, A.
This book is a comprehensive work bringing together the important mathematical foundations and computing techniques for numerical filtering methods. The first two parts of the book introduce the techniques, fundamental theory and applications, while the third part treats specific applications in geophysical prospecting. Discussion is limited to linear filters, but takes in related fields such as correlational and spectral analysis.
MASS SPECTROMETRY-BASED METABOLOMICS
Dettmer, Katja; Aronov, Pavel A.; Hammock, Bruce D.
2007-01-01
This review presents an overview of the dynamically developing field of mass spectrometry-based metabolomics. Metabolomics aims at the comprehensive and quantitative analysis of wide arrays of metabolites in biological samples. These numerous analytes have very diverse physico-chemical properties and occur at different abundance levels. Consequently, comprehensive metabolomics investigations are primarily a challenge for analytical chemistry and specifically mass spectrometry has vast potential as a tool for this type of investigation. Metabolomics require special approaches for sample preparation, separation, and mass spectrometric analysis. Current examples of those approaches are described in this review. It primarily focuses on metabolic fingerprinting, a technique that analyzes all detectable analytes in a given sample with subsequent classification of samples and identification of differentially expressed metabolites, which define the sample classes. To perform this complex task, data analysis tools, metabolite libraries, and databases are required. Therefore, recent advances in metabolomics bioinformatics are also discussed. PMID:16921475
Back-support large laser mirror unit: mounting modeling and analysis
NASA Astrophysics Data System (ADS)
Wang, Hui; Zhang, Zheng; Long, Kai; Liu, Tianye; Li, Jun; Liu, Changchun; Xiong, Zhao; Yuan, Xiaodong
2018-01-01
In high-power laser system, the surface wavefront of large optics has a close link with its structure design and mounting method. The back-support transport mirror design is presently being investigated as a means in China's high-power laser system to hold the optical component firmly while minimizing the distortion of its reflecting surface. We have proposed a comprehensive analytical framework integrated numerical modeling and precise metrology for the mirror's mounting performance evaluation while treating the surface distortion as a key decision variable. The combination of numerical simulation and field tests demonstrates that the comprehensive analytical framework provides a detailed and accurate approach to evaluate the performance of the transport mirror. It is also verified that the back-support transport mirror is effectively compatible with state-of-the-art optical quality specifications. This study will pave the way for future research to solidify the design of back-support large laser optics in China's next generation inertial confinement fusion facility.
Li, Wei; Zhang, Min; Wang, Mingyu; Han, Zhantao; Liu, Jiankai; Chen, Zhezhou; Liu, Bo; Yan, Yan; Liu, Zhu
2018-06-01
Brownfield sites pollution and remediation is an urgent environmental issue worldwide. The screening and assessment of remedial alternatives is especially complex owing to its multiple criteria that involves technique, economy, and policy. To help the decision-makers selecting the remedial alternatives efficiently, the criteria framework conducted by the U.S. EPA is improved and a comprehensive method that integrates multiple criteria decision analysis (MCDA) with numerical simulation is conducted in this paper. The criteria framework is modified and classified into three categories: qualitative, semi-quantitative, and quantitative criteria, MCDA method, AHP-PROMETHEE (analytical hierarchy process-preference ranking organization method for enrichment evaluation) is used to determine the priority ranking of the remedial alternatives and the solute transport simulation is conducted to assess the remedial efficiency. A case study was present to demonstrate the screening method in a brownfield site in Cangzhou, northern China. The results show that the systematic method provides a reliable way to quantify the priority of the remedial alternatives.
Xiong, Naixue; Wu, Zhao; Huang, Yannong; Xu, Degang
2014-12-01
Services composition is fundamental to software development in multi-service wireless sensor networks (WSNs). The quality of service (QoS) of services composition applications (SCAs) are confronted with severe challenges due to the open, dynamic, and complex natures of WSNs. Most previous research separated various QoS indices into different fields and studied them individually due to the computational complexity. This approach ignores the mutual influence between these QoS indices, and leads to a non-comprehensive and inaccurate analysis result. The universal generating function (UGF) shows the speediness and precision in QoS analysis. However, only one QoS index at a time can be analyzed by the classic UGF. In order to efficiently analyze the comprehensive QoS of SCAs, this paper proposes an improved UGF technique-vector universal generating function (VUGF)-which considers the relationship between multiple QoS indices, including security, and can simultaneously analyze multiple QoS indices. The numerical examples demonstrate that it can be used for the evaluation of the comprehensive QoS of SCAs subjected to the security constraint in WSNs. Therefore, it can be effectively applied to the optimal design of multi-service WSNs.
Xiong, Naixue; Wu, Zhao; Huang, Yannong; Xu, Degang
2014-01-01
Services composition is fundamental to software development in multi-service wireless sensor networks (WSNs). The quality of service (QoS) of services composition applications (SCAs) are confronted with severe challenges due to the open, dynamic, and complex natures of WSNs. Most previous research separated various QoS indices into different fields and studied them individually due to the computational complexity. This approach ignores the mutual influence between these QoS indices, and leads to a non-comprehensive and inaccurate analysis result. The universal generating function (UGF) shows the speediness and precision in QoS analysis. However, only one QoS index at a time can be analyzed by the classic UGF. In order to efficiently analyze the comprehensive QoS of SCAs, this paper proposes an improved UGF technique—vector universal generating function (VUGF)—which considers the relationship between multiple QoS indices, including security, and can simultaneously analyze multiple QoS indices. The numerical examples demonstrate that it can be used for the evaluation of the comprehensive QoS of SCAs subjected to the security constraint in WSNs. Therefore, it can be effectively applied to the optimal design of multi-service WSNs. PMID:25470488
The numerical modelling of falling film thickness flow on horizontal tubes
NASA Astrophysics Data System (ADS)
Hassan, I. A.; Sadikin, A.; Isa, N. Mat
2017-04-01
This paper presents a computational modelling of water falling film flowing over horizontal tubes. The objective of this study is to use numerical predictions for comparing the film thickness along circumferential direction of tube on 2-D CFD models. The results are then validated with a theoretical result in previous literatures. A comprehensive design of 2-D models have been developed according to the real application and actual configuration of the falling film evaporator as well as previous experimental parameters. A computational modelling of the water falling film is presented with the aid of Ansys Fluent software. The Volume of Fluid (VOF) technique is adapted in this analysis since its capabilities of determining the film thickness on tubes surface is highly reliable. The numerical analysis is carried out under influence of ambient pressures at temperature of 27 °C. Three types of CFD numerical models were analyzed in this simulation with inter tube spacing of 30 mm, 20 mm and 10 mm respectively. The use of a numerical simulation tool on water falling film has resulted in a detailed investigation of film thickness. Based on the numerical simulated results, it is found that the average values of water film thickness for each model are 0.53 mm, 0.58 mm, and 0.63 mm.
A Systematic Review of the Research on Vocabulary Instruction That Impacts Text Comprehension
ERIC Educational Resources Information Center
Wright, Tanya S.; Cervetti, Gina N.
2017-01-01
Although numerous studies have identified a correlational relationship between vocabulary and comprehension, we know less about vocabulary interventions that impact reading comprehension. Therefore, this study is a systematic review of vocabulary interventions with comprehension outcomes. Analyses of 36 studies that met criteria are organized…
Residual Stress Analysis in Welded Component.
NASA Astrophysics Data System (ADS)
Rouhi, Shahab; Yoshida, Sanichiro; Miura, Fumiya; Sasaki, Tomohiro
Due to local heating, thermal stresses occur during welding; and residual stress and distortion result remain welding. Welding distortion has negative effects on the accuracy of assembly, exterior appearance, and various strengths of the welded structures. Up to date, a lot of experiments and numerical analysis have been developed to assess residual stress. However, quantitative estimation of residual stress based on experiment may involve massive uncertainties and complexity of the measurement process. To comprehensively understand this phenomena, it is necessary to do further researches by means of both experiment and numerical simulation. In this research, we conduct Finite Element Analysis (FEA) for a simple butt-welded metal plate specimen. Thermal input and resultant expansion are modeled with a thermal expansion FEA module and the resultant constitutive response of the material is modeled with a continuous mechanic FEA module. The residual stress is modeled based on permanent deformation occurring during the heating phase of the material. Experiments have also been carried out to compare with the FEA results. Numerical and experimental results show qualitative agreement. The present work was supported by the Louisiana Board of Regents (LEQSF(2016-17)-RD-C-13).
methylPipe and compEpiTools: a suite of R packages for the integrative analysis of epigenomics data.
Kishore, Kamal; de Pretis, Stefano; Lister, Ryan; Morelli, Marco J; Bianchi, Valerio; Amati, Bruno; Ecker, Joseph R; Pelizzola, Mattia
2015-09-29
Numerous methods are available to profile several epigenetic marks, providing data with different genome coverage and resolution. Large epigenomic datasets are then generated, and often combined with other high-throughput data, including RNA-seq, ChIP-seq for transcription factors (TFs) binding and DNase-seq experiments. Despite the numerous computational tools covering specific steps in the analysis of large-scale epigenomics data, comprehensive software solutions for their integrative analysis are still missing. Multiple tools must be identified and combined to jointly analyze histone marks, TFs binding and other -omics data together with DNA methylation data, complicating the analysis of these data and their integration with publicly available datasets. To overcome the burden of integrating various data types with multiple tools, we developed two companion R/Bioconductor packages. The former, methylPipe, is tailored to the analysis of high- or low-resolution DNA methylomes in several species, accommodating (hydroxy-)methyl-cytosines in both CpG and non-CpG sequence context. The analysis of multiple whole-genome bisulfite sequencing experiments is supported, while maintaining the ability of integrating targeted genomic data. The latter, compEpiTools, seamlessly incorporates the results obtained with methylPipe and supports their integration with other epigenomics data. It provides a number of methods to score these data in regions of interest, leading to the identification of enhancers, lncRNAs, and RNAPII stalling/elongation dynamics. Moreover, it allows a fast and comprehensive annotation of the resulting genomic regions, and the association of the corresponding genes with non-redundant GeneOntology terms. Finally, the package includes a flexible method based on heatmaps for the integration of various data types, combining annotation tracks with continuous or categorical data tracks. methylPipe and compEpiTools provide a comprehensive Bioconductor-compliant solution for the integrative analysis of heterogeneous epigenomics data. These packages are instrumental in providing biologists with minimal R skills a complete toolkit facilitating the analysis of their own data, or in accelerating the analyses performed by more experienced bioinformaticians.
Kim, Dong Seong; Park, Jong Sou
2014-01-01
It is important to assess availability of virtualized systems in IT business infrastructures. Previous work on availability modeling and analysis of the virtualized systems used a simplified configuration and assumption in which only one virtual machine (VM) runs on a virtual machine monitor (VMM) hosted on a physical server. In this paper, we show a comprehensive availability model using stochastic reward nets (SRN). The model takes into account (i) the detailed failures and recovery behaviors of multiple VMs, (ii) various other failure modes and corresponding recovery behaviors (e.g., hardware faults, failure and recovery due to Mandelbugs and aging-related bugs), and (iii) dependency between different subcomponents (e.g., between physical host failure and VMM, etc.) in a virtualized servers system. We also show numerical analysis on steady state availability, downtime in hours per year, transaction loss, and sensitivity analysis. This model provides a new finding on how to increase system availability by combining both software rejuvenations at VM and VMM in a wise manner. PMID:25165732
Quantitative image processing in fluid mechanics
NASA Technical Reports Server (NTRS)
Hesselink, Lambertus; Helman, James; Ning, Paul
1992-01-01
The current status of digital image processing in fluid flow research is reviewed. In particular, attention is given to a comprehensive approach to the extraction of quantitative data from multivariate databases and examples of recent developments. The discussion covers numerical simulations and experiments, data processing, generation and dissemination of knowledge, traditional image processing, hybrid processing, fluid flow vector field topology, and isosurface analysis using Marching Cubes.
The Nature of the Nodes, Weights and Degree of Precision in Gaussian Quadrature Rules
ERIC Educational Resources Information Center
Prentice, J. S. C.
2011-01-01
We present a comprehensive proof of the theorem that relates the weights and nodes of a Gaussian quadrature rule to its degree of precision. This level of detail is often absent in modern texts on numerical analysis. We show that the degree of precision is maximal, and that the approximation error in Gaussian quadrature is minimal, in a…
Little, Callie W; Haughbrook, Rasheda; Hart, Sara A
2016-01-01
Numerous twin studies have been published examining the genetic and environmental etiology of reading comprehension, though the etiological estimates may be influenced currently unidentified sample conditions (e.g., Tucker-Drob & Bates, 2015). The purpose of the current meta-analysis was to average the etiological influences of reading comprehension and to explore the potential moderators that may be influencing these estimates. Results revealed an average heritability estimate of h2 = .59, with significant variation in estimates across studies, suggesting potential moderation. Heritability was moderated by publication year, grade level, project, zygosity determination method, and response type. The average shared environmental estimate was c2 = .16, with publication year, grade and zygosity determination method acting as significant moderators. These findings support the large role of genetic influences on reading comprehension, and a small but significant role of shared environmental influences. The significant moderators of etiological influences within the current synthesis suggest our interpretation of how genes and environment influence reading comprehension should reflect aspects of study and sample. PMID:27630039
Response of Non-Linear Shock Absorbers-Boundary Value Problem Analysis
NASA Astrophysics Data System (ADS)
Rahman, M. A.; Ahmed, U.; Uddin, M. S.
2013-08-01
A nonlinear boundary value problem of two degrees-of-freedom (DOF) untuned vibration damper systems using nonlinear springs and dampers has been numerically studied. As far as untuned damper is concerned, sixteen different combinations of linear and nonlinear springs and dampers have been comprehensively analyzed taking into account transient terms. For different cases, a comparative study is made for response versus time for different spring and damper types at three important frequency ratios: one at r = 1, one at r > 1 and one at r <1. The response of the system is changed because of the spring and damper nonlinearities; the change is different for different cases. Accordingly, an initially stable absorber may become unstable with time and vice versa. The analysis also shows that higher nonlinearity terms make the system more unstable. Numerical simulation includes transient vibrations. Although problems are much more complicated compared to those for a tuned absorber, a comparison of the results generated by the present numerical scheme with the exact one shows quite a reasonable agreement
A Practical Engineering Approach to Predicting Fatigue Crack Growth in Riveted Lap Joints
NASA Technical Reports Server (NTRS)
Harris, Charles E.; Piascik, Robert S.; Newman, James C., Jr.
1999-01-01
An extensive experimental database has been assembled from very detailed teardown examinations of fatigue cracks found in rivet holes of fuselage structural components. Based on this experimental database, a comprehensive analysis methodology was developed to predict the onset of widespread fatigue damage in lap joints of fuselage structure. Several computer codes were developed with specialized capabilities to conduct the various analyses that make up the comprehensive methodology. Over the past several years, the authors have interrogated various aspects of the analysis methods to determine the degree of computational rigor required to produce numerical predictions with acceptable engineering accuracy. This study led to the formulation of a practical engineering approach to predicting fatigue crack growth in riveted lap joints. This paper describes the practical engineering approach and compares predictions with the results from several experimental studies.
A Practical Engineering Approach to Predicting Fatigue Crack Growth in Riveted Lap Joints
NASA Technical Reports Server (NTRS)
Harris, C. E.; Piascik, R. S.; Newman, J. C., Jr.
2000-01-01
An extensive experimental database has been assembled from very detailed teardown examinations of fatigue cracks found in rivet holes of fuselage structural components. Based on this experimental database, a comprehensive analysis methodology was developed to predict the onset of widespread fatigue damage in lap joints of fuselage structure. Several computer codes were developed with specialized capabilities to conduct the various analyses that make up the comprehensive methodology. Over the past several years, the authors have interrogated various aspects of the analysis methods to determine the degree of computational rigor required to produce numerical predictions with acceptable engineering accuracy. This study led to the formulation of a practical engineering approach to predicting fatigue crack growth in riveted lap joints. This paper describes the practical engineering approach and compares predictions with the results from several experimental studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brink, A.; Kilpinen, P.; Hupa, M.
1996-01-01
Two methods to improve the modeling of NO{sub x} emissions in numerical flow simulation of combustion are investigated. The models used are a reduced mechanism for nitrogen chemistry in methane combustion and a new model based on regression analysis of perfectly stirred reactor simulations using detailed comprehensive reaction kinetics. The applicability of the methods to numerical flow simulation of practical furnaces, especially in the near burner region, is tested against experimental data from a pulverized coal fired single burner furnace. The results are also compared to those obtained using a commonly used description for the overall reaction rate of NO.
Numbers matter to informed patient choices: A randomized design across age and numeracy levels
Peters, Ellen; Hart, P. Sol; Tusler, Martin; Fraenkel, Liana
2013-01-01
Background How drug adverse events (AEs) are communicated in the United States may mislead consumers and result in low adherence. Requiring written information to include numeric AE-likelihood information might lessen these effects, but providing numbers may disadvantage less skilled populations. Objective To determine risk comprehension and willingness to use a medication when presented with numeric or non-numeric AE-likelihood information across age, numeracy, and cholesterol-lowering-drug-usage groups. Design In a cross-sectional internet survey (N=905; American Life Panel, 5/15/08–6/18/08), respondents were presented with a hypothetical prescription medication for high cholesterol. AE likelihoods were described using one of six formats (non-numeric: Consumer-Medication-Information (CMI)-like list, risk labels; numeric: percentage, frequency, risk-labels-plus-percentage, risk-labels-plus-frequency). Main outcome measures were risk comprehension (recoded to indicate presence/absence of risk overestimation and underestimation), willingness to use the medication (7-point scale; not likely=0, very likely=6), and main reason for willingness (chosen from eight predefined reasons). Results Individuals given non-numeric information were more likely to overestimate risk, less willing to take the medication, and gave different reasons than those provided numeric information across numeracy and age groups (e.g., among less numerate: 69% and 18% overestimated risks in non-numeric and numeric formats, respectively; among more numerate: these same proportions were 66% and 6%). Less numerate middle-aged and older adults, however, showed less influence of numeric format on willingness to take the medication. Limitations It is unclear whether differences are clinically meaningful although some differences are large. Conclusions Providing numeric AE-likelihood information (compared to non-numeric) is likely to increase risk comprehension across numeracy and age levels. Its effects on uptake and adherence of prescribed drugs should be similar across the population, except perhaps in older, less numerate individuals. PMID:24246563
Mandic, D. P.; Ryan, K.; Basu, B.; Pakrashi, V.
2016-01-01
Although vibration monitoring is a popular method to monitor and assess dynamic structures, quantification of linearity or nonlinearity of the dynamic responses remains a challenging problem. We investigate the delay vector variance (DVV) method in this regard in a comprehensive manner to establish the degree to which a change in signal nonlinearity can be related to system nonlinearity and how a change in system parameters affects the nonlinearity in the dynamic response of the system. A wide range of theoretical situations are considered in this regard using a single degree of freedom (SDOF) system to obtain numerical benchmarks. A number of experiments are then carried out using a physical SDOF model in the laboratory. Finally, a composite wind turbine blade is tested for different excitations and the dynamic responses are measured at a number of points to extend the investigation to continuum structures. The dynamic responses were measured using accelerometers, strain gauges and a Laser Doppler vibrometer. This comprehensive study creates a numerical and experimental benchmark for structurally dynamical systems where output-only information is typically available, especially in the context of DVV. The study also allows for comparative analysis between different systems driven by the similar input. PMID:26909175
Evaluation of risk communication in a mammography patient decision aid.
Klein, Krystal A; Watson, Lindsey; Ash, Joan S; Eden, Karen B
2016-07-01
We characterized patients' comprehension, memory, and impressions of risk communication messages in a patient decision aid (PtDA), Mammopad, and clarified perceived importance of numeric risk information in medical decision making. Participants were 75 women in their forties with average risk factors for breast cancer. We used mixed methods, comprising a risk estimation problem administered within a pretest-posttest design, and semi-structured qualitative interviews with a subsample of 21 women. Participants' positive predictive value estimates of screening mammography improved after using Mammopad. Although risk information was only briefly memorable, through content analysis, we identified themes describing why participants value quantitative risk information, and obstacles to understanding. We describe ways the most complicated graphic was incompletely comprehended. Comprehension of risk information following Mammopad use could be improved. Patients valued receiving numeric statistical information, particularly in pictograph format. Obstacles to understanding risk information, including potential for confusion between statistics, should be identified and mitigated in PtDA design. Using simple pictographs accompanied by text, PtDAs may enhance a shared decision-making discussion. PtDA designers and providers should be aware of benefits and limitations of graphical risk presentations. Incorporating comprehension checks could help identify and correct misapprehensions of graphically presented statistics. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Evaluation of risk communication in a mammography patient decision aid
Klein, Krystal A.; Watson, Lindsey; Ash, Joan S.; Eden, Karen B.
2016-01-01
Objectives We characterized patients’ comprehension, memory, and impressions of risk communication messages in a patient decision aid (PtDA), Mammopad, and clarified perceived importance of numeric risk information in medical decision making. Methods Participants were 75 women in their forties with average risk factors for breast cancer. We used mixed methods, comprising a risk estimation problem administered within a pretest–posttest design, and semi-structured qualitative interviews with a subsample of 21 women. Results Participants’ positive predictive value estimates of screening mammography improved after using Mammopad. Although risk information was only briefly memorable, through content analysis, we identified themes describing why participants value quantitative risk information, and obstacles to understanding. We describe ways the most complicated graphic was incompletely comprehended. Conclusions Comprehension of risk information following Mammopad use could be improved. Patients valued receiving numeric statistical information, particularly in pictograph format. Obstacles to understanding risk information, including potential for confusion between statistics, should be identified and mitigated in PtDA design. Practice implications Using simple pictographs accompanied by text, PtDAs may enhance a shared decision-making discussion. PtDA designers and providers should be aware of benefits and limitations of graphical risk presentations. Incorporating comprehension checks could help identify and correct misapprehensions of graphically presented statistics PMID:26965020
GenoBase: comprehensive resource database of Escherichia coli K-12
Otsuka, Yuta; Muto, Ai; Takeuchi, Rikiya; Okada, Chihiro; Ishikawa, Motokazu; Nakamura, Koichiro; Yamamoto, Natsuko; Dose, Hitomi; Nakahigashi, Kenji; Tanishima, Shigeki; Suharnan, Sivasundaram; Nomura, Wataru; Nakayashiki, Toru; Aref, Walid G.; Bochner, Barry R.; Conway, Tyrrell; Gribskov, Michael; Kihara, Daisuke; Rudd, Kenneth E.; Tohsato, Yukako; Wanner, Barry L.; Mori, Hirotada
2015-01-01
Comprehensive experimental resources, such as ORFeome clone libraries and deletion mutant collections, are fundamental tools for elucidation of gene function. Data sets by omics analysis using these resources provide key information for functional analysis, modeling and simulation both in individual and systematic approaches. With the long-term goal of complete understanding of a cell, we have over the past decade created a variety of clone and mutant sets for functional genomics studies of Escherichia coli K-12. We have made these experimental resources freely available to the academic community worldwide. Accordingly, these resources have now been used in numerous investigations of a multitude of cell processes. Quality control is extremely important for evaluating results generated by these resources. Because the annotation has been changed since 2005, which we originally used for the construction, we have updated these genomic resources accordingly. Here, we describe GenoBase (http://ecoli.naist.jp/GB/), which contains key information about comprehensive experimental resources of E. coli K-12, their quality control and several omics data sets generated using these resources. PMID:25399415
Plasma and radio waves from Neptune: Source mechamisms and propagation
NASA Technical Reports Server (NTRS)
Menietti, J. Douglas
1994-01-01
The purpose of this project was to conduct a comprehensive investigation of the radio wave emission observed by the planetary radio astronomy (PRA) instrument on board Voyager 2 as it flew by Neptune. The study has included data analysis, theoretical and numerical calculations, and ray tracing to determine the possible source mechanisms and locations of the radiation, including the narrowband bursty and smooth components of the Neptune radio emission.
Cold Saline Springs in Permafrost on Earth and Mars
NASA Technical Reports Server (NTRS)
Heldann, Jennifer; Toon, Owen B.
2003-01-01
This report summarizes the research results which have emanated from work conducted on Cold Saline Springs in Permafrost on Earth and Mars. Three separate avenues of research including 1) terrestrial field work, 2) analysis of spacecraft data, and 3) numerical modeling were explored to provide a comprehensive investigation of water in the polar desert environments of both Earth and Mars. These investigations and their results are summarized.
NASA Astrophysics Data System (ADS)
Tanigawa, Hiroyasu; Katoh, Yutai; Kohyama, Akira
1995-08-01
Effects of applied stress on early stages of interstitial type Frank loop evolution were investigated by both numerical calculation and irradiation experiments. The final objective of this research is to propose a comprehensive model of complex stress effects on microstructural evolution under various conditions. In the experimental part of this work, the microstructural analysis revealed that the differences in resolved normal stress caused those in the nucleation rates of Frank loops on {111} crystallographic family planes, and that with increasing external applied stress the total nucleation rate of Frank loops was increased. A numerical calculation was carried out primarily to evaluate the validity of models of stress effects on nucleation processes of Frank loop evolution. The calculation stands on rate equuations which describe evolution of point defects, small points defect clusters and Frank loops. The rate equations of Frank loop evolution were formulated for {111} planes, considering effects of resolved normal stress to clustering processes of small point defects and growth processes of Frank loops, separately. The experimental results and the predictions from the numerical calculation qualitatively coincided well with each other.
Evaluation of CFD to Determine Two-Dimensional Airfoil Characteristics for Rotorcraft Applications
NASA Technical Reports Server (NTRS)
Smith, Marilyn J.; Wong, Tin-Chee; Potsdam, Mark; Baeder, James; Phanse, Sujeet
2004-01-01
The efficient prediction of helicopter rotor performance, vibratory loads, and aeroelastic properties still relies heavily on the use of comprehensive analysis codes by the rotorcraft industry. These comprehensive codes utilize look-up tables to provide two-dimensional aerodynamic characteristics. Typically these tables are comprised of a combination of wind tunnel data, empirical data and numerical analyses. The potential to rely more heavily on numerical computations based on Computational Fluid Dynamics (CFD) simulations has become more of a reality with the advent of faster computers and more sophisticated physical models. The ability of five different CFD codes applied independently to predict the lift, drag and pitching moments of rotor airfoils is examined for the SC1095 airfoil, which is utilized in the UH-60A main rotor. Extensive comparisons with the results of ten wind tunnel tests are performed. These CFD computations are found to be as good as experimental data in predicting many of the aerodynamic performance characteristics. Four turbulence models were examined (Baldwin-Lomax, Spalart-Allmaras, Menter SST, and k-omega).
NASA Astrophysics Data System (ADS)
Vedeneev, V. V.; Kolotnikov, M. E.; Mossakovskii, P. A.; Kostyreva, L. A.; Abdukhakimov, F. A.; Makarov, P. V.; Pyhalov, A. A.; Dudaev, M. A.
2018-01-01
In this paper we present a complex numerical workflow for analysis of blade flutter and high-amplitude resonant oscillations, impenetrability of casing if the blade is broken off, and the rotor reaction to the blade detachment and following misbalance, with the assessment of a safe flight possibility at the auto-rotation regime. All the methods used are carefully verified by numerical convergence study and correlations with experiments. The use of the workflow developed significantly improves the efficiency of the design process of modern jet engine compressors. It ensures a significant reduction of time and cost of the compressor design with the required level of strength and durability.
A comprehensive view of the web-resources related to sericulture
Singh, Deepika; Chetia, Hasnahana; Kabiraj, Debajyoti; Sharma, Swagata; Kumar, Anil; Sharma, Pragya; Deka, Manab; Bora, Utpal
2016-01-01
Recent progress in the field of sequencing and analysis has led to a tremendous spike in data and the development of data science tools. One of the outcomes of this scientific progress is development of numerous databases which are gaining popularity in all disciplines of biology including sericulture. As economically important organism, silkworms are studied extensively for their numerous applications in the field of textiles, biomaterials, biomimetics, etc. Similarly, host plants, pests, pathogens, etc. are also being probed to understand the seri-resources more efficiently. These studies have led to the generation of numerous seri-related databases which are extremely helpful for the scientific community. In this article, we have reviewed all the available online resources on silkworm and its related organisms, including databases as well as informative websites. We have studied their basic features and impact on research through citation count analysis, finally discussing the role of emerging sequencing and analysis technologies in the field of seri-data science. As an outcome of this review, a web portal named SeriPort, has been created which will act as an index for the various sericulture-related databases and web resources available in cyberspace. Database URL: http://www.seriport.in/ PMID:27307138
1977-02-11
Continue an reverse aide If necessaty and Identify by block number) A comprehensive computational procedure is presented for predicting the...Aeroballistic Reentry Technology ( ART ) program with some of the fundamental analytical and numerical work supported by NSWC Independent Research Funds. Most of...the Aerospace Corporation. The authors gratefully acknowledge the efforts of Mr. R. Feldhuhn, NSWC coordinator for the ART program, who was responsible
On the accuracy and precision of numerical waveforms: effect of waveform extraction methodology
NASA Astrophysics Data System (ADS)
Chu, Tony; Fong, Heather; Kumar, Prayush; Pfeiffer, Harald P.; Boyle, Michael; Hemberger, Daniel A.; Kidder, Lawrence E.; Scheel, Mark A.; Szilagyi, Bela
2016-08-01
We present a new set of 95 numerical relativity simulations of non-precessing binary black holes (BBHs). The simulations sample comprehensively both black-hole spins up to spin magnitude of 0.9, and cover mass ratios 1-3. The simulations cover on average 24 inspiral orbits, plus merger and ringdown, with low initial orbital eccentricities e\\lt {10}-4. A subset of the simulations extends the coverage of non-spinning BBHs up to mass ratio q = 10. Gravitational waveforms at asymptotic infinity are computed with two independent techniques: extrapolation and Cauchy characteristic extraction. An error analysis based on noise-weighted inner products is performed. We find that numerical truncation error, error due to gravitational wave extraction, and errors due to the Fourier transformation of signals with finite length of the numerical waveforms are of similar magnitude, with gravitational wave extraction errors dominating at noise-weighted mismatches of ˜ 3× {10}-4. This set of waveforms will serve to validate and improve aligned-spin waveform models for gravitational wave science.
Failure Analysis of Space Shuttle Orbiter Valve Poppet
NASA Technical Reports Server (NTRS)
Russell, Rick
2010-01-01
The poppet failed during STS-126 due to fatigue cracking that most likely was initiated during MDC ground-testing. This failure ultimately led to the discovery that the cracking problem was a generic issue effecting numerous poppets throughout the Shuttle program's history. This presentation has focused on the laboratory analysis of the failed hardware, but this analysis was only one aspect of a comprehensive failure investigation. One critical aspect of the overall investigation was modeling of the fluid flow through this valve to determine the possible sources of cyclic loading. This work has led to the conclusion that the poppets are failing due to flow-induced vibration.
NASA Technical Reports Server (NTRS)
Ho, C. Y.; Li, H. H.
1989-01-01
A computerized comprehensive numerical database system on the mechanical, thermophysical, electronic, electrical, magnetic, optical, and other properties of various types of technologically important materials such as metals, alloys, composites, dielectrics, polymers, and ceramics has been established and operational at the Center for Information and Numerical Data Analysis and Synthesis (CINDAS) of Purdue University. This is an on-line, interactive, menu-driven, user-friendly database system. Users can easily search, retrieve, and manipulate the data from the database system without learning special query language, special commands, standardized names of materials, properties, variables, etc. It enables both the direct mode of search/retrieval of data for specified materials, properties, independent variables, etc., and the inverted mode of search/retrieval of candidate materials that meet a set of specified requirements (which is the computer-aided materials selection). It enables also tabular and graphical displays and on-line data manipulations such as units conversion, variables transformation, statistical analysis, etc., of the retrieved data. The development, content, accessibility, etc., of the database system are presented and discussed.
Listening Comprehension Strategies: A Review of the Literature
ERIC Educational Resources Information Center
Berne, Jane E.
2004-01-01
Numerous studies related to listening comprehension strategies have been published in the past two decades. The present study seeks to build upon two previous reviews of listening comprehension strategies research. Of particular interest in this review are studies dealing with the types of cues used by listeners, the sequence of listening,…
Numbers matter to informed patient choices: a randomized design across age and numeracy levels.
Peters, Ellen; Hart, P Sol; Tusler, Martin; Fraenkel, Liana
2014-05-01
How drug adverse events (AEs) are communicated in the United States may mislead consumers and result in low adherence. Requiring written information to include numeric AE-likelihood information might lessen these effects, but providing numbers may disadvantage less skilled populations. The objective was to determine risk comprehension and willingness to use a medication when presented with numeric or nonnumeric AE-likelihood information across age, numeracy, and cholesterol-lowering drug-use groups. In a cross-sectional Internet survey (N = 905; American Life Panel, 15 May 2008 to 18 June 2008), respondents were presented with a hypothetical prescription medication for high cholesterol. AE likelihoods were described using 1 of 6 formats (nonnumeric: consumer medication information (CMI)-like list, risk labels; numeric: percentage, frequency, risk labels + percentage, risk labels + frequency). Main outcome measures were risk comprehension (recoded to indicate presence/absence of risk overestimation and underestimation), willingness to use the medication (7-point scale; not likely = 0, very likely = 6), and main reason for willingness (chosen from 8 predefined reasons). Individuals given nonnumeric information were more likely to overestimate risk, were less willing to take the medication, and gave different reasons than those provided numeric information across numeracy and age groups (e.g., among the less numerate, 69% and 18% overestimated risks in nonnumeric and numeric formats, respectively; among the more numerate, these same proportions were 66% and 6%). Less numerate middle-aged and older adults, however, showed less influence of numeric format on willingness to take the medication. It is unclear whether differences are clinically meaningful, although some differences are large. Providing numeric AE-likelihood information (compared with nonnumeric) is likely to increase risk comprehension across numeracy and age levels. Its effects on uptake and adherence of prescribed drugs should be similar across the population, except perhaps in older, less numerate individuals.
Yang, Jie; Shu, Hua
2012-08-01
Although numerous studies find the premotor cortex and the primary motor cortex are involved in action language comprehension, so far the nature of these motor effects is still in controversy. Some researchers suggest that the motor effects reflect that the premotor cortex and the primary motor cortex make functional contributions to the semantic access of action verbs, while other authors argue that the motor effects are caused by comprehension. In the current study, we used Granger causality analysis to investigate the roles of the premotor cortex and the primary motor cortex in processing of manual-action verbs. Regions of interest were selected in the primary motor cortex (M1) and the premotor cortex based on a hand motion task, and in the left posterior middle temporal gyrus (lexical semantic area) based on the reading task effect. We found that (1) the left posterior middle temporal gyrus had a causal influence on the left M1; and (2) the left posterior middle temporal gyrus and the left premotor cortex had bidirectional causal relations. These results suggest that the premotor cortex and the primary motor cortex play different roles in manual verb comprehension. The premotor cortex may be involved in motor simulation that contributes to action language processing, while the primary motor cortex may be engaged in a processing stage influenced by the meaning access of manual-action verbs. Further investigation combining effective connectivity analysis and technique with high temporal resolution is necessary for better clarification of the roles of the premotor cortex and the primary motor cortex in action language comprehension. Copyright © 2012 Elsevier Inc. All rights reserved.
Numerical-experimental study of internal fixation system "Dufoo" for vertebral fractures.
Nieto-Miranda, J Jesús; Faraón-Carbajal Romero, Manuel; Sánchez-Aguilar, Jons
2012-01-01
We describe a numerical experimental study of the stress generated by the internal fixation system "Dufoo" used in the treatment of vertebral fractures with the purpose of validating the numerical model of human lumbar vertebrae under the main physiological loads that the human body is exposed to in this area. The objective is to model and numerically simulate the elements of the musculoskeletal system to collect the stresses generated and other parameters that are difficult to measure experimentally in the thoracic lumbar vertebrae. We used an internal fixator "Dufoo" and vertebrae L2-L3-L4 specimens from pig and human. The system uses a total L3 corpectomy. The fixator acts as a mechanical bridge implant from L2 to L4. Numerical analysis was performed using the finite element method (FEM). For the experimental study, reflective photoelasticity and extensometry were used. Torsion and combined loads generate the main displacements and stresses in the study system, determining that the internal fixation carries out part of the function of the damaged organ structure when absorbing the stresses presented by applied loads. Numerical analysis allows great freedom in the management of the variables involved in the developed models using radiological images. Geometric models are obtained and are entered into FEM programs that allow testing using parameters that, under actual conditions, may not be easily carried out, allowing to comprehensively determine the biomechanical behavior of the coupled system of study.
Kreuzmair, Christina; Siegrist, Michael; Keller, Carmen
2017-03-01
Researchers recommend the use of pictographs in medical risk communication to improve people's risk comprehension and decision making. However, it is not yet clear whether the iconicity used in pictographs to convey risk information influences individuals' information processing and comprehension. In an eye-tracking experiment with participants from the general population (N = 188), we examined whether specific types of pictograph icons influence the processing strategy viewers use to extract numerical information. In addition, we examined the effect of iconicity and numeracy on probability estimation, recall, and icon liking. This experiment used a 2 (iconicity: blocks vs. restroom icons) × 2 (scenario: medical vs. nonmedical) between-subject design. Numeracy had a significant effect on information processing strategy, but we found no effect of iconicity or scenario. Results indicated that both icon types enabled high and low numerates to use their default way of processing and extracting the gist of the message from the pictorial risk communication format: high numerates counted icons, whereas low numerates used large-area processing. There was no effect of iconicity in the probability estimation. However, people who saw restroom icons had a higher probability of correctly recalling the exact risk level. Iconicity had no effect on icon liking. Although the effects are small, our findings suggest that person-like restroom icons in pictographs seem to have some advantages for risk communication. Specifically, in nonpersonalized prevention brochures, person-like restroom icons may maintain reader motivation for processing the risk information. © 2016 Society for Risk Analysis.
GenoBase: comprehensive resource database of Escherichia coli K-12.
Otsuka, Yuta; Muto, Ai; Takeuchi, Rikiya; Okada, Chihiro; Ishikawa, Motokazu; Nakamura, Koichiro; Yamamoto, Natsuko; Dose, Hitomi; Nakahigashi, Kenji; Tanishima, Shigeki; Suharnan, Sivasundaram; Nomura, Wataru; Nakayashiki, Toru; Aref, Walid G; Bochner, Barry R; Conway, Tyrrell; Gribskov, Michael; Kihara, Daisuke; Rudd, Kenneth E; Tohsato, Yukako; Wanner, Barry L; Mori, Hirotada
2015-01-01
Comprehensive experimental resources, such as ORFeome clone libraries and deletion mutant collections, are fundamental tools for elucidation of gene function. Data sets by omics analysis using these resources provide key information for functional analysis, modeling and simulation both in individual and systematic approaches. With the long-term goal of complete understanding of a cell, we have over the past decade created a variety of clone and mutant sets for functional genomics studies of Escherichia coli K-12. We have made these experimental resources freely available to the academic community worldwide. Accordingly, these resources have now been used in numerous investigations of a multitude of cell processes. Quality control is extremely important for evaluating results generated by these resources. Because the annotation has been changed since 2005, which we originally used for the construction, we have updated these genomic resources accordingly. Here, we describe GenoBase (http://ecoli.naist.jp/GB/), which contains key information about comprehensive experimental resources of E. coli K-12, their quality control and several omics data sets generated using these resources. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
Tang, Qi-Yi; Zhang, Chuan-Xi
2013-04-01
A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. © 2012 The Authors Insect Science © 2012 Institute of Zoology, Chinese Academy of Sciences.
Patterns of linguistic and numerical performance in aphasia.
Rath, Dajana; Domahs, Frank; Dressel, Katharina; Claros-Salinas, Dolores; Klein, Elise; Willmes, Klaus; Krinzinger, Helga
2015-02-04
Empirical research on the relationship between linguistic and numerical processing revealed inconsistent results for different levels of cognitive processing (e.g., lexical, semantic) as well as different stimulus materials (e.g., Arabic digits, number words, letters, non-number words). Information of dissociation patterns in aphasic patients was used in order to investigate the dissociability of linguistic and numerical processes. The aim of the present prospective study was a comprehensive, specific, and systematic investigation of relationships between linguistic and numerical processing, considering the impact of asemantic vs. semantic processing and the type of material employed (numbers compared to letters vs. words). A sample of aphasic patients (n = 60) was assessed with a battery of linguistic and numerical tasks directly comparable for their cognitive processing levels (e.g., perceptual, morpho-lexical, semantic). Mean performance differences and frequencies of (complementary) dissociations in individual patients revealed the most prominent numerical advantage for asemantic tasks when comparing the processing of numbers vs. letters, whereas the least numerical advantage was found for semantic tasks when comparing the processing of numbers vs. words. Different patient subgroups showing differential dissociation patterns were further analysed and discussed. A comprehensive model of linguistic and numerical processing should take these findings into account.
NASA Astrophysics Data System (ADS)
Wang, Jinting; Lu, Liqiao; Zhu, Fei
2018-01-01
Finite element (FE) is a powerful tool and has been applied by investigators to real-time hybrid simulations (RTHSs). This study focuses on the computational efficiency, including the computational time and accuracy, of numerical integrations in solving FE numerical substructure in RTHSs. First, sparse matrix storage schemes are adopted to decrease the computational time of FE numerical substructure. In this way, the task execution time (TET) decreases such that the scale of the numerical substructure model increases. Subsequently, several commonly used explicit numerical integration algorithms, including the central difference method (CDM), the Newmark explicit method, the Chang method and the Gui-λ method, are comprehensively compared to evaluate their computational time in solving FE numerical substructure. CDM is better than the other explicit integration algorithms when the damping matrix is diagonal, while the Gui-λ (λ = 4) method is advantageous when the damping matrix is non-diagonal. Finally, the effect of time delay on the computational accuracy of RTHSs is investigated by simulating structure-foundation systems. Simulation results show that the influences of time delay on the displacement response become obvious with the mass ratio increasing, and delay compensation methods may reduce the relative error of the displacement peak value to less than 5% even under the large time-step and large time delay.
Wu, Yiping; Liu, Shuguang; Huang, Zhihong; Yan, Wende
2014-01-01
Ecosystem models are useful tools for understanding ecological processes and for sustainable management of resources. In biogeochemical field, numerical models have been widely used for investigating carbon dynamics under global changes from site to regional and global scales. However, it is still challenging to optimize parameters and estimate parameterization uncertainty for complex process-based models such as the Erosion Deposition Carbon Model (EDCM), a modified version of CENTURY, that consider carbon, water, and nutrient cycles of ecosystems. This study was designed to conduct the parameter identifiability, optimization, sensitivity, and uncertainty analysis of EDCM using our developed EDCM-Auto, which incorporated a comprehensive R package—Flexible Modeling Framework (FME) and the Shuffled Complex Evolution (SCE) algorithm. Using a forest flux tower site as a case study, we implemented a comprehensive modeling analysis involving nine parameters and four target variables (carbon and water fluxes) with their corresponding measurements based on the eddy covariance technique. The local sensitivity analysis shows that the plant production-related parameters (e.g., PPDF1 and PRDX) are most sensitive to the model cost function. Both SCE and FME are comparable and performed well in deriving the optimal parameter set with satisfactory simulations of target variables. Global sensitivity and uncertainty analysis indicate that the parameter uncertainty and the resulting output uncertainty can be quantified, and that the magnitude of parameter-uncertainty effects depends on variables and seasons. This study also demonstrates that using the cutting-edge R functions such as FME can be feasible and attractive for conducting comprehensive parameter analysis for ecosystem modeling.
Ball, Maria H.; Schaffranek, Raymond W.
2000-01-01
The U.S. Geological Survey (USGS) is working closely with other Federal and State agencies in a comprehensive program to evaluate and restore the south Florida ecosystem. Within the USGS South Florida Ecosystem Program, a project entitled 'Coupling Models for Canal and Wetland Flow/Transport Interaction' is focused on analysis and numerical simulation of flow and potential transport of constituents between canal C-111 and wetlands adjacent to Everglades National Park. In support of this project, comprehensive sets of flow, vegetation, and water-quality data were collected in September 1997 and 1999. The flow-velocity data are compiled, summarized, and tabulated in this report. The flow, vegetation, and water-quality data are available for downloading from the World Wide Web.
Recurrent Loss of Specific Introns during Angiosperm Evolution
Wang, Hao; Devos, Katrien M.; Bennetzen, Jeffrey L.
2014-01-01
Numerous instances of presence/absence variations for introns have been documented in eukaryotes, and some cases of recurrent loss of the same intron have been suggested. However, there has been no comprehensive or phylogenetically deep analysis of recurrent intron loss. Of 883 cases of intron presence/absence variation that we detected in five sequenced grass genomes, 93 were confirmed as recurrent losses and the rest could be explained by single losses (652) or single gains (118). No case of recurrent intron gain was observed. Deep phylogenetic analysis often indicated that apparent intron gains were actually numerous independent losses of the same intron. Recurrent loss exhibited extreme non-randomness, in that some introns were removed independently in many lineages. The two larger genomes, maize and sorghum, were found to have a higher rate of both recurrent loss and overall loss and/or gain than foxtail millet, rice or Brachypodium. Adjacent introns and small introns were found to be preferentially lost. Intron loss genes exhibited a high frequency of germ line or early embryogenesis expression. In addition, flanking exon A+T-richness and intron TG/CG ratios were higher in retained introns. This last result suggests that epigenetic status, as evidenced by a loss of methylated CG dinucleotides, may play a role in the process of intron loss. This study provides the first comprehensive analysis of recurrent intron loss, makes a series of novel findings on the patterns of recurrent intron loss during the evolution of the grass family, and provides insight into the molecular mechanism(s) underlying intron loss. PMID:25474210
Method for determining the weight of functional objectives on manufacturing system.
Zhang, Qingshan; Xu, Wei; Zhang, Jiekun
2014-01-01
We propose a three-dimensional integrated weight determination to solve manufacturing system functional objectives, where consumers are weighted by triangular fuzzy numbers to determine the enterprises. The weights, subjective parts are determined by the expert scoring method, the objective parts are determined by the entropy method with the competitive advantage of determining. Based on the integration of three methods and comprehensive weight, we provide some suggestions for the manufacturing system. This paper provides the numerical example analysis to illustrate the feasibility of this method.
Environmental Risk Assessment of dredging processes - application to Marin harbour (NW Spain)
NASA Astrophysics Data System (ADS)
Gómez, A. G.; García Alba, J.; Puente, A.; Juanes, J. A.
2014-04-01
A methodological procedure to estimate the environmental risk of dredging operations in aquatic systems has been developed. Environmental risk estimations are based on numerical models results, which provide an appropriated spatio-temporal framework analysis to guarantee an effective decision-making process. The methodological procedure has been applied on a real dredging operation in the port of Marin (NW Spain). Results from Marin harbour confirmed the suitability of the developed methodology and the conceptual approaches as a comprehensive and practical management tool.
Delamination Assessment Tool for Spacecraft Composite Structures
NASA Astrophysics Data System (ADS)
Portela, Pedro; Preller, Fabian; Wittke, Henrik; Sinnema, Gerben; Camanho, Pedro; Turon, Albert
2012-07-01
Fortunately only few cases are known where failure of spacecraft structures due to undetected damage has resulted in a loss of spacecraft and launcher mission. However, several problems related to damage tolerance and in particular delamination of composite materials have been encountered during structure development of various ESA projects and qualification testing. To avoid such costly failures during development, launch or service of spacecraft, launcher and reusable launch vehicles structures a comprehensive damage tolerance verification approach is needed. In 2009, the European Space Agency (ESA) initiated an activity called “Delamination Assessment Tool” which is led by the Portuguese company HPS Lda and includes academic and industrial partners. The goal of this study is the development of a comprehensive damage tolerance verification approach for launcher and reusable launch vehicles (RLV) structures, addressing analytical and numerical methodologies, material-, subcomponent- and component testing, as well as non-destructive inspection. The study includes a comprehensive review of current industrial damage tolerance practice resulting from ECSS and NASA standards, the development of new Best Practice Guidelines for analysis, test and inspection methods and the validation of these with a real industrial case study. The paper describes the main findings of this activity so far and presents a first iteration of a Damage Tolerance Verification Approach, which includes the introduction of novel analytical and numerical tools at an industrial level. This new approach is being put to the test using real industrial case studies provided by the industrial partners, MT Aerospace, RUAG Space and INVENT GmbH
Advanced Power System Analysis Capabilities
NASA Technical Reports Server (NTRS)
1997-01-01
As a continuing effort to assist in the design and characterization of space power systems, the NASA Lewis Research Center's Power and Propulsion Office developed a powerful computerized analysis tool called System Power Analysis for Capability Evaluation (SPACE). This year, SPACE was used extensively in analyzing detailed operational timelines for the International Space Station (ISS) program. SPACE was developed to analyze the performance of space-based photovoltaic power systems such as that being developed for the ISS. It is a highly integrated tool that combines numerous factors in a single analysis, providing a comprehensive assessment of the power system's capability. Factors particularly critical to the ISS include the orientation of the solar arrays toward the Sun and the shadowing of the arrays by other portions of the station.
Intense tornadoes in Poland in the years 2000-2012 and their synoptic characteristics
NASA Astrophysics Data System (ADS)
Cwik, Paulina
2015-04-01
Tornadoes, or high speed rotating columns of air, are some of the most extreme natural processes occurring on Earth. Currently the trend towards more frequent and more severe occurrence of tornadoes appear also in Poland. So far, tornadoes in Poland resulted in very serious damage to infrastructure and led to injury or death many of the people. Forecast of tornados is not a easy task, especially when the phenomenon is local. Must be based on a comprehensive analysis of mesoscale numerical models, atmospheric soundings and radar data. Unfortunately, there are a number of limitations.One of it, is that the mesoscale weather models often do not capture local events, so forecast of the tornados in real time and place can not occur well in advance. The phenomenon of tornadoes must be better understood and become the object of a comprehensive analysis, so that the resulting information can be used both for research purposes as well as educational.
Kim, Seongho; Jang, Hyejeong; Koo, Imhoi; Lee, Joohyoung; Zhang, Xiang
2017-01-01
Compared to other analytical platforms, comprehensive two-dimensional gas chromatography coupled with mass spectrometry (GC×GC-MS) has much increased separation power for analysis of complex samples and thus is increasingly used in metabolomics for biomarker discovery. However, accurate peak detection remains a bottleneck for wide applications of GC×GC-MS. Therefore, the normal-exponential-Bernoulli (NEB) model is generalized by gamma distribution and a new peak detection algorithm using the normal-gamma-Bernoulli (NGB) model is developed. Unlike the NEB model, the NGB model has no closed-form analytical solution, hampering its practical use in peak detection. To circumvent this difficulty, three numerical approaches, which are fast Fourier transform (FFT), the first-order and the second-order delta methods (D1 and D2), are introduced. The applications to simulated data and two real GC×GC-MS data sets show that the NGB-D1 method performs the best in terms of both computational expense and peak detection performance.
Chen, Zhiwei; Chen, Bo
2014-01-01
Many long-span bridges have been built throughout the world in recent years but they are often subject to multiple types of dynamic loads, especially those located in wind-prone regions and carrying both trains and road vehicles. To ensure the safety and functionality of these bridges, dynamic responses of long-span bridges are often required for bridge assessment. Given that there are several limitations for the assessment based on field measurement of dynamic responses, a promising approach is based on numerical simulation technologies. This paper provides a detailed review of key issues involved in dynamic response analysis of long-span multiload bridges based on numerical simulation technologies, including dynamic interactions between running trains and bridge, between running road vehicles and bridge, and between wind and bridge, and in the wind-vehicle-bridge coupled system. Then a comprehensive review is conducted for engineering applications of newly developed numerical simulation technologies to safety assessment of long-span bridges, such as assessment of fatigue damage and assessment under extreme events. Finally, the existing problems and promising research efforts for the numerical simulation technologies and their applications to assessment of long-span multiload bridges are explored.
Numerical model for the thermal behavior of thermocline storage tanks
NASA Astrophysics Data System (ADS)
Ehtiwesh, Ismael A. S.; Sousa, Antonio C. M.
2018-03-01
Energy storage is a critical factor in the advancement of solar thermal power systems for the sustained delivery of electricity. In addition, the incorporation of thermal energy storage into the operation of concentrated solar power systems (CSPs) offers the potential of delivering electricity without fossil-fuel backup even during peak demand, independent of weather conditions and daylight. Despite this potential, some areas of the design and performance of thermocline systems still require further attention for future incorporation in commercial CSPs, particularly, their operation and control. Therefore, the present study aims to develop a simple but efficient numerical model to allow the comprehensive analysis of thermocline storage systems aiming better understanding of their dynamic temperature response. The validation results, despite the simplifying assumptions of the numerical model, agree well with the experiments for the time evolution of the thermocline region. Three different cases are considered to test the versatility of the numerical model; for the particular type of a storage tank with top round impingement inlet, a simple analytical model was developed to take into consideration the increased turbulence level in the mixing region. The numerical predictions for the three cases are in general good agreement against the experimental results.
Implicit LES using adaptive filtering
NASA Astrophysics Data System (ADS)
Sun, Guangrui; Domaradzki, Julian A.
2018-04-01
In implicit large eddy simulations (ILES) numerical dissipation prevents buildup of small scale energy in a manner similar to the explicit subgrid scale (SGS) models. If spectral methods are used the numerical dissipation is negligible but it can be introduced by applying a low-pass filter in the physical space, resulting in an effective ILES. In the present work we provide a comprehensive analysis of the numerical dissipation produced by different filtering operations in a turbulent channel flow simulated using a non-dissipative, pseudo-spectral Navier-Stokes solver. The amount of numerical dissipation imparted by filtering can be easily adjusted by changing how often a filter is applied. We show that when the additional numerical dissipation is close to the subgrid-scale (SGS) dissipation of an explicit LES the overall accuracy of ILES is also comparable, indicating that periodic filtering can replace explicit SGS models. A new method is proposed, which does not require any prior knowledge of a flow, to determine the filtering period adaptively. Once an optimal filtering period is found, the accuracy of ILES is significantly improved at low implementation complexity and computational cost. The method is general, performing well for different Reynolds numbers, grid resolutions, and filter shapes.
Chen, Zhiwei; Chen, Bo
2014-01-01
Many long-span bridges have been built throughout the world in recent years but they are often subject to multiple types of dynamic loads, especially those located in wind-prone regions and carrying both trains and road vehicles. To ensure the safety and functionality of these bridges, dynamic responses of long-span bridges are often required for bridge assessment. Given that there are several limitations for the assessment based on field measurement of dynamic responses, a promising approach is based on numerical simulation technologies. This paper provides a detailed review of key issues involved in dynamic response analysis of long-span multiload bridges based on numerical simulation technologies, including dynamic interactions between running trains and bridge, between running road vehicles and bridge, and between wind and bridge, and in the wind-vehicle-bridge coupled system. Then a comprehensive review is conducted for engineering applications of newly developed numerical simulation technologies to safety assessment of long-span bridges, such as assessment of fatigue damage and assessment under extreme events. Finally, the existing problems and promising research efforts for the numerical simulation technologies and their applications to assessment of long-span multiload bridges are explored. PMID:25006597
Analysis of a Stabilized CNLF Method with Fast Slow Wave Splittings for Flow Problems
Jiang, Nan; Tran, Hoang A.
2015-04-01
In this work, we study Crank-Nicolson leap-frog (CNLF) methods with fast-slow wave splittings for Navier-Stokes equations (NSE) with a rotation/Coriolis force term, which is a simplification of geophysical flows. We propose a new stabilized CNLF method where the added stabilization completely removes the method's CFL time step condition. A comprehensive stability and error analysis is given. We also prove that for Oseen equations with the rotation term, the unstable mode (for which u(n+1) + u(n-1) equivalent to 0) of CNLF is asymptotically stable. Numerical results are provided to verify the stability and the convergence of the methods.
Predictors, Indicators, and Validated Measures of Dependence in Menthol Smokers
Muhammad-Kah, Raheema; Rimmer, Lonnie; Liang, Qiwei
2014-01-01
This article presents a comprehensive review of the menthol cigarette dependence-related literature and results from an original analysis of the Total Exposure Study (TES), which included 1,100 menthol and 2,400 nonmenthol adult smokers. The substantial scientific evidence available related to age of first cigarette, age of regular use, single-item dependence indicators (smoking frequency, cigarettes per day, time to first cigarette, night waking to smoke), smoking duration, numerous validated and widely accepted measures of nicotine/cigarette dependence, and our analysis of the TES do not support that menthol smokers are more dependent than nonmenthol smokers or that menthol increases dependence. PMID:24738914
NUMERICAL FLOW AND TRANSPORT SIMULATIONS SUPPORTING THE SALTSTONE FACILITY PERFORMANCE ASSESSMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flach, G.
2009-02-28
The Saltstone Disposal Facility Performance Assessment (PA) is being revised to incorporate requirements of Section 3116 of the Ronald W. Reagan National Defense Authorization Act for Fiscal Year 2005 (NDAA), and updated data and understanding of vault performance since the 1992 PA (Cook and Fowler 1992) and related Special Analyses. A hybrid approach was chosen for modeling contaminant transport from vaults and future disposal cells to exposure points. A higher resolution, largely deterministic, analysis is performed on a best-estimate Base Case scenario using the PORFLOW numerical analysis code. a few additional sensitivity cases are simulated to examine alternative scenarios andmore » parameter settings. Stochastic analysis is performed on a simpler representation of the SDF system using the GoldSim code to estimate uncertainty and sensitivity about the Base Case. This report describes development of PORFLOW models supporting the SDF PA, and presents sample results to illustrate model behaviors and define impacts relative to key facility performance objectives. The SDF PA document, when issued, should be consulted for a comprehensive presentation of results.« less
Semantic Information Processing of Physical Simulation Based on Scientific Concept Vocabulary Model
NASA Astrophysics Data System (ADS)
Kino, Chiaki; Suzuki, Yoshio; Takemiya, Hiroshi
Scientific Concept Vocabulary (SCV) has been developed to actualize Cognitive methodology based Data Analysis System: CDAS which supports researchers to analyze large scale data efficiently and comprehensively. SCV is an information model for processing semantic information for physics and engineering. In the model of SCV, all semantic information is related to substantial data and algorisms. Consequently, SCV enables a data analysis system to recognize the meaning of execution results output from a numerical simulation. This method has allowed a data analysis system to extract important information from a scientific view point. Previous research has shown that SCV is able to describe simple scientific indices and scientific perceptions. However, it is difficult to describe complex scientific perceptions by currently-proposed SCV. In this paper, a new data structure for SCV has been proposed in order to describe scientific perceptions in more detail. Additionally, the prototype of the new model has been constructed and applied to actual data of numerical simulation. The result means that the new SCV is able to describe more complex scientific perceptions.
NASA Astrophysics Data System (ADS)
Yang, Zhou; Zhu, Yunpeng; Ren, Hongrui; Zhang, Yimin
2015-03-01
Reliability allocation of computerized numerical controlled(CNC) lathes is very important in industry. Traditional allocation methods only focus on high-failure rate components rather than moderate failure rate components, which is not applicable in some conditions. Aiming at solving the problem of CNC lathes reliability allocating, a comprehensive reliability allocation method based on cubic transformed functions of failure modes and effects analysis(FMEA) is presented. Firstly, conventional reliability allocation methods are introduced. Then the limitations of direct combination of comprehensive allocation method with the exponential transformed FMEA method are investigated. Subsequently, a cubic transformed function is established in order to overcome these limitations. Properties of the new transformed functions are discussed by considering the failure severity and the failure occurrence. Designers can choose appropriate transform amplitudes according to their requirements. Finally, a CNC lathe and a spindle system are used as an example to verify the new allocation method. Seven criteria are considered to compare the results of the new method with traditional methods. The allocation results indicate that the new method is more flexible than traditional methods. By employing the new cubic transformed function, the method covers a wider range of problems in CNC reliability allocation without losing the advantages of traditional methods.
Zhang, Linjun; Yue, Qiuhai; Zhang, Yang; Shu, Hua; Li, Ping
2015-01-01
Numerous studies have revealed the essential role of the left lateral temporal cortex in auditory sentence comprehension along with evidence of the functional specialization of the anterior and posterior temporal sub-areas. However, it is unclear whether task demands (e.g., active vs. passive listening) modulate the functional specificity of these sub-areas. In the present functional magnetic resonance imaging (fMRI) study, we addressed this issue by applying both independent component analysis (ICA) and general linear model (GLM) methods. Consistent with previous studies, intelligible sentences elicited greater activity in the left lateral temporal cortex relative to unintelligible sentences. Moreover, responses to intelligibility in the sub-regions were differentially modulated by task demands. While the overall activation patterns of the anterior and posterior superior temporal sulcus and middle temporal gyrus (STS/MTG) were equivalent during both passive and active tasks, a middle portion of the STS/MTG was found to be selectively activated only during the active task under a refined analysis of sub-regional contributions. Our results not only confirm the critical role of the left lateral temporal cortex in auditory sentence comprehension but further demonstrate that task demands modulate functional specialization of the anterior-middle-posterior temporal sub-areas. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Numerical study of wave propagation around an underground cavity: acoustic case
NASA Astrophysics Data System (ADS)
Esterhazy, Sofi; Perugia, Ilaria; Schöberl, Joachim; Bokelmann, Götz
2015-04-01
Motivated by the need to detect an underground cavity within the procedure of an On-Site-Inspection (OSI) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO), which might be caused by a nuclear explosion/weapon testing, we aim to provide a basic numerical study of the wave propagation around and inside such an underground cavity. The aim of the CTBTO is to ban all nuclear explosions of any size anywhere, by anyone. Therefore, it is essential to build a powerful strategy to efficiently investigate and detect critical signatures such as gas filled cavities, rubble zones and fracture networks below the surface. One method to investigate the geophysical properties of an underground cavity allowed by the Comprehensive Nuclear-test Ban Treaty is referred to as 'resonance seismometry' - a resonance method that uses passive or active seismic techniques, relying on seismic cavity vibrations. This method is in fact not yet entirely determined by the Treaty and there are also only few experimental examples that have been suitably documented to build a proper scientific groundwork. This motivates to investigate this problem on a purely numerical level and to simulate these events based on recent advances in the mathematical understanding of the underlying physical phenomena. Here, we focus our numerical study on the propagation of P-waves in two dimensions. An extension to three dimensions as well as an inclusion of the full elastic wave field is planned in the following. For the numerical simulations of wave propagation we use a high order finite element discretization which has the significant advantage that it can be extended easily from simple toy designs to complex and irregularly shaped geometries without excessive effort. Our computations are done with the parallel Finite Element Library NGSOLVE ontop of the automatic 2D/3D tetrahedral mesh generator NETGEN (http://sourceforge.net/projects/ngsolve/). Using the basic mathematical understanding of the physical equations and the numerical algorithms it is possible for us to investigate the wave field over a large bandwidth of wave numbers. This means we can apply our calculations for a wide range of parameters, while keeping the numerical error explicitly under control. The accurate numerical modeling can facilitate the development of proper analysis techniques to detect the remnants of an underground nuclear test, help to set a rigorous scientific base of OSI and contribute to bringing the Treaty into force.
Interaction of Kelvin waves and nonlocality of energy transfer in superfluids
NASA Astrophysics Data System (ADS)
Laurie, Jason; L'Vov, Victor S.; Nazarenko, Sergey; Rudenko, Oleksii
2010-03-01
We argue that the physics of interacting Kelvin Waves (KWs) is highly nontrivial and cannot be understood on the basis of pure dimensional reasoning. A consistent theory of KW turbulence in superfluids should be based upon explicit knowledge of their interactions. To achieve this, we present a detailed calculation and comprehensive analysis of the interaction coefficients for KW turbuelence, thereby, resolving previous mistakes stemming from unaccounted contributions. As a first application of this analysis, we derive a local nonlinear (partial differential) equation. This equation is much simpler for analysis and numerical simulations of KWs than the Biot-Savart equation, and in contrast to the completely integrable local induction approximation (in which the energy exchange between KWs is absent), describes the nonlinear dynamics of KWs. Second, we show that the previously suggested Kozik-Svistunov energy spectrum for KWs, which has often been used in the analysis of experimental and numerical data in superfluid turbulence, is irrelevant, because it is based upon an erroneous assumption of the locality of the energy transfer through scales. Moreover, we demonstrate the weak nonlocality of the inverse cascade spectrum with a constant particle-number flux and find resulting logarithmic corrections to this spectrum.
Analysis of chaos in high-dimensional wind power system.
Wang, Cong; Zhang, Hongli; Fan, Wenhui; Ma, Ping
2018-01-01
A comprehensive analysis on the chaos of a high-dimensional wind power system is performed in this study. A high-dimensional wind power system is more complex than most power systems. An 11-dimensional wind power system proposed by Huang, which has not been analyzed in previous studies, is investigated. When the systems are affected by external disturbances including single parameter and periodic disturbance, or its parameters changed, chaotic dynamics of the wind power system is analyzed and chaotic parameters ranges are obtained. Chaos existence is confirmed by calculation and analysis of all state variables' Lyapunov exponents and the state variable sequence diagram. Theoretical analysis and numerical simulations show that the wind power system chaos will occur when parameter variations and external disturbances change to a certain degree.
NASA Astrophysics Data System (ADS)
Esterhazy, Sofi; Schneider, Felix; Perugia, Ilaria; Bokelmann, Götz
2017-04-01
Motivated by the need to detect an underground cavity within the procedure of an On-Site-Inspection (OSI) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO), which might be caused by a nuclear explosion/weapon testing, we aim to provide a basic numerical study of the wave propagation around and inside such an underground cavity. One method to investigate the geophysical properties of an underground cavity allowed by the Comprehensive Nuclear-test Ban Treaty is referred to as "resonance seismometry" - a resonance method that uses passive or active seismic techniques, relying on seismic cavity vibrations. This method is in fact not yet entirely determined by the Treaty and so far, there are only very few experimental examples that have been suitably documented to build a proper scientific groundwork. This motivates to investigate this problem on a purely numerical level and to simulate these events based on recent advances in numerical modeling of wave propagation problems. Our numerical study includes the full elastic wave field in three dimensions. We consider the effects from an incoming plane wave as well as point source located in the surrounding of the cavity at the surface. While the former can be considered as passive source like a tele-seismic earthquake, the latter represents a man-made explosion or a viborseis as used for/in active seismic techniques. Further we want to demonstrate the specific characteristics of the scattered wave field from a P-waves and S-wave separately. For our simulations in 3D we use the discontinuous Galerkin Spectral Element Code SPEED developed by MOX (The Laboratory for Modeling and Scientific Computing, Department of Mathematics) and DICA (Department of Civil and Environmental Engineering) at the Politecnico di Milano. The computations are carried out on the Vienna Scientific Cluster (VSC). The accurate numerical modeling can facilitate the development of proper analysis techniques to detect the remnants of an underground nuclear test, help to set a rigorous scientific base of OSI and contribute to bringing the Treaty into force.
Economic Consequence Analysis of Disasters: The ECAT Software Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, Adam; Prager, Fynn; Chen, Zhenhua
This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economicmore » consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.« less
Yu, Zhongtang; Yu, Marie; Morrison, Mark
2006-04-01
Serial analysis of ribosomal sequence tags (SARST) is a recently developed technology that can generate large 16S rRNA gene (rrs) sequence data sets from microbiomes, but there are numerous enzymatic and purification steps required to construct the ribosomal sequence tag (RST) clone libraries. We report here an improved SARST method, which still targets the V1 hypervariable region of rrs genes, but reduces the number of enzymes, oligonucleotides, reagents, and technical steps needed to produce the RST clone libraries. The new method, hereafter referred to as SARST-V1, was used to examine the eubacterial diversity present in community DNA recovered from the microbiome resident in the ovine rumen. The 190 sequenced clones contained 1055 RSTs and no less than 236 unique phylotypes (based on > or = 95% sequence identity) that were assigned to eight different eubacterial phyla. Rarefaction and monomolecular curve analyses predicted that the complete RST clone library contains 99% of the 353 unique phylotypes predicted to exist in this microbiome. When compared with ribosomal intergenic spacer analysis (RISA) of the same community DNA sample, as well as a compilation of nine previously published conventional rrs clone libraries prepared from the same type of samples, the RST clone library provided a more comprehensive characterization of the eubacterial diversity present in rumen microbiomes. As such, SARST-V1 should be a useful tool applicable to comprehensive examination of diversity and composition in microbiomes and offers an affordable, sequence-based method for diversity analysis.
Method for Determining the Weight of Functional Objectives on Manufacturing System
Zhang, Qingshan; Xu, Wei; Zhang, Jiekun
2014-01-01
We propose a three-dimensional integrated weight determination to solve manufacturing system functional objectives, where consumers are weighted by triangular fuzzy numbers to determine the enterprises. The weights, subjective parts are determined by the expert scoring method, the objective parts are determined by the entropy method with the competitive advantage of determining. Based on the integration of three methods and comprehensive weight, we provide some suggestions for the manufacturing system. This paper provides the numerical example analysis to illustrate the feasibility of this method. PMID:25243203
NASA Astrophysics Data System (ADS)
Diaz-Egea, Carlos; Sigle, Wilfried; van Aken, Peter A.; Molina, Sergio I.
2013-07-01
We present the mapping of the full plasmonic mode spectrum for single and aggregated gold nanoparticles linked through DNA strands to a silicon nitride substrate. A comprehensive analysis of the electron energy loss spectroscopy images maps was performed on nanoparticles standing alone, dimers, and clusters of nanoparticles. The experimental results were confirmed by numerical calculations using the Mie theory and Gans-Mie theory for solving Maxwell's equations. Both bright and dark surface plasmon modes have been unveiled.
Dai, Yilin; Guo, Ling; Li, Meng; Chen, Yi-Bu
2012-06-08
Microarray data analysis presents a significant challenge to researchers who are unable to use the powerful Bioconductor and its numerous tools due to their lack of knowledge of R language. Among the few existing software programs that offer a graphic user interface to Bioconductor packages, none have implemented a comprehensive strategy to address the accuracy and reliability issue of microarray data analysis due to the well known probe design problems associated with many widely used microarray chips. There is also a lack of tools that would expedite the functional analysis of microarray results. We present Microarray Я US, an R-based graphical user interface that implements over a dozen popular Bioconductor packages to offer researchers a streamlined workflow for routine differential microarray expression data analysis without the need to learn R language. In order to enable a more accurate analysis and interpretation of microarray data, we incorporated the latest custom probe re-definition and re-annotation for Affymetrix and Illumina chips. A versatile microarray results output utility tool was also implemented for easy and fast generation of input files for over 20 of the most widely used functional analysis software programs. Coupled with a well-designed user interface, Microarray Я US leverages cutting edge Bioconductor packages for researchers with no knowledge in R language. It also enables a more reliable and accurate microarray data analysis and expedites downstream functional analysis of microarray results.
Estimating the Diets of Animals Using Stable Isotopes and a Comprehensive Bayesian Mixing Model
Hopkins, John B.; Ferguson, Jake M.
2012-01-01
Using stable isotope mixing models (SIMMs) as a tool to investigate the foraging ecology of animals is gaining popularity among researchers. As a result, statistical methods are rapidly evolving and numerous models have been produced to estimate the diets of animals—each with their benefits and their limitations. Deciding which SIMM to use is contingent on factors such as the consumer of interest, its food sources, sample size, the familiarity a user has with a particular framework for statistical analysis, or the level of inference the researcher desires to make (e.g., population- or individual-level). In this paper, we provide a review of commonly used SIMM models and describe a comprehensive SIMM that includes all features commonly used in SIMM analysis and two new features. We used data collected in Yosemite National Park to demonstrate IsotopeR's ability to estimate dietary parameters. We then examined the importance of each feature in the model and compared our results to inferences from commonly used SIMMs. IsotopeR's user interface (in R) will provide researchers a user-friendly tool for SIMM analysis. The model is also applicable for use in paleontology, archaeology, and forensic studies as well as estimating pollution inputs. PMID:22235246
Numerical Hydrodynamics in Special Relativity.
Martí, J M; Müller, E
1999-01-01
This review is concerned with a discussion of numerical methods for the solution of the equations of special relativistic hydrodynamics (SRHD). Particular emphasis is put on a comprehensive review of the application of high-resolution shock-capturing methods in SRHD. Results obtained with different numerical SRHD methods are compared, and two astrophysical applications of SRHD flows are discussed. An evaluation of the various numerical methods is given and future developments are analyzed. Supplementary material is available for this article at 10.12942/lrr-1999-3.
NASA Technical Reports Server (NTRS)
Thompson, J. F.; Warsi, Z. U. A.; Mastin, C. W.
1982-01-01
A comprehensive review of methods of numerically generating curvilinear coordinate systems with coordinate lines coincident with all boundary segments is given. Some general mathematical framework and error analysis common to such coordinate systems is also included. The general categories of generating systems are those based on conformal mapping, orthogonal systems, nearly orthogonal systems, systems produced as the solution of elliptic and hyperbolic partial differential equations, and systems generated algebraically by interpolation among the boundaries. Also covered are the control of coordinate line spacing by functions embedded in the partial differential operators of the generating system and by subsequent stretching transformation. Dynamically adaptive coordinate systems, coupled with the physical solution, and time-dependent systems that follow moving boundaries are treated. References reporting experience using such coordinate systems are reviewed as well as those covering the system development.
Zhou, Wei; Feng, Chuqiao; Liu, Xinghong; Liu, Shuhua; Zhang, Chao; Yuan, Wei
2016-01-01
This work is a contrastive investigation of numerical simulations to improve the comprehension of thermo-structural coupled phenomena of mass concrete structures during construction. The finite element (FE) analysis of thermo-structural behaviors is used to investigate the applicability of supersulfated cement (SSC) in mass concrete structures. A multi-scale framework based on a homogenization scheme is adopted in the parameter studies to describe the nonlinear concrete behaviors. Based on the experimental data of hydration heat evolution rate and quantity of SSC and fly ash Portland cement, the hydration properties of various cements are studied. Simulations are run on a concrete dam section with a conventional method and a chemo-thermo-mechanical coupled method. The results show that SSC is more suitable for mass concrete structures from the standpoint of temperature control and crack prevention. PMID:28773517
Zhou, Wei; Feng, Chuqiao; Liu, Xinghong; Liu, Shuhua; Zhang, Chao; Yuan, Wei
2016-05-20
This work is a contrastive investigation of numerical simulations to improve the comprehension of thermo-structural coupled phenomena of mass concrete structures during construction. The finite element (FE) analysis of thermo-structural behaviors is used to investigate the applicability of supersulfated cement (SSC) in mass concrete structures. A multi-scale framework based on a homogenization scheme is adopted in the parameter studies to describe the nonlinear concrete behaviors. Based on the experimental data of hydration heat evolution rate and quantity of SSC and fly ash Portland cement, the hydration properties of various cements are studied. Simulations are run on a concrete dam section with a conventional method and a chemo-thermo-mechanical coupled method. The results show that SSC is more suitable for mass concrete structures from the standpoint of temperature control and crack prevention.
Li, Duxin; Schmitz, Oliver J
2015-01-01
The analysis of chemical constituents in Chinese herbal medicines (CHMs) is a challenge because of numerous compounds with various polarities and functional groups. Liquid chromatography coupled with quadrupole time-of-flight (QTOF) mass spectrometry (LC/MS) is of particular interest in the analysis of herbal components. One of the main attributes of QTOF that makes it an attractive analytical technique is its accurate mass measurement for both precursor and product ions. For the separation of CHMs, comprehensive two-dimensional chromatography (LCxLC) provides much higher resolving power than traditional one-dimensional separation. Therefore, a LCxLC-QTOF-MS system was developed and applied to the analysis of flavonoids and iridoid glycosides in aqueous extracts of Hedyotis diffusa (Rubiaceae). Shift gradient was applied in the two-dimensional separation in the LCxLC system to increase the orthogonality and effective peak distribution area of the analysis. Tentative identification of compounds was done by accurate mass interpretation and validation by UV spectrum. A clear classification of flavonol glycosides (FGs), acylated FGs, and iridoid glycosides (IGs) was shown in different regions of the LCxLC contour plot. In total, five FGs, four acylated FGs, and three IGs were tentatively identified. In addition, several novel flavonoids were found, which demonstrates that LCxLC-QTOF-MS detection also has great potential in herbal medicine analysis.
The most common technologies and tools for functional genome analysis.
Gasperskaja, Evelina; Kučinskas, Vaidutis
2017-01-01
Since the sequence of the human genome is complete, the main issue is how to understand the information written in the DNA sequence. Despite numerous genome-wide studies that have already been performed, the challenge to determine the function of genes, gene products, and also their interaction is still open. As changes in the human genome are highly likely to cause pathological conditions, functional analysis is vitally important for human health. For many years there have been a variety of technologies and tools used in functional genome analysis. However, only in the past decade there has been rapid revolutionizing progress and improvement in high-throughput methods, which are ranging from traditional real-time polymerase chain reaction to more complex systems, such as next-generation sequencing or mass spectrometry. Furthermore, not only laboratory investigation, but also accurate bioinformatic analysis is required for reliable scientific results. These methods give an opportunity for accurate and comprehensive functional analysis that involves various fields of studies: genomics, epigenomics, proteomics, and interactomics. This is essential for filling the gaps in the knowledge about dynamic biological processes at both cellular and organismal level. However, each method has both advantages and limitations that should be taken into account before choosing the right method for particular research in order to ensure successful study. For this reason, the present review paper aims to describe the most frequent and widely-used methods for the comprehensive functional analysis.
ERIC Educational Resources Information Center
Vahabi, Mandana
2010-01-01
Objective: To test whether the format in which women receive probabilistic information about breast cancer and mammography affects their comprehension. Methods: A convenience sample of 180 women received pre-assembled randomized packages containing a breast health information brochure, with probabilities presented in either verbal or numeric…
Implementing Comprehensive School Physical Activity Programs: A Wayne State University Case Study
ERIC Educational Resources Information Center
Centeio, Erin E.; McCaughtry, Nate
2017-01-01
Comprehensive school physical activity programs (CSPAPs) have been highlighted by numerous public health and education agencies for their potential to improve the health and academic achievement of American youth. A CSPAP integrates physical activity throughout the school environment before, during and after school by engaging educators, children,…
Numerical modeling of inorganic aerosol processes is useful in air quality management, but comprehensive evaluation of modeled aerosol processes is rarely possible due to the lack of comprehensive datasets. During the Nitrogen, Aerosol Composition, and Halogens on a Tall Tower (N...
An e-Learning System for Extracting Text Comprehension and Learning Style Characteristics
ERIC Educational Resources Information Center
Samarakou, Maria; Tsaganou, Grammatiki; Papadakis, Andreas
2018-01-01
Technology-mediated learning is very actively and widely researched, with numerous e-learning environments designed for different educational purposes developed during the past few decades. Still, their organization and texts are not structured according to any theory of educational comprehension. Modern education is even more flexible and, thus,…
Extracting numeric measurements and temporal coordinates from Japanese radiological reports
NASA Astrophysics Data System (ADS)
Imai, Takeshi; Onogi, Yuzo
2004-04-01
Medical records are written mainly, in natural language. The focus of this study is narrative radiological reports written in natural Japanese. These reports cannot be used for advanced retrieval, data mining, and so on, unless they are stored, using a structured format such as DICOM-SR. The goal is to structure narrative reports progressively, using natural language processing (NLP). Structure has many different levels, for example, DICOM-SR has three established levels -- basic text, enhanced and comprehensive. At the enhanced level, it is necessary to use numerical measurements and spatial & temporal coordinates. In this study, the wording used in the reports was first standardized, dictionaries were organized, and morphological analysis performed. Next, numerical measurements and temporal coordinates were extracted, and the objects to which they referred, analyzed. 10,000 CT and MR reports were separated into 82,122 sentences, and 34,269 of the 36,444 numerical descriptions were tagged. Periods, slashes, hyphens, and parentheses are ambiguously used in the description of enumerated lists, dates, image numbers, and anatomical names, as well as at the end of sentences; to resolve this ambiguity, descriptions were processed, according to the order -- date, size, unit, enumerated list, and abbreviation -- then, the tagged reports were separated into sentences.
NASA Astrophysics Data System (ADS)
Zannouni, K.; El Abrach, H.; Dhahri, H.; Mhimid, A.
2017-06-01
The present paper reports a numerical study to investigate the drying of rectangular gypsum sample based on a diffusive model. Both vertical and low sides of the porous media are treated as adiabatic and impermeable surfaces plate. The upper face of the plate represents the permeable interface. The energy equation model is based on the local thermal equilibrium assumption between the fluid and the solid phases. The lattice Boltzmann method (LBM) is used for solving the governing differential equations system. The obtained numerical results concerning the moisture content and the temperature within a gypsum sample were discussed. A comprehensive analysis of the influence of the mass transfer coefficient, the convective heat transfer coefficient, the external temperature, the relative humidity and the diffusion coefficient on macroscopic fields are also investigated. They all presented results in this paper and obtained in the stable regime correspond to time superior than 4000 s. Therefore the numerical error is inferior to 2%. The experimental data and the descriptive information of the approach indicate an excellent agreement between the results of our developed numerical code based on the LBM and the published ones.
CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis.
Jalili, Mahdi; Salehzadeh-Yazdi, Ali; Asgari, Yazdan; Arab, Seyed Shahriar; Yaghmaie, Marjan; Ghavamzadeh, Ardeshir; Alimoghaddam, Kamran
2015-01-01
Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv) file format or a mapped graphical format as a graph modeling language (GML) file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer) and R package (centiserve) are freely available at http://www.centiserver.org/.
CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis
Jalili, Mahdi; Salehzadeh-Yazdi, Ali; Asgari, Yazdan; Arab, Seyed Shahriar; Yaghmaie, Marjan; Ghavamzadeh, Ardeshir; Alimoghaddam, Kamran
2015-01-01
Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv) file format or a mapped graphical format as a graph modeling language (GML) file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer) and R package (centiserve) are freely available at http://www.centiserver.org/ PMID:26571275
Kinetic Analysis for Macrocyclizations Involving Anionic Template at the Transition State
Martí-Centelles, Vicente; Burguete, M. Isabel; Luis, Santiago V.
2012-01-01
Several kinetic models for the macrocyclization of a C2 pseudopeptide with a dihalide through a SN2 reaction have been developed. These models not only focus on the kinetic analysis of the main macrocyclization reaction, but also consider the competitive oligomerization/polymerization processes yielding undesired oligomeric/polymeric byproducts. The effect of anions has also been included in the kinetic models, as they can act as catalytic templates in the transition state reducing and stabilizing the transition state. The corresponding differential equation systems for each kinetic model can be solved numerically. Through a comprehensive analysis of these results, it is possible to obtain a better understanding of the different parameters that are involved in the macrocyclization reaction mechanism and to develop strategies for the optimization of the desired processes. PMID:22666148
Making Green Cleaning Easy for Local School Boards
ERIC Educational Resources Information Center
Ashkin, Stephen
2012-01-01
Ten or even five years ago, it would have been a major undertaking for a school district to convert to a comprehensive green cleaning program. At that time there was little precedence and no "roadmaps" for doing so. Thus schools faced numerous challenges that included: (1) what defined a green cleaning product; (2) should a comprehensive program…
Writing for Learning to Improve Students' Comprehension at the College Level
ERIC Educational Resources Information Center
Alharbi, Fahad
2015-01-01
This literature review will illustrate how writing could improve students' comprehension. Writing is one of the most important skills that students need to master for college level work. Therefore, students should be prepared with these skills before moving to the college level because they are required to write numerous papers that tend to be…
ERIC Educational Resources Information Center
Oakhill, Jane; Yuill, Nicola; Garnham, Alan
2011-01-01
Working memory predicts children's reading comprehension but it is not clear whether this relation is due to a modality-specific or general working memory. This study, which investigated the relations between children's reading skills and working memory (WM) abilities in 3 modalities, extends previous work by including measures of both reading…
Storybook Read-Alouds to Enhance Students' Comprehension Skills in ESL Classrooms: A Case Study
ERIC Educational Resources Information Center
Omar, Ainon; Saufi, Maizatulliza Mohd.
2015-01-01
The effectiveness of using storybooks during read-alouds to develop children's comprehension skills as well as in understanding the story has been widely studied. The reading aloud strategy has also been proven through numerous researches to be the most highly recommended activity for encouraging language and literacy. The study identified the…
The Effects of Think-Aloud in a Collaborative Environment to Improve Comprehension of L2 Texts
ERIC Educational Resources Information Center
Seng, Goh Hock
2007-01-01
Numerous studies have shown that thinking aloud while reading can be an effective instructional technique in helping students improve their reading comprehension. However, most of the studies that examined the effects of think-aloud involve subjects reading individually and carried out in isolation away from the classroom context. Recently,…
IMC/RMC Network Professional Film Collection.
ERIC Educational Resources Information Center
New York State Education Dept., Albany. Special Education Instructional Materials Center.
The compilation is a comprehensive listing of films available from the centers in the Instructional Materials Centers/Regional Media Centers (IMC/RMC) Network. Each IMC/RMC location is given a numerical code in a preliminary listing. These numerical codes are used within the film listing, which is arranged alphabetically according to film titles,…
Cognitive correlates of performance in advanced mathematics.
Wei, Wei; Yuan, Hongbo; Chen, Chuansheng; Zhou, Xinlin
2012-03-01
Much research has been devoted to understanding cognitive correlates of elementary mathematics performance, but little such research has been done for advanced mathematics (e.g., modern algebra, statistics, and mathematical logic). To promote mathematical knowledge among college students, it is necessary to understand what factors (including cognitive factors) are important for acquiring advanced mathematics. We recruited 80 undergraduates from four universities in Beijing. The current study investigated the associations between students' performance on a test of advanced mathematics and a battery of 17 cognitive tasks on basic numerical processing, complex numerical processing, spatial abilities, language abilities, and general cognitive processing. The results showed that spatial abilities were significantly correlated with performance in advanced mathematics after controlling for other factors. In addition, certain language abilities (i.e., comprehension of words and sentences) also made unique contributions. In contrast, basic numerical processing and computation were generally not correlated with performance in advanced mathematics. Results suggest that spatial abilities and language comprehension, but not basic numerical processing, may play an important role in advanced mathematics. These results are discussed in terms of their theoretical significance and practical implications. ©2011 The British Psychological Society.
Comprehensive Numerical Simulation of Filling and Solidification of Steel Ingots
Pola, Annalisa; Gelfi, Marcello; La Vecchia, Giovina Marina
2016-01-01
In this paper, a complete three-dimensional numerical model of mold filling and solidification of steel ingots is presented. The risk of powder entrapment and defects formation during filling is analyzed in detail, demonstrating the importance of using a comprehensive geometry, with trumpet and runner, compared to conventional simplified models. By using a case study, it was shown that the simplified model significantly underestimates the defects sources, reducing the utility of simulations in supporting mold and process design. An experimental test was also performed on an instrumented mold and the measurements were compared to the calculation results. The good agreement between calculation and trial allowed validating the simulation. PMID:28773890
2009-01-01
Background The maintenance of internal pH in bacterial cells is challenged by natural stress conditions, during host infection or in biotechnological production processes. Comprehensive transcriptomic and proteomic analyses has been conducted in several bacterial model systems, yet questions remain as to the mechanisms of pH homeostasis. Results Here we present the comprehensive analysis of pH homeostasis in C. glutamicum, a bacterium of industrial importance. At pH values between 6 and 9 effective maintenance of the internal pH at 7.5 ± 0.5 pH units was found. By DNA microarray analyses differential mRNA patterns were identified. The expression profiles were validated and extended by 1D-LC-ESI-MS/MS based quantification of soluble and membrane proteins. Regulators involved were identified and thereby participation of numerous signaling modules in pH response was found. The functional analysis revealed for the first time the occurrence of oxidative stress in C. glutamicum cells at neutral and low pH conditions accompanied by activation of the iron starvation response. Intracellular metabolite pool analysis unraveled inhibition of the TCA and other pathways at low pH. Methionine and cysteine synthesis were found to be activated via the McbR regulator, cysteine accumulation was observed and addition of cysteine was shown to be toxic under acidic conditions. Conclusions Novel limitations for C. glutamicum at non-optimal pH values were identified by a comprehensive analysis on the level of the transcriptome, proteome, and metabolome indicating a functional link between pH acclimatization, oxidative stress, iron homeostasis, and metabolic alterations. The results offer new insights into bacterial stress physiology and new starting points for bacterial strain design or pathogen defense. PMID:20025733
Evaluation of the chondral modeling theory using fe-simulation and numeric shape optimization
Plochocki, Jeffrey H; Ward, Carol V; Smith, Douglas E
2009-01-01
The chondral modeling theory proposes that hydrostatic pressure within articular cartilage regulates joint size, shape, and congruence through regional variations in rates of tissue proliferation.The purpose of this study is to develop a computational model using a nonlinear two-dimensional finite element analysis in conjunction with numeric shape optimization to evaluate the chondral modeling theory. The model employed in this analysis is generated from an MR image of the medial portion of the tibiofemoral joint in a subadult male. Stress-regulated morphological changes are simulated until skeletal maturity and evaluated against the chondral modeling theory. The computed results are found to support the chondral modeling theory. The shape-optimized model exhibits increased joint congruence, broader stress distributions in articular cartilage, and a relative decrease in joint diameter. The results for the computational model correspond well with experimental data and provide valuable insights into the mechanical determinants of joint growth. The model also provides a crucial first step toward developing a comprehensive model that can be employed to test the influence of mechanical variables on joint conformation. PMID:19438771
Ashab, A.S.M. Ayman; Ruan, Dong; Lu, Guoxing; Bhuiyan, Arafat A.
2016-01-01
The mechanical behavior of aluminum hexagonal honeycombs subjected to out-of-plane dynamic indentation and compression loads has been investigated numerically using ANSYS/LS-DYNA in this paper. The finite element (FE) models have been verified by previous experimental results in terms of deformation pattern, stress-strain curve, and energy dissipation. The verified FE models have then been used in comprehensive numerical analysis of different aluminum honeycombs. Plateau stress, σpl, and dissipated energy (EI for indentation and EC for compression) have been calculated at different strain rates ranging from 102 to 104 s−1. The effects of strain rate and t/l ratio on the plateau stress, dissipated energy, and tearing energy have been discussed. An empirical formula is proposed to describe the relationship between the tearing energy per unit fracture area, relative density, and strain rate for honeycombs. Moreover, it has been found that a generic formula can be used to describe the relationship between tearing energy per unit fracture area and relative density for both aluminum honeycombs and foams. PMID:28773288
Numerical Investigation of the Turbulent Wind Flow Through Elevated Windbreak
NASA Astrophysics Data System (ADS)
Agarwal, Ashish; Irtaza, Hassan
2018-06-01
Analysis of airflow through elevated windbreaks is presented in this paper. Permeable nets and impermeable film increases considerable wind forces on the windbreaks which is susceptible to damage during high wind. A comprehensive numerical investigation has been carried out to analyze the effects of wind on standalone elevated windbreak clad with various permeable nets and an impermeable film. The variation of airflow behavior around and through permeable nets and airflow behavior around impermeable film were also been investigated. Computational fluid dynamics techniques using Reynolds Averaged Navier-Stokes equations has been used to predict the wind force coefficient and thus wind forces on panels supporting permeable nets and impermeable film for turbulent wind flow. Elevated windbreak panels were analyzed for seven different permeable nets having various solidity ratio, specific permeability and aerodynamic resistant coefficients. The permeable nets were modelled as porous jump media obeying Forchheimer's law and an impermeable film modelled as rigid wall.
Analysis of the thermal performance of heat pipe radiators
NASA Technical Reports Server (NTRS)
Boo, J. H.; Hartley, J. G.
1990-01-01
A comprehensive mathematical model and computational methodology are presented to obtain numerical solutions for the transient behavior of a heat pipe radiator in a space environment. The modeling is focused on a typical radiator panel having a long heat pipe at the center and two extended surfaces attached to opposing sides of the heat pipe shell in the condenser section. In the set of governing equations developed for the model, each region of the heat pipe - shell, liquid, and vapor - is thermally lumped to the extent possible, while the fin is lumped only in the direction normal to its surface. Convection is considered to be the only significant heat transfer mode in the vapor, and the evaporation and condensation velocity at the liquid-vapor interface is calculated from kinetic theory. A finite-difference numerical technique is used to predict the transient behavior of the entire radiator in response to changing loads.
Numerical Investigation of the Turbulent Wind Flow Through Elevated Windbreak
NASA Astrophysics Data System (ADS)
Agarwal, Ashish; Irtaza, Hassan
2018-04-01
Analysis of airflow through elevated windbreaks is presented in this paper. Permeable nets and impermeable film increases considerable wind forces on the windbreaks which is susceptible to damage during high wind. A comprehensive numerical investigation has been carried out to analyze the effects of wind on standalone elevated windbreak clad with various permeable nets and an impermeable film. The variation of airflow behavior around and through permeable nets and airflow behavior around impermeable film were also been investigated. Computational fluid dynamics techniques using Reynolds Averaged Navier-Stokes equations has been used to predict the wind force coefficient and thus wind forces on panels supporting permeable nets and impermeable film for turbulent wind flow. Elevated windbreak panels were analyzed for seven different permeable nets having various solidity ratio, specific permeability and aerodynamic resistant coefficients. The permeable nets were modelled as porous jump media obeying Forchheimer's law and an impermeable film modelled as rigid wall.
Theory and design of variable conductance heat pipes
NASA Technical Reports Server (NTRS)
Marcus, B. D.
1972-01-01
A comprehensive review and analysis of all aspects of heat pipe technology pertinent to the design of self-controlled, variable conductance devices for spacecraft thermal control is presented. Subjects considered include hydrostatics, hydrodynamics, heat transfer into and out of the pipe, fluid selection, materials compatibility and variable conductance control techniques. The report includes a selected bibliography of pertinent literature, analytical formulations of various models and theories describing variable conductance heat pipe behavior, and the results of numerous experiments on the steady state and transient performance of gas controlled variable conductance heat pipes. Also included is a discussion of VCHP design techniques.
Ball Bearing Analysis with the ORBIS Tool
NASA Technical Reports Server (NTRS)
Halpin, Jacob D.
2016-01-01
Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.
SCISEAL: A CFD code for analysis of fluid dynamic forces in seals
NASA Technical Reports Server (NTRS)
Athavale, Mahesh; Przekwas, Andrzej
1994-01-01
A viewgraph presentation is made of the objectives, capabilities, and test results of the computer code SCISEAL. Currently, the seal code has: a finite volume, pressure-based integration scheme; colocated variables with strong conservation approach; high-order spatial differencing, up to third-order; up to second-order temporal differencing; a comprehensive set of boundary conditions; a variety of turbulence models and surface roughness treatment; moving grid formulation for arbitrary rotor whirl; rotor dynamic coefficients calculated by the circular whirl and numerical shaker methods; and small perturbation capabilities to handle centered and eccentric seals.
NASA Astrophysics Data System (ADS)
Wang, Yunong; Cheng, Rongjun; Ge, Hongxia
2017-08-01
In this paper, a lattice hydrodynamic model is derived considering not only the effect of flow rate difference but also the delayed feedback control signal which including more comprehensive information. The control method is used to analyze the stability of the model. Furthermore, the critical condition for the linear steady traffic flow is deduced and the numerical simulation is carried out to investigate the advantage of the proposed model with and without the effect of flow rate difference and the control signal. The results are consistent with the theoretical analysis correspondingly.
NASA Astrophysics Data System (ADS)
Andreev, Vladimir
2018-03-01
The paper deals with the problem of determining the stress state of the pressure vessel (PV) with considering the concrete temperature inhomogeneity. Such structures are widely used in heat power engineering, for example, in nuclear power engineering. The structures of such buildings are quite complex and a comprehensive analysis of the stress state in them can be carried out either by numerical or experimental methods. However, a number of fundamental questions can be solved on the basis of simplified models, in particular, studies of the effect on the stressed state of the inhomogeneity caused by the temperature field.
Simulation of non-Newtonian oil-water core annular flow through return bends
NASA Astrophysics Data System (ADS)
Jiang, Fan; Wang, Ke; Skote, Martin; Wong, Teck Neng; Duan, Fei
2018-01-01
The volume of fluid (VOF) model is used together with the continuum surface force (CSF) model to numerically simulate the non-Newtonian oil-water core annular flow across return bends. A comprehensive study is conducted to generate the profiles of pressure, velocity, volume fraction and wall shear stress for different oil properties, flow directions, and bend geometries. It is revealed that the oil core may adhere to the bend wall under certain operating conditions. Through the analysis of the total pressure gradient and fouling angle, suitable bend geometric parameters are identified for avoiding the risk of fouling.
Validation of a Three-Dimensional Method for Counting and Sizing Podocytes in Whole Glomeruli
van der Wolde, James W.; Schulze, Keith E.; Short, Kieran M.; Wong, Milagros N.; Bensley, Jonathan G.; Cullen-McEwen, Luise A.; Caruana, Georgina; Hokke, Stacey N.; Li, Jinhua; Firth, Stephen D.; Harper, Ian S.; Nikolic-Paterson, David J.; Bertram, John F.
2016-01-01
Podocyte depletion is sufficient for the development of numerous glomerular diseases and can be absolute (loss of podocytes) or relative (reduced number of podocytes per volume of glomerulus). Commonly used methods to quantify podocyte depletion introduce bias, whereas gold standard stereologic methodologies are time consuming and impractical. We developed a novel approach for assessing podocyte depletion in whole glomeruli that combines immunofluorescence, optical clearing, confocal microscopy, and three-dimensional analysis. We validated this method in a transgenic mouse model of selective podocyte depletion, in which we determined dose-dependent alterations in several quantitative indices of podocyte depletion. This new approach provides a quantitative tool for the comprehensive and time-efficient analysis of podocyte depletion in whole glomeruli. PMID:26975438
Berretta, Massimiliano; Micek, Agnieszka; Lafranconi, Alessandra; Rossetti, Sabrina; Di Francia, Raffaele; De Paoli, Paolo; Rossi, Paola; Facchini, Gaetano
2018-04-17
Coffee consumption has been associated with numerous cancers, but evidence on ovarian cancer risk is controversial. Therefore, we performed a meta-analysis on prospective cohort studies in order to review the evidence on coffee consumption and risk of ovarian cancer. Studies were identified through searching the PubMed and MEDLINE databases up to March 2017. Risk estimates were retrieved from the studies, and dose-response analysis was modelled by using restricted cubic splines. Additionally, a stratified analysis by menopausal status was performed. A total of 8 studies were eligible for the dose-response meta-analysis. Studies included in the analysis comprised 787,076 participants and 3,541 ovarian cancer cases. The results showed that coffee intake was not associated with ovarian cancer risk (RR = 1.06, 95% CI: 0.89, 1.26). Stratified and subgroup analysis showed consisted results. This comprehensive meta-analysis did not find evidence of an association between the consumption of coffee and risk of ovarian cancer.
Numeracy Skills in Patients With Degenerative Disorders and Focal Brain Lesions
Cappelletti, Marinella; Butterworth, Brian; Kopelman, Michael
2012-01-01
Objective: To characterize the numerical profile of patients with acquired brain disorders. Method: We investigated numeracy skills in 76 participants—40 healthy controls and 36 patients with neurodegenerative disorders (Alzheimer dementia, frontotemporal dementia, semantic dementia, progressive aphasia) and with focal brain lesions affecting parietal, frontal, and temporal areas as in herpes simplex encephalitis (HSE). All patients were tested with the same comprehensive battery of paper-and-pencil and computerized tasks assessing numerical abilities and calculation. Degenerative and HSE patients also performed nonnumerical semantic tasks. Results: Our results, based on nonparametric group statistics as well as on the analysis of individual patients, and all highly significant, show that: (a) all patients, including those with parietal lesions—a key brain area for numeracy processing—had intact processing of number quantity; (b) patients with impaired semantic knowledge had much better preserved numerical knowledge; and (c) most patients showed impaired calculation skills, with the exception of most semantic dementia and HSE patients. Conclusion: Our results allow us, for the first time, to characterize the numeracy skills in patients with a variety of neurological conditions and to suggest that the pattern of numerical performance can vary considerably across different neurological populations. Moreover, the selective sparing of calculation skills in most semantic dementia and HSE suggest that numerical abilities are an independent component of the semantic system. Finally, our data suggest that, besides the parietal areas, other brain regions might be critical to the understanding and processing of numerical concepts. PMID:22122516
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choi, Yong Joon; Yoo, Jun Soo; Smith, Curtis Lee
2015-09-01
This INL plan comprehensively describes the Requirements Traceability Matrix (RTM) on main physics and numerical method of the RELAP-7. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.
ERIC Educational Resources Information Center
Bjork, Isabel Maria; Bowyer-Crane, Claudine
2013-01-01
This study investigates the relationship between skills that underpin mathematical word problems and those that underpin numerical operations, such as addition, subtraction, division and multiplication. Sixty children aged 6-7 years were tested on measures of mathematical ability, reading accuracy, reading comprehension, verbal intelligence and…
Bax, Leon; Yu, Ly-Mee; Ikeda, Noriaki; Tsuruta, Harukazu; Moons, Karel G M
2006-10-13
Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim) and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel) interface, and the educational features. The MIX program is a valid tool for performing meta-analysis and may be particularly useful in educational environments. It can be downloaded free of charge via http://www.mix-for-meta-analysis.info or http://sourceforge.net/projects/meta-analysis.
RELAP-7 Software Verification and Validation Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling
This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less
Vibration Signature Analysis of a Faulted Gear Transmission System
NASA Technical Reports Server (NTRS)
Choy, F. K.; Huang, S.; Zakrajsek, J. J.; Handschuh, R. F.; Townsend, D. P.
1994-01-01
A comprehensive procedure in predicting faults in gear transmission systems under normal operating conditions is presented. Experimental data was obtained from a spiral bevel gear fatigue test rig at NASA Lewis Research Center. Time synchronous averaged vibration data was recorded throughout the test as the fault progressed from a small single pit to severe pitting over several teeth, and finally tooth fracture. A numerical procedure based on the Winger-Ville distribution was used to examine the time averaged vibration data. Results from the Wigner-Ville procedure are compared to results from a variety of signal analysis techniques which include time domain analysis methods and frequency analysis methods. Using photographs of the gear tooth at various stages of damage, the limitations and accuracy of the various techniques are compared and discussed. Conclusions are drawn from the comparison of the different approaches as well as the applicability of the Wigner-Ville method in predicting gear faults.
MetaDP: a comprehensive web server for disease prediction of 16S rRNA metagenomic datasets.
Xu, Xilin; Wu, Aiping; Zhang, Xinlei; Su, Mingming; Jiang, Taijiao; Yuan, Zhe-Ming
2016-01-01
High-throughput sequencing-based metagenomics has garnered considerable interest in recent years. Numerous methods and tools have been developed for the analysis of metagenomic data. However, it is still a daunting task to install a large number of tools and complete a complicated analysis, especially for researchers with minimal bioinformatics backgrounds. To address this problem, we constructed an automated software named MetaDP for 16S rRNA sequencing data analysis, including data quality control, operational taxonomic unit clustering, diversity analysis, and disease risk prediction modeling. Furthermore, a support vector machine-based prediction model for intestinal bowel syndrome (IBS) was built by applying MetaDP to microbial 16S sequencing data from 108 children. The success of the IBS prediction model suggests that the platform may also be applied to other diseases related to gut microbes, such as obesity, metabolic syndrome, or intestinal cancer, among others (http://metadp.cn:7001/).
Measuring the Lense-Thirring precession using a second Lageos satellite
NASA Technical Reports Server (NTRS)
Tapley, B. D.; Ciufolini, I.
1989-01-01
A complete numerical simulation and error analysis was performed for the proposed experiment with the objective of establishing an accurate assessment of the feasibility and the potential accuracy of the measurement of the Lense-Thirring precession. Consideration was given to identifying the error sources which limit the accuracy of the experiment and proposing procedures for eliminating or reducing the effect of these errors. Analytic investigations were conducted to study the effects of major error sources with the objective of providing error bounds on the experiment. The analysis of realistic simulated data is used to demonstrate that satellite laser ranging of two Lageos satellites, orbiting with supplemental inclinations, collected for a period of 3 years or more, can be used to verify the Lense-Thirring precession. A comprehensive covariance analysis for the solution was also developed.
A systems biology analysis of autophagy in cancer therapy.
Shi, Zheng; Li, Chun-yang; Zhao, Si; Yu, Yang; An, Na; Liu, Yong-xi; Wu, Chuan-fang; Yue, Bi-song; Bao, Jin-ku
2013-09-01
Autophagy, which degrades redundant or damaged cellular constituents, is intricately relevant to a variety of human diseases, most notably cancer. Autophagy exerts distinct effects on cancer initiation and progression, due to the intrinsic overlapping of autophagic and cancer signalling pathways. However, due to the complexity of cancer as a systemic disease, the fate of cancer cells is not decided by any one signalling pathway. Numerous autophagic inter-connectivity and cross-talk pathways need to be further clarified at a systems level. In this review, we propose a systems biology perspective for the comprehensive analysis of the autophagy-cancer network, focusing on systems biology analysis in autophagy and cancer therapy. Together, these analyses may not only improve our understanding on autophagy-cancer relationships, but also facilitate cancer drug discovery. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Hollunder, Jens; Friedel, Maik; Kuiper, Martin; Wilhelm, Thomas
2010-04-01
Many large 'omics' datasets have been published and many more are expected in the near future. New analysis methods are needed for best exploitation. We have developed a graphical user interface (GUI) for easy data analysis. Our discovery of all significant substructures (DASS) approach elucidates the underlying modularity, a typical feature of complex biological data. It is related to biclustering and other data mining approaches. Importantly, DASS-GUI also allows handling of multi-sets and calculation of statistical significances. DASS-GUI contains tools for further analysis of the identified patterns: analysis of the pattern hierarchy, enrichment analysis, module validation, analysis of additional numerical data, easy handling of synonymous names, clustering, filtering and merging. Different export options allow easy usage of additional tools such as Cytoscape. Source code, pre-compiled binaries for different systems, a comprehensive tutorial, case studies and many additional datasets are freely available at http://www.ifr.ac.uk/dass/gui/. DASS-GUI is implemented in Qt.
Judd, Charles M; Westfall, Jacob; Kenny, David A
2012-07-01
Throughout social and cognitive psychology, participants are routinely asked to respond in some way to experimental stimuli that are thought to represent categories of theoretical interest. For instance, in measures of implicit attitudes, participants are primed with pictures of specific African American and White stimulus persons sampled in some way from possible stimuli that might have been used. Yet seldom is the sampling of stimuli taken into account in the analysis of the resulting data, in spite of numerous warnings about the perils of ignoring stimulus variation (Clark, 1973; Kenny, 1985; Wells & Windschitl, 1999). Part of this failure to attend to stimulus variation is due to the demands imposed by traditional analysis of variance procedures for the analysis of data when both participants and stimuli are treated as random factors. In this article, we present a comprehensive solution using mixed models for the analysis of data with crossed random factors (e.g., participants and stimuli). We show the substantial biases inherent in analyses that ignore one or the other of the random factors, and we illustrate the substantial advantages of the mixed models approach with both hypothetical and actual, well-known data sets in social psychology (Bem, 2011; Blair, Chapleau, & Judd, 2005; Correll, Park, Judd, & Wittenbrink, 2002). PsycINFO Database Record (c) 2012 APA, all rights reserved
Numerical modeling of bubble dynamics in viscoelastic media with relaxation
NASA Astrophysics Data System (ADS)
Warnez, M. T.; Johnsen, E.
2015-06-01
Cavitation occurs in a variety of non-Newtonian fluids and viscoelastic materials. The large-amplitude volumetric oscillations of cavitation bubbles give rise to high temperatures and pressures at collapse, as well as induce large and rapid deformation of the surroundings. In this work, we develop a comprehensive numerical framework for spherical bubble dynamics in isotropic media obeying a wide range of viscoelastic constitutive relationships. Our numerical approach solves the compressible Keller-Miksis equation with full thermal effects (inside and outside the bubble) when coupled to a highly generalized constitutive relationship (which allows Newtonian, Kelvin-Voigt, Zener, linear Maxwell, upper-convected Maxwell, Jeffreys, Oldroyd-B, Giesekus, and Phan-Thien-Tanner models). For the latter two models, partial differential equations (PDEs) must be solved in the surrounding medium; for the remaining models, we show that the PDEs can be reduced to ordinary differential equations. To solve the general constitutive PDEs, we present a Chebyshev spectral collocation method, which is robust even for violent collapse. Combining this numerical approach with theoretical analysis, we simulate bubble dynamics in various viscoelastic media to determine the impact of relaxation time, a constitutive parameter, on the associated physics. Relaxation time is found to increase bubble growth and permit rebounds driven purely by residual stresses in the surroundings. Different regimes of oscillations occur depending on the relaxation time.
A novel approach to the analysis of squeezed-film air damping in microelectromechanical systems
NASA Astrophysics Data System (ADS)
Yang, Weilin; Li, Hongxia; Chatterjee, Aveek N.; Elfadel, Ibrahim (Abe M.; Ender Ocak, Ilker; Zhang, TieJun
2017-01-01
Squeezed-film damping (SFD) is a phenomenon that significantly affects the performance of micro-electro-mechanical systems (MEMS). The total damping force in MEMS mainly include the viscous damping force and elastic damping force. Quality factor (Q factor) is usually used to evaluate the damping in MEMS. In this work, we measure the Q factor of a resonator through experiments in a wide range of pressure levels. In fact, experimental characterizations of MEMS have some limitations because it is difficult to conduct experiments at very high vacuum and also hard to differentiate the damping mechanisms from the overall Q factor measurements. On the other hand, classical theoretical analysis of SFD is restricted to strong assumptions and simple geometries. In this paper, a novel numerical approach, which is based on lattice Boltzmann simulations, is proposed to investigate SFD in MEMS. Our method considers the dynamics of squeezed air flow as well as fluid-solid interactions in MEMS. It is demonstrated that Q factor can be directly predicted by numerical simulation, and our simulation results agree well with experimental data. Factors that influence SFD, such as pressure, oscillating amplitude, and driving frequency, are investigated separately. Furthermore, viscous damping and elastic damping forces are quantitatively compared based on comprehensive simulation. The proposed numerical approach as well as experimental characterization enables us to reveal the insightful physics of squeezed-film air damping in MEMS.
On-the-fly Numerical Surface Integration for Finite-Difference Poisson-Boltzmann Methods.
Cai, Qin; Ye, Xiang; Wang, Jun; Luo, Ray
2011-11-01
Most implicit solvation models require the definition of a molecular surface as the interface that separates the solute in atomic detail from the solvent approximated as a continuous medium. Commonly used surface definitions include the solvent accessible surface (SAS), the solvent excluded surface (SES), and the van der Waals surface. In this study, we present an efficient numerical algorithm to compute the SES and SAS areas to facilitate the applications of finite-difference Poisson-Boltzmann methods in biomolecular simulations. Different from previous numerical approaches, our algorithm is physics-inspired and intimately coupled to the finite-difference Poisson-Boltzmann methods to fully take advantage of its existing data structures. Our analysis shows that the algorithm can achieve very good agreement with the analytical method in the calculation of the SES and SAS areas. Specifically, in our comprehensive test of 1,555 molecules, the average unsigned relative error is 0.27% in the SES area calculations and 1.05% in the SAS area calculations at the grid spacing of 1/2Å. In addition, a systematic correction analysis can be used to improve the accuracy for the coarse-grid SES area calculations, with the average unsigned relative error in the SES areas reduced to 0.13%. These validation studies indicate that the proposed algorithm can be applied to biomolecules over a broad range of sizes and structures. Finally, the numerical algorithm can also be adapted to evaluate the surface integral of either a vector field or a scalar field defined on the molecular surface for additional solvation energetics and force calculations.
Nizio, Katie D; Harynuk, James J
2012-08-24
Alkyl phosphate based gellants used as viscosity builders for fracturing fluids used in the process of hydraulic fracturing have been implicated in numerous refinery-fouling incidents in North America. In response, industry developed an inductively coupled plasma optical emission spectroscopy (ICP-OES) based method for the analysis of total volatile phosphorus in distillate fractions of crude oil; however, this method is plagued by poor precision and a high limit of detection (0.5±1μg phosphorus mL(-1)). Furthermore this method cannot provide speciation information, which is critical for developing an understanding of the challenge of alkyl phosphates at a molecular level. An approach using comprehensive two-dimensional gas chromatography with nitrogen phosphorus detection (GC×GC-NPD) and post-column Deans switching is presented. This method provides qualitative and quantitative profiles of alkyl phosphates in industrial petroleum samples with increased precision and at levels comparable to or below those achievable by ICP-OES. A recovery study in a fracturing fluid sample and a profiling study of alkyl phosphates in four recovered fracturing fluid/crude oil mixtures (flowback) are also presented. Copyright © 2012 Elsevier B.V. All rights reserved.
Quantifying the Variability in Species' Vulnerability to Ocean Acidification
NASA Astrophysics Data System (ADS)
Kroeker, K. J.; Kordas, R. L.; Crim, R.; Gattuso, J.; Hendriks, I.; Singh, G. G.
2012-12-01
Ocean acidification represents a threat to marine species and ecosystems worldwide. As such, understanding the potential ecological impacts of acidification is a high priority for science, management, and policy. As research on the biological impacts of ocean acidification continues to expand at an exponential rate, a comprehensive understanding of the generalities and/or variability in organisms' responses and the corresponding levels of certainty of these potential responses is essential. Meta-analysis is a quantitative technique for summarizing the results of primary research studies and provides a transparent method to examine the generalities and/or variability in scientific results across numerous studies. Here, we perform the most comprehensive meta-analysis to date by synthesizing the results of 228 studies examining the biological impacts of ocean acidification. Our results reveal decreased survival, calcification, growth, reproduction and development in response to acidification across a broad range of marine organisms, as well as significant trait-mediated variation among taxonomic groups and enhanced sensitivity among early life history stages. In addition, our results reveal a pronounced sensitivity of molluscs to acidification, especially among the larval stages, and enhanced vulnerability to acidification with concurrent exposure to increased seawater temperatures across a diversity of organisms.
ERIC Educational Resources Information Center
Kaplin, William A.; Lee, Barbara A.
This volume is the third edition of a comprehensive treatment of higher education law and is current through approximately August 1994. The nine chapters are divided into numerous sections and subsections. Chapter 1 provides a framework for understanding and integrating what is presented in subsequent chapters and a perspective for future…
Bax, Leon; Yu, Ly-Mee; Ikeda, Noriaki; Tsuruta, Harukazu; Moons, Karel GM
2006-01-01
Background Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. Results We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim) and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel) interface, and the educational features. Conclusion The MIX program is a valid tool for performing meta-analysis and may be particularly useful in educational environments. It can be downloaded free of charge via or . PMID:17038197
Thought-action fusion: a comprehensive analysis using structural equation modeling.
Marino, Teresa L; Lunt, Rachael A; Negy, Charles
2008-07-01
Thought-action fusion (TAF), the phenomenon whereby one has difficulty separating cognitions from corresponding behaviors, has implications in a wide variety of disturbances, including eating disorders, obsessive-compulsive disorder, generalized anxiety disorder, and panic disorder. Numerous constructs believed to contribute to the etiology or maintenance of TAF have been identified in the literature, but to date, no study has empirically integrated these findings into a comprehensive model. In this study, we examined simultaneously an array of variables thought to be related to TAF, and subsequently developed a model that elucidates the role of those variables that seem most involved in this phenomenon using a structural equation modeling approach. Results indicated that religiosity, as predicted by ethnic identity, was a significant predictor of TAF. Additionally, the relation between ethnic identity and TAF was partially mediated by an inflated sense of responsibility. Both TAF and obsessive-compulsive symptoms were found to be significant predictors of engagement in neutralization activities. Clinical and theoretical implications are discussed.
NASA Astrophysics Data System (ADS)
Kumar, Ajay; Tripathi, M. M.; Chaujar, Rishu
2018-04-01
In this work, a comprehensive analog and RF performance of a novel Black Phosphorus-Junctionless-Recessed Channel (BP-JL-RC) MOSFET has been explored at 45 nm technology node (Gate length = 20 nm). The integration of black phosphorus with junctionless recessed channel MOSFET, leads to higher drain current of about 0.3 mA and excellent switching ratio (of the order of 1011) due to reduced off-current which leads to improvement in sub-threshold slope (SS) (67mV/dec). Further, RF performance metrics have also been studied with an aim to analyze high-frequency performance. The following FOMs have been evaluated: cut-off frequency (fT), maximum oscillator frequency (fMAX), stern stability factor, various power gains and parasitic capacitances at THz frequency range. Thus, in addition to the high packing density offered by RC MOSFET, the proposed design finds numerous application at THz frequency making it a promising candidate at wafer scale integration level.
Development of preschool and academic skills in children born very preterm.
Aarnoudse-Moens, Cornelieke Sandrine Hanan; Oosterlaan, Jaap; Duivenvoorden, Hugo Joseph; van Goudoever, Johannes Bernard; Weisglas-Kuperus, Nynke
2011-01-01
To examine performance in preschool and academic skills in very preterm (gestational age ≤ 30 weeks) and term-born comparison children aged 4 to 12 years. Very preterm children (n = 200; mean age, 8.2 ± 2.5 years) born between 1996 and 2004 were compared with 230 term-born children (mean age, 8.3 ± 2.3). The Dutch National Pupil Monitoring System was used to measure preschool numerical reasoning and early linguistics, and primary school simple and complex word reading, reading comprehension, spelling, and mathematics/arithmetic. With univariate analyses of variance, we assessed the effects of preterm birth on performance across grades and on grade retention. In preschool, very preterm children performed comparably with term-born children in early linguistics, but perform more poorly (0.7 standard deviation [SD]) in numerical reasoning skills. In primary school, very preterm children scored 0.3 SD lower in complex word reading and 0.6 SD lower in mathematics/arithmetic, but performed comparably with peers in reading comprehension and spelling. They had a higher grade repeat rate (25.5%), although grade repeat did not improve their academic skills. Very preterm children do well in early linguistics, reading comprehension, and spelling, but have clinically significant deficits in numerical reasoning skills and mathematics/arithmetic, which persist with time. Copyright © 2011 Mosby, Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kollias, Pavlos
2017-08-08
This is a multi-institutional, collaborative project using observations and modeling to study the evolution (e.g. formation and growth) of hydrometeors in continental convective clouds. Our contribution was in data analysis for the generation of high-value cloud and precipitation products and derive cloud statistics for model validation. There are two areas in data analysis that we contributed: i) the development of novel, state-of-the-art dual-wavelength radar algorithms for the retrieval of cloud microphysical properties and ii) the evaluation of large domain, high-resolution models using comprehensive multi-sensor observations. Our research group developed statistical summaries from numerous sensors and developed retrievals of vertical airmore » motion in deep convection.« less
Simulation analysis of an integrated model for dynamic cellular manufacturing system
NASA Astrophysics Data System (ADS)
Hao, Chunfeng; Luan, Shichao; Kong, Jili
2017-05-01
Application of dynamic cellular manufacturing system (DCMS) is a well-known strategy to improve manufacturing efficiency in the production environment with high variety and low volume of production. Often, neither the trade-off of inter and intra-cell material movements nor the trade-off of hiring and firing of operators are examined in details. This paper presents simulation results of an integrated mixed-integer model including sensitivity analysis for several numerical examples. The comprehensive model includes cell formation, inter and intracellular materials handling, inventory and backorder holding, operator assignment (including resource adjustment) and flexible production routing. The model considers multi-production planning with flexible resources (machines and operators) where each period has different demands. The results verify the validity and sensitivity of the proposed model using a genetic algorithm.
Statistical analysis of trypanosomes' motility
NASA Astrophysics Data System (ADS)
Zaburdaev, Vasily; Uppaluri, Sravanti; Pfohl, Thomas; Engstler, Markus; Stark, Holger; Friedrich, Rudolf
2010-03-01
Trypanosome is a parasite causing the sleeping sickness. The way it moves in the blood stream and penetrates various obstacles is the area of active research. Our goal was to investigate a free trypanosomes' motion in the planar geometry. Our analysis of trypanosomes' trajectories reveals that there are two correlation times - one is associated with a fast motion of its body and the second one with a slower rotational diffusion of the trypanosome as a point object. We propose a system of Langevin equations to model such motion. One of its peculiarities is the presence of multiplicative noise predicting higher level of noise for higher velocity of the trypanosome. Theoretical and numerical results give a comprehensive description of the experimental data such as the mean squared displacement, velocity distribution and auto-correlation function.
Integrated web visualizations for protein-protein interaction databases.
Jeanquartier, Fleur; Jean-Quartier, Claire; Holzinger, Andreas
2015-06-16
Understanding living systems is crucial for curing diseases. To achieve this task we have to understand biological networks based on protein-protein interactions. Bioinformatics has come up with a great amount of databases and tools that support analysts in exploring protein-protein interactions on an integrated level for knowledge discovery. They provide predictions and correlations, indicate possibilities for future experimental research and fill the gaps to complete the picture of biochemical processes. There are numerous and huge databases of protein-protein interactions used to gain insights into answering some of the many questions of systems biology. Many computational resources integrate interaction data with additional information on molecular background. However, the vast number of diverse Bioinformatics resources poses an obstacle to the goal of understanding. We present a survey of databases that enable the visual analysis of protein networks. We selected M=10 out of N=53 resources supporting visualization, and we tested against the following set of criteria: interoperability, data integration, quantity of possible interactions, data visualization quality and data coverage. The study reveals differences in usability, visualization features and quality as well as the quantity of interactions. StringDB is the recommended first choice. CPDB presents a comprehensive dataset and IntAct lets the user change the network layout. A comprehensive comparison table is available via web. The supplementary table can be accessed on http://tinyurl.com/PPI-DB-Comparison-2015. Only some web resources featuring graph visualization can be successfully applied to interactive visual analysis of protein-protein interaction. Study results underline the necessity for further enhancements of visualization integration in biochemical analysis tools. Identified challenges are data comprehensiveness, confidence, interactive feature and visualization maturing.
ERIC Educational Resources Information Center
Spencer, Mercedes; Wagner, Richard K.
2018-01-01
The purpose of this meta-analysis was to examine the comprehension problems of children who have a specific reading comprehension deficit (SCD), which is characterized by poor reading comprehension despite adequate decoding. The meta-analysis included 86 studies of children with SCD who were assessed in reading comprehension and oral language…
Delay-dependent coupling for a multi-agent LTI consensus system with inter-agent delays
NASA Astrophysics Data System (ADS)
Qiao, Wei; Sipahi, Rifat
2014-01-01
Delay-dependent coupling (DDC) is considered in this paper in a broadly studied linear time-invariant multi-agent consensus system in which agents communicate with each other under homogeneous delays, while attempting to reach consensus. The coupling among the agents is designed here as an explicit parameter of this delay, allowing couplings to autonomously adapt based on the delay value, and in order to guarantee stability and a certain degree of robustness in the network despite the destabilizing effect of delay. Design procedures, analysis of convergence speed of consensus, comprehensive numerical studies for the case of time-varying delay, and limitations are presented.
NASA Astrophysics Data System (ADS)
Zhang, Zheng; Tian, Menjiya; Quan, Xusong; Pei, Guoqing; Wang, Hui; Liu, Tianye; Long, Kai; Xiong, Zhao; Rong, Yiming
2017-11-01
Surface control and phase matching of large laser conversion optics are urgent requirements and huge challenges in high-power solid-state laser facilities. A self-adaptive, nanocompensating mounting configuration of a large aperture potassium dihydrogen phosphate (KDP) frequency doubler is proposed based on a lever-type surface correction mechanism. A mechanical, numerical, and optical model is developed and employed to evaluate comprehensive performance of this mounting method. The results validate the method's advantages of surface adjustment and phase matching improvement. In addition, the optimal value of the modulation force is figured out through a series of simulations and calculations.
Numerical Hydrodynamics in General Relativity.
Font, José A
2000-01-01
The current status of numerical solutions for the equations of ideal general relativistic hydrodynamics is reviewed. Different formulations of the equations are presented, with special mention of conservative and hyperbolic formulations well-adapted to advanced numerical methods. A representative sample of available numerical schemes is discussed and particular emphasis is paid to solution procedures based on schemes exploiting the characteristic structure of the equations through linearized Riemann solvers. A comprehensive summary of relevant astrophysical simulations in strong gravitational fields, including gravitational collapse, accretion onto black holes and evolution of neutron stars, is also presented. Supplementary material is available for this article at 10.12942/lrr-2000-2.
Bock, I; Raveh-Amit, H; Losonczi, E; Carstea, A C; Feher, A; Mashayekhi, K; Matyas, S; Dinnyes, A; Pribenszky, C
2016-04-01
The efficiency of various assisted reproductive techniques can be improved by preconditioning the gametes and embryos with sublethal hydrostatic pressure treatment. However, the underlying molecular mechanism responsible for this protective effect remains unknown and requires further investigation. Here, we studied the effect of optimised hydrostatic pressure treatment on the global gene expression of mouse oocytes after embryonic genome activation. Based on a gene expression microarray analysis, a significant effect of treatment was observed in 4-cell embryos derived from treated oocytes, revealing a transcriptional footprint of hydrostatic pressure-affected genes. Functional analysis identified numerous genes involved in protein synthesis that were downregulated in 4-cell embryos in response to hydrostatic pressure treatment, suggesting that regulation of translation has a major role in optimised hydrostatic pressure-induced stress tolerance. We present a comprehensive microarray analysis and further delineate a potential mechanism responsible for the protective effect of hydrostatic pressure treatment.
Analysis of the Vibration Propagation in the Subsoil
NASA Astrophysics Data System (ADS)
Jastrzębska, Małgorzata; Łupieżowiec, Marian; Uliniarz, Rafał; Jaroń, Artur
2015-02-01
The paper presents in a comprehensive way issues related to propagation in a soil environment of vibrations originating during sheet piling vibratory driving. Considerations carried out comprised the FEM analysis of initial-boundary behaviour of the subsoil during impacts accompanying the works performed. The analysis has used the authors' RU+MCC constitutive model, which can realistically describe complex deformation characteristics in soils in the field of small strains, which accompany the phenomenon of shock propagation. The basis for model creation and for specification of material parameters of the presented model consisted of first-class tests performed in a triaxial apparatus using proximity detectors guaranteeing a proper measurement of strains ranging from 10-1 to 10-3% and bender elements. Results obtained from numerical analyses were confronted with results of field tests consisting in measurements of acceleration amplitudes generated on the ground surface due to technological impacts versus the distance from vibration source.
Nursing workload in the acute-care setting: A concept analysis of nursing workload.
Swiger, Pauline A; Vance, David E; Patrician, Patricia A
2016-01-01
A pressing need in the field of nursing is the identification of optimal staffing levels to ensure patient safety. Effective staffing requires comprehensive measurement of nursing workload to determine staffing needs. Issues surrounding nursing workload are complex, and the volume of workload is growing; however, many workload systems do not consider the numerous workload factors that impact nursing today. The purpose of this concept analysis was to better understand and define nursing workload as it relates to the acute-care setting. Rogers' evolutionary method was used for this literature-based concept analysis. Nursing workload is influenced by more than patient care. The proposed definition of nursing workload may help leaders identify workload that is unnoticed and unmeasured. These findings could help leaders consider and identify workload that is unnecessary, redundant, or more appropriate for assignment to other members of the health care team. Published by Elsevier Inc.
Third-Order Memristive Morris-Lecar Model of Barnacle Muscle Fiber
NASA Astrophysics Data System (ADS)
Rajamani, Vetriveeran; Sah, Maheshwar Pd.; Mannan, Zubaer Ibna; Kim, Hyongsuk; Chua, Leon
This paper presents a detailed analysis of various oscillatory behaviors observed in relation to the calcium and potassium ions in the third-order Morris-Lecar model of giant barnacle muscle fiber. Since, both the calcium and potassium ions exhibit all of the characteristics of memristor fingerprints, we claim that the time-varying calcium and potassium ions in the third-order Morris-Lecar model are actually time-invariant calcium and potassium memristors in the third-order memristive Morris-Lecar model. We confirmed the existence of a small unstable limit cycle oscillation in both the second-order and the third-order Morris-Lecar model by numerically calculating the basin of attraction of the asymptotically stable equilibrium point associated with two subcritical Hopf bifurcation points. We also describe a comprehensive analysis of the generation of oscillations in third-order memristive Morris-Lecar model via small-signal circuit analysis and a subcritical Hopf bifurcation phenomenon.
A comparative study of integrated pest management strategies based on impulsive control.
Páez Chávez, Joseph; Jungmann, Dirk; Siegmund, Stefan
2018-12-01
The paper presents a comprehensive numerical study of mathematical models used to describe complex biological systems in the framework of integrated pest management. Our study considers two specific ecosystems that describe the application of control mechanisms based on pesticides and natural enemies, implemented in an impulsive and periodic manner, due to which the considered models belong to the class of impulsive differential equations. The present work proposes a numerical approach to study such type of models in detail, via the application of path-following (continuation) techniques for nonsmooth dynamical systems, via the novel continuation platform COCO (Dankowicz and Schilder). In this way, a detailed study focusing on the influence of selected system parameters on the effectiveness of the pest control scheme is carried out for both ecological scenarios. Furthermore, a comparative study is presented, with special emphasis on the mechanisms upon which a pest outbreak can occur in the considered ecosystems. Our study reveals that such outbreaks are determined by the presence of a branching point found during the continuation analysis. The numerical investigation concludes with an in-depth study of the state-dependent pesticide mortality considered in one of the ecological scenarios.
Analysis of the STAT3 interactome using in-situ biotinylation and SILAC.
Blumert, Conny; Kalkhof, Stefan; Brocke-Heidrich, Katja; Kohajda, Tibor; von Bergen, Martin; Horn, Friedemann
2013-12-06
Signal transducer and activator of transcription 3 (STAT3) is activated by a variety of cytokines and growth factors. To generate a comprehensive data set of proteins interacting specifically with STAT3, we applied stable isotope labeling with amino acids in cell culture (SILAC). For high-affinity pull-down using streptavidin, we fused STAT3 with a short peptide tag allowing biotinylation in situ (bio-tag), which did not affect STAT3 functions. By this approach, 3642 coprecipitated proteins were detected in human embryonic kidney-293 cells. Filtering using statistical and functional criteria finally extracted 136 proteins as putative interaction partners of STAT3. Both, a physical interaction network analysis and the enrichment of known and predicted interaction partners suggested that our filtering criteria successfully enriched true STAT3 interactors. Our approach identified numerous novel interactors, including ones previously predicted to associate with STAT3. By reciprocal coprecipitation, we were able to verify the physical association between STAT3 and selected interactors, including the novel interaction with TOX4, a member of the TOX high mobility group box family. Applying the same method, we next investigated the activation-dependency of the STAT3 interactome. Again, we identified both known and novel interactions. Thus, our approach allows to study protein-protein interaction effectively and comprehensively. The location, activity, function, degradation, and synthesis of proteins are significantly regulated by interactions of proteins with other proteins, biopolymers and small molecules. Thus, the comprehensive characterization of interactions of proteins in a given proteome is the next milestone on the path to understanding the biochemistry of the cell. In order to generate a comprehensive interactome dataset of proteins specifically interacting with a selected bait protein, we fused our bait protein STAT3 with a short peptide tag allowing biotinylation in situ (bio-tag). This bio-tag allows an affinity pull-down using streptavidin but affected neither the activation of STAT3 by tyrosine phosphorylation nor its transactivating potential. We combined SILAC for accurate relative protein quantification, subcellular fractionation to increase the coverage of interacting proteins, high-affinity pull-down and a stringent filtering method to successfully analyze the interactome of STAT3. With our approach we confirmed several already known and identified numerous novel STAT3 interactors. The approach applied provides a rapid and effective method, which is broadly applicable for studying protein-protein interactions and their dependency on post-translational modifications. © 2013. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Lali, Mehdi
2009-03-01
A comprehensive computer program is designed in MATLAB to analyze, design and optimize the propulsion, dynamics, thermodynamics, and kinematics of any serial multi-staging rocket for a set of given data. The program is quite user-friendly. It comprises two main sections: "analysis and design" and "optimization." Each section has a GUI (Graphical User Interface) in which the rocket's data are entered by the user and by which the program is run. The first section analyzes the performance of the rocket that is previously devised by the user. Numerous plots and subplots are provided to display the performance of the rocket. The second section of the program finds the "optimum trajectory" via billions of iterations and computations which are done through sophisticated algorithms using numerical methods and incremental integrations. Innovative techniques are applied to calculate the optimal parameters for the engine and designing the "optimal pitch program." This computer program is stand-alone in such a way that it calculates almost every design parameter in regards to rocket propulsion and dynamics. It is meant to be used for actual launch operations as well as educational and research purposes.
Underhill, Kristen; Morrow, Kathleen M; Colleran, Christopher; Calabrese, Sarah K; Operario, Don; Salovey, Peter; Mayer, Kenneth H
2016-07-01
We investigated message comprehension and message framing preferences for communicating about PrEP efficacy with US MSM. We conducted eight focus groups (n = 38) and n = 56 individual interviews with MSM in Providence, RI. Facilitators probed comprehension, credibility, and acceptability of efficacy messages, including percentages, non-numerical paraphrases, efficacy ranges versus point estimates, and success- versus failure-framed messages. Our findings indicated a range of comprehension and operational understandings of efficacy messages. Participants tended to prefer percentage-based and success-framed messages, although preferences varied for communicating about efficacy using a single percentage versus a range. Participants reported uncertainty about how to interpret numerical estimates, and many questioned whether trial results would predict personal effectiveness. These results suggest that providers and researchers implementing PrEP may face challenges in communicating with users about efficacy. Efforts to educate MSM about PrEP should incorporate percentage-based information, and message framing decisions may influence message credibility and overall PrEP acceptability.
NASA Astrophysics Data System (ADS)
Peng, Guoyi; Cao, Shuliang; Ishizuka, Masaru; Hayama, Shinji
2002-06-01
This paper is concerned with the design optimization of axial flow hydraulic turbine runner blade geometry. In order to obtain a better design plan with good performance, a new comprehensive performance optimization procedure has been presented by combining a multi-variable multi-objective constrained optimization model with a Q3D inverse computation and a performance prediction procedure. With careful analysis of the inverse design of axial hydraulic turbine runner, the total hydraulic loss and the cavitation coefficient are taken as optimization objectives and a comprehensive objective function is defined using the weight factors. Parameters of a newly proposed blade bound circulation distribution function and parameters describing positions of blade leading and training edges in the meridional flow passage are taken as optimization variables.The optimization procedure has been applied to the design optimization of a Kaplan runner with specific speed of 440 kW. Numerical results show that the performance of designed runner is successfully improved through optimization computation. The optimization model is found to be validated and it has the feature of good convergence. With the multi-objective optimization model, it is possible to control the performance of designed runner by adjusting the value of weight factors defining the comprehensive objective function. Copyright
Numerical Simulation and Chaotic Analysis of an Aluminum Holding Furnace
NASA Astrophysics Data System (ADS)
Wang, Ji-min; Zhou, Yuan-yuan; Lan, Shen; Chen, Tao; Li, Jie; Yan, Hong-jie; Zhou, Jie-min; Tian, Rui-jiao; Tu, Yan-wu; Li, Wen-ke
2014-12-01
To achieve high heat efficiency, low pollutant emission and homogeneous melt temperature during thermal process of secondary aluminum, taking into account the features of aluminum alloying process, a CFD process model was developed and integrated with heat load and aluminum temperature control model. This paper presented numerical simulation of aluminum holding furnaces using the customized code based on FLUENT packages. Thermal behaviors of aluminum holding furnaces were investigated by probing into main physical fields such as flue gas temperature, velocity, and concentration, and combustion instability of aluminum holding process was represented by chaos theory. The results show that aluminum temperature uniform coefficient firstly decreases during heating phase, then increases and reduces alternately during holding phase, lastly rises during standing phase. Correlation dimension drops with fuel velocity. Maximal Lyapunov exponent reaches to a maximum when air-fuel ratio is close to 1. It would be a clear comprehension about each phase of aluminum holding furnaces to find new technology, retrofit furnace design, and optimize parameters combination.
Plasma and radio waves from Neptune: Source mechanisms and propagation
NASA Astrophysics Data System (ADS)
Wong, H. K.
1994-03-01
This report summarizes results obtained through the support of NASA Grant NAGW-2412. The objective of this project is to conduct a comprehensive investigation of the radio wave emission observed by the planetary radio astronomy (PRA) instrument on board Voyager 2 as if flew by Neptune. This study has included data analysis, theoretical and numerical calculations, ray tracing, and modeling to determine the possible source mechanism(s) and locations of the Neptune radio emissions. We have completed four papers, which are included in the appendix. The paper 'Modeling of Whistler Ray Paths in the Magnetosphere of Neptune' investigated the propagation and dispersion of lighting-generated whistler in the magnetosphere of Neptune by using three dimensional ray tracing. The two papers 'Numerical Simulations of Bursty Radio Emissions from Planetary Magnetospheres' and 'Numerical Simulations of Bursty Planetary Radio Emissions' employed numerical simulations to investigate an alternate source mechanism of bursty radio emissions in addition to the cyclotron maser instability. We have also studied the possible generation of Z and whistler mode waves by the temperature anisotropic beam instability and the result was published in 'Electron Cyclotron Wave Generation by Relativistic Electrons.' Besides the aforementioned studies, we have also collaborated with members of the PRA team to investigate various aspects of the radio wave data. Two papers have been submitted for publication and the abstracts of these papers are also listed in the appendix.
NASA Astrophysics Data System (ADS)
Sanchez, M. J.; Santamarina, C.; Gai, X., Sr.; Teymouri, M., Sr.
2017-12-01
Stability and behavior of Hydrate Bearing Sediments (HBS) are characterized by the metastable character of the gas hydrate structure which strongly depends on thermo-hydro-chemo-mechanical (THCM) actions. Hydrate formation, dissociation and methane production from hydrate bearing sediments are coupled THCM processes that involve, amongst other, exothermic formation and endothermic dissociation of hydrate and ice phases, mixed fluid flow and large changes in fluid pressure. The analysis of available data from past field and laboratory experiments, and the optimization of future field production studies require a formal and robust numerical framework able to capture the very complex behavior of this type of soil. A comprehensive fully coupled THCM formulation has been developed and implemented into a finite element code to tackle problems involving gas hydrates sediments. Special attention is paid to the geomechanical behavior of HBS, and particularly to their response upon hydrate dissociation under loading. The numerical framework has been validated against recent experiments conducted under controlled conditions in the laboratory that challenge the proposed approach and highlight the complex interaction among THCM processes in HBS. The performance of the models in these case studies is highly satisfactory. Finally, the numerical code is applied to analyze the behavior of gas hydrate soils under field-scale conditions exploring different features of material behavior under possible reservoir conditions.
Numerical modeling of bubble dynamics in viscoelastic media with relaxation
Warnez, M. T.; Johnsen, E.
2015-01-01
Cavitation occurs in a variety of non-Newtonian fluids and viscoelastic materials. The large-amplitude volumetric oscillations of cavitation bubbles give rise to high temperatures and pressures at collapse, as well as induce large and rapid deformation of the surroundings. In this work, we develop a comprehensive numerical framework for spherical bubble dynamics in isotropic media obeying a wide range of viscoelastic constitutive relationships. Our numerical approach solves the compressible Keller–Miksis equation with full thermal effects (inside and outside the bubble) when coupled to a highly generalized constitutive relationship (which allows Newtonian, Kelvin–Voigt, Zener, linear Maxwell, upper-convected Maxwell, Jeffreys, Oldroyd-B, Giesekus, and Phan-Thien-Tanner models). For the latter two models, partial differential equations (PDEs) must be solved in the surrounding medium; for the remaining models, we show that the PDEs can be reduced to ordinary differential equations. To solve the general constitutive PDEs, we present a Chebyshev spectral collocation method, which is robust even for violent collapse. Combining this numerical approach with theoretical analysis, we simulate bubble dynamics in various viscoelastic media to determine the impact of relaxation time, a constitutive parameter, on the associated physics. Relaxation time is found to increase bubble growth and permit rebounds driven purely by residual stresses in the surroundings. Different regimes of oscillations occur depending on the relaxation time. PMID:26130967
Plasma and radio waves from Neptune: Source mechanisms and propagation
NASA Technical Reports Server (NTRS)
Wong, H. K.
1994-01-01
This report summarizes results obtained through the support of NASA Grant NAGW-2412. The objective of this project is to conduct a comprehensive investigation of the radio wave emission observed by the planetary radio astronomy (PRA) instrument on board Voyager 2 as if flew by Neptune. This study has included data analysis, theoretical and numerical calculations, ray tracing, and modeling to determine the possible source mechanism(s) and locations of the Neptune radio emissions. We have completed four papers, which are included in the appendix. The paper 'Modeling of Whistler Ray Paths in the Magnetosphere of Neptune' investigated the propagation and dispersion of lighting-generated whistler in the magnetosphere of Neptune by using three dimensional ray tracing. The two papers 'Numerical Simulations of Bursty Radio Emissions from Planetary Magnetospheres' and 'Numerical Simulations of Bursty Planetary Radio Emissions' employed numerical simulations to investigate an alternate source mechanism of bursty radio emissions in addition to the cyclotron maser instability. We have also studied the possible generation of Z and whistler mode waves by the temperature anisotropic beam instability and the result was published in 'Electron Cyclotron Wave Generation by Relativistic Electrons.' Besides the aforementioned studies, we have also collaborated with members of the PRA team to investigate various aspects of the radio wave data. Two papers have been submitted for publication and the abstracts of these papers are also listed in the appendix.
A 3D Numerical Survey of Seismic Waves Inside and Around an Underground Cavity
NASA Astrophysics Data System (ADS)
Esterhazy, S.; Schneider, F. M.; Perugia, I.; Bokelmann, G.
2016-12-01
Motivated by the need to detect an underground cavity within the procedure of an On-Site-Inspection (OSI) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO), which might be caused by a nuclear explo- sion/weapon testing, we present our findings of a numerical study on the elastic wave propagation inside and around such an underground cavity.The aim of the CTBTO is to ban all nuclear explosions of any size anywhere, by anyone. Therefore, it is essential to build a powerful strategy to efficiently investigate and detect critical signatures such as gas filled cavities, rubble zones and fracture networks below the surface. One method to investigate the geophysical properties of an under- ground cavity allowed by the Comprehensive Nuclear-test Ban Treaty is referred to as "resonance seismometry" - a resonance method that uses passive or active seismic techniques, relying on seismic cavity vibrations. This method is in fact not yet entirely determined by the Treaty and there are also only few experimental examples that have been suitably documented to build a proper scientific groundwork. This motivates to investigate this problem on a purely numerical level and to simulate these events based on recent advances in the mathematical understanding of the underlying physical phenomena.Our numerical study includes the full elastic wave field in three dimensions. We consider the effects from an in- coming plane wave as well as point source located in the surrounding of the cavity at the surface. While the former can be considered as passive source like a tele-seismic earthquake, the latter represents a man-made explosion or a viborseis as used for/in active seismic techniques. For our simulations in 3D we use the discontinuous Galerkin Spectral Element Code SPEED developed by MOX (The Laboratory for Modeling and Scientific Computing, Department of Mathematics) and DICA (Department of Civil and Environmental Engineering) at the Politecnico di Milano. The computations are carried out on the Vienna Scientific Cluster (VSC).The accurate numerical modeling can facilitate the development of proper analysis techniques to detect the remnants of an underground nuclear test, help to set a rigorous scientific base of OSI and contribute to bringing the Treaty into force.
Comprehension and computation in Bayesian problem solving
Johnson, Eric D.; Tubau, Elisabet
2015-01-01
Humans have long been characterized as poor probabilistic reasoners when presented with explicit numerical information. Bayesian word problems provide a well-known example of this, where even highly educated and cognitively skilled individuals fail to adhere to mathematical norms. It is widely agreed that natural frequencies can facilitate Bayesian inferences relative to normalized formats (e.g., probabilities, percentages), both by clarifying logical set-subset relations and by simplifying numerical calculations. Nevertheless, between-study performance on “transparent” Bayesian problems varies widely, and generally remains rather unimpressive. We suggest there has been an over-focus on this representational facilitator (i.e., transparent problem structures) at the expense of the specific logical and numerical processing requirements and the corresponding individual abilities and skills necessary for providing Bayesian-like output given specific verbal and numerical input. We further suggest that understanding this task-individual pair could benefit from considerations from the literature on mathematical cognition, which emphasizes text comprehension and problem solving, along with contributions of online executive working memory, metacognitive regulation, and relevant stored knowledge and skills. We conclude by offering avenues for future research aimed at identifying the stages in problem solving at which correct vs. incorrect reasoners depart, and how individual differences might influence this time point. PMID:26283976
Airplane numerical simulation for the rapid prototyping process
NASA Astrophysics Data System (ADS)
Roysdon, Paul F.
Airplane Numerical Simulation for the Rapid Prototyping Process is a comprehensive research investigation into the most up-to-date methods for airplane development and design. Uses of modern engineering software tools, like MatLab and Excel, are presented with examples of batch and optimization algorithms which combine the computing power of MatLab with robust aerodynamic tools like XFOIL and AVL. The resulting data is demonstrated in the development and use of a full non-linear six-degrees-of-freedom simulator. The applications for this numerical tool-box vary from un-manned aerial vehicles to first-order analysis of manned aircraft. A Blended-Wing-Body airplane is used for the analysis to demonstrate the flexibility of the code from classic wing-and-tail configurations to less common configurations like the blended-wing-body. This configuration has been shown to have superior aerodynamic performance -- in contrast to their classic wing-and-tube fuselage counterparts -- and have reduced sensitivity to aerodynamic flutter as well as potential for increased engine noise abatement. Of course without a classic tail elevator to damp the nose up pitching moment, and the vertical tail rudder to damp the yaw and possible rolling aerodynamics, the challenges in lateral roll and yaw stability, as well as pitching moment are not insignificant. This thesis work applies the tools necessary to perform the airplane development and optimization on a rapid basis, demonstrating the strength of this tool through examples and comparison of the results to similar airplane performance characteristics published in literature.
Tsou, Paul M; Daffner, Scott D; Holly, Langston T; Shamie, A Nick; Wang, Jeffrey C
2012-02-10
Multiple factors contribute to the determination for surgical intervention in the setting of cervical spinal injury, yet to date no unified classification system exists that predicts this need. The goals of this study were twofold: to create a comprehensive subaxial cervical spine injury severity numeric scoring model, and to determine the predictive value of this model for the probability of surgical intervention. In a retrospective cohort study of 333 patients, neural impairment, patho-morphology, and available spinal canal sagittal diameter post-injury were selected as injury severity determinants. A common numeric scoring trend was created; smaller values indicated less favorable clinical conditions. Neural impairment was graded from 2-10, patho-morphology scoring ranged from 2-15, and post-injury available canal sagittal diameter (SD) was measured in millimeters at the narrowest point of injury. Logistic regression analysis was performed using the numeric scores to predict the probability for surgical intervention. Complete neurologic deficit was found in 39 patients, partial deficits in 108, root injuries in 19, and 167 were neurologically intact. The pre-injury mean canal SD was 14.6 mm; the post-injury measurement mean was 12.3 mm. The mean patho-morphology score for all patients was 10.9 and the mean neurologic function score was 7.6. There was a statistically significant difference in mean scores for neural impairment, canal SD, and patho-morphology for surgical compared to nonsurgical patients. At the lowest clinical score for each determinant, the probability for surgery was 0.949 for neural impairment, 0.989 for post-injury available canal SD, and 0.971 for patho-morphology. The unit odds ratio for each determinant was 1.73, 1.61, and 1.45, for neural impairment, patho-morphology, and canal SD scores, respectively. The subaxial cervical spine injury severity determinants of neural impairment, patho-morphology, and post-injury available canal SD have well defined probability for surgical intervention when scored separately. Our data showed that each determinant alone could act as a primary predictor for surgical intervention.
Tang, Yuchun; Zhao, Lu; Lou, Yunxia; Shi, Yonggang; Fang, Rui; Lin, Xiangtao; Liu, Shuwei; Toga, Arthur
2018-05-01
Numerous behavioral observations and brain function studies have demonstrated that neurological differences exist between East Asians and Westerners. However, the extent to which these factors relate to differences in brain structure is still not clear. As the basis of brain functions, the anatomical differences in brain structure play a primary and critical role in the origination of functional and behavior differences. To investigate the underlying differences in brain structure between the two cultural/ethnic groups, we conducted a comparative study on education-matched right-handed young male adults (age = 22-29 years) from two cohorts, Han Chinese (n = 45) and Caucasians (n = 45), using high-dimensional structural magnetic resonance imaging (MRI) data. Using two well-validated imaging analysis techniques, surface-based morphometry (SBM) and voxel-based morphometry (VBM), we performed a comprehensive vertex-wise morphometric analysis of the brain structures between Chinese and Caucasian cohorts. We identified consistent significant between-group differences in cortical thickness, volume, and surface area in the frontal, temporal, parietal, occipital, and insular lobes as well as the cingulate cortices. The SBM analyses revealed that compared with Caucasians, the Chinese population showed larger cortical structures in the temporal and cingulate regions, and smaller structural measures in the frontal and parietal cortices. The VBM data of the same sample was well-aligned with the SBM findings. Our findings systematically revealed comprehensive brain structural differences between young male Chinese and Caucasians, and provided new neuroanatomical insights to the behavioral and functional distinctions in the two cultural/ethnic populations. © 2018 Wiley Periodicals, Inc.
A Single and Comprehensive Helios Data Archive
NASA Astrophysics Data System (ADS)
Salem, C. S.
2017-12-01
Helios 1 & 2 rank amoung the most important missions in Heliophysics, and the more-than 11 years of data returned by its spacecraft remain of paramount interests to researchers. Their unique trajectories which brought them closer to the Sun than any spaceccraft before or since, enabled their diverse suite of in-situ instruments to return measurements of unprecedented scientific richness. There is however no comprehensive public repository of all Helios in-situ data. Currently, most of the highest resolution data can be obtained from a variety of places, although highly processed and with very little documentation, especially on calibration. Analysis of this data set requires overcoming a number of technical and instrumental issues, knowledge and expertise of which is only possessed by the original PI's of the Helios experiments. We present here a work funded by NASA of aggregating, analyzing, evaluating, documenting and archiving the available Helios 1 and 2 in-situ data. This work at the UC Berkeley Space Sciences Laboratory is being undertaken in close collaboration with colleagues at the University of Koln, at the University of Kiel, at the Imperial College in London and at the Paris Observatory. A careful, detailed, analysis of the Helios fluxgate and search coil magnetic field data as well as plasma data has revealed numerous issues and problems with the available, processed, datasets, that we are still working to solve. We anticipate this comprehensive single archive of all Helios in-situ data, beyond its inherent scientific value, will also be an invaluable asset to the both the Solar Probe Plus and Solar Orbiter missions.
Wright, Alison J; Whitwell, Sophia C L; Takeichi, Chika; Hankins, Matthew; Marteau, Theresa M
2009-02-01
Numeracy, the ability to process basic mathematical concepts, may affect responses to graphical displays of health risk information. Displays of probabilistic risk information using grouped dots are easier to understand than displays using dispersed dots. However, dispersed dots may better convey the randomness with which health threats occur, so increasing perceived susceptibility. We hypothesized that low numeracy participants would better understand risks presented using grouped dot displays, while high numeracy participants would have good understanding, regardless of display type. Moreover, we predicted that dispersed dot displays, in contrast to grouped dot displays, would increase risk perceptions and worry only for highly numerate individuals. One hundred and forty smokers read vignettes asking them to imagine being at risk of Crohn's disease, in a 2(display type: dispersed/grouped dots) x 3(risk magnitude: 3%/6%/50%) x 2(numeracy: high/low) design. They completed measures of risk comprehension, perceived susceptibility and worry. More numerate participants had better objective risk comprehension, but this effect was not moderated by display type. There was marginally significant support for the predicted numeracy x display type interaction for worry about Crohn's disease, but not for perceived susceptibility to the condition. Dispersed dot displays somewhat increase worry in highly numerate individuals, but only numeracy influenced objective risk comprehension. The most effective display type for communicating risk information will depend on the numeracy of the population and the goal(s) of the communication.
Correlation analysis on real-time tab-delimited network monitoring data
Pan, Aditya; Majumdar, Jahin; Bansal, Abhay; ...
2016-01-01
End-End performance monitoring in the Internet, also called PingER is a part of SLAC National Accelerator Laboratory’s research project. It was created to answer the growing need to monitor network both to analyze current performance and to designate resources to optimize execution between research centers, and the universities and institutes co-operating on present and future operations. The monitoring support reflects the broad geographical area of the collaborations and requires a comprehensive number of research and financial channels. The data architecture retrieval and methodology of the interpretation have emerged over numerous years. Analyzing this data is the main challenge due tomore » its high volume. Finally, by using correlation analysis, we can make crucial conclusions about how the network data affects the performance of the hosts and how it depends from countries to countries.« less
The Cooperative VAS Program with the Marshall Space Flight Center
NASA Technical Reports Server (NTRS)
Diak, George R.; Menzel, W. Paul
1988-01-01
Work was divided between the analysis/forecast model development and evaluation of the impact of satellite data in mesoscale numerical weather prediction (NWP), development of the Multispectral Atmospheric Mapping Sensor (MAMS), and other related research. The Cooperative Institute for Meteorological Satellite Studies (CIMSS) Synoptic Scale Model (SSM) has progressed from a relatively basic analysis/forecast system to a package which includes such features as nonlinear vertical mode initialization, comprehensive Planetary Boundary Layer (PBL) physics, and the core of a fully four-dimensional data assimilation package. The MAMS effort has produced a calibrated visible and infrared sensor that produces imager at high spatial resolution. The MAMS was developed in order to study small scale atmospheric moisture variability, to monitor and classify clouds, and to investigate the role of surface characteristics in the production of clouds, precipitation, and severe storms.
Formal Solutions for Polarized Radiative Transfer. I. The DELO Family
DOE Office of Scientific and Technical Information (OSTI.GOV)
Janett, Gioele; Carlin, Edgar S.; Steiner, Oskar
The discussion regarding the numerical integration of the polarized radiative transfer equation is still open and the comparison between the different numerical schemes proposed by different authors in the past is not fully clear. Aiming at facilitating the comprehension of the advantages and drawbacks of the different formal solvers, this work presents a reference paradigm for their characterization based on the concepts of order of accuracy , stability , and computational cost . Special attention is paid to understand the numerical methods belonging to the Diagonal Element Lambda Operator family, in an attempt to highlight their specificities.
Chiu, Charles Y
2015-01-01
Viral pathogen discovery is of critical importance to clinical microbiology, infectious diseases, and public health. Genomic approaches for pathogen discovery, including consensus polymerase chain reaction (PCR), microarrays, and unbiased next-generation sequencing (NGS), have the capacity to comprehensively identify novel microbes present in clinical samples. Although numerous challenges remain to be addressed, including the bioinformatics analysis and interpretation of large datasets, these technologies have been successful in rapidly identifying emerging outbreak threats, screening vaccines and other biological products for microbial contamination, and discovering novel viruses associated with both acute and chronic illnesses. Downstream studies such as genome assembly, epidemiologic screening, and a culture system or animal model of infection are necessary to establish an association of a candidate pathogen with disease. PMID:23725672
Advance finite element modeling of rotor blade aeroelasticity
NASA Technical Reports Server (NTRS)
Straub, F. K.; Sangha, K. B.; Panda, B.
1994-01-01
An advanced beam finite element has been developed for modeling rotor blade dynamics and aeroelasticity. This element is part of the Element Library of the Second Generation Comprehensive Helicopter Analysis System (2GCHAS). The element allows modeling of arbitrary rotor systems, including bearingless rotors. It accounts for moderately large elastic deflections, anisotropic properties, large frame motion for maneuver simulation, and allows for variable order shape functions. The effects of gravity, mechanically applied and aerodynamic loads are included. All kinematic quantities required to compute airloads are provided. In this paper, the fundamental assumptions and derivation of the element matrices are presented. Numerical results are shown to verify the formulation and illustrate several features of the element.
Thermodynamically self-consistent theory for the Blume-Capel model.
Grollau, S; Kierlik, E; Rosinberg, M L; Tarjus, G
2001-04-01
We use a self-consistent Ornstein-Zernike approximation to study the Blume-Capel ferromagnet on three-dimensional lattices. The correlation functions and the thermodynamics are obtained from the solution of two coupled partial differential equations. The theory provides a comprehensive and accurate description of the phase diagram in all regions, including the wing boundaries in a nonzero magnetic field. In particular, the coordinates of the tricritical point are in very good agreement with the best estimates from simulation or series expansion. Numerical and analytical analysis strongly suggest that the theory predicts a universal Ising-like critical behavior along the lambda line and the wing critical lines, and a tricritical behavior governed by mean-field exponents.
NASA Technical Reports Server (NTRS)
Manhardt, P. D.
1982-01-01
The CMC fluid mechanics program system was developed to transmit the theoretical solution of finite element numerical solution methodology, applied to nonlinear field problems into a versatile computer code for comprehensive flow field analysis. Data procedures for the CMC 3 dimensional Parabolic Navier-Stokes (PNS) algorithm are presented. General data procedures a juncture corner flow standard test case data deck is described. A listing of the data deck and an explanation of grid generation methodology are presented. Tabulations of all commands and variables available to the user are described. These are in alphabetical order with cross reference numbers which refer to storage addresses.
Infectious Bronchitis Virus Variants: Molecular Analysis and Pathogenicity Investigation
Lin, Shu-Yi
2017-01-01
Infectious bronchitis virus (IBV) variants constantly emerge and pose economic threats to poultry farms worldwide. Numerous studies on the molecular and pathogenic characterization of IBV variants have been performed between 2007 and 2017, which we have reviewed herein. We noted that viral genetic mutations and recombination events commonly gave rise to distinct IBV genotypes, serotypes and pathotypes. In addition to characterizing the S1 genes, full viral genomic sequencing, comprehensive antigenicity, and pathogenicity studies on emerging variants have advanced our understanding of IBV infections, which is valuable for developing countermeasures against IBV field outbreaks. This review of IBV variants provides practical value for understanding their phylogenetic relationships and epidemiology from both regional and worldwide viewpoints. PMID:28937583
NASA Astrophysics Data System (ADS)
Bonanno, A.; Bozzo, G.; Sapia, P.
2017-11-01
In this work, we present a coherent sequence of experiments on electromagnetic (EM) induction and eddy currents, appropriate for university undergraduate students, based on a magnet falling through a drilled aluminum disk. The sequence, leveraging on the didactical interplay between the EM and mechanical aspects of the experiments, allows us to exploit the students’ awareness of mechanics to elicit their comprehension of EM phenomena. The proposed experiments feature two kinds of measurements: (i) kinematic measurements (performed by means of high-speed video analysis) give information on the system’s kinematics and, via appropriate numerical data processing, allow us to get dynamic information, in particular on energy dissipation; (ii) induced electromagnetic field (EMF) measurements (by using a homemade multi-coil sensor connected to a cheap data acquisition system) allow us to quantitatively determine the inductive effects of the moving magnet on its neighborhood. The comparison between experimental results and the predictions from an appropriate theoretical model (of the dissipative coupling between the moving magnet and the conducting disk) offers many educational hints on relevant topics related to EM induction, such as Maxwell’s displacement current, magnetic field flux variation, and the conceptual link between induced EMF and induced currents. Moreover, the didactical activity gives students the opportunity to be trained in video analysis, data acquisition and numerical data processing.
FORTRAN programming - A self-taught course
NASA Technical Reports Server (NTRS)
Blecher, S.; Butler, R. V.; Horton, M.; Norrod, V.
1971-01-01
Comprehensive programming course begins with numerical systems and basic concepts, proceeds systematically through FORTRAN language elements, and concludes with discussion of programming techniques. Course is suitable either for individual study or for group study on informal basis.
Excel spreadsheet in teaching numerical methods
NASA Astrophysics Data System (ADS)
Djamila, Harimi
2017-09-01
One of the important objectives in teaching numerical methods for undergraduates’ students is to bring into the comprehension of numerical methods algorithms. Although, manual calculation is important in understanding the procedure, it is time consuming and prone to error. This is specifically the case when considering the iteration procedure used in many numerical methods. Currently, many commercial programs are useful in teaching numerical methods such as Matlab, Maple, and Mathematica. These are usually not user-friendly by the uninitiated. Excel spreadsheet offers an initial level of programming, which it can be used either in or off campus. The students will not be distracted with writing codes. It must be emphasized that general commercial software is required to be introduced later to more elaborated questions. This article aims to report on a teaching numerical methods strategy for undergraduates engineering programs. It is directed to students, lecturers and researchers in engineering field.
Gallego, Carlos; Martín-Aragoneses, M Teresa; López-Higes, Ramón; Pisón, Guzmán
2016-01-01
Deaf students have traditionally exhibited reading comprehension difficulties. In recent years, these comprehension problems have been partially offset through cochlear implantation (CI), and the subsequent improvement in spoken language skills. However, the use of cochlear implants has not managed to fully bridge the gap in language and reading between normally hearing (NH) and deaf children, as its efficacy depends on variables such as the age at implant. This study compared the reading comprehension of sentences in 19 children who received a cochlear implant before 24 months of age (early-CI) and 19 who received it after 24 months (late-CI) with a control group of 19 NH children. The task involved completing sentences in which the last word had been omitted. To complete each sentence children had to choose a word from among several alternatives that included one syntactic and two semantic foils in addition to the target word. The results showed that deaf children with late-CI performed this task significantly worse than NH children, while those with early-CI exhibited no significant differences with NH children, except under more demanding processing conditions (long sentences with infrequent target words). Further, the error analysis revealed a preference of deaf students with early-CI for selecting the syntactic foil over a semantic one, which suggests that they draw upon syntactic cues during sentence processing in the same way as NH children do. In contrast, deaf children with late-CI do not appear to use a syntactic strategy, but neither a semantic strategy based on the use of key words, as the literature suggests. Rather, the numerous errors of both kinds that the late-CI group made seem to indicate an inconsistent and erratic response when faced with a lack of comprehension. These findings are discussed in relation to differences in receptive vocabulary and short-term memory and their implications for sentence reading comprehension. Copyright © 2015 Elsevier Ltd. All rights reserved.
Numerical models for fluid-grains interactions: opportunities and limitations
NASA Astrophysics Data System (ADS)
Esteghamatian, Amir; Rahmani, Mona; Wachs, Anthony
2017-06-01
In the framework of a multi-scale approach, we develop numerical models for suspension flows. At the micro scale level, we perform particle-resolved numerical simulations using a Distributed Lagrange Multiplier/Fictitious Domain approach. At the meso scale level, we use a two-way Euler/Lagrange approach with a Gaussian filtering kernel to model fluid-solid momentum transfer. At both the micro and meso scale levels, particles are individually tracked in a Lagrangian way and all inter-particle collisions are computed by a Discrete Element/Soft-sphere method. The previous numerical models have been extended to handle particles of arbitrary shape (non-spherical, angular and even non-convex) as well as to treat heat and mass transfer. All simulation tools are fully-MPI parallel with standard domain decomposition and run on supercomputers with a satisfactory scalability on up to a few thousands of cores. The main asset of multi scale analysis is the ability to extend our comprehension of the dynamics of suspension flows based on the knowledge acquired from the high-fidelity micro scale simulations and to use that knowledge to improve the meso scale model. We illustrate how we can benefit from this strategy for a fluidized bed, where we introduce a stochastic drag force model derived from micro-scale simulations to recover the proper level of particle fluctuations. Conversely, we discuss the limitations of such modelling tools such as their limited ability to capture lubrication forces and boundary layers in highly inertial flows. We suggest ways to overcome these limitations in order to enhance further the capabilities of the numerical models.
A virtual climate library of surface temperature over North America for 1979-2015
NASA Astrophysics Data System (ADS)
Kravtsov, Sergey; Roebber, Paul; Brazauskas, Vytaras
2017-10-01
The most comprehensive continuous-coverage modern climatic data sets, known as reanalyses, come from combining state-of-the-art numerical weather prediction (NWP) models with diverse available observations. These reanalysis products estimate the path of climate evolution that actually happened, and their use in a probabilistic context—for example, to document trends in extreme events in response to climate change—is, therefore, limited. Free runs of NWP models without data assimilation can in principle be used for the latter purpose, but such simulations are computationally expensive and are prone to systematic biases. Here we produce a high-resolution, 100-member ensemble simulation of surface atmospheric temperature over North America for the 1979-2015 period using a comprehensive spatially extended non-stationary statistical model derived from the data based on the North American Regional Reanalysis. The surrogate climate realizations generated by this model are independent from, yet nearly statistically congruent with reality. This data set provides unique opportunities for the analysis of weather-related risk, with applications in agriculture, energy development, and protection of human life.
A virtual climate library of surface temperature over North America for 1979–2015
Kravtsov, Sergey; Roebber, Paul; Brazauskas, Vytaras
2017-01-01
The most comprehensive continuous-coverage modern climatic data sets, known as reanalyses, come from combining state-of-the-art numerical weather prediction (NWP) models with diverse available observations. These reanalysis products estimate the path of climate evolution that actually happened, and their use in a probabilistic context—for example, to document trends in extreme events in response to climate change—is, therefore, limited. Free runs of NWP models without data assimilation can in principle be used for the latter purpose, but such simulations are computationally expensive and are prone to systematic biases. Here we produce a high-resolution, 100-member ensemble simulation of surface atmospheric temperature over North America for the 1979–2015 period using a comprehensive spatially extended non-stationary statistical model derived from the data based on the North American Regional Reanalysis. The surrogate climate realizations generated by this model are independent from, yet nearly statistically congruent with reality. This data set provides unique opportunities for the analysis of weather-related risk, with applications in agriculture, energy development, and protection of human life. PMID:29039842
A virtual climate library of surface temperature over North America for 1979-2015.
Kravtsov, Sergey; Roebber, Paul; Brazauskas, Vytaras
2017-10-17
The most comprehensive continuous-coverage modern climatic data sets, known as reanalyses, come from combining state-of-the-art numerical weather prediction (NWP) models with diverse available observations. These reanalysis products estimate the path of climate evolution that actually happened, and their use in a probabilistic context-for example, to document trends in extreme events in response to climate change-is, therefore, limited. Free runs of NWP models without data assimilation can in principle be used for the latter purpose, but such simulations are computationally expensive and are prone to systematic biases. Here we produce a high-resolution, 100-member ensemble simulation of surface atmospheric temperature over North America for the 1979-2015 period using a comprehensive spatially extended non-stationary statistical model derived from the data based on the North American Regional Reanalysis. The surrogate climate realizations generated by this model are independent from, yet nearly statistically congruent with reality. This data set provides unique opportunities for the analysis of weather-related risk, with applications in agriculture, energy development, and protection of human life.
Cardiac rehabilitation: a comprehensive review
Lear, Scott A; Ignaszewski, Andrew
2001-01-01
Cardiac rehabilitation (CR) is a commonly used treatment for men and women with cardiovascular disease. To date, no single study has conclusively demonstrated a comprehensive benefit of CR. Numerous individual studies, however, have demonstrated beneficial effects such as improved risk-factor profile, slower disease progression, decreased morbidity, and decreased mortality. This paper will review the evidence for the use of CR and discuss the implications and limitations of these studies. The safety, relevance to special populations, challenges, and future directions of CR will also be reviewed. PMID:11806801
Chiropractic management of a 47-year–old firefighter with lumbar disk extrusion
Schwab, Matthew J.
2008-01-01
Abstract Objective This case report describes the effect of exercise-based chiropractic treatment on chronic and intractable low back pain complicated by lumbar disk extrusion. Clinical Features A 47-year–old male firefighter experienced chronic, unresponsive low back pain. Pre- and posttreatment outcome analysis was performed on numeric (0-10) pain scale, functional rating index, and the low back pain Oswestry data. Secondary outcome assessments included a 1-rep maximum leg press, balancing times, push-ups and sit-ups the patient performed in 60 seconds, and radiographic analysis. Intervention and Outcome The patient was treated with Pettibon manipulative and rehabilitative techniques. At 4 weeks, spinal decompression therapy was incorporated. After 12 weeks of treatment, the patient's self-reported numeric pain scale had reduced from 6 to 1. There was also overall improvement in muscular strength, balance times, self-rated functional status, low back Oswestry scores, and lumbar lordosis using pre- and posttreatment radiographic information. Conclusion Comprehensive, exercise-based chiropractic management may contribute to an improvement of physical fitness and to restoration of function, and may be a protective factor for low back injury. This case suggests promising interventions with otherwise intractable low back pain using a multimodal chiropractic approach that includes isometric strengthening, neuromuscular reeducation, and lumbar spinal decompression therapy. PMID:19646377
The Effects of Music on Pain: A Meta-Analysis.
Lee, Jin Hyung
2016-01-01
Numerous meta-analyses have been conducted on the topic of music and pain, with the latest comprehensive study published in 2006. Since that time, more than 70 randomized controlled trials (RCTs) have been published, necessitating a new and comprehensive review. The aim of this meta-analysis was to examine published RCT studies investigating the effect of music on pain. The present study included RCTs published between 1995 and 2014. Studies were obtained by searching 12 databases and hand-searching related journals and reference lists. Main outcomes were pain intensity, emotional distress from pain, vital signs, and amount of analgesic intake. Study quality was evaluated according to the Cochrane Collaboration guidelines. Analysis of the 97 included studies revealed that music interventions had statistically significant effects in decreasing pain on 0-10 pain scales (MD = -1.13), other pain scales (SMD = -0.39), emotional distress from pain (MD = -10.83), anesthetic use (SMD = -0.56), opioid intake (SMD = -0.24), non-opioid intake (SMD = -0.54), heart rate (MD = -4.25), systolic blood pressure (MD = -3.34), diastolic blood pressure (MD = -1.18), and respiration rate (MD = -1.46). Subgroup and moderator analyses yielded additional clinically informative outcomes. Considering all the possible benefits, music interventions may provide an effective complementary approach for the relief of acute, procedural, and cancer/chronic pain in the medical setting. © the American Music Therapy Association 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
NASA Technical Reports Server (NTRS)
Moncada, Albert M.; Chattopadhyay, Aditi; Bednarcyk, Brett A.; Arnold, Steven M.
2008-01-01
Predicting failure in a composite can be done with ply level mechanisms and/or micro level mechanisms. This paper uses the Generalized Method of Cells and High-Fidelity Generalized Method of Cells micromechanics theories, coupled with classical lamination theory, as implemented within NASA's Micromechanics Analysis Code with Generalized Method of Cells. The code is able to implement different failure theories on the level of both the fiber and the matrix constituents within a laminate. A comparison is made among maximum stress, maximum strain, Tsai-Hill, and Tsai-Wu failure theories. To verify the failure theories the Worldwide Failure Exercise (WWFE) experiments have been used. The WWFE is a comprehensive study that covers a wide range of polymer matrix composite laminates. The numerical results indicate good correlation with the experimental results for most of the composite layups, but also point to the need for more accurate resin damage progression models.
Cytogenetics of melanoma and nonmelanoma skin cancer.
Carless, Melanie A; Griffiths, Lyn R
2014-01-01
Cytogenetic analysis of melanoma and nonmelanoma skin cancers has revealed recurrent aberrations, the frequency of which is reflective of malignant potential. Highly aberrant karyotypes are seen in melanoma, squamous cell carcinoma, actinic keratosis, Merkel cell carcinoma and cutaneous lymphomas with more stable karyotypes seen in basal cell carcinoma, keratoacanthoma, Bowen's disease and dermatofibrosarcoma protuberans. Some aberrations are common among a number of skin cancer types including rearrangements and numerical abnormalities of chromosome 1, -3p, +3q, partial or entire trisomy 6, trisomy 7, +8q, -9p, +9q, partial or entire loss of chromosome 10, -17p, +17q and partial or entire gain of chromosome 20. Combination of cytogenetic analysis with other molecular genetic techniques has enabled the identification of not only aberrant chromosomal regions, but also the genes that contribute to a malignant phenotype. This review provides a comprehensive summary of the pertinent cytogenetic aberrations associated with a variety of melanoma and nonmelanoma skin cancers.
Computational analysis of Variable Thrust Engine (VTE) performance
NASA Technical Reports Server (NTRS)
Giridharan, M. G.; Krishnan, A.; Przekwas, A. J.
1993-01-01
The Variable Thrust Engine (VTE) of the Orbital Maneuvering Vehicle (OMV) uses a hypergolic propellant combination of Monomethyl Hydrazine (MMH) and Nitrogen Tetroxide (NTO) as fuel and oxidizer, respectively. The performance of the VTE depends on a number of complex interacting phenomena such as atomization, spray dynamics, vaporization, turbulent mixing, convective/radiative heat transfer, and hypergolic combustion. This study involved the development of a comprehensive numerical methodology to facilitate detailed analysis of the VTE. An existing Computational Fluid Dynamics (CFD) code was extensively modified to include the following models: a two-liquid, two-phase Eulerian-Lagrangian spray model; a chemical equilibrium model; and a discrete ordinate radiation heat transfer model. The modified code was used to conduct a series of simulations to assess the effects of various physical phenomena and boundary conditions on the VTE performance. The details of the models and the results of the simulations are presented.
The portrait of eikonal instability in Lovelock theories
NASA Astrophysics Data System (ADS)
Konoplya, R. A.; Zhidenko, A.
2017-05-01
Perturbations and eikonal instabilities of black holes and branes in the Einstein-Gauss-Bonnet theory and its Lovelock generalization were considered in the literature for several particular cases, where the asymptotic conditions (flat, dS, AdS), the number of spacetime dimensions D, non-vanishing coupling constants (α1, α2, α3 etc.) and other parameters have been chosen in a specific way. Here we give a comprehensive analysis of the eikonal instabilities of black holes and branes for the most general Lovelock theory, not limited by any of the above cases. Although the part of the stability analysis is performed here purely analytically and formulated in terms of the inequalities for the black hole parameters, the most general case is treated numerically and the accurate regions of instabilities are presented. The shared Mathematica® code allows the reader to construct the regions of eikonal instability for any desired values of the parameters.
NASA Astrophysics Data System (ADS)
Tang, Yang; Wei, Juan; Costello, Catherine E.; Lin, Cheng
2018-04-01
The occurrence of numerous structural isomers in glycans from biological sources presents a severe challenge for structural glycomics. The subtle differences among isomeric structures demand analytical methods that can provide structural details while working efficiently with on-line glycan separation methods. Although liquid chromatography-tandem mass spectrometry (LC-MS/MS) is a powerful tool for mixture analysis, the commonly utilized collision-induced dissociation (CID) method often does not generate a sufficient number of fragments at the MS2 level for comprehensive structural characterization. Here, we studied the electronic excitation dissociation (EED) behaviors of metal-adducted, permethylated glycans, and identified key spectral features that could facilitate both topology and linkage determinations. We developed an EED-based, nanoscale, reversed phase (RP)LC-MS/MS platform, and demonstrated its ability to achieve complete structural elucidation of up to five structural isomers in a single LC-MS/MS analysis. [Figure not available: see fulltext.
Snapshot Hyperspectral Volumetric Microscopy
NASA Astrophysics Data System (ADS)
Wu, Jiamin; Xiong, Bo; Lin, Xing; He, Jijun; Suo, Jinli; Dai, Qionghai
2016-04-01
The comprehensive analysis of biological specimens brings about the demand for capturing the spatial, temporal and spectral dimensions of visual information together. However, such high-dimensional video acquisition faces major challenges in developing large data throughput and effective multiplexing techniques. Here, we report the snapshot hyperspectral volumetric microscopy that computationally reconstructs hyperspectral profiles for high-resolution volumes of ~1000 μm × 1000 μm × 500 μm at video rate by a novel four-dimensional (4D) deconvolution algorithm. We validated the proposed approach with both numerical simulations for quantitative evaluation and various real experimental results on the prototype system. Different applications such as biological component analysis in bright field and spectral unmixing of multiple fluorescence are demonstrated. The experiments on moving fluorescent beads and GFP labelled drosophila larvae indicate the great potential of our method for observing multiple fluorescent markers in dynamic specimens.
Acoustically Generated Flows in Flexural Plate Wave Sensors: a Multifield Analysis
NASA Astrophysics Data System (ADS)
Sayar, Ersin; Farouk, Bakhtier
2011-11-01
Acoustically excited flows in a microchannel flexural plate wave device are explored numerically with a coupled solid-fluid mechanics model. The device can be exploited to integrate micropumps with microfluidic chips. A comprehensive understanding of the device requires the development of coupled two or three-dimensional fluid structure interactive (FSI) models. The channel walls are composed of layers of ZnO, Si3N4 and Al. An isothermal equation of state for the fluid (water) is employed. The flexural motions of the channel walls and the resulting flowfields are solved simultaneously. A parametric analysis is performed by varying the values of the driving frequency, voltage of the electrical signal and the channel height. The time averaged axial velocity is found to be proportional to the square of the wave amplitude. The present approach is superior to the method of successive approximations where the solid-liquid coupling is weak.
On the Dynamical Regimes of Pattern-Accelerated Electroconvection.
Davidson, Scott M; Wessling, Matthias; Mani, Ali
2016-03-03
Recent research has established that electroconvection can enhance ion transport at polarized surfaces such as membranes and electrodes where it would otherwise be limited by diffusion. The onset of such overlimiting transport can be influenced by the surface topology of the ion selective membranes as well as inhomogeneities in their electrochemical properties. However, there is little knowledge regarding the mechanisms through which these surface variations promote transport. We use high-resolution direct numerical simulations to develop a comprehensive analysis of electroconvective flows generated by geometric patterns of impermeable stripes and investigate their potential to regularize electrokinetic instabilities. Counterintuitively, we find that reducing the permeable area of an ion exchange membrane, with appropriate patterning, increases the overall ion transport rate by up to 80%. In addition, we present analysis of nonpatterned membranes, and find a novel regime of electroconvection where a multivalued current is possible due to the coexistence of multiple convective states.
NASA Astrophysics Data System (ADS)
Hashmi, M. S.; Khan, N.; Ullah Khan, Sami; Rashidi, M. M.
In this study, we have constructed a mathematical model to investigate the heat source/sink effects in mixed convection axisymmetric flow of an incompressible, electrically conducting Oldroyd-B fluid between two infinite isothermal stretching disks. The effects of viscous dissipation and Joule heating are also considered in the heat equation. The governing partial differential equations are converted into ordinary differential equations by using appropriate similarity variables. The series solution of these dimensionless equations is constructed by using homotopy analysis method. The convergence of the obtained solution is carefully examined. The effects of various involved parameters on pressure, velocity and temperature profiles are comprehensively studied. A graphical analysis has been presented for various values of problem parameters. The numerical values of wall shear stress and Nusselt number are computed at both upper and lower disks. Moreover, a graphical and tabular explanation for critical values of Frank-Kamenetskii regarding other flow parameters.
Ouyang, Jie; An, Dongli; Chen, Tengteng; Lin, Zhiwei
2017-10-01
In recent years, cosmetic industry profits soared due to the widespread use of cosmetics, which resulted in illicit manufacturers and products of poor quality. Therefore, the rapid and accurate detection of the composition of cosmetics has become crucial. At present, numerous methods, such as gas chromatography and liquid chromatography-mass spectrometry, were available for the analysis of cosmetic ingredients. However, these methods present several limitations, such as failure to perform comprehensive and rapid analysis of the samples. Compared with other techniques, matrix-assisted laser desorption ionization time-of-flight mass spectrometry offered the advantages of wide detection range, fast speed and high accuracy. In this article, we briefly summarized how to select a suitable matrix and adjust the appropriate laser energy. We also discussed the rapid identification of undesired ingredients, focusing on antibiotics and hormones in cosmetics.
Jeon, Jin; Kim, Jae Kwang; Kim, HyeRan; Kim, Yeon Jeong; Park, Yun Ji; Kim, Sun Ju; Kim, Changsoo; Park, Sang Un
2018-02-15
Kale (Brassica oleracea var. acephala) is a rich source of numerous health-benefiting compounds, including vitamins, glucosinolates, phenolic compounds, and carotenoids. However, the genetic resources for exploiting the phyto-nutritional traits of kales are limited. To acquire precise information on secondary metabolites in kales, we performed a comprehensive analysis of the transcriptome and metabolome of green and red kale seedlings. Kale transcriptome datasets revealed 37,149 annotated genes and several secondary metabolite biosynthetic genes. HPLC analysis revealed 14 glucosinolates, 20 anthocyanins, 3 phenylpropanoids, and 6 carotenoids in the kale seedlings that were examined. Red kale contained more glucosinolates, anthocyanins, and phenylpropanoids than green kale, whereas the carotenoid contents were much higher in green kale than in red kale. Ultimately, our data will be a valuable resource for future research on kale bio-engineering and will provide basic information to define gene-to-metabolite networks in kale. Copyright © 2017 Elsevier Ltd. All rights reserved.
Time-variant random interval natural frequency analysis of structures
NASA Astrophysics Data System (ADS)
Wu, Binhua; Wu, Di; Gao, Wei; Song, Chongmin
2018-02-01
This paper presents a new robust method namely, unified interval Chebyshev-based random perturbation method, to tackle hybrid random interval structural natural frequency problem. In the proposed approach, random perturbation method is implemented to furnish the statistical features (i.e., mean and standard deviation) and Chebyshev surrogate model strategy is incorporated to formulate the statistical information of natural frequency with regards to the interval inputs. The comprehensive analysis framework combines the superiority of both methods in a way that computational cost is dramatically reduced. This presented method is thus capable of investigating the day-to-day based time-variant natural frequency of structures accurately and efficiently under concrete intrinsic creep effect with probabilistic and interval uncertain variables. The extreme bounds of the mean and standard deviation of natural frequency are captured through the embedded optimization strategy within the analysis procedure. Three particularly motivated numerical examples with progressive relationship in perspective of both structure type and uncertainty variables are demonstrated to justify the computational applicability, accuracy and efficiency of the proposed method.
Ji, Yue; Xu, Mengjie; Li, Xingfei; Wu, Tengfei; Tuo, Weixiao; Wu, Jun; Dong, Jiuzhi
2018-06-13
The magnetohydrodynamic (MHD) angular rate sensor (ARS) with low noise level in ultra-wide bandwidth is developed in lasing and imaging applications, especially the line-of-sight (LOS) system. A modified MHD ARS combined with the Coriolis effect was studied in this paper to expand the sensor’s bandwidth at low frequency (<1 Hz), which is essential for precision LOS pointing and wide-bandwidth LOS jitter suppression. The model and the simulation method were constructed and a comprehensive solving method based on the magnetic and electric interaction methods was proposed. The numerical results on the Coriolis effect and the frequency response of the modified MHD ARS were detailed. In addition, according to the experimental results of the designed sensor consistent with the simulation results, an error analysis of model errors was discussed. Our study provides an error analysis method of MHD ARS combined with the Coriolis effect and offers a framework for future studies to minimize the error.
BOOK REVIEW: Vortex Methods: Theory and Practice
NASA Astrophysics Data System (ADS)
Cottet, G.-H.; Koumoutsakos, P. D.
2001-03-01
The book Vortex Methods: Theory and Practice presents a comprehensive account of the numerical technique for solving fluid flow problems. It provides a very nice balance between the theoretical development and analysis of the various techniques and their practical implementation. In fact, the presentation of the rigorous mathematical analysis of these methods instills confidence in their implementation. The book goes into some detail on the more recent developments that attempt to account for viscous effects, in particular the presence of viscous boundary layers in some flows of interest. The presentation is very readable, with most points illustrated with well-chosen examples, some quite sophisticated. It is a very worthy reference book that should appeal to a large body of readers, from those interested in the mathematical analysis of the methods to practitioners of computational fluid dynamics. The use of the book as a text is compromised by its lack of exercises for students, but it could form the basis of a graduate special topics course. Juan Lopez
RTE: A computer code for Rocket Thermal Evaluation
NASA Technical Reports Server (NTRS)
Naraghi, Mohammad H. N.
1995-01-01
The numerical model for a rocket thermal analysis code (RTE) is discussed. RTE is a comprehensive thermal analysis code for thermal analysis of regeneratively cooled rocket engines. The input to the code consists of the composition of fuel/oxidant mixture and flow rates, chamber pressure, coolant temperature and pressure. dimensions of the engine, materials and the number of nodes in different parts of the engine. The code allows for temperature variation in axial, radial and circumferential directions. By implementing an iterative scheme, it provides nodal temperature distribution, rates of heat transfer, hot gas and coolant thermal and transport properties. The fuel/oxidant mixture ratio can be varied along the thrust chamber. This feature allows the user to incorporate a non-equilibrium model or an energy release model for the hot-gas-side. The user has the option of bypassing the hot-gas-side calculations and directly inputting the gas-side fluxes. This feature is used to link RTE to a boundary layer module for the hot-gas-side heat flux calculations.
Ehashi, Tomo; Takemura, Taro; Hanagata, Nobutaka; Minowa, Takashi; Kobayashi, Hisatoshi; Ishihara, Kazuhiko; Yamaoka, Tetsuji
2014-01-01
To design scaffolds for tissue regeneration, details of the host body reaction to the scaffolds must be studied. Host body reactions have been investigated mainly by immunohistological observations for a long time. Despite of recent dramatic development in genetic analysis technologies, genetically comprehensive changes in host body reactions are hardly studied. There is no information about host body reactions that can predict successful tissue regeneration in the future. In the present study, porous polyethylene scaffolds were coated with bioactive collagen or bio-inert poly(2-methacryloyloxyethyl phosphorylcholine-co-n-butyl methacrylate) (PMB) and were implanted subcutaneously and compared the host body reaction to those substrates by normalizing the result using control non-coat polyethylene scaffold. The comprehensive analyses of early host body reactions to the scaffolds were carried out using a DNA microarray assay. Within numerous genes which were expressed differently among these scaffolds, particular genes related to inflammation, wound healing, and angiogenesis were focused upon. Interleukin (IL)-1β and IL-10 are important cytokines in tissue responses to biomaterials because IL-1β promotes both inflammation and wound healing and IL-10 suppresses both of them. IL-1β was up-regulated in the collagen-coated scaffold. Collagen-specifically up-regulated genes contained both M1- and M2-macrophage-related genes. Marked vessel formation in the collagen-coated scaffold was occurred in accordance with the up-regulation of many angiogenesis-inducible factors. The DNA microarray assay provided global information regarding the host body reaction. Interestingly, several up-regulated genes were detected even on the very bio-inert PMB-coated surfaces and those genes include inflammation-suppressive and wound healing-suppressive IL-10, suggesting that not only active tissue response but also the inert response may relates to these genetic regulations. PMID:24454803
2013-09-30
numerical efforts undertaken here implement established aspects of Boussinesq -type modeling, developed by the PI and other researchers. These aspects...the Boussinesq -type framework, and then implement in a numerical model. Once this comprehensive model is developed and tested against established...phenomena that might be observed at New River. WORK COMPLETED In FY13 we have continued the development of a Boussinesq -type formulation that
The Role of Computer Simulation in Nanoporous Metals—A Review
Xia, Re; Wu, Run Ni; Liu, Yi Lun; Sun, Xiao Yu
2015-01-01
Nanoporous metals (NPMs) have proven to be all-round candidates in versatile and diverse applications. In this decade, interest has grown in the fabrication, characterization and applications of these intriguing materials. Most existing reviews focus on the experimental and theoretical works rather than the numerical simulation. Actually, with numerous experiments and theory analysis, studies based on computer simulation, which may model complex microstructure in more realistic ways, play a key role in understanding and predicting the behaviors of NPMs. In this review, we present a comprehensive overview of the computer simulations of NPMs, which are prepared through chemical dealloying. Firstly, we summarize the various simulation approaches to preparation, processing, and the basic physical and chemical properties of NPMs. In this part, the emphasis is attached to works involving dealloying, coarsening and mechanical properties. Then, we conclude with the latest progress as well as the future challenges in simulation studies. We believe that highlighting the importance of simulations will help to better understand the properties of novel materials and help with new scientific research on these materials. PMID:28793491
Numerical Analysis of the Performance of Millimeter-Wave RoF-Based Cellular Backhaul Links
NASA Astrophysics Data System (ADS)
Pham, Thu A.; Pham, Hien T. T.; Le, Hai-Chau; Dang, Ngoc T.
2017-08-01
In this paper, we study the performance of a next-generation cellular backhaul network that is based on a hybrid architecture using radio-over-fiber (RoF) and millimeter-wave (MMW) techniques. We develop a mathematic model and comprehensively analyze the performance of a MMW/RoF-based backhaul downlink under the impacts of various physical layer impairments originated from both optical fiber and wireless links. More specifically, the effects of nonlinear distortion, chromatic dispersion, fading, and many types of noises including shot noise, thermal noise, amplifier noise, and relative intensity noise are investigated. The numerical results show that the nonlinear distortion, fiber dispersion, and wireless fading are key factors that limit the system performance. Setting the modulation index properly helps minimize the effect of nonlinear distortion while implementing dispersion shifted optical fibers could be used to reduce the impact of dispersion and as a result, they can improve the bit-error rate. Moreover, it is also verified that, to mitigate the effect of multipath fading, remote radio heads should be located as near the remote antenna units as possible.
Perceptual and academic patterns of learning-disabled/gifted students.
Waldron, K A; Saphire, D G
1992-04-01
This research explored ways gifted children with learning disabilities perceive and recall auditory and visual input and apply this information to reading, mathematics, and spelling. 24 learning-disabled/gifted children and a matched control group of normally achieving gifted students were tested for oral reading, word recognition and analysis, listening comprehension, and spelling. In mathematics, they were tested for numeration, mental and written computation, word problems, and numerical reasoning. To explore perception and memory skills, students were administered formal tests of visual and auditory memory as well as auditory discrimination of sounds. Their responses to reading and to mathematical computations were further considered for evidence of problems in visual discrimination, visual sequencing, and visual spatial areas. Analyses indicated that these learning-disabled/gifted students were significantly weaker than controls in their decoding skills, in spelling, and in most areas of mathematics. They were also significantly weaker in auditory discrimination and memory, and in visual discrimination, sequencing, and spatial abilities. Conclusions are that these underlying perceptual and memory deficits may be related to students' academic problems.
Shahbazi, Mohammad; Saranlı, Uluç; Babuška, Robert; Lopes, Gabriel A D
2016-12-05
This paper introduces approximate time-domain solutions to the otherwise non-integrable double-stance dynamics of the 'bipedal' spring-loaded inverted pendulum (B-SLIP) in the presence of non-negligible damping. We first introduce an auxiliary system whose behavior under certain conditions is approximately equivalent to the B-SLIP in double-stance. Then, we derive approximate solutions to the dynamics of the new system following two different methods: (i) updated-momentum approach that can deal with both the lossy and lossless B-SLIP models, and (ii) perturbation-based approach following which we only derive a solution to the lossless case. The prediction performance of each method is characterized via a comprehensive numerical analysis. The derived representations are computationally very efficient compared to numerical integrations, and, hence, are suitable for online planning, increasing the autonomy of walking robots. Two application examples of walking gait control are presented. The proposed solutions can serve as instrumental tools in various fields such as control in legged robotics and human motion understanding in biomechanics.
Sensitivity of Rayleigh wave ellipticity and implications for surface wave inversion
NASA Astrophysics Data System (ADS)
Cercato, Michele
2018-04-01
The use of Rayleigh wave ellipticity has gained increasing popularity in recent years for investigating earth structures, especially for near-surface soil characterization. In spite of its widespread application, the sensitivity of the ellipticity function to the soil structure has been rarely explored in a comprehensive and systematic manner. To this end, a new analytical method is presented for computing the sensitivity of Rayleigh wave ellipticity with respect to the structural parameters of a layered elastic half-space. This method takes advantage of the minor decomposition of the surface wave eigenproblem and is numerically stable at high frequency. This numerical procedure allowed to retrieve the sensitivity for typical near surface and crustal geological scenarios, pointing out the key parameters for ellipticity interpretation under different circumstances. On this basis, a thorough analysis is performed to assess how ellipticity data can efficiently complement surface wave dispersion information in a joint inversion algorithm. The results of synthetic and real-world examples are illustrated to analyse quantitatively the diagnostic potential of the ellipticity data with respect to the soil structure, focusing on the possible sources of misinterpretation in data inversion.
34 CFR 668.142 - Special definitions.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., reading comprehension, or translation from graphic to numerical representation, that may be learned in... scored by a computer. Disabled student: A student who has a physical or mental impairment that..., DEPARTMENT OF EDUCATION STUDENT ASSISTANCE GENERAL PROVISIONS Approval of Independently Administered Tests...
ERIC Educational Resources Information Center
Eklund, Lowell
1976-01-01
Adult educators, providing necessary lifelong, comprehensive education, face the major imperative of solving the problem of racism. Numerous other imperatives concerning program design and content must be fulfilled in effectively meeting adult learners' needs. Guidance from the principles expressed in a proposed adult educator's Hippocratic Oath…
Indications of a positive feedback between coastal development and beach nourishment
Armstrong, Scott; Lazarus, Eli; Limber, Patrick W.; Goldstein, Evan; Thorpe, Curtis; Ballinger, Rhoda
2016-01-01
Beach nourishment, a method for mitigating coastal storm damage or chronic erosion by deliberately replacing sand on an eroded beach, has been the leading form of coastal protection in the U.S. for four decades. However, investment in hazard protection can have the unintended consequence of encouraging development in places especially vulnerable to damage. In a comprehensive, parcel-scale analysis of all shorefront single-family homes in the state of Florida, we find that houses in nourishing zones are significantly larger and more numerous than in non-nourishing zones. The predominance of larger homes in nourishing zones suggests a positive feedback between nourishment and development that is compounding coastal risk in zones already characterized by high vulnerability.
Factor structure of the Hooper Visual Organization Test: a cross-cultural replication and extension.
Merten, Thomas
2005-01-01
To investigate construct validity of the Hooper Visual Organization Test (VOT), a principal-axis analysis was performed on the neuropsychological test results of 200 German-speaking neurological patients who received a comprehensive battery, encompassing tests of visuospatial functions, memory, attention, executive functions, naming ability, and vocabulary. A four-factor solution was obtained with substantial loadings of the VOT only on the first factor, interpreted as a global dimension of non-verbal cognitive functions. This factor loaded significantly on numerous measures of visuospatial processing and attention (with particularly high loadings on WAIS-R Block Design, Trails A and B, and Raven's Standard Progressive Matrices). The remaining three factors were interpreted as memory, verbal abilities (vocabulary), and a separate factor of naming abilities.
Evaluation of DNA mixtures from database search.
Chung, Yuk-Ka; Hu, Yue-Qing; Fung, Wing K
2010-03-01
With the aim of bridging the gap between DNA mixture analysis and DNA database search, a novel approach is proposed to evaluate the forensic evidence of DNA mixtures when the suspect is identified by the search of a database of DNA profiles. General formulae are developed for the calculation of the likelihood ratio for a two-person mixture under general situations including multiple matches and imperfect evidence. The influence of the prior probabilities on the weight of evidence under the scenario of multiple matches is demonstrated by a numerical example based on Hong Kong data. Our approach is shown to be capable of presenting the forensic evidence of DNA mixtures in a comprehensive way when the suspect is identified through database search.
Epigenetic and genetic diagnosis of Silver-Russell syndrome.
Eggermann, Thomas; Spengler, Sabrina; Gogiel, Magdalena; Begemann, Matthias; Elbracht, Miriam
2012-06-01
Silver-Russell syndrome (SRS) is a congenital imprinting disorder characterized by intrauterine and postnatal growth restriction and further characteristic features. SRS is genetically heterogenous: 7-10% of patients carry a maternal uniparental disomy of chromosome 7; >38% show a hypomethylation in imprinting control region 1 in 11p15; and a further class of mutations are copy number variations affecting different chromosomes, but mainly 11p15 and 7. The diagnostic work-up should thus aim to detect these three molecular subtypes. Numerous techniques are currently applied in genetic SRS testing, but none of them covers all known (epi)mutations, and they should therefore be used synergistically. However, future next-generation sequencing approaches will allow a comprehensive analysis of all types of alterations in SRS.
Indications of a positive feedback between coastal development and beach nourishment
NASA Astrophysics Data System (ADS)
Armstrong, Scott B.; Lazarus, Eli D.; Limber, Patrick W.; Goldstein, Evan B.; Thorpe, Curtis; Ballinger, Rhoda C.
2016-12-01
Beach nourishment, a method for mitigating coastal storm damage or chronic erosion by deliberately replacing sand on an eroded beach, has been the leading form of coastal protection in the United States for four decades. However, investment in hazard protection can have the unintended consequence of encouraging development in places especially vulnerable to damage. In a comprehensive, parcel-scale analysis of all shorefront single-family homes in the state of Florida, we find that houses in nourishing zones are significantly larger and more numerous than in non-nourishing zones. The predominance of larger homes in nourishing zones suggests a positive feedback between nourishment and development that is compounding coastal risk in zones already characterized by high vulnerability.
Rdzanek, Wojciech P
2016-06-01
This study deals with the classical problem of sound radiation of an excited clamped circular plate embedded into a flat rigid baffle. The system of the two coupled differential equations is solved, one for the excited and damped vibrations of the plate and the other one-the Helmholtz equation. An approach using the expansion into radial polynomials leads to results for the modal impedance coefficients useful for a comprehensive numerical analysis of sound radiation. The results obtained are accurate and efficient in a wide low frequency range and can easily be adopted for a simply supported circular plate. The fluid loading is included providing accurate results in resonance.
[What kind of information do German health information pamphlets provide on mammography screening?].
Kurzenhäuser, Stephanie
2003-02-01
To make an informed decision on participation in mammography screening, women have to be educated about all the risks and benefits of the procedure in a manner that is detailed and understandable. But an analysis of 27 German health pamphlets on mammography screening shows that many relevant pieces of information about the benefits, the risks, and especially the meaning of screening results are only insufficiently communicated. Many statements were presented narratively rather than as precise statistics. Depending on content, 17 to 62% of the quantifiable statements were actually given as numerical data. To provide comprehensive information and to avoid misunderstandings, it is necessary to supplement the currently available health pamphlets and make the information on mammography screening more precise.
Communicating data about the benefits and harms of treatment: a randomized trial.
Woloshin, Steven; Schwartz, Lisa M
2011-07-19
Despite limited evidence, it is often asserted that natural frequencies (for example, 2 in 1000) are the best way to communicate absolute risks. To compare comprehension of treatment benefit and harm when absolute risks are presented as natural frequencies, percents, or both. Parallel-group randomized trial with central allocation and masking of investigators to group assignment, conducted through an Internet survey in September 2009. (ClinicalTrials.gov registration number: NCT00950014) National sample of U.S. adults randomly selected from a professional survey firm's research panel of about 30,000 households. 2944 adults aged 18 years or older (all with complete follow-up). Tables presenting absolute risks in 1 of 5 numeric formats: natural frequency (x in 1000), variable frequency (x in 100, x in 1000, or x in 10,000, as needed to keep the numerator >1), percent, percent plus natural frequency, or percent plus variable frequency. Comprehension as assessed by 18 questions (primary outcome) and judgment of treatment benefit and harm. The average number of comprehension questions answered correctly was lowest in the variable frequency group and highest in the percent group (13.1 vs. 13.8; difference, 0.7 [95% CI, 0.3 to 1.1]). The proportion of participants who "passed" the comprehension test (≥13 correct answers) was lowest in the natural and variable frequency groups and highest in the percent group (68% vs. 73%; difference, 5 percentage points [CI, 0 to 10 percentage points]). The largest format effect was seen for the 2 questions about absolute differences: the proportion correct in the natural frequency versus percent groups was 43% versus 72% (P < 0.001) and 73% versus 87% (P < 0.001). Even when data were presented in the percent format, one third of participants failed the comprehension test. Natural frequencies are not the best format for communicating the absolute benefits and harms of treatment. The more succinct percent format resulted in better comprehension: Comprehension was slightly better overall and notably better for absolute differences. Attorney General Consumer and Prescriber Education grant program, the Robert Wood Johnson Pioneer Program, and the National Cancer Institute.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reddy, Christopher; Nelson, Robert
The development of comprehensive two-dimensional gas chromatography (GC x GC) has expanded the analytical window for studying complex mixtures like oil. Compared to traditional gas chromatography, this technology separates and resolves at least an order of magnitude more compounds, has a much larger signal to noise ratio, and sorts compounds based on their chemical class; hence, providing highly refined inventories of petroleum hydrocarbons in geochemical samples that was previously unattainable. In addition to the increased resolution afforded by GC x GC, the resulting chromatograms have been used to estimate the liquid vapor pressures, aqueous solubilities, octanol-water partition coefficients, and vaporizationmore » enthalpies of petroleum hydrocarbons. With these relationships, powerful and incisive analyses of phase-transfer processes affecting petroleum hydrocarbon mixtures in the environment are available. For example, GC x GC retention data has been used to quantitatively deconvolve the effects of phase transfer processes such as water washing and evaporation. In short, the positive attributes of GC x GC-analysis have led to a methodology that has revolutionized the analysis of petroleum hydrocarbons. Overall, this research has opened numerous fields of study on the biogeochemical "genetics" (referred to as petroleomics) of petroleum samples in both subsurface and surface environments. Furthermore, these new findings have already been applied to the behavior of oil at other seeps as well, for petroleum exploration and oil spill studies.« less
Deafness for the meanings of number words
Caño, Agnès; Rapp, Brenda; Costa, Albert; Juncadella, Montserrat
2008-01-01
We describe the performance of an aphasic individual who showed a selective impairment affecting his comprehension of auditorily presented number words and not other word categories. His difficulty in number word comprehension was restricted to the auditory modality, given that with visual stimuli (written words, Arabic numerals and pictures) his comprehension of number and non-number words was intact. While there have been previous reports of selective difficulty or sparing of number words at the semantic and post-semantic levels, this is the first reported case of a pre-semantic deficit that is specific to the category of number words. This constitutes evidence that lexical semantic distinctions are respected by modality-specific neural mechanisms responsible for providing access to the meanings of words. PMID:17915265
Comprehensive T-Matrix Reference Database: A 2007-2009 Update
NASA Technical Reports Server (NTRS)
Mishchenko, Michael I.; Zakharova, Nadia T.; Videen, Gorden; Khlebtsov, Nikolai G.; Wriedt, Thomas
2010-01-01
The T-matrix method is among the most versatile, efficient, and widely used theoretical techniques for the numerically exact computation of electromagnetic scattering by homogeneous and composite particles, clusters of particles, discrete random media, and particles in the vicinity of an interface separating two half-spaces with different refractive indices. This paper presents an update to the comprehensive database of T-matrix publications compiled by us previously and includes the publications that appeared since 2007. It also lists several earlier publications not included in the original database.
Mapping analysis and planning system for the John F. Kennedy Space Center
NASA Technical Reports Server (NTRS)
Hall, C. R.; Barkaszi, M. J.; Provancha, M. J.; Reddick, N. A.; Hinkle, C. R.; Engel, B. A.; Summerfield, B. R.
1994-01-01
Environmental management, impact assessment, research and monitoring are multidisciplinary activities which are ideally suited to incorporate a multi-media approach to environmental problem solving. Geographic information systems (GIS), simulation models, neural networks and expert-system software are some of the advancing technologies being used for data management, query, analysis and display. At the 140,000 acre John F. Kennedy Space Center, the Advanced Software Technology group has been supporting development and implementation of a program that integrates these and other rapidly evolving hardware and software capabilities into a comprehensive Mapping, Analysis and Planning System (MAPS) based in a workstation/local are network environment. An expert-system shell is being developed to link the various databases to guide users through the numerous stages of a facility siting and environmental assessment. The expert-system shell approach is appealing for its ease of data access by management-level decision makers while maintaining the involvement of the data specialists. This, as well as increased efficiency and accuracy in data analysis and report preparation, can benefit any organization involved in natural resources management.
Linearized Unsteady Aerodynamic Analysis of the Acoustic Response to Wake/Blade-Row Interaction
NASA Technical Reports Server (NTRS)
Verdon, Joseph M.; Huff, Dennis L. (Technical Monitor)
2001-01-01
The three-dimensional, linearized Euler analysis, LINFLUX, is being developed to provide a comprehensive and efficient unsteady aerodynamic scheme for predicting the aeroacoustic and aeroelastic responses of axial-flow turbomachinery blading. LINFLUX couples a near-field, implicit, wave-split, finite-volume solution to far-field acoustic eigensolutions, to predict the aerodynamic responses of a blade row to prescribed structural and aerodynamic excitations. It is applied herein to predict the acoustic responses of a fan exit guide vane (FEGV) to rotor wake excitations. The intent is to demonstrate and assess the LINFLUX analysis via application to realistic wake/blade-row interactions. Numerical results are given for the unsteady pressure responses of the FEGV, including the modal pressure responses at inlet and exit. In addition, predictions for the modal and total acoustic power levels at the FEGV exit are compared with measurements. The present results indicate that the LINFLUX analysis should be useful in the aeroacoustic design process, and for understanding the three-dimensional flow physics relevant to blade-row noise generation and propagation.
Solving Upwind-Biased Discretizations: Defect-Correction Iterations
NASA Technical Reports Server (NTRS)
Diskin, Boris; Thomas, James L.
1999-01-01
This paper considers defect-correction solvers for a second order upwind-biased discretization of the 2D convection equation. The following important features are reported: (1) The asymptotic convergence rate is about 0.5 per defect-correction iteration. (2) If the operators involved in defect-correction iterations have different approximation order, then the initial convergence rates may be very slow. The number of iterations required to get into the asymptotic convergence regime might grow on fine grids as a negative power of h. In the case of a second order target operator and a first order driver operator, this number of iterations is roughly proportional to h-1/3. (3) If both the operators have the second approximation order, the defect-correction solver demonstrates the asymptotic convergence rate after three iterations at most. The same three iterations are required to converge algebraic error below the truncation error level. A novel comprehensive half-space Fourier mode analysis (which, by the way, can take into account the influence of discretized outflow boundary conditions as well) for the defect-correction method is developed. This analysis explains many phenomena observed in solving non-elliptic equations and provides a close prediction of the actual solution behavior. It predicts the convergence rate for each iteration and the asymptotic convergence rate. As a result of this analysis, a new very efficient adaptive multigrid algorithm solving the discrete problem to within a given accuracy is proposed. Numerical simulations confirm the accuracy of the analysis and the efficiency of the proposed algorithm. The results of the numerical tests are reported.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Milani, Gabriele, E-mail: milani@stru.polimi.it; Valente, Marco, E-mail: milani@stru.polimi.it
2014-10-06
This study presents some results of a comprehensive numerical analysis on three masonry churches damaged by the recent Emilia-Romagna (Italy) seismic events occurred in May 2012. The numerical study comprises: (a) pushover analyses conducted with a commercial code, standard nonlinear material models and two different horizontal load distributions; (b) FE kinematic limit analyses performed using a non-commercial software based on a preliminary homogenization of the masonry materials and a subsequent limit analysis with triangular elements and interfaces; (c) kinematic limit analyses conducted in agreement with the Italian code and based on the a-priori assumption of preassigned failure mechanisms, where themore » masonry material is considered unable to withstand tensile stresses. All models are capable of giving information on the active failure mechanism and the base shear at failure, which, if properly made non-dimensional with the weight of the structure, gives also an indication of the horizontal peak ground acceleration causing the collapse of the church. The results obtained from all three models indicate that the collapse is usually due to the activation of partial mechanisms (apse, façade, lateral walls, etc.). Moreover the horizontal peak ground acceleration associated to the collapse is largely lower than that required in that seismic zone by the Italian code for ordinary buildings. These outcomes highlight that structural upgrading interventions would be extremely beneficial for the considerable reduction of the seismic vulnerability of such kind of historical structures.« less
NASA Astrophysics Data System (ADS)
Darius, D.; Misaran, M. S.; Rahman, Md. M.; Ismail, M. A.; Amaludin, A.
2017-07-01
The study on the effect of the working parameters such as pipe material, pipe length, pipe diameter, depth of burial of the pipe, air flow rate and different types of soils on the thermal performance of earth-air heat exchanger (EAHE) systems is very crucial to ensure that thermal comfort can be achieved. In the past decade, researchers have performed studies to develop numerical models for analysis of EAHE systems. Until recently, two-dimensional models replaced the numerical models in the 1990s and in recent times, more advanced analysis using three-dimensional models, specifically the Computational Fluid Dynamics (CFD) simulation in the analysis of EAHE system. This paper reviews previous models used to analyse the EAHE system and working parameters that affects the earth-air heat exchanger (EAHE) thermal performance as of February 2017. Recent findings on the parameters affecting EAHE performance are also presented and discussed. As a conclusion, with the advent of CFD methods, investigational work have geared up to modelling and simulation work as it saves time and cost. Comprehension of the EAHE working parameters and its effect on system performance is largely established. However, the study on type of soil and its characteristics on the performance of EAHEs systems are surprisingly barren. Therefore, future studies should focus on the effect of soil characteristics such as moisture content, density of soil, and type of soil on the thermal performance of EAHEs system.
Quantitative Machine Learning Analysis of Brain MRI Morphology throughout Aging.
Shamir, Lior; Long, Joe
2016-01-01
While cognition is clearly affected by aging, it is unclear whether the process of brain aging is driven solely by accumulation of environmental damage, or involves biological pathways. We applied quantitative image analysis to profile the alteration of brain tissues during aging. A dataset of 463 brain MRI images taken from a cohort of 416 subjects was analyzed using a large set of low-level numerical image content descriptors computed from the entire brain MRI images. The correlation between the numerical image content descriptors and the age was computed, and the alterations of the brain tissues during aging were quantified and profiled using machine learning. The comprehensive set of global image content descriptors provides high Pearson correlation of ~0.9822 with the chronological age, indicating that the machine learning analysis of global features is sensitive to the age of the subjects. Profiling of the predicted age shows several periods of mild changes, separated by shorter periods of more rapid alterations. The periods with the most rapid changes were around the age of 55, and around the age of 65. The results show that the process of brain aging of is not linear, and exhibit short periods of rapid aging separated by periods of milder change. These results are in agreement with patterns observed in cognitive decline, mental health status, and general human aging, suggesting that brain aging might not be driven solely by accumulation of environmental damage. Code and data used in the experiments are publicly available.
Greening the consent decree: the ORD-NRMRL experience
The prevalence of combined and septic sewer overflows in most US cities has led to numerous enforcement actions under the Clean Water Act (1972). Contemporary circumstances require a more comprehensive redress of violations due to CSO activity. The integration of green infrastru...
Overaccommodation in a Singapore Eldercare Facility
ERIC Educational Resources Information Center
Cavallaro, Francesco; Seilhamer, Mark Fifer; Chee, Yi Tian Felicia; Ng, Bee Chin
2016-01-01
Numerous studies have shown that some speech accommodation in interactions with the elderly can aid communication. "Over"accommodaters, however, employing features such as high pitch, exaggerated prosody, and child-like forms of address, often demean, infantilise, and patronise elderly interlocutors rather than facilitate comprehension.…
NASA Technical Reports Server (NTRS)
Holmes, Bruce J.; Schairer, Edward; Hicks, Gary; Wander, Stephen; Blankson, Isiaiah; Rose, Raymond; Olson, Lawrence; Unger, George
1990-01-01
Presented here is a comprehensive review of the following aerodynamics elements: computational methods and applications, computational fluid dynamics (CFD) validation, transition and turbulence physics, numerical aerodynamic simulation, drag reduction, test techniques and instrumentation, configuration aerodynamics, aeroacoustics, aerothermodynamics, hypersonics, subsonic transport/commuter aviation, fighter/attack aircraft and rotorcraft.
Wilson, Anna J; Revkin, Susannah K; Cohen, David; Cohen, Laurent; Dehaene, Stanislas
2006-01-01
Background In a companion article [1], we described the development and evaluation of software designed to remediate dyscalculia. This software is based on the hypothesis that dyscalculia is due to a "core deficit" in number sense or in its access via symbolic information. Here we review the evidence for this hypothesis, and present results from an initial open-trial test of the software in a sample of nine 7–9 year old children with mathematical difficulties. Methods Children completed adaptive training on numerical comparison for half an hour a day, four days a week over a period of five-weeks. They were tested before and after intervention on their performance in core numerical tasks: counting, transcoding, base-10 comprehension, enumeration, addition, subtraction, and symbolic and non-symbolic numerical comparison. Results Children showed specific increases in performance on core number sense tasks. Speed of subitizing and numerical comparison increased by several hundred msec. Subtraction accuracy increased by an average of 23%. Performance on addition and base-10 comprehension tasks did not improve over the period of the study. Conclusion Initial open-trial testing showed promising results, and suggested that the software was successful in increasing number sense over the short period of the study. However these results need to be followed up with larger, controlled studies. The issues of transfer to higher-level tasks, and of the best developmental time window for intervention also need to be addressed. PMID:16734906
Requirements for Next Generation Comprehensive Analysis of Rotorcraft
NASA Technical Reports Server (NTRS)
Johnson, Wayne; Data, Anubhav
2008-01-01
The unique demands of rotorcraft aeromechanics analysis have led to the development of software tools that are described as comprehensive analyses. The next generation of rotorcraft comprehensive analyses will be driven and enabled by the tremendous capabilities of high performance computing, particularly modular and scaleable software executed on multiple cores. Development of a comprehensive analysis based on high performance computing both demands and permits a new analysis architecture. This paper describes a vision of the requirements for this next generation of comprehensive analyses of rotorcraft. The requirements are described and substantiated for what must be included and justification provided for what should be excluded. With this guide, a path to the next generation code can be found.
Thermal radiative properties: Nonmetallic solids.
NASA Technical Reports Server (NTRS)
Touloukian, Y. S.; Dewitt, D. P.
1972-01-01
The volume consists of a text on theory, estimation, and measurement, together with its bibliography, the main body of numerical data and its references, and the material index. The text material assumes a role complementary to the main body of numerical data. The physics and basic concepts of thermal radiation are discussed in detail, focusing attention on treatment of nonmetallic materials: theory, estimation, and methods of measurement. Numerical data is presented in a comprehensive manner. The scope of coverage includes the nonmetallic elements and their compounds, intermetallics, polymers, glasses, and minerals. Analyzed data graphs provide an evaluative review of the data. All data have been obtained from their original sources, and each data set is so referenced.
NASA Astrophysics Data System (ADS)
López-Venegas, Alberto M.; Horrillo, Juan; Pampell-Manis, Alyssa; Huérfano, Victor; Mercado, Aurelio
2015-06-01
The most recent tsunami observed along the coast of the island of Puerto Rico occurred on October 11, 1918, after a magnitude 7.2 earthquake in the Mona Passage. The earthquake was responsible for initiating a tsunami that mostly affected the northwestern coast of the island. Runup values from a post-tsunami survey indicated the waves reached up to 6 m. A controversy regarding the source of the tsunami has resulted in several numerical simulations involving either fault rupture or a submarine landslide as the most probable cause of the tsunami. Here we follow up on previous simulations of the tsunami from a submarine landslide source off the western coast of Puerto Rico as initiated by the earthquake. Improvements on our previous study include: (1) higher-resolution bathymetry; (2) a 3D-2D coupled numerical model specifically developed for the tsunami; (3) use of the non-hydrostatic numerical model NEOWAVE (non-hydrostatic evolution of ocean WAVE) featuring two-way nesting capabilities; and (4) comprehensive energy analysis to determine the time of full tsunami wave development. The three-dimensional Navier-Stokes model tsunami solution using the Navier-Stokes algorithm with multiple interfaces for two fluids (water and landslide) was used to determine the initial wave characteristic generated by the submarine landslide. Use of NEOWAVE enabled us to solve for coastal inundation, wave propagation, and detailed runup. Our results were in agreement with previous work in which a submarine landslide is favored as the most probable source of the tsunami, and improvement in the resolution of the bathymetry yielded inundation of the coastal areas that compare well with values from a post-tsunami survey. Our unique energy analysis indicates that most of the wave energy is isolated in the wave generation region, particularly at depths near the landslide, and once the initial wave propagates from the generation region its energy begins to stabilize.
Numerical Hydrodynamics in General Relativity.
Font, José A
2003-01-01
The current status of numerical solutions for the equations of ideal general relativistic hydrodynamics is reviewed. With respect to an earlier version of the article, the present update provides additional information on numerical schemes, and extends the discussion of astrophysical simulations in general relativistic hydrodynamics. Different formulations of the equations are presented, with special mention of conservative and hyperbolic formulations well-adapted to advanced numerical methods. A large sample of available numerical schemes is discussed, paying particular attention to solution procedures based on schemes exploiting the characteristic structure of the equations through linearized Riemann solvers. A comprehensive summary of astrophysical simulations in strong gravitational fields is presented. These include gravitational collapse, accretion onto black holes, and hydrodynamical evolutions of neutron stars. The material contained in these sections highlights the numerical challenges of various representative simulations. It also follows, to some extent, the chronological development of the field, concerning advances on the formulation of the gravitational field and hydrodynamic equations and the numerical methodology designed to solve them. Supplementary material is available for this article at 10.12942/lrr-2003-4.
Spencer, Mercedes; Wagner, Richard K
2018-06-01
The purpose of this meta-analysis was to examine the comprehension problems of children who have a specific reading comprehension deficit (SCD), which is characterized by poor reading comprehension despite adequate decoding. The meta-analysis included 86 studies of children with SCD who were assessed in reading comprehension and oral language (vocabulary, listening comprehension, storytelling ability, and semantic and syntactic knowledge). Results indicated that children with SCD had deficits in oral language ( d = -0.78, 95% CI [-0.89, -0.68], but these deficits were not as severe as their deficit in reading comprehension ( d = -2.78, 95% CI [-3.01, -2.54]). When compared to reading comprehension age-matched normal readers, the oral language skills of the two groups were comparable ( d = 0.32, 95% CI [-0.49, 1.14]), which suggests that the oral language weaknesses of children with SCD represent a developmental delay rather than developmental deviance. Theoretical and practical implications of these findings are discussed.
A Victorian Experiment in Economic Efficiency in Education.
ERIC Educational Resources Information Center
Rapple, Brendan A.
1992-01-01
There are numerous historical precedents for today's demands for economic efficiency and accountability in schools. This paper examines comprehensive nineteenth-century accountability system ("Payment by Results") that endured in English and Welsh elementary schools from 1862 until 1897. The amount of the yearly governmental grant…
WISC-R Examiner Errors: Cause for Concern.
ERIC Educational Resources Information Center
Slate, John R.; Chick, David
1989-01-01
Clinical psychology graduate students (N=14) administered Wechsler Intelligence Scale for Children-Revised. Found numerous scoring and mechanical errors that influenced full-scale intelligence quotient scores on two-thirds of protocols. Particularly prone to error were Verbal subtests of Vocabulary, Comprehension, and Similarities. Noted specific…
Inward, Daegan J G; Vogler, Alfried P; Eggleton, Paul
2007-09-01
The first comprehensive combined molecular and morphological phylogenetic analysis of the major groups of termites is presented. This was based on the analysis of three genes (cytochrome oxidase II, 12S and 28S) and worker characters for approximately 250 species of termites. Parsimony analysis of the aligned dataset showed that the monophyly of Hodotermitidae, Kalotermitidae and Termitidae were well supported, while Termopsidae and Rhinotermitidae were both paraphyletic on the estimated cladogram. Within Termitidae, the most diverse and ecologically most important family, the monophyly of Macrotermitinae, Foraminitermitinae, Apicotermitinae, Syntermitinae and Nasutitermitinae were all broadly supported, but Termitinae was paraphyletic. The pantropical genera Termes, Amitermes and Nasutitermes were all paraphyletic on the estimated cladogram, with at least 17 genera nested within Nasutitermes, given the presently accepted generic limits. Key biological features were mapped onto the cladogram. It was not possible to reconstruct the evolution of true workers unambiguously, as it was as parsimonious to assume a basal evolution of true workers and subsequent evolution of pseudergates, as to assume a basal condition of pseudergates and subsequent evolution of true workers. However, true workers were only found in species with either separate- or intermediate-type nests, so that the mapping of nest habit and worker type onto the cladogram were perfectly correlated. Feeding group evolution, however, showed a much more complex pattern, particularly within the Termitidae, where it proved impossible to estimate unambiguously the ancestral state within the family (which is associated with the loss of worker gut flagellates). However, one biologically plausible optimization implies an initial evolution from wood-feeding to fungus-growing, proposed as the ancestral condition within the Termitidae, followed by the very early evolution of soil-feeding and subsequent re-evolution of wood-feeding in numerous lineages.
Benchmarking a Visual-Basic based multi-component one-dimensional reactive transport modeling tool
NASA Astrophysics Data System (ADS)
Torlapati, Jagadish; Prabhakar Clement, T.
2013-01-01
We present the details of a comprehensive numerical modeling tool, RT1D, which can be used for simulating biochemical and geochemical reactive transport problems. The code can be run within the standard Microsoft EXCEL Visual Basic platform, and it does not require any additional software tools. The code can be easily adapted by others for simulating different types of laboratory-scale reactive transport experiments. We illustrate the capabilities of the tool by solving five benchmark problems with varying levels of reaction complexity. These literature-derived benchmarks are used to highlight the versatility of the code for solving a variety of practical reactive transport problems. The benchmarks are described in detail to provide a comprehensive database, which can be used by model developers to test other numerical codes. The VBA code presented in the study is a practical tool that can be used by laboratory researchers for analyzing both batch and column datasets within an EXCEL platform.
Kelly, Ronald R; Berent, Gerald P
2011-01-01
This research contrasted deaf and hearing students' interpretive knowledge of English sentences containing numeral quantifier phrases and indefinite noun phrases. A multiple-interpretation picture task methodology was used to assess 305 participants' judgments of the compatibility of sentence meanings with depicted discourse contexts. Participants' performance was assessed on the basis of hearing level (deaf, hearing) and grade level (middle school, high school, college). The deaf students were predicted to have differential access to specific sentence interpretations in accordance with the relative derivational complexity of the targeted sentence types. Hypotheses based on the pressures of derivational economy on acquisition were largely supported. The results also revealed that the deaf participants tended to overactivate pragmatic processes that yielded principled, though non-target, sentence interpretations. Collectively, the results not only contribute to the understanding of English acquisition under conditions of restricted access to spoken language input, they also suggest that pragmatic factors may play a broad role in influencing, and compromising, deaf students' reading comprehension and written expression.
Helitzer, Deborah; Hollis, Christine; Cotner, Jane; Oestreicher, Nancy
2009-01-01
Health literacy requires reading and writing skills as well as knowledge of health topics and health systems. Materials written at high reading levels with ambiguous, technical, or dense text, often place great comprehension demands on consumers with lower literacy skills. This study developed and used an instrument to analyze cervical cancer prevention materials for readability, comprehensibility, suitability, and message design. The Suitability Assessment of Materials (SAM) was amended for ease of use, inclusivity, and objectivity with the encouragement of the original developers. Other novel contributions were specifically related to "comprehensibility" (CAM). The resulting SAM + CAM was used to score 69 materials for content, literacy demand, numeric literacy, graphics, layout/typography, and learning stimulation variables. Expert reviewers provided content validation. Inter-rater reliability was "substantial" (kappa = .77). The mean reading level of materials was 11th grade. Most materials (68%) scored as "adequate" for comprehensibility, suitability, and message design; health education brochures scored better than other materials. Only one-fifth were ranked "superior" for ease of use and comprehensibility. Most written materials have a readability level that is too high and require improvement in ease of use and comprehensibility for the majority of readers.
NASA Astrophysics Data System (ADS)
Zheng, Yibo; Zhang, Lei; Wang, Yuan
2017-10-01
In this letter, surface plasmon resonance sensors based on grapefruit-type photonic crystal fiber (PCF)with different silver nano-filling structure have been analyzed and compared though the finite element method (FEM). The regularity of the resonant wavelength changing with refractive index of the sample has been numerically simulated. The surface plasmon resonance (SPR) sensing properties have been numerically simulated in both areas of resonant wavelength and intensity detection. Numerical results show that excellent sensor resolution of 4.17×10-5RIU can be achieved as the radius of the filling silver nanowires is 150 nm by spectrum detection method. Comprehensive comparison indicates that the 150 nm silver wire filling structure is suitable for spectrum detection and 30 nm silver film coating structure is suitable for the amplitude detection.
A numerical analysis of the performance of unpumped SBE 41 sensors at low flushing rates
NASA Astrophysics Data System (ADS)
Alvarez, A.
2018-05-01
The thermal and hydrodynamic response of a Sea-Bird unpumped CTD SBE 41, is numerically modeled to assess the biases occurring at the slow flushing rates typical of glider operations. Based on symmetry considerations, the sensor response is approximated by coupling the incompressible Navier-Stokes and the thermal advection-diffusion equations in two dimensions. Numerical results illustrate three regimes in the thermal response of the SBE 41 sensor, when crossing water layers with different thermal signatures. A linear decay in time of the bulk temperature of the conductivity cell is initially found. This is induced by the transit of the inflow through the conductivity cell in the form of a relatively narrow jet. Water masses with new thermal signatures do not immediately fill the sensor chambers, where the cross-section widens. Thermal equilibrium of these water masses is then achieved, in a second regime, via a cross-flow thermal diffusion between the boundary of the jet and the walls. Consequently, the evolution of the bulk temperature scales with the square root of time. In a third regime, the evolution of the bulk temperature depends on the thermal gradient between the fluid and the coating material. This results on an exponential decay of the bulk temperature with time. A comprehensive analytical model of the time evolution of the bulk temperature inside a cell is proposed based on these results.
Makarova, Kira S; Wolf, Yuri I; Koonin, Eugene V
2009-06-03
The prokaryotic toxin-antitoxin systems (TAS, also referred to as TA loci) are widespread, mobile two-gene modules that can be viewed as selfish genetic elements because they evolved mechanisms to become addictive for replicons and cells in which they reside, but also possess "normal" cellular functions in various forms of stress response and management of prokaryotic population. Several distinct TAS of type 1, where the toxin is a protein and the antitoxin is an antisense RNA, and numerous, unrelated TAS of type 2, in which both the toxin and the antitoxin are proteins, have been experimentally characterized, and it is suspected that many more remain to be identified. We report a comprehensive comparative-genomic analysis of Type 2 toxin-antitoxin systems in prokaryotes. Using sensitive methods for distant sequence similarity search, genome context analysis and a new approach for the identification of mobile two-component systems, we identified numerous, previously unnoticed protein families that are homologous to toxins and antitoxins of known type 2 TAS. In addition, we predict 12 new families of toxins and 13 families of antitoxins, and also, predict a TAS or TAS-like activity for several gene modules that were not previously suspected to function in that capacity. In particular, we present indications that the two-gene module that encodes a minimal nucleotidyl transferase and the accompanying HEPN protein, and is extremely abundant in many archaea and bacteria, especially, thermophiles might comprise a novel TAS. We present a survey of previously known and newly predicted TAS in 750 complete genomes of archaea and bacteria, quantitatively demonstrate the exceptional mobility of the TAS, and explore the network of toxin-antitoxin pairings that combines plasticity with selectivity. The defining properties of the TAS, namely, the typically small size of the toxin and antitoxin genes, fast evolution, and extensive horizontal mobility, make the task of comprehensive identification of these systems particularly challenging. However, these same properties can be exploited to develop context-based computational approaches which, combined with exhaustive analysis of subtle sequence similarities were employed in this work to substantially expand the current collection of TAS by predicting both previously unnoticed, derived versions of known toxins and antitoxins, and putative novel TAS-like systems. In a broader context, the TAS belong to the resistome domain of the prokaryotic mobilome which includes partially selfish, addictive gene cassettes involved in various aspects of stress response and organized under the same general principles as the TAS. The "selfish altruism", or "responsible selfishness", of TAS-like systems appears to be a defining feature of the resistome and an important characteristic of the entire prokaryotic pan-genome given that in the prokaryotic world the mobilome and the "stable" chromosomes form a dynamic continuum. This paper was reviewed by Kenn Gerdes (nominated by Arcady Mushegian), Daniel Haft, Arcady Mushegian, and Andrei Osterman. For full reviews, go to the Reviewers' Reports section.
Makarova, Kira S; Wolf, Yuri I; Koonin, Eugene V
2009-01-01
Background The prokaryotic toxin-antitoxin systems (TAS, also referred to as TA loci) are widespread, mobile two-gene modules that can be viewed as selfish genetic elements because they evolved mechanisms to become addictive for replicons and cells in which they reside, but also possess "normal" cellular functions in various forms of stress response and management of prokaryotic population. Several distinct TAS of type 1, where the toxin is a protein and the antitoxin is an antisense RNA, and numerous, unrelated TAS of type 2, in which both the toxin and the antitoxin are proteins, have been experimentally characterized, and it is suspected that many more remain to be identified. Results We report a comprehensive comparative-genomic analysis of Type 2 toxin-antitoxin systems in prokaryotes. Using sensitive methods for distant sequence similarity search, genome context analysis and a new approach for the identification of mobile two-component systems, we identified numerous, previously unnoticed protein families that are homologous to toxins and antitoxins of known type 2 TAS. In addition, we predict 12 new families of toxins and 13 families of antitoxins, and also, predict a TAS or TAS-like activity for several gene modules that were not previously suspected to function in that capacity. In particular, we present indications that the two-gene module that encodes a minimal nucleotidyl transferase and the accompanying HEPN protein, and is extremely abundant in many archaea and bacteria, especially, thermophiles might comprise a novel TAS. We present a survey of previously known and newly predicted TAS in 750 complete genomes of archaea and bacteria, quantitatively demonstrate the exceptional mobility of the TAS, and explore the network of toxin-antitoxin pairings that combines plasticity with selectivity. Conclusion The defining properties of the TAS, namely, the typically small size of the toxin and antitoxin genes, fast evolution, and extensive horizontal mobility, make the task of comprehensive identification of these systems particularly challenging. However, these same properties can be exploited to develop context-based computational approaches which, combined with exhaustive analysis of subtle sequence similarities were employed in this work to substantially expand the current collection of TAS by predicting both previously unnoticed, derived versions of known toxins and antitoxins, and putative novel TAS-like systems. In a broader context, the TAS belong to the resistome domain of the prokaryotic mobilome which includes partially selfish, addictive gene cassettes involved in various aspects of stress response and organized under the same general principles as the TAS. The "selfish altruism", or "responsible selfishness", of TAS-like systems appears to be a defining feature of the resistome and an important characteristic of the entire prokaryotic pan-genome given that in the prokaryotic world the mobilome and the "stable" chromosomes form a dynamic continuum. Reviewers This paper was reviewed by Kenn Gerdes (nominated by Arcady Mushegian), Daniel Haft, Arcady Mushegian, and Andrei Osterman. For full reviews, go to the Reviewers' Reports section. PMID:19493340
Computational Models of Laryngeal Aerodynamics: Potentials and Numerical Costs.
Sadeghi, Hossein; Kniesburges, Stefan; Kaltenbacher, Manfred; Schützenberger, Anne; Döllinger, Michael
2018-02-07
Human phonation is based on the interaction between tracheal airflow and laryngeal dynamics. This fluid-structure interaction is based on the energy exchange between airflow and vocal folds. Major challenges in analyzing the phonatory process in-vivo are the small dimensions and the poor accessibility of the region of interest. For improved analysis of the phonatory process, numerical simulations of the airflow and the vocal fold dynamics have been suggested. Even though most of the models reproduced the phonatory process fairly well, development of comprehensive larynx models is still a subject of research. In the context of clinical application, physiological accuracy and computational model efficiency are of great interest. In this study, a simple numerical larynx model is introduced that incorporates the laryngeal fluid flow. It is based on a synthetic experimental model with silicone vocal folds. The degree of realism was successively increased in separate computational models and each model was simulated for 10 oscillation cycles. Results show that relevant features of the laryngeal flow field, such as glottal jet deflection, develop even when applying rather simple static models with oscillating flow rates. Including further phonatory components such as vocal fold motion, mucosal wave propagation, and ventricular folds, the simulations show phonatory key features like intraglottal flow separation and increased flow rate in presence of ventricular folds. The simulation time on 100 CPU cores ranged between 25 and 290 hours, currently restricting clinical application of these models. Nevertheless, results show high potential of numerical simulations for better understanding of phonatory process. Copyright © 2018 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
Regulation of Glycan Structures in Animal Tissues
Nairn, Alison V.; York, William S.; Harris, Kyle; Hall, Erica M.; Pierce, J. Michael; Moremen, Kelley W.
2008-01-01
Glycan structures covalently attached to proteins and lipids play numerous roles in mammalian cells, including protein folding, targeting, recognition, and adhesion at the molecular or cellular level. Regulating the abundance of glycan structures on cellular glycoproteins and glycolipids is a complex process that depends on numerous factors. Most models for glycan regulation hypothesize that transcriptional control of the enzymes involved in glycan synthesis, modification, and catabolism determines glycan abundance and diversity. However, few broad-based studies have examined correlations between glycan structures and transcripts encoding the relevant biosynthetic and catabolic enzymes. Low transcript abundance for many glycan-related genes has hampered broad-based transcript profiling for comparison with glycan structural data. In an effort to facilitate comparison with glycan structural data and to identify the molecular basis of alterations in glycan structures, we have developed a medium-throughput quantitative real time reverse transcriptase-PCR platform for the analysis of transcripts encoding glycan-related enzymes and proteins in mouse tissues and cells. The method employs a comprehensive list of >700 genes, including enzymes involved in sugar-nucleotide biosynthesis, transporters, glycan extension, modification, recognition, catabolism, and numerous glycosylated core proteins. Comparison with parallel microarray analyses indicates a significantly greater sensitivity and dynamic range for our quantitative real time reverse transcriptase-PCR approach, particularly for the numerous low abundance glycan-related enzymes. Mapping of the genes and transcript levels to their respective biosynthetic pathway steps allowed a comparison with glycan structural data and provides support for a model where many, but not all, changes in glycan abundance result from alterations in transcript expression of corresponding biosynthetic enzymes. PMID:18411279
Bletzer, Keith V
2015-01-01
Satisfaction surveys are common in the field of health education, as a means of assisting organizations to improve the appropriateness of training materials and the effectiveness of facilitation-presentation. Data can be qualitative of which analysis often become specialized. This technical article aims to reveal whether qualitative survey results can be visualized by presenting them as a Word Cloud. Qualitative materials in the form of written comments on an agency-specific satisfaction survey were coded and quantified. The resulting quantitative data were used to convert comments into "input terms" to generate Word Clouds to increase comprehension and accessibility through visualization of the written responses. A three-tier display incorporated a Word Cloud at the top, followed by the corresponding frequency table, and a textual summary of the qualitative data represented by the Word Cloud imagery. This mixed format adheres to recognition that people vary in what format is most effective for assimilating new information. The combination of visual representation through Word Clouds complemented by quantified qualitative materials is one means of increasing comprehensibility for a range of stakeholders, who might not be familiar with numerical tables or statistical analyses.
Risk assessment for tephra dispersal and sedimentation: the example of four Icelandic volcanoes
NASA Astrophysics Data System (ADS)
Biass, Sebastien; Scaini, Chiara; Bonadonna, Costanza; Smith, Kate; Folch, Arnau; Höskuldsson, Armann; Galderisi, Adriana
2014-05-01
In order to assist the elaboration of proactive measures for the management of future Icelandic volcanic eruptions, we developed a new approach to assess the impact associated with tephra dispersal and sedimentation at various scales and for multiple sources. Target volcanoes are Hekla, Katla, Eyjafjallajökull and Askja, selected for their high probabilities of eruption and/or their high potential impact. We combined stratigraphic studies, probabilistic strategies and numerical modelling to develop comprehensive eruption scenarios and compile hazard maps for local ground deposition and regional atmospheric concentration using both TEPHRA2 and FALL3D models. New algorithms for the identification of comprehensive probability density functions of eruptive source parameters were developed for both short and long-lasting activity scenarios. A vulnerability assessment of socioeconomic and territorial aspects was also performed at both national and continental scales. The identification of relevant vulnerability indicators allowed for the identification of the most critical areas and territorial nodes. At a national scale, the vulnerability of economic activities and the accessibility to critical infrastructures was assessed. At a continental scale, we assessed the vulnerability of the main airline routes and airports. Resulting impact and risk were finally assessed by combining hazard and vulnerability analysis.
Peng, Zhi-yu; Zhou, Xin; Li, Linchuan; Yu, Xiangchun; Li, Hongjiang; Jiang, Zhiqiang; Cao, Guangyu; Bai, Mingyi; Wang, Xingchun; Jiang, Caifu; Lu, Haibin; Hou, Xianhui; Qu, Lijia; Wang, Zhiyong; Zuo, Jianru; Fu, Xiangdong; Su, Zhen; Li, Songgang; Guo, Hongwei
2009-01-01
Plant hormones are small organic molecules that influence almost every aspect of plant growth and development. Genetic and molecular studies have revealed a large number of genes that are involved in responses to numerous plant hormones, including auxin, gibberellin, cytokinin, abscisic acid, ethylene, jasmonic acid, salicylic acid, and brassinosteroid. Here, we develop an Arabidopsis hormone database, which aims to provide a systematic and comprehensive view of genes participating in plant hormonal regulation, as well as morphological phenotypes controlled by plant hormones. Based on data from mutant studies, transgenic analysis and gene ontology (GO) annotation, we have identified a total of 1026 genes in the Arabidopsis genome that participate in plant hormone functions. Meanwhile, a phenotype ontology is developed to precisely describe myriad hormone-regulated morphological processes with standardized vocabularies. A web interface (http://ahd.cbi.pku.edu.cn) would allow users to quickly get access to information about these hormone-related genes, including sequences, functional category, mutant information, phenotypic description, microarray data and linked publications. Several applications of this database in studying plant hormonal regulation and hormone cross-talk will be presented and discussed. PMID:19015126
Profit Analysis Model of Smart Item Implementation in Integrated Supply Chain Process
NASA Astrophysics Data System (ADS)
Tritularsih, Yustina; Rinanto, Andhy; Prasetyo, Hoedi; Nur Rosyidi, Cucuk
2018-03-01
Nowadays all links of the entire supply chain need to integrate their different infrastructures and they have better control of them to drive better profits. This integration should offer the ability for companies in order to have an overall and transparent insight to its supply chain activities. An intelligent supply chain which is mainly supported by Smart Items technology can satisfy the need of those integration. By means of Smart Items, a company can benefit some advantages. Those are cost reduction and value creation. However, currently there is no comprehensive Smart Item infrastructure exists yet so it is difficult to calculate the true benefit information. This paper attempts to recommend a model for estimating the benefits of implementing Smart Items in a company which has an integrated supply chain process. The integrated supply chain means that three echelons (supplier, shipper and retailer) of supply chain are belonged to a company. The proposed model was used to determine the shrinkage value and RFID tag price which can give the maximum benefit of Smart Items implementation. A numerical example is also provided to give a better comprehension on model calculation.
Cacciola, Francesco; Mangraviti, Domenica; Rigano, Francesca; Donato, Paola; Dugo, Paola; Mondello, Luigi; Cortes, Hernan J
2018-06-01
Shikimic acid is a intermediate of aromatic amino acid biosynthesis and the preferred starting material for production of the most commonly prescribed anti-influenza drug, Tamiflu. Its six-membered carbocyclic ring is adorned with several chiral centers and various functionalities, making shikimic acid a valuable chiral synthon. When microbially-produced, in addition to shikimic acid, numerous other metabolites are exported out of the cytoplasm and accumulate in the culture medium. This extracellular matrix of metabolites is referred to as the microbosphere. Due to the high sample complexity, in this study, the microbosphere of shikimate-producing Escherichia coli SP1.1/pKD15.071 was analyzed by liquid chromatography and comprehensive two-dimensional liquid chromatography coupled to photodiode array and mass spectrometry detection. GC analysis of the trimethylsilyl derivatives was also carried out in order to support the elucidation of the selected metabolites in the microbosphere. The elucidation of the metabolic fraction of this bacterial strain might be of valid aid for improving, through genetic changes, the concentration and yield of shikimic acid synthesized from glucose. Graphical abstract.
Multi-fracture response of cross-ply ceramic composites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erdman, D.L.; Weitsman, Y.J.
1996-12-31
Ceramic matrix composites are candidate materials for high temperature applications due to their ability to retain mechanical properties. However, in view of the relatively low transverse strength and ductility associated with unidirectional ceramic matrix lay-ups, it is necessary to consider multi-directional reinforcement for any practical structural application. The simplest laminate that would provide multi-directional toughness would be the cross-ply lay-up. Although there are numerous publications concerned with modeling of the stress-strain response of unidirectional ceramic matrix laminates, there are relatively few investigations in the current literature which deal with laminates such as the cross-ply lay-up. Additionally, the aforementioned publications aremore » often incomplete since they fail to address the failure mechanisms associated with this lay-up in a comprehensive manner and consequently have limited success in correlating experimental stress-strain response with mechanical test results. Furthermore, many current experimental investigations fail to report the details of damage evolution and stress-strain response which are required for correlation with analyses. This investigation presents a comprehensive extended shear-lag type analysis that considers transverse matrix cracking in the 90{degree} plies, the non-linearity of the 0{degree} plies, and slip at the 0/90 ply interface.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
SacconePhD, Scott F; Chesler, Elissa J; Bierut, Laura J
Commercial SNP microarrays now provide comprehensive and affordable coverage of the human genome. However, some diseases have biologically relevant genomic regions that may require additional coverage. Addiction, for example, is thought to be influenced by complex interactions among many relevant genes and pathways. We have assembled a list of 486 biologically relevant genes nominated by a panel of experts on addiction. We then added 424 genes that showed evidence of association with addiction phenotypes through mouse QTL mappings and gene co-expression analysis. We demonstrate that there are a substantial number of SNPs in these genes that are not well representedmore » by commercial SNP platforms. We address this problem by introducing a publicly available SNP database for addiction. The database is annotated using numeric prioritization scores indicating the extent of biological relevance. The scores incorporate a number of factors such as SNP/gene functional properties (including synonymy and promoter regions), data from mouse systems genetics and measures of human/mouse evolutionary conservation. We then used HapMap genotyping data to determine if a SNP is tagged by a commercial microarray through linkage disequilibrium. This combination of biological prioritization scores and LD tagging annotation will enable addiction researchers to supplement commercial SNP microarrays to ensure comprehensive coverage of biologically relevant regions.« less
Hernandez-Prieto, Miguel A; Futschik, Matthias E
2012-01-01
Synechocystis sp. PCC6803 is one of the best studied cyanobacteria and an important model organism for our understanding of photosynthesis. The early availability of its complete genome sequence initiated numerous transcriptome studies, which have generated a wealth of expression data. Analysis of the accumulated data can be a powerful tool to study transcription in a comprehensive manner and to reveal underlying regulatory mechanisms, as well as to annotate genes whose functions are yet unknown. However, use of divergent microarray platforms, as well as distributed data storage make meta-analyses of Synechocystis expression data highly challenging, especially for researchers with limited bioinformatic expertise and resources. To facilitate utilisation of the accumulated expression data for a wider research community, we have developed CyanoEXpress, a web database for interactive exploration and visualisation of transcriptional response patterns in Synechocystis. CyanoEXpress currently comprises expression data for 3073 genes and 178 environmental and genetic perturbations obtained in 31 independent studies. At present, CyanoEXpress constitutes the most comprehensive collection of expression data available for Synechocystis and can be freely accessed. The database is available for free at http://cyanoexpress.sysbiolab.eu.
Peng, Zhi-yu; Zhou, Xin; Li, Linchuan; Yu, Xiangchun; Li, Hongjiang; Jiang, Zhiqiang; Cao, Guangyu; Bai, Mingyi; Wang, Xingchun; Jiang, Caifu; Lu, Haibin; Hou, Xianhui; Qu, Lijia; Wang, Zhiyong; Zuo, Jianru; Fu, Xiangdong; Su, Zhen; Li, Songgang; Guo, Hongwei
2009-01-01
Plant hormones are small organic molecules that influence almost every aspect of plant growth and development. Genetic and molecular studies have revealed a large number of genes that are involved in responses to numerous plant hormones, including auxin, gibberellin, cytokinin, abscisic acid, ethylene, jasmonic acid, salicylic acid, and brassinosteroid. Here, we develop an Arabidopsis hormone database, which aims to provide a systematic and comprehensive view of genes participating in plant hormonal regulation, as well as morphological phenotypes controlled by plant hormones. Based on data from mutant studies, transgenic analysis and gene ontology (GO) annotation, we have identified a total of 1026 genes in the Arabidopsis genome that participate in plant hormone functions. Meanwhile, a phenotype ontology is developed to precisely describe myriad hormone-regulated morphological processes with standardized vocabularies. A web interface (http://ahd.cbi.pku.edu.cn) would allow users to quickly get access to information about these hormone-related genes, including sequences, functional category, mutant information, phenotypic description, microarray data and linked publications. Several applications of this database in studying plant hormonal regulation and hormone cross-talk will be presented and discussed.
2013-01-01
There is considerable interest in the structural and functional properties of the angular gyrus (AG). Located in the posterior part of the inferior parietal lobule, the AG has been shown in numerous meta-analysis reviews to be consistently activated in a variety of tasks. This review discusses the involvement of the AG in semantic processing, word reading and comprehension, number processing, default mode network, memory retrieval, attention and spatial cognition, reasoning, and social cognition. This large functional neuroimaging literature depicts a major role for the AG in processing concepts rather than percepts when interfacing perception-to-recognition-to-action. More specifically, the AG emerges as a cross-modal hub where converging multisensory information is combined and integrated to comprehend and give sense to events, manipulate mental representations, solve familiar problems, and reorient attention to relevant information. In addition, this review discusses recent findings that point to the existence of multiple subdivisions in the AG. This spatial parcellation can serve as a framework for reporting AG activations with greater definition. This review also acknowledges that the role of the AG cannot comprehensibly be identified in isolation but needs to be understood in parallel with the influence from other regions. Several interesting questions that warrant further investigations are finally emphasized. PMID:22547530
Examining the Effects of Classroom Discussion on Students' Comprehension of Text: A Meta-Analysis
ERIC Educational Resources Information Center
Murphy, P. Karen; Wilkinson, Ian A. G.; Soter, Anna O.; Hennessey, Maeghan N.; Alexander, John F.
2009-01-01
The role of classroom discussions in comprehension and learning has been the focus of investigations since the early 1960s. Despite this long history, no syntheses have quantitatively reviewed the vast body of literature on classroom discussions for their effects on students' comprehension and learning. This comprehensive meta-analysis of…
ERIC Educational Resources Information Center
Lan, Yi-Chin; Lo, Yu-Ling; Hsu, Ying-Shao
2014-01-01
Comprehension is the essence of reading. Finding appropriate and effective reading strategies to support students' reading comprehension has always been a critical issue for educators. This article presents findings from a meta-analysis of 17 studies of metacognitive strategy instruction on students' reading comprehension in computerized…
Numerical Hydrodynamics and Magnetohydrodynamics in General Relativity.
Font, José A
2008-01-01
This article presents a comprehensive overview of numerical hydrodynamics and magneto-hydrodynamics (MHD) in general relativity. Some significant additions have been incorporated with respect to the previous two versions of this review (2000, 2003), most notably the coverage of general-relativistic MHD, a field in which remarkable activity and progress has occurred in the last few years. Correspondingly, the discussion of astrophysical simulations in general-relativistic hydrodynamics is enlarged to account for recent relevant advances, while those dealing with general-relativistic MHD are amply covered in this review for the first time. The basic outline of this article is nevertheless similar to its earlier versions, save for the addition of MHD-related issues throughout. Hence, different formulations of both the hydrodynamics and MHD equations are presented, with special mention of conservative and hyperbolic formulations well adapted to advanced numerical methods. A large sample of numerical approaches for solving such hyperbolic systems of equations is discussed, paying particular attention to solution procedures based on schemes exploiting the characteristic structure of the equations through linearized Riemann solvers. As previously stated, a comprehensive summary of astrophysical simulations in strong gravitational fields is also presented. These are detailed in three basic sections, namely gravitational collapse, black-hole accretion, and neutron-star evolutions; despite the boundaries, these sections may (and in fact do) overlap throughout the discussion. The material contained in these sections highlights the numerical challenges of various representative simulations. It also follows, to some extent, the chronological development of the field, concerning advances in the formulation of the gravitational field, hydrodynamics and MHD equations and the numerical methodology designed to solve them. To keep the length of this article reasonable, an effort has been made to focus on multidimensional studies, directing the interested reader to earlier versions of the review for discussions on one-dimensional works. Supplementary material is available for this article at 10.12942/lrr-2008-7.
The UEA sRNA Workbench (version 4.4): a comprehensive suite of tools for analyzing miRNAs and sRNAs.
Stocks, Matthew B; Mohorianu, Irina; Beckers, Matthew; Paicu, Claudia; Moxon, Simon; Thody, Joshua; Dalmay, Tamas; Moulton, Vincent
2018-05-02
RNA interference, a highly conserved regulatory mechanism, is mediated via small RNAs. Recent technical advances enabled the analysis of larger, complex datasets and the investigation of microRNAs and the less known small interfering RNAs. However, the size and intricacy of current data requires a comprehensive set of tools, able to discriminate the patterns from the low-level, noise-like, variation; numerous and varied suggestions from the community represent an invaluable source of ideas for future tools, the ability of the community to contribute to this software is essential. We present a new version of the UEA sRNA Workbench, reconfigured to allow an easy insertion of new tools/workflows. In its released form, it comprises of a suite of tools in a user-friendly environment, with enhanced capabilities for a comprehensive processing of sRNA-seq data e.g. tools for an accurate prediction of sRNA loci (CoLIde) and miRNA loci (miRCat2), as well as workflows to guide the users through common steps such as quality checking of the input data, normalization of abundances or detection of differential expression represent the first step in sRNA-seq analyses. The UEA sRNA Workbench is available at: http://srna-workbench.cmp.uea.ac.uk The source code is available at: https://github.com/sRNAworkbenchuea/UEA_sRNA_Workbench. v.moulton@uea.ac.uk.
Benavides-Varela, S; Piva, D; Burgio, F; Passarini, L; Rolma, G; Meneghello, F; Semenza, C
2017-03-01
Arithmetical deficits in right-hemisphere damaged patients have been traditionally considered secondary to visuo-spatial impairments, although the exact relationship between the two deficits has rarely been assessed. The present study implemented a voxelwise lesion analysis among 30 right-hemisphere damaged patients and a controlled, matched-sample, cross-sectional analysis with 35 cognitively normal controls regressing three composite cognitive measures on standardized numerical measures. The results showed that patients and controls significantly differ in Number comprehension, Transcoding, and Written operations, particularly subtractions and multiplications. The percentage of patients performing below the cutoffs ranged between 27% and 47% across these tasks. Spatial errors were associated with extensive lesions in fronto-temporo-parietal regions -which frequently lead to neglect- whereas pure arithmetical errors appeared related to more confined lesions in the right angular gyrus and its proximity. Stepwise regression models consistently revealed that spatial errors were primarily predicted by composite measures of visuo-spatial attention/neglect and representational abilities. Conversely, specific errors of arithmetic nature linked to representational abilities only. Crucially, the proportion of arithmetical errors (ranging from 65% to 100% across tasks) was higher than that of spatial ones. These findings thus suggest that unilateral right hemisphere lesions can directly affect core numerical/arithmetical processes, and that right-hemisphere acalculia is not only ascribable to visuo-spatial deficits as traditionally thought. Copyright © 2017 Elsevier Ltd. All rights reserved.
Nakashima, Yasuaki; Mano, Masayuki; Tomita, Yasuhiko; Nagasaki, Ikumitsu; Kubo, Toshikazu; Araki, Nobuhito; Haga, Hironori; Toguchida, Junya; Ueda, Takafumi; Sakuma, Toshiko; Imahori, Masaya; Morii, Eiichi; Yoshikawa, Hideki; Tsukamoto, Yoshitane; Futani, Hiroyuki; Wakasa, Kenichi; Hoshi, Manabu; Hamada, Shinshichi; Takeshita, Hideyuki; Inoue, Takeshi; Aono, Masanari; Kawabata, Kenji; Murata, Hiroaki; Katsura, Kanade; Urata, Yoji; Ueda, Hideki; Yanagisawa, Akio
2015-01-01
The aims of this study were: (i) to elucidate clinicopathological characteristics of pcCHS of long bones (L), limb girdles (LG) and trunk (T) in Japan; (ii) to investigate predictive pathological findings for outcome of pcCHS of L, LG and T, objectively; and (iii) to elucidate a discrepancy of grade between biopsy and resected specimens. Clinicopathological profiles of 174 pcCHS (79 male, 95 female), of L, LG, and T were retrieved. For each case, a numerical score was given to 18 pathological findings. The average age was 50.5 years (15–80 years). Frequently involved sites were femur, humerus, pelvis and rib. The 5‐year and 10‐year disease‐specific survival (DSS) rates [follow‐up: 1–258 months (average 65.5)] were 87.0% and 80.4%, respectively. By Cox hazards analysis on pathological findings, age, sex and location, histologically higher grade and older age were unfavorable predictors, and calcification was a favorable predictor in DSS. The histological grade of resected specimen was higher than that of biopsy in 37.7% (26/69 cases). In conclusion, higher histological grade and older age were predictors for poor, but calcification was for good prognosis. Because there was a discrepancy in grade between biopsy and resected specimens, comprehensive evaluation is necessary before definitive operation for pcCHS. PMID:26126783
Space shuttle booster multi-engine base flow analysis
NASA Technical Reports Server (NTRS)
Tang, H. H.; Gardiner, C. R.; Anderson, W. A.; Navickas, J.
1972-01-01
A comprehensive review of currently available techniques pertinent to several prominent aspects of the base thermal problem of the space shuttle booster is given along with a brief review of experimental results. A tractable engineering analysis, capable of predicting the power-on base pressure, base heating, and other base thermal environmental conditions, such as base gas temperature, is presented and used for an analysis of various space shuttle booster configurations. The analysis consists of a rational combination of theoretical treatments of the prominent flow interaction phenomena in the base region. These theories consider jet mixing, plume flow, axisymmetric flow effects, base injection, recirculating flow dynamics, and various modes of heat transfer. Such effects as initial boundary layer expansion at the nozzle lip, reattachment, recompression, choked vent flow, and nonisoenergetic mixing processes are included in the analysis. A unified method was developed and programmed to numerically obtain compatible solutions for the various flow field components in both flight and ground test conditions. Preliminary prediction for a 12-engine space shuttle booster base thermal environment was obtained for a typical trajectory history. Theoretical predictions were also obtained for some clustered-engine experimental conditions. Results indicate good agreement between the data and theoretical predicitons.
NASA Astrophysics Data System (ADS)
Liu, Qiang; Chattopadhyay, Aditi
2000-06-01
Aeromechanical stability plays a critical role in helicopter design and lead-lag damping is crucial to this design. In this paper, the use of segmented constrained damping layer (SCL) treatment and composite tailoring is investigated for improved rotor aeromechanical stability using formal optimization technique. The principal load-carrying member in the rotor blade is represented by a composite box beam, of arbitrary thickness, with surface bonded SCLs. A comprehensive theory is used to model the smart box beam. A ground resonance analysis model and an air resonance analysis model are implemented in the rotor blade built around the composite box beam with SCLs. The Pitt-Peters dynamic inflow model is used in air resonance analysis under hover condition. A hybrid optimization technique is used to investigate the optimum design of the composite box beam with surface bonded SCLs for improved damping characteristics. Parameters such as stacking sequence of the composite laminates and placement of SCLs are used as design variables. Detailed numerical studies are presented for aeromechanical stability analysis. It is shown that optimum blade design yields significant increase in rotor lead-lag regressive modal damping compared to the initial system.
Keeping Up with What You Have.
ERIC Educational Resources Information Center
Krysiak, Barbara H.
1999-01-01
Numerous studies have reported the deteriorating conditions in school buildings. One of the primary causes of this national problem is lack of proper maintenance of school facilities. Outlines a comprehensive assessment and planning process to provide a district with a road map for making decisions about facility improvement. (MLF)
Leveled Reading and Engagement with Complex Texts
ERIC Educational Resources Information Center
Hastings, Kathryn
2016-01-01
The benefits of engaging with age-appropriate reading materials in classroom settings are numerous. For example, students' comprehension is developed as they acquire new vocabulary and concepts. The Common Core requires all students have daily opportunities to engage with "complex text" regardless of students' decoding levels. However,…
PETE Preparation for CSPAP at the University of Kentucky
ERIC Educational Resources Information Center
Erwin, Heather E.; Beighle, Aaron; Eckler, Seth
2017-01-01
Numerous strategies to increase physical activity levels among American youth have been recommended and implemented in schools, and physical education teachers have been identified as the logical personnel in schools to spearhead these attempts. Comprehensive school physical activity programs (CSPAPs) are being promoted, implemented and endorsed…
An open-source Java-based Toolbox for environmental model evaluation: The MOUSE Software Application
USDA-ARS?s Scientific Manuscript database
A consequence of environmental model complexity is that the task of understanding how environmental models work and identifying their sensitivities/uncertainties, etc. becomes progressively more difficult. Comprehensive numerical and visual evaluation tools have been developed such as the Monte Carl...
Preventing Alcohol-Related Problems on Campus: Vandalism.
ERIC Educational Resources Information Center
Epstein, Joel; Finn, Peter
This bulletin provides suggestions for the components of a comprehensive approach to reducing student vandalism on college and university campuses. Numerous facets of the problem are addressed, including: the association of binge drinking with vandalism and school policies that tolerate or even facilitate binge drinking; a school's drinking…
34 CFR 668.142 - Special definitions.
Code of Federal Regulations, 2014 CFR
2014-07-01
.... General learned abilities: Cognitive operations, such as deductive reasoning, reading comprehension, or translation from graphic to numerical representation, that may be learned in both school and non-school...,” “curricula,” or “basic verbal and quantitative skills,” the basic knowledge or skills generally learned in...
34 CFR 668.142 - Special definitions.
Code of Federal Regulations, 2013 CFR
2013-07-01
.... General learned abilities: Cognitive operations, such as deductive reasoning, reading comprehension, or translation from graphic to numerical representation, that may be learned in both school and non-school...,” “curricula,” or “basic verbal and quantitative skills,” the basic knowledge or skills generally learned in...
34 CFR 668.142 - Special definitions.
Code of Federal Regulations, 2011 CFR
2011-07-01
.... General learned abilities: Cognitive operations, such as deductive reasoning, reading comprehension, or translation from graphic to numerical representation, that may be learned in both school and non-school...,” “curricula,” or “basic verbal and quantitative skills,” the basic knowledge or skills generally learned in...
34 CFR 668.142 - Special definitions.
Code of Federal Regulations, 2012 CFR
2012-07-01
.... General learned abilities: Cognitive operations, such as deductive reasoning, reading comprehension, or translation from graphic to numerical representation, that may be learned in both school and non-school...,” “curricula,” or “basic verbal and quantitative skills,” the basic knowledge or skills generally learned in...
ERIC Educational Resources Information Center
Spencer, Mercedes; Wagner, Richard K.
2017-01-01
We conducted a meta-analysis of 16 existing studies to examine the nature of the comprehension problems for children who were second-language learners with poor reading comprehension despite adequate decoding. Results indicated that these children had deficits in oral language (d = -0.80), but these deficits were not as severe as their reading…
CH-47D Rotating System Fault Sensing for Condition Based Maintenance
2011-03-01
replacement. This research seeks to create an analytical model in the Rotorcraft Comprehensive Analysis System which will enable the identifica- tion of...answer my many questions. Without your assistance and that of Dr. Jon Keller and Mr. Clayton Kachelle at AMRDEC, the Rotorcraft Comprehensive Analysis...20 3.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . 20 3.2 Rotorcraft Comprehensive Analysis
Catalani, Simona; Berra, Alessandro; Tomasi, Cesare; Romano, Canzio; Pira, Enrico; Garzaro, Giacomo; Apostoli, Pietro
2015-01-01
In recent years, due to the need to elaborate the amount of information available from the scientific literature, the meta-analyses and systematic reviews have become very numerous. The meta-analyses are carried out to evaluate the association between two events when single researches haven't provided comprehensive data. On the other hand, a good meta-analysis must satisfy certain criteria, from the selection of the studies until the evaluation of the outcomes; to this purpose, the application of methods for quality assessment is a crucial point to obtain data of adequate reliability. The aim of this review is to give some introductory tools for a critical approach to meta-analyses and systematic reviews, which have become useful instruments also in occupational medicine.
Drug-associated pancreatitis: facts and fiction.
Rünzi, M; Layer, P
1996-07-01
In the past, numerous reports on drugs probably causing acute pancreatitis have been published. However, most of these case reports were anecdotal with a lack of obvious evidence and did not present a comprehensive summary. Although drug-associated pancreatitis is rare, it is gaining increasing importance with the introduction of several potent new agents, i.e., anti-acquired immunodeficiency syndrome drugs. The following comprehensive review scrutinizes the evidence present in the world literature on drugs associated with acute or chronic pancreatitis and, based on this, categorizes in a definite, probable, or possible causality. In addition, explanations for the pathophysiological mechanisms are discussed.
Young Children Bet On Their Numerical Skills: Metacognition in the Numerical Domain
Vo, Vy A.; Li, Rosa; Kornell, Nate; Pouget, Alexandre; Cantlon, Jessica F.
2014-01-01
Metacognition, the ability to assess one’s own knowledge, has been targeted as a critical learning mechanism in mathematics education. Yet, the early childhood origins of metacognition have proven difficult to study. Using a novel nonverbal task and a comprehensive set of metacognitive measures, we provide the strongest evidence to date that young children are metacognitive. We show that children as young as 5 years make metacognitive “bets” on their numerical discriminations in a wagering task. However, contrary to previous reports from adults, children’s metacognition proved to be domain-specific: children’s metacognition in the numerical domain was unrelated to their metacognition in another domain (emotion discrimination). Moreover, children’s metacognitive ability in only the numerical domain predicted their school-based mathematics knowledge. The data provide novel evidence that metacognition is a fundamental, domain-dependent cognitive ability in children. The findings have implications for theories of uncertainty and reveal new avenues for training metacognition in children. PMID:24973137
NASA Astrophysics Data System (ADS)
Fischer, T.; Naumov, D.; Sattler, S.; Kolditz, O.; Walther, M.
2015-11-01
We offer a versatile workflow to convert geological models built with the ParadigmTM GOCAD© (Geological Object Computer Aided Design) software into the open-source VTU (Visualization Toolkit unstructured grid) format for usage in numerical simulation models. Tackling relevant scientific questions or engineering tasks often involves multidisciplinary approaches. Conversion workflows are needed as a way of communication between the diverse tools of the various disciplines. Our approach offers an open-source, platform-independent, robust, and comprehensible method that is potentially useful for a multitude of environmental studies. With two application examples in the Thuringian Syncline, we show how a heterogeneous geological GOCAD model including multiple layers and faults can be used for numerical groundwater flow modeling, in our case employing the OpenGeoSys open-source numerical toolbox for groundwater flow simulations. The presented workflow offers the chance to incorporate increasingly detailed data, utilizing the growing availability of computational power to simulate numerical models.
Salguero-Alcañiz, M P; Lorca-Marín, J A; Alameda-Bailén, J R
The ultimate purpose of cognitive neuropsychology is to find out how normal cognitive processes work. To this end, it studies subjects who have suffered brain damage but who, until their accident, were competent in the skills that are later to become the object of study. It is therefore necessary to study patients who have difficulty in processing numbers and in calculating in order to further our knowledge of these processes in the normal population. Our aim was to analyse the relationships between the different cognitive processes involved in numeric knowledge. We studied the case of a female patient who suffered an ischemic infarct in the perisylvian region, on both a superficial and deep level. She presented predominantly expressive mixed aphasia and predominantly brachial hemiparesis. Numeric processing and calculation were evaluated. The patient still had her lexical numeric knowledge but her quantitative numeric knowledge was impaired. These alterations in the quantitative numeric knowledge are evidenced by the difficulties the patient had in numeric comprehension tasks, as well as the severe impairments displayed in calculation. These findings allow us to conclude that quantitative numeric knowledge is functionally independent of lexical or non-quantitative numeric knowledge. From this functional autonomy, a possible structural independence can be inferred.
Pelletier, J.D.; Mayer, L.; Pearthree, P.A.; House, P.K.; Demsey, K.A.; Klawon, J.K.; Vincent, K.R.
2005-01-01
Millions of people in the western United States live near the dynamic, distributary channel networks of alluvial fans where flood behavior is complex and poorly constrained. Here we test a new comprehensive approach to alluvial-fan flood hazard assessment that uses four complementary methods: two-dimensional raster-based hydraulic modeling, satellite-image change detection, fieldbased mapping of recent flood inundation, and surficial geologic mapping. Each of these methods provides spatial detail lacking in the standard method and each provides critical information for a comprehensive assessment. Our numerical model simultaneously solves the continuity equation and Manning's equation (Chow, 1959) using an implicit numerical method. It provides a robust numerical tool for predicting flood flows using the large, high-resolution Digital Elevation Models (DEMs) necessary to resolve the numerous small channels on the typical alluvial fan. Inundation extents and flow depths of historic floods can be reconstructed with the numerical model and validated against field- and satellite-based flood maps. A probabilistic flood hazard map can also be constructed by modeling multiple flood events with a range of specified discharges. This map can be used in conjunction with a surficial geologic map to further refine floodplain delineation on fans. To test the accuracy of the numerical model, we compared model predictions of flood inundation and flow depths against field- and satellite-based flood maps for two recent extreme events on the southern Tortolita and Harquahala piedmonts in Arizona. Model predictions match the field- and satellite-based maps closely. Probabilistic flood hazard maps based on the 10 yr, 100 yr, and maximum floods were also constructed for the study areas using stream gage records and paleoflood deposits. The resulting maps predict spatially complex flood hazards that strongly reflect small-scale topography and are consistent with surficial geology. In contrast, FEMA Flood Insurance Rate Maps (FIRMs) based on the FAN model predict uniformly high flood risk across the study areas without regard for small-scale topography and surficial geology. ?? 2005 Geological Society of America.
Reynolds, Michael G; Schlöffel, Sophie; Peressotti, Francesca
2015-01-01
One approach used to gain insight into the processes underlying bilingual language comprehension and production examines the costs that arise from switching languages. For unbalanced bilinguals, asymmetric switch costs are reported in speech production, where the switch cost for L1 is larger than the switch cost for L2, whereas, symmetric switch costs are reported in language comprehension tasks, where the cost of switching is the same for L1 and L2. Presently, it is unclear why asymmetric switch costs are observed in speech production, but not in language comprehension. Three experiments are reported that simultaneously examine methodological explanations of task related differences in the switch cost asymmetry and the predictions of three accounts of the switch cost asymmetry in speech production. The results of these experiments suggest that (1) the type of language task (comprehension vs. production) determines whether an asymmetric switch cost is observed and (2) at least some of the switch cost asymmetry arises within the language system.
Reynolds, Michael G.; Schlöffel, Sophie; Peressotti, Francesca
2016-01-01
One approach used to gain insight into the processes underlying bilingual language comprehension and production examines the costs that arise from switching languages. For unbalanced bilinguals, asymmetric switch costs are reported in speech production, where the switch cost for L1 is larger than the switch cost for L2, whereas, symmetric switch costs are reported in language comprehension tasks, where the cost of switching is the same for L1 and L2. Presently, it is unclear why asymmetric switch costs are observed in speech production, but not in language comprehension. Three experiments are reported that simultaneously examine methodological explanations of task related differences in the switch cost asymmetry and the predictions of three accounts of the switch cost asymmetry in speech production. The results of these experiments suggest that (1) the type of language task (comprehension vs. production) determines whether an asymmetric switch cost is observed and (2) at least some of the switch cost asymmetry arises within the language system. PMID:26834659
Collaborative en-route and slot allocation algorithm based on fuzzy comprehensive evaluation
NASA Astrophysics Data System (ADS)
Yang, Shangwen; Guo, Baohua; Xiao, Xuefei; Gao, Haichao
2018-01-01
To allocate the en-routes and slots to the flights with collaborative decision making, a collaborative en-route and slot allocation algorithm based on fuzzy comprehensive evaluation was proposed. Evaluation indexes include flight delay costs, delay time and the number of turning points. Analytic hierarchy process is applied to determining index weights. Remark set for current two flights not yet obtained the en-route and slot in flight schedule is established. Then, fuzzy comprehensive evaluation is performed, and the en-route and slot for the current two flights are determined. Continue selecting the flight not yet obtained an en-route and a slot in flight schedule. Perform fuzzy comprehensive evaluation until all flights have obtained the en-routes and slots. MatlabR2007b was applied to numerical test based on the simulated data of a civil en-route. Test results show that, compared with the traditional strategy of first come first service, the algorithm gains better effect. The effectiveness of the algorithm was verified.
Pulse propagation in granular chains
NASA Astrophysics Data System (ADS)
Rosas, Alexandre; Lindenberg, Katja
2018-03-01
In this comprehensive review we present, discuss, and compare a number of theoretical approaches to the propagation of impulses in granular chains found in the literature, emphasizing the strengths and weaknesses of each. Experimental and numerical results are compared, and common features of the dynamics of pulse propagation for distinct chain setups are highlighted.
How Numeracy Influences Risk Comprehension and Medical Decision Making
ERIC Educational Resources Information Center
Reyna, Valerie F.; Nelson, Wendy L.; Han, Paul K.; Dieckmann, Nathan F.
2009-01-01
We review the growing literature on health numeracy, the ability to understand and use numerical information, and its relation to cognition, health behaviors, and medical outcomes. Despite the surfeit of health information from commercial and noncommercial sources, national and international surveys show that many people lack basic numerical…
Sex and Inhumanity: Review Article
ERIC Educational Resources Information Center
Corbett, Patrick
1974-01-01
The first three of these books are attempts by numerous writers to describe, analyze, explain, castigate and remedy the present wave of commercialized pornography in advanced capitalist societies. The fourth is a manual of sexual instruction for adolescents and the fifth a comprehensive account of the Abortion Act 1967. (Author/RK)
Juvenile Offenders and Victims: 2006 National Report
ERIC Educational Resources Information Center
Snyder, Howard N.; Sickmund, Melissa
2006-01-01
This report presents comprehensive information on juvenile crime, violence, and victimization and on the juvenile justice system. This report brings together the latest available statistics from a variety of sources and includes numerous tables, graphs, and maps, accompanied by analyses in clear, nontechnical language. The report offers Congress,…
Building Blocks for School IPM: A Least-Toxic Pest Management Manual.
ERIC Educational Resources Information Center
Crouse, Becky, Ed.; Owens, Kagan, Ed.
This publication is a compilation of original and republished materials from numerous individuals and organizations working on pesticide reform and integrated pest management (IPM)--using alternatives to prevailing chemical-intensive practices. The manual provides comprehensive information on implementing school IPM, including a practical guide to…
Using the Quirk-Schofield Diagram to Explain Environmental Colloid Dispersion Phenomena
ERIC Educational Resources Information Center
Mays, David C.
2007-01-01
Colloid dispersion, through its role in soil science, hydrology, and contaminant transport, is a basic component of many natural resources and environmental education programs. However, comprehension of colloid dispersion phenomena is limited by the numerous variables involved. This article demonstrates how the Quirk-Schofield diagram can be used…
Cooperative Alaska Forest Inventory
Thomas Malone; Jingjing Liang; Edmond C. Packee
2009-01-01
The Cooperative Alaska Forest Inventory (CAFI) is a comprehensive database of boreal forest conditions and dynamics in Alaska. The CAFI consists of field-gathered information from numerous permanent sample plots distributed across interior and south-central Alaska including the Kenai Peninsula. The CAFI currently has 570 permanent sample plots on 190 sites...
Number Games, Magnitude Representation, and Basic Number Skills in Preschoolers
ERIC Educational Resources Information Center
Whyte, Jemma Catherine; Bull, Rebecca
2008-01-01
The effect of 3 intervention board games (linear number, linear color, and nonlinear number) on young children's (mean age = 3.8 years) counting abilities, number naming, magnitude comprehension, accuracy in number-to-position estimation tasks, and best-fit numerical magnitude representations was examined. Pre- and posttest performance was…
ERIC Educational Resources Information Center
Colenbrander, Danielle; Nickels, Lyndsey; Kohnen, Saskia
2017-01-01
Background: Identifying reading comprehension difficulties is challenging. There are many comprehension tests to choose from, and a child's diagnosis can be influenced by various factors such as a test's format and content and the choice of diagnostic criteria. We investigate these issues with reference to the Neale Analysis of Reading Ability…
Towards Test Driven Development for Computational Science with pFUnit
NASA Technical Reports Server (NTRS)
Rilee, Michael L.; Clune, Thomas L.
2014-01-01
Developers working in Computational Science & Engineering (CSE)/High Performance Computing (HPC) must contend with constant change due to advances in computing technology and science. Test Driven Development (TDD) is a methodology that mitigates software development risks due to change at the cost of adding comprehensive and continuous testing to the development process. Testing frameworks tailored for CSE/HPC, like pFUnit, can lower the barriers to such testing, yet CSE software faces unique constraints foreign to the broader software engineering community. Effective testing of numerical software requires a comprehensive suite of oracles, i.e., use cases with known answers, as well as robust estimates for the unavoidable numerical errors associated with implementation with finite-precision arithmetic. At first glance these concerns often seem exceedingly challenging or even insurmountable for real-world scientific applications. However, we argue that this common perception is incorrect and driven by (1) a conflation between model validation and software verification and (2) the general tendency in the scientific community to develop relatively coarse-grained, large procedures that compound numerous algorithmic steps.We believe TDD can be applied routinely to numerical software if developers pursue fine-grained implementations that permit testing, neatly side-stepping concerns about needing nontrivial oracles as well as the accumulation of errors. We present an example of a successful, complex legacy CSE/HPC code whose development process shares some aspects with TDD, which we contrast with current and potential capabilities. A mix of our proposed methodology and framework support should enable everyday use of TDD by CSE-expert developers.
ERIC Educational Resources Information Center
García, J. Ricardo; Cain, Kate
2014-01-01
The twofold purpose of this meta-analysis was to determine the relative importance of decoding skills to reading comprehension in reading development and to identify which reader characteristics and reading assessment characteristics contribute to differences in the decoding and reading comprehension correlation. A meta-analysis of 110 studies…
NASA Astrophysics Data System (ADS)
Hozman, J.; Tichý, T.
2017-12-01
Stochastic volatility models enable to capture the real world features of the options better than the classical Black-Scholes treatment. Here we focus on pricing of European-style options under the Stein-Stein stochastic volatility model when the option value depends on the time, on the price of the underlying asset and on the volatility as a function of a mean reverting Orstein-Uhlenbeck process. A standard mathematical approach to this model leads to the non-stationary second-order degenerate partial differential equation of two spatial variables completed by the system of boundary and terminal conditions. In order to improve the numerical valuation process for a such pricing equation, we propose a numerical technique based on the discontinuous Galerkin method and the Crank-Nicolson scheme. Finally, reference numerical experiments on real market data illustrate comprehensive empirical findings on options with stochastic volatility.
A gradient enhanced plasticity-damage microplane model for concrete
NASA Astrophysics Data System (ADS)
Zreid, Imadeddin; Kaliske, Michael
2018-03-01
Computational modeling of concrete poses two main types of challenges. The first is the mathematical description of local response for such a heterogeneous material under all stress states, and the second is the stability and efficiency of the numerical implementation in finite element codes. The paper at hand presents a comprehensive approach addressing both issues. Adopting the microplane theory, a combined plasticity-damage model is formulated and regularized by an implicit gradient enhancement. The plasticity part introduces a new microplane smooth 3-surface cap yield function, which provides a stable numerical solution within an implicit finite element algorithm. The damage part utilizes a split, which can describe the transition of loading between tension and compression. Regularization of the model by the implicit gradient approach eliminates the mesh sensitivity and numerical instabilities. Identification methods for model parameters are proposed and several numerical examples of plain and reinforced concrete are carried out for illustration.
Numerical Modeling of Ablation Heat Transfer
NASA Technical Reports Server (NTRS)
Ewing, Mark E.; Laker, Travis S.; Walker, David T.
2013-01-01
A unique numerical method has been developed for solving one-dimensional ablation heat transfer problems. This paper provides a comprehensive description of the method, along with detailed derivations of the governing equations. This methodology supports solutions for traditional ablation modeling including such effects as heat transfer, material decomposition, pyrolysis gas permeation and heat exchange, and thermochemical surface erosion. The numerical scheme utilizes a control-volume approach with a variable grid to account for surface movement. This method directly supports implementation of nontraditional models such as material swelling and mechanical erosion, extending capabilities for modeling complex ablation phenomena. Verifications of the numerical implementation are provided using analytical solutions, code comparisons, and the method of manufactured solutions. These verifications are used to demonstrate solution accuracy and proper error convergence rates. A simple demonstration of a mechanical erosion (spallation) model is also provided to illustrate the unique capabilities of the method.
ERIC Educational Resources Information Center
Lonchamp, F.
This is a presentation of the results of a factor analysis of a battery of tests intended to measure listening and reading comprehension in English as a second language. The analysis sought to answer the following questions: (1) whether the factor analysis method yields results when applied to tests which are not specifically designed for this…
NASA Astrophysics Data System (ADS)
Liu, Hu-Chen; Liu, Long; Li, Ping
2014-10-01
Failure mode and effects analysis (FMEA) has shown its effectiveness in examining potential failures in products, process, designs or services and has been extensively used for safety and reliability analysis in a wide range of industries. However, its approach to prioritise failure modes through a crisp risk priority number (RPN) has been criticised as having several shortcomings. The aim of this paper is to develop an efficient and comprehensive risk assessment methodology using intuitionistic fuzzy hybrid weighted Euclidean distance (IFHWED) operator to overcome the limitations and improve the effectiveness of the traditional FMEA. The diversified and uncertain assessments given by FMEA team members are treated as linguistic terms expressed in intuitionistic fuzzy numbers (IFNs). Intuitionistic fuzzy weighted averaging (IFWA) operator is used to aggregate the FMEA team members' individual assessments into a group assessment. IFHWED operator is applied thereafter to the prioritisation and selection of failure modes. Particularly, both subjective and objective weights of risk factors are considered during the risk evaluation process. A numerical example for risk assessment is given to illustrate the proposed method finally.
VAAPA: a web platform for visualization and analysis of alternative polyadenylation.
Guan, Jinting; Fu, Jingyi; Wu, Mingcheng; Chen, Longteng; Ji, Guoli; Quinn Li, Qingshun; Wu, Xiaohui
2015-02-01
Polyadenylation [poly(A)] is an essential process during the maturation of most mRNAs in eukaryotes. Alternative polyadenylation (APA) as an important layer of gene expression regulation has been increasingly recognized in various species. Here, a web platform for visualization and analysis of alternative polyadenylation (VAAPA) was developed. This platform can visualize the distribution of poly(A) sites and poly(A) clusters of a gene or a section of a chromosome. It can also highlight genes with switched APA sites among different conditions. VAAPA is an easy-to-use web-based tool that provides functions of poly(A) site query, data uploading, downloading, and APA sites visualization. It was designed in a multi-tier architecture and developed based on Smart GWT (Google Web Toolkit) using Java as the development language. VAAPA will be a valuable addition to the community for the comprehensive study of APA, not only by making the high quality poly(A) site data more accessible, but also by providing users with numerous valuable functions for poly(A) site analysis and visualization. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kayastha, Shilva; Kunimoto, Ryo; Horvath, Dragos; Varnek, Alexandre; Bajorath, Jürgen
2017-11-01
The analysis of structure-activity relationships (SARs) becomes rather challenging when large and heterogeneous compound data sets are studied. In such cases, many different compounds and their activities need to be compared, which quickly goes beyond the capacity of subjective assessments. For a comprehensive large-scale exploration of SARs, computational analysis and visualization methods are required. Herein, we introduce a two-layered SAR visualization scheme specifically designed for increasingly large compound data sets. The approach combines a new compound pair-based variant of generative topographic mapping (GTM), a machine learning approach for nonlinear mapping, with chemical space networks (CSNs). The GTM component provides a global view of the activity landscapes of large compound data sets, in which informative local SAR environments are identified, augmented by a numerical SAR scoring scheme. Prioritized local SAR regions are then projected into CSNs that resolve these regions at the level of individual compounds and their relationships. Analysis of CSNs makes it possible to distinguish between regions having different SAR characteristics and select compound subsets that are rich in SAR information.
Chemical analysis of Panax quinquefolius (North American ginseng): A review.
Wang, Yaping; Choi, Hyung-Kyoon; Brinckmann, Josef A; Jiang, Xue; Huang, Linfang
2015-12-24
Panax quinquefolius (PQ) is one of the best-selling natural health products due to its proposed beneficial anti-aging, anti-cancer, anti-stress, anti-fatigue, and anxiolytic effects. In recent years, the quality of PQ has received considerable attention. Sensitive and accurate methods for qualitative and quantitative analyses of chemical constituents are necessary for the comprehensive quality control to ensure the safety and efficacy of PQ. This article reviews recent progress in the chemical analysis of PQ and its preparations. Numerous analytical techniques, including spectroscopy, thin-layer chromatography (TLC), gas chromatography (GC), high-performance liquid chromatography (HPLC), liquid chromatography/mass spectrometry (LC/MS), high-speed centrifugal partition chromatography (HSCPC), high-performance counter-current chromatography (HPCCC), nuclear magnetic resonance spectroscopy (NMR), and immunoassay, are described. Among these techniques, HPLC coupled with mass spectrometry (MS) is the most promising method for quality control. The challenges encountered in the chemical analysis of PQ are also briefly discussed, and the remaining questions regarding the quality control of PQ that require further investigation are highlighted. Copyright © 2015 Elsevier B.V. All rights reserved.
Wang, Jianzhou; Niu, Tong; Wang, Rui
2017-03-02
The worsening atmospheric pollution increases the necessity of air quality early warning systems (EWSs). Despite the fact that a massive amount of investigation about EWS in theory and practicality has been conducted by numerous researchers, studies concerning the quantification of uncertain information and comprehensive evaluation are still lacking, which impedes further development in the area. In this paper, firstly a comprehensive warning system is proposed, which consists of two vital indispensable modules, namely effective forecasting and scientific evaluation, respectively. For the forecasting module, a novel hybrid model combining the theory of data preprocessing and numerical optimization is first developed to implement effective forecasting for air pollutant concentration. Especially, in order to further enhance the accuracy and robustness of the warning system, interval forecasting is implemented to quantify the uncertainties generated by forecasts, which can provide significant risk signals by using point forecasting for decision-makers. For the evaluation module, a cloud model, based on probability and fuzzy set theory, is developed to perform comprehensive evaluations of air quality, which can realize the transformation between qualitative concept and quantitative data. To verify the effectiveness and efficiency of the warning system, extensive simulations based on air pollutants data from Dalian in China were effectively implemented, which illustrate that the warning system is not only remarkably high-performance, but also widely applicable.
Wang, Jianzhou; Niu, Tong; Wang, Rui
2017-01-01
The worsening atmospheric pollution increases the necessity of air quality early warning systems (EWSs). Despite the fact that a massive amount of investigation about EWS in theory and practicality has been conducted by numerous researchers, studies concerning the quantification of uncertain information and comprehensive evaluation are still lacking, which impedes further development in the area. In this paper, firstly a comprehensive warning system is proposed, which consists of two vital indispensable modules, namely effective forecasting and scientific evaluation, respectively. For the forecasting module, a novel hybrid model combining the theory of data preprocessing and numerical optimization is first developed to implement effective forecasting for air pollutant concentration. Especially, in order to further enhance the accuracy and robustness of the warning system, interval forecasting is implemented to quantify the uncertainties generated by forecasts, which can provide significant risk signals by using point forecasting for decision-makers. For the evaluation module, a cloud model, based on probability and fuzzy set theory, is developed to perform comprehensive evaluations of air quality, which can realize the transformation between qualitative concept and quantitative data. To verify the effectiveness and efficiency of the warning system, extensive simulations based on air pollutants data from Dalian in China were effectively implemented, which illustrate that the warning system is not only remarkably high-performance, but also widely applicable. PMID:28257122
NASA Orbital Debris Engineering Model ORDEM2008 (Beta Version)
NASA Technical Reports Server (NTRS)
Stansbery, Eugene G.; Krisko, Paula H.
2009-01-01
This is an interim document intended to accompany the beta-release of the ORDEM2008 model. As such it provides the user with a guide for its use, a list of its capabilities, a brief summary of model development, and appendices included to educate the user as to typical runtimes for different orbit configurations. More detailed documentation will be delivered with the final product. ORDEM2008 supersedes NASA's previous model - ORDEM2000. The availability of new sensor and in situ data, the re-analysis of older data, and the development of new analytical techniques, has enabled the construction of this more comprehensive and sophisticated model. Integrated with the software is an upgraded graphical user interface (GUI), which uses project-oriented organization and provides the user with graphical representations of numerous output data products. These range from the conventional average debris size vs. flux magnitude for chosen analysis orbits, to the more complex color-contoured two-dimensional (2-D) directional flux diagrams in terms of local spacecraft pitch and yaw.
Comprehensive proteomic analysis of the human spliceosome
NASA Astrophysics Data System (ADS)
Zhou, Zhaolan; Licklider, Lawrence J.; Gygi, Steven P.; Reed, Robin
2002-09-01
The precise excision of introns from pre-messenger RNA is performed by the spliceosome, a macromolecular machine containing five small nuclear RNAs and numerous proteins. Much has been learned about the protein components of the spliceosome from analysis of individual purified small nuclear ribonucleoproteins and salt-stable spliceosome `core' particles. However, the complete set of proteins that constitutes intact functional spliceosomes has yet to be identified. Here we use maltose-binding protein affinity chromatography to isolate spliceosomes in highly purified and functional form. Using nanoscale microcapillary liquid chromatography tandem mass spectrometry, we identify ~145 distinct spliceosomal proteins, making the spliceosome the most complex cellular machine so far characterized. Our spliceosomes comprise all previously known splicing factors and 58 newly identified components. The spliceosome contains at least 30 proteins with known or putative roles in gene expression steps other than splicing. This complexity may be required not only for splicing multi-intronic metazoan pre-messenger RNAs, but also for mediating the extensive coupling between splicing and other steps in gene expression.
Metabolic Flux Analysis in Isotope Labeling Experiments Using the Adjoint Approach.
Mottelet, Stephane; Gaullier, Gil; Sadaka, Georges
2017-01-01
Comprehension of metabolic pathways is considerably enhanced by metabolic flux analysis (MFA-ILE) in isotope labeling experiments. The balance equations are given by hundreds of algebraic (stationary MFA) or ordinary differential equations (nonstationary MFA), and reducing the number of operations is therefore a crucial part of reducing the computation cost. The main bottleneck for deterministic algorithms is the computation of derivatives, particularly for nonstationary MFA. In this article, we explain how the overall identification process may be speeded up by using the adjoint approach to compute the gradient of the residual sum of squares. The proposed approach shows significant improvements in terms of complexity and computation time when it is compared with the usual (direct) approach. Numerical results are obtained for the central metabolic pathways of Escherichia coli and are validated against reference software in the stationary case. The methods and algorithms described in this paper are included in the sysmetab software package distributed under an Open Source license at http://forge.scilab.org/index.php/p/sysmetab/.
NASA Astrophysics Data System (ADS)
Aziz, Asim; Jamshed, Wasim; Aziz, Taha
2018-04-01
In the present research a simplified mathematical model for the solar thermal collectors is considered in the form of non-uniform unsteady stretching surface. The non-Newtonian Maxwell nanofluid model is utilized for the working fluid along with slip and convective boundary conditions and comprehensive analysis of entropy generation in the system is also observed. The effect of thermal radiation and variable thermal conductivity are also included in the present model. The mathematical formulation is carried out through a boundary layer approach and the numerical computations are carried out for Cu-water and TiO2-water nanofluids. Results are presented for the velocity, temperature and entropy generation profiles, skin friction coefficient and Nusselt number. The discussion is concluded on the effect of various governing parameters on the motion, temperature variation, entropy generation, velocity gradient and the rate of heat transfer at the boundary.
Simulation of UV atomic radiation for application in exhaust plume spectrometry
NASA Astrophysics Data System (ADS)
Wallace, T. L.; Powers, W. T.; Cooper, A. E.
1993-06-01
Quantitative analysis of exhaust plume spectral data has long been a goal of developers of advanced engine health monitoring systems which incorporate optical measurements of rocket exhaust constituents. Discussed herein is the status of present efforts to model and predict atomic radiation spectra and infer free-atom densities from emission/absorption measurements as part of the Optical Plume Anomaly Detection (OPAD) program at Marshall Space Flight Center (MSFC). A brief examination of the mathematical formalism is provided in the context of predicting radiation from the Mach disk region of the SSME exhaust flow at nominal conditions during ground level testing at MSFC. Computational results are provided for Chromium and Copper at selected transitions which indicate a strong dependence upon broadening parameter values determining the absorption-emission line shape. Representative plots of recent spectral data from the Stennis Space Center (SSC) Diagnostic Test Facility (DTF) rocket engine are presented and compared to numerical results from the present self-absorbing model; a comprehensive quantitative analysis will be reported at a later date.
Windshield splatter analysis with the Galaxy metagenomic pipeline
Kosakovsky Pond, Sergei; Wadhawan, Samir; Chiaromonte, Francesca; Ananda, Guruprasad; Chung, Wen-Yu; Taylor, James; Nekrutenko, Anton
2009-01-01
How many species inhabit our immediate surroundings? A straightforward collection technique suitable for answering this question is known to anyone who has ever driven a car at highway speeds. The windshield of a moving vehicle is subjected to numerous insect strikes and can be used as a collection device for representative sampling. Unfortunately the analysis of biological material collected in that manner, as with most metagenomic studies, proves to be rather demanding due to the large number of required tools and considerable computational infrastructure. In this study, we use organic matter collected by a moving vehicle to design and test a comprehensive pipeline for phylogenetic profiling of metagenomic samples that includes all steps from processing and quality control of data generated by next-generation sequencing technologies to statistical analyses and data visualization. To the best of our knowledge, this is also the first publication that features a live online supplement providing access to exact analyses and workflows used in the article. PMID:19819906
Li, Tiejun; Min, Bin; Wang, Zhiming
2013-03-14
The stochastic integral ensuring the Newton-Leibnitz chain rule is essential in stochastic energetics. Marcus canonical integral has this property and can be understood as the Wong-Zakai type smoothing limit when the driving process is non-Gaussian. However, this important concept seems not well-known for physicists. In this paper, we discuss Marcus integral for non-Gaussian processes and its computation in the context of stochastic energetics. We give a comprehensive introduction to Marcus integral and compare three equivalent definitions in the literature. We introduce the exact pathwise simulation algorithm and give the error analysis. We show how to compute the thermodynamic quantities based on the pathwise simulation algorithm. We highlight the information hidden in the Marcus mapping, which plays the key role in determining thermodynamic quantities. We further propose the tau-leaping algorithm, which advance the process with deterministic time steps when tau-leaping condition is satisfied. The numerical experiments and its efficiency analysis show that it is very promising.
McElroy, Lisa M; Khorzad, Rebeca; Rowe, Theresa A; Abecassis, Zachary A; Apley, Daniel W; Barnard, Cynthia; Holl, Jane L
The purpose of this study was to use fault tree analysis to evaluate the adequacy of quality reporting programs in identifying root causes of postoperative bloodstream infection (BSI). A systematic review of the literature was used to construct a fault tree to evaluate 3 postoperative BSI reporting programs: National Surgical Quality Improvement Program (NSQIP), Centers for Medicare and Medicaid Services (CMS), and The Joint Commission (JC). The literature review revealed 699 eligible publications, 90 of which were used to create the fault tree containing 105 faults. A total of 14 identified faults are currently mandated for reporting to NSQIP, 5 to CMS, and 3 to JC; 2 or more programs require 4 identified faults. The fault tree identifies numerous contributing faults to postoperative BSI and reveals substantial variation in the requirements and ability of national quality data reporting programs to capture these potential faults. Efforts to prevent postoperative BSI require more comprehensive data collection to identify the root causes and develop high-reliability improvement strategies.
NASA Technical Reports Server (NTRS)
Starr, David O. (Technical Monitor); Smith, Eric A.
2002-01-01
Comprehensive understanding of the microphysical nature of Mediterranean storms can be accomplished by a combination of in situ meteorological data analysis and radar-passive microwave data analysis, effectively integrated with numerical modeling studies at various scales, from synoptic scale down through the mesoscale, the cloud macrophysical scale, and ultimately the cloud microphysical scale. The microphysical properties of and their controls on severe storms are intrinsically related to meteorological processes under which storms have evolved, processes which eventually select and control the dominant microphysical properties themselves. This involves intense convective development, stratiform decay, orographic lifting, and sloped frontal lifting processes, as well as the associated vertical motions and thermodynamical instabilities governing physical processes that affect details of the size distributions and fall rates of the various types of hydrometeors found within the storm environment. Insofar as hazardous Mediterranean storms, highlighted in this study by three mountain storms producing damaging floods in northern Italy between 1992 and 2000, developing a comprehensive microphysical interpretation requires an understanding of the multiple phases of storm evolution and the heterogeneous nature of precipitation fields within a storm domain. This involves convective development, stratiform transition and decay, orographic lifting, and sloped frontal lifting processes. This also involves vertical motions and thermodynamical instabilities governing physical processes that determine details of the liquid/ice water contents, size disi:ributions, and fall rates of the various modes of hydrometeors found within hazardous storm environments.
Hébert, Martine; Daspe, Marie-Ève; Lapierre, Andréanne; Godbout, Natacha; Blais, Martin; Fernet, Mylène; Lavoie, Francine
2017-01-01
Dating violence (DV) is a widespread social issue that has numerous deleterious repercussions on youths' health. Family and peer risk factors for DV have been widely studied, but with inconsistent methodologies, which complicates global comprehension of the phenomenon. Protective factors, although understudied, constitutes a promising line of research for prevention. To date, there is no comprehensive quantitative review attempting to summarize knowledge on both family and peer factors that increase or decrease the risk for adolescents and emerging adults DV victimization. The current meta-analysis draws on 87 studies with a total sample of 278,712 adolescents and young adults to examine effect sizes of the association between various family and peer correlates of DV victimization. Results suggest small, significant effect sizes for all the family (various forms of child maltreatment, parental support, and parental monitoring) and peer factors (peer victimization, sexual harassment, affiliation with deviant peers, and supportive/prosocial peers) in the prediction of DV. With few exceptions, forms of DV (psychological, physical, and sexual), gender, and age did not moderate the strength of these associations. In addition, no difference was found between the magnitude of family and peer factors' effect sizes, suggesting that these determinants are equally important in predicting DV. The current results provide future directions for examining relations between risk and protective factors for DV and indicate that both peers and family should be part of the development of efficient prevention options.
Kazarian, Artaches A; Nesterenko, Pavel N; Soisungnoen, Phimpha; Burakham, Rodjana; Srijaranai, Supalax; Paull, Brett
2014-08-01
Liquid chromatographic assays were developed using a mixed-mode column coupled in sequence with a hydrophilic interaction liquid chromatography column to allow the simultaneous comprehensive analysis of inorganic/organic anions and cations, active pharmaceutical ingredients, and excipients (carbohydrates). The approach utilized dual sample injection and valve-mediated column switching and was based upon a single high-performance liquid chromatography gradient pump. The separation consisted of three distinct sequential separation mechanisms, namely, (i) ion-exchange, (ii) mixed-mode interactions under an applied dual gradient (reversed-phase/ion-exchange), and (iii) hydrophilic interaction chromatography. Upon first injection, the Scherzo SS C18 column (Imtakt) provided resolution of inorganic anions and cations under isocratic conditions, followed by a dual organic/salt gradient to elute active pharmaceutical ingredients and their respective organic counterions and potential degradants. At the top of the mixed-mode gradient (high acetonitrile content), the mobile phase flow was switched to a preconditioned hydrophilic interaction liquid chromatography column, and the standard/sample was reinjected for the separation of hydrophilic carbohydrates, some of which are commonly known excipients in drug formulations. The approach afforded reproducible separation and resolution of up to 23 chemically diverse solutes in a single run. The method was applied to investigate the composition of commercial cough syrups (Robitussin®), allowing resolution and determination of inorganic ions, active pharmaceutical ingredients, excipients, and numerous well-resolved unknown peaks. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Spencer, Mercedes; Wagner, Richard K.
2016-01-01
We conducted a meta-analysis of 16 existing studies to examine the nature of the comprehension problems for children who were second-language learners with poor reading comprehension despite adequate decoding. Results indicated that these children had deficits in oral language (d = −0.80), but these deficits were not as severe as their reading comprehension deficit (d = −2.47). Second-language learners also had weaker oral language skills compared to native-speaking children regardless of comprehension status (d = −0.84). We discuss theoretical and practical implications of the finding that second-language learners who are poor at reading comprehension despite adequate decoding have deficits in oral language but the deficit is not sufficient to explain their deficit in reading comprehension. PMID:28461711
Spencer, Mercedes; Wagner, Richard K
2017-05-01
We conducted a meta-analysis of 16 existing studies to examine the nature of the comprehension problems for children who were second-language learners with poor reading comprehension despite adequate decoding. Results indicated that these children had deficits in oral language ( d = -0.80), but these deficits were not as severe as their reading comprehension deficit ( d = -2.47). Second-language learners also had weaker oral language skills compared to native-speaking children regardless of comprehension status ( d = -0.84). We discuss theoretical and practical implications of the finding that second-language learners who are poor at reading comprehension despite adequate decoding have deficits in oral language but the deficit is not sufficient to explain their deficit in reading comprehension.
Red Meat and Colorectal Cancer: A Quantitative Update on the State of the Epidemiologic Science
Alexander, Dominik D.; Weed, Douglas L.; Miller, Paula E.; Mohamed, Muhima A.
2015-01-01
The potential relationship between red meat consumption and colorectal cancer (CRC) has been the subject of scientific debate. Given the high degree of resulting uncertainty, our objective was to update the state of the science by conducting a systematic quantitative assessment of the epidemiologic literature. Specifically, we updated and expanded our previous meta-analysis by integrating data from new prospective cohort studies and conducting a broader evaluation of the relative risk estimates by specific intake categories. Data from 27 independent prospective cohort studies were meta-analyzed using random-effects models, and sources of potential heterogeneity were examined through subgroup and sensitivity analyses. In addition, a comprehensive evaluation of potential dose-response patterns was conducted. In the meta-analysis of all cohorts, a weakly elevated summary relative risk was observed (1.11, 95% CI: 1.03–1.19); however, statistically significant heterogeneity was present. In general, summary associations were attenuated (closer to the null and less heterogeneous) in models that isolated fresh red meat (from processed meat), adjusted for more relevant factors, analyzed women only, and were conducted in countries outside of the United States. Furthermore, no clear patterns of dose-response were apparent. In conclusion, the state of the epidemiologic science on red meat consumption and CRC is best described in terms of weak associations, heterogeneity, an inability to disentangle effects from other dietary and lifestyle factors, lack of a clear dose-response effect, and weakening evidence over time. Key Teaching Points: •The role of red meat consumption in colorectal cancer risk has been widely contested among the scientific community.•In the current meta-analysis of red meat intake and colorectal cancer, we comprehensively examined associations by creating numerous sub-group stratifications, conducting extensive sensitivity analyses, and evaluating dose-response using several different methods.•Overall, all summary associations were weak in magnitude with no clear dose-response patterns.•Interpretation of findings from epidemiologic studies investigating diet and health outcomes involves numerous methodological considerations, such as accurately measuring food intake, dietary pattern differences across populations, food definitions, outcome classifications, bias and confounding, multicollinearity, biological mechanisms, genetic variation in metabolizing enzymes, and differences in analytical metrics and statistical testing parameters. PMID:25941850
Numerical Simulation of the Working Process in the Twin Screw Vacuum Pump
NASA Astrophysics Data System (ADS)
Lu, Yang; Fu, Yu; Guo, Bei; Fu, Lijuan; Zhang, Qingqing; Chen, Xiaole
2017-08-01
Twin screw vacuum pumps inherit the advantages of screw machinery, such as high reliability, stable medium conveying, small vibration, simple and compact structures, convenient operation, etc, which have been widely used in petrochemical and air industry. On the basis of previous studies, this study analyzed the geometric features of variable pitch of the twin screw vacuum pump such as the sealing line, the meshing line and the volume between teeth. The mathematical model of numerical simulation of the twin screw vacuum pump was established. The leakage paths of the working volume including the sealing line and the addendum arc were comprehensively considered. The corresponding simplified geometric model of leakage flow was built up for different leak paths and the flow coefficients were calculated. The flow coefficient value range of different leak paths was given. The results showed that the flow coefficient of different leak paths can be taken as constant value for the studied geometry. The analysis of recorded indicator diagrams showed that the increasing rotational speed can dramatically decrease the exhaust pressure and the lower rotational speed can lead to over-compression. The pressure of the isentropic process which was affected by leakage was higher than the theoretical process.
Analytical modeling and numerical simulation of the short-wave infrared electron-injection detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Movassaghi, Yashar; Fathipour, Morteza; Fathipour, Vala
2016-03-21
This paper describes comprehensive analytical and simulation models for the design and optimization of the electron-injection based detectors. The electron-injection detectors evaluated here operate in the short-wave infrared range and utilize a type-II band alignment in InP/GaAsSb/InGaAs material system. The unique geometry of detectors along with an inherent negative-feedback mechanism in the device allows for achieving high internal avalanche-free amplifications without any excess noise. Physics-based closed-form analytical models are derived for the detector rise time and dark current. Our optical gain model takes into account the drop in the optical gain at high optical power levels. Furthermore, numerical simulation studiesmore » of the electrical characteristics of the device show good agreement with our analytical models as well experimental data. Performance comparison between devices with different injector sizes shows that enhancement in the gain and speed is anticipated by reducing the injector size. Sensitivity analysis for the key detector parameters shows the relative importance of each parameter. The results of this study may provide useful information and guidelines for development of future electron-injection based detectors as well as other heterojunction photodetectors.« less
NASA Astrophysics Data System (ADS)
Ghosh, Uddipta; Mandal, Shubhadeep; Chakraborty, Suman
2017-06-01
Here we attempt to solve the fully coupled Poisson-Nernst-Planck-Navier-Stokes equations, to ascertain the influence of finite electric double layer (EDL) thickness on coupled charge and fluid dynamics over patterned charged surfaces. We go beyond the well-studied "weak-field" limit and obtain numerical solutions for a wide range of EDL thicknesses, applied electric field strengths, and the surface potentials. Asymptotic solutions to the coupled system are also derived using a combination of singular and regular perturbation, for thin EDLs and low surface potential, and good agreement between the two solutions is observed. Counterintuitively to common arguments, our analysis reveals that finite EDL thickness may either increase or decrease the "free-stream velocity" (equivalent to net throughput), depending on the strength of the applied electric field. We also unveil a critical EDL thickness for which the effect of finite EDL thickness on the free-stream velocity is the most prominent. Finally, we demonstrate that increasing the surface potential and the applied field tends to influence the overall flow patterns in the contrasting manners. These results may be of profound importance in developing a comprehensive theoretical basis for designing electro-osmotically actuated microfluidic mixtures.
Behavior of Industrial Steel Rack Connections
NASA Astrophysics Data System (ADS)
Shah, S. N. R.; Ramli Sulong, N. H.; Khan, R.; Jumaat, M. Z.; Shariati, M.
2016-03-01
Beam-to-column connections (BCCs) used in steel pallet racks (SPRs) play a significant role to maintain the stability of rack structures in the down-aisle direction. The variety in the geometry of commercially available beam end connectors hampers the development of a generalized analytic design approach for SPR BCCs. The experimental prediction of flexibility in SPR BCCs is prohibitively expensive and difficult for all types of commercially available beam end connectors. A suitable solution to derive a particular uniform M-θ relationship for each connection type in terms of geometric parameters may be achieved through finite element (FE) modeling. This study first presents a comprehensive description of the experimental investigations that were performed and used as the calibration bases for the numerical study that constituted its main contribution. A three dimensioned (3D) non-linear finite element (FE) model was developed and calibrated against the experimental results. The FE model took into account material nonlinearities, geometrical properties and large displacements. Comparisons between numerical and experimental data for observed failure modes and M-θ relationship showed close agreement. The validated FE model was further extended to perform parametric analysis to identify the effects of various parameters which may affect the overall performance of the connection.
Analysis of suspended solids transport processes in primary settling tanks.
Patziger, Miklós; Kiss, Katalin
2015-01-01
The paper shows the results of a long-term research comprising FLUENT-based numerical modeling, in situ measurements and laboratory tests to analyze suspended solids (SS) transport processes in primary settling tanks (PSTs). The investigated PST was one of the rectangular horizontal flow PSTs at a large municipal wastewater treatment plant (WWTP) of a capacity of 500,000 population equivalent. Many middle-sized and large WWTPs are equipped with such PSTs. The numerical PST model was calibrated and validated based on the results of comprehensive in situ flow and SS concentration measurements from low (5 m/h) up to quite high surface overflow rates of 9.5 and 13.0 m/h and on settling and other laboratory tests. The calibrated and validated PST model was also successfully used for evaluation of some slight modifications of the inlet geometry (removing lamellas, installing a flocculation 'box', shifting the inlet into a 'bottom-near' or into a 'high' position), which largely affect PST behavior and performance. The investigations provided detailed insight into the flow and SS transport processes within the investigated PST, which strongly contributes to hydrodynamically driven design and upgrading of PSTs.
Systems biology approaches to understand the effects of nutrition and promote health.
Badimon, Lina; Vilahur, Gemma; Padro, Teresa
2017-01-01
Within the last years the implementation of systems biology in nutritional research has emerged as a powerful tool to understand the mechanisms by which dietary components promote health and prevent disease as well as to identify the biologically active molecules involved in such effects. Systems biology, by combining several '-omics' disciplines (mainly genomics/transcriptomics, proteomics and metabolomics), creates large data sets that upon computational integration provide in silico predictive networks that allow a more extensive analysis of the individual response to a nutritional intervention and provide a more global comprehensive understanding of how diet may influence health and disease. Numerous studies have demonstrated that diet and particularly bioactive food components play a pivotal role in helping to counteract environmental-related oxidative damage. Oxidative stress is considered to be strongly implicated in ageing and the pathophysiology of numerous diseases including neurodegenerative disease, cancers, metabolic disorders and cardiovascular diseases. In the following review we will provide insights into the role of systems biology in nutritional research and focus on transcriptomic, proteomic and metabolomics studies that have demonstrated the ability of functional foods and their bioactive components to fight against oxidative damage and contribute to health benefits. © 2016 The British Pharmacological Society.
Numerical and Experimental Methods for Wake Flow Analysis in Complex Terrain
NASA Astrophysics Data System (ADS)
Castellani, Francesco; Astolfi, Davide; Piccioni, Emanuele; Terzi, Ludovico
2015-06-01
Assessment and interpretation of the quality of wind farms power output is a non-trivial task, which poses at least three main challenges: reliable comprehension of free wind flow, which is stretched to the limit on very complex terrains, realistic model of how wake interactions resemble on the wind flow, awareness of the consequences on turbine control systems, including alignment patterns to the wind and, consequently, power output. The present work deals with an onshore wind farm in southern Italy, which has been a test case of IEA- Task 31 Wakebench project: 17 turbines, with 2.3 MW of rated power each, are sited on a very complex terrain. A cluster of machines is investigated through numerical and experimental methods: CFD is employed for simulating wind fields and power extraction, as well as wakes, are estimated through the Actuator Disc model. SCADA data mining techniques are employed for comparison between models and actual performances. The simulations are performed both on the real terrain and on flat terrain, in order to disentangle the effects of complex flow and wake effects. Attention is devoted to comparison between actual alignment patterns of the cluster of turbines and predicted flow deviation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, G.A.; Lake, L.W.; Sepehrnoori, K.
1988-11-01
The objective of this research is to develop, validate, and apply a comprehensive chemical flooding simulator for chemical recovery processes involving surfactants, polymers, and alkaline chemicals in various combinations. This integrated program includes components of laboratory experiments, physical property modelling, scale-up theory, and numerical analysis as necessary and integral components of the simulation activity. Developing, testing and applying flooding simulator (UTCHEM) to a wide variety of laboratory and reservoir problems involving tracers, polymers, polymer gels, surfactants, and alkaline agent has been continued. Improvements in both the physical-chemical and numerical aspects of UTCHEM have been made which enhance its versatility, accuracymore » and speed. Supporting experimental studies during the past year include relative permeability and trapping of microemulsion, tracer flow studies oil recovery in cores using alcohol free surfactant slugs, and microemulsion viscosity measurements. These have enabled model improvement simulator testing. Another code called PROPACK has also been developed which is used as a preprocessor for UTCHEM. Specifically, it is used to evaluate input to UTCHEM by computing and plotting key physical properties such as phase behavior interfacial tension.« less
[Improving comprehension and communication of risks about health].
García-Retamero, Rocío; Galesic, Mirta; Gigerenzer, Gerd
2011-11-01
Many patients have severe difficulties grasping several numerical concepts that are related to their health, such as, for instance, the risk of suffering various diseases. Visual aids have been proposed as a promising method for enhancing risk communication and comprehension. In medical practice, however, not all patients benefit from visual aids. In a study conducted on a probabilistic, representative national sample in the United States, we identified a group of patients for whom visual aids are most useful: Those who have low numeracy but high graphical literacy skills. These patients have high scores in three abilities involved in graph comprehension: (1) the ability to read the data, (2) the ability to read the relations between the data, and (3) the ability to read beyond the data. Our results can have important implications for medical practice and for risk communication about health.
Comprehensive School Reform and Student Achievement: A Meta-Analysis.
ERIC Educational Resources Information Center
Borman, Geoffrey D.; Hewes, Gina M.; Overman, Laura T.; Brown, Shelly
Using 232 studies, this meta analysis reviewed the research on the achievement effects of the nationally disseminated and externally developed school improvement programs known as "whole-school" or "comprehensive" reforms. In addition to reviewing the overall achievement effects of comprehensive school reform (CSR), the meta…
Complexity analysis based on generalized deviation for financial markets
NASA Astrophysics Data System (ADS)
Li, Chao; Shang, Pengjian
2018-03-01
In this paper, a new modified method is proposed as a measure to investigate the correlation between past price and future volatility for financial time series, known as the complexity analysis based on generalized deviation. In comparison with the former retarded volatility model, the new approach is both simple and computationally efficient. The method based on the generalized deviation function presents us an exhaustive way showing the quantization of the financial market rules. Robustness of this method is verified by numerical experiments with both artificial and financial time series. Results show that the generalized deviation complexity analysis method not only identifies the volatility of financial time series, but provides a comprehensive way distinguishing the different characteristics between stock indices and individual stocks. Exponential functions can be used to successfully fit the volatility curves and quantify the changes of complexity for stock market data. Then we study the influence for negative domain of deviation coefficient and differences during the volatile periods and calm periods. after the data analysis of the experimental model, we found that the generalized deviation model has definite advantages in exploring the relationship between the historical returns and future volatility.
Thermal radiative properties: Coatings.
NASA Technical Reports Server (NTRS)
Touloukian, Y. S.; Dewitt, D. P.; Hernicz, R. S.
1972-01-01
This volume consists, for the most part, of a presentation of numerical data compiled over the years in a most comprehensive manner on coatings for all applications, in particular, thermal control. After a moderately detailed discussion of the theoretical nature of the thermal radiative properties of coatings, together with an overview of predictive procedures and recognized experimental techniques, extensive numerical data on the thermal radiative properties of pigmented, contact, and conversion coatings are presented. These data cover metallic and nonmetallic pigmented coatings, enamels, metallic and nonmetallic contact coatings, antireflection coatings, resin coatings, metallic black coatings, and anodized and oxidized conversion coatings.
CFD Techniques for Propulsion Applications
NASA Technical Reports Server (NTRS)
1992-01-01
The symposium was composed of the following sessions: turbomachinery computations and validations; flow in ducts, intakes, and nozzles; and reacting flows. Forty papers were presented, and they covered full 3-D code validation and numerical techniques; multidimensional reacting flow; and unsteady viscous flow for the entire spectrum of propulsion system components. The capabilities of the various numerical techniques were assessed and significant new developments were identified. The technical evaluation spells out where progress has been made and concludes that the present state of the art has almost reached the level necessary to tackle the comprehensive topic of computational fluid dynamics (CFD) validation for propulsion.
Impact of comprehensive two-dimensional gas chromatography with mass spectrometry on food analysis.
Tranchida, Peter Q; Purcaro, Giorgia; Maimone, Mariarosa; Mondello, Luigi
2016-01-01
Comprehensive two-dimensional gas chromatography with mass spectrometry has been on the separation-science scene for about 15 years. This three-dimensional method has made a great positive impact on various fields of research, and among these that related to food analysis is certainly at the forefront. The present critical review is based on the use of comprehensive two-dimensional gas chromatography with mass spectrometry in the untargeted (general qualitative profiling and fingerprinting) and targeted analysis of food volatiles; attention is focused not only on its potential in such applications, but also on how recent advances in comprehensive two-dimensional gas chromatography with mass spectrometry will potentially be important for food analysis. Additionally, emphasis is devoted to the many instances in which straightforward gas chromatography with mass spectrometry is a sufficiently-powerful analytical tool. Finally, possible future scenarios in the comprehensive two-dimensional gas chromatography with mass spectrometry food analysis field are discussed. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Instruction of Research-Based Comprehension Strategies in Basal Reading Programs
ERIC Educational Resources Information Center
Pilonieta, Paola
2010-01-01
Research supports using research-based comprehension strategies; however, comprehension strategy instruction is not highly visible in basal reading programs or classroom instruction, resulting in many students who struggle with comprehension. A content analysis examined which research-based comprehension strategies were presented in five…
Comprehensive rotorcraft analysis methods
NASA Technical Reports Server (NTRS)
Stephens, Wendell B.; Austin, Edward E.
1988-01-01
The development and application of comprehensive rotorcraft analysis methods in the field of rotorcraft technology are described. These large scale analyses and the resulting computer programs are intended to treat the complex aeromechanical phenomena that describe the behavior of rotorcraft. They may be used to predict rotor aerodynamics, acoustic, performance, stability and control, handling qualities, loads and vibrations, structures, dynamics, and aeroelastic stability characteristics for a variety of applications including research, preliminary and detail design, and evaluation and treatment of field problems. The principal comprehensive methods developed or under development in recent years and generally available to the rotorcraft community because of US Army Aviation Research and Technology Activity (ARTA) sponsorship of all or part of the software systems are the Rotorcraft Flight Simulation (C81), Dynamic System Coupler (DYSCO), Coupled Rotor/Airframe Vibration Analysis Program (SIMVIB), Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD), General Rotorcraft Aeromechanical Stability Program (GRASP), and Second Generation Comprehensive Helicopter Analysis System (2GCHAS).
DOT National Transportation Integrated Search
1995-05-01
In October, of 1992, the Housatonic Area Regional Transit (HART) District published a planning study providing an in-depth analysis of its fixed route bus transit service. This comprehensive operational analysis (COA) was the first detailed analysis ...
DOT National Transportation Integrated Search
1994-07-01
In October, of 1992, the Housatonic Area Regional Transit (HART) District published a planning study providing an in-depth analysis of its fixed route bus transit service. This comprehensive operational analysis (COA) was the first detailed analysis ...
DOT National Transportation Integrated Search
1995-02-01
In October, of 1992, the Housatonic Area Regional Transit (HART) District published a planning study providing an in-depth analysis of its fixed route bus transit service. This comprehensive operational analysis (COA) was the first detailed analysis ...
Direct analysis in real time mass spectrometry for analysis of sexual assault evidence.
Musah, Rabi A; Cody, Robert B; Dane, A John; Vuong, Angela L; Shepard, Jason R E
2012-05-15
Sexual assault crimes are vastly underreported and suffer from alarmingly low prosecution and conviction rates. The key scientific method to aid in prosecution of such cases is forensic DNA analysis, where biological evidence such as semen collected using a rape test kit is used to determine a suspect's DNA profile. However, the growing awareness by criminals of the importance of DNA in the prosecution of sexual assaults has resulted in increased condom use by assailants as a means to avoid leaving behind their DNA. Thus, other types of trace evidence are important to help corroborate victims' accounts, exonerate the innocent, link suspects to the crime, or confirm penetration. Direct Analysis in Real Time Mass Spectrometry (DART-MS) was employed for the comprehensive characterization of non-DNA trace evidence associated with sexual assault. The ambient ionization method associated with DART-MS is extremely rapid and samples are processed instantaneously, without the need for extraction, sample preparation, or other means that might compromise forensic evidence for future analyses. In a single assay, we demonstrated the ability to identify lubricant formulations associated with sexual assault, such as the spermicide nonoxynol-9, compounds used in condom manufacture, and numerous other trace components as probative evidence. In addition, the method can also serve to identify compounds within trace biological residues, such as fatty acids commonly identified in latent fingerprints. Characterization of lubricant residues as probative evidence serves to establish a connection between the victim and the perpetrator, and the availability of these details may lead to higher rates of prosecution and conviction, as well as more severe penalties. The methodology described here opens the way for the adoption of a comprehensive, rapid, and sensitive analysis for use in crime labs, while providing knowledge that can inform and guide criminal justice policy and practice. Copyright © 2012 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Chen, C. P.
1990-01-01
An existing Computational Fluid Dynamics code for simulating complex turbulent flows inside a liquid rocket combustion chamber was validated and further developed. The Advanced Rocket Injector/Combustor Code (ARICC) is simplified and validated against benchmark flow situations for laminar and turbulent flows. The numerical method used in ARICC Code is re-examined for incompressible flow calculations. For turbulent flows, both the subgrid and the two equation k-epsilon turbulence models are studied. Cases tested include idealized Burger's equation in complex geometries and boundaries, a laminar pipe flow, a high Reynolds number turbulent flow, and a confined coaxial jet with recirculations. The accuracy of the algorithm is examined by comparing the numerical results with the analytical solutions as well as experimented data with different grid sizes.
Number games, magnitude representation, and basic number skills in preschoolers.
Whyte, Jemma Catherine; Bull, Rebecca
2008-03-01
The effect of 3 intervention board games (linear number, linear color, and nonlinear number) on young children's (mean age = 3.8 years) counting abilities, number naming, magnitude comprehension, accuracy in number-to-position estimation tasks, and best-fit numerical magnitude representations was examined. Pre- and posttest performance was compared following four 25-min intervention sessions. The linear number board game significantly improved children's performance in all posttest measures and facilitated a shift from a logarithmic to a linear representation of numerical magnitude, emphasizing the importance of spatial cues in estimation. Exposure to the number card games involving nonsymbolic magnitude judgments and association of symbolic and nonsymbolic quantities, but without any linear spatial cues, improved some aspects of children's basic number skills but not numerical estimation precision.
ERIC Educational Resources Information Center
Lusk, Stephanie L.; Koch, Lynn C.; Paul, Teresia M.
2016-01-01
Purpose: In this article, we examined how individuals with co-occurring psychiatric disabilities and substance use disorders encounter numerous challenges when it comes to the vocational rehabilitation (VR) process. Method: A comprehensive review of the literature demonstrated barriers to service delivery (e.g., access to services, exclusionary…
Once in a Million Years: Teaching Geologic Time
ERIC Educational Resources Information Center
Lewis, Susan E.; Lampe, Kristen A.; Lloyd, Andrew J.
2005-01-01
The authors advocate that students frequently lack fundamental numerical literacy on the order of millions or billions, and that this comprehension is critical to grasping key evolutionary concepts related to the geologic time scale, the origin and diversification of life on earth, and other concepts such as the national debt, human population…
Wildlife-Habitat Relationships in California's Oak Woodlands: Where Do We Go From Here?
Michael L. Morrison; William M. Block; Jared Verner
1991-01-01
We discuss management goals and research directions for a comprehensive study of wildlife in California's oak woodlands. Oak woodlands are under intensive multiple use, including urbanization, recreation, grazing, fuel wood cutting, and hunting. Research in oak woodlands is thus complicated by these numerous, often competing, interests. Complicating understanding...
ERIC Educational Resources Information Center
Brusseau, Timothy A.; Hannon, James C.
2015-01-01
Physical activity is associated with numerous academic and health benefits. Furthermore, schools have been identified as an ideal location to promote physical activity as most youth attend school regularly from ages 5-18. Unfortunately, in an effort to increase academic learning time, schools have been eliminating traditional activity…
The Decline in the Number of Black Teachers Can Be Reversed.
ERIC Educational Resources Information Center
Hackley, Lloyd V.
1985-01-01
Rising state standards and competency testing of entering teachers is resulting in a numerical decline in Black candidates who are ill equipped due to weak programs in the secondary schools. A comprehensive program at the University of Arkansas Pine Bluff to provide both educational equity and educational improvement is described. (BS)
Jentes, Emily S.; Millman, Alexander J.; Decenteceo, Michelle; Klevos, Andrew; Biggs, Holly M.; Esposito, Douglas H.; McPherson, Heidi; Sullivan, Carmen; Voorhees, Dayton; Watkins, Jim; Anzalone, Fanancy L.; Gaul, Linda; Flores, Sal; Brunette, Gary W.; Sotir, Mark J.
2017-01-01
Public health investigations can require intensive collaboration between numerous governmental and nongovernmental organizations. We describe an investigation involving several governmental and nongovernmental partners that was successfully planned and performed in an organized, comprehensive, and timely manner with several governmental and nongovernmental partners. PMID:27601520
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-28
... and beyond. The modeling was based on PM Source Apportionment Technology (PSAT) for the Comprehensive... sources and the State adequately determined the apportionment of those pollutants from sources located... Class I areas caused by emissions of air pollutants from numerous sources located over a wide geographic...
ERIC Educational Resources Information Center
Zhou, Xiaolin; Jiang, Xiaoming; Ye, Zheng; Zhang, Yaxu; Lou, Kaiyang; Zhan, Weidong
2010-01-01
An event-related potential (ERP) study was conducted to investigate the temporal neural dynamics of semantic integration processes at different levels of syntactic hierarchy during Chinese sentence reading. In a hierarchical structure, "subject noun" + "verb" + "numeral" + "classifier" + "object noun," the object noun is constrained by selectional…
Formative Assessment in the Digital Age: Blogging with Third Graders
ERIC Educational Resources Information Center
Stover, Katie; Yearta, Lindsay; Harris, Caroline
2016-01-01
There are numerous benefits of using blogs to discuss reading in the elementary classroom. Teachers can assess reading comprehension for individual students while managing several book clubs in a digital space. The resulting assessment-based data can be used to differentiate instruction. Additionally, students can experience growth as independent,…
Educating for Ecological Sustainability: Montessori Education Leads the Way
ERIC Educational Resources Information Center
Sutton, Ann
2009-01-01
These days, the word "green," and the more comprehensive term "sustainability," surface in numerous arenas, whether it be exhortations to recycle more, employ compact fluorescent lightbulbs, use less hot water, avoid products with excess packaging, adjust thermostats, plant trees, turn off electronic devices when not in use, or buy organic and…
Women Workers: Trends & Issues, 1993 Handbook.
ERIC Educational Resources Information Center
Women's Bureau (DOL), Washington, DC.
This handbook offers a comprehensive view of the labor force activity of women in the United States and describes a range of legal and socioeconomic developments that have had an effect upon women's participation and progress in the work force. Through numerous statistical charts and tables, the book depicts change and reactions to change in the…
Evaluating Evidence for Ed-Tech Product Effectiveness: Guidelines for School Districts
ERIC Educational Resources Information Center
Center for Research and Reform in Education, 2017
2017-01-01
Educational technology products offer potentially effective means of supporting teaching and learning in K-12 classrooms. But for any given instructional need, there are likely to be numerous product options available for purchasing. What can school districts do to help ensure that good selections are made? In a recent, comprehensive study of…
Where Do I Start? A School Library Handbook
ERIC Educational Resources Information Center
Linworth, 2001
2001-01-01
Whether you are a new school library media specialist just starting in the profession, or a more experienced librarian who may not have the time to research and gather the information you need, this comprehensive guide provides an excellent overview of numerous topics related to school libraries. Written in easy-to-understand language, this…
Perceived Personal and Social Competence: Development of Valid and Reliable Measures
ERIC Educational Resources Information Center
Fetro, Joyce V.; Rhodes, Darson L.; Hey, David W.
2010-01-01
During the last 20 years, youth programming has shifted from risk reduction to youth development. While numerous instruments exist to measure selected individual characteristics/competencies among youth, a comprehensive instrument to measure four constructs of personal and social skills could not be identified. The purpose of this study was to…
Deaf Students' Reading and Writing in College: Fluency, Coherence, and Comprehension
ERIC Educational Resources Information Center
Albertini, John A.; Marschark, Marc; Kincheloe, Pamela J.
2016-01-01
Research in discourse reveals numerous cognitive connections between reading and writing. Rather than one being the inverse of the other, there are parallels and interactions between them. To understand the variables and possible connections in the reading and writing of adult deaf students, we manipulated writing conditions and reading texts.…
Cross-Cultural Peer Mentoring: One Approach to Enhancing White Faculty Adjustment at Black Colleges
ERIC Educational Resources Information Center
Louis, Dave A.
2015-01-01
White faculty members at Black colleges in the United States face numerous social obstacles. Exploring the experiences of White faculty members at four historically Black colleges and universities (HBCUs) and their adjustment to a minority status assists the comprehension of issues surrounding this subgroup. Utilizing a phenomenological approach,…
ERIC Educational Resources Information Center
Varjo, Janne; Kalalahti, Mira
2015-01-01
Since the 1980s, numerous education reforms have sought to dismantle centralised bureaucracies and replace them with devolved systems of schooling that emphasise parental choice and competition between increasingly diversified types of schools. Nevertheless, the "Finnish variety of "post-comprehensivism" continues to emphasise…
Low gravity two-phase flow with heat transfer
NASA Technical Reports Server (NTRS)
Antar, Basil N.
1991-01-01
A realistic model for the transfer line chilldown operation under low-gravity conditions is developed to provide a comprehensive predictive capability on the behavior of liquid vapor, two-phase diabatic flows in pipes. The tasks described involve the development of numerical code and the establishment of the necessary experimental data base for low-gravity simulation.
Conducting Adolescent Violence Risk Assessments: A Framework for 419 School Counselors
ERIC Educational Resources Information Center
Bernes, Kerry B.; Bardick, Angela D.
2007-01-01
There have been numerous publications devoted to preventing violence and bullying in schools, resulting in school counselors being well equipped with school-wide violence prevention ideas and programs. Despite these violence prevention efforts, some students may pose a threat to others and thus may require a comprehensive assessment for violence…
Svider, Peter F; Keeley, Brieze R; Zumba, Osvaldo; Mauro, Andrew C; Setzen, Michael; Eloy, Jean Anderson
2013-08-01
Malpractice litigation has increased in recent decades, contributing to higher health-care costs. Characterization of complications leading to litigation is of special interest to practitioners of facial plastic surgery procedures because of the higher proportion of elective cases relative to other subspecialties. In this analysis, we comprehensively examine malpractice litigation in facial plastic surgery procedures and characterize factors important in determining legal responsibility, as this information may be of great interest and use to practitioners in several specialties. Retrospective analysis. The Westlaw legal database was examined for court records pertaining to facial plastic surgery procedures. The term "medical malpractice" was searched in combination with numerous procedures obtained from the American Academy of Facial Plastic and Reconstructive Surgery website. Of the 88 cases included, 62.5% were decided in the physician's favor, 9.1% were resolved with an out-of-court settlement, and 28.4% ended in a jury awarding damages for malpractice. The mean settlement was $577,437 and mean jury award was $352,341. The most litigated procedures were blepharoplasties and rhinoplasties. Alleged lack of informed consent was noted in 38.6% of cases; other common complaints were excessive scarring/disfigurement, functional considerations, and postoperative pain. This analysis characterized factors in determining legal responsibility in facial plastic surgery cases. Several factors were identified as potential targets for minimizing liability. Informed consent was the most reported entity in these malpractice suits. This finding emphasizes the importance of open communication between physicians and their patients regarding expectations as well as documentation of specific risks, benefits, and alternatives. © 2013 The American Laryngological, Rhinological, and Otological Society, Inc.
Rubble masonry response under cyclic actions: The experience of L’Aquila city (Italy)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fonti, Roberta, E-mail: roberta.fonti@tum.de; Barthel, Rainer, E-mail: r.barthel@lrz.tu-muenchen.de; Formisano, Antonio, E-mail: antoform@unina.it
2015-12-31
Several methods of analysis are available in engineering practice to study old masonry constructions. Two commonly used approaches in the field of seismic engineering are global and local analyses. Despite several years of research in this field, the various methodologies suffer from a lack of comprehensive experimental validation. This is mainly due to the difficulty in simulating the many different kinds of masonry and, accordingly, the non-linear response under horizontal actions. This issue can be addressed by examining the local response of isolated panels under monotonic and/or alternate actions. Different testing methodologies are commonly used to identify the local responsemore » of old masonry. These range from simplified pull-out tests to sophisticated in-plane monotonic tests. However, there is a lack of both knowledge and critical comparison between experimental validations and numerical simulations. This is mainly due to the difficulties in implementing irregular settings within both simplified and advanced numerical analyses. Similarly, the simulation of degradation effects within laboratory tests is difficult with respect to old masonry in-situ boundary conditions. Numerical models, particularly on rubble masonry, are commonly simplified. They are mainly based on a kinematic chain of rigid blocks able to perform different “modes of damage” of structures subjected to horizontal actions. This paper presents an innovative methodology for testing; its aim is to identify a simplified model for out-of-plane response of rubbleworks with respect to the experimental evidence. The case study of L’Aquila district is discussed.« less
Comprehensive Thematic T-Matrix Reference Database: A 2015-2017 Update
NASA Technical Reports Server (NTRS)
Mishchenko, Michael I.; Zakharova, Nadezhda; Khlebtsov, Nikolai G.; Videen, Gorden; Wriedt, Thomas
2017-01-01
The T-matrix method pioneered by Peter C. Waterman is one of the most versatile and efficient numerically exact computer solvers of the time-harmonic macroscopic Maxwell equations. It is widely used for the computation of electromagnetic scattering by single and composite particles, discrete random media, periodic structures (including metamaterials), and particles in the vicinity of plane or rough interfaces separating media with different refractive indices. This paper is the eighth update to the comprehensive thematic database of peer-reviewed T-matrix publications initiated in 2004 and lists relevant publications that have appeared since 2015. It also references a small number of earlier publications overlooked previously.
Comprehensive thematic T-matrix reference database: A 2015-2017 update
NASA Astrophysics Data System (ADS)
Mishchenko, Michael I.; Zakharova, Nadezhda T.; Khlebtsov, Nikolai G.; Videen, Gorden; Wriedt, Thomas
2017-11-01
The T-matrix method pioneered by Peter C. Waterman is one of the most versatile and efficient numerically exact computer solvers of the time-harmonic macroscopic Maxwell equations. It is widely used for the computation of electromagnetic scattering by single and composite particles, discrete random media, periodic structures (including metamaterials), and particles in the vicinity of plane or rough interfaces separating media with different refractive indices. This paper is the eighth update to the comprehensive thematic database of peer-reviewed T-matrix publications initiated in 2004 and lists relevant publications that have appeared since 2015. It also references a small number of earlier publications overlooked previously.
A retrospective analysis of the Dermatology Foundation's Career Development Award Program.
Boris, Chris; Lessin, Stuart R; Wintroub, Bruce U; Yancey, Kim B
2012-11-01
To provide research support that develops and retains leaders, educators, and investigators in dermatology and cutaneous biology, the Dermatology Foundation (DF) has designed and implemented a comprehensive Career Development Award (CDA) Program. To assess the impact of the DF's 3-year CDA, a comprehensive survey of recipients who received this mechanism of support between 1990 and 2007 was performed. Of 196 individuals receiving a DF CDA, 181 were identified and asked to complete a comprehensive questionnaire concerning their career status, employment history, professional rank, and record of independent research funding (private foundation, federal, other). A personal assessment of the impact of this funding on these individuals' career trajectory was also requested. Eighty percent of 181 CDA recipients identified currently hold full- or part-time positions in academic medicine. The faculty rank of 112 survey respondents included 46 assistant professors (41%), 41 associate professors (37%), 18 professors (16%), and 7 division or departmental chairs (6%). Of respondents, 84% reported that they have received subsequent independent research funding; 95 of these individuals (86%) have received funding from a federal agency (235 federal grants awarded to date with funding >$318M). The study was retrospective and self-reported; some awardees did not respond to the survey. The DF's CDA Program has succeeded in supporting the early career development of talented investigators, educators, and leaders; fostered the promotion and retention of these individuals in academic medicine; and nucleated numerous investigative careers that have successfully acquired independent research funding. Copyright © 2012 American Academy of Dermatology, Inc. Published by Mosby, Inc. All rights reserved.
ERIC Educational Resources Information Center
Robson, Holly; Keidel, James L.; Lambon Ralph, Matthew A.; Sage, Karen
2012-01-01
Wernicke's aphasia is a condition which results in severely disrupted language comprehension following a lesion to the left temporo-parietal region. A phonological analysis deficit has traditionally been held to be at the root of the comprehension impairment in Wernicke's aphasia, a view consistent with current functional neuroimaging which finds…
Comprehensive analysis of heat transfer of gold-blood nanofluid (Sisko-model) with thermal radiation
NASA Astrophysics Data System (ADS)
Eid, Mohamed R.; Alsaedi, Ahmed; Muhammad, Taseer; Hayat, Tasawar
Characteristics of heat transfer of gold nanoparticles (Au-NPs) in flow past a power-law stretching surface are discussed. Sisko bio-nanofluid flow (with blood as a base fluid) in existence of non-linear thermal radiation is studied. The resulting equations system is abbreviated to model the suggested problem in non-linear PDEs. Along with initial and boundary-conditions, the equations are made non-dimensional and then resolved numerically utilizing 4th-5th order Runge-Kutta-Fehlberg (RKF45) technique with shooting integration procedure. Various flow quantities behaviors are examined for parametric consideration such as the Au-NPs volume fraction, the exponentially stretching and thermal radiation parameters. It is observed that radiation drives to shortage the thermal boundary-layer thickness and therefore resulted in better heat transfer at surface.
NASA Astrophysics Data System (ADS)
Zhou, Zhenhuan; Li, Yuejie; Fan, Junhai; Rong, Dalun; Sui, Guohao; Xu, Chenghui
2018-05-01
A new Hamiltonian-based approach is presented for finding exact solutions for transverse vibrations of double-nanobeam-systems embedded in an elastic medium. The continuum model is established within the frameworks of the symplectic methodology and the nonlocal Euler-Bernoulli and Timoshenko beam beams. The symplectic eigenfunctions are obtained after expressing the governing equations in a Hamiltonian form. Exact frequency equations, vibration modes and displacement amplitudes are obtained by using symplectic eigenfunctions and end conditions. Comparisons with previously published work are presented to illustrate the accuracy and reliability of the proposed method. The comprehensive results for arbitrary boundary conditions could serve as benchmark results for verifying numerically obtained solutions. In addition, a study on the difference between the nonlocal beam and the nonlocal plate is also included.
Characteristic Analysis and Experiment of a Dynamic Flow Balance Valve
NASA Astrophysics Data System (ADS)
Bin, Li; Song, Guo; Xuyao, Mao; Chao, Wu; Deman, Zhang; Jin, Shang; Yinshui, Liu
2017-12-01
Comprehensive characteristics of a dynamic flow balance valve of water system were analysed. The flow balance valve can change the drag efficient automatically according to the condition of system, and the effective control flowrate is constant in the range of job pressure. The structure of the flow balance valve was introduced, and the theoretical calculation formula for the variable opening of the valve core was derived. A rated pressure of 20kPa to 200kPa and a rated flowrate of 10m3/h were offered in the numerical work. Static and fluent CFX analyses show good behaviours: through the valve core structure optimization and improve design of the compressive spring, the dynamic flow balance valve can stabilize the flowrate of system evidently. And experiments show that the flow control accuracy is within 5%.
Grenier, Marie-Lyne
2015-01-01
The purpose of this study was to gain a comprehensive understanding of the facilitators of and barriers to learning within occupational therapy fieldwork education from the perspective of both Canadian and American students. A qualitative study using an online open survey format was conducted to gather data from 29 occupational therapy students regarding their fieldwork experiences. An inductive grounded theory approach to content analysis was used. Individual, environmental, educational, and institutional facilitators of and barriers to learning within occupational therapy fieldwork education were identified. This study's findings suggest that learning within fieldwork education is a highly individual and dynamic process that is influenced by numerous factors. The new information generated by this study has the potential to positively affect the future design and implementation of fieldwork education. Copyright © 2015 by the American Occupational Therapy Association, Inc.
A computerized multidimensional measurement of mental workload via handwriting analysis.
Luria, Gil; Rosenblum, Sara
2012-06-01
The goal of this study was to test the effect of mental workload on handwriting behavior and to identify characteristics of low versus high mental workload in handwriting. We hypothesized differences between handwriting under three different load conditions and tried to establish a profile that integrated these indicators. Fifty-six participants wrote three numerical progressions of varying difficulty on a digitizer attached to a computer so that we could evaluate their handwriting behavior. Differences were found in temporal, spatial, and angular velocity handwriting measures, but no significant differences were found for pressure measures. Using data reduction, we identified three clusters of handwriting, two of which differentiated well according to the three mental workload conditions. We concluded that handwriting behavior is affected by mental workload and that each measure provides distinct information, so that they present a comprehensive indicator of mental workload.
NASA Technical Reports Server (NTRS)
Demerdash, N. A.; Nehl, T. W.
1980-01-01
A comprehensive digital model for the analysis and possible optimization of the closed loop dynamic (instantaneous) performance of a power conditioner fed, brushless dc motor powered, electromechanical actuator system (EMA) is presented. This model was developed for the simulation of the dynamic performance of an actual prototype EMA built for NASA-JSC as a possible alternative to hydraulic actuators for consideration in Space Shuttle Orbiter applications. Excellent correlation was achieved between numerical model simulation and experimental test results obtained from the actual hardware. These results include: various current and voltage waveforms in the machine-power conditioner (MPC) unit, flap position as well as other control loop variables in response to step commands of change of flap position. These results with consequent conclusions are detailed in the paper.
Dark matter from a classically scale-invariant S U (3 )X
NASA Astrophysics Data System (ADS)
Karam, Alexandros; Tamvakis, Kyriakos
2016-09-01
In this work we study a classically scale-invariant extension of the Standard Model in which the dark matter and electroweak scales are generated through the Coleman-Weinberg mechanism. The extra S U (3 )X gauge factor gets completely broken by the vacuum expectation values of two scalar triplets. Out of the eight resulting massive vector bosons the three lightest are stable due to an intrinsic Z2×Z2' discrete symmetry and can constitute dark matter candidates. We analyze the phenomenological viability of the predicted multi-Higgs sector imposing theoretical and experimental constraints. We perform a comprehensive analysis of the dark matter predictions of the model solving numerically the set of coupled Boltzmann equations involving all relevant dark matter processes and explore the direct detection prospects of the dark matter candidates.
NASA Astrophysics Data System (ADS)
Liu, Liang-kui; Shi, Cheng; Zhang, Yi-bo; Sun, Lei
2017-04-01
A tri gate Ge-based tunneling field-effect transistor (TFET) has been numerically studied with technology computer aided design (TCAD) tools. Dopant segregated Schottky source/drain is applied to the device structure design (DS-TFET). The characteristics of the DS-TFET are compared and analyzed comprehensively. It is found that the performance of n-channel tri gate DS-TFET with a positive bias is insensitive to the dopant concentration and barrier height at n-type drain, and that the dopant concentration and barrier height at a p-type source considerably affect the device performance. The domination of electron current in the entire BTBT current of this device accounts for this phenomenon and the tri-gate DS-TFET is proved to have a higher performance than its dual-gate counterpart.
Badhwar - O'Neill 2014 Galactic Cosmic Ray Flux Model Description
NASA Technical Reports Server (NTRS)
O'Neill, P. M.; Golge, S.; Slaba, T. C.
2014-01-01
The Badhwar-O'Neill (BON) Galactic Cosmic Ray (GCR) model is based on GCR measurements from particle detectors. The model has mainly been used by NASA to certify microelectronic systems and the analysis of radiation health risks to astronauts in space missions. The BON14 model numerically solves the Fokker-Planck differential equation to account for particle transport in the heliosphere due to diffusion, convection, and adiabatic deceleration under the assumption of a spherically symmetric heliosphere. The model also incorporates an empirical time delay function to account for the lag of the solar activity to reach the boundary of the heliosphere. This technical paper describes the most recent improvements in parameter fits to the BON model (BON14). Using a comprehensive measurement database, it is shown that BON14 is significantly improved over the previous version, BON11.
NASA Astrophysics Data System (ADS)
Li, Xin; Menenti, Massimo
2010-10-01
The general objective of project 5322 in the Dragon 2 programme is to quantitatively retrieve some key eco- hydrological parameters by using remote sensed data, especially from ESA, Chinese, and the Third Party Mission (TPM). To achieve this goal, a comprehensive observation experiment, Watershed Allied Telemetry Experimental Research (WATER) was carried out. WARER is a simultaneously airborne, satellite-borne, and ground-based remote sensing experiment took place in the Heihe River Basin, a typical inland river basin in the northwest of China. This paper introduces the background and implementation of WATER. Data have been obtained so far are described in details. After a period of data analysis for two years, numerous results have also been achieved. This paper presents some early results of WATER as well.
Tan, Ming-Hui; Chong, Kok-Keong; Wong, Chee-Woon
2014-01-20
Optimization of the design of a nonimaging dish concentrator (NIDC) for a dense-array concentrator photovoltaic system is presented. A new algorithm has been developed to determine configuration of facet mirrors in a NIDC. Analytical formulas were derived to analyze the optical performance of a NIDC and then compared with a simulated result obtained from a numerical method. Comprehensive analysis of optical performance via analytical method has been carried out based on facet dimension and focal distance of the concentrator with a total reflective area of 120 m2. The result shows that a facet dimension of 49.8 cm, focal distance of 8 m, and solar concentration ratio of 411.8 suns is the most optimized design for the lowest cost-per-output power, which is US$1.93 per watt.
Global dynamics in a stoichiometric food chain model with two limiting nutrients.
Chen, Ming; Fan, Meng; Kuang, Yang
2017-07-01
Ecological stoichiometry studies the balance of energy and multiple chemical elements in ecological interactions to establish how the nutrient content affect food-web dynamics and nutrient cycling in ecosystems. In this study, we formulate a food chain with two limiting nutrients in the form of a stoichiometric population model. A comprehensive global analysis of the rich dynamics of the targeted model is explored both analytically and numerically. Chaotic dynamic is observed in this simple stoichiometric food chain model and is compared with traditional model without stoichiometry. The detailed comparison reveals that stoichiometry can reduce the parameter space for chaotic dynamics. Our findings also show that decreasing producer production efficiency may have only a small effect on the consumer growth but a more profound impact on the top predator growth. Copyright © 2017 Elsevier Inc. All rights reserved.
Robson, Holly; Keidel, James L; Ralph, Matthew A Lambon; Sage, Karen
2012-01-01
Wernicke's aphasia is a condition which results in severely disrupted language comprehension following a lesion to the left temporo-parietal region. A phonological analysis deficit has traditionally been held to be at the root of the comprehension impairment in Wernicke's aphasia, a view consistent with current functional neuroimaging which finds areas in the superior temporal cortex responsive to phonological stimuli. However behavioural evidence to support the link between a phonological analysis deficit and auditory comprehension has not been yet shown. This study extends seminal work by Blumstein, Baker, and Goodglass (1977) to investigate the relationship between acoustic-phonological perception, measured through phonological discrimination, and auditory comprehension in a case series of Wernicke's aphasia participants. A novel adaptive phonological discrimination task was used to obtain reliable thresholds of the phonological perceptual distance required between nonwords before they could be discriminated. Wernicke's aphasia participants showed significantly elevated thresholds compared to age and hearing matched control participants. Acoustic-phonological thresholds correlated strongly with auditory comprehension abilities in Wernicke's aphasia. In contrast, nonverbal semantic skills showed no relationship with auditory comprehension. The results are evaluated in the context of recent neurobiological models of language and suggest that impaired acoustic-phonological perception underlies the comprehension impairment in Wernicke's aphasia and favour models of language which propose a leftward asymmetry in phonological analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Wong, Bernice Y. L.
1986-01-01
Successful instructional strategies for enhancing the reading comprehension and comprehension test performance of learning disabled students are described. Students are taught to self-monitor their comprehension of expository materials and stories through recognition and analysis of recurrent elements and problem passages, content summarization,…
Development of Multiobjective Optimization Techniques for Sonic Boom Minimization
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.
1996-01-01
A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.
Konishi, Eiichi; Nakashima, Yasuaki; Mano, Masayuki; Tomita, Yasuhiko; Nagasaki, Ikumitsu; Kubo, Toshikazu; Araki, Nobuhito; Haga, Hironori; Toguchida, Junya; Ueda, Takafumi; Sakuma, Toshiko; Imahori, Masaya; Morii, Eiichi; Yoshikawa, Hideki; Tsukamoto, Yoshitane; Futani, Hiroyuki; Wakasa, Kenichi; Hoshi, Manabu; Hamada, Shinshichi; Takeshita, Hideyuki; Inoue, Takeshi; Aono, Masanari; Kawabata, Kenji; Murata, Hiroaki; Katsura, Kanade; Urata, Yoji; Ueda, Hideki; Yanagisawa, Akio
2015-09-01
The aims of this study were: (i) to elucidate clinicopathological characteristics of pcCHS of long bones (L), limb girdles (LG) and trunk (T) in Japan; (ii) to investigate predictive pathological findings for outcome of pcCHS of L, LG and T, objectively; and (iii) to elucidate a discrepancy of grade between biopsy and resected specimens. Clinicopathological profiles of 174 pcCHS (79 male, 95 female), of L, LG, and T were retrieved. For each case, a numerical score was given to 18 pathological findings. The average age was 50.5 years (15-80 years). Frequently involved sites were femur, humerus, pelvis and rib. The 5-year and 10-year disease-specific survival (DSS) rates [follow-up: 1-258 months (average 65.5)] were 87.0% and 80.4%, respectively. By Cox hazards analysis on pathological findings, age, sex and location, histologically higher grade and older age were unfavorable predictors, and calcification was a favorable predictor in DSS. The histological grade of resected specimen was higher than that of biopsy in 37.7% (26/69 cases). In conclusion, higher histological grade and older age were predictors for poor, but calcification was for good prognosis. Because there was a discrepancy in grade between biopsy and resected specimens, comprehensive evaluation is necessary before definitive operation for pcCHS. © 2015 The Authors. Pathology International published by Japanese Society of Pathology and Wiley Publishing Asia Pty Ltd.
Real-time decay of a highly excited charge carrier in the one-dimensional Holstein model
NASA Astrophysics Data System (ADS)
Dorfner, F.; Vidmar, L.; Brockt, C.; Jeckelmann, E.; Heidrich-Meisner, F.
2015-03-01
We study the real-time dynamics of a highly excited charge carrier coupled to quantum phonons via a Holstein-type electron-phonon coupling. This is a prototypical example for the nonequilibrium dynamics in an interacting many-body system where excess energy is transferred from electronic to phononic degrees of freedom. We use diagonalization in a limited functional space (LFS) to study the nonequilibrium dynamics on a finite one-dimensional chain. This method agrees with exact diagonalization and the time-evolving block-decimation method, in both the relaxation regime and the long-time stationary state, and among these three methods it is the most efficient and versatile one for this problem. We perform a comprehensive analysis of the time evolution by calculating the electron, phonon and electron-phonon coupling energies, and the electronic momentum distribution function. The numerical results are compared to analytical solutions for short times, for a small hopping amplitude and for a weak electron-phonon coupling. In the latter case, the relaxation dynamics obtained from the Boltzmann equation agrees very well with the LFS data. We also study the time dependence of the eigenstates of the single-site reduced density matrix, which defines the so-called optimal phonon modes. We discuss their structure in nonequilibrium and the distribution of their weights. Our analysis shows that the structure of optimal phonon modes contains very useful information for the interpretation of the numerical data.
Linear and nonlinear dynamic analysis of redundant load path bearingless rotor systems
NASA Technical Reports Server (NTRS)
Murthy, V. R.; Shultz, Louis A.
1994-01-01
The goal of this research is to develop the transfer matrix method to treat nonlinear autonomous boundary value problems with multiple branches. The application is the complete nonlinear aeroelastic analysis of multiple-branched rotor blades. Once the development is complete, it can be incorporated into the existing transfer matrix analyses. There are several difficulties to be overcome in reaching this objective. The conventional transfer matrix method is limited in that it is applicable only to linear branch chain-like structures, but consideration of multiple branch modeling is important for bearingless rotors. Also, hingeless and bearingless rotor blade dynamic characteristics (particularly their aeroelasticity problems) are inherently nonlinear. The nonlinear equations of motion and the multiple-branched boundary value problem are treated together using a direct transfer matrix method. First, the formulation is applied to a nonlinear single-branch blade to validate the nonlinear portion of the formulation. The nonlinear system of equations is iteratively solved using a form of Newton-Raphson iteration scheme developed for differential equations of continuous systems. The formulation is then applied to determine the nonlinear steady state trim and aeroelastic stability of a rotor blade in hover with two branches at the root. A comprehensive computer program is developed and is used to obtain numerical results for the (1) free vibration, (2) nonlinearly deformed steady state, (3) free vibration about the nonlinearly deformed steady state, and (4) aeroelastic stability tasks. The numerical results obtained by the present method agree with results from other methods.
Liu, Weihua; Yang, Yi; Xu, Haitao; Liu, Xiaoyan; Wang, Yijia; Liang, Zhicheng
2014-01-01
In mass customization logistics service, reasonable scheduling of the logistics service supply chain (LSSC), especially time scheduling, is benefit to increase its competitiveness. Therefore, the effect of a customer order decoupling point (CODP) on the time scheduling performance should be considered. To minimize the total order operation cost of the LSSC, minimize the difference between the expected and actual time of completing the service orders, and maximize the satisfaction of functional logistics service providers, this study establishes an LSSC time scheduling model based on the CODP. Matlab 7.8 software is used in the numerical analysis for a specific example. Results show that the order completion time of the LSSC can be delayed or be ahead of schedule but cannot be infinitely advanced or infinitely delayed. Obtaining the optimal comprehensive performance can be effective if the expected order completion time is appropriately delayed. The increase in supply chain comprehensive performance caused by the increase in the relationship coefficient of logistics service integrator (LSI) is limited. The relative concern degree of LSI on cost and service delivery punctuality leads to not only changes in CODP but also to those in the scheduling performance of the LSSC.
Mealor, Andy D; Simner, Julia; Rothen, Nicolas; Carmichael, Duncan A; Ward, Jamie
2016-01-01
We developed the Sussex Cognitive Styles Questionnaire (SCSQ) to investigate visual and verbal processing preferences and incorporate global/local processing orientations and systemising into a single, comprehensive measure. In Study 1 (N = 1542), factor analysis revealed six reliable subscales to the final 60 item questionnaire: Imagery Ability (relating to the use of visual mental imagery in everyday life); Technical/Spatial (relating to spatial mental imagery, and numerical and technical cognition); Language & Word Forms; Need for Organisation; Global Bias; and Systemising Tendency. Thus, we replicate previous findings that visual and verbal styles are separable, and that types of imagery can be subdivided. We extend previous research by showing that spatial imagery clusters with other abstract cognitive skills, and demonstrate that global/local bias can be separated from systemising. Study 2 validated the Technical/Spatial and Language & Word Forms factors by showing that they affect performance on memory tasks. In Study 3, we validated Imagery Ability, Technical/Spatial, Language & Word Forms, Global Bias, and Systemising Tendency by issuing the SCSQ to a sample of synaesthetes (N = 121) who report atypical cognitive profiles on these subscales. Thus, the SCSQ consolidates research from traditionally disparate areas of cognitive science into a comprehensive cognitive style measure, which can be used in the general population, and special populations.
DNAtraffic--a new database for systems biology of DNA dynamics during the cell life.
Kuchta, Krzysztof; Barszcz, Daniela; Grzesiuk, Elzbieta; Pomorski, Pawel; Krwawicz, Joanna
2012-01-01
DNAtraffic (http://dnatraffic.ibb.waw.pl/) is dedicated to be a unique comprehensive and richly annotated database of genome dynamics during the cell life. It contains extensive data on the nomenclature, ontology, structure and function of proteins related to the DNA integrity mechanisms such as chromatin remodeling, histone modifications, DNA repair and damage response from eight organisms: Homo sapiens, Mus musculus, Drosophila melanogaster, Caenorhabditis elegans, Saccharomyces cerevisiae, Schizosaccharomyces pombe, Escherichia coli and Arabidopsis thaliana. DNAtraffic contains comprehensive information on the diseases related to the assembled human proteins. DNAtraffic is richly annotated in the systemic information on the nomenclature, chemistry and structure of DNA damage and their sources, including environmental agents or commonly used drugs targeting nucleic acids and/or proteins involved in the maintenance of genome stability. One of the DNAtraffic database aim is to create the first platform of the combinatorial complexity of DNA network analysis. Database includes illustrations of pathways, damage, proteins and drugs. Since DNAtraffic is designed to cover a broad spectrum of scientific disciplines, it has to be extensively linked to numerous external data sources. Our database represents the result of the manual annotation work aimed at making the DNAtraffic much more useful for a wide range of systems biology applications.
DNAtraffic—a new database for systems biology of DNA dynamics during the cell life
Kuchta, Krzysztof; Barszcz, Daniela; Grzesiuk, Elzbieta; Pomorski, Pawel; Krwawicz, Joanna
2012-01-01
DNAtraffic (http://dnatraffic.ibb.waw.pl/) is dedicated to be a unique comprehensive and richly annotated database of genome dynamics during the cell life. It contains extensive data on the nomenclature, ontology, structure and function of proteins related to the DNA integrity mechanisms such as chromatin remodeling, histone modifications, DNA repair and damage response from eight organisms: Homo sapiens, Mus musculus, Drosophila melanogaster, Caenorhabditis elegans, Saccharomyces cerevisiae, Schizosaccharomyces pombe, Escherichia coli and Arabidopsis thaliana. DNAtraffic contains comprehensive information on the diseases related to the assembled human proteins. DNAtraffic is richly annotated in the systemic information on the nomenclature, chemistry and structure of DNA damage and their sources, including environmental agents or commonly used drugs targeting nucleic acids and/or proteins involved in the maintenance of genome stability. One of the DNAtraffic database aim is to create the first platform of the combinatorial complexity of DNA network analysis. Database includes illustrations of pathways, damage, proteins and drugs. Since DNAtraffic is designed to cover a broad spectrum of scientific disciplines, it has to be extensively linked to numerous external data sources. Our database represents the result of the manual annotation work aimed at making the DNAtraffic much more useful for a wide range of systems biology applications. PMID:22110027
Yang, Yi; Xu, Haitao; Liu, Xiaoyan; Wang, Yijia; Liang, Zhicheng
2014-01-01
In mass customization logistics service, reasonable scheduling of the logistics service supply chain (LSSC), especially time scheduling, is benefit to increase its competitiveness. Therefore, the effect of a customer order decoupling point (CODP) on the time scheduling performance should be considered. To minimize the total order operation cost of the LSSC, minimize the difference between the expected and actual time of completing the service orders, and maximize the satisfaction of functional logistics service providers, this study establishes an LSSC time scheduling model based on the CODP. Matlab 7.8 software is used in the numerical analysis for a specific example. Results show that the order completion time of the LSSC can be delayed or be ahead of schedule but cannot be infinitely advanced or infinitely delayed. Obtaining the optimal comprehensive performance can be effective if the expected order completion time is appropriately delayed. The increase in supply chain comprehensive performance caused by the increase in the relationship coefficient of logistics service integrator (LSI) is limited. The relative concern degree of LSI on cost and service delivery punctuality leads to not only changes in CODP but also to those in the scheduling performance of the LSSC. PMID:24715818
Late paleozoic fusulinoidean gigantism driven by atmospheric hyperoxia.
Payne, Jonathan L; Groves, John R; Jost, Adam B; Nguyen, Thienan; Moffitt, Sarah E; Hill, Tessa M; Skotheim, Jan M
2012-09-01
Atmospheric hyperoxia, with pO(2) in excess of 30%, has long been hypothesized to account for late Paleozoic (360-250 million years ago) gigantism in numerous higher taxa. However, this hypothesis has not been evaluated statistically because comprehensive size data have not been compiled previously at sufficient temporal resolution to permit quantitative analysis. In this study, we test the hyperoxia-gigantism hypothesis by examining the fossil record of fusulinoidean foraminifers, a dramatic example of protistan gigantism with some individuals exceeding 10 cm in length and exceeding their relatives by six orders of magnitude in biovolume. We assembled and examined comprehensive regional and global, species-level datasets containing 270 and 1823 species, respectively. A statistical model of size evolution forced by atmospheric pO(2) is conclusively favored over alternative models based on random walks or a constant tendency toward size increase. Moreover, the ratios of volume to surface area in the largest fusulinoideans are consistent in magnitude and trend with a mathematical model based on oxygen transport limitation. We further validate the hyperoxia-gigantism model through an examination of modern foraminiferal species living along a measured gradient in oxygen concentration. These findings provide the first quantitative confirmation of a direct connection between Paleozoic gigantism and atmospheric hyperoxia. © 2012 The Author(s). Evolution© 2012 The Society for the Study of Evolution.
Mealor, Andy D.; Simner, Julia; Rothen, Nicolas; Carmichael, Duncan A.; Ward, Jamie
2016-01-01
We developed the Sussex Cognitive Styles Questionnaire (SCSQ) to investigate visual and verbal processing preferences and incorporate global/local processing orientations and systemising into a single, comprehensive measure. In Study 1 (N = 1542), factor analysis revealed six reliable subscales to the final 60 item questionnaire: Imagery Ability (relating to the use of visual mental imagery in everyday life); Technical/Spatial (relating to spatial mental imagery, and numerical and technical cognition); Language & Word Forms; Need for Organisation; Global Bias; and Systemising Tendency. Thus, we replicate previous findings that visual and verbal styles are separable, and that types of imagery can be subdivided. We extend previous research by showing that spatial imagery clusters with other abstract cognitive skills, and demonstrate that global/local bias can be separated from systemising. Study 2 validated the Technical/Spatial and Language & Word Forms factors by showing that they affect performance on memory tasks. In Study 3, we validated Imagery Ability, Technical/Spatial, Language & Word Forms, Global Bias, and Systemising Tendency by issuing the SCSQ to a sample of synaesthetes (N = 121) who report atypical cognitive profiles on these subscales. Thus, the SCSQ consolidates research from traditionally disparate areas of cognitive science into a comprehensive cognitive style measure, which can be used in the general population, and special populations. PMID:27191169
The role of suppression in figurative language comprehension✩
Gemsbacher, Morton Ann; Robertson, Rachel R.W.
2014-01-01
In this paper, we describe the crucial role that suppression plays in many aspects of language comprehension. We define suppression as a general, cognitive mechanism, the purpose of which is to attenuate the interference caused by the activation of extraneous, unnecessary, or inappropriate information. We illustrate the crucial role that suppression plays in general comprehension by reviewing numerous experiments. These experiments demonstrate that suppression attenuates interference during lexical access (how word meanings are ‘accessed’), anaphoric reference (how referents for anaphors, like pronouns, are computed), cataphoric reference (how concepts that are marked by devices, such as spoken stress, gain a privileged status), syntactic parsing (how grammatical forms of sentences are decoded), and individual differences in (adult) language comprehension skill. We also review research that suggests that suppression plays a crucial role in the understanding of figurative language, in particular, metaphors, idioms, and proverbs. PMID:25520540
Comprehensive benefit analysis of regional water resources based on multi-objective evaluation
NASA Astrophysics Data System (ADS)
Chi, Yixia; Xue, Lianqing; Zhang, Hui
2018-01-01
The purpose of the water resources comprehensive benefits analysis is to maximize the comprehensive benefits on the aspects of social, economic and ecological environment. Aiming at the defects of the traditional analytic hierarchy process in the evaluation of water resources, it proposed a comprehensive benefit evaluation of social, economic and environmental benefits index from the perspective of water resources comprehensive benefit in the social system, economic system and environmental system; determined the index weight by the improved fuzzy analytic hierarchy process (AHP), calculated the relative index of water resources comprehensive benefit and analyzed the comprehensive benefit of water resources in Xiangshui County by the multi-objective evaluation model. Based on the water resources data in Xiangshui County, 20 main comprehensive benefit assessment factors of 5 districts belonged to Xiangshui County were evaluated. The results showed that the comprehensive benefit of Xiangshui County was 0.7317, meanwhile the social economy has a further development space in the current situation of water resources.
Schaeffer, Blake A; Hagy, James D; Conmy, Robyn N; Lehrter, John C; Stumpf, Richard P
2012-01-17
Human activities on land increase nutrient loads to coastal waters, which can increase phytoplankton production and biomass and associated ecological impacts. Numeric nutrient water quality standards are needed to protect coastal waters from eutrophication impacts. The Environmental Protection Agency determined that numeric nutrient criteria were necessary to protect designated uses of Florida's waters. The objective of this study was to evaluate a reference condition approach for developing numeric water quality criteria for coastal waters, using data from Florida. Florida's coastal waters have not been monitored comprehensively via field sampling to support numeric criteria development. However, satellite remote sensing had the potential to provide adequate data. Spatial and temporal measures of SeaWiFS OC4 chlorophyll-a (Chl(RS)-a, mg m(-3)) were resolved across Florida's coastal waters between 1997 and 2010 and compared with in situ measurements. Statistical distributions of Chl(RS)-a were evaluated to determine a quantitative reference baseline. A binomial approach was implemented to consider how new data could be assessed against the criteria. The proposed satellite remote sensing approach to derive numeric criteria may be generally applicable to other coastal waters.
2011-01-01
Human activities on land increase nutrient loads to coastal waters, which can increase phytoplankton production and biomass and associated ecological impacts. Numeric nutrient water quality standards are needed to protect coastal waters from eutrophication impacts. The Environmental Protection Agency determined that numeric nutrient criteria were necessary to protect designated uses of Florida’s waters. The objective of this study was to evaluate a reference condition approach for developing numeric water quality criteria for coastal waters, using data from Florida. Florida’s coastal waters have not been monitored comprehensively via field sampling to support numeric criteria development. However, satellite remote sensing had the potential to provide adequate data. Spatial and temporal measures of SeaWiFS OC4 chlorophyll-a (ChlRS-a, mg m–3) were resolved across Florida’s coastal waters between 1997 and 2010 and compared with in situ measurements. Statistical distributions of ChlRS-a were evaluated to determine a quantitative reference baseline. A binomial approach was implemented to consider how new data could be assessed against the criteria. The proposed satellite remote sensing approach to derive numeric criteria may be generally applicable to other coastal waters. PMID:22192062
Improving Reading Comprehension Using Digital Text: A Meta-Analysis of Interventions
ERIC Educational Resources Information Center
Berkeley, Sheri; Kurz, Leigh Ann; Boykin, Andrea; Evmenova, Anya S.
2015-01-01
Much is known about how to improve students' comprehension when reading printed text; less is known about outcomes when reading digital text. The purpose of this meta-analysis was to analyze research on the impact of digital text interventions. A comprehensive literature search resulted in 27 group intervention studies with 16,513 participants.…
Kim, Hee-Ju; Abraham, Ivo
2017-01-01
Evidence is needed on the clinicometric properties of single-item or short measures as alternatives to comprehensive measures. We examined whether two single-item fatigue measures (i.e., Likert scale, numeric rating scale) or a short fatigue measure were comparable to a comprehensive measure in reliability (i.e., internal consistency and test-retest reliability) and validity (i.e., convergent, concurrent, and predictive validity) in Korean young adults. For this quantitative study, we selected the Functional Assessment of Chronic Illness Therapy-Fatigue for the comprehensive measure and the Profile of Mood States-Brief, Fatigue subscale for the short measure; and constructed two single-item measures. A total of 368 students from four nursing colleges in South Korea participated. We used Cronbach's alpha and item-total correlation for internal consistency reliability and intraclass correlation coefficient for test-retest reliability. We assessed Pearson's correlation with a comprehensive measure for convergent validity, with perceived stress level and sleep quality for concurrent validity and the receiver operating characteristic curve for predictive validity. The short measure was comparable to the comprehensive measure in internal consistency reliability (Cronbach's alpha=0.81 vs. 0.88); test-retest reliability (intraclass correlation coefficient=0.66 vs. 0.61); convergent validity (r with comprehensive measure=0.79); concurrent validity (r with perceived stress=0.55, r with sleep quality=0.39) and predictive validity (area under curve=0.88). Single-item measures were not comparable to the comprehensive measure. A short fatigue measure exhibited similar levels of reliability and validity to the comprehensive measure in Korean young adults. Copyright © 2016 Elsevier Ltd. All rights reserved.
Hedonic valuation of the spatial competition for urban circumstance utilities: case Wuhan, China
NASA Astrophysics Data System (ADS)
Zheng, Bin; Liu, Yaolin; Huang, Lina
2008-10-01
It has generally accepted Alonso's [1] theory about the allocation of different land uses of commerce, resident and industry in urban area. A bunch of researches have provided their aspects of the theme of the relationships between urban circumstances and urban land uses in either the influence of one or several designate circumstance factors on different land uses, or the comprehensive analysis of the influence of all kinds of circumstance on one selected land usage (e.g. residential use). There is still not a wholly analysis about the influence of all kinds of spatial characteristics, available for the location selection of different land uses. That's why this research selects to engage in a study on the difference among "consumer preferences" to the location amenities in the city. Here we regard the behavior as "spatial competition of the locations". Hedonic regression model (HRM) analysis is employed as the basic framework of the research. Tabular comparison of HRM parameters performed with principal components analysis (PCA) and Geographic Information Science (GIS) provides all necessary numerical investigation and spatial analysis until to the finally results. The research can be helpful for putting forward to a further integrated investigation on the relationship between urban circumstance and real land use values.
Silagi, Marcela Lima; Rabelo, Camila Maia; Schochat, Eliane; Mansur, Letícia Lessa
2017-11-13
To analyze the effect of education on sentence listening comprehension on cognitively healthy elderly. A total of 111 healthy elderly, aged 60-80 years of both genders were divided into two groups according to educational level: low education (0-8 years of formal education) and high education (≥9 years of formal education). The participants were assessed using the Revised Token Test, an instrument that supports the evaluation of auditory comprehension of orders with different working memory and syntactic complexity demands. The indicators used for performance analysis were the number of correct responses (accuracy analysis) and task execution time (temporal analysis) in the different blocks. The low educated group had a lower number of correct responses than the high educated group on all blocks of the test. In the temporal analysis, participants with low education had longer execution time for commands on the first four blocks related to working memory. However, the two groups had similar execution time for blocks more related to syntactic comprehension. Education influenced sentence listening comprehension on elderly. Temporal analysis allowed to infer over the relationship between comprehension and other cognitive abilities, and to observe that the low educated elderly did not use effective compensation strategies to improve their performances on the task. Therefore, low educational level, associated with aging, may potentialize the risks for language decline.
[Text Comprehensibility of Hospital Report Cards].
Sander, U; Kolb, B; Christoph, C; Emmert, M
2016-12-01
Objectives: Recently, the number of hospital report cards that compare quality of hospitals and present information from German quality reports has greatly increased. Objectives of this study were to a) identify suitable methods for measuring the readability and comprehensibility of hospital report cards, b) to obtain reliable information on the comprehensibility of texts for laymen, c) to give recommendations for improvements and d) to recommend public health actions. Methods: The readability and comprehensibility of the texts were tested with a) a computer-aided evaluation of formal text characteristics (readability indices Flesch (German formula) and 1. Wiener Sachtextformel formula), b) an expert-based heuristic analysis of readability and comprehensibility of texts (counting technical terms and analysis of text simplicity as well as brevity and conciseness using the Hamburg intelligibility model) and c) a survey of subjects about the comprehensibility of individual technical terms, the assessment of the comprehensibility of the presentations and the subjects' decisions in favour of one of the 5 presented clinics due to the better quality of data. In addition, the correlation between the results of the text analysis with the results from the survey of subjects was tested. Results: The assessment of texts with the computer-aided evaluations showed poor comprehensibility values. The assessment of text simplicity using the Hamburg intelligibility model showed poor comprehensibility values (-0.3). On average, 6.8% of the words used were technical terms. A review of 10 technical terms revealed that in all cases only a minority of respondents (from 4.4% to 39.1%) exactly knew what was meant by each of them. Most subjects (62.4%) also believed that unclear terms worsened their understanding of the information offered. The correlation analysis showed that presentations with a lower frequency of technical terms and better values for the text simplicity were better understood. Conclusion: The determination of the frequency of technical terms and the assessment of text simplicity using the Hamburg intelligibility model were suitable methods to determine the readability and comprehensibility of presentations of quality indicators. The analysis showed predominantly poor comprehensibility values and indicated the need to improve the texts of report cards. © Georg Thieme Verlag KG Stuttgart · New York.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-04
...-time staff of five employees and various summer temporaries manage and study the refuge habitats and... majority of fishing occurring on Red Rock. In addition, the refuge is open to limited hunting of ducks..., conducting numerous studies to determine the effects and best methods of restoration, including any effects...
To Read or Not to Read: A Question of National Consequence. Research Report #47
ERIC Educational Resources Information Center
National Endowment for the Arts, 2007
2007-01-01
This report contains the best national data available to provide a reliable and comprehensive overview of American reading today. While it incorporates some statistics from the National Endowment for the Arts' 2004 report, "Reading at Risk," this new study contains vastly more data from numerous sources. Although most of this information is…
ERIC Educational Resources Information Center
Stock, S. E.; Davies, D. K.; Wehmeyer, M. L.; Palmer, S. B.
2008-01-01
Background: There are over two billion telephones in use worldwide. Yet, for millions of Americans with intellectual disabilities (ID), access to the benefits of cellphone technology is limited because of deficits in literacy, numerical comprehension, the proliferation of features and shrinking size of cellphone hardware and user interfaces.…
Alcohol Abuse Prevention: A Comprehensive Guide for Youth Organizations.
ERIC Educational Resources Information Center
Boys' Clubs of America, New York, NY.
This guide, the culmination of a three year Project TEAM effort by the Boys' Clubs of America, describes numerous strategies for developing an alcohol abuse prevention program. The core of this guide consists of program models developed by the Boys' Club project at seven pilot sites. The models presented cover the following areas: peer leadership,…
ERIC Educational Resources Information Center
Finesilver, Carla
2017-01-01
The move from additive to multiplicative thinking requires significant change in children's comprehension and manipulation of numerical relationships, involves various conceptual components, and can be a slow, multistage process for some. Unit arrays are a key visuospatial representation for supporting learning, but most research focuses on 2D…
Interfacing Simulations with Training Content
2006-09-01
a panelist at numerous international training and elearning conferences, ADL Plugfests and IMS Global Learning Consortium Open Technical Forums. Dr...communication technologies has enabled higher quality learning to be made available through increasingly sophisticated modes of presentation. Traditional...However, learning is a comprehensive process which does not simply consist of the transmission and learning of content. While simulations offer the
ERIC Educational Resources Information Center
Segool, Natasha K.; Mathiason, Jacob B.; Majewicz-Hefley, Amy; Carlson, John S.
2009-01-01
Currently, more than two thirds of school-aged children with mental health needs do not receive treatment. By exploring the numerous barriers that limit children's access to mental health care, the authors argue that school psychologists have a key role to play in supporting comprehensive mental health services for children. This article provides…
The Struggle for the American Extracurriculum at Ithaca High School, 1890-1917
ERIC Educational Resources Information Center
Terzian, Sevan G.
2005-01-01
Numerous scholars have observed that the comprehensive high school has never succeeded in socially unifying the entire student body. This article attempts to explain why this has been the case by tracing the origins and development of the extracurriculum at Ithaca High School from 1890 to 1917. It demonstrates that students struggled with…
History of sugar maple decline
David R. Houston
1999-01-01
Only a few episodes of sugar maple dieback or decline were recorded during the first half of the 20th Century. In contrast, the last 50 years have provided numerous reports of both urban and forest dieback/decline. In the late 1950s, a defoliation-triggered decline, termed maple blight, that occurred in Wisconsin prompted the first comprehensive, multidisciplinary...
ERIC Educational Resources Information Center
Evans, Gary W.; Ricciuti, Henry N.; Hope, Steven; Schoon, Ingrid; Bradley, Robert H.; Corwyn, Robert F.; Hazan, Cindy
2010-01-01
Residential crowding in both U.S. and U.K. samples of 36-month-old children is related concurrently to the Bracken scale, a standard index of early cognitive development skills including letter and color identification, shape recognition, and elementary numeric comprehension. In the U.S. sample, these effects also replicate prospectively.…
ERIC Educational Resources Information Center
Wu, Yelena P.; Steele, Ric G.
2011-01-01
School nurses represent an important resource for addressing pediatric obesity and weight-related health. However, school nurses perceive numerous barriers that prevent them from addressing the weight-related health of students. The current study developed and tested a new, comprehensive measure of nurses' perceptions of 10 types of barriers to…
ERIC Educational Resources Information Center
Berkeley, Sheri; King-Sears, Margaret E.; Vilbas, Jessica; Conklin, Sarah
2016-01-01
Textbooks are heavily used in secondary-level content area classes, but previous research has identified numerous challenges for students associated with reading and understanding these texts. While students can learn reading strategies that help them better understand text, it is unclear the extent to which textbooks are written to promote or…
The Robust Learning Model (RLM): A Comprehensive Approach to a New Online University
ERIC Educational Resources Information Center
Neumann, Yoram; Neumann, Edith F.
2010-01-01
This paper outlines the components of the Robust Learning Model (RLM) as a conceptual framework for creating a new online university offering numerous degree programs at all degree levels. The RLM is a multi-factorial model based on the basic belief that successful learning outcomes depend on multiple factors employed together in a holistic…
NASA Technical Reports Server (NTRS)
Shepherd, J. Marshall; Manyin, Michael; Burian, Steve; Garza, Carlos
2003-01-01
There is renewed interest in the impacts of urbanization on global change as witnessed by special sessions at the Fall AGU and Annual AMS meeting. A comprehensive satellite, modeling, and field campaign program is underway to assess the impact of urbanization on precipitation.
Physics-based simulations of the impacts forest management practices have on hydrologic response
Adrianne Carr; Keith Loague
2012-01-01
The impacts of logging on near-surface hydrologic response at the catchment and watershed scales were examined quantitatively using numerical simulation. The simulations were conducted with the Integrated Hydrology Model (InHM) for the North Fork of Caspar Creek Experimental Watershed, located near Fort Bragg, California. InHM is a comprehensive physics-based...
Summer Youth Employment: The Corporate Experience. Research Bulletin Number 141.
ERIC Educational Resources Information Center
Lund, Leonard; Weber, Nathan
During the summer of 1982, summer jobs programs organized and operated by the private sector were underway in numerous cities, often in addition to or in cooperation with the government-funded Comprehensive Employment and Training Act (CETA) projects. Of the 176 companies that responded to a Conference Board survey of 480 of the largest…
NASA Astrophysics Data System (ADS)
Druzgalski, Clara; Mani, Ali
2016-11-01
We investigate electroconvection and its impact on ion transport in a model system comprised of an ion-selective membrane, an aqueous electrolyte, and an external electric field applied normal to the membrane. We develop a direct numerical simulation code to solve the governing Poisson-Nernst-Planck and Navier-Stokes equations in three dimensions using a specialized parallel numerical algorithm and sufficient resolution to capture the high frequency and high wavenumber physics. We show a comprehensive statistical analysis of the transport phenomena in the highly chaotic regime. Qualitative and quantitative comparisons of two-dimensional (2D) and 3D simulations include prediction of the mean concentration fields as well as the spectra of concentration, charge density, and velocity signals. Our analyses reveal a significant quantitative difference between 2D and 3D electroconvection. Furthermore, we show that high-intensity yet short-lived current density hot spots appear randomly on the membrane surface, contributing significantly to the mean current density. By examining cross correlations between current density on the membrane and other field quantities we explore the physical mechanisms leading to current hot spots. We also present analysis of transport fluxes in the context of ensemble-averaged equations. Our analysis reveals that in the highly chaotic regime the mixing layer (ML), which spans the majority of the domain extent, is governed by advective fluctuations. Furthermore, we show that in the ML the mean electromigration fluxes cancel out for positive and negative ions, indicating that the mean transport of total salt content within the ML can be represented via the electroneutral approximation. Finally, we present an assessment of the importance of different length scales in enhancing transport by computing the cross covariance of concentration and velocity fluctuations in the wavenumber space. Our analysis indicates that in the majority of the domain the large scales contribute most significantly to transport, while the effects of small scales become more appreciable in regions very near the membrane.
Probabilistic analysis of tsunami hazards
Geist, E.L.; Parsons, T.
2006-01-01
Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).
Kelvin-wave cascade in the vortex filament model
NASA Astrophysics Data System (ADS)
Baggaley, Andrew W.; Laurie, Jason
2014-01-01
The small-scale energy-transfer mechanism in zero-temperature superfluid turbulence of helium-4 is still a widely debated topic. Currently, the main hypothesis is that weakly nonlinear interacting Kelvin waves (KWs) transfer energy to sufficiently small scales such that energy is dissipated as heat via phonon excitations. Theoretically, there are at least two proposed theories for Kelvin-wave interactions. We perform the most comprehensive numerical simulation of weakly nonlinear interacting KWs to date and show, using a specially designed numerical algorithm incorporating the full Biot-Savart equation, that our results are consistent with the nonlocal six-wave KW interactions as proposed by L'vov and Nazarenko.
NASA Astrophysics Data System (ADS)
Abbasi, F. M.; Gul, Maimoona; Shehzad, S. A.
2018-05-01
Current study provides a comprehensive numerical investigation of the peristaltic transport of boron nitride-ethylene glycol nanofluid through a symmetric channel in presence of magnetic field. Significant effects of Brownian motion and thermophoresis have been included in the energy equation. Hall and Ohmic heating effects are also taken into consideration. Resulting system of non-linear equations is solved numerically using NDSolve in Mathematica. Expressions for velocity, temperature, concentration and streamlines are derived and plotted under the assumption of long wavelength and low Reynolds number. Influence of various parameters on heat and mass transfer rates have been discussed with the help of bar charts.
Generalized Fourier analyses of the advection-diffusion equation - Part I: one-dimensional domains
NASA Astrophysics Data System (ADS)
Christon, Mark A.; Martinez, Mario J.; Voth, Thomas E.
2004-07-01
This paper presents a detailed multi-methods comparison of the spatial errors associated with finite difference, finite element and finite volume semi-discretizations of the scalar advection-diffusion equation. The errors are reported in terms of non-dimensional phase and group speed, discrete diffusivity, artificial diffusivity, and grid-induced anisotropy. It is demonstrated that Fourier analysis provides an automatic process for separating the discrete advective operator into its symmetric and skew-symmetric components and characterizing the spectral behaviour of each operator. For each of the numerical methods considered, asymptotic truncation error and resolution estimates are presented for the limiting cases of pure advection and pure diffusion. It is demonstrated that streamline upwind Petrov-Galerkin and its control-volume finite element analogue, the streamline upwind control-volume method, produce both an artificial diffusivity and a concomitant phase speed adjustment in addition to the usual semi-discrete artifacts observed in the phase speed, group speed and diffusivity. The Galerkin finite element method and its streamline upwind derivatives are shown to exhibit super-convergent behaviour in terms of phase and group speed when a consistent mass matrix is used in the formulation. In contrast, the CVFEM method and its streamline upwind derivatives yield strictly second-order behaviour. In Part II of this paper, we consider two-dimensional semi-discretizations of the advection-diffusion equation and also assess the affects of grid-induced anisotropy observed in the non-dimensional phase speed, and the discrete and artificial diffusivities. Although this work can only be considered a first step in a comprehensive multi-methods analysis and comparison, it serves to identify some of the relative strengths and weaknesses of multiple numerical methods in a common analysis framework. Published in 2004 by John Wiley & Sons, Ltd.
Top 100 Cited Articles on Back Pain Research: A Citation Analysis.
Huang, Weimin; Wang, Lei; Wang, Bing; Yu, Lili; Yu, Xiuchun
2016-11-01
A bibliometric review of the literature. Back pain is a global burden that leads people to seek medical service and results in work disability. Numerous studies are published annually to give new insights into back pain. However, characteristics of the high-impact articles on back pain have not been explored. The current study aimed to identify the 100 most cited articles on back pain and determine their characteristics. Back pain is a globally leading cause of work disability. Numerous studies have been published annually to give new insight to back pain. However, comprehensive analysis to identify the most influential articles is not available until now. The Web of Science core database was searched using the subject terms "back NEAR pain," "dorsalgia," "backache," "lumbar NEAR pain," "lumbago," "back NEAR disorder*," "discitis." The searching results were listed by citation times and the top 100 cited articles on back pain were identified. Important information such as author, journal, publishing year, country, institution, and study type were elicited. A total of 44,460 articles on back pain were displayed. Citation times of the enrolled 100 articles ranged from 249 to 1638 with a mean value of 418. The most productive periods were 1991 to 1995 and 1996 to 2000. The journal Spine holds the largest number of 45 articles, followed by Pain with seven articles. A total of 11 countries contribute to the 100 articles and the United States topped the list. None of the high-impact articles were produced in Asian and African. The current citation analysis demonstrated the essential advances in the history of back pain research and determined the influential authors, institutions, countries, and journals that had outstanding contributions to the studies of back pain. 3.
Detailed modeling of the statistical uncertainty of Thomson scattering measurements
NASA Astrophysics Data System (ADS)
Morton, L. A.; Parke, E.; Den Hartog, D. J.
2013-11-01
The uncertainty of electron density and temperature fluctuation measurements is determined by statistical uncertainty introduced by multiple noise sources. In order to quantify these uncertainties precisely, a simple but comprehensive model was made of the noise sources in the MST Thomson scattering system and of the resulting variance in the integrated scattered signals. The model agrees well with experimental and simulated results. The signal uncertainties are then used by our existing Bayesian analysis routine to find the most likely electron temperature and density, with confidence intervals. In the model, photonic noise from scattered light and plasma background light is multiplied by the noise enhancement factor (F) of the avalanche photodiode (APD). Electronic noise from the amplifier and digitizer is added. The amplifier response function shapes the signal and induces correlation in the noise. The data analysis routine fits a characteristic pulse to the digitized signals from the amplifier, giving the integrated scattered signals. A finite digitization rate loses information and can cause numerical integration error. We find a formula for the variance of the scattered signals in terms of the background and pulse amplitudes, and three calibration constants. The constants are measured easily under operating conditions, resulting in accurate estimation of the scattered signals' uncertainty. We measure F ≈ 3 for our APDs, in agreement with other measurements for similar APDs. This value is wavelength-independent, simplifying analysis. The correlated noise we observe is reproduced well using a Gaussian response function. Numerical integration error can be made negligible by using an interpolated characteristic pulse, allowing digitization rates as low as the detector bandwidth. The effect of background noise is also determined.
Araujo, Luiz H.; Timmers, Cynthia; Bell, Erica Hlavin; Shilo, Konstantin; Lammers, Philip E.; Zhao, Weiqiang; Natarajan, Thanemozhi G.; Miller, Clinton J.; Zhang, Jianying; Yilmaz, Ayse S.; Liu, Tom; Coombes, Kevin; Amann, Joseph; Carbone, David P.
2015-01-01
Purpose Technologic advances have enabled the comprehensive analysis of genetic perturbations in non–small-cell lung cancer (NSCLC); however, African Americans have often been underrepresented in these studies. This ethnic group has higher lung cancer incidence and mortality rates, and some studies have suggested a lower incidence of epidermal growth factor receptor mutations. Herein, we report the most in-depth molecular profile of NSCLC in African Americans to date. Methods A custom panel was designed to cover the coding regions of 81 NSCLC-related genes and 40 ancestry-informative markers. Clinical samples were sequenced on a massively parallel sequencing instrument, and anaplastic lymphoma kinase translocation was evaluated by fluorescent in situ hybridization. Results The study cohort included 99 patients (61% males, 94% smokers) comprising 31 squamous and 68 nonsquamous cell carcinomas. We detected 227 nonsilent variants in the coding sequence, including 24 samples with nonoverlapping, classic driver alterations. The frequency of driver mutations was not significantly different from that of whites, and no association was found between genetic ancestry and the presence of somatic mutations. Copy number alteration analysis disclosed distinguishable amplifications in the 3q chromosome arm in squamous cell carcinomas and pointed toward a handful of targetable alterations. We also found frequent SMARCA4 mutations and protein loss, mostly in driver-negative tumors. Conclusion Our data suggest that African American ancestry may not be significantly different from European/white background for the presence of somatic driver mutations in NSCLC. Furthermore, we demonstrated that using a comprehensive genotyping approach could identify numerous targetable alterations, with potential impact on therapeutic decisions. PMID:25918285
Provost, Mélanie; Koompalum, Dayin; Dong, Diane; Martin, Bradley C
2006-01-01
To develop a comprehensive instrument assessing quality of health-related web sites. Phase I consisted of a literature review to identify constructs thought to indicate web site quality and to identify items. During content analysis, duplicate items were eliminated and items that were not clear, meaningful, or measurable were reworded or removed. Some items were generated by the authors. Phase II: a panel consisting of six healthcare and MIS reviewers was convened to assess each item for its relevance and importance to the construct and to assess item clarity and measurement feasibility. Three hundred and eighty-four items were generated from 26 sources. The initial content analysis reduced the scale to 104 items. Four of the six expert reviewers responded; high concordance on the relevance, importance and measurement feasibility of each item was observed: 3 out of 4, or all raters agreed on 76-85% of items. Based on the panel ratings, 9 items were removed, 3 added, and 10 revised. The WebMedQual consists of 8 categories, 8 sub-categories, 95 items and 3 supplemental items to assess web site quality. The constructs are: content (19 items), authority of source (18 items), design (19 items), accessibility and availability (6 items), links (4 items), user support (9 items), confidentiality and privacy (17 items), e-commerce (6 items). The "WebMedQual" represents a first step toward a comprehensive and standard quality assessment of health web sites. This scale will allow relatively easy assessment of quality with possible numeric scoring.
Comprehension of Co-Speech Gestures in Aphasic Patients: An Eye Movement Study.
Eggenberger, Noëmi; Preisig, Basil C; Schumacher, Rahel; Hopfner, Simone; Vanbellingen, Tim; Nyffeler, Thomas; Gutbrod, Klemens; Annoni, Jean-Marie; Bohlhalter, Stephan; Cazzoli, Dario; Müri, René M
2016-01-01
Co-speech gestures are omnipresent and a crucial element of human interaction by facilitating language comprehension. However, it is unclear whether gestures also support language comprehension in aphasic patients. Using visual exploration behavior analysis, the present study aimed to investigate the influence of congruence between speech and co-speech gestures on comprehension in terms of accuracy in a decision task. Twenty aphasic patients and 30 healthy controls watched videos in which speech was either combined with meaningless (baseline condition), congruent, or incongruent gestures. Comprehension was assessed with a decision task, while remote eye-tracking allowed analysis of visual exploration. In aphasic patients, the incongruent condition resulted in a significant decrease of accuracy, while the congruent condition led to a significant increase in accuracy compared to baseline accuracy. In the control group, the incongruent condition resulted in a decrease in accuracy, while the congruent condition did not significantly increase the accuracy. Visual exploration analysis showed that patients fixated significantly less on the face and tended to fixate more on the gesturing hands compared to controls. Co-speech gestures play an important role for aphasic patients as they modulate comprehension. Incongruent gestures evoke significant interference and deteriorate patients' comprehension. In contrast, congruent gestures enhance comprehension in aphasic patients, which might be valuable for clinical and therapeutic purposes.
Predicting Employment Outcomes of Consumers of State-Operated Comprehensive Rehabilitation Centers
ERIC Educational Resources Information Center
Beach, David Thomas
2009-01-01
This study used records from a state-operated comprehensive rehabilitation center to investigate possible predictive factors related to completing comprehensive rehabilitation center programs and successful vocational rehabilitation (VR) case closure. An analysis of demographic data of randomly selected comprehensive rehabilitation center…
Yeari, Menahem; van den Broek, Paul
2016-09-01
It is a well-accepted view that the prior semantic (general) knowledge that readers possess plays a central role in reading comprehension. Nevertheless, computational models of reading comprehension have not integrated the simulation of semantic knowledge and online comprehension processes under a unified mathematical algorithm. The present article introduces a computational model that integrates the landscape model of comprehension processes with latent semantic analysis representation of semantic knowledge. In three sets of simulations of previous behavioral findings, the integrated model successfully simulated the activation and attenuation of predictive and bridging inferences during reading, as well as centrality estimations and recall of textual information after reading. Analyses of the computational results revealed new theoretical insights regarding the underlying mechanisms of the various comprehension phenomena.
Comprehensive HIV Prevention for Transgender Persons.
Neumann, Mary Spink; Finlayson, Teresa J; Pitts, Nicole L; Keatley, JoAnne
2017-02-01
Transgender persons are at high risk for HIV infection, but prevention efforts specifically targeting these people have been minimal. Part of the challenge of HIV prevention for transgender populations is that numerous individual, interpersonal, social, and structural factors contribute to their risk. By combining HIV prevention services with complementary medical, legal, and psychosocial services, transgender persons' HIV risk behaviors, risk determinants, and overall health can be affected simultaneously. For maximum health impact, comprehensive HIV prevention for transgender persons warrants efforts targeted to various impact levels-socioeconomic factors, decision-making contexts, long-lasting protections, clinical interventions, and counseling and education. We present current HIV prevention efforts that reach transgender persons and present others for future consideration.
Capturing the 'ome': the expanding molecular toolbox for RNA and DNA library construction.
Boone, Morgane; De Koker, Andries; Callewaert, Nico
2018-04-06
All sequencing experiments and most functional genomics screens rely on the generation of libraries to comprehensively capture pools of targeted sequences. In the past decade especially, driven by the progress in the field of massively parallel sequencing, numerous studies have comprehensively assessed the impact of particular manipulations on library complexity and quality, and characterized the activities and specificities of several key enzymes used in library construction. Fortunately, careful protocol design and reagent choice can substantially mitigate many of these biases, and enable reliable representation of sequences in libraries. This review aims to guide the reader through the vast expanse of literature on the subject to promote informed library generation, independent of the application.
Comprehensive HIV Prevention for Transgender Persons
Neumann, Mary Spink; Finlayson, Teresa J.; Pitts, Nicole L.; Keatley, JoAnne
2017-01-01
Transgender persons are at high risk for HIV infection, but prevention efforts specifically targeting these people have been minimal. Part of the challenge of HIV prevention for transgender populations is that numerous individual, interpersonal, social, and structural factors contribute to their risk. By combining HIV prevention services with complementary medical, legal, and psychosocial services, transgender persons’ HIV risk behaviors, risk determinants, and overall health can be affected simultaneously. For maximum health impact, comprehensive HIV prevention for transgender persons warrants efforts targeted to various impact levels—socioeconomic factors, decision-making contexts, long-lasting protections, clinical interventions, and counseling and education. We present current HIV prevention efforts that reach transgender persons and present others for future consideration. PMID:27997228
Phytoscreening with SPME: Variability Analysis.
Limmer, Matt A; Burken, Joel G
2015-01-01
Phytoscreening has been demonstrated at a variety of sites over the past 15 years as a low-impact, sustainable tool in delineation of shallow groundwater contaminated with chlorinated solvents. Collection of tree cores is rapid and straightforward, but low concentrations in tree tissues requires sensitive analytics. Solid-phase microextraction (SPME) is amenable to the complex matrix while allowing for solvent-less extraction. Accurate quantification requires the absence of competitive sorption, examined here both in laboratory experiments and through comprehensive examination of field data. Analysis of approximately 2,000 trees at numerous field sites also allowed testing of the tree genus and diameter effects on measured tree contaminant concentrations. Collectively, while these variables were found to significantly affect site-adjusted perchloroethylene (PCE) concentrations, the explanatory power of these effects was small (adjusted R(2) = 0.031). 90th quantile chemical concentrations in trees were significantly reduced by increasing Henry's constant and increasing hydrophobicity. Analysis of replicate tree core data showed no correlation between replicate relative standard deviation (RSD) and wood type or tree diameter, with an overall median RSD of 30%. Collectively, these findings indicate SPME is an appropriate technique for sampling and analyzing chlorinated solvents in wood and that phytoscreening is robust against changes in tree type and diameter.
Analysis of Rainfall Infiltration Law in Unsaturated Soil Slope
Zhang, Gui-rong; Qian, Ya-jun; Wang, Zhang-chun; Zhao, Bo
2014-01-01
In the study of unsaturated soil slope stability under rainfall infiltration, it is worth continuing to explore how much rainfall infiltrates into the slope in a rain process, and the amount of rainfall infiltrating into slope is the important factor influencing the stability. Therefore, rainfall infiltration capacity is an important issue of unsaturated seepage analysis for slope. On the basis of previous studies, rainfall infiltration law of unsaturated soil slope is analyzed. Considering the characteristics of slope and rainfall, the key factors affecting rainfall infiltration of slope, including hydraulic properties, water storage capacity (θ s - θ r), soil types, rainfall intensities, and antecedent and subsequent infiltration rates on unsaturated soil slope, are discussed by using theory analysis and numerical simulation technology. Based on critical factors changing, this paper presents three calculation models of rainfall infiltrability for unsaturated slope, including (1) infiltration model considering rainfall intensity; (2) effective rainfall model considering antecedent rainfall; (3) infiltration model considering comprehensive factors. Based on the technology of system response, the relationship of rainfall and infiltration is described, and the prototype of regression model of rainfall infiltration is given, in order to determine the amount of rain penetration during a rain process. PMID:24672332
Pasin, Daniel; Cawley, Adam; Bidny, Sergei; Fu, Shanlin
2017-10-01
The proliferation of new psychoactive substances (NPS) in recent years has resulted in the development of numerous analytical methods for the detection and identification of known and unknown NPS derivatives. High-resolution mass spectrometry (HRMS) has been identified as the method of choice for broad screening of NPS in a wide range of analytical contexts because of its ability to measure accurate masses using data-independent acquisition (DIA) techniques. Additionally, it has shown promise for non-targeted screening strategies that have been developed in order to detect and identify novel analogues without the need for certified reference materials (CRMs) or comprehensive mass spectral libraries. This paper reviews the applications of HRMS for the analysis of NPS in forensic drug chemistry and analytical toxicology. It provides an overview of the sample preparation procedures in addition to data acquisition, instrumental analysis, and data processing techniques. Furthermore, it gives an overview of the current state of non-targeted screening strategies with discussion on future directions and perspectives of this technique. Graphical Abstract Missing the bullseye - a graphical respresentation of non-targeted screening. Image courtesy of Christian Alonzo.
Analysis of rainfall infiltration law in unsaturated soil slope.
Zhang, Gui-rong; Qian, Ya-jun; Wang, Zhang-chun; Zhao, Bo
2014-01-01
In the study of unsaturated soil slope stability under rainfall infiltration, it is worth continuing to explore how much rainfall infiltrates into the slope in a rain process, and the amount of rainfall infiltrating into slope is the important factor influencing the stability. Therefore, rainfall infiltration capacity is an important issue of unsaturated seepage analysis for slope. On the basis of previous studies, rainfall infiltration law of unsaturated soil slope is analyzed. Considering the characteristics of slope and rainfall, the key factors affecting rainfall infiltration of slope, including hydraulic properties, water storage capacity (θs - θr), soil types, rainfall intensities, and antecedent and subsequent infiltration rates on unsaturated soil slope, are discussed by using theory analysis and numerical simulation technology. Based on critical factors changing, this paper presents three calculation models of rainfall infiltrability for unsaturated slope, including (1) infiltration model considering rainfall intensity; (2) effective rainfall model considering antecedent rainfall; (3) infiltration model considering comprehensive factors. Based on the technology of system response, the relationship of rainfall and infiltration is described, and the prototype of regression model of rainfall infiltration is given, in order to determine the amount of rain penetration during a rain process.
Phenomenology of NMSSM in TeV scale mirage mediation
NASA Astrophysics Data System (ADS)
Hagimoto, Kei; Kobayashi, Tatsuo; Makino, Hiroki; Okumura, Ken-ichi; Shimomura, Takashi
2016-02-01
We study the next-to-minimal supersymmetric standard model (NMSSM) with the TeV scale mirage mediation, which is known as a solution for the little hierarchy problem in supersymmetry. Our previous study showed that 125 GeV Higgs boson is realized with {O} (10)% fine-tuning for 1.5 TeV gluino (1 TeV stop) mass. The μ term could be as large as 500 GeV without sacrificing the fine-tuning thanks to a cancellation mechanism. The singlet-doublet mixing is suppressed by tan β. In this paper, we further extend this analysis. We argue that approximate scale symmetries play a role behind the suppression of the singlet-doublet mixing. They reduce the mixing matrix to a simple form that is useful to understand the results of the numerical analysis. We perform a comprehensive analysis of the fine-tuning including the singlet sector by introducing a simple formula for the fine-tuning measure. This shows that the singlet mass of the least fine-tuning is favored by the LEP anomaly for moderate tan β. We also discuss prospects for the precision measurements of the Higgs couplings at LHC and ILC and direct/indirect dark matter searches in the model.
TopoMS: Comprehensive topological exploration for molecular and condensed-matter systems.
Bhatia, Harsh; Gyulassy, Attila G; Lordi, Vincenzo; Pask, John E; Pascucci, Valerio; Bremer, Peer-Timo
2018-06-15
We introduce TopoMS, a computational tool enabling detailed topological analysis of molecular and condensed-matter systems, including the computation of atomic volumes and charges through the quantum theory of atoms in molecules, as well as the complete molecular graph. With roots in techniques from computational topology, and using a shared-memory parallel approach, TopoMS provides scalable, numerically robust, and topologically consistent analysis. TopoMS can be used as a command-line tool or with a GUI (graphical user interface), where the latter also enables an interactive exploration of the molecular graph. This paper presents algorithmic details of TopoMS and compares it with state-of-the-art tools: Bader charge analysis v1.0 (Arnaldsson et al., 01/11/17) and molecular graph extraction using Critic2 (Otero-de-la-Roza et al., Comput. Phys. Commun. 2014, 185, 1007). TopoMS not only combines the functionality of these individual codes but also demonstrates up to 4× performance gain on a standard laptop, faster convergence to fine-grid solution, robustness against lattice bias, and topological consistency. TopoMS is released publicly under BSD License. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Melby-Lervag, Monica; Lervag, Arne
2011-01-01
We present a meta-analysis of cross-linguistic transfer of oral language (vocabulary and listening comprehension), phonology (decoding and phonological awareness) and reading comprehension. Our findings show a small meta-correlation between first (L1) and second (L2) oral language and a moderate to large correlation between L1 and L2 phonological…
NASA Astrophysics Data System (ADS)
Maharani, S.; Suprapto, E.
2018-03-01
Critical thinking is very important in Mathematics; it can make student more understanding mathematics concept. Critical thinking is also needed in numerical analysis. The Numerical analysis's book is not yet including critical thinking in them. This research aims to develop group investigation-based book on numerical analysis to increase critical thinking student’s ability, to know the quality of the group investigation-based book on numerical analysis is valid, practical, and effective. The research method is Research and Development (R&D) with the subject are 30 student college department of Mathematics education at Universitas PGRI Madiun. The development model used is 4-D modified to 3-D until the stage development. The type of data used is descriptive qualitative data. Instruments used are sheets of validation, test, and questionnaire. Development results indicate that group investigation-based book on numerical analysis in the category of valid a value 84.25%. Students response to the books very positive, so group investigation-based book on numerical analysis category practical, i.e., 86.00%. The use of group investigation-based book on numerical analysis has been meeting the completeness criteria classical learning that is 84.32 %. Based on research result of this study concluded that group investigation-based book on numerical analysis is feasible because it meets the criteria valid, practical, and effective. So, the book can be used by every mathematics academician. The next research can be observed that book based group investigation in other subjects.
Numerical modelling techniques of soft soil improvement via stone columns: A brief review
NASA Astrophysics Data System (ADS)
Zukri, Azhani; Nazir, Ramli
2018-04-01
There are a number of numerical studies on stone column systems in the literature. Most of the studies found were involved with two-dimensional analysis of the stone column behaviour, while only a few studies used three-dimensional analysis. The most popular software utilised in those studies was Plaxis 2D and 3D. Other types of software that used for numerical analysis are DIANA, EXAMINE, ZSoil, ABAQUS, ANSYS, NISA, GEOSTUDIO, CRISP, TOCHNOG, CESAR, GEOFEM (2D & 3D), FLAC, and FLAC 3. This paper will review the methodological approaches to model stone column numerically, both in two-dimensional and three-dimensional analyses. The numerical techniques and suitable constitutive model used in the studies will also be discussed. In addition, the validation methods conducted were to verify the numerical analysis conducted will be presented. This review paper also serves as a guide for junior engineers through the applicable procedures and considerations when constructing and running a two or three-dimensional numerical analysis while also citing numerous relevant references.
Design of refractive laser beam shapers to generate complex irradiance profiles
NASA Astrophysics Data System (ADS)
Li, Meijie; Meuret, Youri; Duerr, Fabian; Vervaeke, Michael; Thienpont, Hugo
2014-05-01
A Gaussian laser beam is reshaped to have specific irradiance distributions in many applications in order to ensure optimal system performance. Refractive optics are commonly used for laser beam shaping. A refractive laser beam shaper is typically formed by either two plano-aspheric lenses or by one thick lens with two aspherical surfaces. Ray mapping is a general optical design technique to design refractive beam shapers based on geometric optics. This design technique in principle allows to generate any rotational-symmetric irradiance profile, yet in literature ray mapping is mainly developed to transform a Gaussian irradiance profile to a uniform profile. For more complex profiles especially with low intensity in the inner region, like a Dark Hollow Gaussian (DHG) irradiance profile, ray mapping technique is not directly applicable in practice. In order to these complex profiles, the numerical effort of calculating the aspherical surface points and fitting a surface with sufficient accuracy increases considerably. In this work we evaluate different sampling approaches and surface fitting methods. This allows us to propose and demonstrate a comprehensive numerical approach to efficiently design refractive laser beam shapers to generate rotational-symmetric collimated beams with a complex irradiance profile. Ray tracing analysis for several complex irradiance profiles demonstrates excellent performance of the designed lenses and the versatility of our design procedure.
Discrete Particle Method for Simulating Hypervelocity Impact Phenomena.
Watson, Erkai; Steinhauser, Martin O
2017-04-02
In this paper, we introduce a computational model for the simulation of hypervelocity impact (HVI) phenomena which is based on the Discrete Element Method (DEM). Our paper constitutes the first application of DEM to the modeling and simulating of impact events for velocities beyond 5 kms -1 . We present here the results of a systematic numerical study on HVI of solids. For modeling the solids, we use discrete spherical particles that interact with each other via potentials. In our numerical investigations we are particularly interested in the dynamics of material fragmentation upon impact. We model a typical HVI experiment configuration where a sphere strikes a thin plate and investigate the properties of the resulting debris cloud. We provide a quantitative computational analysis of the resulting debris cloud caused by impact and a comprehensive parameter study by varying key parameters of our model. We compare our findings from the simulations with recent HVI experiments performed at our institute. Our findings are that the DEM method leads to very stable, energy-conserving simulations of HVI scenarios that map the experimental setup where a sphere strikes a thin plate at hypervelocity speed. Our chosen interaction model works particularly well in the velocity range where the local stresses caused by impact shock waves markedly exceed the ultimate material strength.
The dynamic failure behavior of tungsten heavy alloys subjected to transverse loads
NASA Astrophysics Data System (ADS)
Tarcza, Kenneth Robert
Tungsten heavy alloys (WHA), a category of particulate composites used in defense applications as kinetic energy penetrators, have been studied for many years. Even so, their dynamic failure behavior is not fully understood and cannot be predicted by numerical models presently in use. In this experimental investigation, a comprehensive understanding of the high-rate transverse-loading fracture behavior of WHA has been developed. Dynamic fracture events spanning a range of strain rates and loading conditions were created via mechanical testing and used to determine the influence of surface condition and microstructure on damage initiation, accumulation, and sample failure under different loading conditions. Using standard scanning electron microscopy metallographic and fractographic techniques, sample surface condition is shown to be extremely influential to the manner in which WHA fails, causing a fundamental change from externally to internally nucleated failures as surface condition is improved. Surface condition is characterized using electron microscopy and surface profilometry. Fracture surface analysis is conducted using electron microscopy, and linear elastic fracture mechanics is used to understand the influence of surface condition, specifically initial flaw size, on sample failure behavior. Loading conditions leading to failure are deduced from numerical modeling and experimental observation. The results highlight parameters and considerations critical to the understanding of dynamic WHA fracture and the development of dynamic WHA failure models.
A Comprehensive Numerical Model for Simulating Fluid Transport in Nanopores
Zhang, Yuan; Yu, Wei; Sepehrnoori, Kamy; Di, Yuan
2017-01-01
Since a large amount of nanopores exist in tight oil reservoirs, fluid transport in nanopores is complex due to large capillary pressure. Recent studies only focus on the effect of nanopore confinement on single-well performance with simple planar fractures in tight oil reservoirs. Its impacts on multi-well performance with complex fracture geometries have not been reported. In this study, a numerical model was developed to investigate the effect of confined phase behavior on cumulative oil and gas production of four horizontal wells with different fracture geometries. Its pore sizes were divided into five regions based on nanopore size distribution. Then, fluid properties were evaluated under different levels of capillary pressure using Peng-Robinson equation of state. Afterwards, an efficient approach of Embedded Discrete Fracture Model (EDFM) was applied to explicitly model hydraulic and natural fractures in the reservoirs. Finally, three fracture geometries, i.e. non-planar hydraulic fractures, non-planar hydraulic fractures with one set natural fractures, and non-planar hydraulic fractures with two sets natural fractures, are evaluated. The multi-well performance with confined phase behavior is analyzed with permeabilities of 0.01 md and 0.1 md. This work improves the analysis of capillarity effect on multi-well performance with complex fracture geometries in tight oil reservoirs. PMID:28091599
Photochemistry in the Atmospheres of Denver and Mexico City
NASA Astrophysics Data System (ADS)
Cantrell, C. A.
2016-12-01
The composition of atmospheres in and downwind of urban centers has been the subject of study for decades. While early campaigns involved measurements exclusively from the ground, more recent studies have included airborne-based observations. Improved understanding has hinged critically on the development of instrumentation for better qualitifcation of pollutants, and measurement of previously unobserved species in the gas and particulate phases. Comprehensive, well-planned studies have, over time, led to more detailed understanding of chemical transformations and thus improved model representations and directions for further research. This presentation focuses on findings from two case studies of urban atmospheres, namely the MILAGRO study in the Mexico City metropolitan area and the FRAPPE study in the Denver metropolitan region. Both studies made use of extensive ground-based networks and multiple aircraft platforms. The data collected during these studies have been combined with numerical models to derive assessments of the evolution of atmospheric composition due to photochemistry, mixing, and surface processes. Here, analysis of MILAGRO data focuses on the evolution of outflow downwind of the urban region. In FRAPPE, the focus is the possible role of oil and gas exploration on urban air quality. These findings are used to assess the accuracy of current numerical models to reproduce observations, and to point toward areas possibly needing further study.
Nonlinear dynamic analysis and optimal trajectory planning of a high-speed macro-micro manipulator
NASA Astrophysics Data System (ADS)
Yang, Yi-ling; Wei, Yan-ding; Lou, Jun-qiang; Fu, Lei; Zhao, Xiao-wei
2017-09-01
This paper reports the nonlinear dynamic modeling and the optimal trajectory planning for a flexure-based macro-micro manipulator, which is dedicated to the large-scale and high-speed tasks. In particular, a macro- micro manipulator composed of a servo motor, a rigid arm and a compliant microgripper is focused. Moreover, both flexure hinges and flexible beams are considered. By combining the pseudorigid-body-model method, the assumed mode method and the Lagrange equation, the overall dynamic model is derived. Then, the rigid-flexible-coupling characteristics are analyzed by numerical simulations. After that, the microscopic scale vibration excited by the large-scale motion is reduced through the trajectory planning approach. Especially, a fitness function regards the comprehensive excitation torque of the compliant microgripper is proposed. The reference curve and the interpolation curve using the quintic polynomial trajectories are adopted. Afterwards, an improved genetic algorithm is used to identify the optimal trajectory by minimizing the fitness function. Finally, the numerical simulations and experiments validate the feasibility and the effectiveness of the established dynamic model and the trajectory planning approach. The amplitude of the residual vibration reduces approximately 54.9%, and the settling time decreases 57.1%. Therefore, the operation efficiency and manipulation stability are significantly improved.
A Comprehensive Numerical Model for Simulating Fluid Transport in Nanopores
NASA Astrophysics Data System (ADS)
Zhang, Yuan; Yu, Wei; Sepehrnoori, Kamy; di, Yuan
2017-01-01
Since a large amount of nanopores exist in tight oil reservoirs, fluid transport in nanopores is complex due to large capillary pressure. Recent studies only focus on the effect of nanopore confinement on single-well performance with simple planar fractures in tight oil reservoirs. Its impacts on multi-well performance with complex fracture geometries have not been reported. In this study, a numerical model was developed to investigate the effect of confined phase behavior on cumulative oil and gas production of four horizontal wells with different fracture geometries. Its pore sizes were divided into five regions based on nanopore size distribution. Then, fluid properties were evaluated under different levels of capillary pressure using Peng-Robinson equation of state. Afterwards, an efficient approach of Embedded Discrete Fracture Model (EDFM) was applied to explicitly model hydraulic and natural fractures in the reservoirs. Finally, three fracture geometries, i.e. non-planar hydraulic fractures, non-planar hydraulic fractures with one set natural fractures, and non-planar hydraulic fractures with two sets natural fractures, are evaluated. The multi-well performance with confined phase behavior is analyzed with permeabilities of 0.01 md and 0.1 md. This work improves the analysis of capillarity effect on multi-well performance with complex fracture geometries in tight oil reservoirs.
Discrete Particle Method for Simulating Hypervelocity Impact Phenomena
Watson, Erkai; Steinhauser, Martin O.
2017-01-01
In this paper, we introduce a computational model for the simulation of hypervelocity impact (HVI) phenomena which is based on the Discrete Element Method (DEM). Our paper constitutes the first application of DEM to the modeling and simulating of impact events for velocities beyond 5 kms−1. We present here the results of a systematic numerical study on HVI of solids. For modeling the solids, we use discrete spherical particles that interact with each other via potentials. In our numerical investigations we are particularly interested in the dynamics of material fragmentation upon impact. We model a typical HVI experiment configuration where a sphere strikes a thin plate and investigate the properties of the resulting debris cloud. We provide a quantitative computational analysis of the resulting debris cloud caused by impact and a comprehensive parameter study by varying key parameters of our model. We compare our findings from the simulations with recent HVI experiments performed at our institute. Our findings are that the DEM method leads to very stable, energy–conserving simulations of HVI scenarios that map the experimental setup where a sphere strikes a thin plate at hypervelocity speed. Our chosen interaction model works particularly well in the velocity range where the local stresses caused by impact shock waves markedly exceed the ultimate material strength. PMID:28772739
Time-dependent changes in protein expression in rainbow trout muscle following hypoxia.
Wulff, Tune; Jokumsen, Alfred; Højrup, Peter; Jessen, Flemming
2012-04-18
Adaptation to hypoxia is a complex process, and individual proteins will be up- or down-regulated in order to address the main challenges at any given time. To investigate the dynamics of the adaptation, rainbow trout (Oncorhynchus mykiss) was exposed to 30% of normal oxygen tension for 1, 2, 5 and 24 h respectively, after which muscle samples were taken. The successful investigation of numerous proteins in a single study was achieved by selectively separating the sarcoplasmic proteins using 2-DE. In total 46 protein spots were identified as changing in abundance in response to hypoxia using one-way ANOVA and multivariate data analysis. Proteins of interest were subsequently identified by MS/MS following tryptic digestion. The observed regulation following hypoxia in skeletal muscle was determined to be time specific, as only a limited number of proteins were regulated in response to more than one time point. The cellular response to hypoxia included regulation of proteins involved in maintaining iron homeostasis, energy levels and muscle structure. In conclusion, this proteome-based study presents a comprehensive investigation of the expression profiles of numerous proteins at four different time points. This increases our understanding of timed changes in protein expression in rainbow trout muscle following hypoxia. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Dimov, I.; Georgieva, R.; Todorov, V.; Ostromsky, Tz.
2017-10-01
Reliability of large-scale mathematical models is an important issue when such models are used to support decision makers. Sensitivity analysis of model outputs to variation or natural uncertainties of model inputs is crucial for improving the reliability of mathematical models. A comprehensive experimental study of Monte Carlo algorithms based on Sobol sequences for multidimensional numerical integration has been done. A comparison with Latin hypercube sampling and a particular quasi-Monte Carlo lattice rule based on generalized Fibonacci numbers has been presented. The algorithms have been successfully applied to compute global Sobol sensitivity measures corresponding to the influence of several input parameters (six chemical reactions rates and four different groups of pollutants) on the concentrations of important air pollutants. The concentration values have been generated by the Unified Danish Eulerian Model. The sensitivity study has been done for the areas of several European cities with different geographical locations. The numerical tests show that the stochastic algorithms under consideration are efficient for multidimensional integration and especially for computing small by value sensitivity indices. It is a crucial element since even small indices may be important to be estimated in order to achieve a more accurate distribution of inputs influence and a more reliable interpretation of the mathematical model results.
A description of the tides in the Eastern North Atlantic
NASA Astrophysics Data System (ADS)
Fanjul, Enrique Alvarez; Gómez, Begoña Pérez; Sánchez-Arévalo, Ignacio Rodríguez
A description of the Eastern North Atlantic tidal dynamics (in a region spanning from 20°N to 48°N in latitude and from 34°W to 0° in longitude) is obtained by means of new in situ measurements and numerical modelling based on TOPEX/POSEIDON-derived data sets. The main source of measurements is the tide gauge network REDMAR (RED de MAReógrafos de Puertos del Estado), operative since July 1992 and managed by Clima Marítimo (Puertos del Estado). Results derived from the harmonic analysis of the first years of measurements are presented and compared with model results. In order to obtain a global picture of the tides in the region, a large compilation of harmonic constants obtained from other institutes is included. The availability of new TOPEX/POSEIDON-derived harmonic constants data sets provides a chance to include the benefits derived from satellite altimetry in high resolution regional applications of numerical models. Richard Ray's tidal model (Ray et al., 1994), based on a response type tidal analysis of TOPEX/POSEIDON data, was employed within a model of the studied area. The numerical model employed is HAMSOM, a 3-D finite difference code developed both by the Institut für Meereskunde (Hamburg University) and Clima Marítimo. Results from simulations of seven major harmonics are presented, providing a comprehensive view of tidal dynamics, including current information. The results of tidal simulations show good agreement between semidiurnal harmonic components and the values measured by both coastal and pelagic tidal gauges and by current meters. The modelled diurnal constituents show larger relative differences with measurements than semidiurnal harmonics, especially concerning the phase lags. The non-linear transfer of energy from semidiurnal to higher order harmonics, such as M 4 and M 6, was mapped. Those transfers were found to be important only in two areas: the French continental shelf in the Bay of Biscay and the widest part of the African shelf, south of Cabo Bojador.
Thermal, size and surface effects on the nonlinear pull-in of small-scale piezoelectric actuators
NASA Astrophysics Data System (ADS)
SoltanRezaee, Masoud; Ghazavi, Mohammad-Reza
2017-09-01
Electrostatically actuated miniature wires/tubes have many operational applications in the high-tech industries. In this research, the nonlinear pull-in instability of piezoelectric thermal small-scale switches subjected to Coulomb and dissipative forces is analyzed using strain gradient and modified couple stress theories. The discretized governing equation is solved numerically by means of the step-by-step linearization method. The correctness of the formulated model and solution procedure is validated through comparison with experimental and several theoretical results. Herein, the length-scale, surface energy, van der Waals attraction and nonlinear curvature are considered in the present comprehensive model and the thermo-electro-mechanical behavior of cantilever piezo-beams are discussed in detail. It is found that the piezoelectric actuation can be used as a design parameter to control the pull-in phenomenon. The obtained results are applicable in stability analysis, practical design and control of actuated miniature intelligent devices.
Dynamical mass generation in unquenched QED using the Dyson-Schwinger equations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kızılersü, Ayse; Sizer, Tom; Pennington, Michael R.
We present a comprehensive numerical study of dynamical mass generation for unquenched QED in four dimensions, in the absence of four-fermion interactions, using the Dyson-Schwinger approach. We begin with an overview of previous investigations of criticality in the quenched approximation. To this we add an analysis using a new fermion-antifermion-boson interaction ansatz, the Kizilersu-Pennington (KP) vertex, developed for an unquenched treatment. After surveying criticality in previous unquenched studies, we investigate the performance of the KP vertex in dynamical mass generation using a renormalized fully unquenched system of equations. This we compare with the results for two hybrid vertices incorporating themore » Curtis-Pennington vertex in the fermion equation. We conclude that the KP vertex is as yet incomplete, and its relative gauge-variance is due to its lack of massive transverse components in its design.« less
Model-Driven Useware Engineering
NASA Astrophysics Data System (ADS)
Meixner, Gerrit; Seissler, Marc; Breiner, Kai
User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.
Earth science information: Planning for the integration and use of global change information
NASA Technical Reports Server (NTRS)
Lousma, Jack R.
1992-01-01
Activities and accomplishments of the first six months of the Consortium for International Earth Science Information Network (CIESIN's) 1992 technical program have focused on four main missions: (1) the development and implementation of plans for initiation of the Socioeconomic Data and Applications Center (SEDAC) as part of the EOSDIS Program; (2) the pursuit and development of a broad-based global change information cooperative by providing systems analysis and integration between natural science and social science data bases held by numerous federal agencies and other sources; (3) the fostering of scientific research into the human dimensions of global change and providing integration between natural science and social science data and information; and (4) the serving of CIESIN as a gateway for global change data and information distribution through development of the Global Change Research Information Office and other comprehensive knowledge sharing systems.
Perception and analysis of Spanish accents in English speech
NASA Astrophysics Data System (ADS)
Chism, Cori; Lass, Norman
2002-05-01
The purpose of the present study was to determine what relates most closely to the degree of perceived foreign accent in the English speech of native Spanish speakers: intonation, vowel length, stress, voice onset time (VOT), or segmental accuracy. Nineteen native English speaking listeners rated speech samples from 7 native English speakers and 15 native Spanish speakers for comprehensibility and degree of foreign accent. The speech samples were analyzed spectrographically and perceptually to obtain numerical values for each variable. Correlation coefficients were computed to determine the relationship beween these values and the average foreign accent scores. Results showed that the average foreign accent scores were statistically significantly correlated with three variables: the length of stressed vowels (r=-0.48, p=0.05), voice onset time (r =-0.62, p=0.01), and segmental accuracy (r=0.92, p=0.001). Implications of these findings and suggestions for future research are discussed.
Monitoring the Wobbe Index of Natural Gas Using Fiber-Enhanced Raman Spectroscopy.
Sandfort, Vincenz; Trabold, Barbara M; Abdolvand, Amir; Bolwien, Carsten; Russell, Philip St. J; Wöllenstein, Jürgen; Palzer, Stefan
2017-11-24
The fast and reliable analysis of the natural gas composition requires the simultaneous quantification of numerous gaseous components. To this end, fiber-enhanced Raman spectroscopy is a powerful tool to detect most components in a single measurement using a single laser source. However, practical issues such as detection limit, gas exchange time and background Raman signals from the fiber material still pose obstacles to utilizing the scheme in real-world settings. This paper compares the performance of two types of hollow-core photonic crystal fiber (PCF), namely photonic bandgap PCF and kagomé-style PCF, and assesses their potential for online determination of the Wobbe index. In contrast to bandgap PCF, kagomé-PCF allows for reliable detection of Raman-scattered photons even below 1200 cm -1 , which in turn enables fast and comprehensive assessment of the natural gas quality of arbitrary mixtures.
NASA Technical Reports Server (NTRS)
Hiser, H. W.; Lee, S. S.; Veziroglu, T. N.; Sengupta, S.
1975-01-01
A comprehensive numerical model development program for near-field thermal plume discharge and far field general circulation in coastal regions is being carried on at the University of Miami Clean Energy Research Institute. The objective of the program is to develop a generalized, three-dimensional, predictive model for thermal pollution studies. Two regions of specific application of the model are the power plants sites at the Biscayne Bay and Hutchinson Island area along the Florida coastline. Remote sensing from aircraft as well as satellites are used in parallel with in situ measurements to provide information needed for the development and verification of the mathematical model. This paper describes the efforts that have been made to identify problems and limitations of the presently available satellite data and to develop methods for enhancing and enlarging thermal infrared displays for mesoscale sea surface temperature measurements.
BioInt: an integrative biological object-oriented application framework and interpreter.
Desai, Sanket; Burra, Prasad
2015-01-01
BioInt, a biological programming application framework and interpreter, is an attempt to equip the researchers with seamless integration, efficient extraction and effortless analysis of the data from various biological databases and algorithms. Based on the type of biological data, algorithms and related functionalities, a biology-specific framework was developed which has nine modules. The modules are a compilation of numerous reusable BioADTs. This software ecosystem containing more than 450 biological objects underneath the interpreter makes it flexible, integrative and comprehensive. Similar to Python, BioInt eliminates the compilation and linking steps cutting the time significantly. The researcher can write the scripts using available BioADTs (following C++ syntax) and execute them interactively or use as a command line application. It has features that enable automation, extension of the framework with new/external BioADTs/libraries and deployment of complex work flows.
Dynamical mass generation in unquenched QED using the Dyson-Schwinger equations
Kızılersü, Ayse; Sizer, Tom; Pennington, Michael R.; ...
2015-03-13
We present a comprehensive numerical study of dynamical mass generation for unquenched QED in four dimensions, in the absence of four-fermion interactions, using the Dyson-Schwinger approach. We begin with an overview of previous investigations of criticality in the quenched approximation. To this we add an analysis using a new fermion-antifermion-boson interaction ansatz, the Kizilersu-Pennington (KP) vertex, developed for an unquenched treatment. After surveying criticality in previous unquenched studies, we investigate the performance of the KP vertex in dynamical mass generation using a renormalized fully unquenched system of equations. This we compare with the results for two hybrid vertices incorporating themore » Curtis-Pennington vertex in the fermion equation. We conclude that the KP vertex is as yet incomplete, and its relative gauge-variance is due to its lack of massive transverse components in its design.« less
NASA Technical Reports Server (NTRS)
2003-01-01
In order to rapidly and efficiently grow crystals, tools were needed to automatically identify and analyze the growing process of protein crystals. To meet this need, Diversified Scientific, Inc. (DSI), with the support of a Small Business Innovation Research (SBIR) contract from NASA s Marshall Space Flight Center, developed CrystalScore(trademark), the first automated image acquisition, analysis, and archiving system designed specifically for the macromolecular crystal growing community. It offers automated hardware control, image and data archiving, image processing, a searchable database, and surface plotting of experimental data. CrystalScore is currently being used by numerous pharmaceutical companies and academic and nonprofit research centers. DSI, located in Birmingham, Alabama, was awarded the patent Method for acquiring, storing, and analyzing crystal images on March 4, 2003. Another DSI product made possible by Marshall SBIR funding is VaporPro(trademark), a unique, comprehensive system that allows for the automated control of vapor diffusion for crystallization experiments.
One Message, Many Voices: Mobile Audio Counselling in Health Education.
Pimmer, Christoph; Mbvundula, Francis
2018-01-01
Health workers' use of counselling information on their mobile phones for health education is a central but little understood phenomenon in numerous mobile health (mHealth) projects in Sub-Saharan Africa. Drawing on empirical data from an interpretive case study in the setting of the Millennium Villages Project in rural Malawi, this research investigates the ways in which community health workers (CHWs) perceive that audio-counselling messages support their health education practice. Three main themes emerged from the analysis: phone-aided audio counselling (1) legitimises the CHWs' use of mobile phones during household visits; (2) helps CHWs to deliver a comprehensive counselling message; (3) supports CHWs in persuading communities to change their health practices. The findings show the complexity and interplay of the multi-faceted, sociocultural, political, and socioemotional meanings associated with audio-counselling use. Practical implications and the demand for further research are discussed.
Acoustic Waves in Medical Imaging and Diagnostics
Sarvazyan, Armen P.; Urban, Matthew W.; Greenleaf, James F.
2013-01-01
Up until about two decades ago acoustic imaging and ultrasound imaging were synonymous. The term “ultrasonography,” or its abbreviated version “sonography” meant an imaging modality based on the use of ultrasonic compressional bulk waves. Since the 1990s numerous acoustic imaging modalities started to emerge based on the use of a different mode of acoustic wave: shear waves. It was demonstrated that imaging with these waves can provide very useful and very different information about the biological tissue being examined. We will discuss physical basis for the differences between these two basic modes of acoustic waves used in medical imaging and analyze the advantages associated with shear acoustic imaging. A comprehensive analysis of the range of acoustic wavelengths, velocities, and frequencies that have been used in different imaging applications will be presented. We will discuss the potential for future shear wave imaging applications. PMID:23643056
NASA Technical Reports Server (NTRS)
Fried, Alan; Drummond, James
2003-01-01
This final report summarizes the progress achieved over the entire 3-year proposal period including two extensions spanning 1 year. These activities include: 1) Preparation for and participation in the NASA 2001 TRACE-P campaign using our airborne tunable diode laser system to acquire measurements of formaldehyde (CH2O); 2) Comprehensive data analysis and data submittal to the NASA archive; 3) Follow up data interpretation working with NASA modelers to place our ambient CH2O measurements into a broader photochemical context; 4) Publication of numerous JGR papers using this data; 5) Extensive follow up laboratory tests on the selectivity and efficiency of our CH20 scrubbing system; and 6) An extensive follow up effort to assess and study the mechanical stability of our entire optical system, particularly the multipass absorption cell, with aircraft changes in cabin pressure.
Monitoring the Wobbe Index of Natural Gas Using Fiber-Enhanced Raman Spectroscopy
Sandfort, Vincenz; Trabold, Barbara M.; Abdolvand, Amir; Bolwien, Carsten; Russell, Philip St. J.; Wöllenstein, Jürgen
2017-01-01
The fast and reliable analysis of the natural gas composition requires the simultaneous quantification of numerous gaseous components. To this end, fiber-enhanced Raman spectroscopy is a powerful tool to detect most components in a single measurement using a single laser source. However, practical issues such as detection limit, gas exchange time and background Raman signals from the fiber material still pose obstacles to utilizing the scheme in real-world settings. This paper compares the performance of two types of hollow-core photonic crystal fiber (PCF), namely photonic bandgap PCF and kagomé-style PCF, and assesses their potential for online determination of the Wobbe index. In contrast to bandgap PCF, kagomé-PCF allows for reliable detection of Raman-scattered photons even below 1200 cm−1, which in turn enables fast and comprehensive assessment of the natural gas quality of arbitrary mixtures. PMID:29186768
Bryant, Fred B
2016-12-01
This paper introduces a special section of the current issue of the Journal of Evaluation in Clinical Practice that includes a set of 6 empirical articles showcasing a versatile, new machine-learning statistical method, known as optimal data (or discriminant) analysis (ODA), specifically designed to produce statistical models that maximize predictive accuracy. As this set of papers clearly illustrates, ODA offers numerous important advantages over traditional statistical methods-advantages that enhance the validity and reproducibility of statistical conclusions in empirical research. This issue of the journal also includes a review of a recently published book that provides a comprehensive introduction to the logic, theory, and application of ODA in empirical research. It is argued that researchers have much to gain by using ODA to analyze their data. © 2016 John Wiley & Sons, Ltd.