Instability of a solidifying binary mixture
NASA Technical Reports Server (NTRS)
Antar, B. N.
1982-01-01
An analysis is performed on the stability of a solidifying binary mixture due to surface tension variation of the free liquid surface. The basic state solution is obtained numerically as a nonstationary function of time. Due to the time dependence of the basic state, the stability analysis is of the global type which utilizes a variational technique. Also due to the fact that the basic state is a complex function of both space and time, the stability analysis is performed through numerical means.
1976-11-11
exchange. The basis for this choice was derived from several factors . One was a timing analysis that was made for certain basic time-critical software...randidate 6jrstem designs were developed and _*xamined with respect to L their capability to demonstrate the workability of the basic concept and for factors ...algorithm recuires a bit time completion, while SOF production allows byte timing and the involved = SOF correlation procedure may be perfor-med during
Efficacy of a Single Dose of Basic Fibroblast Growth Factor: Clinical Observation for 1 Year.
Suzuki, Hirotaka; Makiyama, Kiyoshi; Hirai, Ryoji; Matsuzaki, Hiroumi; Furusaka, Toru; Oshima, Takeshi
2016-11-01
Basic fibroblast growth factor promotes wound healing by accelerating healthy granulation and epithelialization. However, the duration of the effects of a single intracordal injection of basic fibroblast growth factor has not been established, and administration intervals and timing have yet to be standardized. Here, we administered a single injection to patients with insufficient glottic closure and conducted follow-up examinations with high-speed digital imaging to determine the duration of the treatment response. Case series. For treatment, 20 µg/mL recombinant human basic fibroblast growth factor was injected into two vocal cords. The following examinations were performed before the procedure and at 3-month intervals for 12 months starting at 1 month postinjection: Grade, Roughness, Breathiness, Asthenia, and Strain (GRBAS) scale assessment, maximum phonation time, acoustic analysis, high-speed digital imaging, glottal wave analysis, and kymographic analysis. Postinjection, the GRBAS scale score decreased, and the maximum phonation time was prolonged. In addition, the mean minimum glottal area and mean minimum glottal distance decreased. These changes were significant at 12 months postinjection compared with preinjection. However, there were no significant changes in the vibrations of the vocal cord margins. The intracordal injection of basic fibroblast growth factor improved insufficient glottic closure without reducing the vibrations of the vocal cord margins. This effect remained evident at 12 months postinjection. A single injection can be expected to yield a sufficient and persistent long-term effect. Copyright © 2016 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
Analysis of Ballast Water Sampling Port Designs Using Computational Fluid Dynamics
2008-02-01
straight, vertical, upward-flowing pipe having a sample port diameter between 1.5 and 2.0 times the basic isokinetic diameter as defined in this report...water, flow modeling, sample port, sample pipe, particle trajectory, isokinetic sampling 18. Distribution Statement This document is available to...2.0 times the basic isokinetic diameter as defined in this report. Sample ports should use ball valves for isolation purposes and diaphragm or
Analysis of Time-Series Quasi-Experiments. Final Report.
ERIC Educational Resources Information Center
Glass, Gene V.; Maguire, Thomas O.
The objective of this project was to investigate the adequacy of statistical models developed by G. E. P. Box and G. C. Tiao for the analysis of time-series quasi-experiments: (1) The basic model developed by Box and Tiao is applied to actual time-series experiment data from two separate experiments, one in psychology and one in educational…
A crash course on data analysis in asteroseismology
NASA Astrophysics Data System (ADS)
Appourchaux, Thierry
2014-02-01
In this course, I try to provide a few basics required for performing data analysis in asteroseismology. First, I address how one can properly treat times series: the sampling, the filtering effect, the use of Fourier transform, the associated statistics. Second, I address how one can apply statistics for decision making and for parameter estimation either in a frequentist of a Bayesian framework. Last, I review how these basic principle have been applied (or not) in asteroseismology.
A Kinetic Study of the Effect of Basicity on the Mold Fluxes Crystallization
NASA Astrophysics Data System (ADS)
Zhou, Lejun; Wang, Wanlin; Ma, Fanjun; Li, Jin; Wei, Juan; Matsuura, Hiroyuki; Tsukihashi, Fumitaka
2012-04-01
The effect of basicity on the mold fluxes crystallization was investigated in this article. The time-temperature-transformation (TTT) diagrams and continuous-cooling-transformation (CCT) diagrams of mold fluxes with different basicity were constructed by using single, hot thermocouple technology (SHTT). The results showed that with the increase of basicity, the incubation time of isothermal crystallization became shorter, the crystallization temperature was getting higher, and the critical cooling rate of continuous cooling crystallization became faster. The X-ray diffraction analysis suggested that calcium silicate (CaO·SiO2) was precipitated at the upper part of the TTT diagram and cuspidine (Ca4Si2O7F2) was formed at the lower part, when the basicity of mold fluxes was within 1.0 to 1.2. However, when basicity was 0.8, only the cuspidine phase was formed. A kinetic study of isothermal crystallization process indicated that the increase of the basicity tended to enhance the mold flux crystallization, and the crystallization activation energy became smaller. The crystallization mechanism of cupsidine was changing from one-dimensional growth to three-dimensional growth with a constant number of nuclei, when the basicity of mold fluxes varied from 0.8 to 1.2.
Seismpol_ a visual-basic computer program for interactive and automatic earthquake waveform analysis
NASA Astrophysics Data System (ADS)
Patanè, Domenico; Ferrari, Ferruccio
1997-11-01
A Microsoft Visual-Basic computer program for waveform analysis of seismic signals is presented. The program combines interactive and automatic processing of digital signals using data recorded by three-component seismic stations. The analysis procedure can be used in either an interactive earthquake analysis or an automatic on-line processing of seismic recordings. The algorithm works in the time domain using the Covariance Matrix Decomposition method (CMD), so that polarization characteristics may be computed continuously in real time and seismic phases can be identified and discriminated. Visual inspection of the particle motion in hortogonal planes of projection (hodograms) reduces the danger of misinterpretation derived from the application of the polarization filter. The choice of time window and frequency intervals improves the quality of the extracted polarization information. In fact, the program uses a band-pass Butterworth filter to process the signals in the frequency domain by analysis of a selected signal window into a series of narrow frequency bands. Significant results supported by well defined polarizations and source azimuth estimates for P and S phases are also obtained for short-period seismic events (local microearthquakes).
PERTS: A Prototyping Environment for Real-Time Systems
NASA Technical Reports Server (NTRS)
Liu, Jane W. S.; Lin, Kwei-Jay; Liu, C. L.
1993-01-01
PERTS is a prototyping environment for real-time systems. It is being built incrementally and will contain basic building blocks of operating systems for time-critical applications, tools, and performance models for the analysis, evaluation and measurement of real-time systems and a simulation/emulation environment. It is designed to support the use and evaluation of new design approaches, experimentations with alternative system building blocks, and the analysis and performance profiling of prototype real-time systems.
Who Gets What? Is Improved Access to Basic Education Pro-Poor in Sub-Saharan Africa?
ERIC Educational Resources Information Center
Lewin, Keith M.; Sabates, Ricardo
2012-01-01
This paper explores changing patterns of access to basic education in six Sub-Saharan Africa countries using data from Demographic and Health Surveys at two points in time. In general the analysis confirms that participation of children in schooling has increased over the last decade. However, access to education remains strongly associated with…
Neuropsychological basic deficits in preschoolers at risk for ADHD: a meta-analysis.
Pauli-Pott, Ursula; Becker, Katja
2011-06-01
Widely accepted neuropsychological theories on attention deficit hyperactivity disorder (ADHD) assume that the complex symptoms of the disease arise from developmentally preceding neuropsychological basic deficits. These deficits in executive functions and delay aversion are presumed to emerge in the preschool period. The corresponding normative developmental processes include phases of relative stability and rapid change. These non-linear developmental processes might have implications for concurrent and predictive associations between basic deficits and ADHD symptoms. To derive a description of the nature and strength of these associations, a meta-analysis was conducted. It is assumed that weighted mean effect sizes differ between basic deficits and depend on age. The meta-analysis included 25 articles (n=3005 children) in which associations between assessments of basic deficits (i.e. response inhibition, interference control, delay aversion, working memory, flexibility, and vigilance/arousal) in the preschool period and concurrent or subsequent ADHD symptoms or diagnosis of ADHD had been analyzed. For response inhibition and delay aversion, mean effect sizes were of medium to large magnitude while the mean effect size for working memory was small. Meta-regression analyses revealed that effect sizes of delay aversion tasks significantly decreased with increasing age while effect sizes of interference control tasks and Continuous Performance Tests (CPTs) significantly increased. Depending on the normative maturational course of each skill, time windows might exist that allow for a more or less valid assessment of a specific deficit. In future research these time windows might help to describe early developing forms of ADHD and to identify children at risk. Copyright © 2011 Elsevier Ltd. All rights reserved.
From Discrete Space-Time to Minkowski Space: Basic Mechanisms, Methods and Perspectives
NASA Astrophysics Data System (ADS)
Finster, Felix
This survey article reviews recent results on fermion systems in discrete space-time and corresponding systems in Minkowski space. After a basic introduction to the discrete setting, we explain a mechanism of spontaneous symmetry breaking which leads to the emergence of a discrete causal structure. As methods to study the transition between discrete space-time and Minkowski space, we describe a lattice model for a static and isotropic space-time, outline the analysis of regularization tails of vacuum Dirac sea configurations, and introduce a Lorentz invariant action for the masses of the Dirac seas. We mention the method of the continuum limit, which allows to analyze interacting systems. Open problems are discussed.
2013-03-01
response in an effort to determine “What can be done better next time?” and “How do we prevent this from happening again?” One basic expectation that...effort to determine “What can be done better next time?” and “How do we prevent this from happening again?” One basic expectation that citizens have of...to determine “What can be done better next time?” and “How do we prevent this from happening again?” After the tragic events of September, 11, 2001
von Websky, Martin W; Raptis, Dimitri A; Vitz, Martina; Rosenthal, Rachel; Clavien, P A; Hahnloser, Dieter
2013-11-01
Virtual reality (VR) simulators are widely used to familiarize surgical novices with laparoscopy, but VR training methods differ in efficacy. In the present trial, self-controlled basic VR training (SC-training) was tested against training based on peer-group-derived benchmarks (PGD-training). First, novice laparoscopic residents were randomized into a SC group (n = 34), and a group using PGD-benchmarks (n = 34) for basic laparoscopic training. After completing basic training, both groups performed 60 VR laparoscopic cholecystectomies for performance analysis. Primary endpoints were simulator metrics; secondary endpoints were program adherence, trainee motivation, and training efficacy. Altogether, 66 residents completed basic training, and 3,837 of 3,960 (96.8 %) cholecystectomies were available for analysis. Course adherence was good, with only two dropouts, both in the SC-group. The PGD-group spent more time and repetitions in basic training until the benchmarks were reached and subsequently showed better performance in the readout cholecystectomies: Median time (gallbladder extraction) showed significant differences of 520 s (IQR 354-738 s) in SC-training versus 390 s (IQR 278-536 s) in the PGD-group (p < 0.001) and 215 s (IQR 175-276 s) in experts, respectively. Path length of the right instrument also showed significant differences, again with the PGD-training group being more efficient. Basic VR laparoscopic training based on PGD benchmarks with external assessment is superior to SC training, resulting in higher trainee motivation and better performance in simulated laparoscopic cholecystectomies. We recommend such a basic course based on PGD benchmarks before advancing to more elaborate VR training.
Partiprajak, Suphamas; Thongpo, Pichaya
2016-01-01
This study explored the retention of basic life support knowledge, self-efficacy, and chest compression performance among Thai nursing students at a university in Thailand. A one-group, pre-test and post-test design time series was used. Participants were 30 nursing students undertaking basic life support training as a care provider. Repeated measure analysis of variance was used to test the retention of knowledge and self-efficacy between pre-test, immediate post-test, and re-test after 3 months. A Wilcoxon signed-rank test was used to compare the difference in chest compression performance two times. Basic life support knowledge was measured using the Basic Life Support Standard Test for Cognitive Knowledge. Self-efficacy was measured using the Basic Life Support Self-Efficacy Questionnaire. Chest compression performance was evaluated using a data printout from Resusci Anne and Laerdal skillmeter within two cycles. The training had an immediate significant effect on the knowledge, self-efficacy, and skill of chest compression; however, the knowledge and self-efficacy significantly declined after post-training for 3 months. Chest compression performance after training for 3 months was positively retaining compared to the first post-test but was not significant. Therefore, a retraining program to maintain knowledge and self-efficacy for a longer period of time should be established after post-training for 3 months. Copyright © 2015 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Hussmann, Katja; Grande, Marion; Meffert, Elisabeth; Christoph, Swetlana; Piefke, Martina; Willmes, Klaus; Huber, Walter
2012-01-01
Although generally accepted as an important part of aphasia assessment, detailed analysis of spontaneous speech is rarely carried out in clinical practice mostly due to time limitations. The Aachener Sprachanalyse (ASPA; Aachen Speech Analysis) is a computer-assisted method for the quantitative analysis of German spontaneous speech that allows for…
CME Velocity and Acceleration Error Estimates Using the Bootstrap Method
NASA Technical Reports Server (NTRS)
Michalek, Grzegorz; Gopalswamy, Nat; Yashiro, Seiji
2017-01-01
The bootstrap method is used to determine errors of basic attributes of coronal mass ejections (CMEs) visually identified in images obtained by the Solar and Heliospheric Observatory (SOHO) mission's Large Angle and Spectrometric Coronagraph (LASCO) instruments. The basic parameters of CMEs are stored, among others, in a database known as the SOHO/LASCO CME catalog and are widely employed for many research studies. The basic attributes of CMEs (e.g. velocity and acceleration) are obtained from manually generated height-time plots. The subjective nature of manual measurements introduces random errors that are difficult to quantify. In many studies the impact of such measurement errors is overlooked. In this study we present a new possibility to estimate measurements errors in the basic attributes of CMEs. This approach is a computer-intensive method because it requires repeating the original data analysis procedure several times using replicate datasets. This is also commonly called the bootstrap method in the literature. We show that the bootstrap approach can be used to estimate the errors of the basic attributes of CMEs having moderately large numbers of height-time measurements. The velocity errors are in the vast majority small and depend mostly on the number of height-time points measured for a particular event. In the case of acceleration, the errors are significant, and for more than half of all CMEs, they are larger than the acceleration itself.
Clayton, H M
1993-05-01
The time-motion characteristics of Canadian basic- and medium-level dressage competitions are described, and the results are applied in formulating sport-specific conditioning programs. One competition was analyzed at the six levels from basic 1 to medium 3. Each test was divided into a series of sequences based on the type and speed of activity. The durations of the sequences were measured from videotapes. The basic-level tests had fewer sequences, and they were shorter in distance and duration than the medium tests (P < 0.10), but the average speed did not differ between the two levels. It is recommended that horses competing at the basic levels be conditioned using 5-min exercise periods, with short (10-s) bursts of lengthened trot and canter included at basic 2 and above. In preparation for medium-level competitions, the duration of the work periods increases to 7 min, 10- to 12-s bursts of medium or extended trot and canter are included, and transitions are performed frequently to simulate the energy expenditure in overcoming inertia.
Laser Microprobe Mass Spectrometry 1: Basic Principles and Performance Characteristics.
ERIC Educational Resources Information Center
Denoyer, Eric; And Others
1982-01-01
Describes the historical development, performance characteristics (sample requirements, analysis time, ionization characteristics, speciation capabilities, and figures of merit), and applications of laser microprobe mass spectrometry. (JN)
The FTA Method And A Possibility Of Its Application In The Area Of Road Freight Transport
NASA Astrophysics Data System (ADS)
Poliaková, Adela
2015-06-01
The Fault Tree process utilizes logic diagrams to portray and analyse potentially hazardous events. Three basic symbols (logic gates) are adequate for diagramming any fault tree. However, additional recently developed symbols can be used to reduce the time and effort required for analysis. A fault tree is a graphical representation of the relationship between certain specific events and the ultimate undesired event (2). This paper deals to method of Fault Tree Analysis basic description and provides a practical view on possibility of application by quality improvement in road freight transport company.
NASA Astrophysics Data System (ADS)
Taki, Tsuyoshi; Hasegawa, Jun-ichi
1998-12-01
This paper proposes a basic feature for quantitative measurement and evaluation of group behavior of persons. This feature called 'dominant region' is a kind of sphere of influence for each person in the group. The dominant region is defined as a region in where the person can arrive earlier than any other persons and can be formulated as Voronoi region modified by replacing the distance function with a time function. This time function is calculated based on a computational model of moving ability of the person. As an application of the dominant region, we present a motion analysis system of soccer games. The purpose of this system is to evaluate the teamwork quantitatively based on movement of all the players in the game. From experiments using motion pictures of actual games, it is suggested that the proposed feature is useful for measurement and evaluation of group behavior in team sports. This basic feature may be applied to other team ball games, such as American football, basketball, handball and water polo.
[Hazard function and life table: an introduction to the failure time analysis].
Matsushita, K; Inaba, H
1987-04-01
Failure time analysis has become popular in demographic studies. It can be viewed as a part of regression analysis with limited dependent variables as well as a special case of event history analysis and multistate demography. The idea of hazard function and failure time analysis, however, has not been properly introduced to nor commonly discussed by demographers in Japan. The concept of hazard function in comparison with life tables is briefly described, where the force of mortality is interchangeable with the hazard rate. The basic idea of failure time analysis is summarized for the cases of exponential distribution, normal distribution, and proportional hazard models. The multiple decrement life table is also introduced as an example of lifetime data analysis with cause-specific hazard rates.
Evaluating management risks using landscape trajectory analysis: a case study of California fisher
Craig M. Thompson; William J. Zielinski; Kathryn L. Purcell
2011-01-01
Ecosystem management requires an understanding of how landscapes vary in space and time, how this variation can be affected by management decisions or stochastic events, and the potential consequences for species. Landscape trajectory analysis, coupled with a basic knowledge of species habitat selection, offers a straightforward approach to ecological risk analysis and...
NASA Astrophysics Data System (ADS)
Liao, Zangyi
2017-12-01
Accomplishing the regional equalization of basic public service supply among the provinces in China is an important objective that can promote the people’s livelihood construction. In order to measure the problem which is about the non-equalization of basic public service supply, this paper takes these aspects as the first index, such as the infrastructure construction, basic education services, public employment services, public health service and social security service. At the same time, it cooperates with 16 index as the second index to construct the performance evaluation systems, and then use the Theil index to evaluate the performance in provinces that using the panel data from the year 2000 to 2012.
Kunnuji, Michael
2014-01-01
Research has shown that in countries such as Nigeria many urban dwellers live in a state of squalour and lack the basic necessities of food, clothing and shelter. The present study set out to examine the association between forms of basic deprivation--such as food deprivation, high occupancy ratio as a form of shelter deprivation, and inadequate clothing--and two sexual outcomes--timing of onset of penetrative sex and involvement in multiple sexual partnerships. The study used survey data from a sample of 480 girls resident in Iwaya community. A survival analysis of the timing of onset of sex and a regression model for involvement in multiple sexual partnerships reveal that among the forms of deprivation explored, food deprivation is the only significant predictor of the timing of onset of sex and involvement in multiple sexual partnerships. The study concludes that the sexual activities of poor out-of-school girls are partly explained by their desire to overcome food deprivation and recommends that government and non-governmental-organisation programmes working with young people should address the problem of basic deprivation among adolescent girls.
Linear analysis of a force reflective teleoperator
NASA Technical Reports Server (NTRS)
Biggers, Klaus B.; Jacobsen, Stephen C.; Davis, Clark C.
1989-01-01
Complex force reflective teleoperation systems are often very difficult to analyze due to the large number of components and control loops involved. One mode of a force reflective teleoperator is described. An analysis of the performance of the system based on a linear analysis of the general full order model is presented. Reduced order models are derived and correlated with the full order models. Basic effects of force feedback and position feedback are examined and the effects of time delays between the master and slave are studied. The results show that with symmetrical position-position control of teleoperators, a basic trade off must be made between the intersystem stiffness of the teleoperator, and the impedance felt by the operator in free space.
Deformation Monitoring and Analysis of Lsp Landslide Based on Gbinsar
NASA Astrophysics Data System (ADS)
Zhou, L.; Guo, J.; Yang, F.
2018-05-01
Monitoring and analyzing the deformation of the river landslide in city to master the deformation law of landslide, which is an important means of landslide safety assessment. In this paper, aiming at the stability of the Liu Sha Peninsula Landslide during its strengthening process after the landslide disaster. Continuous and high precision deformation monitoring of the landslide was carried out by GBInSAR technique. Meanwhile, the two-dimensional deformation time series pictures of the landslide body were retrieved by the time series analysis method. The deformation monitoring and analysis results show that the reinforcement belt on the landslide body was basically stable and the deformation of most PS points on the reinforcement belt was within 1 mm. The deformation of most areas on the landslide body was basically within 4 mm, and the deformation presented obvious nonlinear changes. GBInSAR technique can quickly and effectively obtain the entire deformation information of the river landslide and the evolution process of deformation.
Zhang, Yong; Huo, Meirong; Zhou, Jianping; Xie, Shaofei
2010-09-01
This study presents PKSolver, a freely available menu-driven add-in program for Microsoft Excel written in Visual Basic for Applications (VBA), for solving basic problems in pharmacokinetic (PK) and pharmacodynamic (PD) data analysis. The program provides a range of modules for PK and PD analysis including noncompartmental analysis (NCA), compartmental analysis (CA), and pharmacodynamic modeling. Two special built-in modules, multiple absorption sites (MAS) and enterohepatic circulation (EHC), were developed for fitting the double-peak concentration-time profile based on the classical one-compartment model. In addition, twenty frequently used pharmacokinetic functions were encoded as a macro and can be directly accessed in an Excel spreadsheet. To evaluate the program, a detailed comparison of modeling PK data using PKSolver and professional PK/PD software package WinNonlin and Scientist was performed. The results showed that the parameters estimated with PKSolver were satisfactory. In conclusion, the PKSolver simplified the PK and PD data analysis process and its output could be generated in Microsoft Word in the form of an integrated report. The program provides pharmacokinetic researchers with a fast and easy-to-use tool for routine and basic PK and PD data analysis with a more user-friendly interface. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.
Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun
2018-01-01
To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.
UGV Control Interoperability Profile (IOP), Version 0
2011-12-21
task or function associated with the ID (e.g. “select asset gear” and “switch between local and zulu time display”). Category Provides a high...CTRL- Basic Status-2 view Zulu date and time in Date-Time-Group (DTG) format Basic Status S SWP Icon (text) CTRL- Basic Status-3 switch...between local and zulu time display Basic Status C SW1 CTRL- Basic Status-4 view unique identifier/call sign for each asset Basic Status S
Techniques for Forecasting Air Passenger Traffic
NASA Technical Reports Server (NTRS)
Taneja, N.
1972-01-01
The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.
Traffic Driven Analysis of Cellular and WiFi Networks
ERIC Educational Resources Information Center
Paul, Utpal Kumar
2012-01-01
Since the days Internet traffic proliferated, measurement, monitoring and analysis of network traffic have been critical to not only the basic understanding of large networks, but also to seek improvements in resource management, traffic engineering and security. At the current times traffic in wireless local and wide area networks are facing…
A Critical Review of Line Graphs in Behavior Analytic Journals
ERIC Educational Resources Information Center
Kubina, Richard M., Jr.; Kostewicz, Douglas E.; Brennan, Kaitlyn M.; King, Seth A.
2017-01-01
Visual displays such as graphs have played an instrumental role in psychology. One discipline relies almost exclusively on graphs in both applied and basic settings, behavior analysis. The most common graphic used in behavior analysis falls under the category of time series. The line graph represents the most frequently used display for visual…
Analysis of the mechanics and deformation characteristics of optical fiber acceleration sensor
NASA Astrophysics Data System (ADS)
Liu, Zong-kai; Bo, Yu-ming; Zhou, Ben-mou; Wang, Jun; Huang, Ya-dong
2016-10-01
The optical fiber sensor holds many advantages such as smaller volume, lighter weight, higher sensitivity, and stronger anti-interference ability, etc. It can be applied to oil exploration to improve the exploration efficiency, since the underground petroleum distribution can be obtained by detecting and analyzing the echo signals. In this paper, the cantilever beam optical fiber sensor was mainly investigated. Specifically, the finite element analysis method is applied to the numerical analysis of the changes and relations of the optical fiber rail slot elongation on the surface of the PC material fiber winding plate along with the changes of time and power under the action of sine force. The analysis results show that, when the upper and lower quality blocks are under the action of sine force, the cantilever beam optical fiber sensor structure can basically produce synchronized deformation along with the force. And the optical fiber elongation length basically has a linear relationship with the sine force within the time ranges of 0.2 0.4 and 0.6 0.8, which would be beneficial for the subsequent signal acquisition and data processing.
Background Information and User’s Guide for MIL-F-9490
1975-01-01
requirements, although different analysis results will apply to each requirement. Basic differences between the two realibility requirements are: MIL-F-8785B...provides the rationale for establishing such limits. The specific risk analysis comprises the same data which formed the average risk analysis , except...statistical analysis will be based on statistical data taken using limited exposure Limes of components and equipment. The exposure times and resulting
Analysis of E-marketplace Attributes: Assessing The NATO Logistics Stock Exchange
2008-01-01
order processing time Reduction of stock levels Reduction of payment processing time Reduction of excessive stocks Reduction of maverick buying...satisfaction 4,02 0,151 3. Reduction of order processing time 4,27 0,317 15. Reduction of stock levels 3,87 0,484 4. Reduction of payment processing time...information exchange with partners in the supply chain Efficiency Basic Reduction of order processing time Efficiency Important Reduction of
Spectral methods for time dependent problems
NASA Technical Reports Server (NTRS)
Tadmor, Eitan
1990-01-01
Spectral approximations are reviewed for time dependent problems. Some basic ingredients from the spectral Fourier and Chebyshev approximations theory are discussed. A brief survey was made of hyperbolic and parabolic time dependent problems which are dealt with by both the energy method and the related Fourier analysis. The ideas presented above are combined in the study of accuracy stability and convergence of the spectral Fourier approximation to time dependent problems.
Hardware Model of a Shipboard Generator
2009-05-19
controller output PM motor power RM motor resistance Td derivative time constant Tf1 fuel valve time constant Tg1 governor time constant Tg2 governor...in speed, sending a response signal to the fuel valve that regulates gas turbine power. At this point, there is an inherent variation between the...basic response analysis [5]. 29 Electrical Power Rotor Inertia Amplifiers Fuel Valve Turbine Dynamics Rotational Friction and Windage
43 CFR 2.19 - When may the bureau extend the basic time limit?
Code of Federal Regulations, 2014 CFR
2014-10-01
... 43 Public Lands: Interior 1 2014-10-01 2014-10-01 false When may the bureau extend the basic time... INFORMATION ACT; RECORDS AND TESTIMONY Timing of Responses to Requests § 2.19 When may the bureau extend the basic time limit? (a) The bureau may extend the basic time limit if unusual circumstances exist. Before...
Bornstein, E; Monteagudo, A; Santos, R; Strock, I; Tsymbal, T; Lenchner, E; Timor-Tritsch, I E
2010-07-01
To evaluate the feasibility and the processing time of offline analysis of three-dimensional (3D) brain volumes to perform a basic, as well as a detailed, targeted, fetal neurosonogram. 3D fetal brain volumes were obtained in 103 consecutive healthy fetuses that underwent routine anatomical survey at 20-23 postmenstrual weeks. Transabdominal gray-scale and power Doppler volumes of the fetal brain were acquired by one of three experienced sonographers (an average of seven volumes per fetus). Acquisition was first attempted in the sagittal and coronal planes. When the fetal position did not enable easy and rapid access to these planes, axial acquisition at the level of the biparietal diameter was performed. Offline analysis of each volume was performed by two of the authors in a blinded manner. A systematic technique of 'volume manipulation' was used to identify a list of 25 brain dimensions/structures comprising a complete basic evaluation, intracranial biometry and a detailed targeted fetal neurosonogram. The feasibility and reproducibility of obtaining diagnostic-quality images of the different structures was evaluated, and processing times were recorded, by the two examiners. Diagnostic-quality visualization was feasible in all of the 25 structures, with an excellent visualization rate (85-100%) reported in 18 structures, a good visualization rate (69-97%) reported in five structures and a low visualization rate (38-54%) reported in two structures, by the two examiners. An average of 4.3 and 5.4 volumes were used to complete the examination by the two examiners, with a mean processing time of 7.2 and 8.8 minutes, respectively. The overall agreement rate for diagnostic visualization of the different brain structures between the two examiners was 89.9%, with a kappa coefficient of 0.5 (P < 0.001). In experienced hands, offline analysis of 3D brain volumes is a reproducible modality that can identify all structures necessary to complete both a basic and a detailed second-trimester fetal neurosonogram. Copyright 2010 ISUOG. Published by John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Buscema, Massimo; Massini, Giulia; Sacco, Pier Luigi
2018-02-01
This paper offers the first systematic presentation of the topological approach to the analysis of epidemic and pseudo-epidemic spatial processes. We introduce the basic concepts and proofs, at test the approach on a diverse collection of case studies of historically documented epidemic and pseudo-epidemic processes. The approach is found to consistently provide reliable estimates of the structural features of epidemic processes, and to provide useful analytical insights and interpretations of fragmentary pseudo-epidemic processes. Although this analysis has to be regarded as preliminary, we find that the approach's basic tenets are strongly corroborated by this first test and warrant future research in this vein.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1990-01-01
Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.
ERIC Educational Resources Information Center
Hannan, Michael T.
This technical document, part of a series of chapters described in SO 011 759, describes a basic model of panel analysis used in a study of the causes of institutional and structural change in nations. Panel analysis is defined as a record of state occupancy of a sample of units at two or more points in time; for example, voters disclose voting…
A comparison of experimental and calculated thin-shell leading-edge buckling due to thermal stresses
NASA Technical Reports Server (NTRS)
Jenkins, Jerald M.
1988-01-01
High-temperature thin-shell leading-edge buckling test data are analyzed using NASA structural analysis (NASTRAN) as a finite element tool for predicting thermal buckling characteristics. Buckling points are predicted for several combinations of edge boundary conditions. The problem of relating the appropriate plate area to the edge stress distribution and the stress gradient is addressed in terms of analysis assumptions. Local plasticity was found to occur on the specimen analyzed, and this tended to simplify the basic problem since it effectively equalized the stress gradient from loaded edge to loaded edge. The initial loading was found to be difficult to select for the buckling analysis because of the transient nature of thermal stress. Multiple initial model loadings are likely required for complicated thermal stress time histories before a pertinent finite element buckling analysis can be achieved. The basic mode shapes determined from experimentation were correctly identified from computation.
Kapp, Nikki; Barnes, William J; Richard, Tom L; Anderson, Charles T
2015-07-01
Lignin is a complex polyphenolic heteropolymer that is abundant in the secondary cell walls of plants and functions in growth and defence. It is also a major barrier to the deconstruction of plant biomass for bioenergy production, but the spatiotemporal details of how lignin is deposited in actively lignifying tissues and the precise relationships between wall lignification in different cell types and developmental events, such as flowering, are incompletely understood. Here, the lignin-detecting fluorogenic dye, Basic Fuchsin, was adapted to enable comparative fluorescence-based imaging of lignin in the basal internodes of three Brachypodium distachyon ecotypes that display divergent flowering times. It was found that the extent and intensity of Basic Fuchsin fluorescence increase over time in the Bd21-3 ecotype, that Basic Fuchsin staining is more widespread and intense in 4-week-old Bd21-3 and Adi-10 basal internodes than in Bd1-1 internodes, and that Basic Fuchsin staining reveals subcellular patterns of lignin in vascular and interfascicular fibre cell walls. Basic Fuchsin fluorescence did not correlate with lignin quantification by acetyl bromide analysis, indicating that whole-plant and subcellular lignin analyses provide distinct information about the extent and patterns of lignification in B. distachyon. Finally, it was found that flowering time correlated with a transient increase in total lignin, but did not correlate strongly with the patterning of stem lignification, suggesting that additional developmental pathways might regulate secondary wall formation in grasses. This study provides a new comparative tool for imaging lignin in plants and helps inform our views of how lignification proceeds in grasses. © The Author 2015. Published by Oxford University Press on behalf of the Society for Experimental Biology.
Kapp, Nikki; Barnes, William J.; Richard, Tom L.; Anderson, Charles T.
2015-01-01
Lignin is a complex polyphenolic heteropolymer that is abundant in the secondary cell walls of plants and functions in growth and defence. It is also a major barrier to the deconstruction of plant biomass for bioenergy production, but the spatiotemporal details of how lignin is deposited in actively lignifying tissues and the precise relationships between wall lignification in different cell types and developmental events, such as flowering, are incompletely understood. Here, the lignin-detecting fluorogenic dye, Basic Fuchsin, was adapted to enable comparative fluorescence-based imaging of lignin in the basal internodes of three Brachypodium distachyon ecotypes that display divergent flowering times. It was found that the extent and intensity of Basic Fuchsin fluorescence increase over time in the Bd21-3 ecotype, that Basic Fuchsin staining is more widespread and intense in 4-week-old Bd21-3 and Adi-10 basal internodes than in Bd1-1 internodes, and that Basic Fuchsin staining reveals subcellular patterns of lignin in vascular and interfascicular fibre cell walls. Basic Fuchsin fluorescence did not correlate with lignin quantification by acetyl bromide analysis, indicating that whole-plant and subcellular lignin analyses provide distinct information about the extent and patterns of lignification in B. distachyon. Finally, it was found that flowering time correlated with a transient increase in total lignin, but did not correlate strongly with the patterning of stem lignification, suggesting that additional developmental pathways might regulate secondary wall formation in grasses. This study provides a new comparative tool for imaging lignin in plants and helps inform our views of how lignification proceeds in grasses. PMID:25922482
Eisner, Emily; Drake, Richard; Lobban, Fiona; Bucci, Sandra; Emsley, Richard; Barrowclough, Christine
2018-02-01
Early signs interventions show promise but could be further developed. A recent review suggested that 'basic symptoms' should be added to conventional early signs to improve relapse prediction. This study builds on preliminary evidence that basic symptoms predict relapse and aimed to: 1. examine which phenomena participants report prior to relapse and how they describe them; 2. determine the best way of identifying pre-relapse basic symptoms; 3. assess current practice by comparing self- and casenote-reported pre-relapse experiences. Participants with non-affective psychosis were recruited from UK mental health services. In-depth interviews (n=23), verbal checklists of basic symptoms (n=23) and casenote extracts (n=208) were analysed using directed content analysis and non-parametric statistical tests. Three-quarters of interviewees reported basic symptoms and all reported conventional early signs and 'other' pre-relapse experiences. Interviewees provided rich descriptions of basic symptoms. Verbal checklist interviews asking specifically about basic symptoms identified these experiences more readily than open questions during in-depth interviews. Only 5% of casenotes recorded basic symptoms; interviewees were 16 times more likely to report basic symptoms than their casenotes did. The majority of interviewees self-reported pre-relapse basic symptoms when asked specifically about these experiences but very few casenotes reported these symptoms. Basic symptoms may be potent predictors of relapse that clinicians miss. A self-report measure would aid monitoring of basic symptoms in routine clinical practice and would facilitate a prospective investigation comparing basic symptoms and conventional early signs as predictors of relapse. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Aoyama, Hideaki; Fujiwara, Yoshi; Ikeda, Yuichi; Iyetomi, Hiroshi; Souma, Wataru; Yoshikawa, Hiroshi
2017-07-01
Preface; Foreword, Acknowledgements, List of tables; List of figures, prologue, 1. Introduction: reconstructing macroeconomics; 2. Basic concepts in statistical physics and stochastic models; 3. Income and firm-size distributions; 4. Productivity distribution and related topics; 5. Multivariate time-series analysis; 6. Business cycles; 7. Price dynamics and inflation/deflation; 8. Complex network, community analysis, visualization; 9. Systemic risks; Appendix A: computer program for beginners; Epilogue; Bibliography; Index.
PERTS: A Prototyping Environment for Real-Time Systems
NASA Technical Reports Server (NTRS)
Liu, Jane W. S.; Lin, Kwei-Jay; Liu, C. L.
1991-01-01
We discuss an ongoing project to build a Prototyping Environment for Real-Time Systems, called PERTS. PERTS is a unique prototyping environment in that it has (1) tools and performance models for the analysis and evaluation of real-time prototype systems, (2) building blocks for flexible real-time programs and the support system software, (3) basic building blocks of distributed and intelligent real time applications, and (4) an execution environment. PERTS will make the recent and future theoretical advances in real-time system design and engineering readily usable to practitioners. In particular, it will provide an environment for the use and evaluation of new design approaches, for experimentation with alternative system building blocks and for the analysis and performance profiling of prototype real-time systems.
Nonstationary time series prediction combined with slow feature analysis
NASA Astrophysics Data System (ADS)
Wang, G.; Chen, X.
2015-07-01
Almost all climate time series have some degree of nonstationarity due to external driving forces perturbing the observed system. Therefore, these external driving forces should be taken into account when constructing the climate dynamics. This paper presents a new technique of obtaining the driving forces of a time series from the slow feature analysis (SFA) approach, and then introduces them into a predictive model to predict nonstationary time series. The basic theory of the technique is to consider the driving forces as state variables and to incorporate them into the predictive model. Experiments using a modified logistic time series and winter ozone data in Arosa, Switzerland, were conducted to test the model. The results showed improved prediction skills.
Mathematical model for HIV spreads control program with ART treatment
NASA Astrophysics Data System (ADS)
Maimunah; Aldila, Dipo
2018-03-01
In this article, using a deterministic approach in a seven-dimensional nonlinear ordinary differential equation, we establish a mathematical model for the spread of HIV with an ART treatment intervention. In a simplified model, when no ART treatment is implemented, disease-free and the endemic equilibrium points were established analytically along with the basic reproduction number. The local stability criteria of disease-free equilibrium and the existing criteria of endemic equilibrium were analyzed. We find that endemic equilibrium exists when the basic reproduction number is larger than one. From the sensitivity analysis of the basic reproduction number of the complete model (with ART treatment), we find that the increased number of infected humans who follow the ART treatment program will reduce the basic reproduction number. We simulate this result also in the numerical experiment of the autonomous system to show how treatment intervention impacts the reduction of the infected population during the intervention time period.
14 CFR 121.422 - Aircraft dispatchers: Initial and transition ground training.
Code of Federal Regulations, 2011 CFR
2011-01-01
... computations; (iv) Basic airplane performance dispatch requirements and procedures; (v) Flight planning including track selection, flight time analysis, and fuel requirements; and (vi) Emergency procedures. (3... procedures, and other subjects having a bearing on dispatcher duties and responsibilities; (ii) Flight...
14 CFR 121.422 - Aircraft dispatchers: Initial and transition ground training.
Code of Federal Regulations, 2012 CFR
2012-01-01
... computations; (iv) Basic airplane performance dispatch requirements and procedures; (v) Flight planning including track selection, flight time analysis, and fuel requirements; and (vi) Emergency procedures. (3... procedures, and other subjects having a bearing on dispatcher duties and responsibilities; (ii) Flight...
14 CFR 121.422 - Aircraft dispatchers: Initial and transition ground training.
Code of Federal Regulations, 2013 CFR
2013-01-01
... computations; (iv) Basic airplane performance dispatch requirements and procedures; (v) Flight planning including track selection, flight time analysis, and fuel requirements; and (vi) Emergency procedures. (3... procedures, and other subjects having a bearing on dispatcher duties and responsibilities; (ii) Flight...
14 CFR 121.422 - Aircraft dispatchers: Initial and transition ground training.
Code of Federal Regulations, 2010 CFR
2010-01-01
... computations; (iv) Basic airplane performance dispatch requirements and procedures; (v) Flight planning including track selection, flight time analysis, and fuel requirements; and (vi) Emergency procedures. (3... procedures, and other subjects having a bearing on dispatcher duties and responsibilities; (ii) Flight...
14 CFR 121.422 - Aircraft dispatchers: Initial and transition ground training.
Code of Federal Regulations, 2014 CFR
2014-01-01
... computations; (iv) Basic airplane performance dispatch requirements and procedures; (v) Flight planning including track selection, flight time analysis, and fuel requirements; and (vi) Emergency procedures. (3... procedures, and other subjects having a bearing on dispatcher duties and responsibilities; (ii) Flight...
Komatsu, Setsuko; Takasaki, Hironori
2009-07-01
Genes regulated by gibberellin (GA) during leaf sheath elongation in rice seedlings were identified using the transcriptome approach. mRNA from the basal regions of leaf sheaths treated with GA3 was analyzed by high-coverage gene expression profiling. 33,004 peaks were detected, and 30 transcripts showed significant changes in the presence of GA3. Among these, basic helix-loop-helix transcription factor (AK073385) was significantly upregulated. Quantitative PCR analysis confirmed that expression of AK073385 was controlled by GA3 in a time- and dose-dependent manner. Basic helix-loop-helix transcription factor (AK073385) is therefore involved in the regulation of gene expression by GA3.
Apollo 15 time and motion study
NASA Technical Reports Server (NTRS)
Kubis, J. F.; Elrod, J. T.; Rusnak, R.; Barnes, J. E.
1972-01-01
A time and motion study of Apollo 15 lunar surface activity led to examination of four distinct areas of crewmen activity. These areas are: an analysis of lunar mobility, a comparative analysis of tasks performed in 1-g training and lunar EVA, an analysis of the metabolic cost of two activities that are performed in several EVAs, and a fall/near-fall analysis. An analysis of mobility showed that the crewmen used three basic mobility patterns (modified walk, hop, side step) while on the lunar surface. These mobility patterns were utilized as adaptive modes to compensate for the uneven terrain and varied soil conditions that the crewmen encountered. A comparison of the time required to perform tasks at the final 1-g lunar EVA training sessions and the time required to perform the same task on the lunar surface indicates that, in almost all cases, it took significantly more time (on the order of 40%) to perform tasks on the moon. This increased time was observed even after extraneous factors (e.g., hardware difficulties) were factored out.
Okosun, Kazeem O; Makinde, Oluwole D; Takaidza, Isaac
2013-01-01
The aim of this paper is to analyze the recruitment effects of susceptible and infected individuals in order to assess the productivity of an organizational labor force in the presence of HIV/AIDS with preventive and HAART treatment measures in enhancing the workforce output. We consider constant controls as well as time-dependent controls. In the constant control case, we calculate the basic reproduction number and investigate the existence and stability of equilibria. The model is found to exhibit backward and Hopf bifurcations, implying that for the disease to be eradicated, the basic reproductive number must be below a critical value of less than one. We also investigate, by calculating sensitivity indices, the sensitivity of the basic reproductive number to the model's parameters. In the time-dependent control case, we use Pontryagin's maximum principle to derive necessary conditions for the optimal control of the disease. Finally, numerical simulations are performed to illustrate the analytical results. The cost-effectiveness analysis results show that optimal efforts on recruitment (HIV screening of applicants, etc.) is not the most cost-effective strategy to enhance productivity in the organizational labor force. Hence, to enhance employees' productivity, effective education programs and strict adherence to preventive measures should be promoted.
NASA Astrophysics Data System (ADS)
Shahzad, Syed Jawad Hussain; Nor, Safwan Mohd; Kumar, Ronald Ravinesh; Mensi, Walid
2017-01-01
This study examines the interdependence and contagion among US industry-level credit markets. We use daily data of 11 industries from 17 December 2007 to 31 December 2014 for the time-frequency, namely, wavelet squared coherence analysis. The empirical analysis reveals that Basic Materials (Utilities) industry credit market has the highest (lowest) interdependence with other industries. Basic Materials credit market passes cyclical effect to all other industries. The little ;shift-contagion; as defined by Forbes and Rigobon (2002) is examined using elliptical and Archimedean copulas on the short-run decomposed series obtained through Variational Mode Decomposition (VMD). The contagion effects between US industry-level credit markets mainly occurred during the global financial crisis of 2007-08.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1988-01-01
This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.
A modeling analysis program for the JPL table mountain Io sodium cloud data
NASA Technical Reports Server (NTRS)
Smyth, William H.; Goldberg, Bruce A.
1988-01-01
Research in the third and final year of this project is divided into three main areas: (1) completion of data processing and calibration for 34 of the 1981 Region B/C images, selected from the massive JPL sodium cloud data set; (2) identification and examination of the basic features and observed changes in the morphological characteristics of the sodium cloud images; and (3) successful physical interpretation of these basic features and observed changes using the highly developed numerical sodium cloud model at AER. The modeling analysis has led to a number of definite conclusions regarding the local structure of Io's atmosphere, the gas escape mechanism at Io, and the presence of an east-west electric field and a System III longitudinal asymmetry in the plasma torus. Large scale stability, as well as some smaller scale time variability for both the sodium cloud and the structure of the plasma torus over a several year time period are also discussed.
NASA Technical Reports Server (NTRS)
Berg, Melanie D.; LaBel, Kenneth; Kim, Hak
2014-01-01
An informative session regarding SRAM FPGA basics. Presenting a framework for fault injection techniques applied to Xilinx Field Programmable Gate Arrays (FPGAs). Introduce an overlooked time component that illustrates fault injection is impractical for most real designs as a stand-alone characterization tool. Demonstrate procedures that benefit from fault injection error analysis.
Orthopedic resident work-shift analysis: are we making the best use of resident work hours?
Hamid, Kamran S; Nwachukwu, Benedict U; Hsu, Eugene; Edgerton, Colston A; Hobson, David R; Lang, Jason E
2014-01-01
Surgery programs have been tasked to meet rising demands in patient surgical care while simultaneously providing adequate resident training in the midst of increasing resident work-hour restrictions. The purpose of this study was to quantify orthopedic surgery resident workflow and identify areas needing improved resident efficiency. We hypothesize that residents spend a disproportionate amount of time involved in activities that do not relate directly to patient care or maximize resident education. We observed 4 orthopedic surgery residents on the orthopedic consult service at a major tertiary care center for 72 consecutive hours (6 consecutive shifts). We collected minute-by-minute data using predefined work-task criteria: direct new patient contact, direct existing patient contact, communications with other providers, documentation/administrative time, transit time, and basic human needs. A seventh category comprised remaining less-productive work was termed as standby. In a 720-minute shift, residents spent on an average: 191 minutes (26.5%) performing documentation/administrative duties, 167.0 minutes (23.2%) in direct contact with new patient consults, 129.6 minutes (17.1%) in communication with other providers regarding patients, 116.2 (16.1%) minutes in standby, 63.7 minutes (8.8%) in transit, 32.6 minutes (4.5%) with existing patients, and 20 minutes (2.7%) attending to basic human needs. Residents performed an additional 130 minutes of administrative work off duty. Secondary analysis revealed residents were more likely to perform administrative work rather than directly interact with existing patients (p = 0.006) or attend to basic human needs (p = 0.003). Orthopedic surgery residents spend a large proportion of their time performing documentation/administrative-type work and their workday can be operationally optimized to minimize nonvalue-adding tasks. Formal workflow analysis may aid program directors in systematic process improvements to better align resident skills with tasks. III. Published by Elsevier Inc.
Time series regression studies in environmental epidemiology.
Bhaskaran, Krishnan; Gasparrini, Antonio; Hajat, Shakoor; Smeeth, Liam; Armstrong, Ben
2013-08-01
Time series regression studies have been widely used in environmental epidemiology, notably in investigating the short-term associations between exposures such as air pollution, weather variables or pollen, and health outcomes such as mortality, myocardial infarction or disease-specific hospital admissions. Typically, for both exposure and outcome, data are available at regular time intervals (e.g. daily pollution levels and daily mortality counts) and the aim is to explore short-term associations between them. In this article, we describe the general features of time series data, and we outline the analysis process, beginning with descriptive analysis, then focusing on issues in time series regression that differ from other regression methods: modelling short-term fluctuations in the presence of seasonal and long-term patterns, dealing with time varying confounding factors and modelling delayed ('lagged') associations between exposure and outcome. We finish with advice on model checking and sensitivity analysis, and some common extensions to the basic model.
Biosensing Technologies for Mycobacterium tuberculosis Detection: Status and New Developments
Zhou, Lixia; He, Xiaoxiao; He, Dinggeng; Wang, Kemin; Qin, Dilan
2011-01-01
Biosensing technologies promise to improve Mycobacterium tuberculosis (M. tuberculosis) detection and management in clinical diagnosis, food analysis, bioprocess, and environmental monitoring. A variety of portable, rapid, and sensitive biosensors with immediate “on-the-spot” interpretation have been developed for M. tuberculosis detection based on different biological elements recognition systems and basic signal transducer principles. Here, we present a synopsis of current developments of biosensing technologies for M. tuberculosis detection, which are classified on the basis of basic signal transducer principles, including piezoelectric quartz crystal biosensors, electrochemical biosensors, and magnetoelastic biosensors. Special attention is paid to the methods for improving the framework and analytical parameters of the biosensors, including sensitivity and analysis time as well as automation of analysis procedures. Challenges and perspectives of biosensing technologies development for M. tuberculosis detection are also discussed in the final part of this paper. PMID:21437177
NASA Technical Reports Server (NTRS)
Miller, G. K., Jr.; Riley, D. R.
1978-01-01
The effect of secondary tasks in determining permissible time delays in visual-motion simulation of a pursuit tracking task was examined. A single subject, a single set of aircraft handling qualities, and a single motion condition in tracking a target aircraft that oscillates sinusoidally in altitude were used. In addition to the basic simulator delays the results indicate that the permissible time delay is about 250 msec for either a tapping task, an adding task, or an audio task and is approximately 125 msec less than when no secondary task is involved. The magnitudes of the primary task performance measures, however, differ only for the tapping task. A power spectraldensity analysis basically confirms the result by comparing the root-mean-square performance measures. For all three secondary tasks, the total pilot workload was quite high.
Broadening the trans-contextual model of motivation: A study with Spanish adolescents.
González-Cutre, D; Sicilia, Á; Beas-Jiménez, M; Hagger, M S
2014-08-01
The original trans-contextual model of motivation proposed that autonomy support from teachers develops students' autonomous motivation in physical education (PE), and that autonomous motivation is transferred from PE contexts to physical activity leisure-time contexts, and predicts attitudes, perceived behavioral control and subjective norms, and forming intentions to participate in future physical activity behavior. The purpose of this study was to test an extended trans-contextual model of motivation including autonomy support from peers and parents and basic psychological needs in a Spanish sample. School students (n = 400) aged between 12 and 18 years completed measures of perceived autonomy support from three sources, autonomous motivation and constructs from the theory of planned behavior at three different points in time and in two contexts, PE and leisure-time. A path analysis controlling for past physical activity behavior supported the main postulates of the model. Autonomous motivation in a PE context predicted autonomous motivation in a leisure-time physical activity context, perceived autonomy support from teachers predicted satisfaction of basic psychological needs in PE, and perceived autonomy support from peers and parents predicted need satisfaction in leisure-time. This study provides a cross-cultural replication of the trans-contextual model of motivation and broadens it to encompass basic psychological needs. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powell, Danny H; Elwood Jr, Robert H
2011-01-01
Analysis of the material protection, control, and accountability (MPC&A) system is necessary to understand the limits and vulnerabilities of the system to internal threats. A self-appraisal helps the facility be prepared to respond to internal threats and reduce the risk of theft or diversion of nuclear material. The material control and accountability (MC&A) system effectiveness tool (MSET) fault tree was developed to depict the failure of the MPC&A system as a result of poor practices and random failures in the MC&A system. It can also be employed as a basis for assessing deliberate threats against a facility. MSET uses faultmore » tree analysis, which is a top-down approach to examining system failure. The analysis starts with identifying a potential undesirable event called a 'top event' and then determining the ways it can occur (e.g., 'Fail To Maintain Nuclear Materials Under The Purview Of The MC&A System'). The analysis proceeds by determining how the top event can be caused by individual or combined lower level faults or failures. These faults, which are the causes of the top event, are 'connected' through logic gates. The MSET model uses AND-gates and OR-gates and propagates the effect of event failure using Boolean algebra. To enable the fault tree analysis calculations, the basic events in the fault tree are populated with probability risk values derived by conversion of questionnaire data to numeric values. The basic events are treated as independent variables. This assumption affects the Boolean algebraic calculations used to calculate results. All the necessary calculations are built into the fault tree codes, but it is often useful to estimate the probabilities manually as a check on code functioning. The probability of failure of a given basic event is the probability that the basic event primary question fails to meet the performance metric for that question. The failure probability is related to how well the facility performs the task identified in that basic event over time (not just one performance or exercise). Fault tree calculations provide a failure probability for the top event in the fault tree. The basic fault tree calculations establish a baseline relative risk value for the system. This probability depicts relative risk, not absolute risk. Subsequent calculations are made to evaluate the change in relative risk that would occur if system performance is improved or degraded. During the development effort of MSET, the fault tree analysis program used was SAPHIRE. SAPHIRE is an acronym for 'Systems Analysis Programs for Hands-on Integrated Reliability Evaluations.' Version 1 of the SAPHIRE code was sponsored by the Nuclear Regulatory Commission in 1987 as an innovative way to draw, edit, and analyze graphical fault trees primarily for safe operation of nuclear power reactors. When the fault tree calculations are performed, the fault tree analysis program will produce several reports that can be used to analyze the MPC&A system. SAPHIRE produces reports showing risk importance factors for all basic events in the operational MC&A system. The risk importance information is used to examine the potential impacts when performance of certain basic events increases or decreases. The initial results produced by the SAPHIRE program are considered relative risk values. None of the results can be interpreted as absolute risk values since the basic event probability values represent estimates of risk associated with the performance of MPC&A tasks throughout the material balance area (MBA). The RRR for a basic event represents the decrease in total system risk that would result from improvement of that one event to a perfect performance level. Improvement of the basic event with the greatest RRR value produces a greater decrease in total system risk than improvement of any other basic event. Basic events with the greatest potential for system risk reduction are assigned performance improvement values, and new fault tree calculations show the improvement in total system risk. The operational impact or cost-effectiveness from implementing the performance improvements can then be evaluated. The improvements being evaluated can be system performance improvements, or they can be potential, or actual, upgrades to the system. The RIR for a basic event represents the increase in total system risk that would result from failure of that one event. Failure of the basic event with the greatest RIR value produces a greater increase in total system risk than failure of any other basic event. Basic events with the greatest potential for system risk increase are assigned failure performance values, and new fault tree calculations show the increase in total system risk. This evaluation shows the importance of preventing performance degradation of the basic events. SAPHIRE identifies combinations of basic events where concurrent failure of the events results in failure of the top event.« less
Research on multi-user encrypted search scheme in cloud environment
NASA Astrophysics Data System (ADS)
Yu, Zonghua; Lin, Sui
2017-05-01
Aiming at the existing problems of multi-user encrypted search scheme in cloud computing environment, a basic multi-user encrypted scheme is proposed firstly, and then the basic scheme is extended to an anonymous hierarchical management authority. Compared with most of the existing schemes, the scheme not only to achieve the protection of keyword information, but also to achieve the protection of user identity privacy; the same time, data owners can directly control the user query permissions, rather than the cloud server. In addition, through the use of a special query key generation rules, to achieve the hierarchical management of the user's query permissions. The safety analysis shows that the scheme is safe and that the performance analysis and experimental data show that the scheme is practicable.
ERIC Educational Resources Information Center
Carroll, John B.
Fifty-five recent studies of individual differences (IDs) in elementary cognitive tasks (ECTs) are reviewed. Twenty-five data sets are examined, analyzed, or reanalyzed by factor analysis. The following promising dimensions are identified: basic perceptual processes, reaction and movement times, mental comparison and recognition tasks, retrieval…
An Introduction to Fast Fourier Transforms through the Study of Oscillating Reactions.
ERIC Educational Resources Information Center
Eastman, M. P.; And Others
1986-01-01
Discusses an experiment designed to introduce students to the basic principles of the fast Fourier transform and Fourier smoothing through transformation of time-dependent optical absorption data from an oscillating reaction. Uses the Belousov-Zhabotinskii reaction. Describes the experimental setup and data analysis techniques.
Ayres-de-Campos, Diogo; Rei, Mariana; Nunes, Inês; Sousa, Paulo; Bernardes, João
2017-01-01
SisPorto 4.0 is the most recent version of a program for the computer analysis of cardiotocographic (CTG) signals and ST events, which has been adapted to the 2015 International Federation of Gynaecology and Obstetrics (FIGO) guidelines for intrapartum foetal monitoring. This paper provides a detailed description of the analysis performed by the system, including the signal-processing algorithms involved in identification of basic CTG features and the resulting real-time alerts.
ERIC Educational Resources Information Center
Barefield, Robert
2006-01-01
Self-analysis is a basic component of artistic development. For the singer, self-analysis is equally important, but the steps for improvement may be less visible. As Richard Alderson has noted, a singer "hears his voice from the inside through the bony structure of the head rather than outside through the eardrum. We as singers are doomed to a…
Springer, Judy B; Lamborn, Susie D; Pollard, Diane M
2013-01-01
Drawing from self-determination theory, this study investigated adults' perceptions of the process of long-term maintenance of physical activity and how it may relate to their self-identity. Qualitative study included 22 in-depth interviews and participants' recorded personal reflective journals. Health/fitness facility in a Midwestern city. Purposeful sample of 12 adult (age range 29-73 years) members who had engaged in regular physical activity for at least 3 years. Data were collected on participants' perceptions of processes associated with physical activity maintenance. Grounded theory data analysis techniques were used to develop an understanding of participants' long-term physical activity adherence. RESULTS. Analysis revealed three themes organized around basic psychological need satisfaction: (1) Relatedness included receiving and giving support. (2) Competence included challenge and competition, managing weight, and strategies for health management. (3) Autonomy included confidence in the established routine, valuing fitness status, and feeling self-directed. The final theme of physically active self included the personal fit of an active lifestyle, identity as an active person, and attachment to physical activity as life enhancing. Our results suggest that long-term physical activity adherence may be strengthened by promotion of the individual's basic psychological need satisfaction. Adherence is most likely to occur when the value of participation becomes internalized over time as a component of the physically active self.
1978-12-31
Dielectric Discharge. .. ......... 23 3.2.1 Total Emitted Charge .. ........... ........ 26 3.2.2 Emission Time History .. .. ................. 29 3.3...taken to be a rise time of 10 ns and a fall time of 10 to 100 ns. In addition, a physical model of the discharge mechanism has been developed in which...scale model of the P78-2, dubbed the SCATSAT was constructed whose design was chosen to simulate the basic structure of the real satellite, including the
The horse-collar aurora - A frequent pattern of the aurora in quiet times
NASA Technical Reports Server (NTRS)
Hones, E. W., Jr.; Craven, J. D.; Frank, L. A.; Evans, D. S.; Newell, P. T.
1989-01-01
The frequent appearance of the 'horse-collar aurora' pattern in quiet-time DE 1 images is reported, presenting a two-hour image sequence that displays the basic features and shows that it sometimes evolves toward the theta configuration. There is some evidence for interplanetary magnetic field B(y) influence on the temporal development of the pattern. A preliminary statistical analysis finds the pattern appearing in one-third or more of the image sequences recorded during quiet times.
[Approach to the Development of Mind and Persona].
Sawaguchi, Toshiko
2018-01-01
To access medical specialists by health specialists working in the regional health field, the possibility of utilizing the voice approach for dissociative identity disorder (DID) patients as a health assessment for medical access (HAMA) was investigated. The first step is to investigate whether the plural personae in a single DID patient can be discriminated by voice analysis. Voices of DID patients including these with different personae were extracted from YouTube and were analysed using the software PRAAT with basic frequency, oral factors, chin factors and tongue factors. In addition, RAKUGO story teller voices made artificially and dramatically were analysed in the same manner. Quantitive and qualitative analysis method were carried out and nested logistic regression and a nested generalized linear model was developed. The voice from different personae in one DID patient could be visually and easily distinquished using basic frequency curve, cluster analysis and factor analysis. In the canonical analysis, only Roy's maximum root was <0.01. In the nested generalized linear model, the model using a standard deviation (SD) indicator fit best and some other possibilities are shown here. In DID patients, the short transition time among plural personae could guide to the risky situation such as suicide. So if the voice approach can show the time threshold of changes between the different personae, it would be useful as an Access Assessment in the form of a simple HAMA.
Nogami, Kentaro; Taniguchi, Shogo; Ichiyama, Tomoko
2016-01-01
The aim of this study was to investigate the correlation between basic life support skills in dentists who had completed the American Heart Association's Basic Life Support (BLS) Healthcare Provider qualification and time since course completion. Thirty-six dentists who had completed the 2005 BLS Healthcare Provider course participated in the study. We asked participants to perform 2 cycles of cardiopulmonary resuscitation on a mannequin and evaluated basic life support skills. Dentists who had previously completed the BLS Healthcare Provider course displayed both prolonged reaction times, and the quality of their basic life support skills deteriorated rapidly. There were no correlations between basic life support skills and time since course completion. Our results suggest that basic life support skills deteriorate rapidly for dentists who have completed the BLS Healthcare Provider. Newer guidelines stressing chest compressions over ventilation may help improve performance over time, allowing better cardiopulmonary resuscitation in dental office emergencies. Moreover, it may be effective to provide a more specialized version of the life support course to train the dentists, stressing issues that may be more likely to occur in the dental office.
Superposition-Based Analysis of First-Order Probabilistic Timed Automata
NASA Astrophysics Data System (ADS)
Fietzke, Arnaud; Hermanns, Holger; Weidenbach, Christoph
This paper discusses the analysis of first-order probabilistic timed automata (FPTA) by a combination of hierarchic first-order superposition-based theorem proving and probabilistic model checking. We develop the overall semantics of FPTAs and prove soundness and completeness of our method for reachability properties. Basically, we decompose FPTAs into their time plus first-order logic aspects on the one hand, and their probabilistic aspects on the other hand. Then we exploit the time plus first-order behavior by hierarchic superposition over linear arithmetic. The result of this analysis is the basis for the construction of a reachability equivalent (to the original FPTA) probabilistic timed automaton to which probabilistic model checking is finally applied. The hierarchic superposition calculus required for the analysis is sound and complete on the first-order formulas generated from FPTAs. It even works well in practice. We illustrate the potential behind it with a real-life DHCP protocol example, which we analyze by means of tool chain support.
Basic gait analysis based on continuous wave radar.
Zhang, Jun
2012-09-01
A gait analysis method based on continuous wave (CW) radar is proposed in this paper. Time-frequency analysis is used to analyze the radar micro-Doppler echo from walking humans, and the relationships between the time-frequency spectrogram and human biological gait are discussed. The methods for extracting the gait parameters from the spectrogram are studied in depth and experiments on more than twenty subjects have been performed to acquire the radar gait data. The gait parameters are calculated and compared. The gait difference between men and women are presented based on the experimental data and extracted features. Gait analysis based on CW radar will provide a new method for clinical diagnosis and therapy. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
1976-01-01
Data analysis and supporting research in connection with the following objectives are discussed: (1) provide a precise and accurate geometric description of the earth's surface, (2) provide a precise and accurate mathematical description of the earth's gravitational field, and (3) determine time variations of the geometry of the ocean surface, the solid earth, the gravity field and other geophysical parameters.
Coupled dynamics analysis of wind energy systems
NASA Technical Reports Server (NTRS)
Hoffman, J. A.
1977-01-01
A qualitative description of all key elements of a complete wind energy system computer analysis code is presented. The analysis system addresses the coupled dynamics characteristics of wind energy systems, including the interactions of the rotor, tower, nacelle, power train, control system, and electrical network. The coupled dynamics are analyzed in both the frequency and time domain to provide the basic motions and loads data required for design, performance verification and operations analysis activities. Elements of the coupled analysis code were used to design and analyze candidate rotor articulation concepts. Fundamental results and conclusions derived from these studies are presented.
Modeling and simulation of count data.
Plan, E L
2014-08-13
Count data, or number of events per time interval, are discrete data arising from repeated time to event observations. Their mean count, or piecewise constant event rate, can be evaluated by discrete probability distributions from the Poisson model family. Clinical trial data characterization often involves population count analysis. This tutorial presents the basics and diagnostics of count modeling and simulation in the context of pharmacometrics. Consideration is given to overdispersion, underdispersion, autocorrelation, and inhomogeneity.
SNAP: A computer program for generating symbolic network functions
NASA Technical Reports Server (NTRS)
Lin, P. M.; Alderson, G. E.
1970-01-01
The computer program SNAP (symbolic network analysis program) generates symbolic network functions for networks containing R, L, and C type elements and all four types of controlled sources. The program is efficient with respect to program storage and execution time. A discussion of the basic algorithms is presented, together with user's and programmer's guides.
NASA Astrophysics Data System (ADS)
Ikeda, Fujio; Toyama, Shigehiro; Ishiduki, Souta; Seta, Hiroaki
2016-09-01
Maritime accidents of small ships continue to increase in number. One of the major factors is poor manoeuvrability of the Manual Hydraulic Steering Mechanism (MHSM) in common use. The manoeuvrability can be improved by using the Electronic Control Steering Mechanism (ECSM). This paper conducts stability analyses of a pleasure boat controlled by human models in view of path following on a target course, in order to establish design guidelines for the ECSM. First, to analyse the stability region, the research derives the linear approximated model in a planar global coordinate system. Then, several human models are assumed to develop closed-loop human-machine controlled systems. These human models include basic proportional, derivative, integral and time-delay actions. The stability analysis simulations for those human-machine systems are carried out. The results show that the stability region tends to spread as a ship's velocity increases in the case of the basic proportional human model. The derivative action and time-delay action of human models are effective in spreading the stability region in their respective ranges of frontal gazing points.
Magnetoencephalography - a noninvasive brain imaging method with 1 ms time resolution
NASA Astrophysics Data System (ADS)
DelGratta, Cosimo; Pizzella, Vittorio; Tecchio, Franca; Luca Romani, Gian
2001-12-01
The basics of magnetoencephalography (MEG), i.e. the measurement and the analysis of the tiny magnetic fields generated outside the scalp by the working human brain, are reviewed. Three main topics are discussed: (1) the relationship between the magnetic field and its generators, including on one hand the neurophysiological basis and the physical theory of magnetic field generation, and on the other hand the techniques for the estimation of the sources from the magnetic field measurements; (2) the instrumental techniques and the laboratory practice of neuromagnetic field measurement and (3) the main applications of MEG in basic neurophysiology as well as in clinical neurology.
Real-Time Mapping Spectroscopy on the Ground, in the Air, and in Space
NASA Astrophysics Data System (ADS)
Thompson, D. R.; Allwood, A.; Chien, S.; Green, R. O.; Wettergreen, D. S.
2016-12-01
Real-time data interpretation can benefit both remote in situ exploration and remote sensing. Basic analyses at the sensor can monitor instrument performance and reveal invisible science phenomena in real time. This promotes situational awareness for remote robotic explorers or campaign decision makers, enabling adaptive data collection, reduced downlink requirements, and coordinated multi-instrument observations. Fast analysis is ideal for mapping spectrometers providing unambiguous, quantitative geophysical measurements. This presentation surveys recent computational advances in real-time spectroscopic analysis for Earth science and planetary exploration. Spectral analysis at the sensor enables new operations concepts that significantly improve science yield. Applications include real-time detection of fugitive greenhouse emissions by airborne monitoring, real-time cloud screening and mineralogical mapping by orbital spectrometers, and adaptive measurement by the PIXL instrument on the Mars 2020 rover. Copyright 2016 California Institute of Technology. All Rights Reserved. We acknowledge support of the US Government, NASA, the Earth Science Division and Terrestrial Ecology program.
The Data Analysis in Gravitational Wave Detection
NASA Astrophysics Data System (ADS)
Wang, Xiao-ge; Lebigot, Eric; Du, Zhi-hui; Cao, Jun-wei; Wang, Yun-yong; Zhang, Fan; Cai, Yong-zhi; Li, Mu-zi; Zhu, Zong-hong; Qian, Jin; Yin, Cong; Wang, Jian-bo; Zhao, Wen; Zhang, Yang; Blair, David; Ju, Li; Zhao, Chun-nong; Wen, Lin-qing
2017-01-01
Gravitational wave (GW) astronomy based on the GW detection is a rising interdisciplinary field, and a new window for humanity to observe the universe, followed after the traditional astronomy with the electromagnetic waves as the detection means, it has a quite important significance for studying the origin and evolution of the universe, and for extending the astronomical research field. The appearance of laser interferometer GW detector has opened a new era of GW detection, and the data processing and analysis of GWs have already been developed quickly around the world, to provide a sharp weapon for the GW astronomy. This paper introduces systematically the tool software that commonly used for the data analysis of GWs, and discusses in detail the basic methods used in the data analysis of GWs, such as the time-frequency analysis, composite analysis, pulsar timing analysis, matched filter, template, χ2 test, and Monte-Carlo simulation, etc.
Calcerrada, Matías; González-Herráez, Miguel; Garcia-Ruiz, Carmen
2015-06-26
This manuscript describes the development of a capillary electrophoresis (CE) method for the detection of acid and basic dyes and its application to real samples, blue-pen-ink strokes on office paper. First, a capillary zone electrophoresis (CZE) method was developed for the separation of basic and acid dyes, by studying the separation medium (buffer nature, pH and relative amount of additive) and instrumental parameters (temperature, voltage and capillary dimensions). The method performance was evaluated in terms of selectivity, resolution (above 5 and 2 for acid dyes and basic dyes, respectively, except for two basic dye standards), LOD (lower than 0.4 mg/L) and precision as intraday and interday RSD values of peak migration times (lower than 0.6%). The developed method was then applied to 34 blue pens from different technologies (rollerball, ballpoint, markers) and with different ink composition (gel, water-based, oil-based). A microdestructive sample treatment using a scalpel to scratch 0.3mg of ink stroke was performed. The entire electropherogram profile allowed the visual discrimination between different types of ink and brands, being not necessary a statistical treatment. A 100% of discrimination was achieved between pen technologies, brands, and models, although non-reproducible zones in the electropherograms were found for blue gel pen samples. The two different batches of blue oil-based pens were also differentiated. Thus, this method provides a simple, microdestructive, and rapid analysis of different blue pen technologies which may complement the current analysis of questioned documents performed by forensic laboratories. Copyright © 2015 Elsevier B.V. All rights reserved.
Autonomic specificity of basic emotions: evidence from pattern classification and cluster analysis.
Stephens, Chad L; Christie, Israel C; Friedman, Bruce H
2010-07-01
Autonomic nervous system (ANS) specificity of emotion remains controversial in contemporary emotion research, and has received mixed support over decades of investigation. This study was designed to replicate and extend psychophysiological research, which has used multivariate pattern classification analysis (PCA) in support of ANS specificity. Forty-nine undergraduates (27 women) listened to emotion-inducing music and viewed affective films while a montage of ANS variables, including heart rate variability indices, peripheral vascular activity, systolic time intervals, and electrodermal activity, were recorded. Evidence for ANS discrimination of emotion was found via PCA with 44.6% of overall observations correctly classified into the predicted emotion conditions, using ANS variables (z=16.05, p<.001). Cluster analysis of these data indicated a lack of distinct clusters, which suggests that ANS responses to the stimuli were nomothetic and stimulus-specific rather than idiosyncratic and individual-specific. Collectively these results further confirm and extend support for the notion that basic emotions have distinct ANS signatures. Copyright © 2010 Elsevier B.V. All rights reserved.
Chosen Aspects of the Production of the Basic Map Using Uav Imagery
NASA Astrophysics Data System (ADS)
Kedzierski, M.; Fryskowska, A.; Wierzbicki, D.; Nerc, P.
2016-06-01
For several years there has been an increasing interest in the use of unmanned aerial vehicles in acquiring image data from a low altitude. Considering the cost-effectiveness of the flight time of UAVs vs. conventional airplanes, the use of the former is advantageous when generating large scale accurate ortophotos. Through the development of UAV imagery, we can update large-scale basic maps. These maps are cartographic products which are used for registration, economic, and strategic planning. On the basis of these maps other cartographic maps are produced, for example maps used building planning. The article presents an assessesment of the usefulness of orthophotos based on UAV imagery to upgrade the basic map. In the research a compact, non-metric camera, mounted on a fixed wing powered by an electric motor was used. The tested area covered flat, agricultural and woodland terrains. The processing and analysis of orthorectification were carried out with the INPHO UASMaster programme. Due to the effect of UAV instability on low-altitude imagery, the use of non-metric digital cameras and the low-accuracy GPS-INS sensors, the geometry of images is visibly lower were compared to conventional digital aerial photos (large values of phi and kappa angles). Therefore, typically, low-altitude images require large along- and across-track direction overlap - usually above 70 %. As a result of the research orthoimages were obtained with a resolution of 0.06 meters and a horizontal accuracy of 0.10m. Digitized basic maps were used as the reference data. The accuracy of orthoimages vs. basic maps was estimated based on the study and on the available reference sources. As a result, it was found that the geometric accuracy and interpretative advantages of the final orthoimages allow the updating of basic maps. It is estimated that such an update of basic maps based on UAV imagery reduces processing time by approx. 40%.
Measuring bioenergetics in T cells using a Seahorse Extracellular Flux Analyzer
van der Windt, Gerritje J.W.; Chang, Chih-Hao; Pearce, Erika L.
2016-01-01
This unit contains several protocols to determine the energy utilization of T cells in real-time using a Seahorse Extracellular Flux Analyzer (www.seahorsebio.com). The advantages to using this machine over traditional metabolic assays include the simultaneous measurement of glycolysis and mitochondrial respiration, in real-time, on relatively small numbers of cells, without any radioactivity. The Basic Protocol describes a standard mitochondrial stress test on the XFe96, which yields information about oxidative phosphorylation and glycolysis, two energy-generating pathways. The alternate protocols provide examples of adaptations to the Basic Protocol, including adjustments for the use of the XFe24. A protocol for real-time bioenergetic responses to T cell activation allows for the analysis of immediate metabolic changes after T cell receptor stimulation. Specific substrate utilization can be determined by the use of differential assay media, or the injection of drugs that specifically affect certain metabolic processes. Accurate cell numbers, purity, and viability are critical to obtain reliable results. PMID:27038461
Measuring Bioenergetics in T Cells Using a Seahorse Extracellular Flux Analyzer.
van der Windt, Gerritje J W; Chang, Chih-Hao; Pearce, Erika L
2016-04-01
This unit contains several protocols to determine the energy utilization of T cells in real-time using a Seahorse Extracellular Flux Analyzer (http://www.seahorsebio.com). The advantages to using this machine over traditional metabolic assays include the simultaneous measurement of glycolysis and mitochondrial respiration, in real-time, on relatively small numbers of cells, without any radioactivity. The Basic Protocol describes a standard mitochondrial stress test on the XF(e) 96, which yields information about oxidative phosphorylation and glycolysis, two energy-generating pathways. The alternate protocols provide examples of adaptations to the Basic Protocol, including adjustments for the use of the XF(e) 24. A protocol for real-time bioenergetic responses to T cell activation allows for the analysis of immediate metabolic changes after T cell receptor stimulation. Specific substrate utilization can be determined by the use of differential assay media, or the injection of drugs that specifically affect certain metabolic processes. Accurate cell numbers, purity, and viability are critical to obtain reliable results. Copyright © 2016 John Wiley & Sons, Inc.
Subversive Complicity and Basic Writing across the Curriculum
ERIC Educational Resources Information Center
Villanueva, Victor
2013-01-01
What follows is a simple assertion: time for basic writing to get out from under, a call for us to inculcate a Basic Writing Across the Curriculum. It is time yet again to move away from the concept that basic writers are in need of remedies, in part because all composition courses are in some sense remedial, and to a greater degree because the…
Auditory sequence analysis and phonological skill
Grube, Manon; Kumar, Sukhbinder; Cooper, Freya E.; Turton, Stuart; Griffiths, Timothy D.
2012-01-01
This work tests the relationship between auditory and phonological skill in a non-selected cohort of 238 school students (age 11) with the specific hypothesis that sound-sequence analysis would be more relevant to phonological skill than the analysis of basic, single sounds. Auditory processing was assessed across the domains of pitch, time and timbre; a combination of six standard tests of literacy and language ability was used to assess phonological skill. A significant correlation between general auditory and phonological skill was demonstrated, plus a significant, specific correlation between measures of phonological skill and the auditory analysis of short sequences in pitch and time. The data support a limited but significant link between auditory and phonological ability with a specific role for sound-sequence analysis, and provide a possible new focus for auditory training strategies to aid language development in early adolescence. PMID:22951739
GRABGAM Analysis of Ultra-Low-Level HPGe Gamma Spectra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winn, W.G.
The GRABGAM code has been used successfully for ultra-low level HPGe gamma spectrometry analysis since its development in 1985 at Savannah River Technology Center (SRTC). Although numerous gamma analysis codes existed at that time, reviews of institutional and commercial codes indicated that none addressed all features that were desired by SRTC. Furthermore, it was recognized that development of an in-house code would better facilitate future evolution of the code to address SRTC needs based on experience with low-level spectra. GRABGAM derives its name from Gamma Ray Analysis BASIC Generated At MCA/PC.
NASA Astrophysics Data System (ADS)
Sun, Hong; Wu, Qian-zhong
2013-09-01
In order to improve the precision of optical-electric tracking device, proposing a kind of improved optical-electric tracking device based on MEMS, in allusion to the tracking error of gyroscope senor and the random drift, According to the principles of time series analysis of random sequence, establish AR model of gyro random error based on Kalman filter algorithm, then the output signals of gyro are multiple filtered with Kalman filter. And use ARM as micro controller servo motor is controlled by fuzzy PID full closed loop control algorithm, and add advanced correction and feed-forward links to improve response lag of angle input, Free-forward can make output perfectly follow input. The function of lead compensation link is to shorten the response of input signals, so as to reduce errors. Use the wireless video monitor module and remote monitoring software (Visual Basic 6.0) to monitor servo motor state in real time, the video monitor module gathers video signals, and the wireless video module will sent these signals to upper computer, so that show the motor running state in the window of Visual Basic 6.0. At the same time, take a detailed analysis to the main error source. Through the quantitative analysis of the errors from bandwidth and gyro sensor, it makes the proportion of each error in the whole error more intuitive, consequently, decrease the error of the system. Through the simulation and experiment results shows the system has good following characteristic, and it is very valuable for engineering application.
CADDIS Volume 4. Data Analysis: Basic Principles & Issues
Use of inferential statistics in causal analysis, introduction to data independence and autocorrelation, methods to identifying and control for confounding variables, references for the Basic Principles section of Data Analysis.
HNP renumbering support in PMIPv6
NASA Astrophysics Data System (ADS)
Yan, Zhiwei; Geng, Guanggang; Lee, Xiaodong
2015-12-01
In the basic PMIPv6 (Proxy Mobile IPv6), the MN (Mobile Node) is assigned with a 64-bit HNP (Home Network Prefix) during the initial attachment for the HoA (Home Address) configuration. During the movements of MN, this prefix is assumed to be unchanged and then the upper layer applications do not have to use the reconfigured HoA and then the handover is transparent at the IP and above layers. However, the current protocol does not specify the related operation to support the MN to timely receive the new HNP and configure the new HoA when its HNP is renumbered. In this paper, this problem is discussed and a possible solution is proposed based on some simple extensions of the basic PMIPv6. Our analysis demonstrates that the proposed scheme can effectively discover the HNP renumbering and keep lower signaling cost, compared with the basic PMIPv6.
The Learning of the Elderly and the Profile of the Adult Educator
ERIC Educational Resources Information Center
Requejo Osorio, Agustin
2008-01-01
This paper deals with specific characteristics of elders,1 bearing in mind both their cognitive and their non-cognitive aspects. Regarding their way of learning, the paper refers to basic principles for this group of people: active learning, situational analysis, their experience, awareness that they have--and need--specific time and rhythm for…
A Vernacular for Linear Latent Growth Models
ERIC Educational Resources Information Center
Hancock, Gregory R.; Choi, Jaehwa
2006-01-01
In its most basic form, latent growth modeling (latent curve analysis) allows an assessment of individuals' change in a measured variable X over time. For simple linear models, as with other growth models, parameter estimates associated with the a construct (amount of X at a chosen temporal reference point) and b construct (growth in X per unit…
Time Keeps on Ticking: The Experience of Clinical Judgment
ERIC Educational Resources Information Center
Spengler, Paul M.; White, Michael J.; Aegisdottir, Stefania; Maugherman, Alan S.
2009-01-01
The reactions by Ridley and Shaw-Ridley (EJ832451) and Lichtenberg (EJ832452) to the authors' meta-analysis on the effects of experience on judgment accuracy add positively to what is hoped will become an ever more focused discourse on this most basic question: How can mental health clinical decision making be improved? In this rejoinder, the…
ERIC Educational Resources Information Center
Cheng, Wei
2011-01-01
This paper deals with transeditors' innovative subjectivity in facilitating intercultural communication from both the journalistic and the translational perspectives. By applying the basic notions of Douglas Robinson's 'dialogical' mode to the analysis of the translated news carried by "The Global Times" that relates to the Summer…
The observation and coverage analysis of the moon-based ultraviolet telescope on CE-3 lander
NASA Astrophysics Data System (ADS)
wang, f.; wen, w.-b.; liu, d.-w.; geng, l.; zhang, x.-x.; zhao, s.
2017-09-01
Through the analysis of all the observed images of MUVT, it is found that in the celestial coordinate system, all the images of the survey are concentrated at Latitude 65 degrees and Longtitude -90 degrees as the center, a ring of 15 degrees width. The observation data analysis: the coverage of the northern area is up to 2263.8 square degrees, accounting for about 5.487% of the all area. The task is completed the observation target. For the first time, the MUVT in a long time has carried out the astronomical observations, and accumulated abundant observational data for basic research on the evolution of stars, compact star and high energy astrophysics and so on.
NASA Astrophysics Data System (ADS)
Lamb, Derek A.
2016-10-01
While sunspots follow a well-defined pattern of emergence in space and time, small-scale flux emergence is assumed to occur randomly at all times in the quiet Sun. HMI's full-disk coverage, high cadence, spatial resolution, and duty cycle allow us to probe that basic assumption. Some case studies of emergence suggest that temporal clustering on spatial scales of 50-150 Mm may occur. If clustering is present, it could serve as a diagnostic of large-scale subsurface magnetic field structures. We present the results of a manual survey of small-scale flux emergence events over a short time period, and a statistical analysis addressing the question of whether these events show spatio-temporal behavior that is anything other than random.
Integrated tools for control-system analysis
NASA Technical Reports Server (NTRS)
Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.
1989-01-01
The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.
Visualizing time-related data in biology, a review
Secrier, Maria; Schneider, Reinhard
2014-01-01
Time is of the essence in biology as in so much else. For example, monitoring disease progression or the timing of developmental defects is important for the processes of drug discovery and therapy trials. Furthermore, an understanding of the basic dynamics of biological phenomena that are often strictly time regulated (e.g. circadian rhythms) is needed to make accurate inferences about the evolution of biological processes. Recent advances in technologies have enabled us to measure timing effects more accurately and in more detail. This has driven related advances in visualization and analysis tools that try to effectively exploit this data. Beyond timeline plots, notable attempts at more involved temporal interpretation have been made in recent years, but awareness of the available resources is still limited within the scientific community. Here, we review some advances in biological visualization of time-driven processes and consider how they aid data analysis and interpretation. PMID:23585583
Flow cytometry: basic principles and applications.
Adan, Aysun; Alizada, Günel; Kiraz, Yağmur; Baran, Yusuf; Nalbant, Ayten
2017-03-01
Flow cytometry is a sophisticated instrument measuring multiple physical characteristics of a single cell such as size and granularity simultaneously as the cell flows in suspension through a measuring device. Its working depends on the light scattering features of the cells under investigation, which may be derived from dyes or monoclonal antibodies targeting either extracellular molecules located on the surface or intracellular molecules inside the cell. This approach makes flow cytometry a powerful tool for detailed analysis of complex populations in a short period of time. This review covers the general principles and selected applications of flow cytometry such as immunophenotyping of peripheral blood cells, analysis of apoptosis and detection of cytokines. Additionally, this report provides a basic understanding of flow cytometry technology essential for all users as well as the methods used to analyze and interpret the data. Moreover, recent progresses in flow cytometry have been discussed in order to give an opinion about the future importance of this technology.
An ultraviolet-visible spectrophotometer automation system. Part 3: Program documentation
NASA Astrophysics Data System (ADS)
Roth, G. S.; Teuschler, J. M.; Budde, W. L.
1982-07-01
The Ultraviolet-Visible Spectrophotometer (UVVIS) automation system accomplishes 'on-line' spectrophotometric quality assurance determinations, report generations, plot generations and data reduction for chlorophyll or color analysis. This system also has the capability to process manually entered data for the analysis of chlorophyll or color. For each program of the UVVIS system, this document contains a program description, flowchart, variable dictionary, code listing, and symbol cross-reference table. Also included are descriptions of file structures and of routines common to all automated analyses. The programs are written in Data General extended BASIC, Revision 4.3, under the RDOS operating systems, Revision 6.2. The BASIC code has been enhanced for real-time data acquisition, which is accomplished by CALLS to assembly language subroutines. Two other related publications are 'An Ultraviolet-Visible Spectrophotometer Automation System - Part I Functional Specifications,' and 'An Ultraviolet-Visible Spectrophotometer Automation System - Part II User's Guide.'
Low-cost USB interface for operant research using Arduino and Visual Basic.
Escobar, Rogelio; Pérez-Herrera, Carlos A
2015-03-01
This note describes the design of a low-cost interface using Arduino microcontroller boards and Visual Basic programming for operant conditioning research. The board executes one program in Arduino programming language that polls the state of the inputs and generates outputs in an operant chamber. This program communicates through a USB port with another program written in Visual Basic 2010 Express Edition running on a laptop, desktop, netbook computer, or even a tablet equipped with Windows operating system. The Visual Basic program controls schedules of reinforcement and records real-time data. A single Arduino board can be used to control a total of 52 inputs/output lines, and multiple Arduino boards can be used to control multiple operant chambers. An external power supply and a series of micro relays are required to control 28-V DC devices commonly used in operant chambers. Instructions for downloading and using the programs to generate simple and concurrent schedules of reinforcement are provided. Testing suggests that the interface is reliable, accurate, and could serve as an inexpensive alternative to commercial equipment. © Society for the Experimental Analysis of Behavior.
A prospective study of the motivational and health dynamics of Internet Gaming Disorder.
Weinstein, Netta; Przybylski, Andrew K; Murayama, Kou
2017-01-01
The American Psychiatric Association has identified Internet Gaming Disorder (IGD) as a potential psychiatric condition and called for research to investigate its etiology, stability, and impacts on health and behavior. The present study recruited 5,777 American adults and applied self-determination theory to examine how motivational factors influence, and are influenced by, IGD and health across a six month period. Following a preregistered analysis plan, results confirmed our hypotheses that IGD criteria are moderately stable and that they and basic psychological need satisfaction have a reciprocal relationship over time. Results also showed need satisfaction promoted health and served as a protective factor against IGD. Contrary to what was hypothesized, results provided no evidence directly linking IGD to health over time. Exploratory analyses suggested that IGD may have indirect effects on health by way of its impact on basic needs. Implications are discussed in terms of existing gaming addiction and motivational frameworks.
A prospective study of the motivational and health dynamics of Internet Gaming Disorder
Przybylski, Andrew K.; Murayama, Kou
2017-01-01
The American Psychiatric Association has identified Internet Gaming Disorder (IGD) as a potential psychiatric condition and called for research to investigate its etiology, stability, and impacts on health and behavior. The present study recruited 5,777 American adults and applied self-determination theory to examine how motivational factors influence, and are influenced by, IGD and health across a six month period. Following a preregistered analysis plan, results confirmed our hypotheses that IGD criteria are moderately stable and that they and basic psychological need satisfaction have a reciprocal relationship over time. Results also showed need satisfaction promoted health and served as a protective factor against IGD. Contrary to what was hypothesized, results provided no evidence directly linking IGD to health over time. Exploratory analyses suggested that IGD may have indirect effects on health by way of its impact on basic needs. Implications are discussed in terms of existing gaming addiction and motivational frameworks. PMID:28975056
de Carvalho, Helder Pereira; Huang, Jiguo; Zhao, Meixia; Liu, Gang; Yang, Xinyu; Dong, Lili; Liu, Xingjuan
2016-01-01
In this study, response surface methodology (RSM) model was applied for optimization of Basic Red 2 (BR2) removal using electrocoagulation/eggshell (ES) coupling process in a batch system. Central composite design was used to evaluate the effects and interactions of process parameters including current density, reaction time, initial pH and ES dosage on the BR2 removal efficiency and energy consumption. The analysis of variance revealed high R(2) values (≥85%) indicating that the predictions of RSM models are adequately applicable for both responses. The optimum conditions when the dye removal efficiency of 93.18% and energy consumption of 0.840 kWh/kg were observed were 11.40 mA/cm(2) current density, 5 min and 3 s reaction time, 6.5 initial pH and 10.91 g/L ES dosage.
NASA Technical Reports Server (NTRS)
Allen, Robert J.
1988-01-01
An assembly language program using the Intel 80386 CPU and 80387 math co-processor chips was written to increase the speed of data gathering and processing, and provide control of a scanning CW ring dye laser system. This laser system is used in high resolution (better than 0.001 cm-1) water vapor spectroscopy experiments. Laser beam power is sensed at the input and output of white cells and the output of a Fabry-Perot. The assembly language subroutine is called from Basic, acquires the data and performs various calculations at rates greater than 150 faster than could be performed by the higher level language. The width of output control pulses generated in assembly language are 3 to 4 microsecs as compared to 2 to 3.7 millisecs for those generated in Basic (about 500 to 1000 times faster). Included are a block diagram and brief description of the spectroscopy experiment, a flow diagram of the Basic and assembly language programs, listing of the programs, scope photographs of the computer generated 5-volt pulses used for control and timing analysis, and representative water spectrum curves obtained using these programs.
Instrumental biosensors: new perspectives for the analysis of biomolecular interactions.
Nice, E C; Catimel, B
1999-04-01
The use of instrumental biosensors in basic research to measure biomolecular interactions in real time is increasing exponentially. Applications include protein-protein, protein-peptide, DNA-protein, DNA-DNA, and lipid-protein interactions. Such techniques have been applied to, for example, antibody-antigen, receptor-ligand, signal transduction, and nuclear receptor studies. This review outlines the principles of two of the most commonly used instruments and highlights specific operating parameters that will assist in optimising experimental design, data generation, and analysis.
NASA Technical Reports Server (NTRS)
1973-01-01
The mission requirements and conceptual design of manned earth observatory payloads for the 1980 time period are discussed. Projections of 1980 sensor technology and user data requirements were used to formulate typical basic criteria pertaining to experiments, sensor complements, and reference missions. The subjects discussed are: (1) mission selection and prioritization, (2) baseline mission analysis, (3) earth observation data handling and contingency plans, and (4) analysis of low cost mission definition and rationale.
ERIC Educational Resources Information Center
Fry, Richard
2016-01-01
Broad demographic shifts in marital status, educational attainment and employment have transformed the way young adults in the U.S. are living. This Pew Research Center analysis of census data highlights the implications of these changes for the most basic element of their lives -- where they call home. In 2014, for the first time in more than 130…
Reference clock parameters for digital communications systems applications
NASA Technical Reports Server (NTRS)
Kartaschoff, P.
1981-01-01
The basic parameters relevant to the design of network timing systems describe the random and systematic time departures of the system elements, i.e., master (or reference) clocks, transmission links, and other clocks controlled over the links. The quantitative relations between these parameters were established and illustrated by means of numerical examples based on available measured data. The examples were limited to a simple PLL control system but the analysis can eventually be applied to more sophisticated systems at the cost of increased computational effort.
QuickNGS elevates Next-Generation Sequencing data analysis to a new level of automation.
Wagle, Prerana; Nikolić, Miloš; Frommolt, Peter
2015-07-01
Next-Generation Sequencing (NGS) has emerged as a widely used tool in molecular biology. While time and cost for the sequencing itself are decreasing, the analysis of the massive amounts of data remains challenging. Since multiple algorithmic approaches for the basic data analysis have been developed, there is now an increasing need to efficiently use these tools to obtain results in reasonable time. We have developed QuickNGS, a new workflow system for laboratories with the need to analyze data from multiple NGS projects at a time. QuickNGS takes advantage of parallel computing resources, a comprehensive back-end database, and a careful selection of previously published algorithmic approaches to build fully automated data analysis workflows. We demonstrate the efficiency of our new software by a comprehensive analysis of 10 RNA-Seq samples which we can finish in only a few minutes of hands-on time. The approach we have taken is suitable to process even much larger numbers of samples and multiple projects at a time. Our approach considerably reduces the barriers that still limit the usability of the powerful NGS technology and finally decreases the time to be spent before proceeding to further downstream analysis and interpretation of the data.
Basics in advanced life support: a role for download audit and metronomes.
Fletcher, David; Galloway, Robert; Chamberlain, Douglas; Pateman, Jane; Bryant, Geoffrey; Newcombe, Robert G
2008-08-01
An intention in 2003 to undertake a multicentre trial in the United Kingdom of compressions before and after defibrillation could not be realized because of concerns at the time in relation to informed consent. Instead, the new protocol was introduced in one ambulance service, ahead of the 2005 Guidelines, with greater emphasis on compressions. The results were monitored by analysis of electronic ECG downloads. Deficiencies in the standard of basic life support were identified but were not unique to our service. The introduction of metronomes and the provision of feedback to crews led to major improvements in performance. Our experience has implications for the emergency pre-hospital care of cardiac arrest.
Matsui, Kazuki; Tsume, Yasuhiro; Takeuchi, Susumu; Searls, Amanda; Amidon, Gordon L
2017-04-03
Weakly basic drugs exhibit a pH-dependent dissolution profile in the gastrointestinal (GI) tract, which makes it difficult to predict their oral absorption profile. The aim of this study was to investigate the utility of the gastrointestinal simulator (GIS), a novel in vivo predictive dissolution (iPD) methodology, in predicting the in vivo behavior of the weakly basic drug dipyridamole when coupled with in silico analysis. The GIS is a multicompartmental dissolution apparatus, which represents physiological gastric emptying in the fasted state. Kinetic parameters for drug dissolution and precipitation were optimized by fitting a curve to the dissolved drug amount-time profiles in the United States Pharmacopeia apparatus II and GIS. Optimized parameters were incorporated into mathematical equations to describe the mass transport kinetics of dipyridamole in the GI tract. By using this in silico model, intraluminal drug concentration-time profile was simulated. The predicted profile of dipyridamole in the duodenal compartment adequately captured observed data. In addition, the plasma concentration-time profile was also predicted using pharmacokinetic parameters following intravenous administration. On the basis of the comparison with observed data, the in silico approach coupled with the GIS successfully predicted in vivo pharmacokinetic profiles. Although further investigations are still required to generalize, these results indicated that incorporating GIS data into mathematical equations improves the predictability of in vivo behavior of weakly basic drugs like dipyridamole.
Training of physicians for the twenty-first century: role of the basic sciences.
Grande, Joseph P
2009-09-01
Rapid changes in the healthcare environment and public dissatisfaction with the cost and quality of medical care have prompted a critical analysis of how physicians are trained in the United States. Accrediting agencies have catalyzed a transformation from a process based to a competency-based curriculum, both at the undergraduate and the graduate levels. The objective of this overview is to determine how these changes are likely to alter the role of basic science in medical education. Policy statements related to basic science education from the National Board of Medical Examiners (NBME), the Accreditation Council for Graduate Medical Education (ACGME), American Board of Medical Specialties (ABMS), and the Federation of State Medical Boards (FSMB) were reviewed and assessed for common themes. Three primary roles for the basic sciences in medical education are proposed: (1) basic science to support the development of clinical reasoning skills; (2) basic science to support a critical analysis of medical and surgical interventions ("evidence-based medicine"); and (3) basic and translational science to support analysis of processes to improve healthcare ("science of healthcare delivery"). With these roles in mind, several methods to incorporate basic sciences into the curriculum are suggested.
It's Not Over Yet: The Annual Report on the Economic Status of the Profession, 2010-11
ERIC Educational Resources Information Center
Curtis, John W.
2011-01-01
This paper presents the annual report of the American Association of University Professors on the economic status of the profession for 2010-2011. This analysis of the economic status of the faculty begins with results from this year's annual survey of full-time faculty compensation. Survey report table 1 presents the most basic results, while…
An Analysis of Shaker Education: The Life and Death of an Alternative Educational System, 1774-1950.
ERIC Educational Resources Information Center
Taylor, Frank G.; Roberts, Arthur D.
This study investigates the Shaker educational system, analyzes the development of Shaker schools, and examines the innovative practices that the Shakers used to ready children for the world of their time. Originating in England among illiterate working class people, the movement was established in New England in 1774. Basic characteristics of the…
Using Consumer Behavior and Decision Models to Aid Students in Choosing a Major.
ERIC Educational Resources Information Center
Kaynama, Shohreh A.; Smith, Louise W.
1996-01-01
A study found that using consumer behavior and decision models to guide students to a major can be useful and enjoyable for students. Students consider many of the basic parameters through multi-attribute and decision-analysis models, so time with professors, who were found to be the most influential group, can be used for more individual and…
Is the Elimination of Recess in School a Violation of a Child's Basic Human Rights?
ERIC Educational Resources Information Center
Dubroc, Alicia M.
2007-01-01
The elimination of recess in schools across the country is becoming a normal occurrence in many communities, large and small. In each study presented in this content analysis, we find that free time and unstructured play is indeed essential to a child's healthy cognitive development. Article 31 of the United Nations Convention on the Rights of…
The influence of essential oils on human attention. I: alertness.
Ilmberger, J; Heuberger, E; Mahrhofer, C; Dessovic, H; Kowarik, D; Buchbauer, G
2001-03-01
Scientific research on the effects of essential oils on human behavior lags behind the promises made by popular aromatherapy. Nearly all aspects of human behavior are closely linked to processes of attention, the basic level being that of alertness, which ranges from sleep to wakefulness. In our study we measured the influence of essential oils and components of essential oils [peppermint, jasmine, ylang-ylang, 1,8-cineole (in two different dosages) and menthol] on this core attentional function, which can be experimentally defined as speed of information processing. Substances were administered by inhalation; levels of alertness were assessed by measuring motor and reaction times in a reaction time paradigm. The performances of the six experimental groups receiving substances (n = 20 in four groups, n = 30 in two groups) were compared with those of corresponding control groups receiving water. Between-group analysis, i.e. comparisons between experimental groups and their respective control groups, mainly did not reach statistical significance. However, within-group analysis showed complex correlations between subjective evaluations of substances and objective performance, indicating that effects of essentials oils or their components on basic forms of attentional behavior are mainly psychological.
Banno, Hayaki; Saiki, Jun
2015-03-01
Recent studies have sought to determine which levels of categories are processed first in visual scene categorization and have shown that the natural and man-made superordinate-level categories are understood faster than are basic-level categories. The current study examined the robustness of the superordinate-level advantage in a visual scene categorization task. A go/no-go categorization task was evaluated with response time distribution analysis using an ex-Gaussian template. A visual scene was categorized as either superordinate or basic level, and two basic-level categories forming a superordinate category were judged as either similar or dissimilar to each other. First, outdoor/ indoor groups and natural/man-made were used as superordinate categories to investigate whether the advantage could be generalized beyond the natural/man-made boundary. Second, a set of images forming a superordinate category was manipulated. We predicted that decreasing image set similarity within the superordinate-level category would work against the speed advantage. We found that basic-level categorization was faster than outdoor/indoor categorization when the outdoor category comprised dissimilar basic-level categories. Our results indicate that the superordinate-level advantage in visual scene categorization is labile across different categories and category structures. © 2015 SAGE Publications.
Analysis of direct punch velocity in professional defence
NASA Astrophysics Data System (ADS)
Lapkova, Dora; Adamek, Milan
2016-06-01
This paper is focused on analysis of a direct punch. Nowadays, professional defence is basic part of effective protection of people and property. There are many striking techniques and the goal of this research was to analyze the direct punch. The analysis is aimed to measure the velocity with help of high speed camera Olympus i-Speed 2 and then find the dependences of this velocity on input parameters. For data analysis two pieces of software were used - i-Speed Control Software and MINITAB. 111 participants took part in this experiment. The results are presented in this paper - especially dependence of mean velocity on time and difference in velocity between genders.
Interrupted time series regression for the evaluation of public health interventions: a tutorial.
Bernal, James Lopez; Cummins, Steven; Gasparrini, Antonio
2017-02-01
Interrupted time series (ITS) analysis is a valuable study design for evaluating the effectiveness of population-level health interventions that have been implemented at a clearly defined point in time. It is increasingly being used to evaluate the effectiveness of interventions ranging from clinical therapy to national public health legislation. Whereas the design shares many properties of regression-based approaches in other epidemiological studies, there are a range of unique features of time series data that require additional methodological considerations. In this tutorial we use a worked example to demonstrate a robust approach to ITS analysis using segmented regression. We begin by describing the design and considering when ITS is an appropriate design choice. We then discuss the essential, yet often omitted, step of proposing the impact model a priori. Subsequently, we demonstrate the approach to statistical analysis including the main segmented regression model. Finally we describe the main methodological issues associated with ITS analysis: over-dispersion of time series data, autocorrelation, adjusting for seasonal trends and controlling for time-varying confounders, and we also outline some of the more complex design adaptations that can be used to strengthen the basic ITS design.
Interrupted time series regression for the evaluation of public health interventions: a tutorial
Bernal, James Lopez; Cummins, Steven; Gasparrini, Antonio
2017-01-01
Abstract Interrupted time series (ITS) analysis is a valuable study design for evaluating the effectiveness of population-level health interventions that have been implemented at a clearly defined point in time. It is increasingly being used to evaluate the effectiveness of interventions ranging from clinical therapy to national public health legislation. Whereas the design shares many properties of regression-based approaches in other epidemiological studies, there are a range of unique features of time series data that require additional methodological considerations. In this tutorial we use a worked example to demonstrate a robust approach to ITS analysis using segmented regression. We begin by describing the design and considering when ITS is an appropriate design choice. We then discuss the essential, yet often omitted, step of proposing the impact model a priori. Subsequently, we demonstrate the approach to statistical analysis including the main segmented regression model. Finally we describe the main methodological issues associated with ITS analysis: over-dispersion of time series data, autocorrelation, adjusting for seasonal trends and controlling for time-varying confounders, and we also outline some of the more complex design adaptations that can be used to strengthen the basic ITS design. PMID:27283160
Topochemical Analysis of Cell Wall Components by TOF-SIMS.
Aoki, Dan; Fukushima, Kazuhiko
2017-01-01
Time-of-flight secondary ion mass spectrometry (TOF-SIMS) is a recently developing analytical tool and a type of imaging mass spectrometry. TOF-SIMS provides mass spectral information with a lateral resolution on the order of submicrons, with widespread applicability. Sometimes, it is described as a surface analysis method without the requirement for sample pretreatment; however, several points need to be taken into account for the complete utilization of the capabilities of TOF-SIMS. In this chapter, we introduce methods for TOF-SIMS sample treatments, as well as basic knowledge of wood samples TOF-SIMS spectral and image data analysis.
A reciprocal effects model of the temporal ordering of basic psychological needs and motivation.
Martinent, Guillaume; Guillet-Descas, Emma; Moiret, Sophie
2015-04-01
Using self-determination theory as the framework, we examined the temporal ordering between satisfaction and thwarting of basic psychological needs and motivation. We accomplished this goal by using a two-wave 7-month partial least squares path modeling approach (PLS-PM) among a sample of 94 adolescent athletes (Mage = 15.96) in an intensive training setting. The PLS-PM results showed significant paths leading: (a) from T1 satisfaction of basic psychological need for competence to T2 identified regulation, (b) from T1 external regulation to T2 thwarting and satisfaction of basic psychological need for competence, and (c) from T1 amotivation to T2 satisfaction of basic psychological need for relatedness. Overall, our results suggest that the relationship between basic psychological need and motivation varied depending on the type of basic need and motivation assessed. Basic psychological need for competence predicted identified regulation over time whereas amotivation and external regulation predicted basic psychological need for relatedness or competence over time.
feets: feATURE eXTRACTOR for tIME sERIES
NASA Astrophysics Data System (ADS)
Cabral, Juan; Sanchez, Bruno; Ramos, Felipe; Gurovich, Sebastián; Granitto, Pablo; VanderPlas, Jake
2018-06-01
feets characterizes and analyzes light-curves from astronomical photometric databases for modelling, classification, data cleaning, outlier detection and data analysis. It uses machine learning algorithms to determine the numerical descriptors that characterize and distinguish the different variability classes of light-curves; these range from basic statistical measures such as the mean or standard deviation to complex time-series characteristics such as the autocorrelation function. The library is not restricted to the astronomical field and could also be applied to any kind of time series. This project is a derivative work of FATS (ascl:1711.017).
Assessment of Driver's Reaction Times in Diverisified Research Environments
NASA Astrophysics Data System (ADS)
Guzek, Marek; Lozia, Zbigniew; Zdanowicz, Piotr; Jurecki, Rafał S.; Stańczyk, Tomasz L.; Pieniążek, Wiesław
2012-06-01
Reaction time is one of the basic parameters that characterize the driver and very important in the analysis of accident situations in road traffic. This paper describes research studies on the reaction time evaluation as conducted in three environments: on a typical device used in the transport psychology labs (the so-called reflexometer), in the driving simulator (autoPW) and on the driving test track (the Kielce Test Track). In all environments, the tests were performed for the same group of drivers. The article presents the characteristics of research in each environment as well as shows and compares exemplary results.
A January angular momentum balance in the OSU two-level atmospheric general circulation model
NASA Technical Reports Server (NTRS)
Kim, J.-W.; Grady, W.
1982-01-01
The present investigation is concerned with an analysis of the atmospheric angular momentum balance, based on the simulation data of the Oregon State University two-level atmospheric general circulation model (AGCM). An attempt is also made to gain an understanding of the involved processes. Preliminary results on the angular momentum and mass balance in the AGCM are shown. The basic equations are examined, and questions of turbulent momentum transfer are investigated. The methods of analysis are discussed, taking into account time-averaged balance equations, time and longitude-averaged balance equations, mean meridional circulation, the mean meridional balance of relative angular momentum, and standing and transient components of motion.
Fransson, Boel A; Chen, Chi-Ya; Noyes, Julie A; Ragle, Claude A
2016-11-01
To determine the construct and concurrent validity of instrument motion metrics for laparoscopic skills assessment in virtual reality and augmented reality simulators. Evaluation study. Veterinarian students (novice, n = 14) and veterinarians (experienced, n = 11) with no or variable laparoscopic experience. Participants' minimally invasive surgery (MIS) experience was determined by hospital records of MIS procedures performed in the Teaching Hospital. Basic laparoscopic skills were assessed by 5 tasks using a physical box trainer. Each participant completed 2 tasks for assessments in each type of simulator (virtual reality: bowel handling and cutting; augmented reality: object positioning and a pericardial window model). Motion metrics such as instrument path length, angle or drift, and economy of motion of each simulator were recorded. None of the motion metrics in a virtual reality simulator showed correlation with experience, or to the basic laparoscopic skills score. All metrics in augmented reality were significantly correlated with experience (time, instrument path, and economy of movement), except for the hand dominance metric. The basic laparoscopic skills score was correlated to all performance metrics in augmented reality. The augmented reality motion metrics differed between American College of Veterinary Surgeons diplomates and residents, whereas basic laparoscopic skills score and virtual reality metrics did not. Our results provide construct validity and concurrent validity for motion analysis metrics for an augmented reality system, whereas a virtual reality system was validated only for the time score. © Copyright 2016 by The American College of Veterinary Surgeons.
Application handbook for a Standardized Control Module (SCM) for DC-DC converters, volume 1
NASA Astrophysics Data System (ADS)
Lee, F. C.; Mahmoud, M. F.; Yu, Y.
1980-04-01
The standardized control module (SCM) was developed for application in the buck, boost and buck/boost DC-DC converters. The SCM used multiple feedback loops to provide improved input line and output load regulation, stable feedback control system, good dynamic transient response and adaptive compensation of the control loop for changes in open loop gain and output filter time constraints. The necessary modeling and analysis tools to aid the design engineer in the application of the SCM to DC-DC Converters were developed. The SCM functional block diagram and the different analysis techniques were examined. The average time domain analysis technique was chosen as the basic analytical tool. The power stage transfer functions were developed for the buck, boost and buck/boost converters. The analog signal and digital signal processor transfer functions were developed for the three DC-DC Converter types using the constant on time, constant off time and constant frequency control laws.
Application handbook for a Standardized Control Module (SCM) for DC-DC converters, volume 1
NASA Technical Reports Server (NTRS)
Lee, F. C.; Mahmoud, M. F.; Yu, Y.
1980-01-01
The standardized control module (SCM) was developed for application in the buck, boost and buck/boost DC-DC converters. The SCM used multiple feedback loops to provide improved input line and output load regulation, stable feedback control system, good dynamic transient response and adaptive compensation of the control loop for changes in open loop gain and output filter time constraints. The necessary modeling and analysis tools to aid the design engineer in the application of the SCM to DC-DC Converters were developed. The SCM functional block diagram and the different analysis techniques were examined. The average time domain analysis technique was chosen as the basic analytical tool. The power stage transfer functions were developed for the buck, boost and buck/boost converters. The analog signal and digital signal processor transfer functions were developed for the three DC-DC Converter types using the constant on time, constant off time and constant frequency control laws.
A structural equation modeling analysis of students' understanding in basic mathematics
NASA Astrophysics Data System (ADS)
Oktavia, Rini; Arif, Salmawaty; Ferdhiana, Ridha; Yuni, Syarifah Meurah; Ihsan, Mahyus
2017-11-01
This research, in general, aims to identify incoming students' understanding and misconceptions of several basic concepts in mathematics. The participants of this study are the 2015 incoming students of Faculty of Mathematics and Natural Science of Syiah Kuala University, Indonesia. Using an instrument that were developed based on some anecdotal and empirical evidences on students' misconceptions, a survey involving 325 participants was administered and several quantitative and qualitative analysis of the survey data were conducted. In this article, we discuss the confirmatory factor analysis using Structural Equation Modeling (SEM) on factors that determine the new students' overall understanding of basic mathematics. The results showed that students' understanding on algebra, arithmetic, and geometry were significant predictors for their overall understanding of basic mathematics. This result supported that arithmetic and algebra are not the only predictors of students' understanding of basic mathematics.
Translational research in behavior analysis: historical traditions and imperative for the future.
Mace, F Charles; Critchfield, Thomas S
2010-05-01
"Pure basic" science can become detached from the natural world that it is supposed to explain. "Pure applied" work can become detached from fundamental processes that shape the world it is supposed to improve. Neither demands the intellectual support of a broad scholarly community or the material support of society. Translational research can do better by seeking innovation in theory or practice through the synthesis of basic and applied questions, literatures, and methods. Although translational thinking has always occurred in behavior analysis, progress often has been constrained by a functional separation of basic and applied communities. A review of translational traditions in behavior analysis suggests that innovation is most likely when individuals with basic and applied expertise collaborate. Such innovation may have to accelerate for behavior analysis to be taken seriously as a general-purpose science of behavior. We discuss the need for better coordination between the basic and applied sectors, and argue that such coordination compromises neither while benefiting both.
Changes in food and beverage environments after an urban corner store intervention.
Cavanaugh, Erica; Green, Sarah; Mallya, Giridhar; Tierney, Ann; Brensinger, Colleen; Glanz, Karen
2014-08-01
In response to the obesity epidemic, interventions to improve the food environment in corner stores have gained attention. This study evaluated the availability, quality, and price of foods in Philadelphia corner stores before and after a healthy corner store intervention with two levels of intervention intensity ("basic" and "conversion"). Observational measures of the food environment were completed in 2011 and again in 2012 in corner stores participating in the intervention, using the Nutrition Environment Measures Survey for Corner Stores (NEMS-CS). Main analyses included the 211 stores evaluated at both time-points. A time-by-treatment interaction analysis was used to evaluate the changes in NEMS-CS scores by intervention level over time. Availability of fresh fruit increased significantly in conversion stores over time. Specifically, there were significant increases in the availability of apples, oranges, grapes, and broccoli in conversion stores over time. Conversion stores showed a trend toward a significantly larger increase in the availability score compared to basic stores over time. Interventions aimed at increasing healthy food availability are associated with improvements in the availability of low-fat milk, fruits, and some vegetables, especially when infrastructure changes, such as refrigeration and shelving enhancements, are offered. Copyright © 2014 Elsevier Inc. All rights reserved.
Time-lapse microscopy and image analysis in basic and clinical embryo development research.
Wong, C; Chen, A A; Behr, B; Shen, S
2013-02-01
Mammalian preimplantation embryo development is a complex process in which the exact timing and sequence of events are as essential as the accurate execution of the events themselves. Time-lapse microscopy (TLM) is an ideal tool to study this process since the ability to capture images over time provides a combination of morphological, dynamic and quantitative information about developmental events. Here, we systematically review the application of TLM in basic and clinical embryo research. We identified all relevant preimplantation embryo TLM studies published in English up to May 2012 using PubMed and Google Scholar. We then analysed the technical challenges involved in embryo TLM studies and how these challenges may be overcome with technological innovations. Finally, we reviewed the different types of TLM embryo studies, with a special focus on how TLM can benefit clinical assisted reproduction. Although new parameters predictive of embryo development potential may be discovered and used clinically to potentially increase the success rate of IVF, adopting TLM to routine clinical practice will require innovations in both optics and image analysis. Combined with such innovations, TLM may provide embryologists and clinicians with an important tool for making critical decisions in assisted reproduction. In this review, we perform a literature search of all published early embryo development studies that used time-lapse microscopy (TLM). From the literature, we discuss the benefits of TLM over traditional time-point analysis, as well as the technical difficulties and solutions involved in implementing TLM for embryo studies. We further discuss research that has successfully derived non-invasive markers that may increase the success rate of assisted reproductive technologies, primarily IVF. Most notably, we extend our discussion to highlight important considerations for the practical use of TLM in research and clinical settings. Copyright © 2012 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.
49 CFR 236.923 - Task analysis and basic requirements.
Code of Federal Regulations, 2011 CFR
2011-10-01
... classroom, simulator, computer-based, hands-on, or other formally structured training and testing, except... for Processor-Based Signal and Train Control Systems § 236.923 Task analysis and basic requirements...) Based on a formal task analysis, identify the installation, maintenance, repair, modification...
Kimura, Kosei; Wada, Akira; Ueta, Masami; Ogata, Akihiko; Tanaka, Satoru; Sakai, Akiko; Yoshida, Hideji; Fushitani, Hideo; Miyamoto, Akiko; Fukushima, Masakazu; Uchiumi, Toshio; Tanigawa, Nobuhiko
2010-11-01
Many auxiliary functions of ribosomal proteins (r-proteins) have received considerable attention in recent years. However, human r-proteins have hardly been examined by proteomic analysis. In this study, we isolated ribosomal particles and subsequently compared the proteome of r-proteins between the DLD-1 human colon cancer cell line and its 5-fluorouracil (5-FU)-resistant sub-line, DLD-1/5-FU, using the radical-free and highly reducing method of two-dimensional polyacrylamide gel electrophoresis, which has a superior ability to separate basic proteins, and we discuss the role of r-proteins in 5-FU resistance. Densitometric analysis was performed to quantify modulated proteins, and protein spots showing significant changes were identified by employing matrix-assisted laser desorption/ionization time-of-flight/time-of-flight mass spectrometry. Three basic proteins (L15, L37 and prohibitin) which were significantly modulated between DLD-1 and DLD-1/5-FU were identified. Two proteins, L15 and L37, showed down-regulated expression in DLD-1/5-FU in comparison to DLD-1. Prohibitin, which is not an r-protein and is known to be localized in the mitochondria, showed up-regulated expression in DLD-1/5-FU. These 3 proteins may be related to 5-FU resistance.
PWL 1.0 Personal WaveLab: an object-oriented workbench for seismogram analysis on Windows systems
NASA Astrophysics Data System (ADS)
Bono, Andrea; Badiali, Lucio
2005-02-01
Personal WaveLab 1.0 wants to be the starting point for an ex novo development of seismic time-series analysis procedures for Windows-based personal computers. Our objective is two-fold. Firstly, being itself a stand-alone application, it allows to do "basic" digital or digitised seismic waveform analysis. Secondly, thanks to its architectural characteristics it can be the basis for the development of more complex and power featured applications. An expanded version of PWL, called SisPick!, is currently in use at the Istituto Nazionale di Geofisica e Vulcanologia (Italian Institute of Geophysics and Volcanology) for real-time monitoring with purposes of Civil Protection. This means that about 90 users tested the application for more than 1 year, making its features more robust and efficient. SisPick! was also employed in the United Nations Nyragongo Project, in Congo, and during the Stromboli emergency in summer of 2002. The main appeals of the application package are: ease of use, object-oriented design, good computational speed, minimal need of disk space and the complete absence of third-party developed components (including ActiveX). Windows environment spares the user scripting or complex interaction with the system. The system is in constant development to answer the needs and suggestions of its users. Microsoft Visual Basic 6 source code, installation package, test data sets and documentation are available at no cost.
Dai, Yuntao; Rozema, Evelien; Verpoorte, Robert; Choi, Young Hae
2016-02-19
Natural deep eutectic solvents (NADES) have attracted a great deal of attention in recent times as promising green media. They are generally composed of neutral, acidic or basic compounds that form liquids of high viscosity when mixed in certain molar ratio. Despite their potential, viscosity and acid or basic nature of some ingredients may affect the extraction capacity and stabilizing ability of the target compounds. To investigate these effects, extraction with a series of NADES was employed for the analysis of anthocyanins in flower petals of Catharanthus roseus in combination with HPLC-DAD-based metabolic profiling. Along with the extraction yields of anthocyanins their stability in NADES was also studied. Multivariate data analysis indicates that the lactic acid-glucose (LGH), and 1,2-propanediol-choline chloride (PCH) NADES present a similar extraction power for anthocyanins as conventional organic solvents. Furthermore, among the NADES employed, LGH exhibits an at least three times higher stabilizing capacity for cyanidins than acidified ethanol, which facilitates their extraction and analysis process. Comparing NADES to the conventional organic solvents, in addition to their reduced environmental impact, they proved to provide higher stability for anthocyanins, and therefore have a great potential as possible alternatives to those organic solvents in health related areas such as food, pharmaceuticals and cosmetics. Copyright © 2016 Elsevier B.V. All rights reserved.
An approximate methods approach to probabilistic structural analysis
NASA Technical Reports Server (NTRS)
Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.
1989-01-01
A major research and technology program in Probabilistic Structural Analysis Methods (PSAM) is currently being sponsored by the NASA Lewis Research Center with Southwest Research Institute as the prime contractor. This program is motivated by the need to accurately predict structural response in an environment where the loadings, the material properties, and even the structure may be considered random. The heart of PSAM is a software package which combines advanced structural analysis codes with a fast probability integration (FPI) algorithm for the efficient calculation of stochastic structural response. The basic idea of PAAM is simple: make an approximate calculation of system response, including calculation of the associated probabilities, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The deterministic solution resulting should give a reasonable and realistic description of performance-limiting system responses, although some error will be inevitable. If the simple model has correctly captured the basic mechanics of the system, however, including the proper functional dependence of stress, frequency, etc. on design parameters, then the response sensitivities calculated may be of significantly higher accuracy.
Djioua, Moussa; Plamondon, Réjean
2009-11-01
In this paper, we present a new analytical method for estimating the parameters of Delta-Lognormal functions and characterizing handwriting strokes. According to the Kinematic Theory of rapid human movements, these parameters contain information on both the motor commands and the timing properties of a neuromuscular system. The new algorithm, called XZERO, exploits relationships between the zero crossings of the first and second time derivatives of a lognormal function and its four basic parameters. The methodology is described and then evaluated under various testing conditions. The new tool allows a greater variety of stroke patterns to be processed automatically. Furthermore, for the first time, the extraction accuracy is quantified empirically, taking advantage of the exponential relationships that link the dispersion of the extraction errors with its signal-to-noise ratio. A new extraction system which combines this algorithm with two other previously published methods is also described and evaluated. This system provides researchers involved in various domains of pattern analysis and artificial intelligence with new tools for the basic study of single strokes as primitives for understanding rapid human movements.
NASA Technical Reports Server (NTRS)
1971-01-01
Computational techniques were developed and assimilated for the design optimization. The resulting computer program was then used to perform initial optimization and sensitivity studies on a typical thermal protection system (TPS) to demonstrate its application to the space shuttle TPS design. The program was developed in Fortran IV for the CDC 6400 but was subsequently converted to the Fortran V language to be used on the Univac 1108. The program allows for improvement and update of the performance prediction techniques. The program logic involves subroutines which handle the following basic functions: (1) a driver which calls for input, output, and communication between program and user and between the subroutines themselves; (2) thermodynamic analysis; (3) thermal stress analysis; (4) acoustic fatigue analysis; and (5) weights/cost analysis. In addition, a system total cost is predicted based on system weight and historical cost data of similar systems. Two basic types of input are provided, both of which are based on trajectory data. These are vehicle attitude (altitude, velocity, and angles of attack and sideslip), for external heat and pressure loads calculation, and heating rates and pressure loads as a function of time.
Highest cited papers published in Neurology India: An analysis for the years 1993-2014.
Pandey, Paritosh; Subeikshanan, V; Madhugiri, Venkatesh S
2016-01-01
The highest cited papers published in a journal provide a snapshot of the clinical practice and research in that specialty and/or region. The aim of this study was to determine the highest cited papers published in Neurology India and analyze their attributes. This study was a citation analysis of all papers published in Neurology India since online archiving commenced in 1993. All papers published in Neurology India between the years 1993-2014 were listed. The number of times each paper had been cited up till the time of performing this study was determined by performing a Google Scholar search. Published papers were then ranked on the basis of total times cited since publication and the annual citation rate. Statistical Techniques: Simple counts and percentages were used to report most results. The mean citations received by papers in various categories were compared using the Student's t-test or a one-way analysis of variance, as appropriate. All analyses were carried out on SAS University Edition (SAS/STAT®, SAS Institute Inc, NC, USA) and graphs were generated on MS Excel 2016. The top papers on the total citations and annual citation rate rank lists pertained to basic neuroscience research. The highest cited paper overall had received 139 citations. About a quarter of the papers published had never been cited at all. The major themes represented were vascular diseases and infections. The highest cited papers reflect the diseases that are of major concern in India. Certain domains such as trauma, allied neurosciences, and basic neuroscience research were underrepresented.
Flightspeed Integral Image Analysis Toolkit
NASA Technical Reports Server (NTRS)
Thompson, David R.
2009-01-01
The Flightspeed Integral Image Analysis Toolkit (FIIAT) is a C library that provides image analysis functions in a single, portable package. It provides basic low-level filtering, texture analysis, and subwindow descriptor for applications dealing with image interpretation and object recognition. Designed with spaceflight in mind, it addresses: Ease of integration (minimal external dependencies) Fast, real-time operation using integer arithmetic where possible (useful for platforms lacking a dedicated floatingpoint processor) Written entirely in C (easily modified) Mostly static memory allocation 8-bit image data The basic goal of the FIIAT library is to compute meaningful numerical descriptors for images or rectangular image regions. These n-vectors can then be used directly for novelty detection or pattern recognition, or as a feature space for higher-level pattern recognition tasks. The library provides routines for leveraging training data to derive descriptors that are most useful for a specific data set. Its runtime algorithms exploit a structure known as the "integral image." This is a caching method that permits fast summation of values within rectangular regions of an image. This integral frame facilitates a wide range of fast image-processing functions. This toolkit has applicability to a wide range of autonomous image analysis tasks in the space-flight domain, including novelty detection, object and scene classification, target detection for autonomous instrument placement, and science analysis of geomorphology. It makes real-time texture and pattern recognition possible for platforms with severe computational restraints. The software provides an order of magnitude speed increase over alternative software libraries currently in use by the research community. FIIAT can commercially support intelligent video cameras used in intelligent surveillance. It is also useful for object recognition by robots or other autonomous vehicles
ERIC Educational Resources Information Center
Kadane, Joseph B.; And Others
This paper offers a preliminary analysis of the effects of a semi-segregated school system on the IQ's of its students. The basic data consist of IQ scores for fourth, sixth, and eighth grades and associated environmental data obtained from their school records. A statistical model is developed to analyze longitudinal data when both process error…
Janna B. Custer; Dale J. Blahna
2000-01-01
Many studies have taken place that seek to gain an understanding of the influences upon attachment to special places. These studies have been largely qualitative in nature and have succeeded in identifying quantifiable variables that can be useful in measuring basic levels of attachment to special places, e.g., length of time that one has been associated with special...
Estimating the Cost of a Bachelor's Degree: An Institutional Cost Analysis.
ERIC Educational Resources Information Center
To, Duc-Le
The cost of a bachelor's degree was estimated and compared for different types of institutions. The objective was to develop a single index to show how much each type of institution spends on producing a bachelor's degree graduate, and to use trend data to show how these costs will change over time. The basic concept associated with the cost of a…
Development of wavelet analysis tools for turbulence
NASA Technical Reports Server (NTRS)
Bertelrud, A.; Erlebacher, G.; Dussouillez, PH.; Liandrat, M. P.; Liandrat, J.; Bailly, F. Moret; Tchamitchian, PH.
1992-01-01
Presented here is the general framework and the initial results of a joint effort to derive novel research tools and easy to use software to analyze and model turbulence and transition. Given here is a brief review of the issues, a summary of some basic properties of wavelets, and preliminary results. Technical aspects of the implementation, the physical conclusions reached at this time, and current developments are discussed.
Theory of Financial Risk and Derivative Pricing
NASA Astrophysics Data System (ADS)
Bouchaud, Jean-Philippe; Potters, Marc
2009-01-01
Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.
Theory of Financial Risk and Derivative Pricing - 2nd Edition
NASA Astrophysics Data System (ADS)
Bouchaud, Jean-Philippe; Potters, Marc
2003-12-01
Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.
Approximation methods for stochastic petri nets
NASA Technical Reports Server (NTRS)
Jungnitz, Hauke Joerg
1992-01-01
Stochastic Marked Graphs are a concurrent decision free formalism provided with a powerful synchronization mechanism generalizing conventional Fork Join Queueing Networks. In some particular cases the analysis of the throughput can be done analytically. Otherwise the analysis suffers from the classical state explosion problem. Embedded in the divide and conquer paradigm, approximation techniques are introduced for the analysis of stochastic marked graphs and Macroplace/Macrotransition-nets (MPMT-nets), a new subclass introduced herein. MPMT-nets are a subclass of Petri nets that allow limited choice, concurrency and sharing of resources. The modeling power of MPMT is much larger than that of marked graphs, e.g., MPMT-nets can model manufacturing flow lines with unreliable machines and dataflow graphs where choice and synchronization occur. The basic idea leads to the notion of a cut to split the original net system into two subnets. The cuts lead to two aggregated net systems where one of the subnets is reduced to a single transition. A further reduction leads to a basic skeleton. The generalization of the idea leads to multiple cuts, where single cuts can be applied recursively leading to a hierarchical decomposition. Based on the decomposition, a response time approximation technique for the performance analysis is introduced. Also, delay equivalence, which has previously been introduced in the context of marked graphs by Woodside et al., Marie's method and flow equivalent aggregation are applied to the aggregated net systems. The experimental results show that response time approximation converges quickly and shows reasonable accuracy in most cases. The convergence of Marie's method and flow equivalent aggregation are applied to the aggregated net systems. The experimental results show that response time approximation converges quickly and shows reasonable accuracy in most cases. The convergence of Marie's is slower, but the accuracy is generally better. Delay equivalence often fails to converge, while flow equivalent aggregation can lead to potentially bad results if a strong dependence of the mean completion time on the interarrival process exists.
Beyond linear methods of data analysis: time series analysis and its applications in renal research.
Gupta, Ashwani K; Udrea, Andreea
2013-01-01
Analysis of temporal trends in medicine is needed to understand normal physiology and to study the evolution of disease processes. It is also useful for monitoring response to drugs and interventions, and for accountability and tracking of health care resources. In this review, we discuss what makes time series analysis unique for the purposes of renal research and its limitations. We also introduce nonlinear time series analysis methods and provide examples where these have advantages over linear methods. We review areas where these computational methods have found applications in nephrology ranging from basic physiology to health services research. Some examples include noninvasive assessment of autonomic function in patients with chronic kidney disease, dialysis-dependent renal failure and renal transplantation. Time series models and analysis methods have been utilized in the characterization of mechanisms of renal autoregulation and to identify the interaction between different rhythms of nephron pressure flow regulation. They have also been used in the study of trends in health care delivery. Time series are everywhere in nephrology and analyzing them can lead to valuable knowledge discovery. The study of time trends of vital signs, laboratory parameters and the health status of patients is inherent to our everyday clinical practice, yet formal models and methods for time series analysis are not fully utilized. With this review, we hope to familiarize the reader with these techniques in order to assist in their proper use where appropriate.
A new fractional wavelet transform
NASA Astrophysics Data System (ADS)
Dai, Hongzhe; Zheng, Zhibao; Wang, Wei
2017-03-01
The fractional Fourier transform (FRFT) is a potent tool to analyze the time-varying signal. However, it fails in locating the fractional Fourier domain (FRFD)-frequency contents which is required in some applications. A novel fractional wavelet transform (FRWT) is proposed to solve this problem. It displays the time and FRFD-frequency information jointly in the time-FRFD-frequency plane. The definition, basic properties, inverse transform and reproducing kernel of the proposed FRWT are considered. It has been shown that an FRWT with proper order corresponds to the classical wavelet transform (WT). The multiresolution analysis (MRA) associated with the developed FRWT, together with the construction of the orthogonal fractional wavelets are also presented. Three applications are discussed: the analysis of signal with time-varying frequency content, the FRFD spectrum estimation of signals that involving noise, and the construction of fractional Harr wavelet. Simulations verify the validity of the proposed FRWT.
Huart, Caroline; Legrain, Valéry; Hummel, Thomas; Rombaux, Philippe; Mouraux, André
2012-01-01
Background The recording of olfactory and trigeminal chemosensory event-related potentials (ERPs) has been proposed as an objective and non-invasive technique to study the cortical processing of odors in humans. Until now, the responses have been characterized mainly using across-trial averaging in the time domain. Unfortunately, chemosensory ERPs, in particular, olfactory ERPs, exhibit a relatively low signal-to-noise ratio. Hence, although the technique is increasingly used in basic research as well as in clinical practice to evaluate people suffering from olfactory disorders, its current clinical relevance remains very limited. Here, we used a time-frequency analysis based on the wavelet transform to reveal EEG responses that are not strictly phase-locked to onset of the chemosensory stimulus. We hypothesized that this approach would significantly enhance the signal-to-noise ratio of the EEG responses to chemosensory stimulation because, as compared to conventional time-domain averaging, (1) it is less sensitive to temporal jitter and (2) it can reveal non phase-locked EEG responses such as event-related synchronization and desynchronization. Methodology/Principal Findings EEG responses to selective trigeminal and olfactory stimulation were recorded in 11 normosmic subjects. A Morlet wavelet was used to characterize the elicited responses in the time-frequency domain. We found that this approach markedly improved the signal-to-noise ratio of the obtained EEG responses, in particular, following olfactory stimulation. Furthermore, the approach allowed characterizing non phase-locked components that could not be identified using conventional time-domain averaging. Conclusion/Significance By providing a more robust and complete view of how odors are represented in the human brain, our approach could constitute the basis for a robust tool to study olfaction, both for basic research and clinicians. PMID:22427997
5 CFR 551.401 - Basic principles.
Code of Federal Regulations, 2011 CFR
2011-01-01
... direction of the agency is “hours of work.” Such time includes: (1) Time during which an employee is required to be on duty; (2) Time during which an employee is suffered or permitted to work; and (3) Waiting... FAIR LABOR STANDARDS ACT Hours of Work General Provisions § 551.401 Basic principles. (a) All time...
NASA Astrophysics Data System (ADS)
Shahzad, Syed Jawad Hussain; Nor, Safwan Mohd; Mensi, Walid; Kumar, Ronald Ravinesh
2017-04-01
This study examines the power law properties of 11 US credit and stock markets at the industry level. We use multifractal detrended fluctuation analysis (MF-DFA) and multifractal detrended cross-correlation analysis (MF-DXA) to first investigate the relative efficiency of credit and stock markets and then evaluate the mutual interdependence between CDS-equity market pairs. The scaling exponents of the MF-DFA approach suggest that CDS markets are relatively more inefficient than their equity counterparts. However, Banks and Financial credit markets are relatively more efficient. Basic Materials (both CDS and equity indices) is the most inefficient sector of the US economy. The cross-correlation exponents obtained through MF-DXA also suggest that the relationship of the CDS and equity sectors within and across markets is multifractal for all pairs. Within the CDS market, Basic Materials is the most dependent sector, whereas equity market sectors can be divided into two distinct groups based on interdependence. The pair-wise dependence between Basic Materials sector CDSs and the equity index is also the highest. The degree of cross-correlation shows that the sectoral pairs of CDS and equity markets belong to a persistent cross-correlated series within selected time intervals.
A new method of differential structural analysis of gamma-family basic parameters
NASA Technical Reports Server (NTRS)
Melkumian, L. G.; Ter-Antonian, S. V.; Smorodin, Y. A.
1985-01-01
The maximum likelihood method is used for the first time to restore parameters of electron photon cascades registered on X-ray films. The method permits one to carry out a structural analysis of the gamma quanta family darkening spots independent of the gamma quanta overlapping degree, and to obtain maximum admissible accuracies in estimating the energies of the gamma quanta composing a family. The parameter estimation accuracy weakly depends on the value of the parameters themselves and exceeds by an order of the values obtained by integral methods.
1988-01-01
that basic terms such as physical ofjPc>. po i i , etc., are used over and over again. We have built, a library o’ s-u.- ani have prwided mechanisms... 1 Goals of a Performance Estimator Assistant As defined in [2], the long- term goa, of a Performance Estimator Assistant (PEA) is to aid in the...characterization m. 1 Figure 1 : Current Paradigm Mid- term goals are: - domain models for analysis, . algorithm design analysis and advice, and 9 real-time
A comparative analysis of massed vs. distributed practice on basic math fact fluency growth rates.
Schutte, Greg M; Duhon, Gary J; Solomon, Benjamin G; Poncy, Brian C; Moore, Kathryn; Story, Bailey
2015-04-01
To best remediate academic deficiencies, educators need to not only identify empirically validated interventions but also be able to apply instructional modifications that result in more efficient student learning. The current study compared the effect of massed and distributed practice with an explicit timing intervention to evaluate the extent to which these modifications lead to increased math fact fluency on basic addition problems. Forty-eight third-grade students were placed into one of three groups with each of the groups completing four 1-min math explicit timing procedures each day across 19 days. Group one completed all four 1-min timings consecutively; group two completed two back-to-back 1-min timings in the morning and two back-to-back 1-min timings in the afternoon, and group three completed one, 1-min independent timing four times distributed across the day. Growth curve modeling was used to examine the progress throughout the course of the study. Results suggested that students in the distributed practice conditions, both four times per day and two times per day, showed significantly higher fluency growth rates than those practicing only once per day in a massed format. These results indicate that combining distributed practice with explicit timing procedures is a useful modification that enhances student learning without the addition of extra instructional time when targeting math fact fluency. Copyright © 2015 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hattaf, Khalid; Mahrouf, Marouane; Adnani, Jihad; Yousfi, Noura
2018-01-01
In this paper, we propose a stochastic delayed epidemic model with specific functional response. The time delay represents temporary immunity period, i.e., time from recovery to becoming susceptible again. We first show that the proposed model is mathematically and biologically well-posed. Moreover, the extinction of the disease and the persistence in the mean are established in the terms of a threshold value R0S which is smaller than the basic reproduction number R0 of the corresponding deterministic system.
ROLE OF TIMING IN ASSESSMENT OF NERVE REGENERATION
BRENNER, MICHAEL J.; MORADZADEH, ARASH; MYCKATYN, TERENCE M.; TUNG, THOMAS H. H.; MENDEZ, ALLEN B.; HUNTER, DANIEL A.; MACKINNON, SUSAN E.
2014-01-01
Small animal models are indispensable for research on nerve injury and reconstruction, but their superlative regenerative potential may confound experimental interpretation. This study investigated time-dependent neuroregenerative phenomena in rodents. Forty-six Lewis rats were randomized to three nerve allograft groups treated with 2 mg/(kg day) tacrolimus; 5 mg/(kg day) Cyclosporine A; or placebo injection. Nerves were subjected to histomorphometric and walking track analysis at serial time points. Tacrolimus increased fiber density, percent neural tissue, and nerve fiber count and accelerated functional recovery at 40 days, but these differences were undetectable by 70 days. Serial walking track analysis showed a similar pattern of recovery. A ‘blow-through’ effect is observed in rodents whereby an advancing nerve front overcomes an experimental defect given sufficient time, rendering experimental groups indistinguishable at late time points. Selection of validated time points and corroboration in higher animal models are essential prerequisites for the clinical application of basic research on nerve regeneration. PMID:18381659
A Mobile Computing Solution for Collecting Functional Analysis Data on a Pocket PC
ERIC Educational Resources Information Center
Jackson, James; Dixon, Mark R.
2007-01-01
The present paper provides a task analysis for creating a computerized data system using a Pocket PC and Microsoft Visual Basic. With Visual Basic software and any handheld device running the Windows MOBLE operating system, this task analysis will allow behavior analysts to program and customize their own functional analysis data-collection…
BAM/DASS: Data Analysis Software for Sub-Microarcsecond Astrometry Device
NASA Astrophysics Data System (ADS)
Gardiol, D.; Bonino, D.; Lattanzi, M. G.; Riva, A.; Russo, F.
2010-12-01
The INAF - Osservatorio Astronomico di Torino is part of the Data Processing and Analysis Consortium (DPAC) for Gaia, a cornerstone mission of the European Space Agency. Gaia will perform global astrometry by means of two telescopes looking at the sky along two different lines of sight oriented at a fixed angle, also called basic angle. Knowledge of the basic angle fluctuations at the sub-microarcsecond level over periods of the order of the minute is crucial to reach the mission goals. A specific device, the Basic Angle Monitoring, will be dedicated to this purpose. We present here the software system we are developing to analyze the BAM data and recover the basic angle variations. This tool is integrated into the whole DPAC data analysis software.
Shimon, Daphna; Feintuch, Akiva; Goldfarb, Daniella; Vega, Shimon
2014-04-14
To study the solid state (1)H-DNP mechanism of the biradical TOTAPOL under static conditions the frequency swept DNP enhancement spectra of samples containing 20 mM and 5 mM TOTAPOL were measured as a function of MW irradiation time and temperature. We observed that under static DNP conditions the biradical TOTAPOL behaves similar to the monoradical TEMPOL, in contrast to MAS DNP where TOTAPOL is considerably more effective. As previously done for TEMPOL, the TOTAPOL DNP spectra were analyzed taking a superposition of a basic SE-DNP lineshape and a basic CE-DNP lineshape with different amplitudes. The analysis of the steady state DNP spectra showed that the SE was dominant in the 6-10 K range and the CE was dominant above 10 K. DNP spectra obtained as a function of MW irradiation time allowed resolving the individual SE and CE buildup times. At low temperatures the SE buildup time was faster than the CE buildup time and at all temperatures the CE buildup time was close to the nuclear spin-lattice relaxation time, T1n. Polarization calculations involving nuclear spin-diffusion for a model system of one electron and many nuclei suggested that the shortening of the T1n for increasing temperatures is the reason why the SE contribution to the overall enhancement was reduced.
Distinguishing Fast and Slow Processes in Accuracy - Response Time Data.
Coomans, Frederik; Hofman, Abe; Brinkhuis, Matthieu; van der Maas, Han L J; Maris, Gunter
2016-01-01
We investigate the relation between speed and accuracy within problem solving in its simplest non-trivial form. We consider tests with only two items and code the item responses in two binary variables: one indicating the response accuracy, and one indicating the response speed. Despite being a very basic setup, it enables us to study item pairs stemming from a broad range of domains such as basic arithmetic, first language learning, intelligence-related problems, and chess, with large numbers of observations for every pair of problems under consideration. We carry out a survey over a large number of such item pairs and compare three types of psychometric accuracy-response time models present in the literature: two 'one-process' models, the first of which models accuracy and response time as conditionally independent and the second of which models accuracy and response time as conditionally dependent, and a 'two-process' model which models accuracy contingent on response time. We find that the data clearly violates the restrictions imposed by both one-process models and requires additional complexity which is parsimoniously provided by the two-process model. We supplement our survey with an analysis of the erroneous responses for an example item pair and demonstrate that there are very significant differences between the types of errors in fast and slow responses.
Basic research needed for stimulating the development of behavioral technologies
Mace, F. Charles
1994-01-01
The costs of disconnection between the basic and applied sectors of behavior analysis are reviewed, and some solutions to these problems are proposed. Central to these solutions are collaborations between basic and applied behavioral scientists in programmatic research that addresses the behavioral basis and solution of human behavior problems. This kind of collaboration parallels the deliberate interactions between basic and applied researchers that have proven to be so profitable in other scientific fields, such as medicine. Basic research questions of particular relevance to the development of behavioral technologies are posed in the following areas: response allocation, resistance to change, countercontrol, formation and differentiation/discrimination of stimulus and response classes, analysis of low-rate behavior, and rule-governed behavior. Three interrelated strategies to build connections between the basic and applied analysis of behavior are identified: (a) the development of nonhuman animal models of human behavior problems using operations that parallel plausible human circumstances, (b) replication of the modeled relations with human subjects in the operant laboratory, and (c) tests of the generality of the model with actual human problems in natural settings. PMID:16812734
The detection and analysis of point processes in biological signals
NASA Technical Reports Server (NTRS)
Anderson, D. J.; Correia, M. J.
1977-01-01
A pragmatic approach to the detection and analysis of discrete events in biomedical signals is taken. Examples from both clinical and basic research are provided. Introductory sections discuss not only discrete events which are easily extracted from recordings by conventional threshold detectors but also events embedded in other information carrying signals. The primary considerations are factors governing event-time resolution and the effects limits to this resolution have on the subsequent analysis of the underlying process. The analysis portion describes tests for qualifying the records as stationary point processes and procedures for providing meaningful information about the biological signals under investigation. All of these procedures are designed to be implemented on laboratory computers of modest computational capacity.
NASA Astrophysics Data System (ADS)
Wright, Robyn; Thornberg, Steven M.
SEDIDAT is a series of compiled IBM-BASIC (version 2.0) programs that direct the collection, statistical calculation, and graphic presentation of particle settling velocity and equivalent spherical diameter for samples analyzed using the settling tube technique. The programs follow a menu-driven format that is understood easily by students and scientists with little previous computer experience. Settling velocity is measured directly (cm,sec) and also converted into Chi units. Equivalent spherical diameter (reported in Phi units) is calculated using a modified Gibbs equation for different particle densities. Input parameters, such as water temperature, settling distance, particle density, run time, and Phi;Chi interval are changed easily at operator discretion. Optional output to a dot-matrix printer includes a summary of moment and graphic statistical parameters, a tabulation of individual and cumulative weight percents, a listing of major distribution modes, and cumulative and histogram plots of a raw time, settling velocity. Chi and Phi data.
ERIC Educational Resources Information Center
Jones, Tom; Di Salvo, Vince
A computerized content analysis of the "theory input" for a basic speech course was conducted. The questions to be answered were (1) What does the inexperienced basic speech student hold as a conceptual perspective of the "speech to inform" prior to his being subjected to a college speech class? and (2) How does that inexperienced student's…
ERIC Educational Resources Information Center
Madawaska School District, ME.
Project CAPABLE (Classroom Action Program: Aim: Basic Learning Effectiveness) is a classroom approach which integrates the basic learning skills with content. The goal of the project is to use basic learning skills to enhance the learning of content and at the same time use the content to teach basic learning skills. This manual illustrates how…
Publications - GMC 180 | Alaska Division of Geological & Geophysical
DGGS GMC 180 Publication Details Title: Basic data for Apatite Fission Track analysis of cuttings (413 Reference Unknown, 1991, Basic data for Apatite Fission Track analysis of cuttings (413'-12375') from the
Arthropod surveillance programs: Basic components, strategies, and analysis
USDA-ARS?s Scientific Manuscript database
Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium “Advancements in arthro...
Advances in fMRI Real-Time Neurofeedback.
Watanabe, Takeo; Sasaki, Yuka; Shibata, Kazuhisa; Kawato, Mitsuo
2017-12-01
Functional magnetic resonance imaging (fMRI) neurofeedback is a type of biofeedback in which real-time online fMRI signals are used to self-regulate brain function. Since its advent in 2003 significant progress has been made in fMRI neurofeedback techniques. Specifically, the use of implicit protocols, external rewards, multivariate analysis, and connectivity analysis has allowed neuroscientists to explore a possible causal involvement of modified brain activity in modified behavior. These techniques have also been integrated into groundbreaking new neurofeedback technologies, specifically decoded neurofeedback (DecNef) and functional connectivity-based neurofeedback (FCNef). By modulating neural activity and behavior, DecNef and FCNef have substantially advanced both basic and clinical research. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Effect of Sr Additive Amount and Holding Time on Microstructure of A390 Aluminum Alloy
NASA Astrophysics Data System (ADS)
Zhang, J. H.; Xing, S. M.; Han, Q. Y.; Guo, Q.; Wang, R. F.
2017-11-01
The microstructure of A390 alloy under different Sr additive amounts and holding times was studied by means of direct reading spectrum analysis, energy spectrum analysis, optical microscope and electron microscope. The results show that Sr has a good modification to eutectic Si, while it has a negative effect on primary silicon. The Sr addition will increase the size of primary silicon. When the addition amount of Al-10Sr alloy is 0.6%, the modification of eutectic silicon is the optimum. The Sr has a short incubation period and a fine modification at 10min, but it is more serious burning rate in small furnace smelting, and the modification effect disappears basically after 100min.
Jabor, A; Vlk, T; Boril, P
1996-04-15
We designed a simulation model for the assessment of the financial risks involved when a new diagnostic test is introduced in the laboratory. The model is based on a neural network consisting of ten neurons and assumes that input entities can have assigned appropriate uncertainty. Simulations are done on a 1-day interval basis. Risk analysis completes the model and the financial effects are evaluated for a selected time period. The basic output of the simulation consists of total expenses and income during the simulation time, net present value of the project at the end of simulation, total number of control samples during simulation, total number of patients evaluated and total number of used kits.
Gomes, Crizian Saar; Matozinhos, Fernanda Penido; Mendes, Larissa Loures; Pessoa, Milene Cristine; Velasquez-Melendez, Gustavo
2016-01-01
The physical activity practice is highlighted as a strategy to health promotion and to avoid chronic diseases. In addition to individual factors, environmental characteristics in which people live, may offer opportunities or barriers in adopting healthy habits and this is related to the physical activity (PA) practice among individuals. The aim of this study is to investigate the associations between neighborhood environment and leisure-time physical activity in adults. This is a cross-sectional study, developed using the database of Surveillance System for Risk and Protective Factors for Chronic Diseases by Telephone Survey (VIGITEL 2008/2010) of Belo Horizonte, Brazil. Individuals with the habit of practicing PA for at least 150 minutes of moderate-intensity PA or at least 75 minutes of vigorous-intensity PA throughout the week in leisure time were classified as active in leisure time. To characterize the built and social environment we used georeferenced data of public and private places for physical activity, population density, residential density, homicide rate and total income of the coverage area of the basic health units. The covered area of the basic health units was used as context unit. For data analysis, we used multilevel logistic regression. The study included 5779 adults, 58.77% female. There was variability of physical activity in leisure time between area covered by the basic health units (Median Odds ratio = 1.30). After adjusting for individual characteristics, the increase of density of private places for physical activity (Odds ratios—OR = 1.31; 95% confidence interval—95% CI: 1.15 to 1.48) and the smaller homicide rate (OR = 0.82; IC95%: 0.70 to 0.96) in the neighborhood increased physical activity in leisure time. The evidence of this study shows that neighborhood environment may influence the physical activity practice in leisure time and should be considered in future interventions and health promotion strategies. PMID:26915091
Gomes, Crizian Saar; Matozinhos, Fernanda Penido; Mendes, Larissa Loures; Pessoa, Milene Cristine; Velasquez-Melendez, Gustavo
2016-01-01
The physical activity practice is highlighted as a strategy to health promotion and to avoid chronic diseases. In addition to individual factors, environmental characteristics in which people live, may offer opportunities or barriers in adopting healthy habits and this is related to the physical activity (PA) practice among individuals. The aim of this study is to investigate the associations between neighborhood environment and leisure-time physical activity in adults. This is a cross-sectional study, developed using the database of Surveillance System for Risk and Protective Factors for Chronic Diseases by Telephone Survey (VIGITEL 2008/2010) of Belo Horizonte, Brazil. Individuals with the habit of practicing PA for at least 150 minutes of moderate-intensity PA or at least 75 minutes of vigorous-intensity PA throughout the week in leisure time were classified as active in leisure time. To characterize the built and social environment we used georeferenced data of public and private places for physical activity, population density, residential density, homicide rate and total income of the coverage area of the basic health units. The covered area of the basic health units was used as context unit. For data analysis, we used multilevel logistic regression. The study included 5779 adults, 58.77% female. There was variability of physical activity in leisure time between area covered by the basic health units (Median Odds ratio = 1.30). After adjusting for individual characteristics, the increase of density of private places for physical activity (Odds ratios-OR = 1.31; 95% confidence interval-95% CI: 1.15 to 1.48) and the smaller homicide rate (OR = 0.82; IC95%: 0.70 to 0.96) in the neighborhood increased physical activity in leisure time. The evidence of this study shows that neighborhood environment may influence the physical activity practice in leisure time and should be considered in future interventions and health promotion strategies.
Bax, Leon; Yu, Ly-Mee; Ikeda, Noriaki; Tsuruta, Harukazu; Moons, Karel G M
2006-10-13
Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim) and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel) interface, and the educational features. The MIX program is a valid tool for performing meta-analysis and may be particularly useful in educational environments. It can be downloaded free of charge via http://www.mix-for-meta-analysis.info or http://sourceforge.net/projects/meta-analysis.
Lystrom, David J.
1972-01-01
Various methods of verifying real-time streamflow data are outlined in part II. Relatively large errors (those greater than 20-30 percent) can be detected readily by use of well-designed verification programs for a digital computer, and smaller errors can be detected only by discharge measurements and field observations. The capability to substitute a simulated discharge value for missing or erroneous data is incorporated in some of the verification routines described. The routines represent concepts ranging from basic statistical comparisons to complex watershed modeling and provide a selection from which real-time data users can choose a suitable level of verification.
NASA Technical Reports Server (NTRS)
Manson, S. S.
1972-01-01
The strainrange partitioning concept divides the imposed strain into four basic ranges involving time-dependent and time-independent components. It is shown that some of the results presented at the symposium can be better correlated on the basis of this concept than by alternative methods. It is also suggested that methods of data generation and analysis can be helpfully guided by this approach. Potential applicability of the concept to the treatment of frequency and hold-time effects, environmental influence, crack initiation and growth, thermal fatigue, and code specifications are briefly considered. A required experimental program is outlined.
Arthropod Surveillance Programs: Basic Components, Strategies, and Analysis
2012-01-01
industries through hide damage, reduc- tions in animal weight gains, or reduced production of animal products such as milk or eggs (Reviewed by Lehane...chiopterus (Meigen 1830) was abundant on sheep in southern England, although relatively uncommon in nearby light traps. Furthermore, attraction to or...Cross correlationmaps: a tool for visualizing andmodeling time lagged associations. Vector Borne Zoonotic Dis. 5: 267Ð 275. Duehl, A. J., L. W
Harmonic analysis of electric locomotive and traction power system based on wavelet singular entropy
NASA Astrophysics Data System (ADS)
Dun, Xiaohong
2018-05-01
With the rapid development of high-speed railway and heavy-haul transport, the locomotive and traction power system has become the main harmonic source of China's power grid. In response to this phenomenon, the system's power quality issues need timely monitoring, assessment and governance. Wavelet singular entropy is an organic combination of wavelet transform, singular value decomposition and information entropy theory, which combines the unique advantages of the three in signal processing: the time-frequency local characteristics of wavelet transform, singular value decomposition explores the basic modal characteristics of data, and information entropy quantifies the feature data. Based on the theory of singular value decomposition, the wavelet coefficient matrix after wavelet transform is decomposed into a series of singular values that can reflect the basic characteristics of the original coefficient matrix. Then the statistical properties of information entropy are used to analyze the uncertainty of the singular value set, so as to give a definite measurement of the complexity of the original signal. It can be said that wavelet entropy has a good application prospect in fault detection, classification and protection. The mat lab simulation shows that the use of wavelet singular entropy on the locomotive and traction power system harmonic analysis is effective.
Conducting high-value secondary dataset analysis: an introductory guide and resources.
Smith, Alexander K; Ayanian, John Z; Covinsky, Kenneth E; Landon, Bruce E; McCarthy, Ellen P; Wee, Christina C; Steinman, Michael A
2011-08-01
Secondary analyses of large datasets provide a mechanism for researchers to address high impact questions that would otherwise be prohibitively expensive and time-consuming to study. This paper presents a guide to assist investigators interested in conducting secondary data analysis, including advice on the process of successful secondary data analysis as well as a brief summary of high-value datasets and online resources for researchers, including the SGIM dataset compendium ( www.sgim.org/go/datasets ). The same basic research principles that apply to primary data analysis apply to secondary data analysis, including the development of a clear and clinically relevant research question, study sample, appropriate measures, and a thoughtful analytic approach. A real-world case description illustrates key steps: (1) define your research topic and question; (2) select a dataset; (3) get to know your dataset; and (4) structure your analysis and presentation of findings in a way that is clinically meaningful. Secondary dataset analysis is a well-established methodology. Secondary analysis is particularly valuable for junior investigators, who have limited time and resources to demonstrate expertise and productivity.
Basic research for the geodynamics program
NASA Technical Reports Server (NTRS)
1991-01-01
The mathematical models of space very long base interferometry (VLBI) observables suitable for least squares covariance analysis were derived and estimatability problems inherent in the space VLBI system were explored, including a detailed rank defect analysis and sensitivity analysis. An important aim is to carry out a comparative analysis of the mathematical models of the ground-based VLBI and space VLBI observables in order to describe the background in detail. Computer programs were developed in order to check the relations, assess errors, and analyze sensitivity. In order to investigate the estimatability of different geodetic and geodynamic parameters from the space VLBI observables, the mathematical models for time delay and time delay rate observables of space VLBI were analytically derived along with the partial derivatives with respect to the parameters. Rank defect analysis was carried out both by analytical and numerical testing of linear dependencies between the columns of the normal matrix thus formed. Definite conclusions were formed about the rank defects in the system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Syring, R.P.; Grubb, R.L.
1979-09-30
This document reports on the following: (1) experimental determination of the response of 16 basic structural elements and 7 B-52 components to simulated nuclear overpressure environments (utilizing Sandia Corporation's Thunderpipe Shock Tube), (2) analysis of these test specimens utilizing the NOVA-2 computer program, and (3) correlation of test and analysis results.
Waterworks Operator Training Manual.
ERIC Educational Resources Information Center
Missouri Univ., Columbia. Instructional Materials Lab.
Sixteen self-study waterworks operators training modules are provided. Module titles are the following: basic mathematics, basic chemistry, analysis procedures, microbiology, basic electricity, hydraulics, chlorination, plant operation, surface water, ground water, pumps, cross connections, distribution systems, safety, public relations, and…
A reference web architecture and patterns for real-time visual analytics on large streaming data
NASA Astrophysics Data System (ADS)
Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer
2013-12-01
Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.
HYPERDATA - BASIC HYPERSONIC DATA AND EQUATIONS
NASA Technical Reports Server (NTRS)
Mackall, D.
1994-01-01
In an effort to place payloads into orbit at the lowest possible costs, the use of air-breathing space-planes, which reduces the need to carry the propulsion system oxidizer, has been examined. As this approach would require the space-plane to fly at hypersonic speeds for periods of time much greater than that required by rockets, many factors must be considered when analyzing its benefits. The Basic Hypersonic Data and Equations spreadsheet provides data gained from three analyses of a space-plane's performance. The equations used to perform the analyses are derived from Newton's second law of physics (i.e. force equals mass times acceleration); the derivation is included. The first analysis is a parametric study of some basic factors affecting the ability of a space-plane to reach orbit. This step calculates the fraction of fuel mass to the total mass of the space-plane at takeoff. The user is able to vary the altitude, the heating value of the fuel, the orbital gravity, and orbital velocity. The second analysis calculates the thickness of a spherical fuel tank, while assuming all of the mass of the vehicle went into the tank's shell. This provides a first order analysis of how much material results from a design where the fuel represents a large portion of the total vehicle mass. In this step, the user is allowed to vary the values for gross weight, material density, and fuel density. The third analysis produces a ratio of gallons of fuel per total mass for various aircraft. It shows that the volume of fuel required by the space-plane relative to the total mass is much larger for a liquid hydrogen space-plane than any other vehicle made. This program is a spreadsheet for use on Macintosh series computers running Microsoft Excel 3.0. The standard distribution medium for this package is a 3.5 inch 800K Macintosh format diskette. Documentation is included in the price of the program. Macintosh is a registered trademark of Apple Computer, Inc. Microsoft is a registered trademark of Microsoft Corporation.
Basic BASIC; An Introduction to Computer Programming in BASIC Language.
ERIC Educational Resources Information Center
Coan, James S.
With the increasing availability of computer access through remote terminals and time sharing, more and more schools and colleges are able to introduce programing to substantial numbers of students. This book is an attempt to incorporate computer programming, using BASIC language, and the teaching of mathematics. The general approach of the book…
Methodology for object-oriented real-time systems analysis and design: Software engineering
NASA Technical Reports Server (NTRS)
Schoeffler, James D.
1991-01-01
Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.
Evidence for a bimodal distribution in human communication.
Wu, Ye; Zhou, Changsong; Xiao, Jinghua; Kurths, Jürgen; Schellnhuber, Hans Joachim
2010-11-02
Interacting human activities underlie the patterns of many social, technological, and economic phenomena. Here we present clear empirical evidence from Short Message correspondence that observed human actions are the result of the interplay of three basic ingredients: Poisson initiation of tasks and decision making for task execution in individual humans as well as interaction among individuals. This interplay leads to new types of interevent time distribution, neither completely Poisson nor power-law, but a bimodal combination of them. We show that the events can be separated into independent bursts which are generated by frequent mutual interactions in short times following random initiations of communications in longer times by the individuals. We introduce a minimal model of two interacting priority queues incorporating the three basic ingredients which fits well the distributions using the parameters extracted from the empirical data. The model can also embrace a range of realistic social interacting systems such as e-mail and letter communications when taking the time scale of processing into account. Our findings provide insight into various human activities both at the individual and network level. Our analysis and modeling of bimodal activity in human communication from the viewpoint of the interplay between processes of different time scales is likely to shed light on bimodal phenomena in other complex systems, such as interevent times in earthquakes, rainfall, forest fire, and economic systems, etc.
Evidence for a bimodal distribution in human communication
Wu, Ye; Zhou, Changsong; Xiao, Jinghua; Kurths, Jürgen; Schellnhuber, Hans Joachim
2010-01-01
Interacting human activities underlie the patterns of many social, technological, and economic phenomena. Here we present clear empirical evidence from Short Message correspondence that observed human actions are the result of the interplay of three basic ingredients: Poisson initiation of tasks and decision making for task execution in individual humans as well as interaction among individuals. This interplay leads to new types of interevent time distribution, neither completely Poisson nor power-law, but a bimodal combination of them. We show that the events can be separated into independent bursts which are generated by frequent mutual interactions in short times following random initiations of communications in longer times by the individuals. We introduce a minimal model of two interacting priority queues incorporating the three basic ingredients which fits well the distributions using the parameters extracted from the empirical data. The model can also embrace a range of realistic social interacting systems such as e-mail and letter communications when taking the time scale of processing into account. Our findings provide insight into various human activities both at the individual and network level. Our analysis and modeling of bimodal activity in human communication from the viewpoint of the interplay between processes of different time scales is likely to shed light on bimodal phenomena in other complex systems, such as interevent times in earthquakes, rainfall, forest fire, and economic systems, etc. PMID:20959414
Digging Deeper: Professional Learning Can Go beyond the Basics to Reach Underserved Students
ERIC Educational Resources Information Center
Gleason, Sonia Caus
2010-01-01
Consistent, excellent teaching is the single greatest factor in improving student achievement over time. School leadership is the second. Excellent teaching and strong leadership require deliberate, ongoing professional learning. In working with high-poverty school systems over time, the following basics emerge: (1) time; (2) content; (3)…
Theodorsson-Norheim, E
1986-08-01
Multiple t tests at a fixed p level are frequently used to analyse biomedical data where analysis of variance followed by multiple comparisons or the adjustment of the p values according to Bonferroni would be more appropriate. The Kruskal-Wallis test is a nonparametric 'analysis of variance' which may be used to compare several independent samples. The present program is written in an elementary subset of BASIC and will perform Kruskal-Wallis test followed by multiple comparisons between the groups on practically any computer programmable in BASIC.
On the stability of equilibrium for a reformulated foreign trade model of three countries
NASA Astrophysics Data System (ADS)
Dassios, Ioannis K.; Kalogeropoulos, Grigoris
2014-06-01
In this paper, we study the stability of equilibrium for a foreign trade model consisting of three countries. As the gravity equation has been proven an excellent tool of analysis and adequately stable over time and space all over the world, we further enhance the problem to three masses. We use the basic Structure of Heckscher-Ohlin-Samuelson model. The national income equals consumption outlays plus investment plus exports minus imports. The proposed reformulation of the problem focus on two basic concepts: (1) the delay inherited in our economic variables and (2) the interaction effect along the three economies involved. Stability and stabilizability conditions are investigated while numerical examples provide further insight and better understanding. Finally, a generalization of the gravity equation is somehow obtained for the model.
Label-Free Optofluidic Nanobiosensor Enables Real-Time Analysis of Single-Cell Cytokine Secretion.
Li, Xiaokang; Soler, Maria; Szydzik, Crispin; Khoshmanesh, Khashayar; Schmidt, Julien; Coukos, George; Mitchell, Arnan; Altug, Hatice
2018-06-01
Single-cell analysis of cytokine secretion is essential to understand the heterogeneity of cellular functionalities and develop novel therapies for multiple diseases. Unraveling the dynamic secretion process at single-cell resolution reveals the real-time functional status of individual cells. Fluorescent and colorimetric-based methodologies require tedious molecular labeling that brings inevitable interferences with cell integrity and compromises the temporal resolution. An innovative label-free optofluidic nanoplasmonic biosensor is introduced for single-cell analysis in real time. The nanobiosensor incorporates a novel design of a multifunctional microfluidic system with small volume microchamber and regulation channels for reliable monitoring of cytokine secretion from individual cells for hours. Different interleukin-2 secretion profiles are detected and distinguished from single lymphoma cells. The sensor configuration combined with optical spectroscopic imaging further allows us to determine the spatial single-cell secretion fingerprints in real time. This new biosensor system is anticipated to be a powerful tool to characterize single-cell signaling for basic and clinical research. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Analysis of real-time vibration data
Safak, E.
2005-01-01
In recent years, a few structures have been instrumented to provide continuous vibration data in real time, recording not only large-amplitude motions generated by extreme loads, but also small-amplitude motions generated by ambient loads. The main objective in continuous recording is to track any changes in structural characteristics, and to detect damage after an extreme event, such as an earthquake or explosion. The Fourier-based spectral analysis methods have been the primary tool to analyze vibration data from structures. In general, such methods do not work well for real-time data, because real-time data are mainly composed of ambient vibrations with very low amplitudes and signal-to-noise ratios. The long duration, linearity, and the stationarity of ambient data, however, allow us to utilize statistical signal processing tools, which can compensate for the adverse effects of low amplitudes and high noise. The analysis of real-time data requires tools and techniques that can be applied in real-time; i.e., data are processed and analyzed while being acquired. This paper presents some of the basic tools and techniques for processing and analyzing real-time vibration data. The topics discussed include utilization of running time windows, tracking mean and mean-square values, filtering, system identification, and damage detection.
ERIC Educational Resources Information Center
Cadart-Ricard, Odette
The problem of meaning in cross-cultural situations, resulting from differing patterns of thought, requires comprehension of the basic rules or patterns of these thought systems. This comprehension can be sought through Vygotsky's unit of analysis, a unit being a product of analysis which, unlike elements, retains all the basic properties of the…
The Social Profile of Students in Basic General Education in Ecuador: A Data Analysis
ERIC Educational Resources Information Center
Buri, Olga Elizabeth Minchala; Stefos, Efstathios
2017-01-01
The objective of this study is to examine the social profile of students who are enrolled in Basic General Education in Ecuador. Both a descriptive and multidimensional statistical analysis was carried out based on the data provided by the National Survey of Employment, Unemployment and Underemployment in 2015. The descriptive analysis shows the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Syring, R.P.; Grubb, R.L.
1979-09-30
This document reports on the following: (1) experimental determination of the response of 16 basic structural elements and 7 B-52 components to simulated nuclear overpressure environments (utilizing Sandia Corporation's Thunderpipe Shock Tube), (2) analysis of these test specimens utilizing the NOVA-2 computer program, and (3) correlation of test and analysis results.
DaRosa, D A; Shuck, J M; Biester, T W; Folse, R
1993-01-01
This research sought to identify the strengths and weakness in residents' basic science knowledge and, second, to determine whether they progressively improve in their abilities to recall basic science information and clinical management facts, to analyze cause-effect relationships, and to solve clinical problems. Basic science knowledge was assessed by means of the results of the January 1990 American Board of Surgery's In-Training/Surgical Basic Science Exam (IT/SBSE). Postgraduate year (PGY) 1 residents' scores were compared with those of PGY5 residents. Content related to a question was considered "known" if 67% or more of the residents in each of the two groups answered it correctly. Findings showed 44% of the content tested by the basic science questions were unknown by new and graduating residents. The second research question required the 250 IT/SBSE questions to be classified into one of three levels of thinking abilities: recall, analysis, and inferential thinking. Profile analysis (split-plot analysis of variance) for each pair of resident levels indicated significant (P < 0.001) differences in performance on questions requiring factual recall, analysis, and inference between all levels except for PGY3s and PGY4s. The results of this research enable program directors to evaluate strengths and weaknesses in residency training curricula and the cognitive development of residents.
Minimum Contradictions Physics and Propulsion via Superconducting Magnetic Field Trapping
NASA Astrophysics Data System (ADS)
Nassikas, A. A.
2010-01-01
All theories are based on Axioms which obviously are arbitrary; e.g. SRT, GRT, QM Axioms. Instead of manipulating the experience through a new set of Arbitrary Axioms it would be useful to search, through a basic tool that we have at our disposal i.e. Logic Analysis, for a set of privileged axioms. Physics theories, beyond their particular axioms, can be restated through the basic communication system as consisting of the Classical Logic, the Sufficient Reason Principle and the Anterior-Posterior Axiom. By means of a theorem this system can be proven as contradictory. The persistence in logic is the way for a set of privileged axioms to be found. This can be achieved on the basis of the Claim for Minimum Contradictions. Further axioms beyond the ones of the basic communications imply further contradictions. Thus, minimum contradictions can be achieved when things are described through anterior-posterior terms; due to existing contradictions through stochastic space-time, which is matter itself, described through a Ψ wave function and distributed, in a Hypothetical Measuring Field (HMF), through the density probability function P(r, t). On this basis, a space-time QM is obtained and this QM is a unified theory satisfying the requirements of quantum gravity. There are both mass-gravitational space-time (g) regarded as real and charge-electromagnetic (em) space-time that could be regarded as imaginary. In a closed system energy conversion-conservation and momentum action take place through photons, which can be regarded either as (g) or (em) space-time formation whose rest mass is equal to zero. Universe Evolution is described through the interaction of the gravitational (g) with the electromagnetic (em) space-time-matter field and not through any other entities. This methodology implies that there is no need for dark matter. An experiment is proposed relative to the (g)+(em) interaction based on Superconducting Magnetic Field Trapping to validate this approach.
49 CFR 236.1043 - Task analysis and basic requirements.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Positive Train Control Systems § 236.1043 Task analysis and basic requirements. (a) Training structure and... installation, maintenance, repair, modification, inspection, testing, and operating tasks that must be...
49 CFR 236.1043 - Task analysis and basic requirements.
Code of Federal Regulations, 2013 CFR
2013-10-01
..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Positive Train Control Systems § 236.1043 Task analysis and basic requirements. (a) Training structure and... installation, maintenance, repair, modification, inspection, testing, and operating tasks that must be...
49 CFR 236.1043 - Task analysis and basic requirements.
Code of Federal Regulations, 2014 CFR
2014-10-01
..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Positive Train Control Systems § 236.1043 Task analysis and basic requirements. (a) Training structure and... installation, maintenance, repair, modification, inspection, testing, and operating tasks that must be...
49 CFR 236.1043 - Task analysis and basic requirements.
Code of Federal Regulations, 2012 CFR
2012-10-01
..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Positive Train Control Systems § 236.1043 Task analysis and basic requirements. (a) Training structure and... installation, maintenance, repair, modification, inspection, testing, and operating tasks that must be...
49 CFR 236.1043 - Task analysis and basic requirements.
Code of Federal Regulations, 2011 CFR
2011-10-01
..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Positive Train Control Systems § 236.1043 Task analysis and basic requirements. (a) Training structure and... installation, maintenance, repair, modification, inspection, testing, and operating tasks that must be...
Web-Writing in One Minute--and Beyond.
ERIC Educational Resources Information Center
Hughes, Kenneth
This paper describes how librarians can teach patrons the basics of hypertext markup language (HTML) so that patrons can publish their own homepages on the World Wide Web. With proper use of handouts and practice time afterwards, the three basics of HTML can be conveyed in only 60 seconds. The three basics are: the basic template of Web tags, used…
Discrete Deterministic and Stochastic Petri Nets
NASA Technical Reports Server (NTRS)
Zijal, Robert; Ciardo, Gianfranco
1996-01-01
Petri nets augmented with timing specifications gained a wide acceptance in the area of performance and reliability evaluation of complex systems exhibiting concurrency, synchronization, and conflicts. The state space of time-extended Petri nets is mapped onto its basic underlying stochastic process, which can be shown to be Markovian under the assumption of exponentially distributed firing times. The integration of exponentially and non-exponentially distributed timing is still one of the major problems for the analysis and was first attacked for continuous time Petri nets at the cost of structural or analytical restrictions. We propose a discrete deterministic and stochastic Petri net (DDSPN) formalism with no imposed structural or analytical restrictions where transitions can fire either in zero time or according to arbitrary firing times that can be represented as the time to absorption in a finite absorbing discrete time Markov chain (DTMC). Exponentially distributed firing times are then approximated arbitrarily well by geometric distributions. Deterministic firing times are a special case of the geometric distribution. The underlying stochastic process of a DDSPN is then also a DTMC, from which the transient and stationary solution can be obtained by standard techniques. A comprehensive algorithm and some state space reduction techniques for the analysis of DDSPNs are presented comprising the automatic detection of conflicts and confusions, which removes a major obstacle for the analysis of discrete time models.
Bax, Leon; Yu, Ly-Mee; Ikeda, Noriaki; Tsuruta, Harukazu; Moons, Karel GM
2006-01-01
Background Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. Results We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim) and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel) interface, and the educational features. Conclusion The MIX program is a valid tool for performing meta-analysis and may be particularly useful in educational environments. It can be downloaded free of charge via or . PMID:17038197
Fundamentals of Digital Engineering: Designing for Reliability
NASA Technical Reports Server (NTRS)
Katz, R.; Day, John H. (Technical Monitor)
2001-01-01
The concept of designing for reliability will be introduced along with a brief overview of reliability, redundancy and traditional methods of fault tolerance is presented, as applied to current logic devices. The fundamentals of advanced circuit design and analysis techniques will be the primary focus. The introduction will cover the definitions of key device parameters and how analysis is used to prove circuit correctness. Basic design techniques such as synchronous vs asynchronous design, metastable state resolution time/arbiter design, and finite state machine structure/implementation will be reviewed. Advanced topics will be explored such as skew-tolerant circuit design, the use of triple-modular redundancy and circuit hazards, device transients and preventative circuit design, lock-up states in finite state machines generated by logic synthesizers, device transient characteristics, radiation mitigation techniques. worst-case analysis, the use of timing analyzer and simulators, and others. Case studies and lessons learned from spaceflight designs will be given as examples
Liu, Bao; Fan, Xiaoming; Huo, Shengnan; Zhou, Lili; Wang, Jun; Zhang, Hui; Hu, Mei; Zhu, Jianhua
2011-12-01
A method was established to analyse the overlapped chromatographic peaks based on the chromatographic-spectra data detected by the diode-array ultraviolet detector. In the method, the three-dimensional data were de-noised and normalized firstly; secondly the differences and clustering analysis of the spectra at different time points were calculated; then the purity of the whole chromatographic peak were analysed and the region were sought out in which the spectra of different time points were stable. The feature spectra were extracted from the spectrum-stable region as the basic foundation. The nonnegative least-square method was chosen to separate the overlapped peaks and get the flow curve which was based on the feature spectrum. The three-dimensional divided chromatographic-spectrum peak could be gained by the matrix operations of the feature spectra with the flow curve. The results displayed that this method could separate the overlapped peaks.
Kim, Eunjin; Kang, Hyunook; Choi, Insung; Song, Jihyeon; Mok, Hyejung; Jung, Woong; Yeo, Woon-Seok
2018-05-09
Detection and quantitation of flavonoids are relatively difficult compared to those of other small-molecule analytes because flavonoids undergo rapid metabolic processes, resulting in their elimination from the body. Here, we report an efficient enrichment method for facilitating the analysis of vicinal-diol-containing flavonoid molecules using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry. In our strategy, boronic-acid-functionalized polyacrylamide particles were used, where boronic acids bound to vicinal diols to form boronate monoesters at basic pH. This complex remained intact during the enrichment processes, and the vicinal-diol-containing flavonoids were easily separated by centrifugation and subsequent acidic treatments. The selectivity and limit of detection of our strategy were confirmed by mass spectrometry analysis, and the validity was assessed by performing the detection and quantitation of quercetin in mouse organs.
Webcam camera as a detector for a simple lab-on-chip time based approach.
Wongwilai, Wasin; Lapanantnoppakhun, Somchai; Grudpan, Supara; Grudpan, Kate
2010-05-15
A modification of a webcam camera for use as a small and low cost detector was demonstrated with a simple lab-on-chip reactor. Real time continuous monitoring of the reaction zone could be done. Acid-base neutralization with phenolphthalein indicator was used as a model reaction. The fading of pink color of the indicator when the acidic solution diffused into the basic solution zone was recorded as the change of red, blue and green colors (%RBG.) The change was related to acid concentration. A low cost portable semi-automation analysis system was achieved.
Taljaard, Monica; McKenzie, Joanne E; Ramsay, Craig R; Grimshaw, Jeremy M
2014-06-19
An interrupted time series design is a powerful quasi-experimental approach for evaluating effects of interventions introduced at a specific point in time. To utilize the strength of this design, a modification to standard regression analysis, such as segmented regression, is required. In segmented regression analysis, the change in intercept and/or slope from pre- to post-intervention is estimated and used to test causal hypotheses about the intervention. We illustrate segmented regression using data from a previously published study that evaluated the effectiveness of a collaborative intervention to improve quality in pre-hospital ambulance care for acute myocardial infarction (AMI) and stroke. In the original analysis, a standard regression model was used with time as a continuous variable. We contrast the results from this standard regression analysis with those from segmented regression analysis. We discuss the limitations of the former and advantages of the latter, as well as the challenges of using segmented regression in analysing complex quality improvement interventions. Based on the estimated change in intercept and slope from pre- to post-intervention using segmented regression, we found insufficient evidence of a statistically significant effect on quality of care for stroke, although potential clinically important effects for AMI cannot be ruled out. Segmented regression analysis is the recommended approach for analysing data from an interrupted time series study. Several modifications to the basic segmented regression analysis approach are available to deal with challenges arising in the evaluation of complex quality improvement interventions.
A comparative critical study between FMEA and FTA risk analysis methods
NASA Astrophysics Data System (ADS)
Cristea, G.; Constantinescu, DM
2017-10-01
Today there is used an overwhelming number of different risk analyses techniques with acronyms such as: FMEA (Failure Modes and Effects Analysis) and its extension FMECA (Failure Mode, Effects, and Criticality Analysis), DRBFM (Design Review by Failure Mode), FTA (Fault Tree Analysis) and and its extension ETA (Event Tree Analysis), HAZOP (Hazard & Operability Studies), HACCP (Hazard Analysis and Critical Control Points) and What-if/Checklist. However, the most used analysis techniques in the mechanical and electrical industry are FMEA and FTA. In FMEA, which is an inductive method, information about the consequences and effects of the failures is usually collected through interviews with experienced people, and with different knowledge i.e., cross-functional groups. The FMEA is used to capture potential failures/risks & impacts and prioritize them on a numeric scale called Risk Priority Number (RPN) which ranges from 1 to 1000. FTA is a deductive method i.e., a general system state is decomposed into chains of more basic events of components. The logical interrelationship of how such basic events depend on and affect each other is often described analytically in a reliability structure which can be visualized as a tree. Both methods are very time-consuming to be applied thoroughly, and this is why it is oftenly not done so. As a consequence possible failure modes may not be identified. To address these shortcomings, it is proposed to use a combination of FTA and FMEA.
NASA Technical Reports Server (NTRS)
Omalley, T. A.
1984-01-01
The use of the coupled cavity traveling wave tube for space communications has led to an increased interest in improving the efficiency of the basic interaction process in these devices through velocity resynchronization and other methods. A flexible, three dimensional, axially symmetric, large signal computer program was developed for use on the IBM 370 time sharing system. A users' manual for this program is included.
Vortex movement and magnetization of high Tc superconductors
NASA Technical Reports Server (NTRS)
Roytburd, A. L.; Turchinskaya, M. J.
1991-01-01
The basic characteristics of the thermoactivated vortex mobility in Y1Ba2Cu3O7 are determined by measurement of the kinetics of magnetization in two time regimes. The analysis of the kinetics of the approach of the equilibrium results in the activation energy, while the measurement of the log-creep rate allows determination of the activated moment. It is shown that the movement of vortices can be regarded as the diffusion process.
Root Cause Analysis of Sexual Assault: Shifting from Early Detection to a Vaccine
2015-02-17
Women Bystander Behavior Basic Development /Personal Identity Sexuality Pornography Dating STDs/Communicable Diseases He Said/She Said - Mars vs Venus...trust over time and sharing of current life experiences allows for “ developmental coaching” (CCLD, “ Developing Leaders,” 21) in a safe environment...skill development . Just as root sources for cancer are mitigated by not smoking, minimal sun exposure, adopting healthy eating habits and alcohol use
A 3D Reconstruction Strategy of Vehicle Outline Based on Single-Pass Single-Polarization CSAR Data.
Leping Chen; Daoxiang An; Xiaotao Huang; Zhimin Zhou
2017-11-01
In the last few years, interest in circular synthetic aperture radar (CSAR) acquisitions has arisen as a consequence of the potential achievement of 3D reconstructions over 360° azimuth angle variation. In real-world scenarios, full 3D reconstructions of arbitrary targets need multi-pass data, which makes the processing complex, money-consuming, and time expending. In this paper, we propose a processing strategy for the 3D reconstruction of vehicle, which can avoid using multi-pass data by introducing a priori information of vehicle's shape. Besides, the proposed strategy just needs the single-pass single-polarization CSAR data to perform vehicle's 3D reconstruction, which makes the processing much more economic and efficient. First, an analysis of the distribution of attributed scattering centers from vehicle facet model is presented. And the analysis results show that a smooth and continuous basic outline of vehicle could be extracted from the peak curve of a noncoherent processing image. Second, the 3D location of vehicle roofline is inferred from layover with empirical insets of the basic outline. At last, the basic line and roofline of the vehicle are used to estimate the vehicle's 3D information and constitute the vehicle's 3D outline. The simulated and measured data processing results prove the correctness and effectiveness of our proposed strategy.
Specialized data analysis of SSME and advanced propulsion system vibration measurements
NASA Technical Reports Server (NTRS)
Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi
1993-01-01
The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.
ERIC Educational Resources Information Center
Comings, John; Sum, Andrew; Uvin, Johan
The role of adult education in sustaining economic growth and expanding opportunity in Massachusetts was explored. The analysis focused on the new basic skills needed for a new economy, groups lacking the new basic skills, the demand for adult basic education (ABE), funding for ABE, building basic skills through adult education, ABE's costs and…
Pattern Recognition for a Flight Dynamics Monte Carlo Simulation
NASA Technical Reports Server (NTRS)
Restrepo, Carolina; Hurtado, John E.
2011-01-01
The design, analysis, and verification and validation of a spacecraft relies heavily on Monte Carlo simulations. Modern computational techniques are able to generate large amounts of Monte Carlo data but flight dynamics engineers lack the time and resources to analyze it all. The growing amounts of data combined with the diminished available time of engineers motivates the need to automate the analysis process. Pattern recognition algorithms are an innovative way of analyzing flight dynamics data efficiently. They can search large data sets for specific patterns and highlight critical variables so analysts can focus their analysis efforts. This work combines a few tractable pattern recognition algorithms with basic flight dynamics concepts to build a practical analysis tool for Monte Carlo simulations. Current results show that this tool can quickly and automatically identify individual design parameters, and most importantly, specific combinations of parameters that should be avoided in order to prevent specific system failures. The current version uses a kernel density estimation algorithm and a sequential feature selection algorithm combined with a k-nearest neighbor classifier to find and rank important design parameters. This provides an increased level of confidence in the analysis and saves a significant amount of time.
Evaluating disease management program effectiveness: an introduction to time-series analysis.
Linden, Ariel; Adams, John L; Roberts, Nancy
2003-01-01
Currently, the most widely used method in the disease management (DM) industry for evaluating program effectiveness is referred to as the "total population approach." This model is a pretest-posttest design, with the most basic limitation being that without a control group, there may be sources of bias and/or competing extraneous confounding factors that offer a plausible rationale explaining the change from baseline. Furthermore, with the current inclination of DM programs to use financial indicators rather than program-specific utilization indicators as the principal measure of program success, additional biases are introduced that may cloud evaluation results. This paper presents a non-technical introduction to time-series analysis (using disease-specific utilization measures) as an alternative, and more appropriate, approach to evaluating DM program effectiveness than the current total population approach.
Development of Time-Distance Helioseismology Data Analysis Pipeline for SDO/HMI
NASA Technical Reports Server (NTRS)
DuVall, T. L., Jr.; Zhao, J.; Couvidat, S.; Parchevsky, K. V.; Beck, J.; Kosovichev, A. G.; Scherrer, P. H.
2008-01-01
The Helioseismic and Magnetic Imager of SDO will provide uninterrupted 4k x 4k-pixel Doppler-shift images of the Sun with approximately 40 sec cadence. These data will have a unique potential for advancing local helioseismic diagnostics of the Sun's interior structure and dynamics. They will help to understand the basic mechanisms of solar activity and develop predictive capabilities for NASA's Living with a Star program. Because of the tremendous amount of data the HMI team is developing a data analysis pipeline, which will provide maps of subsurface flows and sound-speed distributions inferred form the Doppler data by the time-distance technique. We discuss the development plan, methods, and algorithms, and present the status of the pipeline, testing results and examples of the data products.
NASA Astrophysics Data System (ADS)
Chiron, L.; Oger, G.; de Leffe, M.; Le Touzé, D.
2018-02-01
While smoothed-particle hydrodynamics (SPH) simulations are usually performed using uniform particle distributions, local particle refinement techniques have been developed to concentrate fine spatial resolutions in identified areas of interest. Although the formalism of this method is relatively easy to implement, its robustness at coarse/fine interfaces can be problematic. Analysis performed in [16] shows that the radius of refined particles should be greater than half the radius of unrefined particles to ensure robustness. In this article, the basics of an Adaptive Particle Refinement (APR) technique, inspired by AMR in mesh-based methods, are presented. This approach ensures robustness with alleviated constraints. Simulations applying the new formalism proposed achieve accuracy comparable to fully refined spatial resolutions, together with robustness, low CPU times and maintained parallel efficiency.
A Fast-Time Simulation Tool for Analysis of Airport Arrival Traffic
NASA Technical Reports Server (NTRS)
Erzberger, Heinz; Meyn, Larry A.; Neuman, Frank
2004-01-01
The basic objective of arrival sequencing in air traffic control automation is to match traffic demand and airport capacity while minimizing delays. The performance of an automated arrival scheduling system, such as the Traffic Management Advisor developed by NASA for the FAA, can be studied by a fast-time simulation that does not involve running expensive and time-consuming real-time simulations. The fast-time simulation models runway configurations, the characteristics of arrival traffic, deviations from predicted arrival times, as well as the arrival sequencing and scheduling algorithm. This report reviews the development of the fast-time simulation method used originally by NASA in the design of the sequencing and scheduling algorithm for the Traffic Management Advisor. The utility of this method of simulation is demonstrated by examining the effect on delays of altering arrival schedules at a hub airport.
Modeling the transmission dynamics of Ebola virus disease in Liberia.
Xia, Zhi-Qiang; Wang, Shi-Fu; Li, Shen-Long; Huang, Liu-Yu; Zhang, Wen-Yi; Sun, Gui-Quan; Gai, Zhong-Tao; Jin, Zhen
2015-09-08
Ebola virus disease (EVD) has erupted many times in some zones since it was first found in 1976. The 2014 EVD outbreak in West Africa is the largest ever, which has caused a large number of deaths and the most serious country is Liberia during the outbreak period. Based on the data released by World Health Organization and the actual transmission situations, we investigate the impact of different transmission routes on the EVD outbreak in Liberia and estimate the basic reproduction number R0 = 2.012 in the absence of effective control measures. Through sensitivity and uncertainty analysis, we reveal that the transmission coefficients of suspected and probable cases have stronger correlations on the basic reproduction number. Furthermore, we study the influence of control measures (isolation and safe burial measures) on EVD outbreak. It is found that if combined control measures are taken, the basic reproduction number will be less than one and thus EVD in Liberia may be well contained. The obtained results may provide new guidance to prevent and control the spread of disease.
Tang, Rongnian; Chen, Xupeng; Li, Chuang
2018-05-01
Near-infrared spectroscopy is an efficient, low-cost technology that has potential as an accurate method in detecting the nitrogen content of natural rubber leaves. Successive projections algorithm (SPA) is a widely used variable selection method for multivariate calibration, which uses projection operations to select a variable subset with minimum multi-collinearity. However, due to the fluctuation of correlation between variables, high collinearity may still exist in non-adjacent variables of subset obtained by basic SPA. Based on analysis to the correlation matrix of the spectra data, this paper proposed a correlation-based SPA (CB-SPA) to apply the successive projections algorithm in regions with consistent correlation. The result shows that CB-SPA can select variable subsets with more valuable variables and less multi-collinearity. Meanwhile, models established by the CB-SPA subset outperform basic SPA subsets in predicting nitrogen content in terms of both cross-validation and external prediction. Moreover, CB-SPA is assured to be more efficient, for the time cost in its selection procedure is one-twelfth that of the basic SPA.
NASA Technical Reports Server (NTRS)
Herman, J. R.; Hudson, R. D.; Serafino, G.
1990-01-01
Arguments are presented showing that the basic empirical model of the solar backscatter UV (SBUV) instrument degradation used by Cebula et al. (1988) in their analysis of the SBUV data is likely to lead to an incorrect estimate of the ozone trend. A correction factor is given as a function of time and altitude that brings the SBUV data into approximate agreement with the SAGE, SME, and Dobson network ozone trends. It is suggested that the currently archived SBUV ozone data should be used with caution for periods of analysis exceeding 1 yr, since it is likely that the yearly decreases contained in the archived data are too large.
NASA Astrophysics Data System (ADS)
Rahayu, D. V.
2017-02-01
This study was intended to figure out basic teaching skills of Mathematics Department Students of STKIP Garut at Field Experience Program in academic year 2014/2015. This study was qualitative research with analysis descriptive technique. Instrument used in this study was observation sheet to measure basic teaching mathematics skills. The result showed that ability of content mastery and explaining skill were in average category. Questioning skill, conducting variations skill and conducting assessment skill were in good category. Managing classroom skill and giving motivation skill were in poor category. Based on the result, it can be concluded that the students’ basic teaching skills weren’t optimal. It is recommended for the collegians to get lesson with appropriate strategy so that they can optimize their basic teaching skills.
Distinguishing Fast and Slow Processes in Accuracy - Response Time Data
Coomans, Frederik; Hofman, Abe; Brinkhuis, Matthieu; van der Maas, Han L. J.; Maris, Gunter
2016-01-01
We investigate the relation between speed and accuracy within problem solving in its simplest non-trivial form. We consider tests with only two items and code the item responses in two binary variables: one indicating the response accuracy, and one indicating the response speed. Despite being a very basic setup, it enables us to study item pairs stemming from a broad range of domains such as basic arithmetic, first language learning, intelligence-related problems, and chess, with large numbers of observations for every pair of problems under consideration. We carry out a survey over a large number of such item pairs and compare three types of psychometric accuracy-response time models present in the literature: two ‘one-process’ models, the first of which models accuracy and response time as conditionally independent and the second of which models accuracy and response time as conditionally dependent, and a ‘two-process’ model which models accuracy contingent on response time. We find that the data clearly violates the restrictions imposed by both one-process models and requires additional complexity which is parsimoniously provided by the two-process model. We supplement our survey with an analysis of the erroneous responses for an example item pair and demonstrate that there are very significant differences between the types of errors in fast and slow responses. PMID:27167518
NASA Astrophysics Data System (ADS)
Llewellyn-Jones, David; Good, Simon; Corlett, Gary
A pc-based analysis package has been developed, for the dual purposes of, firstly, providing ‘quick-look' capability to research workers inspecting long time-series of global satellite datasets of Sea-surface Temperature (SST); and, secondly, providing an introduction for students, either undergraduates, or advanced high-school students to the characteristics of commonly used analysis techniques for large geophysical data-sets from satellites. Students can also gain insight into the behaviour of some basic climate-related large-scale or global processes. The package gives students immediate access to up to 16 years of continuous global SST data, mainly from the Advanced Along-Track Scanning Radiometer, currently flying on ESA's Envisat satellite. The data are available and are presented in the form of monthly averages and spatial averaged to half-degree or one-sixth degree longitude-latitude grids. There are simple button-operated facilities for defining and calculating box-averages; producing time-series of such averages; defining and displaying transects and their evolution over time; and the examination anomalous behaviour by displaying the difference between observed values and values derived from climatological means. By using these facilities a student rapidly gains familiarity with such processes as annual variability, the El Nĩo effect, as well as major current systems n such as the Gulf Stream and other climatically important phenomena. In fact, the student is given immediate insights into the basic methods of examining geophysical data in a research context, without needing to acquire special analysis skills are go trough lengthy data retrieval and preparation procedures which are more generally required, as precursors to serious investigation, in the research laboratory. This software package, called the Leicester AAATSR Global Analyser (LAGA), is written in a well-known and widely used analysis language and the package can be run by using software that is readily available free-of-charge.
NASA Astrophysics Data System (ADS)
Fomin, Fedor V.
Preprocessing (data reduction or kernelization) as a strategy of coping with hard problems is universally used in almost every implementation. The history of preprocessing, like applying reduction rules simplifying truth functions, can be traced back to the 1950's [6]. A natural question in this regard is how to measure the quality of preprocessing rules proposed for a specific problem. For a long time the mathematical analysis of polynomial time preprocessing algorithms was neglected. The basic reason for this anomaly was that if we start with an instance I of an NP-hard problem and can show that in polynomial time we can replace this with an equivalent instance I' with |I'| < |I| then that would imply P=NP in classical complexity.
Busch, Hauke; Boerries, Melanie; Bao, Jie; Hanke, Sebastian T; Hiss, Manuel; Tiko, Theodhor; Rensing, Stefan A
2013-01-01
Transcription factors (TFs) often trigger developmental decisions, yet, their transcripts are often only moderately regulated and thus not easily detected by conventional statistics on expression data. Here we present a method that allows to determine such genes based on trajectory analysis of time-resolved transcriptome data. As a proof of principle, we have analysed apical stem cells of filamentous moss (P. patens) protonemata that develop from leaflets upon their detachment from the plant. By our novel correlation analysis of the post detachment transcriptome kinetics we predict five out of 1,058 TFs to be involved in the signaling leading to the establishment of pluripotency. Among the predicted regulators is the basic helix loop helix TF PpRSL1, which we show to be involved in the establishment of apical stem cells in P. patens. Our methodology is expected to aid analysis of key players of developmental decisions in complex plant and animal systems.
49 CFR 236.923 - Task analysis and basic requirements.
Code of Federal Regulations, 2013 CFR
2013-10-01
..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Standards for Processor-Based Signal and Train Control Systems § 236.923 Task analysis and basic requirements..., inspection, testing, and operating tasks that must be performed on a railroad's products. This includes the...
49 CFR 236.923 - Task analysis and basic requirements.
Code of Federal Regulations, 2014 CFR
2014-10-01
..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Standards for Processor-Based Signal and Train Control Systems § 236.923 Task analysis and basic requirements..., inspection, testing, and operating tasks that must be performed on a railroad's products. This includes the...
49 CFR 236.923 - Task analysis and basic requirements.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Standards for Processor-Based Signal and Train Control Systems § 236.923 Task analysis and basic requirements..., inspection, testing, and operating tasks that must be performed on a railroad's products. This includes the...
49 CFR 236.923 - Task analysis and basic requirements.
Code of Federal Regulations, 2012 CFR
2012-10-01
..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Standards for Processor-Based Signal and Train Control Systems § 236.923 Task analysis and basic requirements..., inspection, testing, and operating tasks that must be performed on a railroad's products. This includes the...
Incorporating Basic Optical Microscopy in the Instrumental Analysis Laboratory
ERIC Educational Resources Information Center
Flowers, Paul A.
2011-01-01
A simple and versatile approach to incorporating basic optical microscopy in the undergraduate instrumental analysis laboratory is described. Attaching a miniature CCD spectrometer to the video port of a standard compound microscope yields a visible microspectrophotometer suitable for student investigations of fundamental spectrometry concepts,…
Kearse, Matthew; Moir, Richard; Wilson, Amy; Stones-Havas, Steven; Cheung, Matthew; Sturrock, Shane; Buxton, Simon; Cooper, Alex; Markowitz, Sidney; Duran, Chris; Thierer, Tobias; Ashton, Bruce; Meintjes, Peter; Drummond, Alexei
2012-01-01
Summary: The two main functions of bioinformatics are the organization and analysis of biological data using computational resources. Geneious Basic has been designed to be an easy-to-use and flexible desktop software application framework for the organization and analysis of biological data, with a focus on molecular sequences and related data types. It integrates numerous industry-standard discovery analysis tools, with interactive visualizations to generate publication-ready images. One key contribution to researchers in the life sciences is the Geneious public application programming interface (API) that affords the ability to leverage the existing framework of the Geneious Basic software platform for virtually unlimited extension and customization. The result is an increase in the speed and quality of development of computation tools for the life sciences, due to the functionality and graphical user interface available to the developer through the public API. Geneious Basic represents an ideal platform for the bioinformatics community to leverage existing components and to integrate their own specific requirements for the discovery, analysis and visualization of biological data. Availability and implementation: Binaries and public API freely available for download at http://www.geneious.com/basic, implemented in Java and supported on Linux, Apple OSX and MS Windows. The software is also available from the Bio-Linux package repository at http://nebc.nerc.ac.uk/news/geneiousonbl. Contact: peter@biomatters.com PMID:22543367
Kearse, Matthew; Moir, Richard; Wilson, Amy; Stones-Havas, Steven; Cheung, Matthew; Sturrock, Shane; Buxton, Simon; Cooper, Alex; Markowitz, Sidney; Duran, Chris; Thierer, Tobias; Ashton, Bruce; Meintjes, Peter; Drummond, Alexei
2012-06-15
The two main functions of bioinformatics are the organization and analysis of biological data using computational resources. Geneious Basic has been designed to be an easy-to-use and flexible desktop software application framework for the organization and analysis of biological data, with a focus on molecular sequences and related data types. It integrates numerous industry-standard discovery analysis tools, with interactive visualizations to generate publication-ready images. One key contribution to researchers in the life sciences is the Geneious public application programming interface (API) that affords the ability to leverage the existing framework of the Geneious Basic software platform for virtually unlimited extension and customization. The result is an increase in the speed and quality of development of computation tools for the life sciences, due to the functionality and graphical user interface available to the developer through the public API. Geneious Basic represents an ideal platform for the bioinformatics community to leverage existing components and to integrate their own specific requirements for the discovery, analysis and visualization of biological data. Binaries and public API freely available for download at http://www.geneious.com/basic, implemented in Java and supported on Linux, Apple OSX and MS Windows. The software is also available from the Bio-Linux package repository at http://nebc.nerc.ac.uk/news/geneiousonbl.
The application of the geography census data in seismic hazard assessment
NASA Astrophysics Data System (ADS)
Yuan, Shen; Ying, Zhang
2017-04-01
Limited by basic data timeliness to earthquake emergency database in Sichuan province, after the earthquake disaster assessment results and the actual damage there is a certain gap. In 2015, Sichuan completed the province census for the first time which including topography, traffic, vegetation coverage, water area, desert and bare ground, traffic network, the census residents and facilities, geographical unit, geological hazard as well as the Lushan earthquake-stricken area's town planning construction and ecological environment restoration. On this basis, combining with the existing achievements of basic geographic information data and high resolution image data, supplemented by remote sensing image interpretation and geological survey, Carried out distribution and change situation of statistical analysis and information extraction for earthquake disaster hazard-affected body elements such as surface coverage, roads, structures infrastructure in Lushan county before 2013 after 2015. At the same time, achieved the transformation and updating from geographical conditions census data to earthquake emergency basic data through research their data type, structure and relationship. Finally, based on multi-source disaster information including hazard-affected body changed data and Lushan 7.0 magnitude earthquake CORS network coseismal displacement field, etc. obtaining intensity control points through information fusion. Then completed the seismic influence field correction and assessed earthquake disaster again through Sichuan earthquake relief headquarters technology platform. Compared the new assessment result,original assessment result and actual earthquake disaster loss which shows that the revised evaluation result is more close to the actual earthquake disaster loss. In the future can realize geographical conditions census data to earthquake emergency basic data's normalized updates, ensure the timeliness to earthquake emergency database meanwhile improve the accuracy of assessment of earthquake disaster constantly.
Mooney, Mark P; Cooper, Gregory M; Marazita, Mary L
2014-05-01
To celebrate the 50th year of the Cleft Palate-Craniofacial Journal we look back to where we started in 1964 and where we are now, and we speculate about directions for the future in a "Then and Now" editorial series. This editorial examines changing trends and perspectives in anatomical, basic science, and genetic studies published in this 50-year interval. In volume 1 there were 45 total papers, seven (16%) of which were peer-reviewed basic science and genetic articles published: four in anatomy, three in craniofacial biology, and none in genetics. In contrast, in volume 50, of 113 articles there were 47 (42%) peer-reviewed basic science and genetic articles published: 30 in anatomy, five in craniofacial biology, and 12 in genetics. Topical analysis of published manuscripts then and now reveal that similar topics in anatomy and craniofacial biology are still being researched today (e.g., phenotypic variability, optimal timing of surgery, presurgical orthopedics, bone grafting); whereas, most of the more recent papers use advanced technology to address old questions. In contrast, genetic publications have clearly increased in frequency during the last 50 years, which parallels advances in the field during this time. However, all of us have noticed that the more "cutting-edge" papers in these areas are not being submitted for publication to the journal, but instead to discipline-specific journals. Concerted efforts are therefore indicated to attract and publish these cutting-edge papers in order to keep the Cleft Palate-Craniofacial Journal in the forefront of orofacial cleft and craniofacial anomaly research and to provide a valuable service to American Cleft Palate-Craniofacial Association members.
Petrova, Guenka; Clerfeuille, Fabrice; Vakrilova, Milena; Mitkov, Cvetomir; Poubanne, Yannick
2008-01-01
The objective of this work is to study the possibilities of the tetraclass model for the evaluation of the changes in the consumer satisfaction from the provided pharmacy services during the time. Methods Within the same 4 months period in 2004 and 2006 were questioned at approximately 10 pharmacy consumers per working day. Every consumer evaluated the 34 service elements on a 5 points semantic-differential scale. The technique of the correspondence data analysis was used for the categorisation of the services. Results Most of the services have been categorized as basic ones. For the age group up to 40 years the access to pharmacy became a key element and external aspects became a secondary element in 2006 year. For the group of patients that are using the services of the pharmacy for more than 2 years, availability of phone connection, quality of answers and product prices move from plus to secondary element. The ratio quality/price moves from the group of basic to key services, visibility of the prices and hygiene became basic elements from secondary ones. During the two years period, all the service elements connected with the staff as availability, identification, good looking, confidence, dressing, advices, technical competence, explanation, and time spent with clients remain basic services. The confidentiality of the staff remains always a key element. Conclusion Our study shows that the tetraclass model allows taking more informed managerial decisions in the pharmacies, as well as, is providing information for the concrete area of services and possible measures. In case of a development of a simple statistical program for quick processing of the inquiry data, the method will became applicable and affordable even for small pharmacies. PMID:25147588
Basic emotion profiles in healthy, chronic pain, depressed and PTSD individuals.
Finucane, Anne M; Dima, Alexandra; Ferreira, Nuno; Halvorsen, Marianne
2012-01-01
To compare self-reports of five basic emotions across four samples: healthy, chronic pain, depressed and post-traumatic stress disorder (PTSD), and to investigate the extent to which basic emotion reports discriminate between individuals in healthy or clinical groups. In total, 439 participants took part in this study: healthy (n = 131), chronic pain (n = 220), depressed (n = 24) and PTSD (n = 64). The participants completed the trait version of the Basic Emotion Scale. Basic emotion profiles were compared both within each group and between the healthy group and each of the three other groups. Discriminant analysis was used to assess the extent to which basic emotions can be used to classify the participants as belonging to the healthy group or one of the clinical groups. In the healthy group, happiness was experienced more than any other basic emotion. This was not found in the clinical groups. In comparison to the healthy participants, the chronic pain group experienced more fear, anger and sadness, the depressed group reported more sadness and the PTSD group experienced all of the negative emotions more frequently. Discriminant analysis revealed that happiness was the most important variable in determining whether an individual belonged to the healthy group or one of the clinical groups. Anger was found to further discriminate between depressed and chronic pain individuals. The findings demonstrate that basic emotion profile analysis can provide a useful foundation for the exploration of emotional experience both within and between healthy and clinical groups. Copyright © 2011 John Wiley & Sons, Ltd.
AN ANALYSIS OF THE BEHAVIORAL PROCESSES INVOLVED IN SELF-INSTRUCTION WITH TEACHING MACHINES.
ERIC Educational Resources Information Center
HOLLAND, JAMES G.; SKINNER, B.F.
THIS COLLECTION OF PAPERS CONSTITUTES THE FINAL REPORT OF A PROJECT DEVOTED TO AN ANALYSIS OF THE BEHAVIORAL PROCESSES UNDERLYING PROGRAMED INSTRUCTION. THE PAPERS ARE GROUPED UNDER THREE HEADINGS--(1) "PROGRAMING RESEARCH," (2) "BASIC SKILLS--RATIONALE AND PROCEDURE," AND (3) "BASIC SKILLS--SPECIFIC SKILLS." THE…
Lifeline: A Tool for Logistics Professionals
2017-06-01
proof of concept study is designed to provide a basic understanding of the Supply Corps community, provide a comparative analysis of the organizational...concept study is designed to provide a basic understanding of the Supply Corps community, provide a comparative analysis of the organizational...APPLICATION) ......................................................................................63 G. DESIGN
Video and accelerometer-based motion analysis for automated surgical skills assessment.
Zia, Aneeq; Sharma, Yachna; Bettadapura, Vinay; Sarin, Eric L; Essa, Irfan
2018-03-01
Basic surgical skills of suturing and knot tying are an essential part of medical training. Having an automated system for surgical skills assessment could help save experts time and improve training efficiency. There have been some recent attempts at automated surgical skills assessment using either video analysis or acceleration data. In this paper, we present a novel approach for automated assessment of OSATS-like surgical skills and provide an analysis of different features on multi-modal data (video and accelerometer data). We conduct a large study for basic surgical skill assessment on a dataset that contained video and accelerometer data for suturing and knot-tying tasks. We introduce "entropy-based" features-approximate entropy and cross-approximate entropy, which quantify the amount of predictability and regularity of fluctuations in time series data. The proposed features are compared to existing methods of Sequential Motion Texture, Discrete Cosine Transform and Discrete Fourier Transform, for surgical skills assessment. We report average performance of different features across all applicable OSATS-like criteria for suturing and knot-tying tasks. Our analysis shows that the proposed entropy-based features outperform previous state-of-the-art methods using video data, achieving average classification accuracies of 95.1 and 92.2% for suturing and knot tying, respectively. For accelerometer data, our method performs better for suturing achieving 86.8% average accuracy. We also show that fusion of video and acceleration features can improve overall performance for skill assessment. Automated surgical skills assessment can be achieved with high accuracy using the proposed entropy features. Such a system can significantly improve the efficiency of surgical training in medical schools and teaching hospitals.
5 CFR 300.103 - Basic requirements.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Basic requirements. 300.103 Section 300.103 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS EMPLOYMENT (GENERAL) Employment Practices § 300.103 Basic requirements. (a) Job analysis. Each employment practice of the Federal Government generally, and of...
Basic guidelines to introduce electric circuit simulation software in a general physics course
NASA Astrophysics Data System (ADS)
Moya, A. A.
2018-05-01
The introduction of electric circuit simulation software for undergraduate students in a general physics course is proposed in order to contribute to the constructive learning of electric circuit theory. This work focuses on the lab exercises based on dc, transient and ac analysis in electric circuits found in introductory physics courses, and shows how students can use the simulation software to do simple activities associated with a lab exercise itself and with related topics. By introducing electric circuit simulation programs in a general physics course as a brief activitiy complementing lab exercise, students develop basic skills in using simulation software, improve their knowledge on the topology of electric circuits and perceive that the technology contributes to their learning, all without reducing the time spent on the actual content of the course.
NASA Astrophysics Data System (ADS)
Wciślik, Sylwia
This paper analyses energy efficiency of thermomodernization project on the example of three forest lodges located in the Świętokrzyski National Park. Currently, one of the basic requirements posed for the buildings subjected to modernization is to reduce carbon dioxide emissions even above 80% in comparison with the original values. In order to fulfil such criteria, it is necessary to apply alternative solutions based on renewable energy sources. Due to limited budget, low cubic capacity and location of the buildings, solar collectors with storage tanks and biomass boilers provide a rational option. For such a case, the emissions of basic pollutants such as CO2, SOx, NOx or particulates is obtained. The study also gives the results of calculations of payback time (SPBT) for the investment for exemplary forest lodge.
NASA Technical Reports Server (NTRS)
1972-01-01
Current research is reported on precise and accurate descriptions of the earth's surface and gravitational field and on time variations of geophysical parameters. A new computer program was written in connection with the adjustment of the BC-4 worldwide geometric satellite triangulation net. The possibility that an increment to accuracy could be transferred from a super-control net to the basic geodetic (first-order triangulation) was investigated. Coordinates of the NA9 solution were computed and were transformed to the NAD datum, based on GEOS 1 observations. Normal equations from observational data of several different systems and constraint equations were added and a single solution was obtained for the combined systems. Transformation parameters with constraints were determined, and the impact of computers on surveying and mapping is discussed.
NASA Astrophysics Data System (ADS)
Xiao, Yao; Chraibi, Mohcine; Qu, Yunchao; Tordeux, Antoine; Gao, Ziyou
2018-05-01
In a crowd, individuals make different motion choices such as "moving to destination," "following another pedestrian," and "making a detour." For the sake of convenience, the three direction choices are respectively called destination direction, following direction, and detour direction in this paper. Here, it is found that the featured direction choices could be inspired by the shape characteristics of the Voronoi diagram. To be specific, in the Voronoi cell of a pedestrian, the direction to a Voronoi node is regarded as a potential "detour" direction and the direction perpendicular to a Voronoi link is regarded as a potential "following" direction. A pedestrian generally owns several alternative Voronoi nodes and Voronoi links in a Voronoi cell, and the optimal detour and following direction are determined by considering related factors such as deviation. Plus the destination direction which is directly pointing to the destination, the three basic direction choices are defined in a Voronoi cell. In order to evaluate the Voronoi diagram based basic directions, the empirical trajectory data in both uni- and bi-directional flow experiments are extracted. A time series method considering the step frequency is used to reduce the original trajectories' swaying phenomena which might disturb the recognition of actual forward direction. The deviations between the empirical velocity direction and the basic directions are investigated, and each velocity direction is classified into a basic direction or regarded as an inexplicable direction according to the deviations. The analysis results show that each basic direction could be a potential direction choice for a pedestrian. The combination of the three basic directions could cover most empirical velocity direction choices in both uni- and bi-directional flow experiments.
A geospatial evaluation of timely access to surgical care in seven countries
Banguti, Paulin; Chackungal, Smita; Chanthasiri, Traychit; Chao, Tiffany E; Dahn, Bernice; Derbew, Milliard; Dhar, Debashish; Esquivel, Micaela M; Evans, Faye; Hendel, Simon; LeBrun, Drake G; Notrica, Michelle; Saavedra-Pozo, Iracema; Shockley, Ross; Uribe-Leitz, Tarsicio; Vannavong, Boualy; McQueen, Kelly A; Spain, David A; Weiser, Thomas G
2017-01-01
Abstract: Objective To assess the consistent availability of basic surgical resources at selected facilities in seven countries. Methods In 2010–2014, we used a situational analysis tool to collect data at district and regional hospitals in Bangladesh (n = 14), the Plurinational State of Bolivia (n = 18), Ethiopia (n = 19), Guatemala (n = 20), the Lao People's Democratic Republic (n = 12), Liberia (n = 12) and Rwanda (n = 25). Hospital sites were selected by pragmatic sampling. Data were geocoded and then analysed using an online data visualization platform. Each hospital’s catchment population was defined as the people who could reach the hospital via a vehicle trip of no more than two hours. A hospital was only considered to show consistent availability of basic surgical resources if clean water, electricity, essential medications including intravenous fluids and at least one anaesthetic, analgesic and antibiotic, a functional pulse oximeter, a functional sterilizer, oxygen and providers accredited to perform surgery and anaesthesia were always available. Findings Only 41 (34.2%) of the 120 study hospitals met the criteria for the provision of consistent basic surgical services. The combined catchments of the study hospitals in each study country varied between 3.3 million people in Liberia and 151.3 million people in Bangladesh. However, the combined catchments of the study hospitals in each study country that met the criteria for the provision of consistent basic surgical services were substantially smaller and varied between 1.3 million in Liberia and 79.2 million in Bangladesh. Conclusion Many study facilities were deficient in the basic infrastructure necessary for providing basic surgical care on a consistent basis. PMID:28603310
Nazarov, Denis V; Zemtsova, Elena G; Solokhin, Alexandr Yu; Valiev, Ruslan Z; Smirnov, Vladimir M
2017-01-13
In this study, we present the detailed investigation of the influence of the etching medium (acidic or basic Piranha solutions) and the etching time on the morphology and surface relief of ultrafine grained (UFG) and coarse grained (CG) titanium. The surface relief and morphology have been studied by means of scanning electron microscopy (SEM), atomic force microscopy (AFM), and the spectral ellipsometry. The composition of the samples has been determined by X-ray fluorescence analysis (XRF) and X-ray Photoelectron Spectroscopy (XPS). Significant difference in the etching behavior of UFG and CG titanium has been found. UFG titanium exhibits higher etching activity independently of the etching medium. Formed structures possess higher homogeneity. The variation of the etching medium and time leads to micro-, nano-, or hierarchical micro/nanostructures on the surface. Significant difference has been found between surface composition for UFG titanium etched in basic and acidic Piranha solution. Based on the experimental data, the possible reasons and mechanisms are considered for the formation of nano- and microstructures. The prospects of etched UFG titanium as the material for implants are discussed.
Nazarov, Denis V.; Zemtsova, Elena G.; Solokhin, Alexandr Yu.; Valiev, Ruslan Z.; Smirnov, Vladimir M.
2017-01-01
In this study, we present the detailed investigation of the influence of the etching medium (acidic or basic Piranha solutions) and the etching time on the morphology and surface relief of ultrafine grained (UFG) and coarse grained (CG) titanium. The surface relief and morphology have been studied by means of scanning electron microscopy (SEM), atomic force microscopy (AFM), and the spectral ellipsometry. The composition of the samples has been determined by X-ray fluorescence analysis (XRF) and X-ray Photoelectron Spectroscopy (XPS). Significant difference in the etching behavior of UFG and CG titanium has been found. UFG titanium exhibits higher etching activity independently of the etching medium. Formed structures possess higher homogeneity. The variation of the etching medium and time leads to micro-, nano-, or hierarchical micro/nanostructures on the surface. Significant difference has been found between surface composition for UFG titanium etched in basic and acidic Piranha solution. Based on the experimental data, the possible reasons and mechanisms are considered for the formation of nano- and microstructures. The prospects of etched UFG titanium as the material for implants are discussed. PMID:28336849
NASA Astrophysics Data System (ADS)
Pokorný, Jaroslav; Pavlíková, Milena; Medved, Igor; Pavlík, Zbyšek; Zahálková, Jana; Rovnaníková, Pavla; Černý, Robert
2016-06-01
Active silica containing materials in the sub-micrometer size range are commonly used for modification of strength parameters and durability of cement based composites. In addition, these materials also assist to accelerate cement hydration. In this paper, two types of diatomaceous earths are used as partial cement replacement in composition of cement paste mixtures. For raw binders, basic physical and chemical properties are studied. The chemical composition of tested materials is determined using classical chemical analysis combined with XRD method that allowed assessment of SiO2 amorphous phase content. For all tested mixtures, initial and final setting times are measured. Basic physical and mechanical properties are measured on hardened paste samples cured 28 days in water. Here, bulk density, matrix density, total open porosity, compressive and flexural strength, are measured. Relationship between compressive strength and total open porosity is studied using several empirical models. The obtained results give evidence of high pozzolanic activity of tested diatomite earths. Their application leads to the increase of both initial and final setting times, decrease of compressive strength, and increase of flexural strength.
Emery, A Ann; Heath, Nancy L; Mills, Devin J
2017-07-03
The present study applied self-determination theory to examine the onset, maintenance, and cessation of non-suicidal self-injury (NSSI) in adolescents. Specifically, the study examined the relationship between the basic psychological needs of autonomy, competence, and relatedness, and NSSI status. Participants were classified into the NSSI Maintain (n = 30), NSSI Start (n = 44), NSSI Stop (n = 21), or Control (n = 98) groups based on NSSI status over 2 time points within a 12-month period. Repeated measures multiple analysis of variance was employed. Satisfaction of the need for competence decreased over time in all adolescents. Adolescents who maintained NSSI behavior reported significantly lower levels of need satisfaction compared to adolescents reporting no history of NSSI engagement, and adolescents who began NSSI over the course of the study reported significantly lower levels of need satisfaction compared to those reporting no history of NSSI engagement. The findings suggest that need satisfaction varies as a function of NSSI status.
NASA Technical Reports Server (NTRS)
Schwind, R. G.; Allen, H. J.
1973-01-01
High frequency surface pressure measurements were obtained from wind-tunnel tests over the Reynolds number range 1.2 times one million to 6.2 times one million on a rectangular wing of NACA 63-009 airfoil section. Measurements were also obtained with a wide selection of leading-edge serrations added to the basic airfoil. Under a two-dimensional laminar bubble very close to the leading edge of the basic airfoil there is a large apatial peak in rms pressure. Frequency analysis of the pressure signals in this region show a large, high-frequency energy peak which is interpreted as an oscillation in size and position of the bubble. The serrations divide the bubble into segments and reduce the peak rms pressures. A low Reynolds number flow visualization test on a hydrofoil in water was also conducted. A von Karman vortex street was found trailing from the rear of the foil. Its frequency is at a much lower Strouhal number than in the high Reynolds number experiment, and is related to the trailing-edge and boundary-layer thicknesses.
NASA Technical Reports Server (NTRS)
Omalley, T. A.; Connolly, D. J.
1977-01-01
The use of the coupled cavity traveling wave tube for space communications has led to an increased interest in improving the efficiency of the basic interaction process in these devices through velocity resynchronization and other methods. To analyze these methods, a flexible, large signal computer program for use on the IBM 360/67 time-sharing system has been developed. The present report is a users' manual for this program.
A Quantitative Analysis of Factors Affecting Retention of Female Aviators in U.S. Naval Aviation
2012-09-01
various reasons and is controlled by either US law or military regulations, and it equates to a basic quid pro quo scenario. The Navy offers the...Fitzgerald (2005) used time dependent modeling to investigate the effects of sexual harassment on turnover in the military. They discovered that females...exposed to sexual harassment were likely to leave either the job or organization to escape it, depending on the extent of the perceived threat
NASA Astrophysics Data System (ADS)
Vanmarcke, Erik
1983-03-01
Random variation over space and time is one of the few attributes that might safely be predicted as characterizing almost any given complex system. Random fields or "distributed disorder systems" confront astronomers, physicists, geologists, meteorologists, biologists, and other natural scientists. They appear in the artifacts developed by electrical, mechanical, civil, and other engineers. They even underlie the processes of social and economic change. The purpose of this book is to bring together existing and new methodologies of random field theory and indicate how they can be applied to these diverse areas where a "deterministic treatment is inefficient and conventional statistics insufficient." Many new results and methods are included. After outlining the extent and characteristics of the random field approach, the book reviews the classical theory of multidimensional random processes and introduces basic probability concepts and methods in the random field context. It next gives a concise amount of the second-order analysis of homogeneous random fields, in both the space-time domain and the wave number-frequency domain. This is followed by a chapter on spectral moments and related measures of disorder and on level excursions and extremes of Gaussian and related random fields. After developing a new framework of analysis based on local averages of one-, two-, and n-dimensional processes, the book concludes with a chapter discussing ramifications in the important areas of estimation, prediction, and control. The mathematical prerequisite has been held to basic college-level calculus.
Changes in the Structure and Propagation of the MJO with Increasing CO2
NASA Technical Reports Server (NTRS)
Adames, Angel F.; Kim, Daehyun; Sobel, Adam H.; Del Genio, Anthony; Wu, Jingbo
2017-01-01
Changes in the Madden-Julian Oscillation (MJO) with increasing CO2 concentrations are examined using the Goddard Institute for Space Studies Global Climate Model (GCM). Four simulations performed with fixed CO2 concentrations of 0.5, 1, 2 and 4 times pre-industrial levels using the GCM coupled with a mixed layer ocean model are analyzed in terms of the basic state, rainfall and moisture variability, and the structure and propagation of the MJO.The GCM simulates basic state changes associated with increasing CO2 that are consistent with results from earlier studies: column water vapor increases at approximately 7.1% K(exp -1), precipitation also increases but at a lower rate (approximately 3% K(exp -1)), and column relative humidity shows little change. Moisture and rainfall variability intensify with warming. Total moisture and rainfall variability increases at a rate that is similar to that of the mean state change. The intensification is faster in the MJO-related anomalies than in the total anomalies, though the ratio of the MJO band variability to its westward counterpart increases at a much slower rate. On the basis of linear regression analysis and space-time spectral analysis, it is found that the MJO exhibits faster eastward propagation, faster westward energy dispersion, a larger zonal scale and deeper vertical structure in warmer climates.
ERIC Educational Resources Information Center
Garza-Kling, Gina
2011-01-01
Traditionally, learning basic facts has focused on rote memorization of isolated facts, typically through the use of flash cards, repeated drilling, and timed testing. However, as many experienced teachers have seen, "drill alone does not develop mastery of single-digit combinations." In contrast, a fluency approach to learning basic addition…
5 CFR 551.401 - Basic principles.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Basic principles. 551.401 Section 551.401 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PAY ADMINISTRATION UNDER THE FAIR LABOR STANDARDS ACT Hours of Work General Provisions § 551.401 Basic principles. (a) All time...
Basic Skills Applications in Occupational Investigation.
ERIC Educational Resources Information Center
Hendrix, Mary
This guide contains 50 lesson plans for learning activities that incorporate basic skills into content areas of career education, mathematics, science, social studies, communications, and productive work habits. Each lesson consists of a purpose, basic skills applications, approximate time required, materials needed, things for the teacher to do…
ERIC Educational Resources Information Center
Tuijnman, Albert C., Ed.; Kirsch, Irwin S., Ed.; Wagner, Daniel A., Ed.
This book contains 13 papers examining innovations in measuring adults' basic skills and analyzing adult literacy policy. The following papers are included: "Series Preface" (Daniel A. Wagner); "Foreword" (Torsten Husen); "Introduction" (Albert Tuijnman); "Adult Basic Skills: Policy Issues and a Research…
Code of Federal Regulations, 2010 CFR
2010-04-01
... the next, so that in any single year an employee may have a maximum of four weeks' vacation time. At... agreement providing for the deferral), the value of any unused vacation time from the prior year in excess... amount is the lesser of two times the basic annual limitation ($30,000) or the sum of the basic annual...
Code of Federal Regulations, 2011 CFR
2011-04-01
... the next, so that in any single year an employee may have a maximum of four weeks' vacation time. At... agreement providing for the deferral), the value of any unused vacation time from the prior year in excess... amount is the lesser of two times the basic annual limitation ($30,000) or the sum of the basic annual...
Arctic Sea Ice: Trends, Stability and Variability
NASA Astrophysics Data System (ADS)
Moon, Woosok
A stochastic Arctic sea-ice model is derived and analyzed in detail to interpret the recent decay and associated variability of Arctic sea-ice under changes in greenhouse gas forcing widely referred to as global warming. The approach begins from a deterministic model of the heat flux balance through the air/sea/ice system, which uses observed monthly-averaged heat fluxes to drive a time evolution of sea-ice thickness. This model reproduces the observed seasonal cycle of the ice cover and it is to this that stochastic noise---representing high frequency variability---is introduced. The model takes the form of a single periodic non-autonomous stochastic ordinary differential equation. Following an introductory chapter, the two that follow focus principally on the properties of the deterministic model in order to identify the main properties governing the stability of the ice cover. In chapter 2 the underlying time-dependent solutions to the deterministic model are analyzed for their stability. It is found that the response time-scale of the system to perturbations is dominated by the destabilizing sea-ice albedo feedback, which is operative in the summer, and the stabilizing long wave radiative cooling of the ice surface, which is operative in the winter. This basic competition is found throughout the thesis to define the governing dynamics of the system. In particular, as greenhouse gas forcing increases, the sea-ice albedo feedback becomes more effective at destabilizing the system. Thus, any projections of the future state of Arctic sea-ice will depend sensitively on the treatment of the ice-albedo feedback. This in turn implies that the treatment a fractional ice cover as the ice areal extent changes rapidly, must be handled with the utmost care. In chapter 3, the idea of a two-season model, with just winter and summer, is revisited. By breaking the seasonal cycle up in this manner one can simplify the interpretation of the basic dynamics. Whereas in the fully time-dependent seasonal model one finds stable seasonal ice cover (vanishing in the summer but reappearing in the winter), in previous two-season models such a state could not be found. In this chapter the sufficient conditions are found for a stable seasonal ice cover, which reside in including a time variation in the shortwave radiance during summer. This provides a qualitative interpretation of the continuous and reversible shift from perennial to seasonally-varying states in the more complex deterministic model. In order to put the stochastic model into a realistic observational framework, in chapter 4, the analysis of daily satellite retrievals of ice albedo and ice extent is described. Both the basic statistics are examined and a new method, called multi-fractal temporally weighted detrended fluctuation analysis, is applied. Because the basic data are taken on daily time scales, the full fidelity of the retrieved data is accessed and we find time scales from days and weeks to seasonal and decadal. Importantly, the data show a white-noise structure on annual to biannual time scales and this provides the basis for using a Wiener process for the noise in the stochastic Arctic sea-ice model. In chapter 5 a generalized perturbation analysis of a non-autonomous stochastic differential equation is developed and then applied to interpreting the variability of Arctic sea-ice as greenhouse gas forcing increases. The resulting analytic expressions of the statistical moments provide insight into the transient and memory-delay effects associated with the basic competition in the system: the ice-albedo feedback and long wave radiative stabilization along with the asymmetry in the nonlinearity of the deterministic contributions to the model and the magnitude and structure of the stochastic noise. A systematic study of the impact of the noise structure, from additive to multiplicative, is undertaken in chapters 6 and 7. Finally, in chapter 8 the matter of including a fractional ice cover into a deterministic model is addressed. It is found that a simple but crucial mistake is made in one of the most widely used model schemes and this has a major impact given the important role of areal fraction in the ice-albedo feedback in such a model. The thesis is summarized in chapter 9.
Basic Science Training Program.
ERIC Educational Resources Information Center
Brummel, Clete
These six learning modules were developed for Lake Michigan College's Basic Science Training Program, a workshop to develop good study skills while reviewing basic science. The first module, which was designed to provide students with the necessary skills to study efficiently, covers the following topics: time management; an overview of a study…
Serwetnyk, Tara M; Filmore, Kristi; VonBacho, Stephanie; Cole, Robert; Miterko, Cindy; Smith, Caitlin; Smith, Charlene M
2015-01-01
Basic Life Support certification for nursing staff is achieved through various training methods. This study compared three American Heart Association training methods for nurses seeking Basic Life Support renewal: a traditional classroom approach and two online options. Findings indicate that online methods for Basic Life Support renewal deliver cost and time savings, while maintaining positive learning outcomes, satisfaction, and confidence level of participants.
Establishing the Content Validity of a Basic Computer Literacy Course.
ERIC Educational Resources Information Center
Clements, James; Carifio, James
1995-01-01
Content analysis of 13 textbooks and 2 Department of Education documents was conducted to ascertain common word processing, database, and spreadsheet software skills in order to determine which specific skills should be taught in a high school computer literacy course. Aspects of a basic computer course, created from this analysis, are described.…
Microeconomic Analysis with BASIC.
ERIC Educational Resources Information Center
Tom, C. F. Joseph
Computer programs written in BASIC for the study of microeconomic analysis with special emphasis in economic decisions on price, output, and profit of a business firm are described. A very brief overview of the content of each of the 28 computer programs comprising the course is provided; four of the programs are then discussed in greater detail.…
ERIC Educational Resources Information Center
Hamre, S.
The author discusses the need for severely handicapped students to acquire basic home living skills, reviews task analysis principles, and provides sample instructional programs. Listed are basic grooming, dressing, domestic maintenance, and cooking skills. A sample task analysis procedure is demonstrated for the skill of brushing teeth. Reported…
Miller, J.J.
1982-01-01
The spectral analysis and filter program package is written in the BASIC language for the HP-9845T desktop computer. The program's main purpose is to perform spectral analyses on digitized time-domain data. In addition, band-pass filtering of the data can be performed in the time domain. Various other processes such as autocorrelation can be performed to the time domain data in order to precondition them for spectral analyses. The frequency domain data can also be transformed back into the time domain if desired. Any data can be displayed on the CRT in graphic form using a variety of plot routines. A hard copy can be obtained immediately using the internal thermal printer. Data can also be displayed in tabular form on the CRT or internal thermal printer or it can be stored permanently on a mass storage device like a tape or disk. A list of the processes performed in the order in which they occurred can be displayed at any time.
Time-frequency analysis of phonocardiogram signals using wavelet transform: a comparative study.
Ergen, Burhan; Tatar, Yetkin; Gulcur, Halil Ozcan
2012-01-01
Analysis of phonocardiogram (PCG) signals provides a non-invasive means to determine the abnormalities caused by cardiovascular system pathology. In general, time-frequency representation (TFR) methods are used to study the PCG signal because it is one of the non-stationary bio-signals. The continuous wavelet transform (CWT) is especially suitable for the analysis of non-stationary signals and to obtain the TFR, due to its high resolution, both in time and in frequency and has recently become a favourite tool. It decomposes a signal in terms of elementary contributions called wavelets, which are shifted and dilated copies of a fixed mother wavelet function, and yields a joint TFR. Although the basic characteristics of the wavelets are similar, each type of the wavelets produces a different TFR. In this study, eight real types of the most known wavelets are examined on typical PCG signals indicating heart abnormalities in order to determine the best wavelet to obtain a reliable TFR. For this purpose, the wavelet energy and frequency spectrum estimations based on the CWT and the spectra of the chosen wavelets were compared with the energy distribution and the autoregressive frequency spectra in order to determine the most suitable wavelet. The results show that Morlet wavelet is the most reliable wavelet for the time-frequency analysis of PCG signals.
Analysis of anabolic steroids in hair: time courses in guinea pigs.
Shen, Min; Xiang, Ping; Yan, Hui; Shen, Baohua; Wang, Mengye
2009-09-01
Sensitive, specific, and reproducible methods for the quantitative determination of eight anabolic steroids in guinea pig hair have been developed using LC/MS/MS and GC/MS/MS. Methyltestosterone, stanozolol, methandienone, nandrolone, trenbolone, boldenone, methenolone and DHEA were administered intraperitoneally in guinea pigs. After the first injection, black hair segments were collected on shaved areas of skin. The analysis of these segments revealed the distribution of anabolic steroids in the guinea pig hair. The major components in hair are the parent anabolic steroids. The time courses of the concentrations of the steroids in hair (except methenolone, which does not deposit in hair) demonstrated that the peak concentrations were reached on days 2-4, except stanozolol, which peaked on day 10 after administration. The concentrations in hair appeared to be related to the physicochemical properties of the drug compound and to the dosage. These studies on the distribution of drugs in the hair shaft and on the time course of their concentration changes provide information relevant to the optimal time and method of collecting hair samples. Such studies also provide basic data that will be useful in the application of hair analysis in the control of doping and in the interpretation of results.
Analysis of space radiation data of semiconductor memories
NASA Technical Reports Server (NTRS)
Stassinopoulos, E. G.; Brucker, G. J.; Stauffer, C. A.
1996-01-01
This article presents an analysis of radiation effects for several select device types and technologies aboard the Combined Release and Radiation Effects Satellite (CRRES) satellite. These space-flight measurements covered a period of about 14 months of mission lifetime. Single Event Upset (SEU) data of the investigated devices from the Microelectronics Package (MEP) were processed and analyzed. Valid upset measurements were determined by correcting for invalid readings, hard failures, missing data tapes (thus voids in data), and periods over which devices were disabled from interrogation. The basic resolution time of the measurement system was confirmed to be 2 s. Lessons learned, important findings, and recommendations are presented.
NASA Technical Reports Server (NTRS)
Kleis, Stanley J.; Truong, Tuan; Goodwin, Thomas J,
2004-01-01
This report is a documentation of a fluid dynamic analysis of the proposed Automated Static Culture System (ASCS) cell module mixing protocol. The report consists of a review of some basic fluid dynamics principles appropriate for the mixing of a patch of high oxygen content media into the surrounding media which is initially depleted of oxygen, followed by a computational fluid dynamics (CFD) study of this process for the proposed protocol over a range of the governing parameters. The time histories of oxygen concentration distributions and mechanical shear levels generated are used to characterize the mixing process for different parameter values.
Evaluation of the Williams-type model for barley yields in North Dakota and Minnesota
NASA Technical Reports Server (NTRS)
Barnett, T. L. (Principal Investigator)
1981-01-01
The Williams-type yield model is based on multiple regression analysis of historial time series data at CRD level pooled to regional level (groups of similar CRDs). Basic variables considered in the analysis include USDA yield, monthly mean temperature, monthly precipitation, soil texture and topographic information, and variables derived from these. Technologic trend is represented by piecewise linear and/or quadratic functions of year. Indicators of yield reliability obtained from a ten-year bootstrap test (1970-1979) demonstrate that biases are small and performance based on root mean square appears to be acceptable for the intended AgRISTARS large area applications. The model is objective, adequate, timely, simple, and not costly. It consideres scientific knowledge on a broad scale but not in detail, and does not provide a good current measure of modeled yield reliability.
Hazards and occupational risk in hard coal mines - a critical analysis of legal requirements
NASA Astrophysics Data System (ADS)
Krause, Marcin
2017-11-01
This publication concerns the problems of occupational safety and health in hard coal mines, the basic elements of which are the mining hazards and the occupational risk. The work includes a comparative analysis of selected provisions of general and industry-specific law regarding the analysis of hazards and occupational risk assessment. Based on a critical analysis of legal requirements, basic assumptions regarding the practical guidelines for occupational risk assessment in underground coal mines have been proposed.
Advances in numerical and applied mathematics
NASA Technical Reports Server (NTRS)
South, J. C., Jr. (Editor); Hussaini, M. Y. (Editor)
1986-01-01
This collection of papers covers some recent developments in numerical analysis and computational fluid dynamics. Some of these studies are of a fundamental nature. They address basic issues such as intermediate boundary conditions for approximate factorization schemes, existence and uniqueness of steady states for time dependent problems, and pitfalls of implicit time stepping. The other studies deal with modern numerical methods such as total variation diminishing schemes, higher order variants of vortex and particle methods, spectral multidomain techniques, and front tracking techniques. There is also a paper on adaptive grids. The fluid dynamics papers treat the classical problems of imcompressible flows in helically coiled pipes, vortex breakdown, and transonic flows.
Giordano, P C; Plancke, A; Van Meir, C A; Janssen, C A H; Kok, P J M J; Van Rooijen-Nijdam, I H; Tanis, B C; van Huisseling, J C M; Versteegh, F G A
2006-08-01
We have offered, for the first time in The Netherlands, carrier diagnostics for hemoglobinopathies (HbP) to early pregnant women. The aim of this study was to establish whether carrier analysis would be welcome by the public and feasible at the outpatient level. One hundred and thirty-nine randomly selected women were informed and offered basic carrier diagnostics at the first pregnancy control. Carrier diagnostics was accepted by 136 women (97.8%). The population consisted of 31% of recent immigrants and 69% of native Dutch. One carrier of HbS and one of beta-thalassemia were found, both among the group of the recent immigrants. In both cases, partners were tested excluding a couple at risk. In addition, five carriers of alpha(+)-thalassemia were diagnosed at the molecular level, one of them in the native Dutch population. Basic carrier analysis was done both at the Hospital Laboratory and at the Reference Laboratory. No discrepancies were found. This pilot study shows that (1) as predicted the prevalence of risk-related HbP and of alpha(+)-thalassemia is high in the immigrant population. (2) The compliance with carrier analysis in both native Dutch and immigrants is virtually total and (3) carrier diagnosis in early pregnancy and partner analysis in Hospital Laboratories is possible and is an effective tool for primary prevention of HbP in The Netherlands.
Automobile Engine: Basic Ignition Timing. Fordson Bilingual Demonstration Project.
ERIC Educational Resources Information Center
Vick, James E.
These two vocational instructional modules on basic automobile ignition timing and on engine operation, four-stroke cycle, are two of eight such modules designed to assist recently arrived Arab students, limited in English proficiency (LEP), in critical instructional areas in a comprehensive high school. Goal stated for this module is for the…
Stress-related and basic determinants of hair cortisol in humans: A meta-analysis.
Stalder, Tobias; Steudte-Schmiedgen, Susann; Alexander, Nina; Klucken, Tim; Vater, Annika; Wichmann, Susann; Kirschbaum, Clemens; Miller, Robert
2017-03-01
The analysis of hair cortisol concentrations (HCC) is a relatively new strategy to measure long-term cumulative cortisol levels, which is increasingly used in psychoneuroendocrinological research. Here, we conduct a first comprehensive meta-analysis of HCC research based on aggregated data from a total of 124 (sub)samples (66 independent studies; total N=10,289). We seek to answer two central questions: (i) Which covariates and basic features of HCC need to be considered in future research? (ii) What are the main determinants of HCC in terms of chronic stress exposure and mental health? Concerning basic characteristics, our findings identify several covariates to be considered (age, sex, hair washing frequency, hair treatment, oral contraceptive use), confirm a decline of HCC from the first to the second proximal 3cm hair segment, and show positive associations between HCC and short-term salivary cortisol measures. Regarding chronic stress, we show that stress-exposed groups on a whole exhibit 22% increased HCC. This long-term cortisol hypersecretion emerges particularly when stress is still ongoing at the time of study (+43% HCC) but is not present in conditions of past/absent stress (-9% HCC, n.s.). We also report evidence for 17%-reduced HCC in anxiety disorders, such as PTSD. Interestingly, no consistent associations with mood disorders and self-reports of perceived stress, depressiveness or social support are found. However, our findings reveal positive associations of HCC with stress-related anthropometric (body mass index, waist-to-hip ratio) and hemodynamic measures (systolic blood pressure). These meta-analytic results are discussed in the light of their practical implications and important areas for future inquiry are outlined. Copyright © 2017 Elsevier Ltd. All rights reserved.
Deal, Samantha; Wambaugh, John; Judson, Richard; Mosher, Shad; Radio, Nick; Houck, Keith; Padilla, Stephanie
2016-09-01
One of the rate-limiting procedures in a developmental zebrafish screen is the morphological assessment of each larva. Most researchers opt for a time-consuming, structured visual assessment by trained human observer(s). The present studies were designed to develop a more objective, accurate and rapid method for screening zebrafish for dysmorphology. Instead of the very detailed human assessment, we have developed the computational malformation index, which combines the use of high-content imaging with a very brief human visual assessment. Each larva was quickly assessed by a human observer (basic visual assessment), killed, fixed and assessed for dysmorphology with the Zebratox V4 BioApplication using the Cellomics® ArrayScan® V(TI) high-content image analysis platform. The basic visual assessment adds in-life parameters, and the high-content analysis assesses each individual larva for various features (total area, width, spine length, head-tail length, length-width ratio, perimeter-area ratio). In developing the computational malformation index, a training set of hundreds of embryos treated with hundreds of chemicals were visually assessed using the basic or detailed method. In the second phase, we assessed both the stability of these high-content measurements and its performance using a test set of zebrafish treated with a dose range of two reference chemicals (trans-retinoic acid or cadmium). We found the measures were stable for at least 1 week and comparison of these automated measures to detailed visual inspection of the larvae showed excellent congruence. Our computational malformation index provides an objective manner for rapid phenotypic brightfield assessment of individual larva in a developmental zebrafish assay. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
[The basic needs of the spouses of infarct patients in the acute phase of the treatment].
Takahashi, E I; da Silva, C A; Guerra, G M
1990-04-01
The purpose of this study was to identify the basic needs of the spouses of patients with myocardial infarction. The concepts of basic needs from Maslow were used as conceptual framework. The data's analysis showed the following needs affections of this population: safety, belongingness and love, esteem.
Discovery and structural elucidation of the illegal azo dye Basic Red 46 in sumac spice.
Ruf, J; Walter, P; Kandler, H; Kaufmann, A
2012-01-01
An unknown red dye was discovered in a sumac spice sample during routine analysis for Sudan dyes. LC-DAD and LC-MS/MS did not reveal the identity of the red substance. Nevertheless, using LC-high-resolution MS and isotope ratio comparisons the structure was identified as Basic Red 46. The identity of the dye was further confirmed by comparison with a commercial hair-staining product and two textile dye formulations containing Basic Red 46. Analogous to the Sudan dyes, Basic Red 46 is an azo dye. However, some of the sample clean-up methodology utilised for the analysis of Sudan dyes in food prevents its successful detection. In contrast to the Sudan dyes, Basic Red 46 is a cation. Its cationic properties make it bind strongly to gel permeation columns and silica solid-phase extraction cartridges and prevent elution with standard eluents. This is the first report of Basic Red 46 in food. The structure elucidation of this compound as well as the disadvantages of analytical methods focusing on a narrow group of targeted analytes are discussed.
Bernard, Aaron W; Malone, Matthew; Kman, Nicholas E; Caterino, Jeffrey M; Khandelwal, Sorabh
2011-08-12
Professionalism development is influenced by the informal and hidden curriculum. The primary objective of this study was to better understand this experiential learning in the setting of the Emergency Department (ED). Secondarily, the study aimed to explore differences in the informal curriculum between Emergency Medicine (EM) and Internal Medicine (IM) clerkships. A thematic analysis was conducted on 377 professionalism narratives from medical students completing a required EM clerkship from July 2008 through May 2010. The narratives were analyzed using established thematic categories from prior research as well as basic descriptive characteristics. Chi-square analysis was used to compare the frequency of thematic categories to prior research in IM. Finally, emerging themes not fully appreciated in the established thematic categories were created using grounded theory. Observations involving interactions between attending physician and patient were most abundant. The narratives were coded as positive 198 times, negative 128 times, and hybrid 37 times. The two most abundant narrative themes involved manifesting respect (36.9%) and spending time (23.7%). Both of these themes were statistically more likely to be noted by students on EM clerkships compared to IM clerkships. Finally, one new theme regarding cynicism emerged during analysis. This analysis describes an informal curriculum that is diverse in themes. Student narratives suggest their clinical experiences to be influential on professionalism development. Medical students focus on different aspects of professionalism depending on clerkship specialty.
2011-01-01
Background Professionalism development is influenced by the informal and hidden curriculum. The primary objective of this study was to better understand this experiential learning in the setting of the Emergency Department (ED). Secondarily, the study aimed to explore differences in the informal curriculum between Emergency Medicine (EM) and Internal Medicine (IM) clerkships. Methods A thematic analysis was conducted on 377 professionalism narratives from medical students completing a required EM clerkship from July 2008 through May 2010. The narratives were analyzed using established thematic categories from prior research as well as basic descriptive characteristics. Chi-square analysis was used to compare the frequency of thematic categories to prior research in IM. Finally, emerging themes not fully appreciated in the established thematic categories were created using grounded theory. Results Observations involving interactions between attending physician and patient were most abundant. The narratives were coded as positive 198 times, negative 128 times, and hybrid 37 times. The two most abundant narrative themes involved manifesting respect (36.9%) and spending time (23.7%). Both of these themes were statistically more likely to be noted by students on EM clerkships compared to IM clerkships. Finally, one new theme regarding cynicism emerged during analysis. Conclusions This analysis describes an informal curriculum that is diverse in themes. Student narratives suggest their clinical experiences to be influential on professionalism development. Medical students focus on different aspects of professionalism depending on clerkship specialty. PMID:21838887
Fossil fuel and biomass burning effect on climate - Heating or cooling?
NASA Technical Reports Server (NTRS)
Kaufman, Yoram J.; Fraser, Robert S.; Mahoney, Robert L.
1991-01-01
The basic theory of the effect of pollution on cloud microphysics and its global implications is applied to compare the relative effect of a small increase in the consumption rate of oil, coal, or biomass burning on cooling and heating of the atmosphere. The characteristics of and evidence for the SO2 induced cooling effect are reviewed. This perturbation analysis approach permits linearization, therefore simplifying the analysis and reducing the number of uncertain parameters. For biomass burning the analysis is restricted to burning associated with deforestation. Predictions of the effect of an increase in oil or coal burning show that within the present conditions the cooling effect from oil and coal burning may range from 0.4 to 8 times the heating effect.
Finite element dynamic analysis on CDC STAR-100 computer
NASA Technical Reports Server (NTRS)
Noor, A. K.; Lambiotte, J. J., Jr.
1978-01-01
Computational algorithms are presented for the finite element dynamic analysis of structures on the CDC STAR-100 computer. The spatial behavior is described using higher-order finite elements. The temporal behavior is approximated by using either the central difference explicit scheme or Newmark's implicit scheme. In each case the analysis is broken up into a number of basic macro-operations. Discussion is focused on the organization of the computation and the mode of storage of different arrays to take advantage of the STAR pipeline capability. The potential of the proposed algorithms is discussed and CPU times are given for performing the different macro-operations for a shell modeled by higher order composite shallow shell elements having 80 degrees of freedom.
[From hygiene to the building of a city: the state and sanitation in Rio de Janeiro].
Marques, E C
1995-01-01
The paper offers a historical analysis of the creation of the sector responsible for inaugurating and managing Rio de Janeiro's basic sanitary services, examining the period from the mid-nineteenth century, when sanitary issues were first posed, through the 1920s. The analysis centers on the relation between the service structure established by the state, on the one hand, and urban space, on the other, taking a particular look at the special interests involved in creation of this structure. Exploring the vast literature available on Rio de Janeiro's urban world at that time, the present text supplements this with an analysis focused above all on the architecting of Brazil's first policies on sanitation infrastructure.
Line-source excited impulsive EM field response of thin plasmonic metal films
NASA Astrophysics Data System (ADS)
Štumpf, Martin; Vandenbosch, Guy A. E.
2013-08-01
In this paper, reflection against and transmission through thin plasmonic metal films, basic building blocks of many plasmonic devices, are analytically investigated directly in the time domain for an impulsive electric and magnetic line-source excitation. The electromagnetic properties of thin metallic films are modeled via the Drude model. The problem is formulated with the help of approximate thin-sheet boundary conditions and the analysis is carried out using the Cagniard-DeHoop technique. Closed-form space-time expressions are found and discussed. The obtained time-domain analytical expressions reveal the existence of the phenomenon of transient oscillatory surface effects along a plasmonic metal thin sheet. Illustrative numerical examples of transmitted/reflected pulsed fields are provided.
On the Optimization of Aerospace Plane Ascent Trajectory
NASA Astrophysics Data System (ADS)
Al-Garni, Ahmed; Kassem, Ayman Hamdy
A hybrid heuristic optimization technique based on genetic algorithms and particle swarm optimization has been developed and tested for trajectory optimization problems with multi-constraints and a multi-objective cost function. The technique is used to calculate control settings for two types for ascending trajectories (constant dynamic pressure and minimum-fuel-minimum-heat) for a two-dimensional model of an aerospace plane. A thorough statistical analysis is done on the hybrid technique to make comparisons with both basic genetic algorithms and particle swarm optimization techniques with respect to convergence and execution time. Genetic algorithm optimization showed better execution time performance while particle swarm optimization showed better convergence performance. The hybrid optimization technique, benefiting from both techniques, showed superior robust performance compromising convergence trends and execution time.
Matsuyama, Ryota; Lee, Hyojung; Yamaguchi, Takayuki; Tsuzuki, Shinya
2018-01-01
Background A Rohingya refugee camp in Cox’s Bazar, Bangladesh experienced a large-scale diphtheria epidemic in 2017. The background information of previously immune fraction among refugees cannot be explicitly estimated, and thus we conducted an uncertainty analysis of the basic reproduction number, R0. Methods A renewal process model was devised to estimate the R0 and ascertainment rate of cases, and loss of susceptible individuals was modeled as one minus the sum of initially immune fraction and the fraction naturally infected during the epidemic. To account for the uncertainty of initially immune fraction, we employed a Latin Hypercube sampling (LHS) method. Results R0 ranged from 4.7 to 14.8 with the median estimate at 7.2. R0 was positively correlated with ascertainment rates. Sensitivity analysis indicated that R0 would become smaller with greater variance of the generation time. Discussion Estimated R0 was broadly consistent with published estimate from endemic data, indicating that the vaccination coverage of 86% has to be satisfied to prevent the epidemic by means of mass vaccination. LHS was particularly useful in the setting of a refugee camp in which the background health status is poorly quantified. PMID:29629244
Matsuyama, Ryota; Akhmetzhanov, Andrei R; Endo, Akira; Lee, Hyojung; Yamaguchi, Takayuki; Tsuzuki, Shinya; Nishiura, Hiroshi
2018-01-01
A Rohingya refugee camp in Cox's Bazar, Bangladesh experienced a large-scale diphtheria epidemic in 2017. The background information of previously immune fraction among refugees cannot be explicitly estimated, and thus we conducted an uncertainty analysis of the basic reproduction number, R 0 . A renewal process model was devised to estimate the R 0 and ascertainment rate of cases, and loss of susceptible individuals was modeled as one minus the sum of initially immune fraction and the fraction naturally infected during the epidemic. To account for the uncertainty of initially immune fraction, we employed a Latin Hypercube sampling (LHS) method. R 0 ranged from 4.7 to 14.8 with the median estimate at 7.2. R 0 was positively correlated with ascertainment rates. Sensitivity analysis indicated that R 0 would become smaller with greater variance of the generation time. Estimated R 0 was broadly consistent with published estimate from endemic data, indicating that the vaccination coverage of 86% has to be satisfied to prevent the epidemic by means of mass vaccination. LHS was particularly useful in the setting of a refugee camp in which the background health status is poorly quantified.
Measuring the impact of air pollution on respiratory infection risk in China.
Tang, Sanyi; Yan, Qinling; Shi, Wei; Wang, Xia; Sun, Xiaodan; Yu, Pengbo; Wu, Jianhong; Xiao, Yanni
2018-01-01
China is now experiencing major public health challenges caused by air pollution. Few studies have quantified the dynamics of air pollution and its impact on the risk of respiratory infection. We conducted an integrated data analysis to quantify the association among air quality index (AQI), meteorological variables and respiratory infection risk in Shaanxi province of China in the period of November 15th, 2010 to November 14th, 2016. Our analysis illustrated a statistically significantly positive correlation between the number of influenza-like illness (ILI) cases and AQI, and the respiratory infection risk has increased progressively with increased AQI with a time lag of 0-3 days. We also developed mathematical models for the AQI trend and respiratory infection dynamics, incorporating AQI-dependent incidence and AQI-based behaviour change interventions. Our combined data and modelling analysis estimated the basic reproduction number for the respiratory infection during the studying period to be 2.4076, higher than the basic reproduction number of the 2009 pandemic influenza in the same province. Our modelling-based simulations concluded that, in terms of respiratory infection risk reduction, the persistent control of emission in the China's blue-sky programme is much more effective than substantial social-economic interventions implemented only during the smog days. Copyright © 2017 Elsevier Ltd. All rights reserved.
Index of time-of-travel studies of the US Geological Survey
Boning, Charles W.
1973-01-01
This index identifies locations on streams where the U. S. Geological Survey has investigated the time of travel of a highly soluble material moving through a reach of stream channel. This index provides information only on the location of studied stream reaches; it contains no basic data. It does contain, however, a list of references to published data and analytical reports on time of travel and a list of U.S. Geological Survey offices where basic time-of-travel data are on file.
NASA Astrophysics Data System (ADS)
Kaiser, C.; Roll, K.; Volk, W.
2017-09-01
In the automotive industry, the manufacturing of automotive outer panels requires hemming processes in which two sheet metal parts are joined together by bending the flange of the outer part over the inner part. Because of decreasing development times and the steadily growing number of vehicle derivatives, an efficient digital product and process validation is necessary. Commonly used simulations, which are based on the finite element method, demand significant modelling effort, which results in disadvantages especially in the early product development phase. To increase the efficiency of designing hemming processes this paper presents a hemming-specific metamodel approach. The approach includes a part analysis in which the outline of the automotive outer panels is initially split into individual segments. By doing a para-metrization of each of the segments and assigning basic geometric shapes, the outline of the part is approximated. Based on this, the hemming parameters such as flange length, roll-in, wrinkling and plastic strains are calculated for each of the geometric basic shapes by performing a meta-model-based segmental product validation. The metamodel is based on an element similar formulation that includes a reference dataset of various geometric basic shapes. A random automotive outer panel can now be analysed and optimized based on the hemming-specific database. By implementing this approach into a planning system, an efficient optimization of designing hemming processes will be enabled. Furthermore, valuable time and cost benefits can be realized in a vehicle’s development process.
Relevance of motion-related assessment metrics in laparoscopic surgery.
Oropesa, Ignacio; Chmarra, Magdalena K; Sánchez-González, Patricia; Lamata, Pablo; Rodrigues, Sharon P; Enciso, Silvia; Sánchez-Margallo, Francisco M; Jansen, Frank-Willem; Dankelman, Jenny; Gómez, Enrique J
2013-06-01
Motion metrics have become an important source of information when addressing the assessment of surgical expertise. However, their direct relationship with the different surgical skills has not been fully explored. The purpose of this study is to investigate the relevance of motion-related metrics in the evaluation processes of basic psychomotor laparoscopic skills and their correlation with the different abilities sought to measure. A framework for task definition and metric analysis is proposed. An explorative survey was first conducted with a board of experts to identify metrics to assess basic psychomotor skills. Based on the output of that survey, 3 novel tasks for surgical assessment were designed. Face and construct validation was performed, with focus on motion-related metrics. Tasks were performed by 42 participants (16 novices, 22 residents, and 4 experts). Movements of the laparoscopic instruments were registered with the TrEndo tracking system and analyzed. Time, path length, and depth showed construct validity for all 3 tasks. Motion smoothness and idle time also showed validity for tasks involving bimanual coordination and tasks requiring a more tactical approach, respectively. Additionally, motion smoothness and average speed showed a high internal consistency, proving them to be the most task-independent of all the metrics analyzed. Motion metrics are complementary and valid for assessing basic psychomotor skills, and their relevance depends on the skill being evaluated. A larger clinical implementation, combined with quality performance information, will give more insight on the relevance of the results shown in this study.
Engineering Education: A Clear Decision
ERIC Educational Resources Information Center
Strimel, Greg J.; Grubbs, Michael E.; Wells, John G.
2017-01-01
The core subjects in P-12 education have a common key characteristic that makes them stable over time. That characteristic is a steady content. For example, in the sciences, the basics of biology remain the same--the cell is the basic building block around which organisms are defined, characterized, structured, etc. Similarly, the basics of…
Visual Basic Applications to Physics Teaching
ERIC Educational Resources Information Center
Chitu, Catalin; Inpuscatu, Razvan Constantin; Viziru, Marilena
2011-01-01
Derived from basic language, VB (Visual Basic) is a programming language focused on the video interface component. With graphics and functional components implemented, the programmer is able to bring and use their components to achieve the desired application in a relatively short time. Language VB is a useful tool in physics teaching by creating…
Grating-assisted surface acoustic wave directional couplers
NASA Astrophysics Data System (ADS)
Golan, G.; Griffel, G.; Seidman, A.; Croitoru, N.
1991-07-01
Physical properties of novel grating-assisted Y directional couplers are examined using the coupled-mode theory. A general formalism for the analysis of the lateral perturbed directional coupler properties is presented. Explicit expressions for waveguide key parameters such as coupling length, grating period, and other structural characterizations, are obtained. The influence of other physical properties such as time and frequency response or cutoff conditions are also analyzed. A plane grating-assisted directional coupler is presented and examined as a basic component in the integrated acoustic technology.
1993-01-29
Bessel functions and Jacobi functions (cf. [2]). References [1] R. Askey & J. Wilson, Some basic hypergeometric orthogonal polynomials that gen- eralize...1; 1] can be treated as a part of general theory of T-systems (see [81 for that theory and [7] for some aspects of the Chebyshev polynomials theory...waves in elastic media. It has been known for some time that these multiplicities sometimes occur for topological reasons and are present generically , see
2012-01-01
1200 Session 3 – C2 Framework, OR Methods MOOs, MOEs, MOPs Development Case Study – 1300-1630 Session 4 – Findings...Objective 1: Understand the impact of the application of traditional operational research techniques to networked C2 systems. • Objective 2: Develop ...for the network. 3. Cost measures including cost and time to implement the solution (for example, a basic rule-of-thumb I use for development
NASA Technical Reports Server (NTRS)
Mcdougal, David S. (Editor); Wagner, H. Scott (Editor)
1990-01-01
FIRE (First ISCCP Regional Experiment) is a U.S. cloud-radiation program that seeks to address the issues of a basic understanding and parameterizations of cirrus and marine stratocumulus cloud systems and ISCCP data products. The papers describe research analysis of data collected at the 1986 Cirrus Intensive Field Observations (IFO), the 1987 Marine Stratocumulus IFO, and the Extended Time Observations. The papers are grouped into sessions on satellite studies, lidar/radiative properties/microphysical studies, radiative properties, thermodynamic and dynamic properties, case studies, and large scale environment and modeling studies.
Ship Track Cloud Analysis for the North Pacific Area
1988-09-01
referred to as CBD ) have theorized that the impact of aerosols on the radiation budget, due to their interaction with clouds, may be several times...tracks in regions where they were known to exist. While this study was limnited to a few test cases. it did prove thc feasibility of developing2 a...on 13 July 1987 C. OBJECTIVES AND ORGANIZATION The goal of this thesis is to generalize the work of CBD . This effort will have two basic objectives
NASA Technical Reports Server (NTRS)
1974-01-01
Accomplishments in the continuing programs are reported. The data were obtained in support of the following broad objectives: (1) to provide a precise and accurate geometric description of the earth's surface; (2) to provide a precise and accurate mathematical description of the earth's gravitational field; and (3) to determine time variations of the geometry of the ocean surface, the solid earth, the gravity field, and other geophysical parameters.
SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel
Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari
2009-01-01
Background Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. Findings The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Conclusion Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software. PMID:19852806
SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel.
Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari
2009-10-23
Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.
Goals 2000: Overview and Analysis. CRS Report for Congress.
ERIC Educational Resources Information Center
Stedman, James B.
Goals 2000: Educate America Act (P.L. 103-227) authorizes a range of initiatives for federal support of education reform. Its basic strategy is that of systemic reform guided by sets of agreed-upon educational goals and standards at each level of governance. An overview and analysis of the Act's basic provisions and authorizations is provided.…
The Etymology of Basic Concepts in the Experimental Analysis of Behavior
ERIC Educational Resources Information Center
Dinsmoor, James A.
2004-01-01
The origins of many of the basic concepts used in the experimental analysis of behavior can be traced to Pavlov's (1927/1960) discussion of unconditional and conditional reflexes in the dog, but often with substantial changes in meaning (e.g., stimulus, response, and reinforcement). Other terms were added by Skinner (1938/1991) to describe his…
Confirmatory Factor Analysis of the TerraNova Comprehensive Tests of Basic Skills/5
ERIC Educational Resources Information Center
Stevens, Joseph J.; Zvoch, Keith
2007-01-01
Confirmatory factor analysis was used to explore the internal validity of scores on the TerraNova Comprehensive Tests of Basic Skills/5 using samples from a southwestern school district and standardization samples reported by the publisher. One of the strengths claimed for battery-type achievement tests is provision of reliable and valid samples…
Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beggs, W.J.
1981-02-01
This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; themore » analysis of variance; quality control procedures; and linear regression analysis.« less
Zhou, Y; Murata, T; Defanti, T A
2000-01-01
Despite their attractive properties, networked virtual environments (net-VEs) are notoriously difficult to design, implement, and test due to the concurrency, real-time and networking features in these systems. Net-VEs demand high quality-of-service (QoS) requirements on the network to maintain natural and real-time interactions among users. The current practice for net-VE design is basically trial and error, empirical, and totally lacks formal methods. This paper proposes to apply a Petri net formal modeling technique to a net-VE-NICE (narrative immersive constructionist/collaborative environment), predict the net-VE performance based on simulation, and improve the net-VE performance. NICE is essentially a network of collaborative virtual reality systems called the CAVE-(CAVE automatic virtual environment). First, we introduce extended fuzzy-timing Petri net (EFTN) modeling and analysis techniques. Then, we present EFTN models of the CAVE, NICE, and transport layer protocol used in NICE: transmission control protocol (TCP). We show the possibility analysis based on the EFTN model for the CAVE. Then, by using these models and design/CPN as the simulation tool, we conducted various simulations to study real-time behavior, network effects and performance (latencies and jitters) of NICE. Our simulation results are consistent with experimental data.
Liu, Ning; Lu, Xin; Yang, YuHan; Yao, Chen Xi; Ning, BaoMing; He, Dacheng; He, Lan; Ouyang, Jin
2015-10-01
A new approach for monitoring the binding affinity between drugs and alpha 1-acid glycoprotein in real time was developed based on a combination of drug-protein reaction followed by Venturi easy ambient sonic-spray ionization mass spectrometry determination of the free drug concentrations. A known basic drug, propranolol was used to validate the new built method. Binding constant values calculated by venturi easy ambient sonic-spray ionization mass spectrometry was in good accordance with a traditional ultrafiltration combined with high performance liquid chromatography method. Then six types of basic drugs were used as the samples to conduct the real time analysis. Upon injection of alpha 1-acid glycoprotein to the drug mixture, the ion chromatograms were extracted to show the changes in the free drug concentrations in real time. By observing the drop-out of six types of drugs during the whole binding reaction, the binding affinities of different drugs were distinguished. A volume shift validating experiment and an injection delay correcting experiment were also performed to eliminate extraneous factors and verify the reliability of our experiment. Therefore, the features of Venturi easy ambient sonic-spray ionization mass spectrometry (V-EASI-MS) and the experimental results indicate that our technique is likely to become a powerful tool for monitoring drug-AGP binding affinity in real time. Copyright © 2015 Elsevier B.V. All rights reserved.
Schepin, V O
2013-01-01
The article presents the results of complex scientific analysis of number and structure of physicians and paramedical personnel of public and municipal health care system of the Russian Federation. The provision of country population, its federal okrugs and federation subjects with physicians and paramedical personnel of various specialties are analyzed too, including ratio of physicians and paramedical personnel and territorial differentiation of provision of population with basic medical personnel. The study results demonstrate that in 2012 provision of population (per 10 000 of population) with physicians increased from 43.9 to 44.7. At the same time provision with paramedical personnel decreased from 92.3 to 90.8. in the Russian Federation are preserved significant territorial disproportions of provision of population with medical personnel resource. The provision of population with physicians and paramedical personnel is 4.3 times and 1.9 times higher in cities than in rural area. The differences between extreme indicators of provision of population of the Russian Federation with physicians and paramedical personnel are 2.9 and 2.4 times correspondingly. The differences between indicators of provision with physicians of clinical specialties are 2.6 times. The average ratio between physician and paramedical personnel is 1:2.03. The structure of medical manpower corresponds to the need of population in medical care in incomplete measure. The materials substantiate necessity to continue modernization, optimization and development of manpower support of public health care system in the Russian Federation.
NASA Technical Reports Server (NTRS)
Achtemeier, Gary L.; Scott, Robert W.; Chen, J.
1991-01-01
A summary is presented of the progress toward the completion of a comprehensive diagnostic objective analysis system based upon the calculus of variations. The approach was to first develop the objective analysis subject to the constraints that the final product satisfies the five basic primitive equations for a dry inviscid atmosphere: the two nonlinear horizontal momentum equations, the continuity equation, the hydrostatic equation, and the thermodynamic equation. Then, having derived the basic model, there would be added to it the equations for moist atmospheric processes and the radiative transfer equation.
Changes in Naval Aviation Basic Instrument Flight Training: An Analysis.
1985-12-01
position to the desired attitude in relation to the horizon [Refs. 4,5: pp.2,16-3]. C. BASIC INSTRUMENT FLIGHT TRAINING The objective of basic...were related to the treatment lecture: 1. Basic Air Work (BAW) 2. Partial Panel 3. Unusual Attitudes (full panel) 4. Initial Climb to Altitude (ICA) 5...of student aviators was compared. The modifi- cations consisted of a lecture concentrating on the fundamentals of attitude instrument flight. One group
A stochastic bioburden model for spacecraft sterilization.
NASA Technical Reports Server (NTRS)
Roark, A. L.
1972-01-01
Development of a stochastic model of the probability distribution for the random variable representing the number of microorganisms on a surface as a function of time. The first basic principle associated with bioburden estimation is that viable particles are removed from surfaces. The second notion important to the analysis is that microorganisms in environments and on surfaces occur in clumps. The last basic principle relating to bioburden modeling is that viable particles are deposited on a surface. The bioburden on a spacecraft is determined by the amount and kind of control exercised on the spacecraft assembly location, the shedding characteristics of the individuals in the vicinity of the spacecraft, its orientation, the geographical location in which the assembly takes place, and the steps in the assembly procedure. The model presented has many of the features which are desirable for its use in the spacecraft sterilization programs currently being planned by NASA.
Helical vortices: linear stability analysis and nonlinear dynamics
NASA Astrophysics Data System (ADS)
Selçuk, C.; Delbende, I.; Rossi, M.
2018-02-01
We numerically investigate, within the context of helical symmetry, the dynamics of a regular array of two or three helical vortices with or without a straight central hub vortex. The Navier-Stokes equations are linearised to study the instabilities of such basic states. For vortices with low pitches, an unstable mode is extracted which corresponds to a displacement mode and growth rates are found to compare well with results valid for an infinite row of point vortices or an infinite alley of vortex rings. For larger pitches, the system is stable with respect to helically symmetric perturbations. In the nonlinear regime, we follow the time-evolution of the above basic states when initially perturbed by the dominant instability mode. For two vortices, sequences of overtaking events, leapfrogging and eventually merging are observed. The transition between such behaviours occurs at a critical ratio involving the core size and the vortex-separation distance. Cases with three helical vortices are also presented.
Mathematical model for transmission of tuberculosis in badger population with vaccination
NASA Astrophysics Data System (ADS)
Tasmi, Aldila, D.; Soewono, E.; Nuraini, N.
2016-04-01
Badger was first time identified as a carrier of Bovine tuberculosis disease in England since 30 years ago. Bovine tuberculosis can be transmitted to another species through the faces, saliva, and breath. The control of tuberculosis in the badger is necessary to reduce the spread of the disease to other species. Many actions have been taken by the government to tackle the disease such as culling badgers with cyanide gas, but this way destroys the natural balance and disrupts the badger population. An alternative way to eliminate tuberculosis within badger population is by vaccination. Here in this paper a model for transmission of badger tuberculosis with vaccination is discussed. The existence of the endemic equilibrium, the stability and the basic reproduction ratio are shown analytically. Numerical simulations show that with proper vaccination level, the basic reproduction ratio could be reduced significantly. Sensitivity analysis for variation of parameters are shown numerically.
The psychological science of addiction.
Gifford, Elizabeth; Humphreys, Keith
2007-03-01
To discuss the contributions and future course of the psychological science of addiction. The psychology of addiction includes a tremendous range of scientific activity, from the basic experimental laboratory through increasingly broad relational contexts, including patient-practitioner interactions, families, social networks, institutional settings, economics and culture. Some of the contributions discussed here include applications of behavioral principles, cognitive and behavioral neuroscience and the development and evaluation of addiction treatment. Psychology has at times been guilty of proliferating theories with relatively little pruning, and of overemphasizing intrapersonal explanations for human behavior. However, at its best, defined as the science of the individual in context, psychology is an integrated discipline using diverse methods well-suited to capture the multi-dimensional nature of addictive behavior. Psychology has a unique ability to integrate basic experimental and applied clinical science and to apply the knowledge gained from multiple levels of analysis to the pragmatic goal of reducing the prevalence of addiction.
Mariner, R.H.; Venezky, D.Y.; Hurwitz, S.
2006-01-01
Chemical and isotope data accumulated by two USGS Projects (led by I. Barnes and R. Mariner) over a time period of about 40 years can now be found using a basic web search or through an image search (left). The data are primarily chemical and isotopic analyses of waters (thermal, mineral, or fresh) and associated gas (free and/or dissolved) collected from hot springs, mineral springs, cold springs, geothermal wells, fumaroles, and gas seeps. Additional information is available about the collection methods and analysis procedures.The chemical and isotope data are stored in a MySQL database and accessed using PHP from a basic search form below. Data can also be accessed using an Open Source GIS called WorldKit by clicking on the image to the left. Additional information is available about WorldKit including the files used to set up the site.
Reduction and Smelting of Vanadium Titanomagnetite Metallized Pellets
NASA Astrophysics Data System (ADS)
Wang, Shuai; Chen, Mao; Guo, Yufeng; Jiang, Tao; Zhao, Baojun
2018-04-01
Reduction and smelting of the vanadium titanomagnetite metallized pellets have been experimentally investigated in this study. By using the high-temperature smelting, rapid quenching, and electron probe x-ray microanalysis (EPMA) technique, the effects of basicity, reaction time, and graphite reductant amount were investigated. The vanadium contents in iron alloys increase with increasing basicity, reaction time, and graphite amount, whereas the FeO and V2O3 concentrations in the liquid phase decrease with the increase of graphite amount and reaction time. Increasing the reaction time and reductant content promotes the reduction of titanium oxide, whereas the reduction of titanium oxides can be suppressed with increasing the slag basicity. Titanium carbide (TiC) was not observed in all the quenched samples under the present conditions. The experimental results and the FactSage calculations are also compared in the present study.
Yamazaki, Yuka; Uka, Takanori; Marui, Eiji
2017-09-15
In Japan, the field of Basic Sciences encompasses clinical, academic, and translational research, as well as the teaching of medical sciences, with both an MD and PhD typically required. In this study, it was hypothesized that the characteristics of a Basic Sciences career path could offer the professional advancement and personal fulfillment that many female medical doctors would find advantageous. Moreover, encouraging interest in Basic Sciences could help stem shortages that Japan is experiencing in medical fields, as noted in the three principal contributing factors: premature resignation of female clinicians, an imbalance of female physicians engaged in research, and a shortage of medical doctors in the Basic Sciences. This study examines the professional and personal fulfillment expressed by Japanese female medical doctors who hold positions in Basic Sciences. Topics include career advancement, interest in medical research, and greater flexibility for parenting. A cross-sectional questionnaire survey was distributed at all 80 medical schools in Japan, directed to 228 female medical doctors whose academic rank was assistant professor or higher in departments of Basic Sciences in 2012. Chi-square tests and the binary logistic regression model were used to investigate the impact of parenthood on career satisfaction, academic rank, salary, etc. The survey response rate of female physicians in Basic Sciences was 54.0%. Regardless of parental status, one in three respondents cited research interest as their rationale for entering Basic Sciences, well over twice other motivations. A majority had clinical experience, with clinical duties maintained part-time by about half of respondents and particularly parents. Only one third expressed afterthoughts about relinquishing full-time clinical practice, with physicians who were parents expressing stronger regrets. Parental status had little effect on academic rank and income within the Basic Sciences, CONCLUSION: Scientific curiosity and a desire to improve community health are hallmarks of those choosing a challenging career in medicine. Therefore, it is unsurprising that interest in research is the primary motivation for a female medical doctor to choose a career in Basic Sciences. Additionally, as with many young professionals with families, female doctors seek balance in professional and private lives. Although many expressed afterthoughts relinquishing a full-time clinical practice, mothers generally benefited from greater job flexibility, with little significant effect on career development and income as Basic Scientists.
Ring, P R; Bostick, J M
2000-04-01
A sensitive and selective high-performance liquid chromatography (HPLC) method was developed for the determination of zolpidem in human plasma. Zolpidem and the internal standard (trazodone) were extracted from human plasma that had been made basic. The basic sample was loaded onto a conditioned Bond Elut C18 cartridge, rinsed with water and eluted with methanol. Forty microliters were then injected onto the LC system. Separation was achieved on a C18 column (150 x 4.6 mm, 5 microm) with a mobile phase composed of acetonitrile:50 mM potassium phosphate monobasic at pH 6.0 (4:6, v/v). Detection was by fluorescence, with excitation at 254 nm and emission at 400 nm. The retention times of zolpidem and internal standard were approximately 4.7 and 5.3 min, respectively. The LC run time was 8 min. The assay was linear in concentration range 1-400 ng/ml for zolpidem in human plasma. The analysis of quality control samples for zolpidem (3, 30, and 300 ng/ml) demonstrated excellent precision with relative standard deviations (RSD) of 3.7, 4.6, and 3.0%, respectively (n = 18). The method was accurate with all intraday (n = 6) and overall (n = 18) mean concentrations within 5.8% from nominal at all quality control sample concentrations. This method was also performed using a Gilson Aspec XL automated sample processor and autoinjector. The samples were manually fortified with internal standard and made basic. The aspec then performed the solid phase extraction and made injections of the samples onto the LC system. Using the automated procedure for analysis, quality control samples for zolpidem (3, 30, and 300 ng/ml) demonstrated acceptable precision with RSD values of 9.0, 4.9, and 5.1%, respectively (n = 12). The method was accurate with all intracurve (n = 4) and overall (n = 12) mean values being less than 10.8% from nominal at all quality control sample concentrations.
Materials Discovery | Materials Science | NREL
measurement methods and specialized analysis algorithms. Projects Basic Research The basic research projects applications using high-throughput combinatorial research methods. Email | 303-384-6467 Photo of John Perkins
[Analysis of Forensic Characteristics about 23 Family Homicide Cases].
Xie, X; Dong, X D
2016-08-01
To provide references for forensic analysis of family homicides cases by analyzing the situations of scene, injuries and individual which were related to the family homicide cases in a county. The data of 23 family homicide cases from 2004 to 2013 were collected. The basic situation of individual involved, the relationship between dead and suspect, the cause of death, the motive, the location, time and tools of the crime and the behavior of the suspect after crime etc. were analyzed. The characteristics of the 23 family homicides cases showed that couple relationship was the most common relationship; passion killing was the most common motive; local materials were mostly used as the tools for committing crimes; most crimes were committed in residences; most time of crime was night. The analysis of family homicide cases should be based on the scene investigation, the examination of the body and combined with the investigation of the situation. Copyright© by the Editorial Department of Journal of Forensic Medicine
Frequency domain analysis of knock images
NASA Astrophysics Data System (ADS)
Qi, Yunliang; He, Xin; Wang, Zhi; Wang, Jianxin
2014-12-01
High speed imaging-based knock analysis has mainly focused on time domain information, e.g. the spark triggered flame speed, the time when end gas auto-ignition occurs and the end gas flame speed after auto-ignition. This study presents a frequency domain analysis on the knock images recorded using a high speed camera with direct photography in a rapid compression machine (RCM). To clearly visualize the pressure wave oscillation in the combustion chamber, the images were high-pass-filtered to extract the luminosity oscillation. The luminosity spectrum was then obtained by applying fast Fourier transform (FFT) to three basic colour components (red, green and blue) of the high-pass-filtered images. Compared to the pressure spectrum, the luminosity spectra better identify the resonant modes of pressure wave oscillation. More importantly, the resonant mode shapes can be clearly visualized by reconstructing the images based on the amplitudes of luminosity spectra at the corresponding resonant frequencies, which agree well with the analytical solutions for mode shapes of gas vibration in a cylindrical cavity.
Rapid method for the quantification of hydroquinone concentration: chemiluminescent analysis.
Chen, Tung-Sheng; Liou, Show-Yih; Kuo, Wei-Wen; Wu, Hsi-Chin; Jong, Gwo-Ping; Wang, Hsueh-Fang; Shen, Chia-Yao; Padma, V Vijaya; Huang, Chih-Yang; Chang, Yen-Lin
2015-11-01
Topical hydroquinone serves as a skin whitener and is usually available in cosmetics or on prescription based on the hydroquinone concentration. Quantification of hydroquinone content therefore becomes an important issue in topical agents. High-performance liquid chromatography (HPLC) is the commonest method for determining hydroquinone content in topical agents, but this method is time-consuming and uses many solvents that can become an environmental issue. We report a rapid method for quantifying hydroquinone content by chemiluminescent analysis. Hydroquinone induces the production of hydrogen peroxide in the presence of basic compounds. Hydrogen peroxide induced by hydroquinone oxidized light-emitting materials such as lucigenin, resulted in the production of ultra-weak chemiluminescence that was detected by a chemiluminescence analyzer. The intensity of the chemiluminescence was found to be proportional to the hydroquinone concentration. We suggest that the rapid (measurement time, 60 s) and virtually solvent-free (solvent volume, <2 mL) chemiluminescent method described here for quantifying hydroquinone content may be an alternative to HPLC analysis. Copyright © 2015 John Wiley & Sons, Ltd.
Discrete Mathematical Approaches to Graph-Based Traffic Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joslyn, Cliff A.; Cowley, Wendy E.; Hogan, Emilie A.
2014-04-01
Modern cyber defense and anlaytics requires general, formal models of cyber systems. Multi-scale network models are prime candidates for such formalisms, using discrete mathematical methods based in hierarchically-structured directed multigraphs which also include rich sets of labels. An exemplar of an application of such an approach is traffic analysis, that is, observing and analyzing connections between clients, servers, hosts, and actors within IP networks, over time, to identify characteristic or suspicious patterns. Towards that end, NetFlow (or more generically, IPFLOW) data are available from routers and servers which summarize coherent groups of IP packets flowing through the network. In thismore » paper, we consider traffic analysis of Netflow using both basic graph statistics and two new mathematical measures involving labeled degree distributions and time interval overlap measures. We do all of this over the VAST test data set of 96M synthetic Netflow graph edges, against which we can identify characteristic patterns of simulated ground-truth network attacks.« less
The Philosopher's Stone: How Basic Skills Programs Fare in Troubled Financial Times
ERIC Educational Resources Information Center
Ray, Thomas P.
2012-01-01
This mixed methods study examined the relative position of basic skills programs with transfer and career technical programs in a large suburban community college in California during the three-year period of budget reductions from 2009-2010 through 2011-2012. The budget line dedicated to part-time or non-contract instruction was analyzed along…
Liang, Fu-Wen; Chan, Wenyaw; Chen, Ping-Jen; Zimmerman, Carissa; Waring, Stephen; Doody, Rachelle
2016-01-01
Some Alzheimer's disease (AD) patients die without ever developing cognitively impaired basic activities of daily living (basic ADL), which may reflect slower disease progression or better compensatory mechanisms. Although impaired basic ADL is related to disease severity, it may exert an independent risk for death. This study examined the association between impaired basic ADL and survival of AD patients, and proposed a multistate approach for modeling the time to death for patients who demonstrate different patterns of progression of AD that do or do not include basic ADL impairment. 1029 patients with probable AD at the Baylor College of Medicine Alzheimer's Disease and Memory Disorders Center met the criteria for this study. Two complementary definitions were used to define development of basic ADL impairment using the Physical Self-Maintenance Scale score. A weighted Cox regression model, including a time-dependent covariate (development of basic ADL impairment), and a multistate survival model were applied to examine the effect of basic ADL impairment on survival. As expected decreased ability to perform basic ADL at baseline, age at initial visit, years of education, and sex were all associated with significantly higher mortality risk. In those unimpaired at baseline, the development of basic ADL impairment was also associated with a much greater risk of death (hazard ratios 1.77-4.06) over and above the risk conferred by loss of MMSE points. A multi-state Cox model, controlling for those other variables quantified the substantive increase in hazard ratios for death conferred by the development of basic ADL impairment by two definitions and can be applied to calculate the short term risk of mortality in individual patients. The current study demonstrates that the presence of basic ADL impairment or the development of such impairments are important predictors of death in AD patients, regardless of severity.
Research on Basic Design Education: An International Survey
ERIC Educational Resources Information Center
Boucharenc, C. G.
2006-01-01
This paper reports on the results of a survey and qualitative analysis on the teaching of "Basic Design" in schools of design and architecture located in 22 countries. In the context of this research work, Basic Design means the teaching and learning of design fundamentals that may also be commonly referred to as the Principles of Two- and…
Profiles of Learning. The Basic Skills Testing Program in New South Wales: 1989.
ERIC Educational Resources Information Center
Masters, Geofferey; And Others
This publication on the New South Wales' Basic Skills Testing Program (BSTP) describes the development of the program's tests, the analysis of students' results, and the communication of results to parents, teachers, and schools. In BSTP tests, basic skills are defined not as low-level, rudimentary survival skills, but as major areas of learning…
Problems in Choosing a Theory of Basic Writing: Toward a Rhetoric of Scholarly Discourse.
ERIC Educational Resources Information Center
Bizzell, Patricia
This paper discusses some of the problems faced in working with competing theories of basic writing and suggests its own kind of theoretical analysis of nonstandard writing. A brief overview of basic writing theories is presented, and the theories are categorized into two approaches: a traditional approach of teaching by prescription in an…
ERIC Educational Resources Information Center
Sidman, Murray
2011-01-01
I have written before about the importance of applied behavior analysis to basic researchers. That relationship is, however, reciprocal; it is also critical for practitioners to understand and even to participate in basic research. Although applied problems are rarely the same as those investigated in the laboratory, practitioners who understand…
NASA Technical Reports Server (NTRS)
Melick, H. C., Jr.; Ybarra, A. H.; Bencze, D. P.
1975-01-01
An inexpensive method is developed to determine the extreme values of instantaneous inlet distortion. This method also provides insight into the basic mechanics of unsteady inlet flow and the associated engine reaction. The analysis is based on fundamental fluid dynamics and statistical methods to provide an understanding of the turbulent inlet flow and quantitatively relate the rms level and power spectral density (PSD) function of the measured time variant total pressure fluctuations to the strength and size of the low pressure regions. The most probable extreme value of the instantaneous distortion is then synthesized from this information in conjunction with the steady state distortion. Results of the analysis show the extreme values to be dependent upon the steady state distortion, the measured turbulence rms level and PSD function, the time on point, and the engine response characteristics. Analytical projections of instantaneous distortion are presented and compared with data obtained by a conventional, highly time correlated, 40 probe instantaneous pressure measurement system.
Comparison of technologies for deorbiting spacecraft from low-earth-orbit at end of mission
NASA Astrophysics Data System (ADS)
Sánchez-Arriaga, G.; Sanmartín, J. R.; Lorenzini, E. C.
2017-09-01
An analytical comparison of four technologies for deorbiting spacecraft from Low-Earth-Orbit at end of mission is presented. Basic formulas based on simple physical models of key figures of merit for each device are found. Active devices - rockets and electrical thrusters - and passive technologies - drag augmentation devices and electrodynamic tethers - are considered. A basic figure of merit is the deorbit device-to-spacecraft mass ratio, which is, in general, a function of environmental variables, technology development parameters and deorbit time. For typical state-of-the-art values, equal deorbit time, middle inclination and initial altitude of 850 km, the analysis indicates that tethers are about one and two orders of magnitude lighter than active technologies and drag augmentation devices, respectively; a tether needs a few percent mass-ratio for a deorbit time of a couple of weeks. For high inclination, the performance drop of the tether system is moderate: mass ratio and deorbit time increase by factors of 2 and 4, respectively. Besides collision risk with other spacecraft and system mass considerations, such as main driving factors for deorbit space technologies, the analysis addresses other important constraints, like deorbit time, system scalability, manoeuver capability, reliability, simplicity, attitude control requirement, and re-entry and multi-mission capability (deorbit and re-boost) issues. The requirements and constraints are used to make a critical assessment of the four technologies as functions of spacecraft mass and initial orbit (altitude and inclination). Emphasis is placed on electrodynamic tethers, including the latest advances attained in the FP7/Space project BETs. The superiority of tape tethers as compared to round and multi-line tethers in terms of deorbit mission performance is highlighted, as well as the importance of an optimal geometry selection, i.e. tape length, width, and thickness, as function of spacecraft mass and initial orbit. Tether system configuration, deployment and dynamical issues, including a simple passive way to mitigate the well-known dynamical instability of electrodynamic tethers, are also discussed.
29 CFR 548.300 - Introductory statement.
Code of Federal Regulations, 2010 CFR
2010-07-01
... AUTHORIZATION OF ESTABLISHED BASIC RATES FOR COMPUTING OVERTIME PAY Interpretations Authorized Basic Rates § 548... has determined that they are substantially equivalent to the straight-time average hourly earnings of...
29 CFR 548.300 - Introductory statement.
Code of Federal Regulations, 2011 CFR
2011-07-01
... AUTHORIZATION OF ESTABLISHED BASIC RATES FOR COMPUTING OVERTIME PAY Interpretations Authorized Basic Rates § 548... has determined that they are substantially equivalent to the straight-time average hourly earnings of...
The AAPM/RSNA physics tutorial for residents. Basic physics of MR imaging: an introduction.
Hendrick, R E
1994-07-01
This article provides an introduction to the basic physical principles of magnetic resonance (MR) imaging. Essential basic concepts such as nuclear magnetism, tissue magnetization, precession, excitation, and tissue relaxation properties are presented. Hydrogen spin density and tissue relaxation times T1, T2, and T2* are explained. The basic elements of a planar MR pulse sequence are described: section selection during tissue excitation, phase encoding, and frequency encoding during signal measurement.
Laccase/mediator assisted degradation of triarylmethane dyes in a continuous membrane reactor.
Chhabra, Meenu; Mishra, Saroj; Sreekrishnan, Trichur Ramaswamy
2009-08-10
Laccase/mediator systems are important bioremediation agents as the rates of reactions can be enhanced in the presence of the mediators. The decolorization mechanism of two triarylmethane dyes, namely, Basic Green 4 and Acid Violet 17 is reported using Cyathus bulleri laccase. Basic Green 4 was decolorized through N-demethylation by laccase alone, while in mediator assisted reactions, dye breakdown was initiated from oxidation of carbinol form of the dye. Benzaldehyde and N,N-dimethyl aniline were the major end products. With Acid Violet 17, laccase carried out N-deethylation and in mediator assisted reactions, oxidation of the carbinol form of the dye occurred resulting in formation of formyl benzene sulfonic acid, carboxy benzene sulfonic acid and benzene sulfonic acid. Toxicity analysis revealed that Basic Green 4 was toxic and treatment with laccase/mediators resulted in 80-100% detoxification. The treatment of the textile dye solution using laccase and 2,2'-azino-di-(-ethylbenzothiazoline-6-sulfonic acid) (ABTS) was demonstrated in an enzyme membrane reactor. At a hydraulic retention time of 6h, the process was operated for a period of 15 days with nearly 95% decolorization, 10% reduction in flux and 70% recovery of active ABTS.
Hofmann, David A; Burke, Michael J; Zohar, Dov
2017-03-01
Starting with initiatives dating back to the mid-1800s, we provide a high-level review of the key trends and developments in the application of applied psychology to the field of occupational safety. Factory laws, basic worker compensation, and research on accident proneness comprised much of the early work. Thus, early research and practice very much focused on the individual worker, the design of their work, and their basic protection. Gradually and over time, the focus began to navigate further into the organizational context. One of the early efforts to broaden beyond the individual worker was a significant focus on safety-related training during the middle of the 20th century. Toward the latter years of the 20th century and continuing the move from the individual worker to the broader organizational context, there was a significant increase in leadership and organizational climate (safety climate) research. Ultimately, this resulted in the development of a multilevel model of safety culture/climate. After discussing these trends, we identify key conclusions and opportunities for future research. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Harmonic analysis of traction power supply system based on wavelet decomposition
NASA Astrophysics Data System (ADS)
Dun, Xiaohong
2018-05-01
With the rapid development of high-speed railway and heavy-haul transport, AC drive electric locomotive and EMU large-scale operation in the country on the ground, the electrified railway has become the main harmonic source of China's power grid. In response to this phenomenon, the need for timely monitoring of power quality problems of electrified railway, assessment and governance. Wavelet transform is developed on the basis of Fourier analysis, the basic idea comes from the harmonic analysis, with a rigorous theoretical model, which has inherited and developed the local thought of Garbor transformation, and has overcome the disadvantages such as window fixation and lack of discrete orthogonally, so as to become a more recently studied spectral analysis tool. The wavelet analysis takes the gradual and precise time domain step in the high frequency part so as to focus on any details of the signal being analyzed, thereby comprehensively analyzing the harmonics of the traction power supply system meanwhile use the pyramid algorithm to increase the speed of wavelet decomposition. The matlab simulation shows that the use of wavelet decomposition of the traction power supply system for harmonic spectrum analysis is effective.
Mathematics Content Coverage and Student Learning in Kindergarten
Engel, Mimi; Claessens, Amy; Watts, Tyler; Farkas, George
2017-01-01
Analyzing data from two nationally representative kindergarten cohorts, we examine the mathematics content teachers cover in kindergarten. We expand upon prior research, finding that kindergarten teachers report emphasizing basic mathematics content. Although teachers reported increased coverage of advanced content between the 1998–99 and 2010–11 school years, they continued to place more emphasis on basic content. We find that time on advanced content is positively associated with student learning, whereas time on basic content has a negative association with learning. We argue that increased exposure to more advanced mathematics content could benefit the vast majority of kindergartners. PMID:29353913
Local Stability of AIDS Epidemic Model Through Treatment and Vertical Transmission with Time Delay
NASA Astrophysics Data System (ADS)
Novi W, Cascarilla; Lestari, Dwi
2016-02-01
This study aims to explain stability of the spread of AIDS through treatment and vertical transmission model. Human with HIV need a time to positively suffer AIDS. The existence of a time, human with HIV until positively suffer AIDS can be delayed for a time so that the model acquired is the model with time delay. The model form is a nonlinear differential equation with time delay, SIPTA (susceptible-infected-pre AIDS-treatment-AIDS). Based on SIPTA model analysis results the disease free equilibrium point and the endemic equilibrium point. The disease free equilibrium point with and without time delay are local asymptotically stable if the basic reproduction number is less than one. The endemic equilibrium point will be local asymptotically stable if the time delay is less than the critical value of delay, unstable if the time delay is more than the critical value of delay, and bifurcation occurs if the time delay is equal to the critical value of delay.
Li, Jia; Wang, Deming; Huang, Zonghou
2017-01-01
Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents. PMID:28793348
Wang, Hetang; Li, Jia; Wang, Deming; Huang, Zonghou
2017-01-01
Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents.
... Basics: Health Benefits What are the benefits of breastfeeding? Breastfeeding gives you and your baby time to ... Basics: Common Questions If you are worried about breastfeeding, you aren't alone. It's normal to have ...
CAI-BASIC: A Program to Teach the Programming Language BASIC.
ERIC Educational Resources Information Center
Barry, Thomas Anthony
A computer-assisted instruction (CAI) program was designed which fulfills the objectives of teaching a simple programing language, interpreting student responses, and executing and editing student programs. The CAI-BASIC program is written in FORTRAN IV and executes on IBM-2741 terminals while running under a time-sharing system on an IBM-360-70…
Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Tutorial
DOE Office of Scientific and Technical Information (OSTI.GOV)
C. L. Smith; S. T. Beck; S. T. Wood
2008-08-01
The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of computer programs that were developed to create and analyze probabilistic risk assessment (PRAs). This volume is the tutorial manual for the SAPHIRE system. In this document, a series of lessons are provided that guide the user through basic steps common to most analyses preformed with SAPHIRE. The tutorial is divided into two major sections covering both basic and advanced features. The section covering basic topics contains lessons that lead the reader through development of a probabilistic hypothetical problem involving a vehicle accident, highlighting the program’smore » most fundamental features. The advanced features section contains additional lessons that expand on fundamental analysis features of SAPHIRE and provide insights into more complex analysis techniques. Together, these two elements provide an overview into the operation and capabilities of the SAPHIRE software.« less
NASA Technical Reports Server (NTRS)
Marlowe, M. B.; Moore, R. A.; Whetstone, W. D.
1979-01-01
User instructions are given for performing linear and nonlinear steady state and transient thermal analyses with SPAR thermal analysis processors TGEO, SSTA, and TRTA. It is assumed that the user is familiar with basic SPAR operations and basic heat transfer theory.
Particle circulation and solids transport in large bubbling fluidized beds. Progress report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Homsy, G.M.
1982-04-01
We have undertaken a theoretical study of the possibility of the formation of plumes or channeling when coal particles volatilize upon introduction to a fluidized bed, Fitzgerald (1980). We have completed the analysis of the basic state of uniform flow and are currently completing a stability analysis. We have modified the continuum equations of fluidization, Homsy et al. (1980), to include the source of gas due to volatilization, which we assume to be uniformly distributed spatially. Simplifying these equations and solving leads to the prediction of a basic state analogous to the state of uniform fluidization found when no sourcemore » is present within the medium. We are currently completing a stability analysis of this basic state which will give the critical volatilization rate above which the above simple basic state is unstable. Because of the experimental evidence of Jewett and Lawless (1981), who observed regularly spaced plume-like instabilities upon drying a bed of saturated silica gel, we are considering two-dimensional periodic disturbances. The analysis is similar to that given by Homsy, et al. (1980) and Medlin et al. (1974). We hope to determine the stability limits for this system shortly.« less
Barkoukis, Vassilis; Hagger, Martin S; Lambropoulos, George; Tsorbatzoudis, Haralambos
2010-12-01
The trans-contextual model (TCM) is an integrated model of motivation that aims to explain the processes by which agentic support for autonomous motivation in physical education promotes autonomous motivation and physical activity in a leisure-time context. It is proposed that perceived support for autonomous motivation in physical education is related to autonomous motivation in physical education and leisure-time contexts. Furthermore, relations between autonomous motivation and the immediate antecedents of intentions to engage in physical activity behaviour and actual behaviour are hypothesized. The purpose of the present study was to incorporate the constructs of basic psychological need satisfaction in the TCM to provide a more comprehensive explanation of motivation and demonstrate the robustness of the findings of previous tests of the model that have not incorporated these constructs. Students (N=274) from Greek secondary schools. Participants completed self-report measures of perceived autonomy support, autonomous motivation, and basic psychological need satisfaction in physical education. Follow-up measures of these variables were taken in a leisure-time context along with measures of attitudes, subjective norms, perceived behavioural control (PBC), and intentions from the theory of planned behaviour 1 week later. Self-reported physical activity behaviour was measured 4 weeks later. Results supported TCM hypotheses. Basic psychological need satisfaction variables uniquely predicted autonomous motivation in physical education and leisure time as well as the antecedents of intention, namely, attitudes, and PBC. The basic psychological need satisfaction variables also mediated the effects of perceived autonomy support on autonomous motivation in physical education. Findings support the TCM and provide further information of the mechanisms in the model and integrated theories of motivation in physical education and leisure time.
New Student-Centered and Data-Based Approaches to Hydrology Education
NASA Astrophysics Data System (ADS)
Bloeschl, G.; Troch, P. A. A.; Sivapalan, M.
2014-12-01
Hydrology as a science has evolved over the last century. The knowledge base has significantly expanded, and there are requirements to meet with the new expectations of a science where the connections between the parts are just as important as the parts themselves. In this new environment, what should we teach, and how should we teach it? Given the limited time we have in an undergraduate (and even graduate) curriculum, what should we include, and what should we leave out? What new material and new methods are essential, as compared to textbooks? Past practices have assumed certain basics as being essential to undergraduate teaching. Depending on the professor's background, these include basic process descriptions (infiltration, runoff generation, evaporation etc.) and basic techniques (unit hydrographs, flood frequency analysis, pumping tests). These are taught using idealized (textbook) examples and examined to test this basic competence. The main idea behind this "reductionist" approach to teaching is that the students will do the rest of the learning during practice and apprenticeship in their workplaces. Much of current hydrology teaching follows this paradigm, and the books provide the backdrop to this approach. Our view is that this approach is less than optimum, as it does not prepare the students to face up to the new challenges of the changing world. It is our view that the basics of hydrologic science are not just a collection of individual processes and techniques, but process interactions and underlying concepts or principles, and a collection of techniques that highlights these, combined with student-driven and data-based learning that enables the students to see the manifestations of these process interactions and principles in action in real world situations. While the actual number of items that can be taught in the classroom by this approach in a limited period of time may be lower than in the traditional approach, it will help the students make connections between the understanding gained in this way in solving real world problems. We will illustrate the feasibility of the approach through key examples from our own teaching.
What is Basic Research? Insights from Historical Semantics.
Schauz, Désirée
2014-01-01
For some years now, the concept of basic research has been under attack. Yet although the significance of the concept is in doubt, basic research continues to be used as an analytical category in science studies. But what exactly is basic research? What is the difference between basic and applied research? This article seeks to answer these questions by applying historical semantics. I argue that the concept of basic research did not arise out of the tradition of pure science. On the contrary, this new concept emerged in the late 19th and early 20th centuries, a time when scientists were being confronted with rising expectations regarding the societal utility of science. Scientists used the concept in order to try to bridge the gap between the promise of utility and the uncertainty of scientific endeavour. Only after 1945, when United States science policy shaped the notion of basic research, did the concept revert to the older ideals of pure science. This revival of the purity discourse was caused by the specific historical situation in the US at that time: the need to reform federal research policy after the Second World War, the new dimension of ethical dilemmas in science and technology during the atomic era, and the tense political climate during the Cold War.
Optimization of Micro Metal Injection Molding By Using Grey Relational Grade
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ibrahim, M. H. I.; Precision Process Research Group, Dept. of Mechanical and Materials Engineering, Faculty of Engineering, Universiti Kebangsaan Malaysia; Muhamad, N.
2011-01-17
Micro metal injection molding ({mu}MIM) which is a variant of MIM process is a promising method towards near net-shape of metallic micro components of complex geometry. In this paper, {mu}MIM is applied to produce 316L stainless steel micro components. Due to highly stringent characteristic of {mu}MIM properties, the study has been emphasized on optimization of process parameter where Taguchi method associated with Grey Relational Analysis (GRA) will be implemented as it represents novel approach towards investigation of multiple performance characteristics. Basic idea of GRA is to find a grey relational grade (GRG) which can be used for the optimization conversionmore » from multi objectives case which are density and strength to a single objective case. After considering the form 'the larger the better', results show that the injection time(D) is the most significant followed by injection pressure(A), holding time(E), mold temperature(C) and injection temperature(B). Analysis of variance (ANOVA) is also employed to strengthen the significant of each parameter involved in this study.« less
Real-Time Mapping alert system; user's manual
Torres, L.A.
1996-01-01
The U.S. Geological Survey has an extensive hydrologic network that records and transmits precipitation, stage, discharge, and other water- related data on a real-time basis to an automated data processing system. Data values are recorded on electronic data collection platforms at field monitoring sites. These values are transmitted by means of orbiting satellites to receiving ground stations, and by way of telecommunication lines to a U.S. Geological Survey office where they are processed on a computer system. Data that exceed predefined thresholds are identified as alert values. These alert values can help keep water- resource specialists informed of current hydrologic conditions. The current alert status at monitoring sites is of critical importance during floods, hurricanes, and other extreme hydrologic events where quick analysis of the situation is needed. This manual provides instructions for using the Real-Time Mapping software, a series of computer programs developed by the U.S. Geological Survey for quick analysis of hydrologic conditions, and guides users through a basic interactive session. The software provides interactive graphics display and query of real-time information in a map-based, menu-driven environment.
Kutz, D F; Marzocchi, N; Fattori, P; Cavalcanti, S; Galletti, C
2005-06-01
A new method is presented based on trinary logic able to check the state of different control variables and synchronously record the physiological and behavioral data of behaving animals and humans. The basic information structure of the method is a time interval of defined maximum duration, called time slice, during which the supervisor system periodically checks the status of a specific subset of input channels. An experimental condition is a sequence of time slices subsequently executed according to the final status of the previous time slice. The proposed method implements in its data structure the possibility to branch like an if-else cascade and the possibility to repeat parts of it recursively like the while-loop. Therefore its data structure contains the most basic control structures of programming languages. The method was implemented using a real-time version of LabVIEW programming environment to program and control our experimental setup. Using this supervision system, we synchronously record four analog data channels at 500 Hz (including eye movements) and the time stamps of up to six neurons at 100 kHz. The system reacts with a resolution within 1 ms to changes of state of digital input channels. The system is set to react to changes in eye position with a resolution within 4 ms. The time slices, experimental conditions, and data are handled by relational databases. This facilitates the construction of new experimental conditions and data analysis. The proposed implementation allows continuous recording without an inter-trial gap for data storage or task management. The implementation can be used to drive electrophysiological experiments of behaving animals and psychophysical studies with human subjects.
Design and analysis of sustainable computer mouse using design for disassembly methodology
NASA Astrophysics Data System (ADS)
Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia
2017-12-01
This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.
The General Mission Analysis Tool (GMAT): Current Features And Adding Custom Functionality
NASA Technical Reports Server (NTRS)
Conway, Darrel J.; Hughes, Steven P.
2010-01-01
The General Mission Analysis Tool (GMAT) is a software system for trajectory optimization, mission analysis, trajectory estimation, and prediction developed by NASA, the Air Force Research Lab, and private industry. GMAT's design and implementation are based on four basic principles: open source visibility for both the source code and design documentation; platform independence; modular design; and user extensibility. The system, released under the NASA Open Source Agreement, runs on Windows, Mac and Linux. User extensions, loaded at run time, have been built for optimization, trajectory visualization, force model extension, and estimation, by parties outside of GMAT's development group. The system has been used to optimize maneuvers for the Lunar Crater Observation and Sensing Satellite (LCROSS) and ARTEMIS missions and is being used for formation design and analysis for the Magnetospheric Multiscale Mission (MMS).
Trapnell, Cole; Roberts, Adam; Goff, Loyal; Pertea, Geo; Kim, Daehwan; Kelley, David R; Pimentel, Harold; Salzberg, Steven L; Rinn, John L; Pachter, Lior
2012-01-01
Recent advances in high-throughput cDNA sequencing (RNA-seq) can reveal new genes and splice variants and quantify expression genome-wide in a single assay. The volume and complexity of data from RNA-seq experiments necessitate scalable, fast and mathematically principled analysis software. TopHat and Cufflinks are free, open-source software tools for gene discovery and comprehensive expression analysis of high-throughput mRNA sequencing (RNA-seq) data. Together, they allow biologists to identify new genes and new splice variants of known ones, as well as compare gene and transcript expression under two or more conditions. This protocol describes in detail how to use TopHat and Cufflinks to perform such analyses. It also covers several accessory tools and utilities that aid in managing data, including CummeRbund, a tool for visualizing RNA-seq analysis results. Although the procedure assumes basic informatics skills, these tools assume little to no background with RNA-seq analysis and are meant for novices and experts alike. The protocol begins with raw sequencing reads and produces a transcriptome assembly, lists of differentially expressed and regulated genes and transcripts, and publication-quality visualizations of analysis results. The protocol's execution time depends on the volume of transcriptome sequencing data and available computing resources but takes less than 1 d of computer time for typical experiments and ~1 h of hands-on time. PMID:22383036
Matsubara, Hiroki; Enami, Miki; Hirose, Keiko; Kamikura, Takahisa; Nishi, Taiki; Takei, Yutaka; Inaba, Hideo
2015-04-01
To determine the effect of Japanese obligatory basic life support training for new driver's license applicants on their willingness to carry out basic life support. We distributed a questionnaire to 9,807 participants of basic life support courses in authorized driving schools from May 2007 to April 2008 after the release of the 2006 Japanese guidelines. The questionnaire explored the participants' willingness to perform basic life support in four hypothetical scenarios: cardiopulmonary resuscitation on one's own initiative; compression-only cardiopulmonary resuscitation following telephone cardiopulmonary resuscitation; early emergency call; and use of an automated external defibrillator. The questionnaire was given at the beginning of the basic life support course in the first 6-month term and at the end in the second 6-month term. The 9,011 fully completed answer sheets were analyzed. The training significantly increased the proportion of respondents willing to use an automated external defibrillator and to perform cardiopulmonary resuscitation on their own initiative in those with and without prior basic life support training experience. It significantly increased the proportion of respondents willing to carry out favorable actions in all four scenarios. In multiple logistic regression analysis, basic life support training and prior training experiences within 3 years were associated with the attitude. The analysis of reasons for unwillingness suggested that the training reduced the lack of confidence in their skill but did not attenuate the lack of confidence in detection of arrest or clinical judgment to initiate a basic life support action. Obligatory basic life support training should be carried out periodically and modified to ensure that participants gain confidence in judging and detecting cardiac arrest.
NASA Astrophysics Data System (ADS)
Liang, Jing; Yu, Jian-xing; Yu, Yang; Lam, W.; Zhao, Yi-yu; Duan, Jing-hui
2016-06-01
Energy transfer ratio is the basic-factor affecting the level of pipe damage during the impact between dropped object and submarine pipe. For the purpose of studying energy transfer and damage mechanism of submarine pipe impacted by dropped objects, series of experiments are designed and carried out. The effective yield strength is deduced to make the quasi-static analysis more reliable, and the normal distribution of energy transfer ratio caused by lateral impact on pipes is presented by statistic analysis of experimental results based on the effective yield strength, which provides experimental and theoretical basis for the risk analysis of submarine pipe system impacted by dropped objects. Failure strains of pipe material are confirmed by comparing experimental results with finite element simulation. In addition, impact contact area and impact time are proved to be the major influence factors of energy transfer by sensitivity analysis of the finite element simulation.
A novel surveillance approach for disaster mental health
Shankardass, Ketan; Subramanian, S. V.; Galea, Sandro
2017-01-01
Background Disasters have substantial consequences for population mental health. Social media data present an opportunity for mental health surveillance after disasters to help identify areas of mental health needs. We aimed to 1) identify specific basic emotions from Twitter for the greater New York City area during Hurricane Sandy, which made landfall on October 29, 2012, and to 2) detect and map spatial temporal clusters representing excess risk of these emotions. Methods We applied an advanced sentiment analysis on 344,957 Twitter tweets in the study area over eleven days, from October 22 to November 1, 2012, to extract basic emotions, a space-time scan statistic (SaTScan) and a geographic information system (QGIS) to detect and map excess risk of these emotions. Results Sadness and disgust were among the most prominent emotions identified. Furthermore, we noted 24 spatial clusters of excess risk of basic emotions over time: Four for anger, one for confusion, three for disgust, five for fear, five for sadness, and six for surprise. Of these, anger, confusion, disgust and fear clusters appeared pre disaster, a cluster of surprise was found peri disaster, and a cluster of sadness emerged post disaster. Conclusions We proposed a novel syndromic surveillance approach for mental health based on social media data that may support conventional approaches by providing useful additional information in the context of disaster. We showed that excess risk of multiple basic emotions could be mapped in space and time as a step towards anticipating acute stress in the population and identifying community mental health need rapidly and efficiently in the aftermath of disaster. More studies are needed to better control for bias, identify associations with reliable and valid instruments measuring mental health, and to explore computational methods for continued model-fitting, causal relationships, and ongoing evaluation. Our study may be a starting point also for more fully elaborated models that can either prospectively detect mental health risk using real-time social media data or detect excess risk of emotional reactions in areas that lack efficient infrastructure during and after disasters. As such, social media data may be used for mental health surveillance after large scale disasters to help identify areas of mental health needs and to guide us in our knowledge where we may most effectively intervene to reduce the mental health consequences of disasters. PMID:28723959
A novel surveillance approach for disaster mental health.
Gruebner, Oliver; Lowe, Sarah R; Sykora, Martin; Shankardass, Ketan; Subramanian, S V; Galea, Sandro
2017-01-01
Disasters have substantial consequences for population mental health. Social media data present an opportunity for mental health surveillance after disasters to help identify areas of mental health needs. We aimed to 1) identify specific basic emotions from Twitter for the greater New York City area during Hurricane Sandy, which made landfall on October 29, 2012, and to 2) detect and map spatial temporal clusters representing excess risk of these emotions. We applied an advanced sentiment analysis on 344,957 Twitter tweets in the study area over eleven days, from October 22 to November 1, 2012, to extract basic emotions, a space-time scan statistic (SaTScan) and a geographic information system (QGIS) to detect and map excess risk of these emotions. Sadness and disgust were among the most prominent emotions identified. Furthermore, we noted 24 spatial clusters of excess risk of basic emotions over time: Four for anger, one for confusion, three for disgust, five for fear, five for sadness, and six for surprise. Of these, anger, confusion, disgust and fear clusters appeared pre disaster, a cluster of surprise was found peri disaster, and a cluster of sadness emerged post disaster. We proposed a novel syndromic surveillance approach for mental health based on social media data that may support conventional approaches by providing useful additional information in the context of disaster. We showed that excess risk of multiple basic emotions could be mapped in space and time as a step towards anticipating acute stress in the population and identifying community mental health need rapidly and efficiently in the aftermath of disaster. More studies are needed to better control for bias, identify associations with reliable and valid instruments measuring mental health, and to explore computational methods for continued model-fitting, causal relationships, and ongoing evaluation. Our study may be a starting point also for more fully elaborated models that can either prospectively detect mental health risk using real-time social media data or detect excess risk of emotional reactions in areas that lack efficient infrastructure during and after disasters. As such, social media data may be used for mental health surveillance after large scale disasters to help identify areas of mental health needs and to guide us in our knowledge where we may most effectively intervene to reduce the mental health consequences of disasters.
Ding, Hai-quan; Lu, Qi-peng
2012-01-01
"Digital agriculture" or "precision agriculture" is an important direction of modern agriculture technique. It is the combination of the modern information technique and traditional agriculture and becomes a hotspot field in international agriculture research in recent years. As a nondestructive, real-time, effective and exact analysis technique, near infrared spectroscopy, by which precision agriculture could be carried out, has vast prospect in agrology and gradually gained the recognition. The present paper intends to review the basic theory of near infrared spectroscopy and its applications in the field of agrology, pointing out that the direction of NIR in agrology should based on portable NIR spectrograph in order to acquire qualitative or quantitative information from real-time measuring in field. In addition, NIRS could be combined with space remote sensing to macroscopically control the way crop is growing and the nutrition crops need, to change the current state of our country's agriculture radically.
Profitability Analysis of Soybean Oil Processes.
Cheng, Ming-Hsun; Rosentrater, Kurt A
2017-10-07
Soybean oil production is the basic process for soybean applications. Cash flow analysis is used to estimate the profitability of a manufacturing venture. Besides capital investments, operating costs, and revenues, the interest rate is the factor to estimate the net present value (NPV), break-even points, and payback time; which are benchmarks for profitability evaluation. The positive NPV and reasonable payback time represent a profitable process, and provide an acceptable projection for real operating. Additionally, the capacity of the process is another critical factor. The extruding-expelling process and hexane extraction are the two typical approaches used in industry. When the capacities of annual oil production are larger than 12 and 173 million kg respectively, these two processes are profitable. The solvent free approach, known as enzyme assisted aqueous extraction process (EAEP), is profitable when the capacity is larger than 17 million kg of annual oil production.
Profitability Analysis of Soybean Oil Processes
2017-01-01
Soybean oil production is the basic process for soybean applications. Cash flow analysis is used to estimate the profitability of a manufacturing venture. Besides capital investments, operating costs, and revenues, the interest rate is the factor to estimate the net present value (NPV), break-even points, and payback time; which are benchmarks for profitability evaluation. The positive NPV and reasonable payback time represent a profitable process, and provide an acceptable projection for real operating. Additionally, the capacity of the process is another critical factor. The extruding-expelling process and hexane extraction are the two typical approaches used in industry. When the capacities of annual oil production are larger than 12 and 173 million kg respectively, these two processes are profitable. The solvent free approach, known as enzyme assisted aqueous extraction process (EAEP), is profitable when the capacity is larger than 17 million kg of annual oil production. PMID:28991168
Quantitative analysis of spatial variability of geotechnical parameters
NASA Astrophysics Data System (ADS)
Fang, Xing
2018-04-01
Geotechnical parameters are the basic parameters of geotechnical engineering design, while the geotechnical parameters have strong regional characteristics. At the same time, the spatial variability of geotechnical parameters has been recognized. It is gradually introduced into the reliability analysis of geotechnical engineering. Based on the statistical theory of geostatistical spatial information, the spatial variability of geotechnical parameters is quantitatively analyzed. At the same time, the evaluation of geotechnical parameters and the correlation coefficient between geotechnical parameters are calculated. A residential district of Tianjin Survey Institute was selected as the research object. There are 68 boreholes in this area and 9 layers of mechanical stratification. The parameters are water content, natural gravity, void ratio, liquid limit, plasticity index, liquidity index, compressibility coefficient, compressive modulus, internal friction angle, cohesion and SP index. According to the principle of statistical correlation, the correlation coefficient of geotechnical parameters is calculated. According to the correlation coefficient, the law of geotechnical parameters is obtained.
Measuring the quality-of-life effects of diagnostic and screening tests.
Swan, J Shannon; Miksad, Rebecca A
2009-08-01
Health-related quality of life (HRQL) is a central concept for understanding the outcomes of medical care. When used in cost-effectiveness analysis, HRQL is typically measured for conditions persisting over long time frames (years), and quality-adjusted life year (QALY) values are generated. Consequently, years are the basic unit of time for cost-effectiveness analysis results: dollars spent per QALY gained. However, shorter term components of health care may also affect HRQL, and there is increased interest in measuring and accounting for these events. In radiology, the short-term HRQL effects of screening and diagnostic testing may affect a test's cost-effectiveness, even though they may only last for days. The unique challenge in radiology HRQL assessment is to realistically tap into the testing and screening experience while remaining consistent with QALY theory. The authors review HRQL assessment and highlight methods developed to specifically address the short-term effects of radiologic screening and testing.
An Analysis of the New 9-Year Basic Education Mathematics Curriculum in Nigeria
ERIC Educational Resources Information Center
Awofala, Adeneye O. A.
2012-01-01
The intention of this paper is to describe and reflect on the changes in the new 9-year basic education mathematics curriculum in Nigeria. The paper is divided into four major themes: history of curriculum development in mathematics education at the basic education level in Nigeria, the motivations for the revision of the primary and junior…
ERIC Educational Resources Information Center
Shihua, Peng; Rihui, Tan
2009-01-01
Employing statistical analysis, this study has made a preliminary exploration of promoting the equitable development of basic education in underdeveloped counties through the case study of Cili county. The unequally developed basic education in the county has been made clear, the reasons for the inequitable education have been analyzed, and,…
NASA Astrophysics Data System (ADS)
Tinoco, Hector A.; Ovalle, Alex M.; Vargas, Carlos A.; Cardona, María J.
2015-09-01
In the context of industrial engineering, the predetermined time systems (PTS) play an important role in workplaces because inefficiencies are found in assembly processes that require manual manipulations. In this study, an approach is proposed with the aim to analyze time and motions in a manual process using a capture motion system embedded to a virtual environment. Capture motion system tracks IR passive markers located on the hands to take the positions of each one. For our purpose, a real workplace is virtually represented by domains to create a virtual workplace based on basic geometries. Motion captured data are combined with the virtual workplace to simulate operations carried out on it, and a time and motion analysis is completed by means of an algorithm. To test the methodology of analysis, a case study was intentionally designed using and violating the principles of motion economy. In the results, it was possible to observe where the hands never crossed as well as where the hands passed by the same place. In addition, the activities done in each zone were observed and some known deficiencies were identified in the distribution of the workplace by computational analysis. Using a frequency analysis of hand velocities, errors in the chosen assembly method were revealed showing differences in the hand velocities. An opportunity is seen to classify some quantifiable aspects that are not identified easily in a traditional time and motion analysis. The automated analysis is considered as the main contribution in this study. In the industrial context, a great application is perceived in terms of monitoring the workplace to analyze repeatability, PTS, workplace and labor activities redistribution using the proposed methodology.
A Multilevel Multiset Time-Series Model for Describing Complex Developmental Processes
Ma, Xin; Shen, Jianping
2017-01-01
The authors sought to develop an analytical platform where multiple sets of time series can be examined simultaneously. This multivariate platform capable of testing interaction effects among multiple sets of time series can be very useful in empirical research. The authors demonstrated that the multilevel framework can readily accommodate this analytical capacity. Given their intention to use the multilevel multiset time-series model to pursue complicated research purposes, their resulting model is relatively simple to specify, to run, and to interpret. These advantages make the adoption of their model relatively effortless as long as researchers have the basic knowledge and skills in working with multilevel growth modeling. With multiple potential extensions of their model, the establishment of this analytical platform for analysis of multiple sets of time series can inspire researchers to pursue far more advanced research designs to address complex developmental processes in reality. PMID:29881094
Studying mixing in Non-Newtonian blue maize flour suspensions using color analysis.
Trujillo-de Santiago, Grissel; Rojas-de Gante, Cecilia; García-Lara, Silverio; Ballescá-Estrada, Adriana; Alvarez, Mario Moisés
2014-01-01
Non-Newtonian fluids occur in many relevant flow and mixing scenarios at the lab and industrial scale. The addition of acid or basic solutions to a non-Newtonian fluid is not an infrequent operation, particularly in Biotechnology applications where the pH of Non-Newtonian culture broths is usually regulated using this strategy. We conducted mixing experiments in agitated vessels using Non-Newtonian blue maize flour suspensions. Acid or basic pulses were injected to reveal mixing patterns and flow structures and to follow their time evolution. No foreign pH indicator was used as blue maize flours naturally contain anthocyanins that act as a native, wide spectrum, pH indicator. We describe a novel method to quantitate mixedness and mixing evolution through Dynamic Color Analysis (DCA) in this system. Color readings corresponding to different times and locations within the mixing vessel were taken with a digital camera (or a colorimeter) and translated to the CIELab scale of colors. We use distances in the Lab space, a 3D color space, between a particular mixing state and the final mixing point to characterize segregation/mixing in the system. Blue maize suspensions represent an adequate and flexible model to study mixing (and fluid mechanics in general) in Non-Newtonian suspensions using acid/base tracer injections. Simple strategies based on the evaluation of color distances in the CIELab space (or other scales such as HSB) can be adapted to characterize mixedness and mixing evolution in experiments using blue maize suspensions.
Crystallization of Synthetic Blast Furnace Slags Pertaining to Heat Recovery
NASA Astrophysics Data System (ADS)
Esfahani, Shaghayegh
Heat recovery from blast furnace slags is often contradicted by another requirement, to generate amorphous slag for its use in cement production. As both the rate and extent of heat recovery and slag structure are determined by its cooling rate, a relation between the crystallization kinetics and the cooling conditions is highly desired. In this study, CaO-SiO2-Al2O3-MgO (CSAM) slags with different basicities were studied by Single Hot Thermocouple Technique (SHTT) during isothermal treatment and non-isothermal cooling. Their time-temperature-transformation (TTT) and continuous-cooling-transformation (CCT) diagrams were plotted and compared with each other. Furthermore, kinetic parameters such as the Avrami exponent (n), rate coefficient (K) and effective activation energy of crystallization (EA) were found by analysis of data obtained from in-situ observation of glassy to crystalline transformation and image analysis. Also, the dependence of nucleation and growth rates of crystalline phases were quantified as a function of time, temperature, and slag basicity. Together with the observations of crystallization front, they facilitated establishing the dominant mechanisms of crystallization. In addition to the experimental work, a mathematical model was developed and validated that predicts the amount of crystallization during cooling. A second mathematical model that calculates temperature history of slag during its cooling was coupled with the above model, to allow studying the effect of parameters such as the slag/air ratio and granule size on the heat recovery and glass content of slag.
Hu, Kaifeng; Ellinger, James J; Chylla, Roger A; Markley, John L
2011-12-15
Time-zero 2D (13)C HSQC (HSQC(0)) spectroscopy offers advantages over traditional 2D NMR for quantitative analysis of solutions containing a mixture of compounds because the signal intensities are directly proportional to the concentrations of the constituents. The HSQC(0) spectrum is derived from a series of spectra collected with increasing repetition times within the basic HSQC block by extrapolating the repetition time to zero. Here we present an alternative approach to data collection, gradient-selective time-zero (1)H-(13)C HSQC(0) in combination with fast maximum likelihood reconstruction (FMLR) data analysis and the use of two concentration references for absolute concentration determination. Gradient-selective data acquisition results in cleaner spectra, and NMR data can be acquired in both constant-time and non-constant-time mode. Semiautomatic data analysis is supported by the FMLR approach, which is used to deconvolute the spectra and extract peak volumes. The peak volumes obtained from this analysis are converted to absolute concentrations by reference to the peak volumes of two internal reference compounds of known concentration: DSS (4,4-dimethyl-4-silapentane-1-sulfonic acid) at the low concentration limit (which also serves as chemical shift reference) and MES (2-(N-morpholino)ethanesulfonic acid) at the high concentration limit. The linear relationship between peak volumes and concentration is better defined with two references than with one, and the measured absolute concentrations of individual compounds in the mixture are more accurate. We compare results from semiautomated gsHSQC(0) with those obtained by the original manual phase-cycled HSQC(0) approach. The new approach is suitable for automatic metabolite profiling by simultaneous quantification of multiple metabolites in a complex mixture.
Krishnamurthy, Krish
2013-12-01
The intrinsic quantitative nature of NMR is increasingly exploited in areas ranging from complex mixture analysis (as in metabolomics and reaction monitoring) to quality assurance/control. Complex NMR spectra are more common than not, and therefore, extraction of quantitative information generally involves significant prior knowledge and/or operator interaction to characterize resonances of interest. Moreover, in most NMR-based metabolomic experiments, the signals from metabolites are normally present as a mixture of overlapping resonances, making quantification difficult. Time-domain Bayesian approaches have been reported to be better than conventional frequency-domain analysis at identifying subtle changes in signal amplitude. We discuss an approach that exploits Bayesian analysis to achieve a complete reduction to amplitude frequency table (CRAFT) in an automated and time-efficient fashion - thus converting the time-domain FID to a frequency-amplitude table. CRAFT uses a two-step approach to FID analysis. First, the FID is digitally filtered and downsampled to several sub FIDs, and secondly, these sub FIDs are then modeled as sums of decaying sinusoids using the Bayesian approach. CRAFT tables can be used for further data mining of quantitative information using fingerprint chemical shifts of compounds of interest and/or statistical analysis of modulation of chemical quantity in a biological study (metabolomics) or process study (reaction monitoring) or quality assurance/control. The basic principles behind this approach as well as results to evaluate the effectiveness of this approach in mixture analysis are presented. Copyright © 2013 John Wiley & Sons, Ltd.
Ellenbogen, Michael I; Ma, Madeleine; Christensen, Nicholas P; Lee, Jungwha; O'Leary, Kevin J
2017-01-01
Studies have shown that the overutilization of laboratory tests ("labs") for hospitalized patients is common and can cause adverse health outcomes. Our objective was to compare the ordering tendencies for routine complete blood counts (CBC) and chemistry panels by internal medicine residents and hospitalists. This observational study included a survey of medicine residents and hospitalists and a retrospective analysis of labs ordering data. The retrospective data analysis comprised patients admitted to either the teaching service or nonteaching hospitalist service at a single hospital during 2014. The survey asked residents and hospitalists about their practices and preferences on labs ordering. The frequency and timing of one-time and daily CBC and basic chemistry panel ordering for teaching service and hospitalist patients were obtained from our data warehouse. The average number of CBCs per patient per day and chemistry panels per patient per day was calculated for both services and multivariate regression was performed to control for patient characteristics. Forty-four of 120 (37%) residents and 41 of 53 (77%) hospitalists responded to the survey. Forty-four (100%) residents reported ordering a daily CBC and chemistry panel rather than one-time labs at patient admission compared with 22 (54%) hospitalists ( P < 0.001). For CBCs, teaching service patients averaged 1.72/day and hospitalist service patients averaged 1.43/day ( P < 0.001). For basic chemistry panels, teaching service patients averaged 1.96/day and hospitalist service patients averaged 1.78/day ( P < 0.001). Results were similar in multivariate regression models adjusting for patient characteristics. Residents' self-reported and actual use of CBCs and chemistry panels is significantly higher than that of hospitalists in the same hospital. Our results reveal an opportunity for greater supervision and improved instruction of cost-conscious ordering practices.
Sarma, Sisira; Hajizadeh, Mohammad; Thind, Amardeep; Chan, Rick
2013-01-01
Objective: To describe the association between health information technology (HIT) adoption and family physicians' patient visit length in Canada after controlling for physician and practice characteristics. Method: HIT adoption is defined in terms of four types of HIT usage: no HIT use (NO), basic HIT use without electronic medical record system (HIT), basic HIT use with electronic medical record (EMR) and advanced HIT use (EMR + HIT). The outcome variable is the average time spent on a patient visit (visit length). The data for this study came from the 2007 and 2010 National Physician Surveys. A log-linear model was used to analyze our visit length outcome. Results: The average time worked per week was found to be in the neighbourhood of 36 hours in both 2007 and 2010, but users of EMR and EMR + HIT were undertaking fewer patient visits per week relative to NO users. Multivariable analysis showed that EMR and EMR + HIT were associated with longer average time spent per patient visit by about 7.7% (p<0.05) and 6.7% (p<0.01), respectively, compared to NO users in 2007. In 2010, EMR was not statistically significant and EMR + HIT was associated with a 4% (p<0.1) increased visit length. A variety of practice-related variables such as the mode of remuneration, work setting and interprofessional practice influenced visit length in the expected direction. Conclusion: Use of HIT is found to be associated with fewer patient visits and longer visit length among family physicians in Canada relative to NO users, but this association weakened in the multivariable analysis of 2010. PMID:23968677
Zhang, Dongdong; Liu, Xuejiao; Liu, Yu; Sun, Xizhuo; Wang, Bingyuan; Ren, Yongcheng; Zhao, Yang; Zhou, Junmei; Han, Chengyi; Yin, Lei; Zhao, Jingzhi; Shi, Yuanyuan; Zhang, Ming; Hu, Dongsheng
2017-10-01
Leisure-time physical activity (LTPA) has been suggested to reduce risk of metabolic syndrome (MetS). However, a quantitative comprehensive assessment of the dose-response association between LTPA and incident MetS has not been reported. We performed a meta-analysis of studies assessing the risk of MetS with LTPA. MEDLINE via PubMed and EMBase databases were searched for relevant articles published up to March 13, 2017. Random-effects models were used to estimate the summary relative risk (RR) of MetS with LTPA. Restricted cubic splines were used to model the dose-response association. We identified 16 articles (18 studies including 76,699 participants and 13,871 cases of MetS). We found a negative linear association between LTPA and incident MetS, with a reduction of 8% in MetS risk per 10 metabolic equivalent of task (MET) h/week increment. According to the restricted cubic splines model, risk of MetS was reduced 10% with LTPA performed according to the basic guideline-recommended level of 150min of moderate PA (MPA) per week (10METh/week) versus inactivity (RR=0.90, 95% CI 0.86-0.94). It was reduced 20% and 53% with LTPA at twice (20METh/week) and seven times (70METh/week) the basic recommended level (RR=0.80, 95% CI 0.74-0.88 and 0.47, 95% CI 0.34-0.64, respectively). Our findings provide quantitative data suggesting that any amount of LTPA is better than none and that LTPA substantially exceeding the current LTPA guidelines is associated with an additional reduction in MetS risk. Copyright © 2017. Published by Elsevier Inc.
Fractals in the neurosciences, Part II: clinical applications and future perspectives.
Di Ieva, Antonio; Esteban, Francisco J; Grizzi, Fabio; Klonowski, Wlodzimierz; Martín-Landrove, Miguel
2015-02-01
It has been ascertained that the human brain is a complex system studied at multiple scales, from neurons and microcircuits to macronetworks. The brain is characterized by a hierarchical organization that gives rise to its highly topological and functional complexity. Over the last decades, fractal geometry has been shown as a universal tool for the analysis and quantification of the geometric complexity of natural objects, including the brain. The fractal dimension has been identified as a quantitative parameter for the evaluation of the roughness of neural structures, the estimation of time series, and the description of patterns, thus able to discriminate different states of the brain in its entire physiopathological spectrum. Fractal-based computational analyses have been applied to the neurosciences, particularly in the field of clinical neurosciences including neuroimaging and neuroradiology, neurology and neurosurgery, psychiatry and psychology, and neuro-oncology and neuropathology. After a review of the basic concepts of fractal analysis and its main applications to the basic neurosciences in part I of this series, here, we review the main applications of fractals to the clinical neurosciences for a holistic approach towards a fractal geometry model of the brain. © The Author(s) 2013.
Ngugi, Elizabeth N; Benoit, Cecilia; Hallgrimsdottir, Helga; Jansson, Mikael; Roth, Eric A
2012-06-01
A basic ecological and epidemiological question is why some women enter into commercial sex work while other women in the same socio-economic environment never do. To address this question respondent driven sampling principles were adopted to recruit and collect data for 161 female sex workers and 159 same aged women who never engaged in commercial sex in Kibera, a large informal settlement in Nairobi, Kenya. Univariate analysis indicated that basic kinship measures, including number of family members seen during adolescence and at present, not having a male guardian while growing up, and earlier times of ending relationships with both male and female guardians were associated with commercial sex work in Kibera. Multivariate analysis via logistic regression modeling showed that not having a male guardian during childhood, low education attainment and a small number of family members seen at adolescence were all significant predictors of entering sex work. By far the most important predictor of entering sex work was not having any male guardian, e.g., father, uncle, older brother, etc. during childhood. Results are interpreted in light of the historic pattern of sub-Saharan African child fostering and their relevance for young women in Kibera today.
Vashist, Sandeep Kumar; Schneider, E. Marion; Luong, John H.T.
2014-01-01
Smartphone-based devices and applications (SBDAs) with cost effectiveness and remote sensing are the most promising and effective means of delivering mobile healthcare (mHealthcare). Several SBDAs have been commercialized for the personalized monitoring and/or management of basic physiological parameters, such as blood pressure, weight, body analysis, pulse rate, electrocardiograph, blood glucose, blood glucose saturation, sleeping and physical activity. With advances in Bluetooth technology, software, cloud computing and remote sensing, SBDAs provide real-time on-site analysis and telemedicine opportunities in remote areas. This scenario is of utmost importance for developing countries, where the number of smartphone users is about 70% of 6.8 billion cell phone subscribers worldwide with limited access to basic healthcare service. The technology platform facilitates patient-doctor communication and the patients to effectively manage and keep track of their medical conditions. Besides tremendous healthcare cost savings, SBDAs are very critical for the monitoring and effective management of emerging epidemics and food contamination outbreaks. The next decade will witness pioneering advances and increasing applications of SBDAs in this exponentially growing field of mHealthcare. This article provides a critical review of commercial SBDAs that are being widely used for personalized healthcare monitoring and management. PMID:26852680
Vashist, Sandeep Kumar; Schneider, E Marion; Luong, John H T
2014-08-18
Smartphone-based devices and applications (SBDAs) with cost effectiveness and remote sensing are the most promising and effective means of delivering mobile healthcare (mHealthcare). Several SBDAs have been commercialized for the personalized monitoring and/or management of basic physiological parameters, such as blood pressure, weight, body analysis, pulse rate, electrocardiograph, blood glucose, blood glucose saturation, sleeping and physical activity. With advances in Bluetooth technology, software, cloud computing and remote sensing, SBDAs provide real-time on-site analysis and telemedicine opportunities in remote areas. This scenario is of utmost importance for developing countries, where the number of smartphone users is about 70% of 6.8 billion cell phone subscribers worldwide with limited access to basic healthcare service. The technology platform facilitates patient-doctor communication and the patients to effectively manage and keep track of their medical conditions. Besides tremendous healthcare cost savings, SBDAs are very critical for the monitoring and effective management of emerging epidemics and food contamination outbreaks. The next decade will witness pioneering advances and increasing applications of SBDAs in this exponentially growing field of mHealthcare. This article provides a critical review of commercial SBDAs that are being widely used for personalized healthcare monitoring and management.
ERIC Educational Resources Information Center
Dunbar, Laura
2014-01-01
This article is an introduction to video screen capture. Basic information of two software programs, QuickTime for Mac and BlueBerry Flashback Express for PC, are also discussed. Practical applications for video screen capture are given.
NASA Astrophysics Data System (ADS)
Mensi, Walid; Tiwari, Aviral Kumar; Yoon, Seong-Min
2017-04-01
This paper estimates the weak-form efficiency of Islamic stock markets using 10 sectoral stock indices (basic materials, consumer services, consumer goods, energy, financials, health care, industrials, technology, telecommunication, and utilities). The results based on the multifractal detrended fluctuation analysis (MF-DFA) approach show time-varying efficiency for the sectoral stock markets. Moreover, we find that they tend to show high efficiency in the long term but moderate efficiency in the short term, and that these markets become less efficient after the onset of the global financial crisis. These results have several significant implications in terms of asset allocation for investors dealing with Islamic markets.
How to Combine ChIP with qPCR.
Asp, Patrik
2018-01-01
Chromatin immunoprecipitation (ChIP) coupled with quantitative PCR (qPCR) has in the last 15 years become a basic mainstream tool in genomic research. Numerous commercially available ChIP kits, qPCR kits, and real-time PCR systems allow for quick and easy analysis of virtually anything chromatin-related as long as there is an available antibody. However, the highly accurate quantitative dimension added by using qPCR to analyze ChIP samples significantly raises the bar in terms of experimental accuracy, appropriate controls, data analysis, and data presentation. This chapter will address these potential pitfalls by providing protocols and procedures that address the difficulties inherent in ChIP-qPCR assays.
Basic sanitation policy in Brazil: discussion of a path.
Sousa, Ana Cristina A de; Costa, Nilson do Rosário
2016-01-01
This article demonstrates that the position of dominance enjoyed by state sanitation companies dictates the public policy decision-making process for sanitation in Brazil. These companies' hegemony is explained here through the analysis of a path that generated political and economic incentives that have permitted its consolidation over time. Through the content analysis of the legislation proposed for the sector and the material produced by the stakeholders involved in the approval of new regulations for the sector in 2007, the study identifies the main sources of incentive introduced by the adoption of the National Sanitation Plan, which explain certain structural features of the current sanitation policy and its strong capacity to withstand the innovations proposed under democratic rule.
NASA Technical Reports Server (NTRS)
Schultink, G. (Principal Investigator)
1977-01-01
The author has identified the following significant results. A linear regression between percent nonvegetative land and the time variable was completed for the two sample areas. Sample area no. 1 showed an average vegetation loss of 1.901% per year, while the loss for sample area no. 2 amounted to 5.889% per year. Two basic reasons for the difference were assumed to play a role: the difference in access potential and the amount of already fragmented vegetation complexes in existence during the first year of the comparative analysis - 1970. Sample area no. 2 was located closer to potential access points and was more fragmented initially.
Trépanier, Sarah-Geneviève; Fernet, Claude; Austin, Stéphanie
2015-01-01
Drawing on self-determination theory, this study proposes and tests a model investigating the role of basic psychological need satisfaction in relation to workplace bullying and employee functioning (burnout, work engagement, and turnover intention). For this study, data were collected at 2 time points, over a 12-month period, from a sample of 699 nurses. The results from cross-lagged analyses support the proposed model. Results show that workplace bullying thwarts the satisfaction of employees' basic psychological needs and fosters burnout 12 months later. In addition, when taking into account the cross-lagged effect of workplace bullying on employee functioning, basic need satisfaction fosters work engagement and hinders turnover intention over time. Implications for workplace bullying research and managerial practices are discussed. PsycINFO Database Record (c) 2014 APA, all rights reserved.
5 CFR 839.1121 - What is the Actuarial Reduction for the Basic Employee Death Benefit (BEDB)?
Code of Federal Regulations, 2010 CFR
2010-01-01
... will be the amount of the BEDB divided by the present value factor for your age at the time of the... Basic Employee Death Benefit (BEDB)? 839.1121 Section 839.1121 Administrative Personnel OFFICE OF... Benefits § 839.1121 What is the Actuarial Reduction for the Basic Employee Death Benefit (BEDB)? If you...
ARTiiFACT: a tool for heart rate artifact processing and heart rate variability analysis.
Kaufmann, Tobias; Sütterlin, Stefan; Schulz, Stefan M; Vögele, Claus
2011-12-01
The importance of appropriate handling of artifacts in interbeat interval (IBI) data must not be underestimated. Even a single artifact may cause unreliable heart rate variability (HRV) results. Thus, a robust artifact detection algorithm and the option for manual intervention by the researcher form key components for confident HRV analysis. Here, we present ARTiiFACT, a software tool for processing electrocardiogram and IBI data. Both automated and manual artifact detection and correction are available in a graphical user interface. In addition, ARTiiFACT includes time- and frequency-based HRV analyses and descriptive statistics, thus offering the basic tools for HRV analysis. Notably, all program steps can be executed separately and allow for data export, thus offering high flexibility and interoperability with a whole range of applications.
Goldenberg, Neil M; Steinberg, Benjamin E; Rutka, James T; Chen, Robert; Cabral, Val; Rosenblum, Norman D; Kapus, Andras; Lee, Warren L
2016-01-01
Physicians have traditionally been at the forefront of medical research, bringing clinical questions to the laboratory and returning with ideas for treatment. However, we have anecdotally observed a decline in the popularity of basic science research among trainees. We hypothesized that fewer resident physicians have been pursuing basic science research training over time. We examined records from residents in the Surgeon-Scientist and Clinician-Investigator programs at the University of Toronto (1987-2016). Research by residents was categorized independently by 2 raters as basic science, clinical epidemiology or education-related based on the title of the project, the name of the supervisor and Pubmed searches. The study population was divided into quintiles of time, and the proportion pursuing basic science training in each quintile was calculated. Agreement between the raters was 100%; the categorization of the research topic remained unclear in 9 cases. The proportion of trainees pursuing basic science training dropped by 60% from 1987 to 2016 ( p = 0.005). Significantly fewer residents in the Surgeon-Scientist and Clinician-Investigator Programs at the University of Toronto are pursuing training in the basic sciences as compared with previous years.
Impulse measurement using an Arduíno
NASA Astrophysics Data System (ADS)
Espindola, P. R.; Cena, C. R.; Alves, D. C. B.; Bozano, D. F.; Goncalves, A. M. B.
2018-05-01
In this paper, we propose a simple experimental apparatus that can measure the force variation over time to study the impulse-momentum theorem. In this proposal, a body attached to a rubber string falls freely from rest until it stretches and changes the linear momentum. During that process the force due to the tension on the rubber string is measured with a load cell by using an Arduíno board. We check the instrumental results with the basic concept of impulse, finding the area under the force versus time curve and comparing this with the linear momentum variation estimated from software analysis. The apparatus is presented as a simple and low cost alternative to mechanical physics laboratories.
Random walker in temporally deforming higher-order potential forces observed in a financial crisis.
Watanabe, Kota; Takayasu, Hideki; Takayasu, Misako
2009-11-01
Basic peculiarities of market price fluctuations are known to be well described by a recently developed random-walk model in a temporally deforming quadratic potential force whose center is given by a moving average of past price traces [M. Takayasu, T. Mizuno, and H. Takayasu, Physica A 370, 91 (2006)]. By analyzing high-frequency financial time series of exceptional events, such as bubbles and crashes, we confirm the appearance of higher-order potential force in the markets. We show statistical significance of its existence by applying the information criterion. This time series analysis is expected to be applied widely for detecting a nonstationary symptom in random phenomena.
ERIC Educational Resources Information Center
Diseth, Age; Danielsen, Anne G.; Samdal, Oddrun
2012-01-01
Teachers' support of basic psychological needs, self-efficacy, achievement goals, life satisfaction and academic achievement level was measured in a sample of 240 secondary school students (8th and 10th grades). Correlation analysis showed significant positive relations between all of the variables, except for the relation between need support of…
ERIC Educational Resources Information Center
Applied Management Sciences, Inc., Silver Spring, MD.
A Pre-Award Validation Analysis was conducted in 1978-1979 to provide the federal government information about the accuracy of data provided by applicants for Basic Educational Opportunity Grants. New procedures involved: validation of selected applications by college financial aid officers using documentation such as Federal Income Tax forms;…
ERIC Educational Resources Information Center
Ashley, Richard D.
This report summarizes a project in which a number of new approaches were taken to improve learning in undergraduate basic music instruction for music majors. The basic viewpoint proposed was that music activities can be seen as skilled problem solving in the areas of aural analysis, visual analysis, and understanding of compositional processes.…
Integration and timing of basic and clinical sciences education.
Bandiera, Glen; Boucher, Andree; Neville, Alan; Kuper, Ayelet; Hodges, Brian
2013-05-01
Medical education has traditionally been compartmentalized into basic and clinical sciences, with the latter being viewed as the skillful application of the former. Over time, the relevance of basic sciences has become defined by their role in supporting clinical problem solving rather than being, of themselves, a defining knowledge base of physicians. As part of the national Future of Medical Education in Canada (FMEC MD) project, a comprehensive empirical environmental scan identified the timing and integration of basic sciences as a key pressing issue for medical education. Using the literature review, key informant interviews, stakeholder meetings, and subsequent consultation forums from the FMEC project, this paper details the empirical basis for focusing on the role of basic science, the evidentiary foundations for current practices, and the implications for medical education. Despite a dearth of definitive relevant studies, opinions about how best to integrate the sciences remain strong. Resource allocation, political power, educational philosophy, and the shift from a knowledge-based to a problem-solving profession all influence the debate. There was little disagreement that both sciences are important, that many traditional models emphasized deep understanding of limited basic science disciplines at the expense of other relevant content such as social sciences, or that teaching the sciences contemporaneously rather than sequentially has theoretical and practical merit. Innovations in integrated curriculum design have occurred internationally. Less clear are the appropriate balance of the sciences, the best integration model, and solutions to the political and practical challenges of integrated curricula. New curricula tend to emphasize integration, development of more diverse physician competencies, and preparation of physicians to adapt to evolving technology and patients' expectations. Refocusing the basic/clinical dichotomy to a foundational/applied model may yield benefits in training widely competent future physicians.
Dey, B.; Ratcliff, B.; Va’vra, J.
2017-02-16
In this article, we explore the angular resolution limits attainable in small FDIRC designs taking advantage of the new highly pixelated detectors that are now available. Since the basic FDIRC design concept attains its particle separation performance mostly in the angular domain as measured by two-dimensional pixels, this paper relies primarily on a pixel-based analysis, with additional chromatic corrections using the time domain, requiring single photon timing resolution at a level of 100–200 ps only. This approach differs from other modern DIRC design concepts such as TOP or TORCH detectors, whose separation performances rely more strongly on time-dependent analyses. Inmore » conclusion, we find excellent single photon resolution with a geometry where individual bars are coupled to a single plate, which is coupled in turn to a cylindrical lens focusing camera.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dey, B.; Ratcliff, B.; Va’vra, J.
In this article, we explore the angular resolution limits attainable in small FDIRC designs taking advantage of the new highly pixelated detectors that are now available. Since the basic FDIRC design concept attains its particle separation performance mostly in the angular domain as measured by two-dimensional pixels, this paper relies primarily on a pixel-based analysis, with additional chromatic corrections using the time domain, requiring single photon timing resolution at a level of 100–200 ps only. This approach differs from other modern DIRC design concepts such as TOP or TORCH detectors, whose separation performances rely more strongly on time-dependent analyses. Inmore » conclusion, we find excellent single photon resolution with a geometry where individual bars are coupled to a single plate, which is coupled in turn to a cylindrical lens focusing camera.« less
NASA Astrophysics Data System (ADS)
Szurgacz, Dawid
2018-01-01
The article discusses basic functions of a powered roof support in a longwall unit. The support function is to provide safety by protecting mine workings against uncontrolled falling of rocks. The subject of the research includes the measures to shorten the time of roof support shifting. The roof support is adapted to transfer, in hazard conditions of rock mass tremors, dynamic loads caused by mining exploitation. The article presents preliminary research results on the time reduction of the unit advance to increase the extraction process and thus reduce operating costs. Conducted stand tests showed the ability to increase the flow for 3/2-way valve cartridges. The level of fluid flowing through the cartridges is adequate to control individual actuators.
Transient-state kinetic approach to mechanisms of enzymatic catalysis.
Fisher, Harvey F
2005-03-01
Transient-state kinetics by its inherent nature can potentially provide more directly observed detailed resolution of discrete events in the mechanistic time courses of enzyme-catalyzed reactions than its more widely used steady-state counterpart. The use of the transient-state approach, however, has been severely limited by the lack of any theoretically sound and applicable basis of interpreting the virtual cornucopia of time and signal-dependent phenomena that it provides. This Account describes the basic kinetic behavior of the transient state, critically examines some currently used analytic methods, discusses the application of a new and more soundly based "resolved component transient-state time-course method" to the L-glutamate-dehydrogenase reaction, and establishes new approaches for the analysis of both single- and multiple-step substituted transient-state kinetic isotope effects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kraiskii, A V; Mironova, T V
2015-08-31
The results of the study of interdiffusion of two liquids, obtained using the holographic recording scheme with a nonstationary reference wave with the frequency linearly varying in space and time are compared with the results of correlation processing of digital photographs, made with a random background screen. The spatio-temporal behaviour of the signal in four basic representations ('space – temporal frequency', 'space – time', 'spatial frequency – temporal frequency' and 'spatial frequency – time') is found in the holographic experiment and calculated (in the appropriate coordinates) based on the background-oriented schlieren method. Practical coincidence of the results of the correlationmore » analysis and the holographic double-exposure interferometry is demonstrated. (interferometry)« less
Dao, Dyda; Sodhi, Sukhmani; Tabasinejad, Rasam; Peterson, Devin; Ayeni, Olufemi R; Bhandari, Mohit; Farrokhyar, Forough
2015-08-01
Low serum 25-hydroxyvitamin D (25(OH)D) levels have been associated with stress fractures in various physically active populations such as the military. To examine the association between serum 25(OH)D levels and stress fractures in the military. Systematic review and meta-analysis. Relevant studies were identified through searching multiple databases and manually screening reference lists. Two reviewers independently selected the included studies by applying the eligibility criteria to the title, abstract, and/or full text of the articles yielded in the search. Two reviewers also independently conducted the methodological quality assessment and data extraction. A random-effects model was used to calculate the mean difference (MD) with 95% CI in serum 25(OH)D levels between stress fracture cases and controls. Nine observational studies on lower extremity stress fractures were eligible, and 1 was excluded due to inadequate data. A total of 2634 military personnel (age, 18-30 years; 44% male) with 761 cases (16% male) and 1873 controls (61% male) from 8 studies were included in the analysis. Three of the 8 studies measured serum 25(OH)D levels at the time of stress fracture diagnosis, and the 5 remaining studies measured serum 25(OH)D levels at the time of entry into basic training. The mean serum 25(OH)D level was lower in stress fracture cases than in controls at the time of entry into basic training (MD, -2.63 ng/mL; 95% CI, -5.80 to 0.54; P = .10; I(2) = 65%) and at the time of stress fracture diagnosis (MD, -2.26 ng/mL; 95% CI, -3.89 to -0.63; P = .007; I(2) = 42%). Despite the inherent limitations of the included studies, the study results suggest some association between low serum 25(OH)D levels and lower extremity stress fractures in military personnel. Given the rigorous training of military personnel, implementing strategies to ensure sufficient 25(OH)D levels may be beneficial for reducing the risk of stress fractures. © 2014 The Author(s).
2014-01-01
Background Since the global standards for postgraduate medical education (PGME) were published in January 2003, they have gained worldwide attention. The current state of residency training programs in medical-school-affiliated hospitals throughout China was assessed in this study. Methods Based on the internationally recognized global standards for PGME, residents undergoing residency training at that time and the relevant residency training instructors and management personnel from 15 medical-school-affiliated hospitals throughout China were recruited and surveyed regarding the current state of residency training programs. A total of 938 questionnaire surveys were distributed between June 30, 2006 and July 30, 2006; of 892 surveys collected, 841 were valid. Results For six items, the total proportions of “basically meets standards” and “completely meets standards” were <70% for the basic standards. These items were identified in the fields of “training settings and educational resources”, “evaluation of training process”, and “trainees”. In all fields other than “continuous updates”, the average scores of the western regions were significantly lower than those of the eastern regions for both the basic and target standards. Specifically, the average scores for the basic standards on as many as 25 of the 38 items in the nine fields were significantly lower in the western regions. There were significant differences in the basic standards scores on 13 of the 38 items among trainees, instructors, and managers. Conclusions The residency training programs have achieved satisfactory outcomes in the hospitals affiliated with various medical schools in China. However, overall, the programs remain inadequate in certain areas. For the governments, organizations, and institutions responsible for PGME, such global standards for PGME are a very useful self-assessment tool and can help identify problems, promote reform, and ultimately standardize PGME. PMID:24885865
Kuhlmann, F E; Apffel, A; Fischer, S M; Goldberg, G; Goodley, P C
1995-12-01
Trifluoroacetic acid (TFA) and other volatile strong acids, used as modifiers in reverse-phase high-performance liquid chromatography, cause signal suppression for basic compounds when analyzed by electrospray ionization mass spectrometry (ESI-MS). Evidence is presented that signal suppression is caused by strong ion pairing between the TFA anion and the protonated sample cation of basic sample molecules. The ion-pairing process "masks" the protonated sample cations from the ESI-MS electric fields by rendering them "neutral. " Weakly basic molecules are not suppressed by this process. The TFA signal suppression effect is independent from the well-known spray problem that electrospray has with highly aqueous solutions that contain TFA. This previously reported spray problem is caused by the high conductivity and surface tension of aqueous TFA solutions. A practical method to enhance the signal for most basic analytes in the presence of signal-suppressing volatile strong acids has been developed. The method employs postcolumn addition of a solution of 75% propionic acid and 25% isopropanol in a ratio 1:2 to the column flow. Signal enhancement is typically 10-50 times for peptides and other small basic molecules. Thus, peptide maps that use ESI-MS for detection can be performed at lower levels, with conventional columns, without the need to use capillary chromatography or reduced mass spectral resolution to achieve satisfactory sensitivity. The method may be used with similar results for heptafluorobutyric acid and hydrochloric acid. A mechanism for TFA signal suppression and signal enhancement by the foregoing method, is proposed.
Demodulator for binary-phase modulated signals having a variable clock rate
NASA Technical Reports Server (NTRS)
Wu, Ta Tzu (Inventor)
1976-01-01
Method and apparatus for demodulating binary-phase modulated signals recorded on a magnetic stripe on a card as the card is manually inserted into a card reader. Magnetic transitions are sensed as the card is read and the time interval between immediately preceeding basic transitions determines the duration of a data sampling pulse which detects the presence or absence of an intermediate transition pulse indicative of two respective logic states. The duration of the data sampling pulse is approximately 75 percent of the preceeding interval between basic transitions to permit tracking succeeding time differences in basic transition intervals of up to approximately 25 percent.
Fu, Shangxi; Liu, Xiao; Zhou, Li; Zhou, Meisheng; Wang, Liming
2017-08-01
The purpose of this study was to estimate the effects of surgical laparoscopic operation course on laparoscopic operation skills after the simulated training for medical students with relatively objective results via data gained before and after the practice course of laparoscopic simulator of the resident standardized trainees. Experiment 1: 20 resident standardized trainees with no experience in laparoscopic surgery were included in the inexperienced group and finished simulated cholecystectomy according to simulator videos. Simulator data was collected (total operation time, path length, average speed of instrument movement, movement efficiency, number of perforations, the time cautery is applied without appropriate contact with adhesions, number of serious complications). Ten attending doctors were included in the experienced group and conducted the operation of simulated cholecystectomy directly. Data was collected with simulator. Data of two groups was compared. Experiment 2: Participants in inexperienced group were assigned to basic group (receiving 8 items of basic operation training) and special group (receiving 8 items of basic operation training and 4 items of specialized training), and 10 persons for each group. They received training course designed by us respectively. After training level had reached the expected target, simulated cholecystectomy was performed, and data was collected. Experimental data between basic group and special group was compared and then data between special group and experienced group was compared. Results of experiment 1 showed that there is significant difference between data in inexperienced group in which participants operated simulated cholecystectomy only according to instructors' teaching and operation video and data in experienced group. Result of experiment 2 suggested that, total operation time, number of perforations, number of serious complications, number of non-cauterized bleeding and the time cautery is applied without appropriate contact with adhesions in special group were all superior to those in basic group. There was no statistical difference on other data between special group and basic group. Comparing special group with experienced group, data of total operation time and the time cautery is applied without appropriate contact with adhesions in experienced group was superior to that in special group. There was no statistical difference on other data between special group and experienced group. Laparoscopic simulators are effective for surgical skills training. Basic courses could mainly improve operator's hand-eye coordination and perception of sense of the insertion depth for instruments. Specialized training courses could not only improve operator's familiarity with surgeries, but also reduce operation time and risk, and improve safety.
String Mining in Bioinformatics
NASA Astrophysics Data System (ADS)
Abouelhoda, Mohamed; Ghanem, Moustafa
Sequence analysis is a major area in bioinformatics encompassing the methods and techniques for studying the biological sequences, DNA, RNA, and proteins, on the linear structure level. The focus of this area is generally on the identification of intra- and inter-molecular similarities. Identifying intra-molecular similarities boils down to detecting repeated segments within a given sequence, while identifying inter-molecular similarities amounts to spotting common segments among two or multiple sequences. From a data mining point of view, sequence analysis is nothing but string- or pattern mining specific to biological strings. For a long time, this point of view, however, has not been explicitly embraced neither in the data mining nor in the sequence analysis text books, which may be attributed to the co-evolution of the two apparently independent fields. In other words, although the word "data-mining" is almost missing in the sequence analysis literature, its basic concepts have been implicitly applied. Interestingly, recent research in biological sequence analysis introduced efficient solutions to many problems in data mining, such as querying and analyzing time series [49,53], extracting information from web pages [20], fighting spam mails [50], detecting plagiarism [22], and spotting duplications in software systems [14].
String Mining in Bioinformatics
NASA Astrophysics Data System (ADS)
Abouelhoda, Mohamed; Ghanem, Moustafa
Sequence analysis is a major area in bioinformatics encompassing the methods and techniques for studying the biological sequences, DNA, RNA, and proteins, on the linear structure level. The focus of this area is generally on the identification of intra- and inter-molecular similarities. Identifying intra-molecular similarities boils down to detecting repeated segments within a given sequence, while identifying inter-molecular similarities amounts to spotting common segments among two or multiple sequences. From a data mining point of view, sequence analysis is nothing but string- or pattern mining specific to biological strings. For a long time, this point of view, however, has not been explicitly embraced neither in the data mining nor in the sequence analysis text books, which may be attributed to the co-evolution of the two apparently independent fields. In other words, although the word “data-mining” is almost missing in the sequence analysis literature, its basic concepts have been implicitly applied. Interestingly, recent research in biological sequence analysis introduced efficient solutions to many problems in data mining, such as querying and analyzing time series [49,53], extracting information from web pages [20], fighting spam mails [50], detecting plagiarism [22], and spotting duplications in software systems [14].
Distributed collaborative response surface method for mechanical dynamic assembly reliability design
NASA Astrophysics Data System (ADS)
Bai, Guangchen; Fei, Chengwei
2013-11-01
Because of the randomness of many impact factors influencing the dynamic assembly relationship of complex machinery, the reliability analysis of dynamic assembly relationship needs to be accomplished considering the randomness from a probabilistic perspective. To improve the accuracy and efficiency of dynamic assembly relationship reliability analysis, the mechanical dynamic assembly reliability(MDAR) theory and a distributed collaborative response surface method(DCRSM) are proposed. The mathematic model of DCRSM is established based on the quadratic response surface function, and verified by the assembly relationship reliability analysis of aeroengine high pressure turbine(HPT) blade-tip radial running clearance(BTRRC). Through the comparison of the DCRSM, traditional response surface method(RSM) and Monte Carlo Method(MCM), the results show that the DCRSM is not able to accomplish the computational task which is impossible for the other methods when the number of simulation is more than 100 000 times, but also the computational precision for the DCRSM is basically consistent with the MCM and improved by 0.40˜4.63% to the RSM, furthermore, the computational efficiency of DCRSM is up to about 188 times of the MCM and 55 times of the RSM under 10000 times simulations. The DCRSM is demonstrated to be a feasible and effective approach for markedly improving the computational efficiency and accuracy of MDAR analysis. Thus, the proposed research provides the promising theory and method for the MDAR design and optimization, and opens a novel research direction of probabilistic analysis for developing the high-performance and high-reliability of aeroengine.
Analyzing the dynamics of cell cycle processes from fixed samples through ergodic principles
Wheeler, Richard John
2015-01-01
Tools to analyze cyclical cellular processes, particularly the cell cycle, are of broad value for cell biology. Cell cycle synchronization and live-cell time-lapse observation are widely used to analyze these processes but are not available for many systems. Simple mathematical methods built on the ergodic principle are a well-established, widely applicable, and powerful alternative analysis approach, although they are less widely used. These methods extract data about the dynamics of a cyclical process from a single time-point “snapshot” of a population of cells progressing through the cycle asynchronously. Here, I demonstrate application of these simple mathematical methods to analysis of basic cyclical processes—cycles including a division event, cell populations undergoing unicellular aging, and cell cycles with multiple fission (schizogony)—as well as recent advances that allow detailed mapping of the cell cycle from continuously changing properties of the cell such as size and DNA content. This includes examples using existing data from mammalian, yeast, and unicellular eukaryotic parasite cell biology. Through the ongoing advances in high-throughput cell analysis by light microscopy, electron microscopy, and flow cytometry, these mathematical methods are becoming ever more important and are a powerful complementary method to traditional synchronization and time-lapse cell cycle analysis methods. PMID:26543196
Fahimirad, Bahareh; Asghari, Alireza; Rajabi, Maryam
2017-05-15
In this work, the lanthanum oxide-aluminum oxide (La 2 O 3 -Al 2 O 3 ) nanocomposite is introduced as an efficient photocatalyst for the photo-degradation of the dyes basic green 1 (BG1) and basic red 46 (BR46) in their binary aqueous solution under the UV light irradiation. The properties of this catalyst are determined by X-ray diffraction (XRD), scanning electron microscopy (SEM), energy-dispersive X-ray spectroscopy (EDX), Brunauer-Emmett-Teller (BET), and UV-visible spectrophotometry. The first-order derivative spectra are used for the simultaneous analysis of the dyes in their binary solution. The screening investigations indicate that five parameters including the catalyst dosage, concentration of the dyes, irradiation time, and solution pH have significant effects on the photo-degradation of the dyes. The effects of these variables together with their interactions in the photo-degradation of the dyes are studied using the Box-Behnken design (BBD). Under the optimum experimental conditions, obtained via the desirability function, the photo-catalytic activities of La 2 O 3 -Al 2 O 3 and pure Al 2 O 3 are also investigated. The results obtained show an enhancement in the photo-catalytic activity when La 2 O 3 nanoparticles are loaded on the surface of Al 2 O 3 nanoparticles. Copyright © 2017 Elsevier B.V. All rights reserved.
The Relationship between Expertise in Sports, Visuospatial, and Basic Cognitive Skills
Heppe, Holger; Kohler, Axel; Fleddermann, Marie-Therese; Zentgraf, Karen
2016-01-01
Team sports place high demands on visuospatial and other cognitive skills. However, there is a lack of research on visuospatial skills of elite athletes and there are heterogeneous results on basic cognitive skills of this population. Therefore, this series of studies tested different cognitive skills in elite team sports athletes. In Experiment 1, elite athletes were compared to recreational athletes, but no differences were observed between the groups in choice response time (CRT) and mental rotation (MR). To see if differences could be observed when the tested groups had a greater difference in expertise and more representative stimuli, in Experiment 2, we tested CRT and MR of elite athletes who had higher level of expertise, and we also used three-dimensional human stimuli. Overall, we still found no differences in MR; however, elite athletes did have shorter CRTs. In Experiment 3, instead of testing MR, we compared elite athletes’ and recreational athletes’ basic cognitive skills, such as processing speed, letter readout speed, memory span, and sustained attention. We found that elite athletes only performed better in sustained attention. Building on this data, in a supplementary analysis (Experiment 4) we tested whether MR and CRTs are correlated with basic cognitive skills. Results show that processing speed is the best predictor for MR, whereas letter readout speed explains most of the variance in CRTs. Finally, we discuss these findings against the backdrop of expertise and offer implications for future studies on mental rotation. PMID:27378994
Study of basic computer competence among public health nurses in Taiwan.
Yang, Kuei-Feng; Yu, Shu; Lin, Ming-Sheng; Hsu, Chia-Ling
2004-03-01
Rapid advances in information technology and media have made distance learning on the Internet possible. This new model of learning allows greater efficiency and flexibility in knowledge acquisition. Since basic computer competence is a prerequisite for this new learning model, this study was conducted to examine the basic computer competence of public health nurses in Taiwan and explore factors influencing computer competence. A national cross-sectional randomized study was conducted with 329 public health nurses. A questionnaire was used to collect data and was delivered by mail. Results indicate that basic computer competence of public health nurses in Taiwan is still needs to be improved (mean = 57.57 +- 2.83, total score range from 26-130). Among the five most frequently used software programs, nurses were most knowledgeable about Word and least knowledgeable about PowerPoint. Stepwise multiple regression analysis revealed eight variables (weekly number of hours spent online at home, weekly amount of time spent online at work, weekly frequency of computer use at work, previous computer training, computer at workplace and Internet access, job position, education level, and age) that significantly influenced computer competence, which accounted for 39.0 % of the variance. In conclusion, greater computer competence, broader educational programs regarding computer technology, and a greater emphasis on computers at work are necessary to increase the usefulness of distance learning via the Internet in Taiwan. Building a user-friendly environment is important in developing this new media model of learning for the future.
A Mobile Computing Solution for Collecting Functional Analysis Data on a Pocket PC
Jackson, James; Dixon, Mark R
2007-01-01
The present paper provides a task analysis for creating a computerized data system using a Pocket PC and Microsoft Visual Basic. With Visual Basic software and any handheld device running the Windows Moble operating system, this task analysis will allow behavior analysts to program and customize their own functional analysis data-collection system. The program will allow the user to select the type of behavior to be recorded, choose between interval and frequency data collection, and summarize data for graphing and analysis. We also provide suggestions for customizing the data-collection system for idiosyncratic research and clinical needs. PMID:17624078
Mathematical modeling of transmission co-infection tuberculosis in HIV community
NASA Astrophysics Data System (ADS)
Lusiana, V.; Putra, P. S.; Nuraini, N.; Soewono, E.
2017-03-01
TB and HIV infection have the effect of deeply on assault the immune system, since they can afford to weaken host immune respone through a mechanism that has not been fully understood. HIV co-infection is the stongest risk factor for progression of M. tuberculosis to active TB disease in HIV individuals, as well as TB has been accelerated to progression HIV infection. In this paper we create a model of transmission co-infection TB in HIV community, dynamic system with ten compartments built in here. Dynamic analysis in this paper mentioned ranging from disease free equilibrium conditions, endemic equilibrium conditions, basic reproduction ratio, stability analysis and numerical simulation. Basic reproductive ratio were obtained from spectral radius the next generation matrix of the model. Numerical simulations are built to justify the results of the analysis and to see the changes in the dynamics of the population in each compartment. The sensitivity analysis indicates that the parameters affecting the population dynamics of TB in people with HIV infection is parameters rate of progression of individuals from the exposed TB class to the active TB, treatment rate of exposed TB individuals, treatment rate of infectious (active TB) individuals and probability of transmission of TB infection from an infective to a susceptible per contact per unit time. We can conclude that growing number of infections carried by infectious TB in people with HIV infection can lead to increased spread of disease or increase in endemic conditions.
Sebastian, Manu; Gantz, Marie G; Tobin, Thomas; Harkins, J Daniel; Bosken, Jeffrey M; Hughes, Charlie; Harrison, Lenn R; Bernard, William V; Richter, Dana L; Fitzgerald, Terrence D
2003-01-01
During 2001, central Kentucky experienced acute transient epidemics of early and late fetal losses, pericarditis, and unilateral endophthalmitis, collectively referred to as mare reproductive loss syndrome (MRLS). A toxicokinetic/statistical analysis of experimental and field MRLS data was conducted using accelerated failure time (AFT) analysis of abortions following administration of Eastern tent caterpillars (ETCs; 100 or 50 g/day or 100 g of irradiated caterpillars/day) to late-term pregnant mares. In addition, 2001 late-term fetal loss field data were used in the analysis. Experimental data were fitted by AFT analysis at a high (P <.0001) significance. Times to first abortion ("lag time") and abortion rates were dose dependent. Lag times decreased and abortion rates increased exponentially with dose. Calculated dose x response data curves allow interpretation of abortion data in terms of "intubated ETC equivalents." Analysis suggested that field exposure to ETCs in 2001 in central Kentucky commenced on approximately April 27, was initially equivalent to approximately 5 g of intubated ETCs/day, and increased to approximately 30 g/day at the outbreak peak. This analysis accounts for many aspects of the epidemiology, clinical presentations, and manifestations of MRLS. It allows quantitative interpretation of experimental and field MRLS data and has implications for the basic mechanisms underlying MRLS. The results support suggestions that MRLS is caused by exposure to or ingestion of ETCs. The results also show that high levels of ETC exposure produce intense, focused outbreaks of MRLS, closely linked in time and place to dispersing ETCs, as occurred in central Kentucky in 2001. With less intense exposure, lag time is longer and abortions tend to spread out over time and may occur out of phase with ETC exposure, obscuring both diagnosis of this syndrome and the role of the caterpillars.
Basic principles, methodology, and applications of remote sensing in agriculture
NASA Technical Reports Server (NTRS)
Moreira, M. A. (Principal Investigator); Deassuncao, G. V.
1984-01-01
The basic principles of remote sensing applied to agriculture and the methods used in data analysis are described. Emphasis is placed on the importance of developing a methodology that may help crop forecast, basic concepts of spectral signatures of vegetation, the methodology of the LANDSAT data utilization in agriculture, and the remote sensing program application of INPE (Institute for Space Research) in agriculture.
Sequential Testing: Basics and Benefits
1978-03-01
Eii~TARADC6M and x _..TECHNICAL REPORT NO. 12325 SEQUENTIAL TESTING: BASICS AND BENEFITS / i * p iREFERENCE CP...Sequential Testing: Basics and Benefits Contents Page I. Introduction and Summary II. Sequential Analysis 2 III. Mathematics of Sequential Testing 4 IV...testing. The added benefit of reduced energy needs are inherent in this testing method. The text was originally released by the authors in 1972. The text
ERIC Educational Resources Information Center
Stallings, Jane A.
The Development and Demonstration project has trained interns to lead Effective Use of Time inservice workshops for secondary school teachers of basic reading and mathematical skills. These interns then returned to their home bases and trained teachers who could in turn train other teachers to use the Stallings Effective Use of Time methods. The…
[Analysis of variance of repeated data measured by water maze with SPSS].
Qiu, Hong; Jin, Guo-qin; Jin, Ru-feng; Zhao, Wei-kang
2007-01-01
To introduce the method of analyzing repeated data measured by water maze with SPSS 11.0, and offer a reference statistical method to clinical and basic medicine researchers who take the design of repeated measures. Using repeated measures and multivariate analysis of variance (ANOVA) process of the general linear model in SPSS and giving comparison among different groups and different measure time pairwise. Firstly, Mauchly's test of sphericity should be used to judge whether there were relations among the repeatedly measured data. If any (P
Secondary Students' Understanding of Basic Ideas of Special Relativity
NASA Astrophysics Data System (ADS)
Dimitriadi, Kyriaki; Halkia, Krystallia
2012-11-01
A major topic that has marked 'modern physics' is the theory of special relativity (TSR). The present work focuses on the possibility of teaching the basic ideas of the TSR to students at the upper secondary level in such a way that they are able to understand and learn the ideas. Its aim is to investigate students' learning processes towards the two axioms of the theory (the principle of relativity and the invariance of the speed of light) and their consequences (the relativity of simultaneity, time dilation and length contraction). Based on an analysis of physics college textbooks, on a review of the relevant bibliography and on a pilot study, a teaching and learning sequence consisting of five sessions was developed. To collect the data, experimental interviews (the so-called teaching experiment) were used. The teaching experiment may be viewed as a Piagetian clinical interview that is deliberately employed as a teaching and learning situation. The sample consisted of 40 10th grade students (aged 15-16). The data were collected by taping and transcribing the 'interviews', as well as from two open-ended questionnaires filled out by each student, one before and the other after the sessions. Methods of qualitative content analysis were applied. The results show that upper secondary education students are able to cope with the basic ideas of the TSR, but there are some difficulties caused by the following student conceptions: (a) there is an absolute frame of reference, (b) objects have fixed properties and (c) the way events happen is independent of what the observers perceive.
Breath Analysis in Disease Diagnosis: Methodological Considerations and Applications
Lourenço, Célia; Turner, Claire
2014-01-01
Breath analysis is a promising field with great potential for non-invasive diagnosis of a number of disease states. Analysis of the concentrations of volatile organic compounds (VOCs) in breath with an acceptable accuracy are assessed by means of using analytical techniques with high sensitivity, accuracy, precision, low response time, and low detection limit, which are desirable characteristics for the detection of VOCs in human breath. “Breath fingerprinting”, indicative of a specific clinical status, relies on the use of multivariate statistics methods with powerful in-built algorithms. The need for standardisation of sample collection and analysis is the main issue concerning breath analysis, blocking the introduction of breath tests into clinical practice. This review describes recent scientific developments in basic research and clinical applications, namely issues concerning sampling and biochemistry, highlighting the diagnostic potential of breath analysis for disease diagnosis. Several considerations that need to be taken into account in breath analysis are documented here, including the growing need for metabolomics to deal with breath profiles. PMID:24957037
Breath analysis in disease diagnosis: methodological considerations and applications.
Lourenço, Célia; Turner, Claire
2014-06-20
Breath analysis is a promising field with great potential for non-invasive diagnosis of a number of disease states. Analysis of the concentrations of volatile organic compounds (VOCs) in breath with an acceptable accuracy are assessed by means of using analytical techniques with high sensitivity, accuracy, precision, low response time, and low detection limit, which are desirable characteristics for the detection of VOCs in human breath. "Breath fingerprinting", indicative of a specific clinical status, relies on the use of multivariate statistics methods with powerful in-built algorithms. The need for standardisation of sample collection and analysis is the main issue concerning breath analysis, blocking the introduction of breath tests into clinical practice. This review describes recent scientific developments in basic research and clinical applications, namely issues concerning sampling and biochemistry, highlighting the diagnostic potential of breath analysis for disease diagnosis. Several considerations that need to be taken into account in breath analysis are documented here, including the growing need for metabolomics to deal with breath profiles.
Description of MSFC engineering photographic analysis
NASA Technical Reports Server (NTRS)
Earle, Jim; Williams, Frank
1988-01-01
Utilizing a background that includes development of basic launch and test photographic coverage and analysis procedures, the MSFC Photographic Evaluation Group has built a body of experience that enables it to effectively satisfy MSFC's engineering photographic analysis needs. Combining the basic soundness of reliable, proven techniques of the past with the newer technical advances of computers and computer-related devices, the MSFC Photo Evaluation Group is in a position to continue to provide photo and video analysis service center-wide and NASA-wide to supply an improving photo analysis product to meet the photo evaluation needs of the future; and to provide new standards in the state-of-the-art of photo analysis of dynamic events.
NASA Technical Reports Server (NTRS)
Crosson, William L.; Smith, Eric A.
1992-01-01
The behavior of in situ measurements of surface fluxes obtained during FIFE 1987 is examined by using correlative and spectral techniques in order to assess the significance of fluctuations on various time scales, from subdiurnal up to synoptic, intraseasonal, and annual scales. The objectives of this analysis are: (1) to determine which temporal scales have a significant impact on areal averaged fluxes and (2) to design a procedure for filtering an extended flux time series that preserves the basic diurnal features and longer time scales while removing high frequency noise that cannot be attributed to site-induced variation. These objectives are accomplished through the use of a two-dimensional cross-time Fourier transform, which serves to separate processes inherently related to diurnal and subdiurnal variability from those which impact flux variations on the longer time scales. A filtering procedure is desirable before the measurements are utilized as input with an experimental biosphere model, to insure that model based intercomparisons at multiple sites are uncontaminated by input variance not related to true site behavior. Analysis of the spectral decomposition indicates that subdiurnal time scales having periods shorter than 6 hours have little site-to-site consistency and therefore little impact on areal integrated fluxes.
MNE Scan: Software for real-time processing of electrophysiological data.
Esch, Lorenz; Sun, Limin; Klüber, Viktor; Lew, Seok; Baumgarten, Daniel; Grant, P Ellen; Okada, Yoshio; Haueisen, Jens; Hämäläinen, Matti S; Dinh, Christoph
2018-06-01
Magnetoencephalography (MEG) and Electroencephalography (EEG) are noninvasive techniques to study the electrophysiological activity of the human brain. Thus, they are well suited for real-time monitoring and analysis of neuronal activity. Real-time MEG/EEG data processing allows adjustment of the stimuli to the subject's responses for optimizing the acquired information especially by providing dynamically changing displays to enable neurofeedback. We introduce MNE Scan, an acquisition and real-time analysis software based on the multipurpose software library MNE-CPP. MNE Scan allows the development and application of acquisition and novel real-time processing methods in both research and clinical studies. The MNE Scan development follows a strict software engineering process to enable approvals required for clinical software. We tested the performance of MNE Scan in several device-independent use cases, including, a clinical epilepsy study, real-time source estimation, and Brain Computer Interface (BCI) application. Compared to existing tools we propose a modular software considering clinical software requirements expected by certification authorities. At the same time the software is extendable and freely accessible. We conclude that MNE Scan is the first step in creating a device-independent open-source software to facilitate the transition from basic neuroscience research to both applied sciences and clinical applications. Copyright © 2018 Elsevier B.V. All rights reserved.
Ryu, Stephen I.; Shenoy, Krishna V.; Cunningham, John P.; Churchland, Mark M.
2016-01-01
Cortical firing rates frequently display elaborate and heterogeneous temporal structure. One often wishes to compute quantitative summaries of such structure—a basic example is the frequency spectrum—and compare with model-based predictions. The advent of large-scale population recordings affords the opportunity to do so in new ways, with the hope of distinguishing between potential explanations for why responses vary with time. We introduce a method that assesses a basic but previously unexplored form of population-level structure: when data contain responses across multiple neurons, conditions, and times, they are naturally expressed as a third-order tensor. We examined tensor structure for multiple datasets from primary visual cortex (V1) and primary motor cortex (M1). All V1 datasets were ‘simplest’ (there were relatively few degrees of freedom) along the neuron mode, while all M1 datasets were simplest along the condition mode. These differences could not be inferred from surface-level response features. Formal considerations suggest why tensor structure might differ across modes. For idealized linear models, structure is simplest across the neuron mode when responses reflect external variables, and simplest across the condition mode when responses reflect population dynamics. This same pattern was present for existing models that seek to explain motor cortex responses. Critically, only dynamical models displayed tensor structure that agreed with the empirical M1 data. These results illustrate that tensor structure is a basic feature of the data. For M1 the tensor structure was compatible with only a subset of existing models. PMID:27814353
ERIC Educational Resources Information Center
Suveren-Erdogan, Ceren; Suveren, Sibel
2018-01-01
The aim of this study is to enable basic posture exercises to be included in the basic exercises of the visually impaired individuals as a step to learn more difficult movements, to guide the instructors in order to make efficient progress in a short time and to help more numbers of disabled individuals benefit from these studies. Method: 15…
Chopperla, Ramakrishna; Singh, Sonam; Mohanty, Sasmita; Reddy, Nanja; Padaria, Jasdeep C; Solanke, Amolkumar U
2017-10-01
Basic leucine zipper (bZIP) transcription factors comprise one of the largest gene families in plants. They play a key role in almost every aspect of plant growth and development and also in biotic and abiotic stress tolerance. In this study, we report isolation and characterization of EcbZIP17 , a group B bZIP transcription factor from a climate smart cereal, finger millet ( Eleusine coracana L.). The genomic sequence of EcbZIP17 is 2662 bp long encompassing two exons and one intron with ORF of 1722 bp and peptide length of 573 aa. This gene is homologous to AtbZIP17 ( Arabidopsis ), ZmbZIP17 (maize) and OsbZIP60 (rice) which play a key role in endoplasmic reticulum (ER) stress pathway. In silico analysis confirmed the presence of basic leucine zipper (bZIP) and transmembrane (TM) domains in the EcbZIP17 protein. Allele mining of this gene in 16 different genotypes by Sanger sequencing revealed no variation in nucleotide sequence, including the 618 bp long intron. Expression analysis of EcbZIP17 under heat stress exhibited similar pattern of expression in all the genotypes across time intervals with highest upregulation after 4 h. The present study established the conserved nature of EcbZIP17 at nucleotide and expression level.
Musile, Giacomo; Cenci, Lucia; Piletska, Elena; Gottardo, Rossella; Bossi, Alessandra M; Bortolotti, Federica
2018-07-27
The aim of the present work was to develop a novel in-house mixed-mode SPE sorbent to be used for the HPLC-Ion TrapMS determination of 16 basic drugs in urine. By using a computational modelling, a virtual monomer library was screened identifying three suitable functional monomers, methacrylic acid (MAA), itaconic acid (IA) and 2-acrylamide-2-methylpropane sulfonic acid (AMPSA), respectively. Three different sorbents were then synthetized based on these monomers, and using as cross-linker trimethylolpropane trimethacrylate (TMPTMA). The sorbent characterization analyses brought to the selection of the AMPSA based phase. Using this novel in-house sorbent, a SPE-HPLC-Ion TrapMS method for drug analysis in urine was validated proving to be selective and accurate and showing a sensitivity adequate for toxicological urine analysis. The comparison of the in-house mixed-mode SPE sorbent with two analogous commercial mixed-mode SPE phases showed that the first one was better not only in terms of process efficiency, but also in terms of quality-price rate. To the best of our knowledge, this is the first time in which an in-house SPE procedure has been applied to the toxicological analysis of a complex matrix, such as urine. Copyright © 2018 Elsevier B.V. All rights reserved.
Design of point-of-care (POC) microfluidic medical diagnostic devices
NASA Astrophysics Data System (ADS)
Leary, James F.
2018-02-01
Design of inexpensive and portable hand-held microfluidic flow/image cytometry devices for initial medical diagnostics at the point of initial patient contact by emergency medical personnel in the field requires careful design in terms of power/weight requirements to allow for realistic portability as a hand-held, point-of-care medical diagnostics device. True portability also requires small micro-pumps for high-throughput capability. Weight/power requirements dictate use of super-bright LEDs and very small silicon photodiodes or nanophotonic sensors that can be powered by batteries. Signal-to-noise characteristics can be greatly improved by appropriately pulsing the LED excitation sources and sampling and subtracting noise in between excitation pulses. The requirements for basic computing, imaging, GPS and basic telecommunications can be simultaneously met by use of smartphone technologies, which become part of the overall device. Software for a user-interface system, limited real-time computing, real-time imaging, and offline data analysis can be accomplished through multi-platform software development systems that are well-suited to a variety of currently available cellphone technologies which already contain all of these capabilities. Microfluidic cytometry requires judicious use of small sample volumes and appropriate statistical sampling by microfluidic cytometry or imaging for adequate statistical significance to permit real-time (typically < 15 minutes) medical decisions for patients at the physician's office or real-time decision making in the field. One or two drops of blood obtained by pin-prick should be able to provide statistically meaningful results for use in making real-time medical decisions without the need for blood fractionation, which is not realistic in the field.
Tri-state oriented parallel processing system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tenenbaum, J.; Wallach, Y.
1982-08-01
An alternating sequential/parallel system, the MOPPS was introduced a few years ago and is modified despite the fact that it solved satisfactorily a number of real-time problems. The new system, the TOPPS is described and compared to MOPPS and two applications are chosen to prove it to be superior. The advantage of having a third basic, the ring mode, is illustrated when solving sets of linear equations with band matrices. The advantage of having independent I/O for the slaves is illustrated for biomedical signal analysis. 11 references.
A new thermally immobilized fluorinated stationary phase for RP-HPLC.
Maldaner, Liane; Jardim, Isabel C S F
2010-02-01
A new fluorinated stationary phase was prepared through thermal immobilization of poly(methyl-3,3,3-trifluoropropylsiloxane) onto 5 microm Kromasil silica particles. The best conditions of immobilization time and temperature were determined through a central composite design and response surface methodologies. Physical-chemical characterization using solid-state (29)Si NMR measurements, infrared spectroscopy and elemental analysis showed that the immobilization process was effective to promote a coating of the support that corresponds to a monolayer of polymer. The stationary phase presents selectivity for positional isomers and good peak shape for basic compounds.
Scaling phenomena in fatigue and fracture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barenblatt, G.I.
2004-12-01
The general classification of scaling laws will be presented and the basic concepts of modern similarity analysis--intermediate asymptotics, complete and incomplete similarity--will be introduced and discussed. The examples of scaling laws corresponding to complete similarity will be given. The Paris scaling law in fatigue will be discussed as an instructive example of incomplete similarity. It will be emphasized that in the Paris law the powers are not the material constants. Therefore, the evaluation of the life-time of structures using the data obtained from standard fatigue tests requires some precautions.
Study of ballistic mode comet Encke mission opportunities
NASA Technical Reports Server (NTRS)
Hollenbeck, G. R.; Vanpelt, J. M.
1974-01-01
An analysis was conducted of the space mission to intercept the comet Encke. The two basic types of flight geometry considered for the mission are described. The primary interactions between time-of-flight and performance characteristics are displayed. The representative spacecraft characteristics for the Titan 3/Centaur launch vehicle are tabulated. The navigation analyses for the two missions are developed to show: (1) assessment of the navigation feasibility of the missions, (2) determination of the total velocity budget for the trim maneuvers, and (3) evaluation of dispersions at comet encounter.
1993-07-09
real-time simulation capabilities, highly non -linear control devices, work space path planing, active control of machine flexibilities and reliability...P.M., "The Information Capacity of the Human Motor System in Controlling the Amplitude of Movement," Journal of Experimental Psychology, Vol 47, No...driven many research groups in the challenging problem of flexible sy,;tems with an increasing interaction with finite element methodologies. Basic
1989-01-01
Compressor Rear Frame (ClF) which exhibits extensive cract:ing of the forward flange. ThL 1988 Actuarial Function data shows CRF crackiing As the number 2...Creep-Rupture properties of Waspaloy sheet to Sharp-Edged Notches in the Temperature Range of 1O000F-14O0OF. Journal of Basle Engineering, Trans ASME ...Dependence of the Notch Sensitivity of Waspaloy at 10000F-1400F on the Gamma Prime Phase, Journal of Basic Engineering, Trans ASME (in print at time of
Alternative Fuels Data Center: Hydrogen Basics
; Incentives Hydrogen Basics Hydrogen (H2) is an alternative fuel that can be produced from diverse domestic for domestic production, its fast filling time, and the fuel cell's high efficiency. In fact, a fuel
Exploring the role of auditory analysis in atypical compared to typical language development.
Grube, Manon; Cooper, Freya E; Kumar, Sukhbinder; Kelly, Tom; Griffiths, Timothy D
2014-02-01
The relationship between auditory processing and language skills has been debated for decades. Previous findings have been inconsistent, both in typically developing and impaired subjects, including those with dyslexia or specific language impairment. Whether correlations between auditory and language skills are consistent between different populations has hardly been addressed at all. The present work presents an exploratory approach of testing for patterns of correlations in a range of measures of auditory processing. In a recent study, we reported findings from a large cohort of eleven-year olds on a range of auditory measures and the data supported a specific role for the processing of short sequences in pitch and time in typical language development. Here we tested whether a group of individuals with dyslexic traits (DT group; n = 28) from the same year group would show the same pattern of correlations between auditory and language skills as the typically developing group (TD group; n = 173). Regarding the raw scores, the DT group showed a significantly poorer performance on the language but not the auditory measures, including measures of pitch, time and rhythm, and timbre (modulation). In terms of correlations, there was a tendency to decrease in correlations between short-sequence processing and language skills, contrasted by a significant increase in correlation for basic, single-sound processing, in particular in the domain of modulation. The data support the notion that the fundamental relationship between auditory and language skills might differ in atypical compared to typical language development, with the implication that merging data or drawing inference between populations might be problematic. Further examination of the relationship between both basic sound feature analysis and music-like sound analysis and language skills in impaired populations might allow the development of appropriate training strategies. These might include types of musical training to augment language skills via their common bases in sound sequence analysis. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.
Rivera, Ana Leonor; Toledo-Roy, Juan C.; Ellis, Jason; Angelova, Maia
2017-01-01
Circadian rhythms become less dominant and less regular with chronic-degenerative disease, such that to accurately assess these pathological conditions it is important to quantify not only periodic characteristics but also more irregular aspects of the corresponding time series. Novel data-adaptive techniques, such as singular spectrum analysis (SSA), allow for the decomposition of experimental time series, in a model-free way, into a trend, quasiperiodic components and noise fluctuations. We compared SSA with the traditional techniques of cosinor analysis and intradaily variability using 1-week continuous actigraphy data in young adults with acute insomnia and healthy age-matched controls. The findings suggest a small but significant delay in circadian components in the subjects with acute insomnia, i.e. a larger acrophase, and alterations in the day-to-day variability of acrophase and amplitude. The power of the ultradian components follows a fractal 1/f power law for controls, whereas for those with acute insomnia this power law breaks down because of an increased variability at the 90min time scale, reminiscent of Kleitman’s basic rest-activity (BRAC) cycles. This suggests that for healthy sleepers attention and activity can be sustained at whatever time scale required by circumstances, whereas for those with acute insomnia this capacity may be impaired and these individuals need to rest or switch activities in order to stay focused. Traditional methods of circadian rhythm analysis are unable to detect the more subtle effects of day-to-day variability and ultradian rhythm fragmentation at the specific 90min time scale. PMID:28753669
El-Awady, Mohamed; Belal, Fathalla; Pyell, Ute
2013-09-27
The analysis of hydrophobic basic analytes by micellar electrokinetic chromatography (MEKC) is usually challenging because of the tendency of these analytes to be adsorbed onto the inner capillary wall in addition to the difficulty to separate these compounds as they exhibit extremely high retention factors. A robust and reliable method for the simultaneous determination of loratadine (LOR) and its major metabolite desloratadine (DSL) is developed based on cyclodextrin-modified micellar electrokinetic chromatography (CD-MEKC) with acidic sample matrix and basic background electrolyte (BGE). The influence of the sample matrix on the reachable focusing efficiency is studied. It is shown that the application of a low pH sample solution mitigates problems associated with the low solubility of the hydrophobic basic analytes in aqueous solution while having advantages with regard to on-line focusing. Moreover, the use of a basic BGE reduces the adsorption of these analytes in the separation compartment. The separation of the studied analytes is achieved in less than 7min using a BGE consisting of 10mmolL(-1) disodium tetraborate buffer, pH 9.30 containing 40mmolL(-1) SDS and 20mmolL(-1) hydroxypropyl-β-CD while the sample solution is composed of 10mmolL(-1) phosphoric acid, pH 2.15. A full validation study of the developed method based on the pharmacopeial guidelines is performed. The method is successfully applied to the analysis of the studied drugs in tablets without interference of tablet additives as well as the analysis of spiked human urine without any sample pretreatment. Furthermore, DSL can be detected as an impurity in LOR bulk powder at the stated pharmacopeial limit (0.1%, w/w). The selectivity of the developed method allows the analysis of LOR and DSL in combination with the co-formulated drug pseudoephedrine. It is shown that in CD-MEKC with basic BGE, solute-wall interactions are effectively suppressed allowing the development of efficient and precise methods for the determination of hydrophobic basic analytes, whereas the use of a low pH sample solution has a positive impact on the attainable sweeping efficiency without compromising peak shape and resolution. Copyright © 2013 Elsevier B.V. All rights reserved.
Non-contact FBG sensing based steam turbine rotor dynamic balance vibration detection system
NASA Astrophysics Data System (ADS)
Li, Tianliang; Tan, Yuegang; Cai, Lin
2015-10-01
This paper has proposed a non-contact vibration sensor based on fiber Bragg grating sensing, and applied to detect vibration of steam turbine rotor dynamic balance experimental platform. The principle of the sensor has been introduced, as well as the experimental analysis; performance of non-contact FBG vibration sensor has been analyzed in the experiment; in addition, turbine rotor dynamic vibration detection system based on eddy current displacement sensor and non-contact FBG vibration sensor have built; finally, compared with results of signals under analysis of the time domain and frequency domain. The analysis of experimental data contrast shows that: the vibration signal analysis of non-contact FBG vibration sensor is basically the same as the result of eddy current displacement sensor; it verified that the sensor can be used for non-contact measurement of steam turbine rotor dynamic balance vibration.
Automatic analysis of microscopic images of red blood cell aggregates
NASA Astrophysics Data System (ADS)
Menichini, Pablo A.; Larese, Mónica G.; Riquelme, Bibiana D.
2015-06-01
Red blood cell aggregation is one of the most important factors in blood viscosity at stasis or at very low rates of flow. The basic structure of aggregates is a linear array of cell commonly termed as rouleaux. Enhanced or abnormal aggregation is seen in clinical conditions, such as diabetes and hypertension, producing alterations in the microcirculation, some of which can be analyzed through the characterization of aggregated cells. Frequently, image processing and analysis for the characterization of RBC aggregation were done manually or semi-automatically using interactive tools. We propose a system that processes images of RBC aggregation and automatically obtains the characterization and quantification of the different types of RBC aggregates. Present technique could be interesting to perform the adaptation as a routine used in hemorheological and Clinical Biochemistry Laboratories because this automatic method is rapid, efficient and economical, and at the same time independent of the user performing the analysis (repeatability of the analysis).
Issues in Biomedical Research Data Management and Analysis: Needs and Barriers
Anderson, Nicholas R.; Lee, E. Sally; Brockenbrough, J. Scott; Minie, Mark E.; Fuller, Sherrilynne; Brinkley, James; Tarczy-Hornoch, Peter
2007-01-01
Objectives A. Identify the current state of data management needs of academic biomedical researchers. B. Explore their anticipated data management and analysis needs. C. Identify barriers to addressing those needs. Design A multimodal needs analysis was conducted using a combination of an online survey and in-depth one-on-one semi-structured interviews. Subjects were recruited via an e-mail list representing a wide range of academic biomedical researchers in the Pacific Northwest. Measurements The results from 286 survey respondents were used to provide triangulation of the qualitative analysis of data gathered from 15 semi-structured in-depth interviews. Results Three major themes were identified: 1) there continues to be widespread use of basic general-purpose applications for core data management; 2) there is broad perceived need for additional support in managing and analyzing large datasets; and 3) the barriers to acquiring currently available tools are most commonly related to financial burdens on small labs and unmet expectations of institutional support. Conclusion Themes identified in this study suggest that at least some common data management needs will best be served by improving access to basic level tools such that researchers can solve their own problems. Additionally, institutions and informaticians should focus on three components: 1) facilitate and encourage the use of modern data exchange models and standards, enabling researchers to leverage a common layer of interoperability and analysis; 2) improve the ability of researchers to maintain provenance of data and models as they evolve over time though tools and the leveraging of standards; and 3) develop and support information management service cores that could assist in these previous components while providing researchers with unique data analysis and information design support within a spectrum of informatics capabilities. PMID:17460139
[Quenched fluorescein: a reference dye for instrument response function of TCSPC].
Pan, Hai-feng; Ding, Jing-xin; Liang, Rong-rong; Tao, Zhan-dong; Liu, Meng-wei; Zhang, San-jun; Xu, Jian-hua
2014-08-01
Measuring the instrument response function (IRF) and fitting by reconvolution algorithms are routines to improve time resolution in fluorescence lifetime measurements. Iodide ions were successfully used to quench the fluorescence of fluorescein in this study. By systematically adding saturated NaI water solution in basic fluorescein solution, the lifetimes of fluorescein were reduced from 4 ns to 24 ps. The quenched lifetime of fluorescein obtained from the analysis of Time-Correlated Single Photon Counting (TCSPC) measurement agrees well with that from femtosecond frequency up-conversion measurement. In time resolved excitation spectra measurements, the IRF should be measured at various detection wavelengths providing scattring materials are used. This study could not only reduce the complexity of IRF measurement, but also avoid the existing color effect in system. This study should have wide applications in time resolved fluorescence spectroscopy and fluorescence lifetime imaging.
Local feature saliency classifier for real-time intrusion monitoring
NASA Astrophysics Data System (ADS)
Buch, Norbert; Velastin, Sergio A.
2014-07-01
We propose a texture saliency classifier to detect people in a video frame by identifying salient texture regions. The image is classified into foreground and background in real time. No temporal image information is used during the classification. The system is used for the task of detecting people entering a sterile zone, which is a common scenario for visual surveillance. Testing is performed on the Imagery Library for Intelligent Detection Systems sterile zone benchmark dataset of the United Kingdom's Home Office. The basic classifier is extended by fusing its output with simple motion information, which significantly outperforms standard motion tracking. A lower detection time can be achieved by combining texture classification with Kalman filtering. The fusion approach running at 10 fps gives the highest result of F1=0.92 for the 24-h test dataset. The paper concludes with a detailed analysis of the computation time required for the different parts of the algorithm.
Lack of mutagens in deep-fat-fried foods obtained at the retail level.
Taylor, S L; Berg, C M; Shoptaugh, N H; Scott, V N
1982-04-01
The basic methylene chloride extract from 20 of 30 samples of foods fried in deep fat failed to elicit any mutagenic response that could be detected in the Salmonella typhimurium/mammalian microsome assay. The basic extracts of the remaining ten samples (all three chicken samples studied, two of the four potato-chip samples, one of four corn-chip samples, the sample of onion rings, two of six doughnuts, and one of three samples of french-fried potato) showed evidence of weak mutagenic activity. In these samples, amounts of the basic extract equivalent to 28.5-57 g of the original food sample were required to produce revertants at levels of 2.6-4.8 times the background level. Only two of the acidic methylene chloride extracts from the 30 samples exhibited mutagenic activity greater than 2.5 times the background reversion level, and in both cases (one corn-chip and one shrimp sample) the mutagenic response was quite weak. The basic extract of hamburgers fried in deep fat in a home-style fryer possessed higher levels of mutagenic activity (13 times the background reversion level). However, the mutagenic activity of deep-fried hamburgers is some four times lower than that of pan-fried hamburgers.
A content analysis of food advertising on Turkish television.
Akçil Ok, Mehtap; Ercan, Aydan; Kaya, Fatih Suleyman
2016-12-01
The aim of this study was to conduct a comprehensive content analysis of Television (TV) food advertising and compare various food advertisements on free-to-air Turkish national TV channels by broadcast time (duration) and frequency over the period of a week (19-25 April 2012). TV food advertisements were the unit of content analysis in this study. Each advertisement identified as promoting a food product was analysed for content; non-food advertisements were not analysed, although they were counted as a proportion of the advertisements aired. We recorded all programmes for 4 h each per day (7 p.m.-11 p.m.), totalling 84 h. Five types of food-related advertisements were identified (basic foods, junk foods, meat products, beverages and fast food), and six types of non-food advertisements. The Student t-test and ANOVA were used to compare the mean broadcast time of all prime time advertising for the two groups. The mean broadcast times for prime time, non-food advertisements showed a statistically significant difference (p < 0.05). This difference is related to the prime time period 7 p.m.-8 p.m. being considered dinner time for most Turkish families. Additionally, the number and broadcast times of beverage advertisements increased during this time period, while the broadcast time per beverage advertisement decreased (ratio = 20.8 s per ads). As a result, TV food advertising increased not only during dinner time but also in overall broadcast time (per advertisement). These findings may be useful for explaining how advertising can negatively influence food choices, thereby increasing public awareness of the need for health messages targeting obesity. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Preliminary basic performance analysis of the Cedar multiprocessor memory system
NASA Technical Reports Server (NTRS)
Gallivan, K.; Jalby, W.; Turner, S.; Veidenbaum, A.; Wijshoff, H.
1991-01-01
Some preliminary basic results on the performance of the Cedar multiprocessor memory system are presented. Empirical results are presented and used to calibrate a memory system simulator which is then used to discuss the scalability of the system.
Crystallization control for remediation of an FetO-rich CaO-SiO2-Al2O3-MgO EAF waste slag.
Jung, Sung Suk; Sohn, Il
2014-01-01
In this work, the crystallization behavior of synthesized FetO-rich electric arc furnace (EAF) waste slags with a basicity range of 0.7 to 1.08 was investigated. Crystal growth in the melts was observed in situ using a confocal laser scanning microscope, and a delayed crystallization for higher-basicity samples was observed in the continuous cooling transformation and time temperature transformation diagrams. This result is likely due to the polymerization of the melt structure as a result of the increased number of network-forming FeO4 and AlO4 units, as suggested by Raman analysis. The complex incorporation of Al and Fe ions in the form of AlO4 and FeO4 tetrahedral units dominant in the melt structure at a higher basicity constrained the precipitation of a magnetic, nonstoichiometric, and Fe-rich MgAlFeO4 primary phase. The growth of this spinel phase caused a clear compositional separation from amorphous phase during isothermal cooling at 1473 K leading to a clear separation between the primary and amorphous phases, allowing an efficient magnetic separation of Fe compounds from the slag for effective remediation and recycling of synthesized EAF waste slags for use in higher value-added ordinary Portland cement.
NASA Astrophysics Data System (ADS)
Miksovsky, J.; Raidl, A.
Time delays phase space reconstruction represents one of useful tools of nonlinear time series analysis, enabling number of applications. Its utilization requires the value of time delay to be known, as well as the value of embedding dimension. There are sev- eral methods how to estimate both these parameters. Typically, time delay is computed first, followed by embedding dimension. Our presented approach is slightly different - we reconstructed phase space for various combinations of mentioned parameters and used it for prediction by means of the nearest neighbours in the phase space. Then some measure of prediction's success was computed (correlation or RMSE, e.g.). The position of its global maximum (minimum) should indicate the suitable combination of time delay and embedding dimension. Several meteorological (particularly clima- tological) time series were used for the computations. We have also created a MS- Windows based program in order to implement this approach - its basic features will be presented as well.
Integrated analysis of remote sensing products from basic geological surveys. [Brazil
NASA Technical Reports Server (NTRS)
Dasilvafagundesfilho, E. (Principal Investigator)
1984-01-01
Recent advances in remote sensing led to the development of several techniques to obtain image information. These techniques as effective tools in geological maping are analyzed. A strategy for optimizing the images in basic geological surveying is presented. It embraces as integrated analysis of spatial, spectral, and temporal data through photoptic (color additive viewer) and computer processing at different scales, allowing large areas survey in a fast, precise, and low cost manner.
NASA Astrophysics Data System (ADS)
Kardas, Edyta; Brožova, Silvie; Pustějovská, Pavlína; Jursová, Simona
2017-12-01
In the paper the evaluation of efficiency of the use of machines in the selected production company was presented. The OEE method (Overall Equipment Effectiveness) was used for the analysis. The selected company deals with the production of tapered roller bearings. The analysis of effectiveness was done for 17 automatic grinding lines working in the department of grinding rollers. Low level of efficiency of machines was affected by problems with the availability of machines and devices. The causes of machine downtime on these lines was also analyzed. Three basic causes of downtime were identified: no kanban card, diamonding, no operator. Ways to improve the use of these machines were suggested. The analysis takes into account the actual results from the production process and covers the period of one calendar year.
Free wake analysis of hover performance using a new influence coefficient method
NASA Technical Reports Server (NTRS)
Quackenbush, Todd R.; Bliss, Donald B.; Ong, Ching Cho; Ching, Cho Ong
1990-01-01
A new approach to the prediction of helicopter rotor performance using a free wake analysis was developed. This new method uses a relaxation process that does not suffer from the convergence problems associated with previous time marching simulations. This wake relaxation procedure was coupled to a vortex-lattice, lifting surface loads analysis to produce a novel, self contained performance prediction code: EHPIC (Evaluation of Helicopter Performance using Influence Coefficients). The major technical features of the EHPIC code are described and a substantial amount of background information on the capabilities and proper operation of the code is supplied. Sample problems were undertaken to demonstrate the robustness and flexibility of the basic approach. Also, a performance correlation study was carried out to establish the breadth of applicability of the code, with very favorable results.
Monte Carlo investigation of thrust imbalance of solid rocket motor pairs
NASA Technical Reports Server (NTRS)
Sforzini, R. H.; Foster, W. A., Jr.
1976-01-01
The Monte Carlo method of statistical analysis is used to investigate the theoretical thrust imbalance of pairs of solid rocket motors (SRMs) firing in parallel. Sets of the significant variables are selected using a random sampling technique and the imbalance calculated for a large number of motor pairs using a simplified, but comprehensive, model of the internal ballistics. The treatment of burning surface geometry allows for the variations in the ovality and alignment of the motor case and mandrel as well as those arising from differences in the basic size dimensions and propellant properties. The analysis is used to predict the thrust-time characteristics of 130 randomly selected pairs of Titan IIIC SRMs. A statistical comparison of the results with test data for 20 pairs shows the theory underpredicts the standard deviation in maximum thrust imbalance by 20% with variability in burning times matched within 2%. The range in thrust imbalance of Space Shuttle type SRM pairs is also estimated using applicable tolerances and variabilities and a correction factor based on the Titan IIIC analysis.
Urban Land: Study of Surface Run-off Composition and Its Dynamics
NASA Astrophysics Data System (ADS)
Palagin, E. D.; Gridneva, M. A.; Bykova, P. G.
2017-11-01
The qualitative composition of urban land surface run-off is liable to significant variations. To study surface run-off dynamics, to examine its behaviour and to discover reasons of these variations, it is relevant to use the mathematical apparatus technique of time series analysis. A seasonal decomposition procedure was applied to a temporary series of monthly dynamics with the annual frequency of seasonal variations in connection with a multiplicative model. The results of the quantitative chemical analysis of surface wastewater of the 22nd Partsjezd outlet in Samara for the period of 2004-2016 were used as basic data. As a result of the analysis, a seasonal pattern of variations in the composition of surface run-off in Samara was identified. Seasonal indices upon 15 waste-water quality indicators were defined. BOD (full), suspended materials, mineralization, chlorides, sulphates, ammonium-ion, nitrite-anion, nitrate-anion, phosphates (phosphorus), iron general, copper, zinc, aluminium, petroleum products, synthetic surfactants (anion-active). Based on the seasonal decomposition of the time series data, the contribution of trends, seasonal and accidental components of the variability of the surface run-off indicators was estimated.
NASA Astrophysics Data System (ADS)
Korytárová, J.; Vaňková, L.
2017-10-01
Paper builds on previous research of the authors into the evaluation of economic efficiency of transport infrastructure projects evaluated by the economic efficiency ratio - NPV, IRR and BCR. Values of indicators and subsequent outputs of the sensitivity analysis show extremely favourable values in some cases. The authors dealt with the analysis of these indicators down to the level of the input variables and examined which inputs have a larger share of these extreme values. NCF for the calculation of above mentioned ratios is created by benefits that arise as the difference between zero and investment options of the project (savings in travel and operating costs, savings in travel time costs, reduction in accident costs and savings in exogenous costs) as well as total agency costs. Savings in travel time costs which contribute to the overall utility of projects by more than 70% appear to be the most important benefits in the long term horizon. This is the reason why this benefit emphasized. The outcome of the article has resulted how the particular basic variables contributed to the total robustness of economic efficiency of these project.
Advanced data acquisition and display techniques for laser velocimetry
NASA Technical Reports Server (NTRS)
Kjelgaard, Scott O.; Weston, Robert P.
1991-01-01
The Basic Aerodynamics Research Tunnel (BART) has been equipped with state-of-the-art instrumentation for acquiring the data needed for code validation. This paper describes the three-component LDV and the workstation-based data-acquisition system (DAS) which has been developed for the BART. The DAS allows the use of automation and the quick integration of advanced instrumentation, while minimizing the software development time required between investigations. The paper also includes a description of a graphics software library developed to support the windowing environment of the DAS. The real-time displays generated using the graphics library help the researcher ensure the test is proceeding properly. The graphics library also supports the requirements of posttest data analysis. The use of the DAS and graphics libraries is illustrated by presenting examples of the real-time and postprocessing display graphics for LDV investigations.
Choodum, Aree; Parabun, Kaewalee; Klawach, Nantikan; Daeid, Niamh Nic; Kanatharana, Proespichaya; Wongniramaikul, Worawit
2014-02-01
The Simon presumptive color test was used in combination with the built-in digital camera on a mobile phone to detect methamphetamine. The real-time Red-Green-Blue (RGB) basic color data was obtained using an application installed on the mobile phone and the relationship profile between RGB intensity, including other calculated values, and the colourimetric product was investigated. A wide linear range (0.1-2.5mg mL(-1)) and a low detection limit (0.0110±0.0001-0.044±0.002mg mL(-1)) were achieved. The method also required a small sample size (20μL). The results obtained from the analysis of illicit methamphetamine tablets were comparable to values obtained from gas chromatograph-flame ionization detector (GC-FID) analysis. Method validation indicated good intra- and inter-day precision (2.27-4.49%RSD and 2.65-5.62%RSD, respectively). The results suggest that this is a powerful real-time mobile method with the potential to be applied in field tests. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Connected Text Reading and Differences in Text Reading Fluency in Adult Readers
Wallot, Sebastian; Hollis, Geoff; van Rooij, Marieke
2013-01-01
The process of connected text reading has received very little attention in contemporary cognitive psychology. This lack of attention is in parts due to a research tradition that emphasizes the role of basic lexical constituents, which can be studied in isolated words or sentences. However, this lack of attention is in parts also due to the lack of statistical analysis techniques, which accommodate interdependent time series. In this study, we investigate text reading performance with traditional and nonlinear analysis techniques and show how outcomes from multiple analyses can used to create a more detailed picture of the process of text reading. Specifically, we investigate reading performance of groups of literate adult readers that differ in reading fluency during a self-paced text reading task. Our results indicate that classical metrics of reading (such as word frequency) do not capture text reading very well, and that classical measures of reading fluency (such as average reading time) distinguish relatively poorly between participant groups. Nonlinear analyses of distribution tails and reading time fluctuations provide more fine-grained information about the reading process and reading fluency. PMID:23977177
Phylogenetic analysis reveals a scattered distribution of autumn colours
Archetti, Marco
2009-01-01
Background and Aims Leaf colour in autumn is rarely considered informative for taxonomy, but there is now growing interest in the evolution of autumn colours and different hypotheses are debated. Research efforts are hindered by the lack of basic information: the phylogenetic distribution of autumn colours. It is not known when and how autumn colours evolved. Methods Data are reported on the autumn colours of 2368 tree species belonging to 400 genera of the temperate regions of the world, and an analysis is made of their phylogenetic relationships in order to reconstruct the evolutionary origin of red and yellow in autumn leaves. Key Results Red autumn colours are present in at least 290 species (70 genera), and evolved independently at least 25 times. Yellow is present independently from red in at least 378 species (97 genera) and evolved at least 28 times. Conclusions The phylogenetic reconstruction suggests that autumn colours have been acquired and lost many times during evolution. This scattered distribution could be explained by hypotheses involving some kind of coevolutionary interaction or by hypotheses that rely on the need for photoprotection. PMID:19126636
Medical University admission test: a confirmatory factor analysis of the results.
Luschin-Ebengreuth, Marion; Dimai, Hans P; Ithaler, Daniel; Neges, Heide M; Reibnegger, Gilbert
2016-05-01
The Graz Admission Test has been applied since the academic year 2006/2007. The validity of the Test was demonstrated by a significant improvement of study success and a significant reduction of dropout rate. The purpose of this study was a detailed analysis of the internal correlation structure of the various components of the Graz Admission Test. In particular, the question investigated was whether or not the various test parts constitute a suitable construct which might be designated as "Basic Knowledge in Natural Science." This study is an observational investigation, analyzing the results of the Graz Admission Test for the study of human medicine and dentistry. A total of 4741 applicants were included in the analysis. Principal component factor analysis (PCFA) as well as techniques from structural equation modeling, specifically confirmatory factor analysis (CFA), were employed to detect potential underlying latent variables governing the behavior of the measured variables. PCFA showed good clustering of the science test parts, including also text comprehension. A putative latent variable "Basic Knowledge in Natural Science," investigated by CFA, was indeed shown to govern the response behavior of the applicants in biology, chemistry, physics, and mathematics as well as text comprehension. The analysis of the correlation structure of the various test parts confirmed that the science test parts together with text comprehension constitute a satisfactory instrument for measuring a latent construct variable "Basic Knowledge in Natural Science." The present results suggest the fundamental importance of basic science knowledge for results obtained in the framework of the admission process for medical universities.
Population, Migration, and Arctic Community Change
NASA Astrophysics Data System (ADS)
Hamilton, L.; Wirsing, J.
2017-12-01
North American Arctic communities commonly show decadal trends in population growth, driven by natural increase but variably offset by net migration with year-to-year volatility. Migration rates themselves can be a social indicator, integrating a range of push and pull factors. Population and population change of Arctic communities are basic scale properties affecting the resources needed to achieve sustainability, and the adaptations that may be required for climate change (such as relocation from flood-threatened locations). We examine interannual changes 1990-2016 in population and net migration of 43 Alaska Arctic communities, some facing serious threats of flooding. Our Alaska analysis updates previous work with additional years of data. We also extend this demographic analysis for the first time to 25 towns and villages of Nunavut, Canada.
Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.
Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian
2011-01-01
Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.
Learning basic surgical skills with mental imagery: using the simulation centre in the mind.
Sanders, Charles W; Sadoski, Mark; van Walsum, Kim; Bramson, Rachel; Wiprud, Robert; Fossum, Theresa W
2008-06-01
Although surgeons and athletes frequently use mental imagery in preparing to perform, mental imagery has not been extensively researched as a learning technique in medical education. A mental imagery rehearsal technique was experimentally compared with textbook study to determine the effects of each on the learning of basic surgical skills. Sixty-four Year 2 medical students were randomly assigned to 2 treatment groups in which they undertook either mental imagery or textbook study. Both groups received the usual skills course of didactic lectures, demonstrations, physical practice with pigs' feet and a live animal laboratory. One group received additional training in mental imagery and the other group was given textbook study. Performance was assessed at 3 different time-points using a reliable rating scale. Analysis of variance on student performance in live rabbit surgery revealed a significant interaction favouring the imagery group over the textbook study group. The mental imagery technique appeared to transfer learning from practice to actual surgery better than textbook study.
A random distribution reacting mixing layer model
NASA Technical Reports Server (NTRS)
Jones, Richard A.; Marek, C. John; Myrabo, Leik N.; Nagamatsu, Henry T.
1994-01-01
A methodology for simulation of molecular mixing, and the resulting velocity and temperature fields has been developed. The ideas are applied to the flow conditions present in the NASA Lewis Research Center Planar Reacting Shear Layer (PRSL) facility, and results compared to experimental data. A gaussian transverse turbulent velocity distribution is used in conjunction with a linearly increasing time scale to describe the mixing of different regions of the flow. Equilibrium reaction calculations are then performed on the mix to arrive at a new species composition and temperature. Velocities are determined through summation of momentum contributions. The analysis indicates a combustion efficiency of the order of 80 percent for the reacting mixing layer, and a turbulent Schmidt number of 2/3. The success of the model is attributed to the simulation of large-scale transport of fluid. The favorable comparison shows that a relatively quick and simple PC calculation is capable of simulating the basic flow structure in the reacting and nonreacting shear layer present in the facility given basic assumptions about turbulence properties.
Real-time fringe pattern demodulation with a second-order digital phase-locked loop.
Gdeisat, M A; Burton, D R; Lalor, M J
2000-10-10
The use of a second-order digital phase-locked loop (DPLL) to demodulate fringe patterns is presented. The second-order DPLL has better tracking ability and more noise immunity than the first-order loop. Consequently, the second-order DPLL is capable of demodulating a wider range of fringe patterns than the first-order DPLL. A basic analysis of the first- and the second-order loops is given, and a performance comparison between the first- and the second-order DPLL's in analyzing fringe patterns is presented. The implementation of the second-order loop in real time on a commercial parallel image processing system is described. Fringe patterns are grabbed and processed, and the resultant phase maps are displayed concurrently.
A solid reactor core thermal model for nuclear thermal rockets
NASA Astrophysics Data System (ADS)
Rider, William J.; Cappiello, Michael W.; Liles, Dennis R.
1991-01-01
A Helium/Hydrogen Cooled Reactor Analysis (HERA) computer code has been developed. HERA has the ability to model arbitrary geometries in three dimensions, which allows the user to easily analyze reactor cores constructed of prismatic graphite elements. The code accounts for heat generation in the fuel, control rods, and other structures; conduction and radiation across gaps; convection to the coolant; and a variety of boundary conditions. The numerical solution scheme has been optimized for vector computers, making long transient analyses economical. Time integration is either explicit or implicit, which allows the use of the model to accurately calculate both short- or long-term transients with an efficient use of computer time. Both the basic spatial and temporal integration schemes have been benchmarked against analytical solutions.
A Computationally Efficient Method for Polyphonic Pitch Estimation
NASA Astrophysics Data System (ADS)
Zhou, Ruohua; Reiss, Joshua D.; Mattavelli, Marco; Zoia, Giorgio
2009-12-01
This paper presents a computationally efficient method for polyphonic pitch estimation. The method employs the Fast Resonator Time-Frequency Image (RTFI) as the basic time-frequency analysis tool. The approach is composed of two main stages. First, a preliminary pitch estimation is obtained by means of a simple peak-picking procedure in the pitch energy spectrum. Such spectrum is calculated from the original RTFI energy spectrum according to harmonic grouping principles. Then the incorrect estimations are removed according to spectral irregularity and knowledge of the harmonic structures of the music notes played on commonly used music instruments. The new approach is compared with a variety of other frame-based polyphonic pitch estimation methods, and results demonstrate the high performance and computational efficiency of the approach.
A structurally oriented simulation system
NASA Technical Reports Server (NTRS)
Aran, Z.
1973-01-01
The computer program SOSS (Structurally Oriented Simulation System) is designed to be used as an experimental aid in the study of reliable systems. Basically, SOSS can simulate the structure and behavior of a discrete-time, finite-state, time-invariant system at various levels of structural definition. A general description of the program is given along with its modes of operation, command language of the basic system, future features to be incorporated in SOSS, and an example of usage.
Sulsky, Sandra I; Karlsson, Lee H; Bulzacchelli, Maria T; Luippold, Rose S; Rodriguez-Monguio, Rosa; Bulathsinhala, Lakmini; Hill, Owen T
2014-12-01
Training-related injury is a threat to military health and readiness. Prevalence of potential risk factors for training-related injury can change with U.S. Army recruitment goals and may influence basic combat training (BCT) injury rates. This article describes challenges of using administrative data to identify a trainee cohort and describes demographic and training characteristics across the five BCT locations. Data from the Total Army Injury and Health Outcomes Database were used to identify a U.S. Army-wide cohort of first-time trainees from January 1, 2002 to September 30, 2007 and describe its characteristics. The cohort includes 368,102 first-time trainees. The annual number starting BCT increased from 52,187 in 2002 to 68,808 in 2004. The proportion of males increased from 81.57% in 2003 to 83.84% in 2007. Mean (SD) age increased from 20.67 (3.55) years in 2002 to 20.94 (3.65) years in 2007. Mean (SD) body mass index increased from 24.53 (3.56) kg/m(2) in 2002 to 24.94 (3.84) kg/m(2) in 2006. Other characteristics fluctuated by year, including proportions of race/ethnicity, accession waivers, and confirmed graduates. Fluctuations in trainee characteristics warrant further analysis of potential influence on BCT injury rates. For research uses, careful acquisition of administrative data is needed. Reprint & Copyright © 2014 Association of Military Surgeons of the U.S.
Studying Mixing in Non-Newtonian Blue Maize Flour Suspensions Using Color Analysis
Trujillo-de Santiago, Grissel; Rojas-de Gante, Cecilia; García-Lara, Silverio; Ballescá-Estrada, Adriana; Alvarez, Mario Moisés
2014-01-01
Background Non-Newtonian fluids occur in many relevant flow and mixing scenarios at the lab and industrial scale. The addition of acid or basic solutions to a non-Newtonian fluid is not an infrequent operation, particularly in Biotechnology applications where the pH of Non-Newtonian culture broths is usually regulated using this strategy. Methodology and Findings We conducted mixing experiments in agitated vessels using Non-Newtonian blue maize flour suspensions. Acid or basic pulses were injected to reveal mixing patterns and flow structures and to follow their time evolution. No foreign pH indicator was used as blue maize flours naturally contain anthocyanins that act as a native, wide spectrum, pH indicator. We describe a novel method to quantitate mixedness and mixing evolution through Dynamic Color Analysis (DCA) in this system. Color readings corresponding to different times and locations within the mixing vessel were taken with a digital camera (or a colorimeter) and translated to the CIELab scale of colors. We use distances in the Lab space, a 3D color space, between a particular mixing state and the final mixing point to characterize segregation/mixing in the system. Conclusion and Relevance Blue maize suspensions represent an adequate and flexible model to study mixing (and fluid mechanics in general) in Non-Newtonian suspensions using acid/base tracer injections. Simple strategies based on the evaluation of color distances in the CIELab space (or other scales such as HSB) can be adapted to characterize mixedness and mixing evolution in experiments using blue maize suspensions. PMID:25401332
Gómez-Méndez, Raquel; Monte-Secades, Rafael; Ventura-Valcárcel, Pablo; Rabuñal-Rey, Ramón; Guerrero-Sande, Héctor; Chamorro-Fernández, Antonio J; Pértega-Díaz, Sonia
2017-12-20
There are no data on the incidence of admissions associated with alcohol withdrawal syndrome (AWS) or about its trend over time in Spain. To analyze the characteristics, incidence rates and trends over time of hospital admissions associated with AWS in Spanish public hospitals. Analysis from the Spanish public hospitals minimum basic data set of hospital admissions with AWS (CIE9-MC 291.81), alcohol withdrawal delirium (CIE9-MC 291.0) and alcohol withdrawal hallucinosis (CIE9-MC 291.3), since 1999 to 2010. We identified 56,395 admissions associated with AWS. Mean age was 50.9 (SD 12.5) and 88% were male. The most frequent admission department was Internal Medicine (24.9%). The mean hospital stay was 12.6 days (SD 14.4) and mortality was 4.7%; 62.6% of cases developed AWS during an admission for another reason, mostly due to alcohol-related pathologies. Secondary diagnoses in patients hospitalized for AWS were related to alcohol consumption in more than half of the cases. The incidence rate of admissions associated with AWS in Spain remained stable from 1999 to 2010, with a small decline in the last 3 years of the period. The communities with the highest incidence were the Canary Islands, the Balearic Islands and Galicia. The incidence rate of admissions associated with AWS in Spanish public hospitals in the period 1999-2010 has remained stable with slight changes. There are differences in the incidence of AWS among the different autonomous communities. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.
Multichannel electrical stimulation of the auditory nerve in man. I. Basic psychophysics.
Shannon, R V
1983-08-01
Basic psychophysical measurements were obtained from three patients implanted with multichannel cochlear implants. This paper presents measurements from stimulation of a single channel at a time (either monopolar or bipolar). The shape of the threshold vs. frequency curve can be partially related to the membrane biophysics of the remaining spiral ganglion and/or dendrites. Nerve survival in the region of the electrode may produce some increase in the dynamic range on that electrode. Loudness was related to the stimulus amplitude by a power law with exponents between 1.6 and 3.4, depending on frequency. Intensity discrimination was better than for normal auditory stimulation, but not enough to offset the small dynamic range for electrical stimulation. Measures of temporal integration were comparable to normals, indicating a central mechanism that is still intact in implant patients. No frequency analysis of the electrical signal was observed. Each electrode produced a unique pitch sensation, but they were not simply related to the tonotopic position of the stimulated electrode. Pitch increased over more than 4 octaves (for one patient) as the frequency was increased from 100 to 300 Hz, but above 300 Hz no pitch change was observed. Possibly the major limitation of single channel cochlear implants is the 1-2 ms integration time (probably due to the capacitative properties of the nerve membrane which acts as a low-pass filter at 100 Hz). Another limitation of electrical stimulation is that there is no spectral analysis of the electrical waveform so that temporal waveform alone determines the effective stimulus.
Ciesielczyk, Filip; Bartczak, Przemysław; Zdarta, Jakub; Jesionowski, Teofil
2017-12-15
A comparative analysis was performed concerning the removal of two different organic dyes from model aqueous solution using an inorganic oxide adsorbent. The key element of the study concerns evaluation of the influence of the dyes' structure and their acid-base character on the efficiency of the adsorption process. The selection of sorbent material for this research - an MgO-SiO 2 oxide system synthesized via a modified sol-gel route - is also not without significance. The relatively high porous structure parameters of this material (A BET = 642 m 2 /g, V p = 1.11 mL and S p = 9.8 nm) are a result of the proposed methodology for its synthesis. Both organic dyes (C.I. Acid Blue 29 and C.I. Basic Blue 9) were subjected to typical batch adsorption tests, including investigation of such process parameters as time, initial adsorbate concentration, adsorbent dose, pH and temperature. An attempt was also made to estimate the sorption capacity of the oxide material with respect to the analyzed organic dyes. To achieve the objectives of the research - determine the efficiency of adsorption - it was important to perform a thorough physicochemical analysis of the adsorbents (e.g. FTIR, elemental analysis and porous structure parameters). The results confirmed the significantly higher affinity of the basic dye to the oxide adsorbents compared with the acidic dye. The regeneration tests, which indirectly determine the nature of the adsorbent/adsorbate interactions, provide further evidence for this finding. On this basis, a probable mechanism of dyes adsorption on the MgO-SiO 2 oxide adsorbent was proposed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Program to analyze aquifer test data and check for validity with the jacob method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Field, M.S.
1993-01-01
The Jacob straight-line method of aquifer analysis deals with the late-time data and small radius of the Theis type curve which plot as a straight line if the drawdown data are plotted on an arithmetic scale and the time data on a logarithmic (base 10) scale. Correct analysis with the Jacob method normally assumes that (1) the data lie on a straight line, (2) the value of the dimensionless time factor is less than 0.01, and (3) the site's hydrogeology conforms to the method's assumptions and limiting conditions. Items 1 and 2 are usually considered for the Jacob method, butmore » item 3 is often ignored, which can lead to incorrect calculations of aquifer parameters. A BASIC computer program was developed to analyze aquifer test data with the Jacob method to test the validity of its use. Aquifer test data are entered into the program and manipulated so that a slope and time intercept of the straight line drawn through the data (excluding early-time and late-time data) can be used to calculate transmissivity and storage coefficient. Late-time data are excluded to eliminate the effects of positive and negative boundaries. The time-drawdown data then are converted into dimensionless units to determine if the Jacob method's assumptions are valid for the hydrogeologic conditions under which the test was conducted.« less
Maguire, Mandy J; Abel, Alyson D
2013-10-01
EEG is a primary method for studying temporally precise neuronal processes across the lifespan. Most of this work focuses on event related potentials (ERPs); however, using time-locked time frequency analysis to decompose the EEG signal can identify and distinguish multiple changes in brain oscillations underlying cognition (Bastiaansen et al., 2010). Further this measure is thought to reflect changes in inter-neuronal communication more directly than ERPs (Nunez and Srinivasan, 2006). Although time frequency has elucidated cognitive processes in adults, applying it to cognitive development is still rare. Here, we review the basics of neuronal oscillations, some of what they reveal about adult cognitive function, and what little is known relating to children. We focus on language because it develops early and engages complex cortical networks. Additionally, because time frequency analysis of the EEG related to adult language comprehension has been incredibly informative, using similar methods with children will shed new light on current theories of language development and increase our understanding of how neural processes change over the lifespan. Our goal is to emphasize the power of this methodology and encourage its use throughout developmental cognitive neuroscience. Copyright © 2013 Elsevier Ltd. All rights reserved.
Toxicodynetics: A new discipline in clinical toxicology.
Baud, F J; Houzé, P; Villa, A; Borron, S W; Carli, P
2016-05-01
Regarding the different disciplines that encompass the pharmacology and the toxicology, none is specifically dedicated to the description and analysis of the time-course of relevant toxic effects both in experimental and clinical studies. The lack of a discipline devoted to this major field in toxicology results in misconception and even in errors by clinicians. Review of the basic different disciplines that encompass pharmacology toxicology and comparing with the description of the time-course of effects in conditions in which toxicological analysis was not performed or with limited analytical evidence. Review of the literature clearly shows how misleading is the current extrapolation of toxicokinetic data to the description of the time-course of toxic effects. A new discipline entitled toxicodynetics should be developed aiming at a more systematic description of the time-course of effects in acute human and experimental poisonings. Toxicodynetics might help emergency physicians in risk assessment when facing a poisoning and contribute to a better assessment of quality control of data collected by poison control centres. Toxicodynetics would also allow a quantitative approach to the clinical effects resulting from drug-drug interaction. Copyright © 2016. Published by Elsevier Masson SAS.
Kane, Lesley A; Yung, Christina K; Agnetti, Giulio; Neverova, Irina; Van Eyk, Jennifer E
2006-11-01
Separation of basic proteins with 2-DE presents technical challenges involving protein precipitation, load limitations, and streaking. Cardiac mitochondria are enriched in basic proteins and difficult to resolve by 2-DE. We investigated two methods, cup and paper bridge, for sample loading of this subproteome into the basic range (pH 6-11) gels. Paper bridge loading consistently produced improved resolution of both analytical and preparative protein loads. A unique benefit of this technique is that proteins retained in the paper bridge after loading basic gels can be reloaded onto lower pH gradients (pH 4-7), allowing valued samples to be analyzed on multiple pH ranges.
Pan, Long; Yao, Enjian; Yang, Yang
2016-12-01
With the rapid development of urbanization and motorization in China, traffic-related air pollution has become a major component of air pollution which constantly jeopardizes public health. This study proposes an integrated framework for estimating the concentration of traffic-related air pollution with real-time traffic and basic meteorological information and also for further evaluating the impact of traffic-related air pollution. First, based on the vehicle emission factor models sensitive to traffic status, traffic emissions are calculated according to the real-time link-based average traffic speed, traffic volume, and vehicular fleet composition. Then, based on differences in meteorological conditions, traffic pollution sources are divided into line sources and point sources, and the corresponding methods to determine the dynamic affecting areas are also proposed. Subsequently, with basic meteorological data, Gaussian dispersion model and puff integration model are applied respectively to estimate the concentration of traffic-related air pollution. Finally, the proposed estimating framework is applied to calculate the distribution of CO concentration in the main area of Beijing, and the population exposure is also calculated to evaluate the impact of traffic-related air pollution on public health. Results show that there is a certain correlation between traffic indicators (i.e., traffic speed and traffic intensity) of the affecting area and traffic-related CO concentration of the target grid, which indicates the methods to determine the affecting areas are reliable. Furthermore, the reliability of the proposed estimating framework is verified by comparing the predicted and the observed ambient CO concentration. In addition, results also show that the traffic-related CO concentration is higher in morning and evening peak hours, and has a heavier impact on public health within the Fourth Ring Road of Beijing due to higher population density and higher CO concentration under calm wind condition in this area. Copyright © 2016 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Maoyuan, Pan
2007-01-01
Research on the issues of higher education has been going on for a long time. However, higher education pedagogy as independent discipline has been present in China for only about ten years. The structure of a discipline cannot consist merely of a compilation of the issues under research but must also include its basic theories and a system of…