NASA Astrophysics Data System (ADS)
Pradhan, Moumita; Pradhan, Dinesh; Bandyopadhyay, G.
2010-10-01
Fuzzy System has demonstrated their ability to solve different kinds of problem in various application domains. There is an increasing interest to apply fuzzy concept to improve tasks of any system. Here case study of a thermal power plant is considered. Existing time estimation represents time to complete tasks. Applying fuzzy linear approach it becomes clear that after each confidence level least time is taken to complete tasks. As time schedule is less than less amount of cost is needed. Objective of this paper is to show how one system becomes more efficient in applying Fuzzy Linear approach. In this paper we want to optimize the time estimation to perform all tasks in appropriate time schedules. For the case study, optimistic time (to), pessimistic time (tp), most likely time(tm) is considered as data collected from thermal power plant. These time estimates help to calculate expected time(te) which represents time to complete particular task to considering all happenings. Using project evaluation and review technique (PERT) and critical path method (CPM) concept critical path duration (CPD) of this project is calculated. This tells that the probability of fifty percent of the total tasks can be completed in fifty days. Using critical path duration and standard deviation of the critical path, total completion of project can be completed easily after applying normal distribution. Using trapezoidal rule from four time estimates (to, tm, tp, te), we can calculate defuzzyfied value of time estimates. For range of fuzzy, we consider four confidence interval level say 0.4, 0.6, 0.8,1. From our study, it is seen that time estimates at confidence level between 0.4 and 0.8 gives the better result compared to other confidence levels.
77 FR 21577 - Agency Information Collection Activities: Lien Notice
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-10
... technology; and (e) the annual cost burden to respondents or record keepers from the collection of... with a change to the burden hours as a result of changing the estimated response time for completing... Number of Annual Responses per Respondent: 1. Estimated Time per Response: 15 minutes. Estimated Total...
The validity of birth and pregnancy histories in rural Bangladesh.
Espeut, Donna; Becker, Stan
2015-08-28
Maternity histories provide a means of estimating fertility and mortality from surveys. The present analysis compares two types of maternity histories-birth histories and pregnancy histories-in three respects: (1) completeness of live birth and infant death reporting; (2) accuracy of the time placement of live births and infant deaths; and (3) the degree to which reported versus actual total fertility measures differ. The analysis covers a 15-year time span and is based on two data sources from Matlab, Bangladesh: the 1994 Matlab Demographic and Health Survey and, as gold standard, the vital events data from Matlab's Demographic Surveillance System. Both histories are near perfect in live-birth completeness; however, pregnancy histories do better in the completeness and time accuracy of deaths during the first year of life. Birth or pregnancy histories can be used for fertility estimation, but pregnancy histories are advised for estimating infant mortality.
Short Vigilance Tasks are Hard Work Even If Time Flies
2016-10-21
maintaining the data needed , and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other...actual time. Upon completion of the task, participants filled out questionnaires related to the hedonic and temporal evaluation of the task. Participants...time. Upon completion of the task, participants filled out questionnaires related to the hedonic and temporal evaluation of the task. Participants
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-26
... information technology; and (e) the annual costs burden to respondents or record keepers from the collection... estimated time to complete ESTA or Form I-94W. There are no proposed changes to Form I-94. Type of Review... Responses: 4,387,550. Estimated Time per Response: 8 minutes. Estimated Burden Hours: 583,544. Estimated...
Biau, David Jean; Porcher, Raphael; Roren, Alexandra; Babinet, Antoine; Rosencher, Nadia; Chevret, Sylvie; Poiraudeau, Serge; Anract, Philippe
2015-08-01
The purpose of this study was to evaluate pre-operative education versus no education and mini-invasive surgery versus standard surgery to reach complete independence. We conducted a four-arm randomized controlled trial of 209 patients. The primary outcome criterion was the time to reach complete functional independence. Secondary outcomes included the operative time, the estimated total blood loss, the pain level, the dose of morphine, and the time to discharge. There was no significant effect of either education (HR: 1.1; P = 0.77) or mini-invasive surgery (HR: 1.0; 95 %; P = 0.96) on the time to reach complete independence. The mini-invasive surgery group significantly reduced the total estimated blood loss (P = 0.0035) and decreased the dose of morphine necessary for titration in the recovery (P = 0.035). Neither pre-operative education nor mini-invasive surgery reduces the time to reach complete functional independence. Mini-invasive surgery significantly reduces blood loss and the need for morphine consumption.
2016-10-25
TIME TO COMPLETE THE OBSTACLE COURSE PORTION OF THE LOAD EFFECTS ASSESSMENT PROGRAM (LEAP) by K. Blake Mitchell Jessica M. Batty Megan E...Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for...2014 – June 2015 4. TITLE AND SUBTITLE RELIABILITY ANALYSIS OF TIME TO COMPLETE THE OBSTACLE COURSE PORTION OF THE LOAD EFFECTS ASSESSMENT PROGRAM
Time estimation predicts mathematical intelligence.
Kramer, Peter; Bressan, Paola; Grassi, Massimo
2011-01-01
Performing mental subtractions affects time (duration) estimates, and making time estimates disrupts mental subtractions. This interaction has been attributed to the concurrent involvement of time estimation and arithmetic with general intelligence and working memory. Given the extant evidence of a relationship between time and number, here we test the stronger hypothesis that time estimation correlates specifically with mathematical intelligence, and not with general intelligence or working-memory capacity. Participants performed a (prospective) time estimation experiment, completed several subtests of the WAIS intelligence test, and self-rated their mathematical skill. For five different durations, we found that time estimation correlated with both arithmetic ability and self-rated mathematical skill. Controlling for non-mathematical intelligence (including working memory capacity) did not change the results. Conversely, correlations between time estimation and non-mathematical intelligence either were nonsignificant, or disappeared after controlling for mathematical intelligence. We conclude that time estimation specifically predicts mathematical intelligence. On the basis of the relevant literature, we furthermore conclude that the relationship between time estimation and mathematical intelligence is likely due to a common reliance on spatial ability.
ERIC Educational Resources Information Center
Calcagno, Juan Carlos; Crosta, Peter; Bailey, Thomas; Jenkins, Davis
2007-01-01
Research has consistently shown that older students--those who enter college for the first time at age 25 or older--are less likely to complete a degree or certificate. The authors estimate a single-risk discrete-time hazard model using transcript data on a cohort of first-time community college students in Florida to compare the educational…
Validation of a Formula for Assigning Continuing Education Credit to Printed Home Study Courses
Hanson, Alan L.
2007-01-01
Objectives To reevaluate and validate the use of a formula for calculating the amount of continuing education credit to be awarded for printed home study courses. Methods Ten home study courses were selected for inclusion in a study to validate the formula, which is based on the number of words, number of final examination questions, and estimated difficulty level of the course. The amount of estimated credit calculated using the a priori formula was compared to the average amount of time required to complete each article based on pharmacists' self-reporting. Results A strong positive relationship between the amount of time required to complete the home study courses based on the a priori calculation and the times reported by pharmacists completing the 10 courses was found (p < 0.001). The correlation accounted for 86.2% of the total variability in the average pharmacist reported completion times (p < 0.001). Conclusions The formula offers an efficient and accurate means of determining the amount of continuing education credit that should be assigned to printed home study courses. PMID:19503705
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-05
... prospective accredited agency to complete the form. Total Burden Hours: 190. Estimated Cost (Operation and...)). This program ensures that information is in the desired format, reporting burden (time and costs) is...; The accuracy of OSHA's estimate of the burden (time and costs) of the information collection...
Mavromoustakos, Elena; Clark, Gavin I; Rock, Adam J
2016-01-01
Probability bias regarding threat-relevant outcomes has been demonstrated across anxiety disorders but has not been investigated in flying phobia. Individual temporal orientation (time perspective) may be hypothesised to influence estimates of negative outcomes occurring. The present study investigated whether probability bias could be demonstrated in flying phobia and whether probability estimates of negative flying events was predicted by time perspective. Sixty flying phobic and fifty-five non-flying-phobic adults were recruited to complete an online questionnaire. Participants completed the Flight Anxiety Scale, Probability Scale (measuring perceived probability of flying-negative events, general-negative and general positive events) and the Past-Negative, Future and Present-Hedonistic subscales of the Zimbardo Time Perspective Inventory (variables argued to predict mental travel forward and backward in time). The flying phobic group estimated the probability of flying negative and general negative events occurring as significantly higher than non-flying phobics. Past-Negative scores (positively) and Present-Hedonistic scores (negatively) predicted probability estimates of flying negative events. The Future Orientation subscale did not significantly predict probability estimates. This study is the first to demonstrate probability bias for threat-relevant outcomes in flying phobia. Results suggest that time perspective may influence perceived probability of threat-relevant outcomes but the nature of this relationship remains to be determined.
Mavromoustakos, Elena; Clark, Gavin I.; Rock, Adam J.
2016-01-01
Probability bias regarding threat-relevant outcomes has been demonstrated across anxiety disorders but has not been investigated in flying phobia. Individual temporal orientation (time perspective) may be hypothesised to influence estimates of negative outcomes occurring. The present study investigated whether probability bias could be demonstrated in flying phobia and whether probability estimates of negative flying events was predicted by time perspective. Sixty flying phobic and fifty-five non-flying-phobic adults were recruited to complete an online questionnaire. Participants completed the Flight Anxiety Scale, Probability Scale (measuring perceived probability of flying-negative events, general-negative and general positive events) and the Past-Negative, Future and Present-Hedonistic subscales of the Zimbardo Time Perspective Inventory (variables argued to predict mental travel forward and backward in time). The flying phobic group estimated the probability of flying negative and general negative events occurring as significantly higher than non-flying phobics. Past-Negative scores (positively) and Present-Hedonistic scores (negatively) predicted probability estimates of flying negative events. The Future Orientation subscale did not significantly predict probability estimates. This study is the first to demonstrate probability bias for threat-relevant outcomes in flying phobia. Results suggest that time perspective may influence perceived probability of threat-relevant outcomes but the nature of this relationship remains to be determined. PMID:27557054
Tang, Xianyan; Geater, Alan; McNeil, Edward; Zhou, Hongxia; Deng, Qiuyun; Dong, Aihu
2017-07-01
Large-scale outbreaks of measles occurred in 2013 and 2014 in rural Guangxi, a region in Southwest China with high coverage for measles-containing vaccine (MCV). This study aimed to estimate the timely vaccination coverage, the timely-and-complete vaccination coverage, and the median delay period for MCV among children aged 18-54 months in rural Guangxi. Based on quartiles of measles incidence during 2011-2013, a stratified three-stage cluster survey was conducted from June through August 2015. Using weighted estimation and finite population correction, vaccination coverage and 95% confidence intervals (CIs) were calculated. Weighted Kaplan-Meier analyses were used to estimate the median delay periods for the first (MCV1) and second (MCV2) doses of the vaccine. A total of 1216 children were surveyed. The timely vaccination coverage rate was 58.4% (95% CI, 54.9%-62.0%) for MCV1, and 76.9% (95% CI, 73.6%-80.0%) for MCV2. The timely-and-complete vaccination coverage rate was 47.4% (95% CI, 44.0%-51.0%). The median delay period was 32 (95% CI, 27-38) days for MCV1, and 159 (95% CI, 118-195) days for MCV2. The timeliness and completeness of measles vaccination was low, and the median delay period was long among children in rural Guangxi. Incorporating the timeliness and completeness into official routine vaccination coverage statistics may help appraise the coverage of vaccination in China. Copyright © 2017 The Authors. Production and hosting by Elsevier B.V. All rights reserved.
Borghese, Michael M; Janssen, Ian
2018-03-22
Children participate in four main types of physical activity: organized sport, active travel, outdoor active play, and curriculum-based physical activity. The objective of this study was to develop a valid approach that can be used to concurrently measure time spent in each of these types of physical activity. Two samples (sample 1: n = 50; sample 2: n = 83) of children aged 10-13 wore an accelerometer and a GPS watch continuously over 7 days. They also completed a log where they recorded the start and end times of organized sport sessions. Sample 1 also completed an outdoor time log where they recorded the times they went outdoors and a description of the outdoor activity. Sample 2 also completed a curriculum log where they recorded times they participated in physical activity (e.g., physical education) during class time. We describe the development of a measurement approach that can be used to concurrently assess the time children spend participating in specific types of physical activity. The approach uses a combination of data from accelerometers, GPS, and activity logs and relies on merging and then processing these data using several manual (e.g., data checks and cleaning) and automated (e.g., algorithms) procedures. In the new measurement approach time spent in organized sport is estimated using the activity log. Time spent in active travel is estimated using an existing algorithm that uses GPS data. Time spent in outdoor active play is estimated using an algorithm (with a sensitivity and specificity of 85%) that was developed using data collected in sample 1 and which uses all of the data sources. Time spent in curriculum-based physical activity is estimated using an algorithm (with a sensitivity of 78% and specificity of 92%) that was developed using data collected in sample 2 and which uses accelerometer data collected during class time. There was evidence of excellent intra- and inter-rater reliability of the estimates for all of these types of physical activity when the manual steps were duplicated. This novel measurement approach can be used to estimate the time that children participate in different types of physical activity.
Classes of Split-Plot Response Surface Designs for Equivalent Estimation
NASA Technical Reports Server (NTRS)
Parker, Peter A.; Kowalski, Scott M.; Vining, G. Geoffrey
2006-01-01
When planning an experimental investigation, we are frequently faced with factors that are difficult or time consuming to manipulate, thereby making complete randomization impractical. A split-plot structure differentiates between the experimental units associated with these hard-to-change factors and others that are relatively easy-to-change and provides an efficient strategy that integrates the restrictions imposed by the experimental apparatus. Several industrial and scientific examples are presented to illustrate design considerations encountered in the restricted randomization context. In this paper, we propose classes of split-plot response designs that provide an intuitive and natural extension from the completely randomized context. For these designs, the ordinary least squares estimates of the model are equivalent to the generalized least squares estimates. This property provides best linear unbiased estimators and simplifies model estimation. The design conditions that allow for equivalent estimation are presented enabling design construction strategies to transform completely randomized Box-Behnken, equiradial, and small composite designs into a split-plot structure.
The United States Marine Corps in Cyberspace: Every Marine a Cyber Warrior
2008-01-01
information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other...TillS COLLECTION OF INFORMATION IS ESTIMATED TO AVERAGE I HOUR PER RESPONSE, INCWDING TIlE TIME FOR REVIEWING INSTRUCTIONS. SEARCHING E.’\\’lSTING
Castel, Guillaume; Tordo, Noël; Plyusnin, Alexander
2017-04-02
Because of the great variability of their reservoir hosts, hantaviruses are excellent models to evaluate the dynamics of virus-host co-evolution. Intriguing questions remain about the timescale of the diversification events that influenced this evolution. In this paper we attempted to estimate the first ever timing of hantavirus diversification based on thirty five available complete genomes representing five major groups of hantaviruses and the assumption of co-speciation of hantaviruses with their respective mammal hosts. Phylogenetic analyses were used to estimate the main diversification points during hantavirus evolution in mammals while host diversification was mostly estimated from independent calibrators taken from fossil records. Our results support an earlier developed hypothesis of co-speciation of known hantaviruses with their respective mammal hosts and hence a common ancestor for all hantaviruses carried by placental mammals. Copyright © 2017 Elsevier B.V. All rights reserved.
F-35 Risk during Department of Defense Financial Crisis
2013-03-01
information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering...and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other...program, an approach intended to save time and money by launching construction at an early stage and at the same time the aircraft was put through
2016-12-06
direction and speed based on cost minimization and best estimated time of arrival (ETA). Sometimes, ships are forced to travel 43 Lehigh Technical...the allowable time to complete the travel . Another important aspect, addressed in the case study, is to investigate the optimal routing of aged...The public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing
Bootsie: estimation of coefficient of variation of AFLP data by bootstrap analysis
USDA-ARS?s Scientific Manuscript database
Bootsie is an English-native replacement for ASG Coelho’s “DBOOT” utility for estimating coefficient of variation of a population of AFLP marker data using bootstrapping. Bootsie improves on DBOOT by supporting batch processing, time-to-completion estimation, built-in graphs, and a suite of export t...
Weisgerber, Michael; Danduran, Michael; Meurer, John; Hartmann, Kathryn; Berger, Stuart; Flores, Glenn
2009-07-01
To evaluate Cooper 12-minute run/walk test (CT12) as a one-time estimate of cardiorespiratory fitness and marker of fitness change compared with treadmill fitness testing in young children with persistent asthma. A cohort of urban children with asthma participated in the asthma and exercise program and a subset completed pre- and postintervention fitness testing. Treadmill fitness testing was conducted by an exercise physiologist in the fitness laboratory at an academic children's hospital. CT12 was conducted in a college recreation center gymnasium. Forty-five urban children with persistent asthma aged 7 to 14 years participated in exercise interventions. A subset of 19 children completed pre- and postintervention exercise testing. Participants completed a 9-week exercise program where they participated in either swimming or golf 3 days a week for 1 hour. A subset of participants completed fitness testing by 2 methods before and after program completion. CT12 results (meters), maximal oxygen consumption ((.)Vo2max) (mL x kg(-1) x min(-1)), and treadmill exercise time (minutes). CT12 and maximal oxygen consumption were moderately correlated (preintervention: 0.55, P = 0.003; postintervention: 0.48, P = 0.04) as one-time measures of fitness. Correlations of the tests as markers of change over time were poor and nonsignificant. In children with asthma, CT12 is a reasonable one-time estimate of fitness but a poor marker of fitness change over time.
A Scalable, Reconfigurable, and Dependable Time-Triggered Architecture
2003-07-01
burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...existing data sources, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of...information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this
Wielgosz, Andreas; Robinson, Christopher; Mao, Yang; Jiang, Ying; Campbell, Norm R C; Muthuri, Stella; Morrison, Howard
2016-06-01
The standard for population-based surveillance of dietary sodium intake is 24-hour urine testing; however, this may be affected by incomplete urine collection. The impact of different indirect methods of assessing completeness of collection on estimated sodium ingestion has not been established. The authors enlisted 507 participants from an existing community study in 2009 to collect 24-hour urine samples. Several methods of assessing completeness of urine collection were tested. Mean sodium intake varied between 3648 mg/24 h and 7210 mg/24 h depending on the method used. Excluding urine samples collected for longer or shorter than 24 hours increased the estimated urine sodium excretion, even when corrections for the variation in timed collections were applied. Until an accurate method of indirectly assessing completeness of urine collection is identified, the gold standard of administering para-aminobenzoic acid is recommended. Efforts to ensure participants collect complete urine samples are also warranted. ©2015 Wiley Periodicals, Inc.
Remission in systemic lupus erythematosus: durable remission is rare.
Wilhelm, Theresa R; Magder, Laurence S; Petri, Michelle
2017-03-01
Remission is the ultimate goal in systemic lupus erythematosus (SLE). In this study, we applied four definitions of remission agreed on by an international collaboration (Definitions of Remission in SLE, DORIS) to a large clinical cohort to estimate rates and predictors of remission. We applied the DORIS definitions of Clinical Remission, Complete Remission (requiring negative serologies), Clinical Remission on treatment (ROT) and Complete ROT. 2307 patients entered the cohort from 1987 to 2014 and were seen at least quarterly. Patients not in remission at cohort entry were followed prospectively. We used the Kaplan-Meier approach to estimate the time to remission and the time from remission to relapse. Cox regression was used to identify baseline factors associated with time to remission, adjusting for baseline disease activity and baseline treatment. The median time to remission was 8.7, 11.0, 1.8 and 3.1 years for Clinical Remission, Complete Remission, Clinical ROT and Complete ROT, respectively. High baseline treatment was the major predictor of a longer time to remission, followed by high baseline activity. The median duration of remission for all definitions was 3 months. African-American ethnicity, baseline low C3 and baseline haematological activity were associated with longer time to remission for all definitions. Baseline anti-dsDNA and baseline low C4 were associated with longer time to Complete Remission and Complete ROT. Baseline low C4 was also negatively associated with Clinical Remission. Our results provide further insights into the frequency and duration of remission in SLE and call attention to the major role of baseline activity and baseline treatment in predicting remission. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Estimation of Time-Varying Pilot Model Parameters
NASA Technical Reports Server (NTRS)
Zaal, Peter M. T.; Sweet, Barbara T.
2011-01-01
Human control behavior is rarely completely stationary over time due to fatigue or loss of attention. In addition, there are many control tasks for which human operators need to adapt their control strategy to vehicle dynamics that vary in time. In previous studies on the identification of time-varying pilot control behavior wavelets were used to estimate the time-varying frequency response functions. However, the estimation of time-varying pilot model parameters was not considered. Estimating these parameters can be a valuable tool for the quantification of different aspects of human time-varying manual control. This paper presents two methods for the estimation of time-varying pilot model parameters, a two-step method using wavelets and a windowed maximum likelihood estimation method. The methods are evaluated using simulations of a closed-loop control task with time-varying pilot equalization and vehicle dynamics. Simulations are performed with and without remnant. Both methods give accurate results when no pilot remnant is present. The wavelet transform is very sensitive to measurement noise, resulting in inaccurate parameter estimates when considerable pilot remnant is present. Maximum likelihood estimation is less sensitive to pilot remnant, but cannot detect fast changes in pilot control behavior.
Meng, Yu; Li, Gang; Gao, Yaozong; Lin, Weili; Shen, Dinggang
2016-11-01
Longitudinal neuroimaging analysis of the dynamic brain development in infants has received increasing attention recently. Many studies expect a complete longitudinal dataset in order to accurately chart the brain developmental trajectories. However, in practice, a large portion of subjects in longitudinal studies often have missing data at certain time points, due to various reasons such as the absence of scan or poor image quality. To make better use of these incomplete longitudinal data, in this paper, we propose a novel machine learning-based method to estimate the subject-specific, vertex-wise cortical morphological attributes at the missing time points in longitudinal infant studies. Specifically, we develop a customized regression forest, named dynamically assembled regression forest (DARF), as the core regression tool. DARF ensures the spatial smoothness of the estimated maps for vertex-wise cortical morphological attributes and also greatly reduces the computational cost. By employing a pairwise estimation followed by a joint refinement, our method is able to fully exploit the available information from both subjects with complete scans and subjects with missing scans for estimation of the missing cortical attribute maps. The proposed method has been applied to estimating the dynamic cortical thickness maps at missing time points in an incomplete longitudinal infant dataset, which includes 31 healthy infant subjects, each having up to five time points in the first postnatal year. The experimental results indicate that our proposed framework can accurately estimate the subject-specific vertex-wise cortical thickness maps at missing time points, with the average error less than 0.23 mm. Hum Brain Mapp 37:4129-4147, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Levy, Dan; Duncan, Greg J.
This study assessed the impact of family childhood income on completed years of schooling using fixed effects techniques to eliminate biases associated with omission of unmeasured family characteristics. It also examined the importance of timing of family income, estimating models that related years of completed schooling to average levels of…
Parameter Estimates in Differential Equation Models for Chemical Kinetics
ERIC Educational Resources Information Center
Winkel, Brian
2011-01-01
We discuss the need for devoting time in differential equations courses to modelling and the completion of the modelling process with efforts to estimate the parameters in the models using data. We estimate the parameters present in several differential equation models of chemical reactions of order n, where n = 0, 1, 2, and apply more general…
2006-10-01
high probability for success. Estimated Time to Complete: 31 May 2007. 4. Support and Upgrade of Armed Forces-CARES to integrate Chaplin ...Excellence (ORCEN) is to provide a small, full- time analytical capability to both the Academy and the United States Army and the Department of...complete significant research projects in this time as they usually require little train-up as they are exposed to many military and academic
Pathfinder. Volume 9, Number 2, March/April 2011
2011-03-01
vides audio, video, desktop sharing and chat.” The platform offers a real-time, Web- based presentation tool to create information and general...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or
Finite-error metrological bounds on multiparameter Hamiltonian estimation
NASA Astrophysics Data System (ADS)
Kura, Naoto; Ueda, Masahito
2018-01-01
Estimation of multiple parameters in an unknown Hamiltonian is investigated. We present upper and lower bounds on the time required to complete the estimation within a prescribed error tolerance δ . The lower bound is given on the basis of the Cramér-Rao inequality, where the quantum Fisher information is bounded by the squared evolution time. The upper bound is obtained by an explicit construction of estimation procedures. By comparing the cases with different numbers of Hamiltonian channels, we also find that the few-channel procedure with adaptive feedback and the many-channel procedure with entanglement are equivalent in the sense that they require the same amount of time resource up to a constant factor.
2017-01-26
Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/5514--17-9692 High Resolution Bathymetry Estimation Improvement with Single Image Super...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate
Hu, Yu; Chen, Yaping
2017-07-11
Vaccination coverage in Zhejiang province, east China, is evaluated through repeated coverage surveys. The Zhejiang provincial immunization information system (ZJIIS) was established in 2004 with links to all immunization clinics. ZJIIS has become an alternative to quickly assess the vaccination coverage. To assess the current completeness and accuracy on the vaccination coverage derived from ZJIIS, we compared the estimates from ZJIIS with the estimates from the most recent provincial coverage survey in 2014, which combined interview data with verified data from ZJIIS. Of the enrolled 2772 children in the 2014 provincial survey, the proportions of children with vaccination cards and registered in ZJIIS were 94.0% and 87.4%, respectively. Coverage estimates from ZJIIS were systematically higher than the corresponding estimates obtained through the survey, with a mean difference of 4.5%. Of the vaccination doses registered in ZJIIS, 16.7% differed from the date recorded in the corresponding vaccination cards. Under-registration in ZJIIS significantly influenced the coverage estimates derived from ZJIIS. Therefore, periodic coverage surveys currently provide more complete and reliable results than the estimates based on ZJIIS alone. However, further improvement of completeness and accuracy of ZJIIS will likely allow more reliable and timely estimates in future.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-11
... to the form are to allow applicants to pay the transfer tax by credit or debit card, and combine... amount of time estimated for an average respondent to respond: It is estimated that 9,662 respondents will take an average of approximately 1.69 hours to complete. (6) An estimate of the total burden (in...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-11
... and local law. The changes to the form are to allow the applicant to pay the transfer tax by credit or...) An estimate of the total number of respondents and the amount of time estimated for an average respondent to respond: It is estimated that 65,085 respondents will take an average of 1.68 hours to complete...
Asynchronous machine rotor speed estimation using a tabulated numerical approach
NASA Astrophysics Data System (ADS)
Nguyen, Huu Phuc; De Miras, Jérôme; Charara, Ali; Eltabach, Mario; Bonnet, Stéphane
2017-12-01
This paper proposes a new method to estimate the rotor speed of the asynchronous machine by looking at the estimation problem as a nonlinear optimal control problem. The behavior of the nonlinear plant model is approximated off-line as a prediction map using a numerical one-step time discretization obtained from simulations. At each time-step, the speed of the induction machine is selected satisfying the dynamic fitting problem between the plant output and the predicted output, leading the system to adopt its dynamical behavior. Thanks to the limitation of the prediction horizon to a single time-step, the execution time of the algorithm can be completely bounded. It can thus easily be implemented and embedded into a real-time system to observe the speed of the real induction motor. Simulation results show the performance and robustness of the proposed estimator.
Adaptive Training in an Unmanned Aerial Vehicel: Examination of Several Candidate Real-time Metrics
2010-01-01
for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data... sources , gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden... estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services
Impossible Certainty: Cost Risk Analysis for Air Force Systems
2006-01-01
the estimated cost of weapon systems , which typically take many years to acquire and remain in operation for a long time . To make those esti- mates... times , uncertain, undefined, or unknown when estimates are prepared. New system development may involve further uncer- tainty due to unproven or...risk (a system requiring more money to complete than was forecasted ) and opera- tional risk (a vital capability becoming unaffordable as the program
NASA Technical Reports Server (NTRS)
Engelland, Shawn A.; Capps, Alan
2011-01-01
Current aircraft departure release times are based on manual estimates of aircraft takeoff times. Uncertainty in takeoff time estimates may result in missed opportunities to merge into constrained en route streams and lead to lost throughput. However, technology exists to improve takeoff time estimates by using the aircraft surface trajectory predictions that enable air traffic control tower (ATCT) decision support tools. NASA s Precision Departure Release Capability (PDRC) is designed to use automated surface trajectory-based takeoff time estimates to improve en route tactical departure scheduling. This is accomplished by integrating an ATCT decision support tool with an en route tactical departure scheduling decision support tool. The PDRC concept and prototype software have been developed, and an initial test was completed at air traffic control facilities in Dallas/Fort Worth. This paper describes the PDRC operational concept, system design, and initial observations.
Cross-bispectrum computation and variance estimation
NASA Technical Reports Server (NTRS)
Lii, K. S.; Helland, K. N.
1981-01-01
A method for the estimation of cross-bispectra of discrete real time series is developed. The asymptotic variance properties of the bispectrum are reviewed, and a method for the direct estimation of bispectral variance is given. The symmetry properties are described which minimize the computations necessary to obtain a complete estimate of the cross-bispectrum in the right-half-plane. A procedure is given for computing the cross-bispectrum by subdividing the domain into rectangular averaging regions which help reduce the variance of the estimates and allow easy application of the symmetry relationships to minimize the computational effort. As an example of the procedure, the cross-bispectrum of a numerically generated, exponentially distributed time series is computed and compared with theory.
Regression analysis of longitudinal data with correlated censoring and observation times.
Li, Yang; He, Xin; Wang, Haiying; Sun, Jianguo
2016-07-01
Longitudinal data occur in many fields such as the medical follow-up studies that involve repeated measurements. For their analysis, most existing approaches assume that the observation or follow-up times are independent of the response process either completely or given some covariates. In practice, it is apparent that this may not be true. In this paper, we present a joint analysis approach that allows the possible mutual correlations that can be characterized by time-dependent random effects. Estimating equations are developed for the parameter estimation and the resulted estimators are shown to be consistent and asymptotically normal. The finite sample performance of the proposed estimators is assessed through a simulation study and an illustrative example from a skin cancer study is provided.
Gap filling strategies and error in estimating annual soil respiration
USDA-ARS?s Scientific Manuscript database
Soil respiration (Rsoil) is one of the largest CO2 fluxes in the global carbon (C) cycle. Estimation of annual Rsoil requires extrapolation of survey measurements or gap-filling of automated records to produce a complete time series. While many gap-filling methodologies have been employed, there is ...
Characteristics of people living in Italy after a cancer diagnosis in 2010 and projections to 2020.
Guzzinati, Stefano; Virdone, Saverio; De Angelis, Roberta; Panato, Chiara; Buzzoni, Carlotta; Capocaccia, Riccardo; Francisci, Silvia; Gigli, Anna; Zorzi, Manuel; Tagliabue, Giovanna; Serraino, Diego; Falcini, Fabio; Casella, Claudia; Russo, Antonio Giampiero; Stracci, Fabrizio; Caruso, Bianca; Michiara, Maria; Caiazzo, Anna Luisa; Castaing, Marine; Ferretti, Stefano; Mangone, Lucia; Rudisi, Giuseppa; Sensi, Flavio; Mazzoleni, Guido; Pannozzo, Fabio; Tumino, Rosario; Fusco, Mario; Ricci, Paolo; Gola, Gemma; Giacomin, Adriano; Tisano, Francesco; Candela, Giuseppa; Fanetti, Anna Clara; Pala, Filomena; Sardo, Antonella Sutera; Rugge, Massimo; Botta, Laura; Maso, Luigino Dal
2018-02-09
Estimates of cancer prevalence are widely based on limited duration, often including patients living after a cancer diagnosis made in the previous 5 years and less frequently on complete prevalence (i.e., including all patients regardless of the time elapsed since diagnosis). This study aims to provide estimates of complete cancer prevalence in Italy by sex, age, and time since diagnosis for all cancers combined, and for selected cancer types. Projections were made up to 2020, overall and by time since diagnosis. Data were from 27 Italian population-based cancer registries, covering 32% of the Italian population, able to provide at least 7 years of registration as of December 2009 and follow-up of vital status as of December 2013. The data were used to compute the limited-duration prevalence, in order to estimate the complete prevalence by means of the COMPREV software. In 2010, 2,637,975 persons were estimated to live in Italy after a cancer diagnosis, 1.2 million men and 1.4 million women, or 4.6% of the Italian population. A quarter of male prevalent cases had prostate cancer (n = 305,044), while 42% of prevalent women had breast cancer (n = 604,841). More than 1.5 million people (2.7% of Italians) were alive since 5 or more years after diagnosis and 20% since ≥15 years. It is projected that, in 2020 in Italy, there will be 3.6 million prevalent cancer cases (+ 37% vs 2010). The largest 10-year increases are foreseen for prostate (+ 85%) and for thyroid cancers (+ 79%), and for long-term survivors diagnosed since 20 or more years (+ 45%). Among the population aged ≥75 years, 22% will have had a previous cancer diagnosis. The number of persons living after a cancer diagnosis is estimated to rise of approximately 3% per year in Italy. The availability of detailed estimates and projections of the complete prevalence are intended to help the implementation of guidelines aimed to enhance the long-term follow-up of cancer survivors and to contribute their rehabilitation needs.
Duchêne, Sebastián; Archer, Frederick I.; Vilstrup, Julia; Caballero, Susana; Morin, Phillip A.
2011-01-01
The availability of mitochondrial genome sequences is growing as a result of recent technological advances in molecular biology. In phylogenetic analyses, the complete mitogenome is increasingly becoming the marker of choice, usually providing better phylogenetic resolution and precision relative to traditional markers such as cytochrome b (CYTB) and the control region (CR). In some cases, the differences in phylogenetic estimates between mitogenomic and single-gene markers have yielded incongruent conclusions. By comparing phylogenetic estimates made from different genes, we identified the most informative mitochondrial regions and evaluated the minimum amount of data necessary to reproduce the same results as the mitogenome. We compared results among individual genes and the mitogenome for recently published complete mitogenome datasets of selected delphinids (Delphinidae) and killer whales (genus Orcinus). Using Bayesian phylogenetic methods, we investigated differences in estimation of topologies, divergence dates, and clock-like behavior among genes for both datasets. Although the most informative regions were not the same for each taxonomic group (COX1, CYTB, ND3 and ATP6 for Orcinus, and ND1, COX1 and ND4 for Delphinidae), in both cases they were equivalent to less than a quarter of the complete mitogenome. This suggests that gene information content can vary among groups, but can be adequately represented by a portion of the complete sequence. Although our results indicate that complete mitogenomes provide the highest phylogenetic resolution and most precise date estimates, a minimum amount of data can be selected using our approach when the complete sequence is unavailable. Studies based on single genes can benefit from the addition of a few more mitochondrial markers, producing topologies and date estimates similar to those obtained using the entire mitogenome. PMID:22073275
NASA Astrophysics Data System (ADS)
Strano, Salvatore; Terzo, Mario
2018-05-01
The dynamics of the railway vehicles is strongly influenced by the interaction between the wheel and the rail. This kind of contact is affected by several conditioning factors such as vehicle speed, wear, adhesion level and, moreover, it is nonlinear. As a consequence, the modelling and the observation of this kind of phenomenon are complex tasks but, at the same time, they constitute a fundamental step for the estimation of the adhesion level or for the vehicle condition monitoring. This paper presents a novel technique for the real time estimation of the wheel-rail contact forces based on an estimator design model that takes into account the nonlinearities of the interaction by means of a fitting model functional to reproduce the contact mechanics in a wide range of slip and to be easily integrated in a complete model based estimator for railway vehicle.
Stone, Anne C; Battistuzzi, Fabia U; Kubatko, Laura S; Perry, George H; Trudeau, Evan; Lin, Hsiuman; Kumar, Sudhir
2010-10-27
Here, we report the sequencing and analysis of eight complete mitochondrial genomes of chimpanzees (Pan troglodytes) from each of the three established subspecies (P. t. troglodytes, P. t. schweinfurthii and P. t. verus) and the proposed fourth subspecies (P. t. ellioti). Our population genetic analyses are consistent with neutral patterns of evolution that have been shaped by demography. The high levels of mtDNA diversity in western chimpanzees are unlike those seen at nuclear loci, which may reflect a demographic history of greater female to male effective population sizes possibly owing to the characteristics of the founding population. By using relaxed-clock methods, we have inferred a timetree of chimpanzee species and subspecies. The absolute divergence times vary based on the methods and calibration used, but relative divergence times show extensive uniformity. Overall, mtDNA produces consistently older times than those known from nuclear markers, a discrepancy that is reduced significantly by explicitly accounting for chimpanzee population structures in time estimation. Assuming the human-chimpanzee split to be between 7 and 5 Ma, chimpanzee time estimates are 2.1-1.5, 1.1-0.76 and 0.25-0.18 Ma for the chimpanzee/bonobo, western/(eastern + central) and eastern/central chimpanzee divergences, respectively.
Sheng, Li; Wang, Zidong; Zou, Lei; Alsaadi, Fuad E
2017-10-01
In this paper, the event-based finite-horizon H ∞ state estimation problem is investigated for a class of discrete time-varying stochastic dynamical networks with state- and disturbance-dependent noises [also called (x,v) -dependent noises]. An event-triggered scheme is proposed to decrease the frequency of the data transmission between the sensors and the estimator, where the signal is transmitted only when certain conditions are satisfied. The purpose of the problem addressed is to design a time-varying state estimator in order to estimate the network states through available output measurements. By employing the completing-the-square technique and the stochastic analysis approach, sufficient conditions are established to ensure that the error dynamics of the state estimation satisfies a prescribed H ∞ performance constraint over a finite horizon. The desired estimator parameters can be designed via solving coupled backward recursive Riccati difference equations. Finally, a numerical example is exploited to demonstrate the effectiveness of the developed state estimation scheme.
Pointwise nonparametric maximum likelihood estimator of stochastically ordered survivor functions
Park, Yongseok; Taylor, Jeremy M. G.; Kalbfleisch, John D.
2012-01-01
In this paper, we consider estimation of survivor functions from groups of observations with right-censored data when the groups are subject to a stochastic ordering constraint. Many methods and algorithms have been proposed to estimate distribution functions under such restrictions, but none have completely satisfactory properties when the observations are censored. We propose a pointwise constrained nonparametric maximum likelihood estimator, which is defined at each time t by the estimates of the survivor functions subject to constraints applied at time t only. We also propose an efficient method to obtain the estimator. The estimator of each constrained survivor function is shown to be nonincreasing in t, and its consistency and asymptotic distribution are established. A simulation study suggests better small and large sample properties than for alternative estimators. An example using prostate cancer data illustrates the method. PMID:23843661
NASA Technical Reports Server (NTRS)
Deutschmann, Julie; Bar-Itzhack, Itzhack
1997-01-01
Traditionally satellite attitude and trajectory have been estimated with completely separate systems, using different measurement data. The estimation of both trajectory and attitude for low earth orbit satellites has been successfully demonstrated in ground software using magnetometer and gyroscope data. Since the earth's magnetic field is a function of time and position, and since time is known quite precisely, the differences between the computed and measured magnetic field components, as measured by the magnetometers throughout the entire spacecraft orbit, are a function of both the spacecraft trajectory and attitude errors. Therefore, these errors can be used to estimate both trajectory and attitude. This work further tests the single augmented Extended Kalman Filter (EKF) which simultaneously and autonomously estimates spacecraft trajectory and attitude with data from the Rossi X-Ray Timing Explorer (RXTE) magnetometer and gyro-measured body rates. In addition, gyro biases are added to the state and the filter's ability to estimate them is presented.
Glossary Defense Acquisition Acronyms and Terms
1991-09-01
of work to complete a job or part of a project . Actual Cost A cost sustained in fact, on the basis of costs incurred, as... of a project which shows the activities to be completed and the time to complete them is represented by horizontal lines drawn in proportion to the ...recorded for the total estimated obligations for a program or project in the initial year of funding. (For distinction, see Full
Data transfer using complete bipartite graph
NASA Astrophysics Data System (ADS)
Chandrasekaran, V. M.; Praba, B.; Manimaran, A.; Kailash, G.
2017-11-01
Information exchange extent is an estimation of the amount of information sent between two focuses on a framework in a given time period. It is an extremely significant perception in present world. There are many ways of message passing in the present situations. Some of them are through encryption, decryption, by using complete bipartite graph. In this paper, we recommend a method for communication using messages through encryption of a complete bipartite graph.
Fung, Janice Wing Mei; Lim, Sandra Bee Lay; Zheng, Huili; Ho, William Ying Tat; Lee, Bee Guat; Chow, Khuan Yew; Lee, Hin Peng
2016-08-01
To provide a comprehensive evaluation of the quality of the data at the Singapore Cancer Registry (SCR). Quantitative and semi-quantitative methods were used to assess the comparability, completeness, accuracy and timeliness of data for the period of 1968-2013, with focus on the period 2008-2012. The SCR coding and classification systems follow international standards. The overall completeness was estimated at 98.1% using the flow method and 97.5% using the capture-recapture method, for the period of 2008-2012. For the same period, 91.9% of the cases were morphologically verified (site-specific range: 40.4-100%) with 1.1% DCO cases. The under-reporting in 2011 and 2012 due to timely publication was estimated at 0.03% and 0.51% respectively. This review shows that the processes in place at the SCR yields data which are internationally comparable, relatively complete, valid, and timely, allowing for greater confidence in the use of quality data in the areas of cancer prevention, treatment and control. Copyright © 2016 Elsevier Ltd. All rights reserved.
Hu, Yu; Chen, Yaping
2017-01-01
Vaccination coverage in Zhejiang province, east China, is evaluated through repeated coverage surveys. The Zhejiang provincial immunization information system (ZJIIS) was established in 2004 with links to all immunization clinics. ZJIIS has become an alternative to quickly assess the vaccination coverage. To assess the current completeness and accuracy on the vaccination coverage derived from ZJIIS, we compared the estimates from ZJIIS with the estimates from the most recent provincial coverage survey in 2014, which combined interview data with verified data from ZJIIS. Of the enrolled 2772 children in the 2014 provincial survey, the proportions of children with vaccination cards and registered in ZJIIS were 94.0% and 87.4%, respectively. Coverage estimates from ZJIIS were systematically higher than the corresponding estimates obtained through the survey, with a mean difference of 4.5%. Of the vaccination doses registered in ZJIIS, 16.7% differed from the date recorded in the corresponding vaccination cards. Under-registration in ZJIIS significantly influenced the coverage estimates derived from ZJIIS. Therefore, periodic coverage surveys currently provide more complete and reliable results than the estimates based on ZJIIS alone. However, further improvement of completeness and accuracy of ZJIIS will likely allow more reliable and timely estimates in future. PMID:28696387
A statistical analysis of flank eruptions on Etna volcano
NASA Astrophysics Data System (ADS)
Mulargia, Francesco; Tinti, Stefano; Boschi, Enzo
1985-02-01
A singularly complete record exists for the eruptive activity of Etna volcano. The time series of occurrence of flank eruptions in the period 1600-1980, in which the record is presumably complete, is found to follow a stationary Poisson process. A revision of the available data shows that eruption durations are rather well correlated with the estimates of the volume of lava flows. This implies that the magnitude of an eruption can be defined directly by its duration. Extreme value statistics are then applied to the time series, using duration as a dependent variable. The probability of occurrence of a very long (300 days) eruption is greater than 50% only in time intervals of the order of 50 years. The correlation found between duration and total output also allows estimation of the probability of occurrence of a major event which exceeds a given duration and total flow of lava. The composite probabilities do not differ considerably from the pure ones. Paralleling a well established application to seismic events, extreme value theory can be profitably used in volcanic risk estimates, provided that appropriate account is also taken of all other variables.
Testing for Seed Quality in Southern Oaks
F.T. Bonner
1984-01-01
Expressions of germination rate, such as peak value (PV) or mean germination time (MGT), provide good estimates of acorn quality, but test completion requires a minimum of 3 weeks. For more rapid estimates, tetrazolium staining is recommended. Some seed test results were significantly correlated with nursery germination of cherrybark and water oaks, but not with...
2017-01-01
The State Energy Data System (SEDS) is the U.S. Energy Information Administration's (EIA) source for comprehensive state energy statistics. Included are estimates of energy production, consumption, prices, and expenditures broken down by energy source and sector. Production and consumption estimates begin with the year 1960 while price and expenditure estimates begin with 1970. The multidimensional completeness of SEDS allows users to make comparisons across states, energy sources, sectors, and over time.
Development of Parallel Architectures for Sensor Array Processing. Volume 1
1993-08-01
required for the DOA estimation [ 1-7]. The Multiple Signal Classification ( MUSIC ) [ 1] and the Estimation of Signal Parameters by Rotational...manifold and the estimated subspace. Although MUSIC is a high resolution algorithm, it has several drawbacks including the fact that complete knowledge of...thoroughly, MUSIC algorithm was selected to develop special purpose hardware for real time computation. Summary of the MUSIC algorithm is as follows
Fine tuning GPS clock estimation in the MCS
NASA Technical Reports Server (NTRS)
Hutsell, Steven T.
1995-01-01
With the completion of a 24 operational satellite constellation, GPS is fast approaching the critical milestone, Full Operational Capability (FOC). Although GPS is well capable of providing the timing accuracy and stability figures required by system specifications, the GPS community will continue to strive for further improvements in performance. The GPS Master Control Station (MCS) recently demonstrated that timing improvements are always composite Clock, and hence, Kalman Filter state estimation, providing a small improvement to user accuracy.
Pennington, Audrey Flak; Strickland, Matthew J.; Klein, Mitchel; Zhai, Xinxin; Russell, Armistead G.; Hansen, Craig; Darrow, Lyndsey A.
2018-01-01
Prenatal air pollution exposure is frequently estimated using maternal residential location at the time of delivery as a proxy for residence during pregnancy. We describe residential mobility during pregnancy among 19,951 children from the Kaiser Air Pollution and Pediatric Asthma Study, quantify measurement error in spatially-resolved estimates of prenatal exposure to mobile source fine particulate matter (PM2.5) due to ignoring this mobility, and simulate the impact of this error on estimates of epidemiologic associations. Two exposure estimates were compared, one calculated using complete residential histories during pregnancy (weighted average based on time spent at each address) and the second calculated using only residence at birth. Estimates were computed using annual averages of primary PM2.5 from traffic emissions modeled using a research line-source dispersion model (RLINE) at 250 meter resolution. In this cohort, 18.6% of children were born to mothers who moved at least once during pregnancy. Mobile source PM2.5 exposure estimates calculated using complete residential histories during pregnancy and only residence at birth were highly correlated (rS>0.9). Simulations indicated that ignoring residential mobility resulted in modest bias of epidemiologic associations toward the null, but varied by maternal characteristics and prenatal exposure windows of interest (ranging from −2% to −10% bias). PMID:27966666
Pennington, Audrey Flak; Strickland, Matthew J; Klein, Mitchel; Zhai, Xinxin; Russell, Armistead G; Hansen, Craig; Darrow, Lyndsey A
2017-09-01
Prenatal air pollution exposure is frequently estimated using maternal residential location at the time of delivery as a proxy for residence during pregnancy. We describe residential mobility during pregnancy among 19,951 children from the Kaiser Air Pollution and Pediatric Asthma Study, quantify measurement error in spatially resolved estimates of prenatal exposure to mobile source fine particulate matter (PM 2.5 ) due to ignoring this mobility, and simulate the impact of this error on estimates of epidemiologic associations. Two exposure estimates were compared, one calculated using complete residential histories during pregnancy (weighted average based on time spent at each address) and the second calculated using only residence at birth. Estimates were computed using annual averages of primary PM 2.5 from traffic emissions modeled using a Research LINE-source dispersion model for near-surface releases (RLINE) at 250 m resolution. In this cohort, 18.6% of children were born to mothers who moved at least once during pregnancy. Mobile source PM 2.5 exposure estimates calculated using complete residential histories during pregnancy and only residence at birth were highly correlated (r S >0.9). Simulations indicated that ignoring residential mobility resulted in modest bias of epidemiologic associations toward the null, but varied by maternal characteristics and prenatal exposure windows of interest (ranging from -2% to -10% bias).
NASA Astrophysics Data System (ADS)
Li, Lu; Narayanan, Ramakrishnan; Miller, Steve; Shen, Feimo; Barqawi, Al B.; Crawford, E. David; Suri, Jasjit S.
2008-02-01
Real-time knowledge of capsule volume of an organ provides a valuable clinical tool for 3D biopsy applications. It is challenging to estimate this capsule volume in real-time due to the presence of speckles, shadow artifacts, partial volume effect and patient motion during image scans, which are all inherent in medical ultrasound imaging. The volumetric ultrasound prostate images are sliced in a rotational manner every three degrees. The automated segmentation method employs a shape model, which is obtained from training data, to delineate the middle slices of volumetric prostate images. Then a "DDC" algorithm is applied to the rest of the images with the initial contour obtained. The volume of prostate is estimated with the segmentation results. Our database consists of 36 prostate volumes which are acquired using a Philips ultrasound machine using a Side-fire transrectal ultrasound (TRUS) probe. We compare our automated method with the semi-automated approach. The mean volumes using the semi-automated and complete automated techniques were 35.16 cc and 34.86 cc, with the error of 7.3% and 7.6% compared to the volume obtained by the human estimated boundary (ideal boundary), respectively. The overall system, which was developed using Microsoft Visual C++, is real-time and accurate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldhaber, M.K.; Staub, S.L.; Tokuhata, G.K.
A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected. After one year, all pregnancy cases were followed up and outcomes ascertained. Using the life table method, it was found that, given pregnancies after four completed weeks of gestation counting from the first day of the last menstrual period, themore » estimated incidence of spontaneous abortion (miscarriage before completion of 16 weeks of gestation) was 15.1 per cent for women pregnant at the time of the TMI accident. Combining spontaneous abortions and stillbirths (delivery of a dead fetus after 16 weeks of gestation), the estimated incidence was 16.1 per cent for pregnancies after four completed weeks of gestation. Both incidences are comparable to baseline studies of fetal loss.« less
Spontaneous abortions after the Three Mile Island nuclear accident: a life table analysis.
Goldhaber, M K; Staub, S L; Tokuhata, G K
1983-07-01
A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected. After one year, all pregnancy cases were followed up and outcomes ascertained. Using the life table method, it was found that, given pregnancies after four completed weeks of gestation counting from the first day of the last menstrual period, the estimated incidence of spontaneous abortion (miscarriage before completion of 16 weeks of gestation) was 15.1 per cent for women pregnant at the time of the TMI accident. Combining spontaneous abortions and stillbirths (delivery of a dead fetus after 16 weeks of gestation), the estimated incidence was 16.1 per cent for pregnancies after four completed weeks of gestation. Both incidences are comparable to baseline studies of fetal loss.
75 FR 65039 - Submission for Review: Program Services Evaluation Surveys, OMB Control No. 3206-NEW
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-21
... will be completed in the next 3 years. The time estimate varies from 1 minute to 40 minutes to complete... 75 FR 35092 allowing for a 60-day public comment period. No comments were received for this... Office of Management and Budget is particularly interested in comments that: 1. Evaluate whether the...
Kendall, W.L.; Nichols, J.D.; North, P.M.; Nichols, J.D.
1995-01-01
The use of the Cormack- Jolly-Seber model under a standard sampling scheme of one sample per time period, when the Jolly-Seber assumption that all emigration is permanent does not hold, leads to the confounding of temporary emigration probabilities with capture probabilities. This biases the estimates of capture probability when temporary emigration is a completely random process, and both capture and survival probabilities when there is a temporary trap response in temporary emigration, or it is Markovian. The use of secondary capture samples over a shorter interval within each period, during which the population is assumed to be closed (Pollock's robust design), provides a second source of information on capture probabilities. This solves the confounding problem, and thus temporary emigration probabilities can be estimated. This process can be accomplished in an ad hoc fashion for completely random temporary emigration and to some extent in the temporary trap response case, but modelling the complete sampling process provides more flexibility and permits direct estimation of variances. For the case of Markovian temporary emigration, a full likelihood is required.
Political Revolution And Social Communication Technologies
2017-12-01
this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data...sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden...estimate or any other aspect of this collection of information , including suggestions for reducing this burden, to Washington headquarters Services
2013-12-01
instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send...0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing...comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to
PROPERTY APPRAISAL PROVIDES CONTROL, INSURANCE BASIS, AND VALUE ESTIMATE.
ERIC Educational Resources Information Center
THOMSON, JACK
A COMPLETE PROPERTY APPRAISAL SERVES AS A BASIS FOR CONTROL, INSURANCE AND VALUE ESTIMATE. A PROFESSIONAL APPRAISAL FIRM SHOULD PERFORM THIS FUNCTION BECAUSE (1) IT IS FAMILIAR WITH PROPER METHODS, (2) IT CAN PREPARE THE REPORT WITH MINIMUM CONFUSION AND INTERRRUPTION OF THE COLLEGE OPERATION, (3) USE OF ITS PRICING LIBRARY REDUCES TIME NEEDED AND…
Perceived Difficulty of a Motor-Skill Task as a Function of Training.
ERIC Educational Resources Information Center
Bratfisch, Oswald; And Others
A simple device called a "wire labyrinth" was used in an experiment involving learning of a two-hand motor task. The Ss were asked, after completing each of 7 successive trails, to give their estimates of perceived (subjective) difficulty of the task. For this purpose, the psychophysical method of magnitude estimation was used. Time was…
ERIC Educational Resources Information Center
Storek, Josephine; Furnham, Adrian
2013-01-01
Over 120 participants completed three timed intelligence tests, a self-estimated Domain Masculine (DMIQ) Intelligence scale, and a mindset "beliefs about intelligence" measure (Dweck, 2012) to examine correlates of the Hubris-Humility Effect (HHE) which shows males believe they are more intelligent than females. As predicted males gave…
Special Operations Forces Interagency Counterterrorism Reference Manual
2009-03-01
information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering...and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other...Presidential Review Directives and Presidential Decision Directives (Clin- ton administration) and National Security Study Directives and National
USDA-ARS?s Scientific Manuscript database
The scale mismatch between remotely sensed observations and crop growth models simulated state variables decreases the reliability of crop yield estimates. To overcome this problem, we used a two-step data assimilation phases: first we generated a complete leaf area index (LAI) time series by combin...
Sasebo, A Case Study in Optimizing Official Vehicles
2012-12-01
is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and maintaining...the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of...that were estimated to reduce federal budget deficits by a total of at least $2.1 trillion over the 2012–2021 period…At least another $1.2 trillion
Estimation of modal parameters using bilinear joint time frequency distributions
NASA Astrophysics Data System (ADS)
Roshan-Ghias, A.; Shamsollahi, M. B.; Mobed, M.; Behzad, M.
2007-07-01
In this paper, a new method is proposed for modal parameter estimation using time-frequency representations. Smoothed Pseudo Wigner-Ville distribution which is a member of the Cohen's class distributions is used to decouple vibration modes completely in order to study each mode separately. This distribution reduces cross-terms which are troublesome in Wigner-Ville distribution and retains the resolution as well. The method was applied to highly damped systems, and results were superior to those obtained via other conventional methods.
Estimation of duration and mental workload at differing times of day by males and females
NASA Technical Reports Server (NTRS)
Hancock, P. A.; Rodenburg, G. J.; Mathews, W. D.; Vercruyssen, M.
1988-01-01
Two experiments are reported which investigated whether male and female operator duration estimation and subjective workload followed conventional circadian fluctuation. In the first experiment, twenty-four subjects performed a filled time-estimation task in a constant blacked-out, noise-reduced environment at 0800, 1200, 1600, and 2000 h. In the second experiment, twelve subjects performed an unfilled time estimation task in similar conditions at 0900, 1400, and 1900 h. At the termination of all experimental sessions, participants completed the NASA TLX workload assessment questionnaire as a measure of perceived mental workload. Results indicated that while physiological response followed an expected pattern, estimations of duration and subjective perception of workload showed no significant effects for time-of-day. In each of the experiments, however, there were significant differences in durational estimates and mental workload response depending upon the gender of the participant. Results are taken to support the assertion that subjective workload is responsive largely to task-related factors and indicates the important differences that may be expected due to operator gender.
Reirradiation of tumors in cats and dogs.
Turrel, J M; Théon, A P
1988-08-15
Fifty-one cats and dogs with tumor recurrence after irradiation were treated with a second course of radiotherapy, using either teletherapy or brachytherapy. Eighty-six percent of the tumors had partial or complete response at 2 months after reirradiation. Tumor response was significantly (P = 0.041) affected when the interval between the 2 courses of irradiation was greater than 5 months. The estimated local tumor control rate was 38% at 1 year after reirradiation. Of all the factors examined, complete response at 2 months, reirradiation field size less than or equal to 10 cm2, and reirradiation dose greater than 40 gray emerged as predictors of local tumor control. The estimated overall survival rate was 47% at 2 years. Tumor location had a significant (P = 0.001) influence on overall survival; animals with cutaneous tumors had the longest survival times, and those with oral tumors had the shortest survival times. The other significant (P = 0.001) factor affecting overall survival time was the field size of the reirradiated site. Estimated survival time after reirradiation was 41% at 1 year. Favorable prognostic indicators were complete response at 2 months and location of tumor; animals with skin tumors had a favorable prognosis. The acute effects of reirradiation on normal tissues were acceptable, but 12% of the animals had severe delayed complications. Significant risk of complications after reirradiation was associated with squamous cell carcinoma (P = 0.015) and reirradiated field size greater than 30 cm2 (P = 0.056). When the interval between irradiations was greater than 5 months, the risk of complications was significantly (P = 0.022) lower.(ABSTRACT TRUNCATED AT 250 WORDS)
Humphries, Laura S; Shenaq, Deana S; Teven, Chad M; Park, Julie E; Song, David H
2018-01-01
We hypothesize that reusable, on-site specialty instrument trays available to plastic surgery residents in the emergency department (ED) for bedside procedures are more cost-effective than disposable on-site and remote re-usable operating room (OR) instruments at our institution. We completed a cost-effectiveness analysis comparing the use of disposable on-site kits and remote OR trays to a hypothetical, custom, reusable tray for ED procedures completed by PRS residents. Material costs of existing OR trays were used to estimate the purchasing and use-cost of a custom on-site tray for the same procedures. Cost of per procedure 'consult time' was estimated using procedure and resident salary. Sixteen bedside procedures were completed over a 4.5 month period. A mean of 2.14 disposable kits were used per-procedure. Mean consultation time was 1.66 hours. Procedures that used OR trays took 3 times as long as procedures that used on-site kits (4 vs. 1.1 hours). Necessary, additional instruments were unavailable for 75% of procedures. Mean cost of using disposable kits and OR trays was $115.03/procedure versus an estimated $26.67/procedure cost of using a custom tray, yielding $88.36/procedure cost-savings. Purchase of a single custom tray ($1,421.55) would be redeemed after 2.3 weeks at 1 procedure/day. Purchasing 4 trays has projected annual cost-savings of $26,565.20. The purchase of specialized procedure trays will yield valuable time and cost-savings while providing quality patient care. Improving time efficiency will help achieve the Accreditation Council of Graduate Medical Education (ACGME) goals of maintaining resident well-being and developing quality improvement competency.
49 CFR 571.126 - Standard No. 126; Electronic stability control systems.
Code of Federal Regulations, 2011 CFR
2011-10-01
... series uses counterclockwise steering, and the other series uses clockwise steering. The maximum time... rate and to estimate its side slip or side slip derivative with respect to time; (4) That has a means... after completion of the sine with dwell steering input (time T0 + 1 in Figure 1) must not exceed 35...
Edouard, Pascal; Junge, Astrid; Kiss-Polauf, Marianna; Ramirez, Christophe; Sousa, Monica; Timpka, Toomas; Branco, Pedro
2018-03-01
The quality of epidemiological injury data depends on the reliability of reporting to an injury surveillance system. Ascertaining whether all physicians/physiotherapists report the same information for the same injury case is of major interest to determine data validity. The aim of this study was therefore to analyse the data collection reliability through the analysis of the interrater reliability. Cross-sectional survey. During the 2016 European Athletics Advanced Athletics Medicine Course in Amsterdam, all national medical teams were asked to complete seven virtual case reports on a standardised injury report form using the same definitions and classifications of injuries as the international athletics championships injury surveillance protocol. The completeness of data and the Fleiss' kappa coefficients for the inter-rater reliability were calculated for: sex, age, event, circumstance, location, type, assumed cause and estimated time-loss. Forty-one team physicians and physiotherapists of national medical teams participated in the study (response rate 89.1%). Data completeness was 96.9%. The Fleiss' kappa coefficients were: almost perfect for sex (k=1), injury location (k=0.991), event (k=0.953), circumstance (k=0.942), and age (k=0.870), moderate for type (k=0.507), fair for assumed cause (k=0.394), and poor for estimated time-loss (k=0.155). The injury surveillance system used during international athletics championships provided reliable data for "sex", "location", "event", "circumstance", and "age". More caution should be taken for "assumed cause" and "type", and even more for "estimated time-loss". This injury surveillance system displays satisfactory data quality (reliable data and high data completeness), and thus, can be recommended as tool to collect epidemiology information on injuries during international athletics championships. Copyright © 2018 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
A rapid method for estimation of Pu-isotopes in urine samples using high volume centrifuge.
Kumar, Ranjeet; Rao, D D; Dubla, Rupali; Yadav, J R
2017-07-01
The conventional radio-analytical technique used for estimation of Pu-isotopes in urine samples involves anion exchange/TEVA column separation followed by alpha spectrometry. This sequence of analysis consumes nearly 3-4 days for completion. Many a times excreta analysis results are required urgently, particularly under repeat and incidental/emergency situations. Therefore, there is need to reduce the analysis time for the estimation of Pu-isotopes in bioassay samples. This paper gives the details of standardization of a rapid method for estimation of Pu-isotopes in urine samples using multi-purpose centrifuge, TEVA resin followed by alpha spectrometry. The rapid method involves oxidation of urine samples, co-precipitation of plutonium along with calcium phosphate followed by sample preparation using high volume centrifuge and separation of Pu using TEVA resin. Pu-fraction was electrodeposited and activity estimated using 236 Pu tracer recovery by alpha spectrometry. Ten routine urine samples of radiation workers were analyzed and consistent radiochemical tracer recovery was obtained in the range 47-88% with a mean and standard deviation of 64.4% and 11.3% respectively. With this newly standardized technique, the whole analytical procedure is completed within 9h (one working day hour). Copyright © 2017 Elsevier Ltd. All rights reserved.
Why are they late? Timing abilities and executive control among students with learning disabilities.
Grinblat, Nufar; Rosenblum, Sara
2016-12-01
While a deficient ability to perform daily tasks on time has been reported among students with learning disabilities (LD), the underlying mechanism behind their 'being late' is still unclear. This study aimed to evaluate the organization in time, time estimation abilities, actual performance time pertaining to specific daily activities, as well as the executive functions of students with LD in comparison to those of controls, and to assess the relationships between these domains among each group. The participants were 27 students with LD, aged 20-30, and 32 gender and age-matched controls who completed the Time Organization and Participation Scale (TOPS) and the Behavioral Rating Inventory of Executive Function-Adult version (BRIEF-A). In addition, their ability to estimate the time needed to complete the task of preparing a cup of coffee as well as their actual performance time were evaluated. The results indicated that in comparison to controls, students with LD showed significantly inferior organization in time (TOPS) and executive function abilities (BRIEF-A). Furthermore, their time estimation abilities were significantly inferior and they required significantly more time to prepare a cup of coffee. Regression analysis identified the variables that predicted organization in time and task performance time among each group. The significance of the results for both theoretical and clinical implications are discussed. What this paper adds? This study examines the underlying mechanism of the phenomena of being late among students with LD. Following a recent call for using ecologically valid assessments, the functional daily ability of students with LD to prepare a cup of coffee and to organize time were investigated. Furthermore, their time estimation and executive control abilities were examined as a possible underlying mechanism for their lateness. Although previous studies have indicated executive control deficits among students with LD, to our knowledge, this is the first analysis of the relationships between their executive control and time estimation deficits and their influence upon their daily function and organization in time abilities. Our findings demonstrate that students with LD need more time in order to execute simple daily activities, such as preparing a cup of coffee. Deficient working memory, retrospective time estimation ability and inhibition predicted their performance time and organization in time abilities. Therefore, this paper sheds light on the mechanism behind daily performance in time among students with LD and emphasizes the need for future development of focused intervention programs to meet their unique needs. Copyright © 2016 Elsevier Ltd. All rights reserved.
Automated Guidance from Physiological Sensing to Reduce Thermal-Work Strain Levels on a Novel Task
USDA-ARS?s Scientific Manuscript database
This experiment demonstrated that automated pace guidance generated from real-time physiological monitoring allowed least stressful completion of a timed (60 minute limit) 5 mile treadmill exercise. An optimal pacing policy was estimated from a Markov decision process that balanced the goals of the...
How EIA Estimates Natural Gas Production
2004-01-01
The Energy Information Administration (EIA) publishes estimates monthly and annually of the production of natural gas in the United States. The estimates are based on data EIA collects from gas producing states and data collected by the U. S. Minerals Management Service (MMS) in the Department of Interior. The states and MMS collect this information from producers of natural gas for various reasons, most often for revenue purposes. Because the information is not sufficiently complete or timely for inclusion in EIA's Natural Gas Monthly (NGM), EIA has developed estimation methodologies to generate monthly production estimates that are described in this document.
Determining Source Attenuation History to Support Closure by Natural Attenuation
2013-11-01
of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering...and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or any other...aspect of this collection of information , including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for
Microstructure Analyses of Detonation Diamond Nanoparticles
2012-05-01
burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing...data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information Send comments regarding this...burden estimate or any other aspect of this collection of information, including suggestions for reducing the burden to Department of Defense
Alaska Native Parkinson’s Disease Registry
2008-11-01
OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for...reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing this collection of...information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this
Communication Breakdown: DHS Operations During a Cyber Attack
2010-12-01
is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of...Presidential Directive, Malware, National Exercise, Quadrennial Homeland Security Review , Trusted Internet Connections, Zero-Day Exploits 16. PRICE CODE 17
2011-11-30
modulates or controls the state of Y). The process of identifying these relationships is analogous to what statisticians do during exploratory data ...for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or
Development of an Air-Deployable Ocean Profiler
2009-01-01
select the most appropriate technology for each component; sanity check that the selected technolgies can meet the design goals; and detailed...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate
Adaptive Campaigning Applied: Australian Army Operations in Iraq and Afghanistan
2011-05-01
of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering...and maintaining the data needed, and completing and reviewing this collection of information . Send comments regarding this burden estimate or any...other aspect of this collection of information , including suggestions for reducing this burden to Washington Headquarters Services, Directorate for
Optimizing Human Input in Social Network Analysis
2018-01-23
of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering...and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or any...other aspect of this collection of information , including suggesstions for reducing this burden, to Washington Headquarters Services, Directorate for
The Reality Of The Homeland Security Enterprise Information Sharing Environment
2017-12-01
THE HOMELAND SECURITY ENTERPRISE INFORMATION SHARING ENVIRONMENT by Michael E. Brown December 2017 Thesis Advisors: Erik Dahl Robert...collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or
Microscopic approaches to quantum nonequilibriumthermodynamics and information
2018-02-09
Microscopic approaches to quantum non- equilibrium thermodynamics and information The views, opinions and/or findings contained in this report are... information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering...and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or any other
Two Invariants of Human-Swarm Interaction
2018-01-16
for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data...sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden...estimate or any other aspect of this collection of information , including suggestions for reducing the burden, to Department of Defense, Washington
Coupling Considerations in Assembly Language. Revision 1
2018-02-13
reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments...regarding this burden estimate or any other aspect of this collection of information , including suggestions for reducing the burden, to the Department of
Inclusion of Disaster Resiliency in City/Neighborhood Comprehensive Plans
2017-09-01
collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or...any other aspect of this collection of information , including suggestions for reducing this burden, to Washington headquarters Services, Directorate
Unitary Transformations in 3 D Vector Representation of Qutrit States
2018-03-12
Representation of Qutrit States Vinod K Mishra Computational and Information Sciences Directorate, ARL Approved for public... information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection information . Send comments regarding this burden estimate or any other aspect
ERIC Educational Resources Information Center
Bahi, Saïd; Higgins, Devin; Staley, Patrick
2015-01-01
Individual level data for the entire cohort of undergraduate mathematics students of a relatively small US public university was used to estimate the risk that a student will switch major to another one before degree completion. The data set covers the period from 1999 to 2006. Survival tables and logistic models were estimated and used to discuss…
Joint Direct Attack Munition (JDAM)
2013-12-01
instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send...0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing...comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to
Activity Recognition for Agent Teams
2007-07-01
Uncertainty in Artifcial Intelligence (UAI), 1994. [47] S. Intille and A. Bobick. Visual tracking using closed-worlds. Technical Report 294, MIT Media Lab...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other
The Effects of Physical Impairment on Shooting Performance
2012-08-01
Anthropometry Anthropometric data were collected from each participant. Summary anthropometric statistics are shown in table 1. Table 1...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection information. Send comments regarding this burden estimate or any other aspect of
Subsampling program for the estimation of fish impingement
NASA Astrophysics Data System (ADS)
Beauchamp, John J.; Kumar, K. D.
1984-11-01
Federal regulations require operators of nuclear and coal-fired power-generating stations to estimate the number of fish impinged on intake screens. During winter months, impingement may range into the hundreds of thousands for certain species, making it impossible to count all intake screens completely. We present graphs for determinig the appropriate“optimal” subsample that must be obtained to estimate the total number impinged. Since the number of fish impinged tends to change drastically within a short time period, the subsample size is determined based on the most recent data. This allows for the changing nature of the species-age composition of the impinged fish. These graphs can also be used for subsampling fish catches in an aquatic system when the size of the catch is too large to sample completely.
Khor, Y H; Tolson, J; Churchward, T; Rochford, P; Worsnop, C
2015-08-01
Home polysomnography (PSG) is an alternative method for diagnosis of obstructive sleep apnoea (OSA). Some types 3 and 4 PSG do not monitor sleep and so rely on patients' estimation of total sleep time (TST). To compare patients' subjective sleep duration estimation with objective measures in patients who underwent type 2 PSG for probable OSA. A prospective clinical audit of 536 consecutive patients of one of the authors between 2006 and 2013. A standard questionnaire was completed by the patients the morning after the home PSG to record the time of lights being turned off and estimated time of sleep onset and offset. PSG was scored based on the guidelines of the American Academy of Sleep Medicine. Median estimated sleep latency (SL) was 20 min compared with 10 min for measured SL (P < 0.0001). There was also a significant difference between the estimated and measured sleep offset time (median difference = -1 min, P = 0.01). Estimated TST was significantly shorter than the measured TST (median difference = -18.5 min, P = 0.002). No factors have been identified to affect patients' accuracy of sleep perception. Only 2% of patients had a change in their diagnosis of OSA based on calculated apnoea-hypopnoea index. Overall estimated TST in the patients with probable OSA was significantly shorter than measured with significant individual variability. Collectively, inaccurate sleep time estimation had not resulted in significant difference in the diagnosis of OSA. © 2015 Royal Australasian College of Physicians.
Evolution of Modern Birds Revealed by Mitogenomics: Timing the Radiation and Origin of Major Orders
Pacheco, M. Andreína; Battistuzzi, Fabia U.; Lentino, Miguel; Aguilar, Roberto F.; Kumar, Sudhir; Escalante, Ananias A.
2011-01-01
Mitochondrial (mt) genes and genomes are among the major sources of data for evolutionary studies in birds. This places mitogenomic studies in birds at the core of intense debates in avian evolutionary biology. Indeed, complete mt genomes are actively been used to unveil the phylogenetic relationships among major orders, whereas single genes (e.g., cytochrome c oxidase I [COX1]) are considered standard for species identification and defining species boundaries (DNA barcoding). In this investigation, we study the time of origin and evolutionary relationships among Neoaves orders using complete mt genomes. First, we were able to solve polytomies previously observed at the deep nodes of the Neoaves phylogeny by analyzing 80 mt genomes, including 17 new sequences reported in this investigation. As an example, we found evidence indicating that columbiforms and charadriforms are sister groups. Overall, our analyses indicate that by improving the taxonomic sampling, complete mt genomes can solve the evolutionary relationships among major bird groups. Second, we used our phylogenetic hypotheses to estimate the time of origin of major avian orders as a way to test if their diversification took place prior to the Cretaceous/Tertiary (K/T) boundary. Such timetrees were estimated using several molecular dating approaches and conservative calibration points. Whereas we found time estimates slightly younger than those reported by others, most of the major orders originated prior to the K/T boundary. Finally, we used our timetrees to estimate the rate of evolution of each mt gene. We found great variation on the mutation rates among mt genes and within different bird groups. COX1 was the gene with less variation among Neoaves orders and the one with the least amount of rate heterogeneity across lineages. Such findings support the choice of COX 1 among mt genes as target for developing DNA barcoding approaches in birds. PMID:21242529
76 FR 45799 - Agency Information Collection Activities; Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-01
... the 2005 survey so that the results from it can be used as a baseline for a time-series analysis.\\1... 15 minutes to complete the pretest, the same time as that needed for the actual survey. The revised estimate takes further into account the presumed added time required to respond to questions unique to the...
Gallistel, C.R; King, Adam Philip; Gottlieb, Daniel; Balci, Fuat; Papachristos, Efstathios B; Szalecki, Matthew; Carbone, Kimberly S
2007-01-01
Experimentally naive mice matched the proportions of their temporal investments (visit durations) in two feeding hoppers to the proportions of the food income (pellets per unit session time) derived from them in three experiments that varied the coupling between the behavioral investment and food income, from no coupling to strict coupling. Matching was observed from the outset; it did not improve with training. When the numbers of pellets received were proportional to time invested, investment was unstable, swinging abruptly from sustained, almost complete investment in one hopper, to sustained, almost complete investment in the other—in the absence of appropriate local fluctuations in returns (pellets obtained per time invested). The abruptness of the swings strongly constrains possible models. We suggest that matching reflects an innate (unconditioned) program that matches the ratio of expected visit durations to the ratio between the current estimates of expected incomes. A model that processes the income stream looking for changes in the income and generates discontinuous income estimates when a change is detected is shown to account for salient features of the data. PMID:17465311
Gallistel, C R; King, Adam Philip; Gottlieb, Daniel; Balci, Fuat; Papachristos, Efstathios B; Szalecki, Matthew; Carbone, Kimberly S
2007-03-01
Experimentally naive mice matched the proportions of their temporal investments (visit durations) in two feeding hoppers to the proportions of the food income (pellets per unit session time) derived from them in three experiments that varied the coupling between the behavioral investment and food income, from no coupling to strict coupling. Matching was observed from the outset; it did not improve with training. When the numbers of pellets received were proportional to time invested, investment was unstable, swinging abruptly from sustained, almost complete investment in one hopper, to sustained, almost complete investment in the other-in the absence of appropriate local fluctuations in returns (pellets obtained per time invested). The abruptness of the swings strongly constrains possible models. We suggest that matching reflects an innate (unconditioned) program that matches the ratio of expected visit durations to the ratio between the current estimates of expected incomes. A model that processes the income stream looking for changes in the income and generates discontinuous income estimates when a change is detected is shown to account for salient features of the data.
ERIC Educational Resources Information Center
VanLith, Clinten David
It has been estimated that completed suicides in the United States leave behind 750,000 survivors every year. Many times, individuals who complete suicide had been seeing a mental health practitioner, who then must face the turmoil of losing a client to suicide. This paper reviews the literature on the frequency, impact, and recovery of…
Strategic Methodologies in Public Health Cost Analyses.
Whittington, Melanie; Atherly, Adam; VanRaemdonck, Lisa; Lampe, Sarah
The National Research Agenda for Public Health Services and Systems Research states the need for research to determine the cost of delivering public health services in order to assist the public health system in communicating financial needs to decision makers, partners, and health reform leaders. The objective of this analysis is to compare 2 cost estimation methodologies, public health manager estimates of employee time spent and activity logs completed by public health workers, to understand to what degree manager surveys could be used in lieu of more time-consuming and burdensome activity logs. Employees recorded their time spent on communicable disease surveillance for a 2-week period using an activity log. Managers then estimated time spent by each employee on a manager survey. Robust and ordinary least squares regression was used to measure the agreement between the time estimated by the manager and the time recorded by the employee. The 2 outcomes for this study included time recorded by the employee on the activity log and time estimated by the manager on the manager survey. This study was conducted in local health departments in Colorado. Forty-one Colorado local health departments (82%) agreed to participate. Seven of the 8 models showed that managers underestimate their employees' time, especially for activities on which an employee spent little time. Manager surveys can best estimate time for time-intensive activities, such as total time spent on a core service or broad public health activity, and yet are less precise when estimating discrete activities. When Public Health Services and Systems Research researchers and health departments are conducting studies to determine the cost of public health services, there are many situations in which managers can closely approximate the time required and produce a relatively precise approximation of cost without as much time investment by practitioners.
Fallout Deposition in the Marshall Islands from Bikini and Enewetak Nuclear Weapons Tests
Beck, Harold L.; Bouville, André; Moroz, Brian E.; Simon, Steven L.
2009-01-01
Deposition densities (Bq m-2) of all important dose-contributing radionuclides occurring in nuclear weapons testing fallout from tests conducted at Bikini and Enewetak Atolls (1946-1958) have been estimated on a test-specific basis for all the 31 atolls and separate reef islands of the Marshall Islands. A complete review of various historical and contemporary data, as well as meteorological analysis, was used to make judgments regarding which tests deposited fallout in the Marshall Islands and to estimate fallout deposition density. Our analysis suggested that only 20 of the 66 nuclear tests conducted in or near the Marshall Islands resulted in substantial fallout deposition on any of the 25 inhabited atolls. This analysis was confirmed by the fact that the sum of our estimates of 137Cs deposition from these 20 tests at each atoll is in good agreement with the total 137Cs deposited as estimated from contemporary soil sample analyses. The monitoring data and meteorological analyses were used to quantitatively estimate the deposition density of 63 activation and fission products for each nuclear test, plus the cumulative deposition of 239+240Pu at each atoll. Estimates of the degree of fractionation of fallout from each test at each atoll, as well as of the fallout transit times from the test sites to the atolls were used in this analysis. The estimates of radionuclide deposition density, fractionation, and transit times reported here are the most complete available anywhere and are suitable for estimations of both external and internal dose to representative persons as described in companion papers. PMID:20622548
Fallout deposition in the Marshall Islands from Bikini and Enewetak nuclear weapons tests.
Beck, Harold L; Bouville, André; Moroz, Brian E; Simon, Steven L
2010-08-01
Deposition densities (Bq m(-2)) of all important dose-contributing radionuclides occurring in nuclear weapons testing fallout from tests conducted at Bikini and Enewetak Atolls (1946-1958) have been estimated on a test-specific basis for 32 atolls and separate reef islands of the Marshall Islands. A complete review of various historical and contemporary data, as well as meteorological analysis, was used to make judgments regarding which tests deposited fallout in the Marshall Islands and to estimate fallout deposition density. Our analysis suggested that only 20 of the 66 nuclear tests conducted in or near the Marshall Islands resulted in substantial fallout deposition on any of the 23 inhabited atolls. This analysis was confirmed by the fact that the sum of our estimates of 137Cs deposition from these 20 tests at each atoll is in good agreement with the total 137Cs deposited as estimated from contemporary soil sample analyses. The monitoring data and meteorological analyses were used to quantitatively estimate the deposition density of 63 activation and fission products for each nuclear test, plus the cumulative deposition of 239+240Pu at each atoll. Estimates of the degree of fractionation of fallout from each test at each atoll, as well as of the fallout transit times from the test sites to the atolls were used in this analysis. The estimates of radionuclide deposition density, fractionation, and transit times reported here are the most complete available anywhere and are suitable for estimations of both external and internal dose to representative persons as described in companion papers.
The duration perception of loading applications in smartphone: Effects of different loading types.
Zhao, Wenguo; Ge, Yan; Qu, Weina; Zhang, Kan; Sun, Xianghong
2017-11-01
The loading time of a smartphone application is an important issue, which affects the satisfaction of phone users. This study evaluated the effects of black loading screen (BLS) and animation loading screen (ALS) during application loading on users' duration perception and satisfaction. A total of 43 volunteers were enrolled. They were asked to complete several tasks by clicking the icons of each application, such as camera or message. The duration of loading time for each application was manipulated. The participants were asked to estimate the duration, evaluate the loading speed and their satisfaction. The results showed that the estimated duration increased and the satisfaction for loading period declined along with the loading time increased. Compared with the BLS, the ALS prolonged the estimated duration, and lowered the evaluation of speed and satisfaction. We also discussed the tendency and key inflection points of the curves involving the estimated duration, speed evaluation and satisfaction with the loading time. Copyright © 2017 Elsevier Ltd. All rights reserved.
SNR-based queue observations at CFHT
NASA Astrophysics Data System (ADS)
Devost, Daniel; Moutou, Claire; Manset, Nadine; Mahoney, Billy; Burdullis, Todd; Cuillandre, Jean-Charles; Racine, René
2016-07-01
In an effort to optimize the night time utilizing the exquisite weather on Maunakea, CFHT has equipped its dome with vents and is now moving its Queued Scheduled Observing (QSO)1 based operations toward Signal to Noise Ratio (SNR) observing. In this new mode, individual exposure times for a science program are estimated using a model that uses measurements of the weather conditions as input and the science program is considered completed when the depth required by the scientific requirements are reached. These changes allow CFHT to make better use of the excellent seeing conditions provided by Maunakea, allowing us to complete programs in a shorter time than allocated to the science programs.
Spontaneous abortions after the Three Mile Island nuclear accident: a life table analysis.
Goldhaber, M K; Staub, S L; Tokuhata, G K
1983-01-01
A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected. After one year, all pregnancy cases were followed up and outcomes ascertained. Using the life table method, it was found that, given pregnancies after four completed weeks of gestation counting from the first day of the last menstrual period, the estimated incidence of spontaneous abortion (miscarriage before completion of 16 weeks of gestation) was 15.1 per cent for women pregnant at the time of the TMI accident. Combining spontaneous abortions and stillbirths (delivery of a dead fetus after 16 weeks of gestation), the estimated incidence was 16.1 per cent for pregnancies after four completed weeks of gestation. Both incidences are comparable to baseline studies of fetal loss. PMID:6859357
ERIC Educational Resources Information Center
Cramp, Anita G.; Bray, Steven R.
2009-01-01
The purpose of this study was to examine women's leisure time physical activity (LTPA) before pregnancy, during pregnancy, and through the first 7 months postnatal. Pre- and postnatal women (n = 309) completed the 12-month Modifiable Activity Questionnaire and demographic information. Multilevel modeling was used to estimate a growth curve…
78 FR 72125 - Information Collection Request; Submission for OMB Review
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-02
... Applicant at one of the following times in the medical review process: (1) After the Applicant completes the... Form (a) Estimated number of Applicants/physicians: 100/100. (b) Frequency of response: one time. (c... Collection: When an Applicant reports that he or she is currently receiving allergy shot treatments, Peace...
Contract-Based Integration of Cyber-Physical Analyses
2014-10-14
Conference on Embedded Software Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the ...data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this
Mems-Based Waste Vibration and Acoustic Energy Harvesters
2014-12-01
information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other...Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork
Coupled Ocean-Atmosphere Dynamics and Predictability of MJO’s
2012-09-30
chlorophyll modulation by the MJO Previous studies analyzed ocean color satellite data and suggested that the primary mechanism of surface...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of
Coupled Ocean-Atmosphere Dynamics and Predictability of MJO’s
2012-09-30
mechanisms of surface chlorophyll modulation by the MJO Previous studies analyzed ocean color satellite data and suggested that the primary mechanism of...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the... data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this
ERIC Educational Resources Information Center
Schmitt, T. A.; Sass, D. A.; Sullivan, J. R.; Walker, C. M.
2010-01-01
Imposed time limits on computer adaptive tests (CATs) can result in examinees having difficulty completing all items, thus compromising the validity and reliability of ability estimates. In this study, the effects of speededness were explored in a simulated CAT environment by varying examinee response patterns to end-of-test items. Expectedly,…
An Evaluation of Shipyard Practices and Their Correlation to Ship Costs
2017-12-01
Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to the Office of Management and...collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate
Characterizing Candidate Oncogenes at 8q21 in Breast Cancer
2008-03-01
this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data...sources, gathering and maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden...estimate or any other aspect of this collection of information, including suggestions for reducing this burden to Department of Defense, Washington
Nonparametric Conditional Estimation
1987-02-01
the data because the statistician has complete control over the method. It is especially reasonable when there is a bone fide loss function to which...For example, the sample mean is m(Fn). Most calculations that statisticians perform on a set of data can be expressed as statistical functionals on...of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering
2014-09-09
public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send...comments regarding this burden estimate or any other aspect of this collection of information , including suggestions for reducing the burden, to the
Defence Science and Technology Strategy. Science and Technology for a Secure Canada
2006-12-01
Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments...regarding this burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden, to Washington
Spectral Analysis for DIAL and Lidar Detection of TATP
2008-08-13
REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 The public reporting burden for this collection of information is. estimated to average 1...hour per response, including the time for reviewing instructions, searching existing data sources , gathering and maintaining the data needed, and...completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of
Far Infrared Photonic Crystals Operating in the Reststrahl Region
2007-08-20
REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1...hour per response, including the time for reviewing instructions, searching existing data sources , gathering and maintaining the data needed, and...completing and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information
Overcoming Resistance to Trastuzumab in HER2-Amplified Breast Cancers
2011-08-01
Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for...reviewing instructions, searching existing data sources , gathering and maintaining the data needed, and completing and reviewing this collection of...information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing
Second-Order Active NLO Chromophores for DNA Based Electro-Optics Materials
2010-09-21
REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour...per response, including the time for reviewing instructions, searching existing data sources , gathering and maintaining the data needed, and...completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information
Post-Remediation Evaluation of EVO Treatment: How Can We Improve Performance
2017-11-15
this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data...sources, gathering and maintaining the data needed, and completing and reviewing this collection of information . Send comments regarding this burden...estimate or any other aspect of this collection of information , including suggestions for reducing this burden to Department of Defense, Washington
Attribution In Influence: Relative Power And The Use Of Attribution
2017-12-01
reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments...regarding this burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden, to Washington
Direct Thermodynamic Measurements of the Energetics of Information Processing
2017-08-08
Report: Direct thermodynamic measurements of the energetics of information processing The views, opinions and/or findings contained in this report are... information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or any other
Role of the U.S. Government in the Cybersecurity of Private Entities
2017-12-01
reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding...this burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden, to Washington
2017-09-01
information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or any other aspect...of this collection of information , including suggestions for reducing this burden to Washington headquarters Services, Directorate for Information
2018-03-16
this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data...sources, gathering and maintaining the data needed, and completing and reviewing this collection of information . Send comments regarding this burden...estimate or any other aspect of this collection of information , including suggestions for reducing this burden to Department of Defense, Washington
Workshop on Information Engines at the Frontiers of Nanoscale Thermodynamics
2017-11-01
Report: Workshop on Information Engines at the Frontiers of Nanoscale Thermodynamics The views, opinions and/or findings contained in this report are... information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or any other
Optimizing Sparse Representations of Kinetic Distributions via Information Theory
2017-07-31
for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data...sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden...estimate or any other aspect of this collection of information , including suggestions for reducing the burden, to Department of Defense, Washington
How The Democratization Of Technology Enhances Intelligence-Led Policing And Serves The Community
2017-12-01
reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments...regarding this burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden, to Washington
Navy And Marine Corps IT/IS Acquisition: A Way Forward
2017-12-01
reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding...this burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden, to Washington
Propagation of Statistical Noise Through a Two-Qubit Maximum Likelihood Tomography
2018-04-01
University Daniel E Jones, Brian T Kirby, and Michael Brodsky Computational and Information Sciences Directorate, ARL Approved for...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection information . Send comments regarding this burden estimate or any other
Rand National Security Division Annual Report 2006
2007-01-01
collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any...this opportunity, they need to control enough personnel and material resources to secure and supply at least the capital. Marshaling Resources to Meet
DoD Software Intensive Systems Development: A Hit and Miss Process
2015-05-01
searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments...Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington
Fumiaki Funahashi; Jennifer L. Parke
2017-01-01
Soil solarization has been shown to be an effective tool to manage Phytophthora spp. within surface soils, but estimating the minimum time required to complete local eradication under variable weather conditions remains unknown. A mathematical model could help predict the effectiveness of solarization at different sites and soil depths....
Epigenetic Regulation of microRNA Expression: Targeting the Triple-Negative Breast Cancer Phenotype
2011-10-01
information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering...and maintaining the data needed, and completing and reviewing this collection of information . Send comments regarding this burden estimate or any...other aspect of this collection of information , including suggestions for reducing this burden to Department of Defense, Washington Headquarters Services
The Effect of Modified Eye Position on Shooting Performance
2011-04-01
participants was 20/20, with one participant aided by corrective contact lenses. 3.3 Anthropometry Anthropometric data was collected from each...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the... data needed, and completing and reviewing the collection information. Send comments regarding this burden estimate or any other aspect of this
Human Systems Integration (HSI) in Acquisition. HSI Domain Guide
2009-08-01
job simulation that includes posture data , force parameters, and anthropometry . Output includes the percentage of men and women who have the strength...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of
NASA Astrophysics Data System (ADS)
Mainhagu, J.; Brusseau, M. L.
2016-09-01
The mass of contaminant present at a site, particularly in the source zones, is one of the key parameters for assessing the risk posed by contaminated sites, and for setting and evaluating remediation goals and objectives. This quantity is rarely known and is challenging to estimate accurately. This work investigated the efficacy of fitting mass-depletion functions to temporal contaminant mass discharge (CMD) data as a means of estimating initial mass. Two common mass-depletion functions, exponential and power functions, were applied to historic soil vapor extraction (SVE) CMD data collected from 11 contaminated sites for which the SVE operations are considered to be at or close to essentially complete mass removal. The functions were applied to the entire available data set for each site, as well as to the early-time data (the initial 1/3 of the data available). Additionally, a complete differential-time analysis was conducted. The latter two analyses were conducted to investigate the impact of limited data on method performance, given that the primary mode of application would be to use the method during the early stages of a remediation effort. The estimated initial masses were compared to the total masses removed for the SVE operations. The mass estimates obtained from application to the full data sets were reasonably similar to the measured masses removed for both functions (13 and 15% mean error). The use of the early-time data resulted in a minimally higher variation for the exponential function (17%) but a much higher error (51%) for the power function. These results suggest that the method can produce reasonable estimates of initial mass useful for planning and assessing remediation efforts.
Theoretical framework of the causes of construction time and cost overruns
NASA Astrophysics Data System (ADS)
Ullah, K.; Abdullah, A. H.; Nagapan, S.; Suhoo, S.; Khan, M. S.
2017-11-01
Any construction practitioner fundamental goal is to complete the projects within estimated duration and budgets, and expected quality targets. However, time and cost overruns are regular and universal phenomenon in construction projects and the construction projects in Malaysia has no exemption from the problems of time overrun and cost overrun. In order to accomplish the successful completion of construction projects on specified time and within planned cost, there are various factors that should be given serious attention so that issues such as time and cost overrun can be addressed. This paper aims to construct a framework for the causes of time overrun and cost overrun in construction projects of Malaysia. Based on the relevant literature review, causative factors of time overrun and cost overrun in Malaysian construction projects are summarized and the theoretical frameworks of the causes of construction time overrun and cost overrun is constructed. The developed frameworks for construction time and cost overruns based on the existing literature will assist the construction practitioners to plan the efficient approaches for achieving successful completion of the projects.
Developing a comprehensive time series of GDP per capita for 210 countries from 1950 to 2015
2012-01-01
Background Income has been extensively studied and utilized as a determinant of health. There are several sources of income expressed as gross domestic product (GDP) per capita, but there are no time series that are complete for the years between 1950 and 2015 for the 210 countries for which data exist. It is in the interest of population health research to establish a global time series that is complete from 1950 to 2015. Methods We collected GDP per capita estimates expressed in either constant US dollar terms or international dollar terms (corrected for purchasing power parity) from seven sources. We applied several stages of models, including ordinary least-squares regressions and mixed effects models, to complete each of the seven source series from 1950 to 2015. The three US dollar and four international dollar series were each averaged to produce two new GDP per capita series. Results and discussion Nine complete series from 1950 to 2015 for 210 countries are available for use. These series can serve various analytical purposes and can illustrate myriad economic trends and features. The derivation of the two new series allows for researchers to avoid any series-specific biases that may exist. The modeling approach used is flexible and will allow for yearly updating as new estimates are produced by the source series. Conclusion GDP per capita is a necessary tool in population health research, and our development and implementation of a new method has allowed for the most comprehensive known time series to date. PMID:22846561
Bhaskaran, Krishnan; Forbes, Harriet J; Douglas, Ian; Leon, David A; Smeeth, Liam
2013-01-01
Objectives To assess the completeness and representativeness of body mass index (BMI) data in the Clinical Practice Research Datalink (CPRD), and determine an optimal strategy for their use. Design Descriptive study. Setting Electronic healthcare records from primary care. Participants A million patient random sample from the UK CPRD primary care database, aged ≥16 years. Primary and secondary outcome measures BMI completeness in CPRD was evaluated by age, sex and calendar period. CPRD-based summary BMI statistics for each calendar year (2003–2010) were age-standardised and sex-standardised and compared with equivalent statistics from the Health Survey for England (HSE). Results BMI completeness increased over calendar time from 37% in 1990–1994 to 77% in 2005–2011, was higher among females and increased with age. When BMI at specific time points was assigned based on the most recent record, calendar–year-specific mean BMI statistics underestimated equivalent HSE statistics by 0.75–1.1 kg/m2. Restriction to those with a recent (≤3 years) BMI resulted in mean BMI estimates closer to HSE (≤0.28 kg/m2 underestimation), but excluded up to 47% of patients. An alternative strategy of imputing up-to-date BMI based on modelled changes in BMI over time since the last available record also led to mean BMI estimates that were close to HSE (≤0.37 kg/m2 underestimation). Conclusions Completeness of BMI in CPRD increased over time and varied by age and sex. At a given point in time, a large proportion of the most recent BMIs are unlikely to reflect current BMI; consequent BMI misclassification might be reduced by employing model-based imputation of current BMI. PMID:24038008
Attitude and Trajectory Estimation Using Earth Magnetic Field Data
NASA Technical Reports Server (NTRS)
Deutschmann, Julie; Bar-Itzhack, Itzhack Y.
1996-01-01
The magnetometer has long been a reliable, inexpensive sensor used in spacecraft momentum management and attitude estimation. Recent studies show an increased accuracy potential for magnetometer-only attitude estimation systems. Since the Earth's magnetic field is a function of time and position, and since time is known quite precisely, the differences between the computer and measured magnetic field components, as measured by the magnetometers throughout the entire spacecraft orbit, are a function of both the spacecraft trajectory and attitude errors. Therefore, these errors can be used to estimate both trajectory and attitude. Traditionally, satellite attitude and trajectory have been estimated with completely separate system, using different measurement data. Recently, trajectory estimation for low earth orbit satellites was successfully demonstrated in ground software using only magnetometer data. This work proposes a single augmented extended Kalman Filter to simultaneously and autonomously estimate both spacecraft trajectory and attitude with data from a magnetometer and either dynamically determined rates or gyro-measured body rates.
Lopez-Iturri, Peio; de Miguel-Bilbao, Silvia; Aguirre, Erik; Azpilicueta, Leire; Falcone, Francisco; Ramos, Victoria
2015-01-01
The electromagnetic field leakage levels of nonionizing radiation from a microwave oven have been estimated within a complex indoor scenario. By employing a hybrid simulation technique, based on coupling full wave simulation with an in-house developed deterministic 3D ray launching code, estimations of the observed electric field values can be obtained for the complete indoor scenario. The microwave oven can be modeled as a time- and frequency-dependent radiating source, in which leakage, basically from the microwave oven door, is propagated along the complete indoor scenario interacting with all of the elements present in it. This method can be of aid in order to assess the impact of such devices on expected exposure levels, allowing adequate minimization strategies such as optimal location to be applied. PMID:25705676
Relationship Between Reported and Measured Sleep Times
Silva, Graciela E.; Goodwin, James L.; Sherrill, Duane L.; Arnold, Jean L.; Bootzin, Richard R.; Smith, Terry; Walsleben, Joyce A.; Baldwin, Carol M.; Quan, Stuart F.
2007-01-01
Study Objective: Subjective and objective assessments of sleep may be discrepant due to sleep misperception and measurement effects, the latter of which may change the quality and quantity of a person's usual sleep. This study compared sleep times from polysomnography (PSG) with self-reports of habitual sleep and sleep estimated on the morning after a PSG in adults. Design: Total sleep time and sleep onset latency obtained from unattended home PSGs were compared to sleep times obtained from a questionnaire completed before the PSG and a Morning Survey completed the morning after the PSG. Participants: A total of 2,113 subjects who were ≥ 40 years of age were included in this analysis. Measures and Results: Subjects were 53% female, 75% Caucasian, and 38% obese. The mean habitual sleep time (HABTST), morning estimated sleep time (AMTST), and PSG total sleep times (PSGTST) were 422 min, 379 min, and 363 min, respectively. The mean habitual sleep onset latency, morning estimated sleep onset latency, and PSG sleep onset latency were 17.0 min, 21.8 min, and 16.9 min, respectively. Models adjusting for related demographic factors showed that HABTST and AMTST differ significantly from PSGTST by 61 and 18 minutes, respectively. Obese and higher educated people reported less sleep time than their counterparts. Similarly, small but significant differences were seen for sleep latency. Conclusions: In a community population, self-reported total sleep times and sleep latencies are overestimated even on the morning following overnight PSG. Citation: Silva GE; Goodwin JL; Sherrill DL; Arnold JL; Bootzin RR; Smith T; Walsleben JA; Baldwin CM; Quan SF. Relationship between reported and measured sleep times: the sleep heart health study (SHHS). J Clin Sleep Med 2007;3(6):622-630. PMID:17993045
An Embedded Device for Real-Time Noninvasive Intracranial Pressure Estimation.
Matthews, Jonathan M; Fanelli, Andrea; Heldt, Thomas
2018-01-01
The monitoring of intracranial pressure (ICP) is indicated for diagnosing and guiding therapy in many neurological conditions. Current monitoring methods, however, are highly invasive, limiting their use to the most critically ill patients only. Our goal is to develop and test an embedded device that performs all necessary mathematical operations in real-time for noninvasive ICP (nICP) estimation based on a previously developed model-based approach that uses cerebral blood flow velocity (CBFV) and arterial blood pressure (ABP) waveforms. The nICP estimation algorithm along with the required preprocessing steps were implemented on an NXP LPC4337 microcontroller unit (MCU). A prototype device using the MCU was also developed, complete with display, recording functionality, and peripheral interfaces for ABP and CBFV monitoring hardware. The device produces an estimate of mean ICP once per minute and performs the necessary computations in 410 ms, on average. Real-time nICP estimates differed from the original batch-mode MATLAB implementation of theestimation algorithm by 0.63 mmHg (root-mean-square error). We have demonstrated that real-time nICP estimation is possible on a microprocessor platform, which offers the advantages of low cost, small size, and product modularity over a general-purpose computer. These attributes take a step toward the goal of real-time nICP estimation at the patient's bedside in a variety of clinical settings.
Amanatidou, Elisavet; Samiotis, Georgios; Trikoilidou, Eleni; Pekridis, George; Taousanidis, Nikolaos
2015-02-01
Zero net sludge growth can be achieved by complete retention of solids in activated sludge wastewater treatment, especially in high strength and biodegradable wastewaters. When increasing the solids retention time, MLSS and MLVSS concentrations reach a plateau phase and observed growth yields values tend to zero (Yobs ≈ 0). In this work, in order to evaluate sedimentation problems arised due to high MLSS concentrations and complete sludge retention operational conditions, two identical innovative slaughterhouse wastewater treatment plants were studied. Measurements of wastewaters' quality characteristics, treatment plant's operational conditions, sludge microscopic analysis and state point analysis were conducted. Results have shown that low COD/Nitrogen ratios increase sludge bulking and flotation phenomena due to accidental denitrification in clarifiers. High return activated sludge rate is essential in complete retention systems as it reduces sludge condensation and hydraulic retention time in the clarifiers. Under certain operational conditions sludge loading rates can greatly exceed literature limit values. The presented methodology is a useful tool for estimation of sedimentation problems encountered in activated sludge wastewater treatment plants with complete retention time. Copyright © 2014 Elsevier Ltd. All rights reserved.
Do missing data influence the accuracy of divergence-time estimation with BEAST?
Zheng, Yuchi; Wiens, John J
2015-04-01
Time-calibrated phylogenies have become essential to evolutionary biology. A recurrent and unresolved question for dating analyses is whether genes with missing data cells should be included or excluded. This issue is particularly unclear for the most widely used dating method, the uncorrelated lognormal approach implemented in BEAST. Here, we test the robustness of this method to missing data. We compare divergence-time estimates from a nearly complete dataset (20 nuclear genes for 32 species of squamate reptiles) to those from subsampled matrices, including those with 5 or 2 complete loci only and those with 5 or 8 incomplete loci added. In general, missing data had little impact on estimated dates (mean error of ∼5Myr per node or less, given an overall age of ∼220Myr in squamates), even when 80% of sampled genes had 75% missing data. Mean errors were somewhat higher when all genes were 75% incomplete (∼17Myr). However, errors increased dramatically when only 2 of 9 fossil calibration points were included (∼40Myr), regardless of missing data. Overall, missing data (and even numbers of genes sampled) may have only minor impacts on the accuracy of divergence dating with BEAST, relative to the dramatic effects of fossil calibrations. Copyright © 2015 Elsevier Inc. All rights reserved.
Assessment of Voting Assistance Programs for Calendar Year 2011
2012-03-30
is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of...emphasis. We reviewed the Service IG reports and certain supporting data , as needed; met with senior IG representatives from the Army, Navy, Air Force
Assessment of the Accountability of Night Vision Devices Provided to the Security Forces of Iraq
2009-03-17
of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering...and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other... data in this project. The qualitative data consisted of individual interviews, direct observation, and written documents. Quantitative data
2015-04-30
from the MIT Sloan School that provide a relative complexity score for functions (Product and Context Complexity). The PMA assesses the complexity...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or
Launch and Recovery System Literature Review
2010-12-01
information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing he collection of information. Send comments regarding this burden estimate or any other aspect...if it does not display a currently valid OMB control number. PLEASE DO NOT RETURNYOU FORM TO THE AVOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) 31-12
2013-06-01
collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or...Control Information Exchange Data Model (JC3IEDM). The Coalition Battle Management Language (CBML) being developed by the Simulation Interoperability
The RADAR Test Methodology: Evaluating a Multi-Task Machine Learning System with Humans in the Loop
2006-10-01
burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing...data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this...burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden, to Washington Headquarters Services
Deformation Mechanisms and High Strain Rate Properties of Magnesium (Mg) and Mg Alloys
2012-08-01
collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection information. Send comments regarding this burden estimate or...conference in June 2010 (2). A comprehensive historical review of the U.S. military applications of Mg alloys has recently been published (3
Radiative Transfer in Submerged Macrophyte Canopies
2001-09-30
estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources , gathering and maintaining the data...needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection...person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control
Social and Cognitive Functioning as Risk Factors for Suicide: A Historical-Prospective Cohort Study
2011-04-01
Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per...response, including the time for reviewing instructions, searching existing data sources , gathering and maintaining the data needed, and completing and...reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including
2010-04-30
estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources , gathering and maintaining...previous and current complex SW development efforts, the program offices will have a source of objective lessons learned and metrics that can be applied...the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this
Operative Therapy and the Growth of Breast Cancer Micrometastases: Cause and Effect
2006-08-01
0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing...instructions, searching existing data sources , gathering and maintaining the data needed, and completing and reviewing this collection of information...Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden to
Factors Impacting Intra-District Collaboration: A Field Study in a Midwest Police Department
2018-03-01
burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing...data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this...burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden, to Washington headquarters Services
Blind, Deaf, and Dumb: We Must Be Prepared to Fight for Information
2017-05-25
Blind, Deaf, and Dumb: We Must Be Prepared to Fight for Information A Monograph By LTC Stephen M. Johnson United States Army... information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing this collection of information . Send comments regarding this burden estimate or any other aspect
China’s War by Other Means: Unveiling China’s Quest for Information Dominance
2017-06-09
CHINA’S WAR BY OTHER MEANS: UNVEILING CHINA’S QUEST FOR INFORMATION DOMINANCE A thesis presented to the Faculty of the U.S...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing this collection of information . Send comments regarding this burden estimate
Agent And Component Object Framework For Concept Design Modeling Of Mobile Cyber Physical Systems
2018-03-01
burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing...data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this...burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden, to Washington headquarters
2018-03-01
collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or...any other aspect of this collection of information , including suggestions for reducing this burden, to Washington headquarters Services, Directorate
Scalable Matrix Algorithms for Interactive Analytics of Very Large Informatics Graphs
2017-06-14
information networks. Depending on the situation, these larger networks may not fit on a single machine. Although we considered traditional matrix and graph...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or
V-22 Osprey Joint Services Advanced Vertical Lift Aircraft (V-22)
2013-12-01
Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response...including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the... collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions
2004-04-20
EUROPE (Leson, 1991). Chemical Operations Coffee Roasting Composting Facilities Chemical Storage Coca Roasting Landfill Gas Extraction Film Coating...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect
A Case Study in Transnational Crime: Ukraine and Modern Slavery
2007-06-01
remained unable to appropriate resources or plan efficiently. The full extent of the decline remains unknown, because statistics were manipulated to hide...information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of
Learning to Leave. The Preeminence of Disengagement in US Military Strategy
2008-05-01
information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of...Information center cataloging Data Brown, R. Greg. Learning to leave : the preeminence of disengagement in US military strategy / R. Greg Brown. p. ; cm
2013-11-15
features and designed a classifier that achieves up to 95% classification accuracy on classifying the occupancy with indoor footstep data. MDL-based...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other
2008-02-29
information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing this collection of information . Send comments regarding this burden estimate or any other aspect...of this collection of information , including suggestions for reducing this burden to Department of Defense, Washington Headquarters Services
A Qualitative Analysis of the Navy’s HSI Billet Structure
2008-06-01
of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering...and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other...subspecialty code. The research results support the hypothesis that the work requirements of the July 2007 data set of 4600P-coded billets (billets
Human Systems Integration (HSI) in Acquisition. Acquisition Phase Guide
2009-08-01
information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of...available Concept of Operations (CONOPS) and other available data 1.1 Select and review Baseline Comparison System(s) (BCS) documentation 1.2 Assess
Applicability of Human Simulation for Enhancing Operations of Dismounted Soldiers
2010-10-01
information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect...consisted of a data capturing phase, in which field-trials at a German MOUT training facility were observed, a subsequent data -processing including
Aerodynamic Parameters of High Performance Aircraft Estimated from Wind Tunnel and Flight Test Data
NASA Technical Reports Server (NTRS)
Klein, Vladislav; Murphy, Patrick C.
1998-01-01
A concept of system identification applied to high performance aircraft is introduced followed by a discussion on the identification methodology. Special emphasis is given to model postulation using time invariant and time dependent aerodynamic parameters, model structure determination and parameter estimation using ordinary least squares an mixed estimation methods, At the same time problems of data collinearity detection and its assessment are discussed. These parts of methodology are demonstrated in examples using flight data of the X-29A and X-31A aircraft. In the third example wind tunnel oscillatory data of the F-16XL model are used. A strong dependence of these data on frequency led to the development of models with unsteady aerodynamic terms in the form of indicial functions. The paper is completed by concluding remarks.
White, Sonia L J; Szűcs, Dénes
2012-01-04
The objective of this study was to scrutinize number line estimation behaviors displayed by children in mathematics classrooms during the first three years of schooling. We extend existing research by not only mapping potential logarithmic-linear shifts but also provide a new perspective by studying in detail the estimation strategies of individual target digits within a number range familiar to children. Typically developing children (n = 67) from Years 1-3 completed a number-to-position numerical estimation task (0-20 number line). Estimation behaviors were first analyzed via logarithmic and linear regression modeling. Subsequently, using an analysis of variance we compared the estimation accuracy of each digit, thus identifying target digits that were estimated with the assistance of arithmetic strategy. Our results further confirm a developmental logarithmic-linear shift when utilizing regression modeling; however, uniquely we have identified that children employ variable strategies when completing numerical estimation, with levels of strategy advancing with development. In terms of the existing cognitive research, this strategy factor highlights the limitations of any regression modeling approach, or alternatively, it could underpin the developmental time course of the logarithmic-linear shift. Future studies need to systematically investigate this relationship and also consider the implications for educational practice.
2012-01-01
Background The objective of this study was to scrutinize number line estimation behaviors displayed by children in mathematics classrooms during the first three years of schooling. We extend existing research by not only mapping potential logarithmic-linear shifts but also provide a new perspective by studying in detail the estimation strategies of individual target digits within a number range familiar to children. Methods Typically developing children (n = 67) from Years 1-3 completed a number-to-position numerical estimation task (0-20 number line). Estimation behaviors were first analyzed via logarithmic and linear regression modeling. Subsequently, using an analysis of variance we compared the estimation accuracy of each digit, thus identifying target digits that were estimated with the assistance of arithmetic strategy. Results Our results further confirm a developmental logarithmic-linear shift when utilizing regression modeling; however, uniquely we have identified that children employ variable strategies when completing numerical estimation, with levels of strategy advancing with development. Conclusion In terms of the existing cognitive research, this strategy factor highlights the limitations of any regression modeling approach, or alternatively, it could underpin the developmental time course of the logarithmic-linear shift. Future studies need to systematically investigate this relationship and also consider the implications for educational practice. PMID:22217191
Reporting bias in completed epilepsy intervention trials: A cross-sectional analysis.
Rayi, Appaji; Thompson, Stephanie; Gloss, David; Malhotra, Konark
2018-03-30
To explore the evidence of reporting bias among completed epilepsy intervention trials (EITs) and compliance of applicable EITs to Food and Drug Administration Amendments Act (FDAAA). We included consecutive EITs registered as completed on ClinicalTrials.gov from 2008 to 2015. Descriptive data was collected including study type, study phase, funding source, primary completion date, and result reporting date. Time to result reporting was analyzed using Kaplan-Meier estimates for two time periods (2008-2011 and 2012-2015). PubMed, Web of Science, and Google scholar databases were manually searched for publication details. Overall, 95/126 EITs (75%) reported, while remaining 31/126 (25%) did not report their results. Time to reporting was significantly lower for trials completed during 2012-2015 (16.5 months; 95% CI: 13.60-19.40; p = .002; Cohen's d = 0.68) as compared to the trials completed during 2008-2011 (25.9 months; 95% CI: 21.56-30.22). 72/126 trials were conducted in at least one U.S. center. 56/72 (78%) of the trials met the FDAAA criteria, while only 19/56 (34%) reported within the mandated one-year time frame. The lack of reporting of nearly one-quarter of completed epilepsy intervention trials suggests existence of reporting bias. As such, it should be considered an important criterion for determining risk of bias in epilepsy systematic reviews. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Umehara, Hiroaki; Okada, Masato; Naruse, Yasushi
2018-03-01
The estimation of angular time series data is a widespread issue relating to various situations involving rotational motion and moving objects. There are two kinds of problem settings: the estimation of wrapped angles, which are principal values in a circular coordinate system (e.g., the direction of an object), and the estimation of unwrapped angles in an unbounded coordinate system such as for the positioning and tracking of moving objects measured by the signal-wave phase. Wrapped angles have been estimated in previous studies by sequential Bayesian filtering; however, the hyperparameters that are to be solved and that control the properties of the estimation model were given a priori. The present study establishes a procedure of hyperparameter estimation from the observation data of angles only, using the framework of Bayesian inference completely as the maximum likelihood estimation. Moreover, the filter model is modified to estimate the unwrapped angles. It is proved that without noise our model reduces to the existing algorithm of Itoh's unwrapping transform. It is numerically confirmed that our model is an extension of unwrapping estimation from Itoh's unwrapping transform to the case with noise.
Delivery of Modular Lethality via a Parent-Child Concept
2015-02-01
time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the...downrange distance to the target, is the time of flight, is the distance of the thruster force from the body center of gravity, and is...velocity and time of flight can be estimated or measured in flight. These values can be collected in a term, , and the 2 components of lateral
2015-05-18
response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and... reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information...five times the speed of sound. For reference, the SR-71 Blackbird , the fastest manned airbreathing typically flew at three times the speed of sound
Human Problem Solving: The Complete Model of the Traveling Salesman Problem
2009-08-31
REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Pubic reporting burden for this collection of information is estimated to average 1 nour...per response, indudrnj the time for reviewing instructions, searcnmg exsting data sources , gathering and mamtaning the data needed, and completing...provision of law. no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid
Digital program for solving the linear stochastic optimal control and estimation problem
NASA Technical Reports Server (NTRS)
Geyser, L. C.; Lehtinen, B.
1975-01-01
A computer program is described which solves the linear stochastic optimal control and estimation (LSOCE) problem by using a time-domain formulation. The LSOCE problem is defined as that of designing controls for a linear time-invariant system which is disturbed by white noise in such a way as to minimize a performance index which is quadratic in state and control variables. The LSOCE problem and solution are outlined; brief descriptions are given of the solution algorithms, and complete descriptions of each subroutine, including usage information and digital listings, are provided. A test case is included, as well as information on the IBM 7090-7094 DCS time and storage requirements.
The Martian: Examining Human Physical Judgments across Virtual Gravity Fields.
Ye, Tian; Qi, Siyuan; Kubricht, James; Zhu, Yixin; Lu, Hongjing; Zhu, Song-Chun
2017-04-01
This paper examines how humans adapt to novel physical situations with unknown gravitational acceleration in immersive virtual environments. We designed four virtual reality experiments with different tasks for participants to complete: strike a ball to hit a target, trigger a ball to hit a target, predict the landing location of a projectile, and estimate the flight duration of a projectile. The first two experiments compared human behavior in the virtual environment with real-world performance reported in the literature. The last two experiments aimed to test the human ability to adapt to novel gravity fields by measuring their performance in trajectory prediction and time estimation tasks. The experiment results show that: 1) based on brief observation of a projectile's initial trajectory, humans are accurate at predicting the landing location even under novel gravity fields, and 2) humans' time estimation in a familiar earth environment fluctuates around the ground truth flight duration, although the time estimation in unknown gravity fields indicates a bias toward earth's gravity.
Completeness of pedigree and family cancer history for ovarian cancer patients.
Son, Yedong; Lim, Myong Cheol; Seo, Sang Soo; Kang, Sokbom; Park, Sang Yoon
2014-10-01
To investigate the completeness of pedigree and of number of pedigree analysis to know the acceptable familial history in Korean women with ovarian cancer. Interview was conducted in 50 ovarian cancer patients for obtaining familial history three times over the 6 weeks. The completeness of pedigree is estimated in terms of familial history of disease (cancer), health status (health living, disease and death), and onset age of disease and death. The completion of pedigree was 79.3, 85.1, and 85.6% at the 1st, 2nd, and 3rd time of interview and the time for pedigree analysis was 34.3, 10.8, and 3.1 minutes, respectively. The factors limiting pedigree analysis were as follows: out of contact with their relatives (38%), no living ancestors who know the family history (34%), dispersed family member because of the Korean War (16%), unknown cause of death (12%), reluctance to ask medical history of relatives (10%), and concealing their ovarian cancer (10%). The percentage of cancers revealed in 1st (2%) and 2nd degree (8%) relatives were increasing through surveys, especially colorectal cancer related with Lynch syndrome (4%). Analysis of pedigree at least two times is acceptable in Korean woman with ovarian cancer from the first study. The completion of pedigree is increasing, while time to take family history is decreasing during three time survey.
Health-Related Quality-of-Life Findings for the Prostate Cancer Prevention Trial
2012-01-01
Background The Prostate Cancer Prevention Trial (PCPT)—a randomized placebo-controlled study of the efficacy of finasteride in preventing prostate cancer—offered the opportunity to prospectively study effects of finasteride and other covariates on the health-related quality of life of participants in a multiyear trial. Methods We assessed three health-related quality-of-life domains (measured with the Health Survey Short Form–36: Physical Functioning, Mental Health, and Vitality scales) via questionnaires completed by PCPT participants at enrollment (3 months before randomization), at 6 months after randomization, and annually for 7 years. Covariate data obtained at enrollment from patient-completed questionnaires were included in our model. Mixed-effects model analyses and a cross-sectional presentation at three time points began at 6 months after randomization. All statistical tests were two-sided. Results For the physical function outcome (n = 16 077), neither the finasteride main effect nor the finasteride interaction with time were statistically significant. The effects of finasteride on physical function were minor and accounted for less than a 1-point difference over time in Physical Functioning scores (mixed-effect estimate = 0.07, 95% confidence interval [CI] = −0.28 to 0.42, P = .71). Comorbidities such as congestive heart failure (estimate = −5.64, 95% CI = −7.96 to −3.32, P < .001), leg pain (estimate = −2.57, 95% CI = −3.04 to −2.10, P < .001), and diabetes (estimate = −1.31, 95% CI = −2.04 to −0.57, P < .001) had statistically significant negative effects on physical function, as did current smoking (estimate = −2.34, 95% CI = −2.97 to −1.71, P < .001) and time on study (estimate = −1.20, 95% CI = −1.36 to −1.03, P < .001). Finasteride did not have a statistically significant effect on the other two dependent variables, mental health and vitality, either in the mixed-effects analyses or in the cross-sectional analysis at any of the three time points. Conclusion Finasteride did not negatively affect SF–36 Physical Functioning, Mental Health, or Vitality scores. PMID:22972968
Boyd, Matt; Baker, Michael G; Mansoor, Osman D; Kvizhinadze, Giorgi; Wilson, Nick
2017-01-01
Countries are well advised to prepare for future pandemic risks (e.g., pandemic influenza, novel emerging agents or synthetic bioweapons). These preparations do not typically include planning for complete border closure. Even though border closure may not be instituted in time, and can fail, there might still plausible chances of success for well organized island nations. To estimate costs and benefits of complete border closure in response to new pandemic threats, at an initial proof-of-concept level. New Zealand was used as a case-study for an island country. An Excel spreadsheet model was developed to estimate costs and benefits. Case-study specific epidemiological data was sourced from past influenza pandemics. Country-specific healthcare cost data, valuation of life, and lost tourism revenue were imputed (with lost trade also in scenario analyses). For a new pandemic equivalent to the 1918 influenza pandemic (albeit with half the mortality rate, "Scenario A"), it was estimated that successful border closure for 26 weeks provided a net societal benefit (e.g., of NZ$11.0 billion, USD$7.3 billion). Even in the face of a complete end to trade, a net benefit was estimated for scenarios where the mortality rate was high (e.g., at 10 times the mortality impact of "Scenario A", or 2.75% of the country's population dying) giving a net benefit of NZ$54 billion (USD$36 billion). But for some other pandemic scenarios where trade ceased, border closure resulted in a net negative societal value (e.g., for "Scenario A" times three for 26 weeks of border closure-but not for only 12 weeks of closure when it would still be beneficial). This "proof-of-concept" work indicates that more detailed cost-benefit analysis of border closure in very severe pandemic situations for some island nations is probably warranted, as this course of action might sometimes be worthwhile from a societal perspective.
Determination of the Time-Space Magnetic Correlation Functions in the Solar Wind
NASA Astrophysics Data System (ADS)
Weygand, J. M.; Matthaeus, W. H.; Kivelson, M.; Dasso, S.
2013-12-01
Magnetic field data from many different intervals and 7 different solar wind spacecraft are employed to estimate the scale-dependent time decorrelation function in the interplanetary magnetic field in both the slow and fast solar wind. This estimation requires correlations varying with both space and time lags. The two point correlation function with no time lag is determined by correlating time series data from multiple spacecraft separated in space and for complete coverage of length scales relies on many intervals with different spacecraft spatial separations. In addition we employ single spacecraft time-lagged correlations, and two spacecraft time lagged correlations to access different spatial and temporal correlation data. Combining these data sets gives estimates of the scale-dependent time decorrelation function, which in principle tells us how rapidly time decorrelation occurs at a given wavelength. For static fields the scale-dependent time decorrelation function is trivially unity, but in turbulence the nonlinear cascade process induces time-decorrelation at a given length scale that occurs more rapidly with decreasing scale. The scale-dependent time decorrelation function is valuable input to theories as well as various applications such as scattering, transport, and study of predictability. It is also a fundamental element of formal turbulence theory. Our results are extension of the Eulerian correlation functions estimated in Matthaeus et al. [2010], Weygand et al [2012; 2013].
Quentin, Wilm; Neubauer, Simone; Leidl, Reiner; König, Hans-Helmut
2007-01-01
This paper reviews the international literature that employed time-series analysis to evaluate the effects of advertising bans on aggregate consumption of cigarettes or tobacco. A systematic search of the literature was conducted. Three groups of studies representing analyses of advertising bans in the U.S.A., in other countries and in 22 OECD countries were defined. The estimated effects of advertising bans and their significance were analysed. 24 studies were identified. They used a wide array of explanatory variables, models, estimating methods and data sources. 18 studies found a negative effect of an advertising ban on aggregate consumption, but only ten of these studies found a significant effect. Two studies using data from 22 OECD countries suggested that partial bans would have little or no influence on aggregate consumption, whereas complete bans would significantly reduce consumption. The results imply that advertising bans have a negative but sometimes only narrow impact on consumption. Complete bans let expect a higher effectiveness. Because of methodological restrictions of analysing advertising bans' effects by time series approaches, also different approaches should be used in the future.
Gerhardsson, Lars; Balogh, Istvan; Hambert, Per-Arne; Hjortsberg, Ulf; Karlsson, Jan-Erik
2005-01-01
The aim of the present study was to compare the development of vibration white fingers (VWF) in workers in relation to different ways of exposure estimation, and their relationship to the standard ISO 5349, annex A. Nineteen vibration exposed (grinding machines) male workers completed a questionnaire followed by a structured interview including questions regarding their estimated hand-held vibration exposure. Neurophysiological tests such as fractionated nerve conduction velocity in hands and arms, vibrotactile perception thresholds and temperature thresholds were determined. The subjective estimation of the mean daily exposure-time to vibrating tools was 192 min (range 18-480 min) among the workers. The estimated mean exposure time calculated from the consumption of grinding wheels was 42 min (range 18-60 min), approximately a four-fold overestimation (Wilcoxon's signed ranks test, p<0.001). Thus, objective measurements of the exposure time, related to the standard ISO 5349, which in this case were based on the consumption of grinding wheels, will in most cases give a better basis for adequate risk assessment than self-exposure assessment.
2011-12-01
Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...searching existing data sources gathering and maintaining the data needed, and completing and reviewing this collection of information Send comments...regarding this burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden lo Department of
2017-03-01
0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing...instruction, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information ...Send comments regarding this burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden
2013-10-01
collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate...variants which explain much more than a small amount of risk for prostate cancer among a small population of men. Even less progress has been made
2003-04-01
gener- ally considered to be passive data . Instead the genetic material should be capable of being algorith - mic information, that is, program code or...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other
Identifying Enterprise Leverage Points in Defense Acquisition Program Performance
2009-09-01
estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining...the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this...of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB
Biomarker Discovery in Gulf War Veterans: Development of a War Illness Diagnostic Panel
2014-10-17
estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the... data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this...that notwithstanding any other provision of law, no person shall be subject to any penalty for failing to comply with a collection of information if it
Design and Analysis of Low Frequency Communication System in Persian Gulf
2008-09-01
information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources , gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other...other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently
Nonlinear Oscillations of Microscale Piezoelectric Resonators and Resonator Arrays
2006-06-30
REO TD C M NA INPG Form ApprovedREPO T D CUM NTATON AGEOMB No. 0704-0188 Public reporting burden for this collection of information is estimated to...average 1 hour per response, including the time for reviewing instructions, searching data sources , gathering and maintaining the data needed, and...completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information
Development of a Tetrathioether (S4) Bifunctional Chelate System for Rh-105
2013-07-01
information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources , gathering and...maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect...information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY
Development of a Tetrathioether (S4) Bifunctional Chelate System for Rh-105
2013-06-01
is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources , gathering and...maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of...if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) June
sarA as a Target for the Treatment and Prevention of Staphylococcal Biofilm-Associated Infection
2015-02-01
M.S., Compadre, C.M. 2011. Sesquiterpene lactons from Gynoxys verrucosa and their anti - MRSA activity. Journal of Ethnopharmacology, 137:1055-1059. 11...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data...needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this
Robotic extended pyelolithotomy for treatment of renal calculi: a feasibility study.
Badani, Ketan K; Hemal, Ashok K; Fumo, Michael; Kaul, Sanjeev; Shrivastava, Alok; Rajendram, Arumuga Kumar; Yusoff, Noor Ashani; Sundram, Murali; Woo, Susan; Peabody, James O; Mohamed, Sahabudin Raja; Menon, Mani
2006-06-01
Percutaneous nephrolithotomy (PCNL) remains the treatment of choice for staghorn renal calculi. Many reports suggest that laparoscopy can be an alternative treatment for large renal stones. We wished to evaluate the role and feasibility of laparoscopic extended pyelolithotomy (REP) for treatment of staghorn calculi. Thirteen patients underwent REP for treatment of staghorn calculi over a 12-day period. Twelve patients had partial staghorn stones and one had a complete staghorn stone. All patients had pre-operative and post-operative imaging including KUB and computed tomography. All procedures were completed robotically without conversion to laparoscopy or open surgery. Mean operative time was 158 min and mean robotic console time was 108 min. Complete stone removal was accomplished in all patients except the one with a complete staghorn calculus. Estimated blood loss was 100 cc, and no patient required post-operative transfusion. REP is an effective treatment alternative to PCNL in some patients with staghorn calculi. However, patients with complete staghorn stones are not suitable candidates for this particular technique.
Brenner, Hermann; Jansen, Lina
2016-02-01
Monitoring cancer survival is a key task of cancer registries, but timely disclosure of progress in long-term survival remains a challenge. We introduce and evaluate a novel method, denoted "boomerang method," for deriving more up-to-date estimates of long-term survival. We applied three established methods (cohort, complete, and period analysis) and the boomerang method to derive up-to-date 10-year relative survival of patients diagnosed with common solid cancers and hematological malignancies in the United States. Using the Surveillance, Epidemiology and End Results 9 database, we compared the most up-to-date age-specific estimates that might have been obtained with the database including patients diagnosed up to 2001 with 10-year survival later observed for patients diagnosed in 1997-2001. For cancers with little or no increase in survival over time, the various estimates of 10-year relative survival potentially available by the end of 2001 were generally rather similar. For malignancies with strongly increasing survival over time, including breast and prostate cancer and all hematological malignancies, the boomerang method provided estimates that were closest to later observed 10-year relative survival in 23 of the 34 groups assessed. The boomerang method can substantially improve up-to-dateness of long-term cancer survival estimates in times of ongoing improvement in prognosis. Copyright © 2016 Elsevier Inc. All rights reserved.
Heading Toward Launch with the Integrated Multi-Satellite Retrievals for GPM (IMERG)
NASA Technical Reports Server (NTRS)
Huffman, George J.; Bolvin, David T.; Nelkin, Eric J.; Adler, Robert F.
2012-01-01
The Day-l algorithm for computing combined precipitation estimates in GPM is the Integrated Multi-satellitE Retrievals for GPM (IMERG). We plan for the period of record to encompass both the TRMM and GPM eras, and the coverage to extend to fully global as experience is gained in the difficult high-latitude environment. IMERG is being developed as a unified U.S. algorithm that takes advantage of strengths in the three groups that are contributing expertise: 1) the TRMM Multi-satellite Precipitation Analysis (TMPA), which addresses inter-satellite calibration of precipitation estimates and monthly scale combination of satellite and gauge analyses; 2) the CPC Morphing algorithm with Kalman Filtering (KF-CMORPH), which provides quality-weighted time interpolation of precipitation patterns following cloud motion; and 3) the Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks using a Cloud Classification System (PERSIANN-CCS), which provides a neural-network-based scheme for generating microwave-calibrated precipitation estimates from geosynchronous infrared brightness temperatures. In this talk we summarize the major building blocks and important design issues driven by user needs and practical data issues. One concept being pioneered by the IMERG team is that the code system should produce estimates for the same time period but at different latencies to support the requirements of different groups of users. Another user requirement is that all these runs must be reprocessed as new IMERG versions are introduced. IMERG's status at meeting time will be summarized, and the processing scenario in the transition from TRMM to GPM will be laid out. Initially, IMERG will be run with TRMM-based calibration, and then a conversion to a GPM-based calibration will be employed after the GPM sensor products are validated. A complete reprocessing will be computed, which will complete the transition from TMPA.
Shahar, Suzana; Abdul Manaf, Zahara; Mohd Nordin, Nor Azlin; Susetyowati, Susetyowati
2017-01-01
Although nutritional screening and dietary monitoring in clinical settings are important, studies on related user satisfaction and cost benefit are still lacking. This study aimed to: (1) elucidate the cost of implementing a newly developed dietary monitoring tool, the Pictorial Dietary Assessment Tool (PDAT); and (2) investigate the accuracy of estimation and satisfaction of healthcare staff after the use of the PDAT. A cross-over intervention study was conducted among 132 hospitalized patients with diabetes. Cost and time for the implementation of PDAT in comparison to modified Comstock was estimated using the activity-based costing approach. Accuracy was expressed as the percentages of energy and protein obtained by both methods, which were within 15% and 30%, respectively, of those obtained by the food weighing. Satisfaction of healthcare staff was measured using a standardized questionnaire. Time to complete the food intake recording of patients using PDAT (2.31 ± 0.70 min) was shorter than when modified Comstock (3.53 ± 1.27 min) was used (p < 0.001). Overall cost per patient was slightly higher for PDAT (United States Dollar 0.27 ± 0.02) than for modified Comstock (USD 0.26 ± 0.04 (p < 0.05)). The accuracy of energy intake estimated by modified Comstock was 10% lower than that of PDAT. There was poorer accuracy of protein intake estimated by modified Comstock (<40%) compared to that estimated by the PDAT (>71%) (p < 0.05). Mean user satisfaction of healthcare staff was significantly higher for PDAT than that for modified Comstock (p < 0.05). PDAT requires a shorter time to be completed and was rated better than modified Comstock. PMID:29283401
Kim, Moon H.; Morlock, Scott E.; Arihood, Leslie D.; Kiesler, James L.
2011-01-01
Near-real-time and forecast flood-inundation mapping products resulted from a pilot study for an 11-mile reach of the White River in Indianapolis. The study was done by the U.S. Geological Survey (USGS), Indiana Silver Jackets hazard mitigation taskforce members, the National Weather Service (NWS), the Polis Center, and Indiana University, in cooperation with the City of Indianapolis, the Indianapolis Museum of Art, the Indiana Department of Homeland Security, and the Indiana Department of Natural Resources, Division of Water. The pilot project showed that it is technically feasible to create a flood-inundation map library by means of a two-dimensional hydraulic model, use a map from the library to quickly complete a moderately detailed local flood-loss estimate, and automatically run the hydraulic model during a flood event to provide the maps and flood-damage information through a Web graphical user interface. A library of static digital flood-inundation maps was created by means of a calibrated two-dimensional hydraulic model. Estimated water-surface elevations were developed for a range of river stages referenced to a USGS streamgage and NWS flood forecast point colocated within the study reach. These maps were made available through the Internet in several formats, including geographic information system, Keyhole Markup Language, and Portable Document Format. A flood-loss estimate was completed for part of the study reach by using one of the flood-inundation maps from the static library. The Federal Emergency Management Agency natural disaster-loss estimation program HAZUS-MH, in conjunction with local building information, was used to complete a level 2 analysis of flood-loss estimation. A Service-Oriented Architecture-based dynamic flood-inundation application was developed and was designed to start automatically during a flood, obtain near real-time and forecast data (from the colocated USGS streamgage and NWS flood forecast point within the study reach), run the two-dimensional hydraulic model, and produce flood-inundation maps. The application used local building data and depth-damage curves to estimate flood losses based on the maps, and it served inundation maps and flood-loss estimates through a Web-based graphical user interface.
24 CFR Appendix C to Part 3500 - Instructions for Completing Good Faith Estimate (GFE) Form
Code of Federal Regulations, 2010 CFR
2010-04-01
... address, telephone number, and email address, if any, on the top of the form, along with the applicant's... originator must provide the borrower with a written list of settlement services providers at the time of the...
Impact of drug shortages on U.S. health systems.
Kaakeh, Rola; Sweet, Burgunda V; Reilly, Cynthia; Bush, Colleen; DeLoach, Sherry; Higgins, Barb; Clark, Angela M; Stevenson, James
2011-10-01
A study was performed to quantify the personnel resources required to manage drug shortages, define the impact of drug shortages on health systems nationwide, and assess the adequacy of information resources available to manage drug shortages. An online survey was sent to the 1322 members of the American Society of Health-System Pharmacists who were identified as directors of pharmacy. Survey recipients were asked to identify which of the 30 most recent drug shortages listed affected their health system, to identify actions taken to manage the shortage, and to rate the impact of each shortage. Employees responsible for completing predefined tasks were identified, and the average time spent by each type of employee completing these tasks was estimated. Labor costs associated with managing shortages were calculated. A total of 353 respondents completed the survey, yielding a response rate of 27%. Pharmacists and pharmacy technicians spent more time managing drug shortages than did physicians and nurses. There was a significant association between the time spent managing shortages and the size of the institution, the number of shortages managed, and the institution's level of automation. Overall, 70% of the respondents felt that the information resources available to manage drug shortages were not good. The labor costs associated with managing shortages in the United States is an estimated $216 million annually. A survey of directors of pharmacy revealed that labor costs and the time required to manage drug shortages are significant and that current information available to manage drug shortages is considered suboptimal.
2005-07-26
Audit Report Cost-to-Complete Estimates and Financial Reporting for the Management of the Iraq Relief and Reconstruction...Complete Estimates and Financial Reporting for the Management of the Iraq Relief and Reconstruction Fund 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c...RECONSTRUCTION MANAGEMENT OFFICE DIRECTOR, PROJECT AND CONTRACTING OFFICE SUBJECT: Cost-to-Complete Estimates and Financial Reporting for the Management of
Gachara, George; Symekher, Samuel; Otieno, Michael; Magana, Japheth; Opot, Benjamin; Bulimo, Wallace
2016-06-01
An influenza pandemic caused by a novel influenza virus A(H1N1)pdm09 spread worldwide in 2009 and is estimated to have caused between 151,700 and 575,400 deaths globally. While whole genome data on new virus enables a deeper insight in the pathogenesis, epidemiology, and drug sensitivities of the circulating viruses, there are relatively limited complete genetic sequences available for this virus from African countries. We describe herein the full genome analysis of influenza A(H1N1)pdm09 viruses isolated in Kenya between June 2009 and August 2010. A total of 40 influenza A(H1N1)pdm09 viruses isolated during the pandemic were selected. The segments from each isolate were amplified and directly sequenced. The resulting sequences of individual gene segments were concatenated and used for subsequent analysis. These were used to infer phylogenetic relationships and also to reconstruct the time of most recent ancestor, time of introduction into the country, rates of substitution and to estimate a time-resolved phylogeny. The Kenyan complete genome sequences clustered with globally distributed clade 2 and clade 7 sequences but local clade 2 viruses did not circulate beyond the introductory foci while clade 7 viruses disseminated country wide. The time of the most recent common ancestor was estimated between April and June 2009, and distinct clusters circulated during the pandemic. The complete genome had an estimated rate of nucleotide substitution of 4.9×10(-3) substitutions/site/year and greater diversity in surface expressed proteins was observed. We show that two clades of influenza A(H1N1)pdm09 virus were introduced into Kenya from the UK and the pandemic was sustained as a result of importations. Several closely related but distinct clusters co-circulated locally during the peak pandemic phase but only one cluster dominated in the late phase of the pandemic suggesting that it possessed greater adaptability. Copyright © 2016 Elsevier B.V. All rights reserved.
Houston, Natalie A.; Braun, Christopher L.
2004-01-01
This report describes the collection, analyses, and distribution of hydraulic-conductivity data obtained from slug tests completed in the alluvial aquifer underlying Air Force Plant 4 and Naval Air Station-Joint Reserve Base Carswell Field, Fort Worth, Texas, during October 2002 and August 2003 and summarizes previously available hydraulic-conductivity data. The U.S. Geological Survey, in cooperation with the U.S. Air Force, completed 30 slug tests in October 2002 and August 2003 to obtain estimates of horizontal hydraulic conductivity to use as initial values in a ground-water-flow model for the site. The tests were done by placing a polyvinyl-chloride slug of known volume beneath the water level in selected wells, removing the slug, and measuring the resulting water-level recovery over time. The water levels were measured with a pressure transducer and recorded with a data logger. Hydraulic-conductivity values were estimated from an analytical relation between the instantaneous displacement of water in a well bore and the resulting rate of head change. Although nearly two-thirds of the tested wells recovered 90 percent of their slug-induced head change in less than 2 minutes, 90-percent recovery times ranged from 3 seconds to 35 minutes. The estimates of hydraulic conductivity range from 0.2 to 200 feet per day. Eighty-three percent of the estimates are between 1 and 100 feet per day.
Development of Army Job Knowledge Tests for Three Air Force Specialties
1989-05-01
the SMEs who assisted in development of the AGE JKT had been in supervisory-type positions for some years, they were not always familiar with current ...NOTATION , 17. COSATI CODES IS. SUBJECT TERMS (Continue on revene if necesuary and identify by block number) FIELD GROUP SUB- GROV - job knowledge test...provided an opportunity to obtain an accurate estimate of the amount of time required to complete the test. Following completion of the incumbent review
Productivity associated with visual status of computer users.
Daum, Kent M; Clore, Katherine A; Simms, Suzanne S; Vesely, Jon W; Wilczek, Dawn D; Spittle, Brian M; Good, Greg W
2004-01-01
The aim of this project is to examine the potential connection between the astigmatic refractive corrections of subjects using computers and their productivity and comfort. We hypothesize that improving the visual status of subjects using computers results in greater productivity, as well as improved visual comfort. Inclusion criteria required subjects 19 to 30 years of age with complete vision examinations before being enrolled. Using a double-masked, placebo-controlled, randomized design, subjects completed three experimental tasks calculated to assess the effects of refractive error on productivity (time to completion and the number of errors) at a computer. The tasks resembled those commonly undertaken by computer users and involved visual search tasks of: (1) counties and populations; (2) nonsense word search; and (3) a modified text-editing task. Estimates of productivity for time to completion varied from a minimum of 2.5% upwards to 28.7% with 2 D cylinder miscorrection. Assuming a conservative estimate of an overall 2.5% increase in productivity with appropriate astigmatic refractive correction, our data suggest a favorable cost-benefit ratio of at least 2.3 for the visual correction of an employee (total cost 268 dollars) with a salary of 25,000 dollars per year. We conclude that astigmatic refractive error affected both productivity and visual comfort under the conditions of this experiment. These data also suggest a favorable cost-benefit ratio for employers who provide computer-specific eyewear to their employees.
NASA Astrophysics Data System (ADS)
Brodersen, R. W.
1984-04-01
A scaled version of the RISC II chip has been fabricated and tested and these new chips have a cycle time that would outperform a VAX 11/780 by about a factor of two on compiled integer C programs. The architectural work on a RISC chip designed for a Smalltalk implementation has been completed. This chip, called SOAR (Smalltalk On a RISC), should run program s4-15 times faster than the Xerox 1100 (Dolphin), a TTL minicomputer, and about as fast as the Xerox 1132 (Dorado), a $100,000 ECL minicomputer. The 1983 VLSI tools tape has been converted for use under the latest UNIX release (4.2). The Magic (formerly called Caddy) layout system will be a unified set of highly automated tools that cover all aspects of the layout process, including stretching, compaction, tiling and routing. A multiple window package and design rule checker for this system have just been completed and compaction and stretching are partially implemented. New slope-based timing models for the Crystal timing analyzer are now fully implemented and in regular use. In an accuracy test using a dozen critical paths from the RISC II processor and cache chips it was found that Crystal's estimates were within 5-10% of SPICE's estimates, while being a factor of 10,000 times faster.
Turnbull, Alison E; O'Connor, Cristi L; Lau, Bryan; Halpern, Scott D; Needham, Dale M
2015-07-29
Survey response rates among physicians are declining, and determining an appropriate level of compensation to motivate participation poses a major challenge. To estimate the effect of permitting intensive care physicians to select their preferred level of compensation for completing a short Web-based survey on physician (1) response rate, (2) survey completion rate, (3) time to response, and (4) time spent completing the survey. A total of 1850 US intensivists from an existing database were randomized to receive a survey invitation email with or without an Amazon.com incentive available to the first 100 respondents. The incentive could be instantly redeemed for an amount chosen by the respondent, up to a maximum of US $50. The overall response rate was 35.90% (630/1755). Among the 35.4% (111/314) of eligible participants choosing the incentive, 80.2% (89/111) selected the maximum value. Among intensivists offered an incentive, the response was 6.0% higher (95% CI 1.5-10.5, P=.01), survey completion was marginally greater (807/859, 94.0% vs 892/991, 90.0%; P=.06), and the median number of days to survey response was shorter (0.8, interquartile range [IQR] 0.2-14.4 vs 6.6, IQR 0.3-22.3; P=.001), with no difference in time spent completing the survey. Permitting intensive care physicians to determine compensation level for completing a short Web-based survey modestly increased response rate and substantially decreased response time without decreasing the time spent on survey completion.
O'Connor, Cristi L; Lau, Bryan; Halpern, Scott D; Needham, Dale M
2015-01-01
Background Survey response rates among physicians are declining, and determining an appropriate level of compensation to motivate participation poses a major challenge. Objective To estimate the effect of permitting intensive care physicians to select their preferred level of compensation for completing a short Web-based survey on physician (1) response rate, (2) survey completion rate, (3) time to response, and (4) time spent completing the survey. Methods A total of 1850 US intensivists from an existing database were randomized to receive a survey invitation email with or without an Amazon.com incentive available to the first 100 respondents. The incentive could be instantly redeemed for an amount chosen by the respondent, up to a maximum of US $50. Results The overall response rate was 35.90% (630/1755). Among the 35.4% (111/314) of eligible participants choosing the incentive, 80.2% (89/111) selected the maximum value. Among intensivists offered an incentive, the response was 6.0% higher (95% CI 1.5-10.5, P=.01), survey completion was marginally greater (807/859, 94.0% vs 892/991, 90.0%; P=.06), and the median number of days to survey response was shorter (0.8, interquartile range [IQR] 0.2-14.4 vs 6.6, IQR 0.3-22.3; P=.001), with no difference in time spent completing the survey. Conclusions Permitting intensive care physicians to determine compensation level for completing a short Web-based survey modestly increased response rate and substantially decreased response time without decreasing the time spent on survey completion. PMID:26223821
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saaban, Azizan; Zainudin, Lutfi; Bakar, Mohd Nazari Abu
This paper intends to reveal the ability of the linear interpolation method to predict missing values in solar radiation time series. Reliable dataset is equally tends to complete time series observed dataset. The absence or presence of radiation data alters long-term variation of solar radiation measurement values. Based on that change, the opportunities to provide bias output result for modelling and the validation process is higher. The completeness of the observed variable dataset has significantly important for data analysis. Occurrence the lack of continual and unreliable time series solar radiation data widely spread and become the main problematic issue. However,more » the limited number of research quantity that has carried out to emphasize and gives full attention to estimate missing values in the solar radiation dataset.« less
Development of a Tetrathioether (S4) Bifunctional Chelate System for Rh-105
2012-07-01
hour per response, including the time for reviewing instructions, searching existing data sources , gathering and maintaining the data needed, and...completing and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this collection of
Acute effects of caffeine on several operant behaviors in rhesus monkeys.
Buffalo, E A; Gillam, M P; Allen, R R; Paule, M G
1993-11-01
The acute effects of 1,3-trimethylxanthine (caffeine) were assessed using an operant test battery (OTB) of complex food-reinforced tasks that are thought to depend upon relatively specific brain functions, such as motivation to work for food (progressive ratio, PR), learning (incremental repeated acquisition, IRA), color and position discrimination (conditioned position responding, CPR), time estimation (temporal response differentiation, TRD), and short-term memory and attention (delayed matching-to-sample, DMTS). Endpoints included response rates (RR), accuracies (ACC), and percent task completed (PTC). Caffeine sulfate (0.175-20.0 mg/kg, IV), given 15 min pretesting, produced significant dose-dependent decreases in TRD percent task completed and accuracy at doses > or = 5.6 mg/kg. Caffeine produced no systematic effects on either DMTS or PR responding, but low doses tended to enhance performance in both IRA and CPR tasks. Thus, in monkeys, performance of an operant task designed to model time estimation is more sensitive to the disruptive effects of caffeine than is performance of the other tasks in the OTB.
Yang, Yong-Qiang; Li, Xue-Bo; Shao, Ru-Yue; Lyu, Zhou; Li, Hong-Wei; Li, Gen-Ping; Xu, Lyu-Zi; Wan, Li-Hua
2016-09-01
The characteristic life stages of infesting blowflies (Calliphoridae) such as Chrysomya megacephala (Fabricius) are powerful evidence for estimating the death time of a corpse, but an established reference of developmental times for local blowfly species is required. We determined the developmental rates of C. megacephala from southwest China at seven constant temperatures (16-34°C). Isomegalen and isomorphen diagrams were constructed based on the larval length and time for each developmental event (first ecdysis, second ecdysis, wandering, pupariation, and eclosion), at each temperature. A thermal summation model was constructed by estimating the developmental threshold temperature D0 and the thermal summation constant K. The thermal summation model indicated that, for complete development from egg hatching to eclosion, D0 = 9.07 ± 0.54°C and K = 3991.07 ± 187.26 h °C. This reference can increase the accuracy of estimations of postmortem intervals in China by predicting the growth of C. megacephala. © 2016 American Academy of Forensic Sciences.
Baudouin, Alexia; Isingrini, Michel; Vanneste, Sandrine
2018-01-25
Age-related differences in time estimation were examined by comparing the temporal performance of young, young-old, and old-old adults, in relation to two major theories of cognitive aging: executive decline and cognitive slowing. We tested the hypothesis that processing speed and executive function are differentially involved in timing depending on the temporal task used. We also tested the assumption of greater age-related effects in time estimation in old-old participants. Participants performed two standard temporal tasks: duration production and duration reproduction. They also completed tests measuring executive function and processing speed. Findings supported the view that executive function is the best mediator of reproduction performance and inversely that processing speed is the best mediator of production performance. They also showed that young-old participants provide relatively accurate temporal judgments compared to old-old participants. These findings are discussed in terms of compensation mechanisms in aging.
2015-01-01
Data SIGAR JANUARY 2015 Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data...needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this
2013-09-30
is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of... downscaling future projection simulations. APPROACH To address the scientific objectives, we plan to develop, implement, and validate a new
2014-01-01
entry and review procedures; (2) explain the various database components; (3) outline included datafields and datasets; and (4) document the...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or
Transmission overhaul estimates for partial and full replacement at repair
NASA Technical Reports Server (NTRS)
Savage, M.; Lewicki, D. G.
1991-01-01
Timely transmission overhauls increase in-flight service reliability greater than the calculated design reliabilities of the individual aircraft transmission components. Although necessary for aircraft safety, transmission overhauls contribute significantly to aircraft expense. Predictions of a transmission's maintenance needs at the design stage should enable the development of more cost effective and reliable transmissions in the future. The frequency is estimated of overhaul along with the number of transmissions or components needed to support the overhaul schedule. Two methods based on the two parameter Weibull statistical distribution for component life are used to estimate the time between transmission overhauls. These methods predict transmission lives for maintenance schedules which repair the transmission with a complete system replacement or repair only failed components of the transmission. An example illustrates the methods.
Near-Earth-object survey progress and population of small near-Earth asteroids
NASA Astrophysics Data System (ADS)
Harris, A.
2014-07-01
Estimating the total population vs. size of NEAs and the completion of surveys is the same thing since the total population is just the number discovered divided by the estimated completion. I review the method of completion estimation based on ratio of re-detected objects to total detections (known plus new discoveries). The method is quite general and can be used for population estimations of all sorts, from wildlife to various classes of solar system bodies. Since 2001, I have been making estimates of population and survey progress approximately every two years. Plotted below, left, is my latest estimate, including NEA discoveries up to August, 2012. I plan to present an update at the meeting. All asteroids of a given size are not equally easy to detect because of specific orbital geometries. Thus a model of the orbital distribution is necessary, and computer simulations using those orbits need to establish the relation between the raw re-detection ratio and the actual completion fraction. This can be done for any sub-group population, allowing to estimate the population of a subgroup and the expected current completion. Once a reliable survey computer model has been developed and ''calibrated'' with respect to actual survey re-detections versus size, it can be extrapolated to smaller sizes to estimate completion even at very small size where re-detections are rare or even zero. I have recently investigated the subgroup of extremely low encounter velocity NEAs, the class of interest for the Asteroid Redirect Mission (ARM), recently proposed by NASA. I found that asteroids of diameter ˜ 10 m with encounter velocity with the Earth lower than 2.5 km/sec are detected by current surveys nearly 1,000 times more efficiently than the general background of NEAs of that size. Thus the current completion of these slow relative velocity objects may be around 1%, compared to 10^{-6} for that size objects of the general velocity distribution. Current surveys are nowhere near complete, but there may be fewer such objects than have been suggested. This conclusion is reinforced by the fact that at least a couple such discovered objects are known to be not real asteroids but spent rocket bodies in heliocentric orbit, of which there are only of the order of a hundred. Brown et al. (Nature 503, 238-241, 2013, below right, green squares are a re-plot of my blue circles on left plot) recently suggested that the population of small NEAs in the size range from roughly 5 to 50 meters in diameter may have been substantially under-estimated. To be sure, the greatest uncertainty in population estimates is in that range, since there are very few bolide events to use for estimation, and the surveys are extremely incomplete in that size range, so a factor of 3 or so discrepancy is not significant. However, the population estimated from surveys carried still smaller, where the bolide frequency becomes more secure, disagrees from the bolide estimate by even less than a factor of 3 and in fact intersects at about 3 m diameter. On the other hand, the shallow-sloping size-frequency distribution derived from the sparse large bolide data diverges badly from the survey estimates, in sizes where the survey estimates become ever-increasingly reliable, even by 100-200 m diameter. It appears that the bolide data provides a good "anchor" of the population in the size range up to about 5 m diameter, but above that one might do better just connecting that population with a straight line (on a log-log plot) with the survey-determined population at larger size, 50-100 m diameter or so.
NASA Astrophysics Data System (ADS)
Ling, Khoo Mei; Ghaffar, Mazlan Abd.
2014-09-01
This study examines the movement of food item and the estimation of gastric emptying time using the X-radiography techniques, in the clownfish (Amphiprion ocellaris) fed in captivity. Fishes were voluntarily fed to satiation after being deprived of food for 72 hours, using pellets that were tampered with barium sulphate (BaSO4). The movement of food item was monitored over different time of feeding. As a result, a total of 36 hours were needed for the food items to be evacuated completely from the stomach. Results on the modeling of meal satiation were also discussed. The size of satiation meal to body weight relationship was allometric, with the power value equal to 1.28.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ling, Khoo Mei; Ghaffar, Mazlan Abd.
2014-09-03
This study examines the movement of food item and the estimation of gastric emptying time using the X-radiography techniques, in the clownfish (Amphiprion ocellaris) fed in captivity. Fishes were voluntarily fed to satiation after being deprived of food for 72 hours, using pellets that were tampered with barium sulphate (BaSO{sub 4}). The movement of food item was monitored over different time of feeding. As a result, a total of 36 hours were needed for the food items to be evacuated completely from the stomach. Results on the modeling of meal satiation were also discussed. The size of satiation meal tomore » body weight relationship was allometric, with the power value equal to 1.28.« less
Nilsson, Maria A; Härlid, Anna; Kullberg, Morgan; Janke, Axel
2010-05-01
The native rodents are the most species-rich placental mammal group on the Australian continent. Fossils of native Australian rodents belonging to the group Conilurini are known from Northern Australia at 4.5Ma. These fossil assemblages already display a rich diversity of rodents, but the exact timing of their arrival on the Australian continent is not yet established. The complete mitochondrial genomes of two native Australian rodents, Leggadina lakedownensis (Lakeland Downs mouse) and Pseudomys chapmani (Western Pebble-mound mouse) were sequenced for investigating their evolutionary history. The molecular data were used for studying the phylogenetic position and divergence times of the Australian rodents, using 12 calibration points and various methods. Phylogenetic analyses place the native Australian rodents as the sister-group to the genus Mus. The Mus-Conilurini calibration point (7.3-11.0Ma) is highly critical for estimating rodent divergence times, while the influence of the different algorithms on estimating divergence times is negligible. The influence of the data type was investigated, indicating that amino acid data are more likely to reflect the correct divergence times than nucleotide sequences. The study on the problems related to estimating divergence times in fast-evolving lineages such as rodents, emphasize the choice of data and calibration points as being critical. Furthermore, it is essential to include accurate calibration points for fast-evolving groups, because the divergence times can otherwise be estimated to be significantly older. The divergence times of the Australian rodents are highly congruent and are estimated to 6.5-7.2Ma, a date that is compatible with their fossil record.
España-Romero, Vanesa; Golubic, Rajna; Martin, Kathryn R; Hardy, Rebecca; Ekelund, Ulf; Kuh, Diana; Wareham, Nicholas J; Cooper, Rachel; Brage, Soren
2014-01-01
To compare physical activity (PA) subcomponents from EPIC Physical Activity Questionnaire (EPAQ2) and combined heart rate and movement sensing in older adults. Participants aged 60-64y from the MRC National Survey of Health and Development in Great Britain completed EPAQ2, which assesses self-report PA in 4 domains (leisure time, occupation, transportation and domestic life) during the past year and wore a combined sensor for 5 consecutive days. Estimates of PA energy expenditure (PAEE), sedentary behaviour, light (LPA) and moderate-to-vigorous PA (MVPA) were obtained from EPAQ2 and combined sensing and compared. Complete data were available in 1689 participants (52% women). EPAQ2 estimates of PAEE and MVPA were higher than objective estimates and sedentary time and LPA estimates were lower [bias (95% limits of agreement) in men and women were 32.3 (-61.5 to 122.6) and 29.0 (-39.2 to 94.6) kJ/kg/day for PAEE; -4.6 (-10.6 to 1.3) and -6.0 (-10.9 to -1.0) h/day for sedentary time; -171.8 (-454.5 to 110.8) and -60.4 (-367.5 to 246.6) min/day for LPA; 91.1 (-159.5 to 341.8) and 55.4 (-117.2 to 228.0) min/day for MVPA]. There were significant positive correlations between all self-reported and objectively assessed PA subcomponents (rho= 0.12 to 0.36); the strongest were observed for MVPA (rho = 0.30 men; rho = 0.36 women) and PAEE (rho = 0.26 men; rho = 0.25 women). EPAQ2 produces higher estimates of PAEE and MVPA and lower estimates of sedentary and LPA than objective assessment. However, both methodologies rank individuals similarly, suggesting that EPAQ2 may be used in etiological studies in this population.
Thissen, David; Liu, Yang; Magnus, Brooke; Quinn, Hally; Gipson, Debbie S; Dampier, Carlton; Huang, I-Chan; Hinds, Pamela S; Selewski, David T; Reeve, Bryce B; Gross, Heather E; DeWalt, Darren A
2016-01-01
To assess minimally important differences (MIDs) for several pediatric self-report item banks from the National Institutes of Health Patient-Reported Outcomes Measurement Information System(®) (PROMIS(®)). We presented vignettes comprising sets of two completed PROMIS questionnaires and asked judges to declare whether the individual completing those questionnaires had an important change or not. We enrolled judges (including adolescents, parents, and clinicians) who responded to 24 vignettes (six for each domain of depression, pain interference, fatigue, and mobility). We used item response theory to model responses to the vignettes across different judges and estimated MID as the point at which 50 % of the judges would declare an important change. We enrolled 246 judges (78 adolescents, 85 parents, and 83 clinicians). The MID estimated with clinician data was about 2 points on the PROMIS T-score scale, and the MID estimated with adolescent and parent data was about 3 points on that same scale. The MIDs enhance the value of PROMIS pediatric measures in clinical research studies to identify meaningful changes in health status over time.
78 FR 54729 - Reports, Forms, and Record Keeping Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-05
... information: Title--NHTSA Distracted Driving Survey Project. Type of Request--Revision of previously approved... region, age, and gender. The National Survey on Distracted Driving Attitudes and Behaviors (NSDDAB) will... driving behaviors. The estimated average amount of time to complete the survey is 20 minutes. This...
Robust Ambiguity Estimation for an Automated Analysis of the Intensive Sessions
NASA Astrophysics Data System (ADS)
Kareinen, Niko; Hobiger, Thomas; Haas, Rüdiger
2016-12-01
Very Long Baseline Interferometry (VLBI) is a unique space-geodetic technique that can directly determine the Earth's phase of rotation, namely UT1. The daily estimates of the difference between UT1 and Coordinated Universal Time (UTC) are computed from one-hour long VLBI Intensive sessions. These sessions are essential for providing timely UT1 estimates for satellite navigation systems. To produce timely UT1 estimates, efforts have been made to completely automate the analysis of VLBI Intensive sessions. This requires automated processing of X- and S-band group delays. These data often contain an unknown number of integer ambiguities in the observed group delays. In an automated analysis with the c5++ software the standard approach in resolving the ambiguities is to perform a simplified parameter estimation using a least-squares adjustment (L2-norm minimization). We implement the robust L1-norm with an alternative estimation method in c5++. The implemented method is used to automatically estimate the ambiguities in VLBI Intensive sessions for the Kokee-Wettzell baseline. The results are compared to an analysis setup where the ambiguity estimation is computed using the L2-norm. Additionally, we investigate three alternative weighting strategies for the ambiguity estimation. The results show that in automated analysis the L1-norm resolves ambiguities better than the L2-norm. The use of the L1-norm leads to a significantly higher number of good quality UT1-UTC estimates with each of the three weighting strategies.
Simons, Hannah R; Unger, Zoe D; Lopez, Priscilla M; Kohn, Julia E
2015-12-01
We estimated human papillomavirus (HPV) vaccine series completion and examined predictors of completion among adolescents and young adults in a large family planning network. Our retrospective cohort study of vaccine completion within 12 months and time to completion used electronic health record data from 119 Planned Parenthood health centers in 11 US states for 9648 patients who initiated HPV vaccination between January 2011 and January 2013. Among vaccine initiators, 29% completed the series within 12 months. Patients who were male, younger than 22 years, or non-Hispanic Black or who had public insurance were less likely to complete within 12 months and completed more slowly than their counterparts. Gender appeared to modify the effect of public versus private insurance on completion (adjusted hazard ratio = 0.76 for women and 0.95 for men; relative excess risk due to interaction = 0.41; 95% confidence interval = 0.09, 0.73). Completion was low yet similar to previous studies conducted in safety net settings.
Borah, Rohit; Brown, Andrew W; Capers, Patrice L; Kaiser, Kathryn A
2017-01-01
Objectives To summarise logistical aspects of recently completed systematic reviews that were registered in the International Prospective Register of Systematic Reviews (PROSPERO) registry to quantify the time and resources required to complete such projects. Design Meta-analysis. Data sources and study selection All of the 195 registered and completed reviews (status from the PROSPERO registry) with associated publications at the time of our search (1 July 2014). Data extraction All authors extracted data using registry entries and publication information related to the data sources used, the number of initially retrieved citations, the final number of included studies, the time between registration date to publication date and number of authors involved for completion of each publication. Information related to funding and geographical location was also recorded when reported. Results The mean estimated time to complete the project and publish the review was 67.3 weeks (IQR=42). The number of studies found in the literature searches ranged from 27 to 92 020; the mean yield rate of included studies was 2.94% (IQR=2.5); and the mean number of authors per review was 5, SD=3. Funded reviews took significantly longer to complete and publish (mean=42 vs 26 weeks) and involved more authors and team members (mean=6.8 vs 4.8 people) than those that did not report funding (both p<0.001). Conclusions Systematic reviews presently take much time and require large amounts of human resources. In the light of the ever-increasing volume of published studies, application of existing computing and informatics technology should be applied to decrease this time and resource burden. We discuss recently published guidelines that provide a framework to make finding and accessing relevant literature less burdensome. PMID:28242767
Improving Project Management with Simulation and Completion Distribution Functions
NASA Technical Reports Server (NTRS)
Cates, Grant R.
2004-01-01
Despite the critical importance of project completion timeliness, management practices in place today remain inadequate for addressing the persistent problem of project completion tardiness. A major culprit in late projects is uncertainty, which most, if not all, projects are inherently subject to. This uncertainty resides in the estimates for activity durations, the occurrence of unplanned and unforeseen events, and the availability of critical resources. In response to this problem, this research developed a comprehensive simulation based methodology for conducting quantitative project completion time risk analysis. It is called the Project Assessment by Simulation Technique (PAST). This new tool enables project stakeholders to visualize uncertainty or risk, i.e. the likelihood of their project completing late and the magnitude of the lateness, by providing them with a completion time distribution function of their projects. Discrete event simulation is used within PAST to determine the completion distribution function for the project of interest. The simulation is populated with both deterministic and stochastic elements. The deterministic inputs include planned project activities, precedence requirements, and resource requirements. The stochastic inputs include activity duration growth distributions, probabilities for events that can impact the project, and other dynamic constraints that may be placed upon project activities and milestones. These stochastic inputs are based upon past data from similar projects. The time for an entity to complete the simulation network, subject to both the deterministic and stochastic factors, represents the time to complete the project. Repeating the simulation hundreds or thousands of times allows one to create the project completion distribution function. The Project Assessment by Simulation Technique was demonstrated to be effective for the on-going NASA project to assemble the International Space Station. Approximately $500 million per month is being spent on this project, which is scheduled to complete by 2010. NASA project stakeholders participated in determining and managing completion distribution functions produced from PAST. The first result was that project stakeholders improved project completion risk awareness. Secondly, using PAST, mitigation options were analyzed to improve project completion performance and reduce total project cost.
An evaluation of rapid methods for monitoring vegetation characteristics of wetland bird habitat
Tavernia, Brian G.; Lyons, James E.; Loges, Brian W.; Wilson, Andrew; Collazo, Jaime A.; Runge, Michael C.
2016-01-01
Wetland managers benefit from monitoring data of sufficient precision and accuracy to assess wildlife habitat conditions and to evaluate and learn from past management decisions. For large-scale monitoring programs focused on waterbirds (waterfowl, wading birds, secretive marsh birds, and shorebirds), precision and accuracy of habitat measurements must be balanced with fiscal and logistic constraints. We evaluated a set of protocols for rapid, visual estimates of key waterbird habitat characteristics made from the wetland perimeter against estimates from (1) plots sampled within wetlands, and (2) cover maps made from aerial photographs. Estimated percent cover of annuals and perennials using a perimeter-based protocol fell within 10 percent of plot-based estimates, and percent cover estimates for seven vegetation height classes were within 20 % of plot-based estimates. Perimeter-based estimates of total emergent vegetation cover did not differ significantly from cover map estimates. Post-hoc analyses revealed evidence for observer effects in estimates of annual and perennial covers and vegetation height. Median time required to complete perimeter-based methods was less than 7 percent of the time needed for intensive plot-based methods. Our results show that rapid, perimeter-based assessments, which increase sample size and efficiency, provide vegetation estimates comparable to more intensive methods.
Estimating the biophysical properties of neurons with intracellular calcium dynamics.
Ye, Jingxin; Rozdeba, Paul J; Morone, Uriel I; Daou, Arij; Abarbanel, Henry D I
2014-06-01
We investigate the dynamics of a conductance-based neuron model coupled to a model of intracellular calcium uptake and release by the endoplasmic reticulum. The intracellular calcium dynamics occur on a time scale that is orders of magnitude slower than voltage spiking behavior. Coupling these mechanisms sets the stage for the appearance of chaotic dynamics, which we observe within certain ranges of model parameter values. We then explore the question of whether one can, using observed voltage data alone, estimate the states and parameters of the voltage plus calcium (V+Ca) dynamics model. We find the answer is negative. Indeed, we show that voltage plus another observed quantity must be known to allow the estimation to be accurate. We show that observing both the voltage time course V(t) and the intracellular Ca time course will permit accurate estimation, and from the estimated model state, accurate prediction after observations are completed. This sets the stage for how one will be able to use a more detailed model of V+Ca dynamics in neuron activity in the analysis of experimental data on individual neurons as well as functional networks in which the nodes (neurons) have these biophysical properties.
Estimating the biophysical properties of neurons with intracellular calcium dynamics
NASA Astrophysics Data System (ADS)
Ye, Jingxin; Rozdeba, Paul J.; Morone, Uriel I.; Daou, Arij; Abarbanel, Henry D. I.
2014-06-01
We investigate the dynamics of a conductance-based neuron model coupled to a model of intracellular calcium uptake and release by the endoplasmic reticulum. The intracellular calcium dynamics occur on a time scale that is orders of magnitude slower than voltage spiking behavior. Coupling these mechanisms sets the stage for the appearance of chaotic dynamics, which we observe within certain ranges of model parameter values. We then explore the question of whether one can, using observed voltage data alone, estimate the states and parameters of the voltage plus calcium (V+Ca) dynamics model. We find the answer is negative. Indeed, we show that voltage plus another observed quantity must be known to allow the estimation to be accurate. We show that observing both the voltage time course V (t) and the intracellular Ca time course will permit accurate estimation, and from the estimated model state, accurate prediction after observations are completed. This sets the stage for how one will be able to use a more detailed model of V+Ca dynamics in neuron activity in the analysis of experimental data on individual neurons as well as functional networks in which the nodes (neurons) have these biophysical properties.
Validity of a Self-Report Recall Tool for Estimating Sedentary Behavior in Adults.
Gomersall, Sjaan R; Pavey, Toby G; Clark, Bronwyn K; Jasman, Adib; Brown, Wendy J
2015-11-01
Sedentary behavior is continuing to emerge as an important target for health promotion. The purpose of this study was to determine the validity of a self-report use of time recall tool, the Multimedia Activity Recall for Children and Adults (MARCA) in estimating time spent sitting/lying, compared with a device-based measure. Fifty-eight participants (48% female, [mean ± standard deviation] 28 ± 7.4 years of age, 23.9 ± 3.05 kg/m(2)) wore an activPAL device for 24-h and the following day completed the MARCA. Pearson correlation coefficients (r) were used to analyze convergent validity of the adult MARCA compared with activPAL estimates of total sitting/lying time. Agreement was examined using Bland-Altman plots. According to activPAL estimates, participants spent 10.4 hr/day [standard deviation (SD) = 2.06] sitting or lying down while awake. The correlation between MARCA and activPAL estimates of total sit/lie time was r = .77 (95% confidence interval = 0.64-0.86; P < .001). Bland-Altman analyses revealed a mean bias of +0.59 hr/day with moderately wide limits of agreement (-2.35 hr to +3.53 hr/day). This study found a moderate to strong agreement between the adult MARCA and the activPAL, suggesting that the MARCA is an appropriate tool for the measurement of time spent sitting or lying down in an adult population.
2010-08-23
typhoon. Part I: Satel- lite data analyses. J. Atmos. Sci., 63, 1377–1389. ——, ——, X. Ge, B. Wang, and M. Peng, 2003: Satellite data analysis and numerical...is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining...the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this
2011-09-01
0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing...instructions, searching existing data sources , gathering and maintaining the data needed, and completing and reviewing this collection of information...Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden to
Stability region maximization by decomposition-aggregation method. [Skylab stability
NASA Technical Reports Server (NTRS)
Siljak, D. D.; Cuk, S. M.
1974-01-01
This work is to improve the estimates of the stability regions by formulating and resolving a proper maximization problem. The solution of the problem provides the best estimate of the maximal value of the structural parameter and at the same time yields the optimum comparison system, which can be used to determine the degree of stability of the Skylab. The analysis procedure is completely computerized, resulting in a flexible and powerful tool for stability considerations of large-scale linear as well as nonlinear systems.
2006-07-01
information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other...Western Diet with representative Sucrose 29.1 25.2 17 medium and high fat diets (Ghibaudi, Maltodextrin 8.5 6.5 10 et al., Obesity Research, pp 956-963 10(9
Spectral Unmixing Analysis of Time Series Landsat 8 Images
NASA Astrophysics Data System (ADS)
Zhuo, R.; Xu, L.; Peng, J.; Chen, Y.
2018-05-01
Temporal analysis of Landsat 8 images opens up new opportunities in the unmixing procedure. Although spectral analysis of time series Landsat imagery has its own advantage, it has rarely been studied. Nevertheless, using the temporal information can provide improved unmixing performance when compared to independent image analyses. Moreover, different land cover types may demonstrate different temporal patterns, which can aid the discrimination of different natures. Therefore, this letter presents time series K-P-Means, a new solution to the problem of unmixing time series Landsat imagery. The proposed approach is to obtain the "purified" pixels in order to achieve optimal unmixing performance. The vertex component analysis (VCA) is used to extract endmembers for endmember initialization. First, nonnegative least square (NNLS) is used to estimate abundance maps by using the endmember. Then, the estimated endmember is the mean value of "purified" pixels, which is the residual of the mixed pixel after excluding the contribution of all nondominant endmembers. Assembling two main steps (abundance estimation and endmember update) into the iterative optimization framework generates the complete algorithm. Experiments using both simulated and real Landsat 8 images show that the proposed "joint unmixing" approach provides more accurate endmember and abundance estimation results compared with "separate unmixing" approach.
Judge, Lawrence W; Burke, Jeanmarie R
2010-06-01
To determine the effects of training sessions, involving high-resistance, low-repetition bench press exercise, on strength recovery patterns, as a function of gender and training background. The subjects were 12 athletes (6 males and 6 females) and age-matched college students of both genders (4 males and 4 females). The subjects completed a 3-wk resistance training program involving a bench press exercise, 3 d/wk, to become familiar with the testing procedure. After the completion of the resistance training program, the subjects, on three consecutive weeks, participated in two testing sessions per week, baseline session and recovery session. During the testing sessions, subjects performed five sets of the bench press exercise at 50% to 100% of perceived five repetition maximum (5-RM). Following the weekly baseline sessions, subjects rested during a 4-, 24-, or 48-h recovery period. Strength measurements were estimates of one repetition maximum (1-RM), using equivalent percentages for the number of repetitions completed by the subject at the perceived 5-RM effort of the bench press exercise. The full-factorial ANOVA model revealed a Gender by Recovery Period by Testing Session interaction effect, F(2, 32) = 10.65; P < .05. Among male subjects, decreases in estimated 1-RM were detected at the 4- and 24-h recovery times. There were no differences in muscle strength among the female subjects, regardless of recovery time. For bench press exercises, using different recovery times of 48 h for males and 4 h for females may optimize strength development as a function of gender.
78 FR 71712 - Request for Comments of a Previously Approved Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-29
... during the operations by the Armed forces of the United States in World War II, Korea, Vietnam, and...: 550. Estimated Completion Time per Response: 1 hour. Frequency of Collection: Annually. ADDRESSES... 49 CFR 1:48. Dated: November 21, 2013. Julie P. Agarwal, Secretary, Maritime Administration. [FR Doc...
76 FR 20386 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-12
... series of online investor forms. Investors may access these forms through the SEC Center for Complaints... depends on the number of investors who use the forms each year and the estimated time it takes to complete... techniques or other forms of information technology. Consideration will be given to comments and suggestions...
Retirement Patterns from Career Employment
ERIC Educational Resources Information Center
Cahill, Kevin E.; Giandrea, Michael D.; Quinn, Joseph F.
2006-01-01
Purpose: This article investigates how older Americans leave their career jobs and estimates the extent of intermediate labor force activity (bridge jobs) between full-time work on a career job and complete labor-force withdrawal. Design and Methods: Using data from the Health and Retirement Study, we explored the work histories and retirement…
Scharfenberg, Janna; Schaper, Katharina; Krummenauer, Frank
2014-01-01
The German "Dr med" plays a specific role in doctoral thesis settings since students may start the underlying doctoral project during their studies at medical school. If a Medical Faculty principally encourages this approach, then it should support the students in performing the respective projects as efficiently as possible. Consequently, it must be ensured that students are able to implement and complete a doctoral project in parallel to their studies. As a characteristic efficiency feature of these "Dr med" initiatives, the proportion of doctoral projects successfully completed shortly after graduating from medical school is proposed and illustrated. The proposed characteristic can be estimated by the time period between the state examination (date of completion of the qualifying medical examination) and the doctoral examination. Completion of the doctoral project "during their medical studies" was then characterised by a doctoral examination no later than 12 months after the qualifying medical state examination. To illustrate the estimation and interpretation of this characteristic, it was retrospectively estimated on the basis of the full sample of all doctorates successfully completed between July 2009 and June 2012 at the Department of Human Medicine at the Faculty of Health of the University of Witten/Herdecke. During the period of investigation defined, a total number of 56 doctoral examinations were documented, 30 % of which were completed within 12 months after the qualifying medical state examination (95% confidence interval 19 to 44 %). The median duration between state and doctoral examination was 27 months. The proportion of doctoral projects completed parallel to the medical studies increased during the investigation period from 14 % in the first year (July 2009 till June 2010) to 40 % in the third year (July 2011 till June 2012). Only about a third of all "Dr med" projects at the Witten/Herdecke Faculty of Health were completed during or close to the qualifying medical studies. This proportion, however, increased after the introduction of a curriculum on research methodology and practice in 2010; prospective longitudinal studies will have to clarify whether this is causal or mere chronological coincidence. In summary, the proposed method for determining the process efficiency of a medical faculty's "Dr med" programme has proven to be both feasible and informative. Copyright © 2014. Published by Elsevier GmbH.
Parsons, T J; Thomas, C; Power, C
2009-08-01
To investigate patterns of, and associations between, physical activity at work and in leisure time, television viewing and computer use. 4531 men and 4594 women with complete plausible data, age 44-45 years, participating in the 1958 British birth cohort study. Physical activity, television viewing and computer use (hours/week) were estimated using a self-complete questionnaire and intensity (MET hours/week) derived for physical activity. Relationships were investigated using linear regression and chi(2) tests. From a target sample of 11,971, 9223 provided information on physical activity, of whom 75 and 47% provided complete and plausible activity data on work and leisure time activity respectively. Men and women spent a median of 40.2 and 34.2 h/week, respectively in work activity, and 8.3 and 5.8 h/week in leisure activity. Half of all participants watched television for > or =2 h/day, and half used a computer for <1 h/day. Longer work hours were not associated with a shorter duration of leisure activity, but were associated with a shorter duration of computer use (men only). In men, higher work MET hours were associated with higher leisure-time MET hours, and shorter durations of television viewing and computer use. Watching more television was related to fewer hours or MET hours of leisure activity, as was longer computer use in men. Longer computer use was related to more hours (or MET hours) in leisure activities in women. Physical activity levels at work and in leisure time in mid-adulthood are low. Television viewing (and computer use in men) may compete with leisure activity for time, whereas longer duration of work hours is less influential. To change active and sedentary behaviours, better understanding of barriers and motivators is needed.
Levine, Andrew J; Martin, Eileen; Sacktor, Ned; Munro, Cynthia; Becker, James
2017-06-01
Prevalence estimates of HIV-associated neurocognitive disorders (HAND) may be inflated. Estimates are determined via cohort studies in which participants may apply suboptimal effort on neurocognitive testing, thereby inflating estimates. Additionally, fluctuating HAND severity over time may be related to inconsistent effort. To address these hypotheses, we characterized effort in the Multicenter AIDS Cohort Study. After neurocognitive testing, 935 participants (525 HIV- and 410 HIV+) completed the visual analog effort scale (VAES), rating their effort from 0% to 100%. Those with <100% then indicated the reason(s) for suboptimal effort. K-means cluster analysis established 3 groups: high (mean = 97%), moderate (79%), and low effort (51%). Rates of HAND and other characteristics were compared between the groups. Linear regression examined the predictors of VAES score. Data from 57 participants who completed the VAES at 2 visits were analyzed to characterize the longitudinal relationship between effort and HAND severity. Fifty-two percent of participants reported suboptimal effort (<100%), with no difference between serostatus groups. Common reasons included "tired" (43%) and "distracted" (36%). The lowest effort group had greater asymptomatic neurocognitive impairment and minor neurocognitive disorder diagnosis (25% and 33%) as compared with the moderate (23% and 15%) and the high (12% and 9%) effort groups. Predictors of suboptimal effort were self-reported memory impairment, African American race, and cocaine use. Change in effort between baseline and follow-up correlated with change in HAND severity. Suboptimal effort seems to inflate estimated HAND prevalence and explain fluctuation of severity over time. A simple modification of study protocols to optimize effort is indicated by the results.
Central Visual Prosthesis With Interface at the Lateral Geniculate Nucleus
2017-07-01
burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...currently used in the field to implant Deep Brain Stimulation electrodes. Page 4 We thus limited ourselves to using a ‘4 French’ size split sheath...this program. At this time , several approaches for realizing the complete system have been evaluated. Initially, a very simple mechanical mockup
Papageorgiou, Spyridon N; Antonoglou, Georgios N; Sándor, George K; Eliades, Theodore
2017-01-01
A priori registration of randomized clinical trials is crucial to the transparency and credibility of their findings. Aim of this study was to assess the frequency with which registered and completed randomized trials in orthodontics are published. We searched ClinicalTrials.gov and ISRCTN for registered randomized clinical trials in orthodontics that had been completed up to January 2017 and judged the publication status and date of registered trials using a systematic protocol. Statistical analysis included descriptive statistics, chi-square or Fisher exact tests, and Kaplan-Meier survival estimates. From the 266 orthodontic trials registered up to January 2017, 80 trials had been completed and included in the present study. Among these 80 included trials, the majority (76%) were registered retrospectively, while only 33 (41%) were published at the time. The median time from completion to publication was 20.1 months (interquartile range: 9.1 to 31.6 months), while survival analysis indicated that less than 10% of the trials were published after 5 years from their completion. Finally, 22 (28%) of completed trials remain unpublished even after 5 years from their completion. Publication rates of registered randomized trials in orthodontics remained low, even 5 years after their completion date.
Antonoglou, Georgios N.; Sándor, George K.; Eliades, Theodore
2017-01-01
A priori registration of randomized clinical trials is crucial to the transparency and credibility of their findings. Aim of this study was to assess the frequency with which registered and completed randomized trials in orthodontics are published. We searched ClinicalTrials.gov and ISRCTN for registered randomized clinical trials in orthodontics that had been completed up to January 2017 and judged the publication status and date of registered trials using a systematic protocol. Statistical analysis included descriptive statistics, chi-square or Fisher exact tests, and Kaplan-Meier survival estimates. From the 266 orthodontic trials registered up to January 2017, 80 trials had been completed and included in the present study. Among these 80 included trials, the majority (76%) were registered retrospectively, while only 33 (41%) were published at the time. The median time from completion to publication was 20.1 months (interquartile range: 9.1 to 31.6 months), while survival analysis indicated that less than 10% of the trials were published after 5 years from their completion. Finally, 22 (28%) of completed trials remain unpublished even after 5 years from their completion. Publication rates of registered randomized trials in orthodontics remained low, even 5 years after their completion date. PMID:28777820
NASA Astrophysics Data System (ADS)
Latorre, Borja; Peña-Sancho, Carolina; Angulo-Jaramillo, Rafaël; Moret-Fernández, David
2015-04-01
Measurement of soil hydraulic properties is of paramount importance in fields such as agronomy, hydrology or soil science. Fundamented on the analysis of the Haverkamp et al. (1994) model, the aim of this paper is to explain a technique to estimate the soil hydraulic properties (sorptivity, S, and hydraulic conductivity, K) from the full-time cumulative infiltration curves. The method (NSH) was validated by means of 12 synthetic infiltration curves generated with HYDRUS-3D from known soil hydraulic properties. The K values used to simulate the synthetic curves were compared to those estimated with the proposed method. A procedure to identify and remove the effect of the contact sand layer on the cumulative infiltration curve was also developed. A sensitivity analysis was performed using the water level measurement as uncertainty source. Finally, the procedure was evaluated using different infiltration times and data noise. Since a good correlation between the K used in HYDRUS-3D to model the infiltration curves and those estimated by the NSH method was obtained, (R2 =0.98), it can be concluded that this technique is robust enough to estimate the soil hydraulic conductivity from complete infiltration curves. The numerical procedure to detect and remove the influence of the contact sand layer on the K and S estimates seemed to be robust and efficient. An effect of the curve infiltration noise on the K estimate was observed, which uncertainty increased with increasing noise. Finally, the results showed that infiltration time was an important factor to estimate K. Lower values of K or smaller uncertainty needed longer infiltration times.
Short-term international migration trends in England and Wales from 2004 to 2009.
Whitworth, Simon; Loukas, Konstantinos; McGregor, Ian
2011-01-01
Short-term migration estimates for England and Wales are the latest addition to the Office for National Statistics (ONS) migration statistics. This article discusses definitions of short-term migration and the methodology that is used to produce the estimates. Some of the estimates and the changes in the estimates over time are then discussed. The article includes previously unpublished short-term migration statistics and therefore helps to give a more complete picture of the size and characteristics of short-term international migration for England and Wales than has previously been possible. ONS have identified a clear user requirement for short-term migration estimates at local authority (LA) level. Consequently, attention is also paid to the progress that has been made and future work that is planned to distribute England and Wales short-term migration estimates to LA level.
Availability analysis of an HTGR fuel recycle facility. Summary report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharmahd, J.N.
1979-11-01
An availability analysis of reprocessing systems in a high-temperature gas-cooled reactor (HTGR) fuel recycle facility was completed. This report summarizes work done to date to define and determine reprocessing system availability for a previously planned HTGR recycle reference facility (HRRF). Schedules and procedures for further work during reprocessing development and for HRRF design and construction are proposed in this report. Probable failure rates, transfer times, and repair times are estimated for major system components. Unscheduled down times are summarized.
State of charge estimation in Ni-MH rechargeable batteries
NASA Astrophysics Data System (ADS)
Milocco, R. H.; Castro, B. E.
In this work we estimate the state of charge (SOC) of Ni-MH rechargeable batteries using the Kalman filter based on a simplified electrochemical model. First, we derive the complete electrochemical model of the battery which includes diffusional processes and kinetic reactions in both Ni and MH electrodes. The full model is further reduced in a cascade of two parts, a linear time invariant dynamical sub-model followed by a static nonlinearity. Both parts are identified using the current and potential measured at the terminals of the battery with a simple 1-D minimization procedure. The inverse of the static nonlinearity together with a Kalman filter provide the SOC estimation as a linear estimation problem. Experimental results with commercial batteries are provided to illustrate the estimation procedure and to show the performance.
Evaluating Protocol Lifecycle Time Intervals in HIV/AIDS Clinical Trials
Schouten, Jeffrey T.; Dixon, Dennis; Varghese, Suresh; Cope, Marie T.; Marci, Joe; Kagan, Jonathan M.
2014-01-01
Background Identifying efficacious interventions for the prevention and treatment of human diseases depends on the efficient development and implementation of controlled clinical trials. Essential to reducing the time and burden of completing the clinical trial lifecycle is determining which aspects take the longest, delay other stages, and may lead to better resource utilization without diminishing scientific quality, safety, or the protection of human subjects. Purpose In this study we modeled time-to-event data to explore relationships between clinical trial protocol development and implementation times, as well as identify potential correlates of prolonged development and implementation. Methods We obtained time interval and participant accrual data from 111 interventional clinical trials initiated between 2006 and 2011 by NIH’s HIV/AIDS Clinical Trials Networks. We determined the time (in days) required to complete defined phases of clinical trial protocol development and implementation. Kaplan-Meier estimates were used to assess the rates at which protocols reached specified terminal events, stratified by study purpose (therapeutic, prevention) and phase group (pilot/phase I, phase II, and phase III/ IV). We also examined several potential correlates to prolonged development and implementation intervals. Results Even though phase grouping did not determine development or implementation times of either therapeutic or prevention studies, overall we observed wide variation in protocol development times. Moreover, we detected a trend toward phase III/IV therapeutic protocols exhibiting longer developmental (median 2 ½ years) and implementation times (>3years). We also found that protocols exceeding the median number of days for completing the development interval had significantly longer implementation. Limitations The use of a relatively small set of protocols may have limited our ability to detect differences across phase groupings. Some timing effects present for a specific study phase may have been masked by combining protocols into phase groupings. Presence of informative censoring, such as withdrawal of some protocols from development if they began showing signs of lost interest among investigators, complicates interpretation of Kaplan-Meier estimates. Because this study constitutes a retrospective examination over an extended period of time, it does not allow for the precise identification of relative factors impacting timing. Conclusions Delays not only increase the time and cost to complete clinical trials, but they also diminish their usefulness by failing to answer research questions in time. We believe that research analyzing the time spent traversing defined intervals across the clinical trial protocol development and implementation continuum can stimulate business process analyses and reengineering efforts that could lead to reductions in the time from clinical trial concept to results, thereby accelerating progress in clinical research. PMID:24980279
A comparative simulation study of AR(1) estimators in short time series.
Krone, Tanja; Albers, Casper J; Timmerman, Marieke E
2017-01-01
Various estimators of the autoregressive model exist. We compare their performance in estimating the autocorrelation in short time series. In Study 1, under correct model specification, we compare the frequentist r 1 estimator, C-statistic, ordinary least squares estimator (OLS) and maximum likelihood estimator (MLE), and a Bayesian method, considering flat (B f ) and symmetrized reference (B sr ) priors. In a completely crossed experimental design we vary lengths of time series (i.e., T = 10, 25, 40, 50 and 100) and autocorrelation (from -0.90 to 0.90 with steps of 0.10). The results show a lowest bias for the B sr , and a lowest variability for r 1 . The power in different conditions is highest for B sr and OLS. For T = 10, the absolute performance of all measurements is poor, as expected. In Study 2, we study robustness of the methods through misspecification by generating the data according to an ARMA(1,1) model, but still analysing the data with an AR(1) model. We use the two methods with the lowest bias for this study, i.e., B sr and MLE. The bias gets larger when the non-modelled moving average parameter becomes larger. Both the variability and power show dependency on the non-modelled parameter. The differences between the two estimation methods are negligible for all measurements.
Uhlig, Constantin E; Seitz, Berthold; Eter, Nicole; Promesberger, Julia; Busse, Holger
2014-01-01
To evaluate the relative efficiencies of five Internet-based digital and three paper-based scientific surveys and to estimate the costs for different-sized cohorts. Invitations to participate in a survey were distributed via e-mail to employees of two university hospitals (E1 and E2) and to members of a medical association (E3), as a link placed in a special text on the municipal homepage regularly read by the administrative employees of two cities (H1 and H2), and paper-based to workers at an automobile enterprise (P1) and college (P2) and senior (P3) students. The main parameters analyzed included the numbers of invited and actual participants, and the time and cost to complete the survey. Statistical analysis was descriptive, except for the Kruskal-Wallis-H-test, which was used to compare the three recruitment methods. Cost efficiencies were compared and extrapolated to different-sized cohorts. The ratios of completely answered questionnaires to distributed questionnaires were between 81.5% (E1) and 97.4% (P2). Between 6.4% (P1) and 57.0% (P2) of the invited participants completely answered the questionnaires. The costs per completely answered questionnaire were $0.57-$1.41 (E1-3), $1.70 and $0.80 for H1 and H2, respectively, and $3.36-$4.21 (P1-3). Based on our results, electronic surveys with 10, 20, 30, or 42 questions would be estimated to be most cost (and time) efficient if more than 101.6-225.9 (128.2-391.7), 139.8-229.2 (93.8-193.6), 165.8-230.6 (68.7-115.7), or 188.2-231.5 (44.4-72.7) participants were required, respectively. The study efficiency depended on the technical modalities of the survey methods and engagement of the participants. Depending on our study design, our results suggest that in similar projects that will certainly have more than two to three hundred required participants, the most efficient way of conducting a questionnaire-based survey is likely via the Internet with a digital questionnaire, specifically via a centralized e-mail.
Kanagaratnam, Sathananthan; Schluter, Philip J
2012-06-01
To report robust and contemporary estimates of permanent teeth emergence ages in children of Māori, Pasifika, Chinese, Indian and European ethnic origin in the Auckland region. A stratified, two-stage cross-sectional study. Strata were defined by school decile status. Schools defined the first-stage sampling unit, and students the second stage. Invitations and consent forms were distributed to eligible participants at school for completion at home. Participants were examined at school-based clinics or in a mobile clinic. PARTICIPANTS/MATERIALS, AND METHODS: Children aged between 5 and 13 years enrolled within the Auckland Regional Dental Service. Schools were randomly selected and then all students within selected schools were invited to participate. Eligible participants completing a consent form had an additional tooth assessment that complemented their routine dental examination. A generalised gamma failure-time model was employed to estimate permanent tooth eruption ages. Visually based assessment of permanent tooth emergence. Overall, 3,466 children participated. Differences in median permanent tooth emergence ages were seen among ethnic groups and sexes (P < or = 0.01). Pasifika children had earlier median eruption time than sex-matched Māori children, who (in turn) were more advanced than sex-matched European children. Median eruption age occurred earlier in girls than boys for all permanent teeth. Despite known demographic, geographic and ethnic differences, estimates of permanent teeth emergence timing widely used in New Zealand are based on historical overseas populations. The presented estimates provide new standards and may be more appropriate for dental therapists and dentists when assessing permanent teeth emergence in New Zealand children.
Does telephone scheduling assistance increase mammography screening adherence?
Payton, Colleen A; Sarfaty, Mona; Beckett, Shirley; Campos, Carmen; Hilbert, Kathleen
2015-11-01
The 2 objectives were: 1) describe the use of a patient navigation process utilized to promote adherence to mammography screening within a primary care practice, and 2) determine the result of the navigation process and estimate the time required to increase mammography screening with this approach in a commercially insured patient population enrolled in a health maintenance organization. An evaluation of a nonrandomized practice improvement intervention. Women eligible for mammography (n = 298) who did not respond to 2 reminder letters were contacted via telephone by a navigator who offered scheduling assistance for mammography screening. The patient navigator scheduled appointments, documented the number of calls, and confirmed completed mammograms in the electronic health record, as well as estimated the time for calls and chart review. Of the 188 participants reached by phone, 112 (59%) scheduled appointments using the patient navigator, 35 (19%) scheduled their own appointments independently prior to the call, and 41 (22%) declined. As a result of the telephone intervention, 78 of the 188 women reached (41%) received a mammogram; also, all 35 women who had independently scheduled a mammogram received one. Chart documentation confirmed that 113 (38%) of the cohort of 298 women completed a mammogram. The estimated time burden for the entire project was 55 hours and 33 minutes, including calling patients, scheduling appointments, and chart review. A patient navigator can increase mammography adherence in a previously nonadherent population by making the screening appointment while the patient is on the phone.
Global Precipitation at One-Degree Daily Resolution From Multi-Satellite Observations
NASA Technical Reports Server (NTRS)
Huffman, George J.; Adler, Robert F.; Morrissey, Mark M.; Curtis, Scott; Joyce, Robert; McGavock, Brad; Susskind, Joel
2000-01-01
The One-Degree Daily (1DD) technique is described for producing globally complete daily estimates of precipitation on a 1 deg x 1 deg lat/long grid from currently available observational data. Where possible (40 deg N-40 deg S), the Threshold-Matched Precipitation Index (TMPI) provides precipitation estimates in which the 3-hourly infrared brightness temperatures (IR T(sub b)) are thresholded and all "cold" pixels are given a single precipitation rate. This approach is an adaptation of the Geostationary Operational Environmental Satellite (GOES) Precipitation Index (GPI), but for the TMPI the IR Tb threshold and conditional rain rate are set locally by month from Special Sensor Microwave/Imager (SSM/I)-based precipitation frequency and the Global Precipitation Climatology Project (GPCP) satellite-gauge (SG) combined monthly precipitation estimate, respectively. At higher latitudes the 1DD features a rescaled daily Television Infrared Observation Satellite (TIROS) Operational Vertical Sounder (TOVS) precipitation. The frequency of rain days in the TOVS is scaled down to match that in the TMPI at the data boundaries, and the resulting non-zero TOVS values are scaled locally to sum to the SG (which is a globally complete monthly product). The time series of the daily 1DD global images shows good continuity in time and across the data boundaries. Various examples are shown to illustrate uses. Validation for individual grid -box values shows a very high root-mean-square error but, it improves quickly when users perform time/space averaging according to their own requirements.
Song, Wei; Cho, Kyungeun; Um, Kyhyun; Won, Chee Sun; Sim, Sungdae
2012-01-01
Mobile robot operators must make rapid decisions based on information about the robot’s surrounding environment. This means that terrain modeling and photorealistic visualization are required for the remote operation of mobile robots. We have produced a voxel map and textured mesh from the 2D and 3D datasets collected by a robot’s array of sensors, but some upper parts of objects are beyond the sensors’ measurements and these parts are missing in the terrain reconstruction result. This result is an incomplete terrain model. To solve this problem, we present a new ground segmentation method to detect non-ground data in the reconstructed voxel map. Our method uses height histograms to estimate the ground height range, and a Gibbs-Markov random field model to refine the segmentation results. To reconstruct a complete terrain model of the 3D environment, we develop a 3D boundary estimation method for non-ground objects. We apply a boundary detection technique to the 2D image, before estimating and refining the actual height values of the non-ground vertices in the reconstructed textured mesh. Our proposed methods were tested in an outdoor environment in which trees and buildings were not completely sensed. Our results show that the time required for ground segmentation is faster than that for data sensing, which is necessary for a real-time approach. In addition, those parts of objects that were not sensed are accurately recovered to retrieve their real-world appearances. PMID:23235454
Song, Wei; Cho, Kyungeun; Um, Kyhyun; Won, Chee Sun; Sim, Sungdae
2012-12-12
Mobile robot operators must make rapid decisions based on information about the robot's surrounding environment. This means that terrain modeling and photorealistic visualization are required for the remote operation of mobile robots. We have produced a voxel map and textured mesh from the 2D and 3D datasets collected by a robot's array of sensors, but some upper parts of objects are beyond the sensors' measurements and these parts are missing in the terrain reconstruction result. This result is an incomplete terrain model. To solve this problem, we present a new ground segmentation method to detect non-ground data in the reconstructed voxel map. Our method uses height histograms to estimate the ground height range, and a Gibbs-Markov random field model to refine the segmentation results. To reconstruct a complete terrain model of the 3D environment, we develop a 3D boundary estimation method for non-ground objects. We apply a boundary detection technique to the 2D image, before estimating and refining the actual height values of the non-ground vertices in the reconstructed textured mesh. Our proposed methods were tested in an outdoor environment in which trees and buildings were not completely sensed. Our results show that the time required for ground segmentation is faster than that for data sensing, which is necessary for a real-time approach. In addition, those parts of objects that were not sensed are accurately recovered to retrieve their real-world appearances.
Boyd, Matt; Baker, Michael G.; Mansoor, Osman D.; Kvizhinadze, Giorgi; Wilson, Nick
2017-01-01
Background Countries are well advised to prepare for future pandemic risks (e.g., pandemic influenza, novel emerging agents or synthetic bioweapons). These preparations do not typically include planning for complete border closure. Even though border closure may not be instituted in time, and can fail, there might still plausible chances of success for well organized island nations. Objective To estimate costs and benefits of complete border closure in response to new pandemic threats, at an initial proof-of-concept level. New Zealand was used as a case-study for an island country. Methods An Excel spreadsheet model was developed to estimate costs and benefits. Case-study specific epidemiological data was sourced from past influenza pandemics. Country-specific healthcare cost data, valuation of life, and lost tourism revenue were imputed (with lost trade also in scenario analyses). Results For a new pandemic equivalent to the 1918 influenza pandemic (albeit with half the mortality rate, “Scenario A”), it was estimated that successful border closure for 26 weeks provided a net societal benefit (e.g., of NZ$11.0 billion, USD$7.3 billion). Even in the face of a complete end to trade, a net benefit was estimated for scenarios where the mortality rate was high (e.g., at 10 times the mortality impact of “Scenario A”, or 2.75% of the country’s population dying) giving a net benefit of NZ$54 billion (USD$36 billion). But for some other pandemic scenarios where trade ceased, border closure resulted in a net negative societal value (e.g., for “Scenario A” times three for 26 weeks of border closure–but not for only 12 weeks of closure when it would still be beneficial). Conclusions This “proof-of-concept” work indicates that more detailed cost-benefit analysis of border closure in very severe pandemic situations for some island nations is probably warranted, as this course of action might sometimes be worthwhile from a societal perspective. PMID:28622344
Brooks, Hannah L; Pontefract, Sarah K; Hodson, James; Blackwell, Nicholas; Hughes, Elizabeth; Marriott, John F; Coleman, Jamie J
2016-05-03
Technology-Enhanced Learning (TEL) can be used to educate Foundation Programme trainee (F1 and F2) doctors. Despite the advantages of TEL, learning behaviours may be exhibited that are not desired by system developers or educators. The aim of this evaluation was to investigate how learner behaviours (e.g. time spent on task) were affected by temporal (e.g. time of year), module (e.g. word count), and individual (e.g. knowledge) factors for 16 mandatory TEL modules related to prescribing and therapeutics. Data were extracted from the SCRIPT e-Learning platform for first year Foundation trainee (F1) doctors in the Health Education England's West Midland region from 1(st) August 2013 to 5(th) August 2014. Generalised Estimating Equation models were used to examine the relationship between time taken to complete modules, date modules were completed, pre- and post-test scores, and module factors. Over the time period examined, 688 F1 doctors interacted with the 16 compulsory modules 10,255 times. The geometric mean time taken to complete a module was 28.9 min (95% Confidence Interval: 28.4-29.5) and 1,075 (10.5%) modules were completed in less than 10 min. In February and June (prior to F1 progression reviews) peaks occurred in the number of modules completed and troughs in the time taken. Most modules were completed, and the greatest amount of time was spent on the learning on a Sunday. More time was taken by those doctors with greater pre-test scores and those with larger improvements in test scores. Foundation trainees are exhibiting unintended learning behaviours in this TEL environment, which may be attributed to several factors. These findings can help guide future developments of this TEL programme and the integration of other TEL programmes into curricula by raising awareness of potential behavioural issues that may arise.
State estimation improves prospects for ocean research
NASA Astrophysics Data System (ADS)
Stammer, Detlef; Wunsch, C.; Fukumori, I.; Marshall, J.
Rigorous global ocean state estimation methods can now be used to produce dynamically consistent time-varying model/data syntheses, the results of which are being used to study a variety of important scientific problems. Figure 1 shows a schematic of a complete ocean observing and synthesis system that includes global observations and state-of-the-art ocean general circulation models (OGCM) run on modern computer platforms. A global observing system is described in detail in Smith and Koblinsky [2001],and the present status of ocean modeling and anticipated improvements are addressed by Griffies et al. [2001]. Here, the focus is on the third component of state estimation: the synthesis of the observations and a model into a unified, dynamically consistent estimate.
Sensorless position estimator applied to nonlinear IPMC model
NASA Astrophysics Data System (ADS)
Bernat, Jakub; Kolota, Jakub
2016-11-01
This paper addresses the issue of estimating position for an ionic polymer metal composite (IPMC) known as electro active polymer (EAP). The key step is the construction of a sensorless mode considering only current feedback. This work takes into account nonlinearities caused by electrochemical effects in the material. Owing to the recent observer design technique, the authors obtained both Lyapunov function based estimation law as well as sliding mode observer. To accomplish the observer design, the IPMC model was identified through a series of experiments. The research comprises time domain measurements. The identification process was completed by means of geometric scaling of three test samples. In the proposed design, the estimated position accurately tracks the polymer position, which is illustrated by the experiments.
Estimating the value of life and injury for pedestrians using a stated preference framework.
Niroomand, Naghmeh; Jenkins, Glenn P
2017-09-01
The incidence of pedestrian death over the period 2010 to 2014 per 1000,000 in North Cyprus is about 2.5 times that of the EU, with 10.5 times more pedestrian road injuries than deaths. With the prospect of North Cyprus entering the EU, many investments need to be undertaken to improve road safety in order to reach EU benchmarks. We conducted a stated choice experiment to identify the preferences and tradeoffs of pedestrians in North Cyprus for improved walking times, pedestrian costs, and safety. The choice of route was examined using mixed logit models to obtain the marginal utilities associated with each attribute of the routes that consumers chose. These were used to estimate the individuals' willingness to pay (WTP) to save walking time and to avoid pedestrian fatalities and injuries. We then used the results to obtain community-wide estimates of the value of a statistical life (VSL) saved, the value of an injury (VI) prevented, and the value per hour of walking time saved. The estimate of the VSL was €699,434 and the estimate of VI was €20,077. These values are consistent, after adjusting for differences in incomes, with the median results of similar studies done for EU countries. The estimated value of time to pedestrians is €7.20 per person hour. The ratio of deaths to injuries is much higher for pedestrians than for road accidents, and this is completely consistent with the higher estimated WTP to avoid a pedestrian accident than to avoid a car accident. The value of time of €7.20 is quite high relative to the wages earned. Findings provide a set of information on the VRR for fatalities and injuries and the value of pedestrian time that is critical for conducing ex ante appraisals of investments to improve pedestrian safety. Copyright © 2017 National Safety Council and Elsevier Ltd. All rights reserved.
24 CFR Appendix C to Part 3500 - Instructions for Completing Good Faith Estimate (GFE) Form
Code of Federal Regulations, 2012 CFR
2012-04-01
... 24 Housing and Urban Development 5 2012-04-01 2012-04-01 false Instructions for Completing Good Faith Estimate (GFE) Form C Appendix C to Part 3500 Housing and Urban Development Regulations Relating.... 3500, App. C Appendix C to Part 3500—Instructions for Completing Good Faith Estimate (GFE) Form The...
Chong, Ka Chun; Zee, Benny Chung Ying; Wang, Maggie Haitian
2018-04-10
In an influenza pandemic, arrival times of cases are a proxy of the epidemic size and disease transmissibility. Because of intense surveillance of travelers from infected countries, detection is more rapid and complete than on local surveillance. Travel information can provide a more reliable estimation of transmission parameters. We developed an Approximate Bayesian Computation algorithm to estimate the basic reproduction number (R 0 ) in addition to the reporting rate and unobserved epidemic start time, utilizing travel, and routine surveillance data in an influenza pandemic. A simulation was conducted to assess the sampling uncertainty. The estimation approach was further applied to the 2009 influenza A/H1N1 pandemic in Mexico as a case study. In the simulations, we showed that the estimation approach was valid and reliable in different simulation settings. We also found estimates of R 0 and the reporting rate to be 1.37 (95% Credible Interval [CI]: 1.26-1.42) and 4.9% (95% CI: 0.1%-18%), respectively, in the 2009 influenza pandemic in Mexico, which were robust to variations in the fixed parameters. The estimated R 0 was consistent with that in the literature. This method is useful for officials to obtain reliable estimates of disease transmissibility for strategic planning. We suggest that improvements to the flow of reporting for confirmed cases among patients arriving at different countries are required. Copyright © 2018 Elsevier Ltd. All rights reserved.
2008-01-01
for the first time in stars other than the Sun. The complete extent of each helmet streamer above the stellar surface is about 24 R∗ which implies...is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...Heliospheric Observatory (SOHO) satellite (Schwenn 2006; Suess & Nerney 2004; Vourlidas 2006). Following the analogy with the Sun, Massi & collaborators
Zee, Jarcy; Xie, Sharon X.
2015-01-01
Summary When a true survival endpoint cannot be assessed for some subjects, an alternative endpoint that measures the true endpoint with error may be collected, which often occurs when obtaining the true endpoint is too invasive or costly. We develop an estimated likelihood function for the situation where we have both uncertain endpoints for all participants and true endpoints for only a subset of participants. We propose a nonparametric maximum estimated likelihood estimator of the discrete survival function of time to the true endpoint. We show that the proposed estimator is consistent and asymptotically normal. We demonstrate through extensive simulations that the proposed estimator has little bias compared to the naïve Kaplan-Meier survival function estimator, which uses only uncertain endpoints, and more efficient with moderate missingness compared to the complete-case Kaplan-Meier survival function estimator, which uses only available true endpoints. Finally, we apply the proposed method to a dataset for estimating the risk of developing Alzheimer's disease from the Alzheimer's Disease Neuroimaging Initiative. PMID:25916510
Phadnis, Milind A.; Shireman, Theresa I.; Wetmore, James B.; Rigler, Sally K.; Zhou, Xinhua; Spertus, John A.; Ellerbeck, Edward F.; Mahnken, Jonathan D.
2014-01-01
In a population of chronic dialysis patients with an extensive burden of cardiovascular disease, estimation of the effectiveness of cardioprotective medication in literature is based on calculation of a hazard ratio comparing hazard of mortality for two groups (with or without drug exposure) measured at a single point in time or through the cumulative metric of proportion of days covered (PDC) on medication. Though both approaches can be modeled in a time-dependent manner using a Cox regression model, we propose a more complete time-dependent metric for evaluating cardioprotective medication efficacy. We consider that drug effectiveness is potentially the result of interactions between three time-dependent covariate measures, current drug usage status (ON versus OFF), proportion of cumulative exposure to drug at a given point in time, and the patient’s switching behavior between taking and not taking the medication. We show that modeling of all three of these time-dependent measures illustrates more clearly how varying patterns of drug exposure affect drug effectiveness, which could remain obscured when modeled by the more standard single time-dependent covariate approaches. We propose that understanding the nature and directionality of these interactions will help the biopharmaceutical industry in better estimating drug efficacy. PMID:25343005
Phadnis, Milind A; Shireman, Theresa I; Wetmore, James B; Rigler, Sally K; Zhou, Xinhua; Spertus, John A; Ellerbeck, Edward F; Mahnken, Jonathan D
2014-01-01
In a population of chronic dialysis patients with an extensive burden of cardiovascular disease, estimation of the effectiveness of cardioprotective medication in literature is based on calculation of a hazard ratio comparing hazard of mortality for two groups (with or without drug exposure) measured at a single point in time or through the cumulative metric of proportion of days covered (PDC) on medication. Though both approaches can be modeled in a time-dependent manner using a Cox regression model, we propose a more complete time-dependent metric for evaluating cardioprotective medication efficacy. We consider that drug effectiveness is potentially the result of interactions between three time-dependent covariate measures, current drug usage status (ON versus OFF), proportion of cumulative exposure to drug at a given point in time, and the patient's switching behavior between taking and not taking the medication. We show that modeling of all three of these time-dependent measures illustrates more clearly how varying patterns of drug exposure affect drug effectiveness, which could remain obscured when modeled by the more standard single time-dependent covariate approaches. We propose that understanding the nature and directionality of these interactions will help the biopharmaceutical industry in better estimating drug efficacy.
A Model for Teacher Effects from Longitudinal Data without Assuming Vertical Scaling
ERIC Educational Resources Information Center
Mariano, Louis T.; McCaffrey, Daniel F.; Lockwood, J. R.
2010-01-01
There is an increasing interest in using longitudinal measures of student achievement to estimate individual teacher effects. Current multivariate models assume each teacher has a single effect on student outcomes that persists undiminished to all future test administrations (complete persistence [CP]) or can diminish with time but remains…
Are Middle Schools More Effective
ERIC Educational Resources Information Center
Bedard, Kelly; Do, Chan
2005-01-01
While nearly half of all school districts have adopted middle schools, there is little quantitative evidence of the efficacy of this educational structure. We estimate the impact of moving from a junior high school system, where students stay in elementary school longer, to a middle school system for on-time high school completion. This is a…
International Disability Educational Alliance (IDEAnet)
2009-03-01
this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data...sources, gathering and maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden...educational services ................................................32 R2: Conduct literature review and evaluation of cost-effective delivery options
A Human Factors Engineering Assessment of the Buffalo Mine Protection Clearance Vehicle Roof Hatch
2007-10-01
this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data ...sources, gathering and maintaining the data needed, and completing and reviewing the collection information. Send comments regarding this burden...3 2. Method 4 2.1 Anthropometric Data
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-12
... paperwork requirements for the USGS Earthquake Report. We may not conduct or sponsor and a person is not.... SUPPLEMENTARY INFORMATION: OMB Control Number: 1028-0048. Title: USGS Earthquake Report. Type of Request...: Voluntary. Frequency of Collection: On occasion, after an earthquake. Estimated Completion Time: 6 minutes...
2011-01-01
Background Several regression models have been proposed for estimation of isometric joint torque using surface electromyography (SEMG) signals. Common issues related to torque estimation models are degradation of model accuracy with passage of time, electrode displacement, and alteration of limb posture. This work compares the performance of the most commonly used regression models under these circumstances, in order to assist researchers with identifying the most appropriate model for a specific biomedical application. Methods Eleven healthy volunteers participated in this study. A custom-built rig, equipped with a torque sensor, was used to measure isometric torque as each volunteer flexed and extended his wrist. SEMG signals from eight forearm muscles, in addition to wrist joint torque data were gathered during the experiment. Additional data were gathered one hour and twenty-four hours following the completion of the first data gathering session, for the purpose of evaluating the effects of passage of time and electrode displacement on accuracy of models. Acquired SEMG signals were filtered, rectified, normalized and then fed to models for training. Results It was shown that mean adjusted coefficient of determination (Ra2) values decrease between 20%-35% for different models after one hour while altering arm posture decreased mean Ra2 values between 64% to 74% for different models. Conclusions Model estimation accuracy drops significantly with passage of time, electrode displacement, and alteration of limb posture. Therefore model retraining is crucial for preserving estimation accuracy. Data resampling can significantly reduce model training time without losing estimation accuracy. Among the models compared, ordinary least squares linear regression model (OLS) was shown to have high isometric torque estimation accuracy combined with very short training times. PMID:21943179
Ran, Bin; Song, Li; Cheng, Yang; Tan, Huachun
2016-01-01
Traffic state estimation from the floating car system is a challenging problem. The low penetration rate and random distribution make available floating car samples usually cover part space and time points of the road networks. To obtain a wide range of traffic state from the floating car system, many methods have been proposed to estimate the traffic state for the uncovered links. However, these methods cannot provide traffic state of the entire road networks. In this paper, the traffic state estimation is transformed to solve a missing data imputation problem, and the tensor completion framework is proposed to estimate missing traffic state. A tensor is constructed to model traffic state in which observed entries are directly derived from floating car system and unobserved traffic states are modeled as missing entries of constructed tensor. The constructed traffic state tensor can represent spatial and temporal correlations of traffic data and encode the multi-way properties of traffic state. The advantage of the proposed approach is that it can fully mine and utilize the multi-dimensional inherent correlations of traffic state. We tested the proposed approach on a well calibrated simulation network. Experimental results demonstrated that the proposed approach yield reliable traffic state estimation from very sparse floating car data, particularly when dealing with the floating car penetration rate is below 1%. PMID:27448326
An audit strategy for progression-free survival
Dodd, Lori E.; Korn, Edward L.; Freidlin, Boris; Gray, Robert; Bhattacharya, Suman
2010-01-01
Summary In randomized clinical trials, the use of potentially subjective endpoints has led to frequent use of blinded independent central review (BICR) and event adjudication committees to reduce possible bias in treatment effect estimators based on local evaluations (LE). In oncology trials, progression-free survival (PFS) is one such endpoint. PFS requires image interpretation to determine whether a patient’s cancer has progressed, and BICR has been advocated to reduce the potential for endpoints to be biased by knowledge of treatment assignment. There is current debate, however, about the value of such reviews with time-to-event outcomes like PFS. We propose a BICR audit strategy as an alternative to a complete-case BICR to provide assurance of the presence of a treatment effect. We develop an auxiliary-variable estimator of the log-hazard ratio that is more efficient than simply using the audited (i.e., sampled) BICR data for estimation. Our estimator incorporates information from the LE on all the cases and the audited BICR cases, and is an asymptotically unbiased estimator of the log-hazard ratio from BICR. The estimator offers considerable efficiency gains that improve as the correlation between LE and BICR increases. A two-stage auditing strategy is also proposed and evaluated through simulation studies. The method is applied retrospectively to a large oncology trial that had a complete-case BICR, showing the potential for efficiency improvements. PMID:21210772
Ran, Bin; Song, Li; Zhang, Jian; Cheng, Yang; Tan, Huachun
2016-01-01
Traffic state estimation from the floating car system is a challenging problem. The low penetration rate and random distribution make available floating car samples usually cover part space and time points of the road networks. To obtain a wide range of traffic state from the floating car system, many methods have been proposed to estimate the traffic state for the uncovered links. However, these methods cannot provide traffic state of the entire road networks. In this paper, the traffic state estimation is transformed to solve a missing data imputation problem, and the tensor completion framework is proposed to estimate missing traffic state. A tensor is constructed to model traffic state in which observed entries are directly derived from floating car system and unobserved traffic states are modeled as missing entries of constructed tensor. The constructed traffic state tensor can represent spatial and temporal correlations of traffic data and encode the multi-way properties of traffic state. The advantage of the proposed approach is that it can fully mine and utilize the multi-dimensional inherent correlations of traffic state. We tested the proposed approach on a well calibrated simulation network. Experimental results demonstrated that the proposed approach yield reliable traffic state estimation from very sparse floating car data, particularly when dealing with the floating car penetration rate is below 1%.
Individual-Based Completion Rates for Apprentices. Technical Paper
ERIC Educational Resources Information Center
Karmel, Tom
2011-01-01
Low completion rates for apprentices and trainees have received considerable attention recently and it has been argued that NCVER seriously understates completion rates. In this paper Tom Karmel uses NCVER data on recommencements to estimate individual-based completion rates. It is estimated that around one-quarter of trade apprentices swap…
NASA Astrophysics Data System (ADS)
Yu, Wenwu; Cao, Jinde
2007-09-01
Parameter identification of dynamical systems from time series has received increasing interest due to its wide applications in secure communication, pattern recognition, neural networks, and so on. Given the driving system, parameters can be estimated from the time series by using an adaptive control algorithm. Recently, it has been reported that for some stable systems, in which parameters are difficult to be identified [Li et al., Phys Lett. A 333, 269-270 (2004); Remark 5 in Yu and Cao, Physica A 375, 467-482 (2007); and Li et al., Chaos 17, 038101 (2007)], and in this paper, a brief discussion about whether parameters can be identified from time series is investigated. From some detailed analyses, the problem of why parameters of stable systems can be hardly estimated is discussed. Some interesting examples are drawn to verify the proposed analysis.
Mitogenomic analysis of the genus Panthera.
Wei, Lei; Wu, Xiaobing; Zhu, Lixin; Jiang, Zhigang
2011-10-01
The complete sequences of the mitochondrial DNA genomes of Panthera tigris, Panthera pardus, and Panthera uncia were determined using the polymerase chain reaction method. The lengths of the complete mitochondrial DNA sequences of the three species were 16990, 16964, and 16773 bp, respectively. Each of the three mitochondrial DNA genomes included 13 protein-coding genes, 22 tRNA, two rRNA, one O(L)R, and one control region. The structures of the genomes were highly similar to those of Felis catus, Acinonyx jubatus, and Neofelis nebulosa. The phylogenies of the genus Panthera were inferred from two combined mitochondrial sequence data sets and the complete mitochondrial genome sequences, by MP (maximum parsimony), ML (maximum likelihood), and Bayesian analysis. The results showed that Panthera was composed of Panthera leo, P. uncia, P. pardus, Panthera onca, P. tigris, and N. nebulosa, which was included as the most basal member. The phylogeny within Panthera genus was N. nebulosa (P. tigris (P. onca (P. pardus, (P. leo, P. uncia)))). The divergence times for Panthera genus were estimated based on the ML branch lengths and four well-established calibration points. The results showed that at about 11.3 MYA, the Panthera genus separated from other felid species and then evolved into the several species of the genus. In detail, N. nebulosa was estimated to be founded about 8.66 MYA, P. tigris about 6.55 MYA, P. uncia about 4.63 MYA, and P. pardus about 4.35 MYA. All these estimated times were older than those estimated from the fossil records. The divergence event, evolutionary process, speciation, and distribution pattern of P. uncia, a species endemic to the central Asia with core habitats on the Qinghai-Tibetan Plateau and surrounding highlands, mostly correlated with the geological tectonic events and intensive climate shifts that happened at 8, 3.6, 2.5, and 1.7 MYA on the plateau during the late Cenozoic period.
Rafia, Rachid; Dodd, Peter J; Brennan, Alan; Meier, Petra S; Hope, Vivian D; Ncube, Fortune; Byford, Sarah; Tie, Hiong; Metrebian, Nicola; Hellier, Jennifer; Weaver, Tim; Strang, John
2016-09-01
To determine whether the provision of contingency management using financial incentives to improve hepatitis B vaccine completion in people who inject drugs entering community treatment represents a cost-effective use of health-care resources. A probabilistic cost-effectiveness analysis was conducted, using a decision-tree to estimate the short-term clinical and health-care cost impact of the vaccination strategies, followed by a Markov process to evaluate the long-term clinical consequences and costs associated with hepatitis B infection. Data on attendance to vaccination from a UK cluster randomized trial. Two contingency management options were examined in the trial: fixed versus escalating schedule financial incentives. Life-time health-care costs and quality-adjusted life years discounted at 3.5% annually; incremental cost-effectiveness ratios. The resulting estimate for the incremental life-time health-care cost of the contingency management strategy versus usual care was £21.86 [95% confidence interval (CI) = -£12.20 to 39.86] per person offered the incentive. For 1000 people offered the incentive, the incremental reduction in numbers of hepatitis B infections avoided over their lifetime was estimated at 19 (95% CI = 8-30). The probabilistic incremental cost per quality adjusted life-year gained of the contingency management programme was estimated to be £6738 (95% CI = £6297-7172), with an 89% probability of being considered cost-effective at a threshold of £20 000 per quality-adjusted life years gained (97.60% at £30 000). Using financial incentives to increase hepatitis B vaccination completion in people who inject drugs could be a cost-effective use of health-care resources in the UK as long as the incidence remains above 1.2%. © 2016 Society for the Study of Addiction.
Optimal design and use of retry in fault tolerant real-time computer systems
NASA Technical Reports Server (NTRS)
Lee, Y. H.; Shin, K. G.
1983-01-01
A new method to determin an optimal retry policy and for use in retry of fault characterization is presented. An optimal retry policy for a given fault characteristic, which determines the maximum allowable retry durations to minimize the total task completion time was derived. The combined fault characterization and retry decision, in which the characteristics of fault are estimated simultaneously with the determination of the optimal retry policy were carried out. Two solution approaches were developed, one based on the point estimation and the other on the Bayes sequential decision. The maximum likelihood estimators are used for the first approach, and the backward induction for testing hypotheses in the second approach. Numerical examples in which all the durations associated with faults have monotone hazard functions, e.g., exponential, Weibull and gamma distributions are presented. These are standard distributions commonly used for modeling analysis and faults.
NASA Technical Reports Server (NTRS)
Wierwille, W. W.; Rahimi, M.; Casali, J. G.
1985-01-01
As aircraft and other systems become more automated, a shift is occurring in human operator participation in these systems. This shift is away from manual control and toward activities that tap the higher mental functioning of human operators. Therefore, an experiment was performed in a moving-base flight simulator to assess mediational (cognitive) workload measurement. Specifically, 16 workload estimation techniques were evaluated as to their sensitivity and intrusion in a flight task emphasizing mediational behavior. Task loading, using navigation problems presented on a display, was treated as an independent variable, and workload-measure values were treated as dependent variables. Results indicate that two mediational task measures, two rating scale measures, time estimation, and two eye behavior measures were reliably sensitive to mediational loading. The time estimation measure did, however, intrude on mediational task performance. Several of the remaining measures were completely insensitive to mediational load.
Students' Accuracy of Measurement Estimation: Context, Units, and Logical Thinking
ERIC Educational Resources Information Center
Jones, M. Gail; Gardner, Grant E.; Taylor, Amy R.; Forrester, Jennifer H.; Andre, Thomas
2012-01-01
This study examined students' accuracy of measurement estimation for linear distances, different units of measure, task context, and the relationship between accuracy estimation and logical thinking. Middle school students completed a series of tasks that included estimating the length of various objects in different contexts and completed a test…
Quantum Parameter Estimation: From Experimental Design to Constructive Algorithm
NASA Astrophysics Data System (ADS)
Yang, Le; Chen, Xi; Zhang, Ming; Dai, Hong-Yi
2017-11-01
In this paper we design the following two-step scheme to estimate the model parameter ω 0 of the quantum system: first we utilize the Fisher information with respect to an intermediate variable v=\\cos ({ω }0t) to determine an optimal initial state and to seek optimal parameters of the POVM measurement operators; second we explore how to estimate ω 0 from v by choosing t when a priori information knowledge of ω 0 is available. Our optimal initial state can achieve the maximum quantum Fisher information. The formulation of the optimal time t is obtained and the complete algorithm for parameter estimation is presented. We further explore how the lower bound of the estimation deviation depends on the a priori information of the model. Supported by the National Natural Science Foundation of China under Grant Nos. 61273202, 61673389, and 61134008
NASA Astrophysics Data System (ADS)
Noda, A.; Saito, T.; Fukuyama, E.
2017-12-01
In southwest Japan, great thrust earthquakes occurred on the plate interface along the Nankai trough with a recurrence time of about 100 yr. Most studies estimated slip deficits on the seismogenic zone from interseismic GNSS velocity data assuming elastic slip-response functions (e.g. Loveless and Meade, 2016; Yokota et al., 2016). The observed surface velocities, however, include effects of viscoelastic relaxation in the asthenosphere caused by slip history of seismic cycles on the plate interface. Following Noda et al. (2013, GJI), the interseismic surface velocities due to seismic cycle can be represented by the superposition of (1) completely relaxed viscoelastic response to steady slip rate over the whole plate interface, (2) completely relaxed viscoelastic response to steady slip deficit rate in the seismogenic zone, and (3) surface velocity due to viscoelastic stress relaxation after the last interplate earthquake. Subtracting calculated velocities due to steady slip (1) from velocity data observed after the postseismic stress relaxation (3) decays sufficiently, we can formulate an inverse problem of estimating slip deficit rates from the residual velocities using completely relaxed slip-response functions. In an elastic (lithosphere) - viscoelastic (asthenosphere) layered half-space, the completely relaxed responses do not depend on the viscosity of asthenosphere, but depend on the thickness of lithosphere. In this study, we investigate the effects of structure model on the estimation of slip deficit rate distribution. First, we analyze GNSS daily coordinate data (GEONET F3 Solution, GSI), and obtain surface velocity data for overlapped periods of 6 yr (1996-2002, 1999-2005, 2002-2008, 2005-2011). There is no significant temporal change in the velocity data, which suggests that postseismic stress relaxations after the 1944 Tonankai and the 1946 Nankai earthquakes decayed sufficiently. Next, we estimate slip deficit rate distribution from velocity data from 2005 to 2011 together with seafloor geodetic data (Yokota et al., 2016). There is a significant difference between the results using elastic and completely relaxed responses. While the result using elastic responses shows high slip-deficit rate zone in coastal regions, they are located trenchward if using completely relaxed responses.
The Blake geomagnetic excursion recorded in a radiometrically dated speleothem
NASA Astrophysics Data System (ADS)
Osete, María-Luisa; Martín-Chivelet, Javier; Rossi, Carlos; Edwards, R. Lawrence; Egli, Ramon; Muñoz-García, M. Belén; Wang, Xianfeng; Pavón-Carrasco, F. Javier; Heller, Friedrich
2012-11-01
One of the most important developments in geomagnetism has been the recognition of polarity excursions of the Earth's magnetic field. Accurate timing of the excursions is a key point for understanding the geodynamo process and for magnetostratigraphic correlation. One of the best-known excursions is the Blake geomagnetic episode, which occurred during marine isotope stage MIS 5, but its morphology and age remain controversial. Here we show, for the first time, the Blake excursion recorded in a stalagmite which was dated using the uranium-series disequilibrium techniques. The characteristic remanent magnetisation is carried by fine-grained magnetite. The event is documented by two reversed intervals (B1 and B2). The age of the event is estimated to be between 116.5±0.7 kyr BP and 112.0±1.9 kyr BP, slightly younger (∼3-4 kyr) than recent estimations from sedimentary records dated by astronomical tuning. Low values of relative palaeointensity during the Blake episode are estimated, but a relative maximum in the palaeofield intensity coeval with the complete reversal during the B2 interval was observed. Duration of the Blake geomagnetic excursion is 4.5 kyr, two times lower than single excursions and slightly higher than the estimated diffusion time for the inner core (∼3 kyr).
Woolley, Thomas E; Belmonte-Beitia, Juan; Calvo, Gabriel F; Hopewell, John W; Gaffney, Eamonn A; Jones, Bleddyn
2018-06-01
To estimate, from experimental data, the retreatment radiation 'tolerances' of the spinal cord at different times after initial treatment. A model was developed to show the relationship between the biological effective doses (BEDs) for two separate courses of treatment with the BED of each course being expressed as a percentage of the designated 'retreatment tolerance' BED value, denoted [Formula: see text] and [Formula: see text]. The primate data of Ang et al. ( 2001 ) were used to determine the fitted parameters. However, based on rodent data, recovery was assumed to commence 70 days after the first course was complete, and with a non-linear relationship to the magnitude of the initial BED (BED init ). The model, taking into account the above processes, provides estimates of the retreatment tolerance dose after different times. Extrapolations from the experimental data can provide conservative estimates for the clinic, with a lower acceptable myelopathy incidence. Care must be taken to convert the predicted [Formula: see text] value into a formal BED value and then a practical dose fractionation schedule. Used with caution, the proposed model allows estimations of retreatment doses with elapsed times ranging from 70 days up to three years after the initial course of treatment.
Seasonal Variability in Global Eddy Diffusion and the Effect on Thermospheric Neutral Density
NASA Astrophysics Data System (ADS)
Pilinski, M.; Crowley, G.
2014-12-01
We describe a method for making single-satellite estimates of the seasonal variability in global-average eddy diffusion coefficients. Eddy diffusion values as a function of time between January 2004 and January 2008 were estimated from residuals of neutral density measurements made by the CHallenging Minisatellite Payload (CHAMP) and simulations made using the Thermosphere Ionosphere Mesosphere Electrodynamics - Global Circulation Model (TIME-GCM). The eddy diffusion coefficient results are quantitatively consistent with previous estimates based on satellite drag observations and are qualitatively consistent with other measurement methods such as sodium lidar observations and eddy-diffusivity models. The eddy diffusion coefficient values estimated between January 2004 and January 2008 were then used to generate new TIME-GCM results. Based on these results, the RMS difference between the TIME-GCM model and density data from a variety of satellites is reduced by an average of 5%. This result, indicates that global thermospheric density modeling can be improved by using data from a single satellite like CHAMP. This approach also demonstrates how eddy diffusion could be estimated in near real-time from satellite observations and used to drive a global circulation model like TIME-GCM. Although the use of global values improves modeled neutral densities, there are some limitations of this method, which are discussed, including that the latitude-dependence of the seasonal neutral-density signal is not completely captured by a global variation of eddy diffusion coefficients. This demonstrates the need for a latitude-dependent specification of eddy diffusion consistent with diffusion observations made by other techniques.
Seasonal variability in global eddy diffusion and the effect on neutral density
NASA Astrophysics Data System (ADS)
Pilinski, M. D.; Crowley, G.
2015-04-01
We describe a method for making single-satellite estimates of the seasonal variability in global-average eddy diffusion coefficients. Eddy diffusion values as a function of time were estimated from residuals of neutral density measurements made by the Challenging Minisatellite Payload (CHAMP) and simulations made using the thermosphere-ionosphere-mesosphere electrodynamics global circulation model (TIME-GCM). The eddy diffusion coefficient results are quantitatively consistent with previous estimates based on satellite drag observations and are qualitatively consistent with other measurement methods such as sodium lidar observations and eddy diffusivity models. Eddy diffusion coefficient values estimated between January 2004 and January 2008 were then used to generate new TIME-GCM results. Based on these results, the root-mean-square sum for the TIME-GCM model is reduced by an average of 5% when compared to density data from a variety of satellites, indicating that the fidelity of global density modeling can be improved by using data from a single satellite like CHAMP. This approach also demonstrates that eddy diffusion could be estimated in near real-time from satellite observations and used to drive a global circulation model like TIME-GCM. Although the use of global values improves modeled neutral densities, there are limitations to this method, which are discussed, including that the latitude dependence of the seasonal neutral-density signal is not completely captured by a global variation of eddy diffusion coefficients. This demonstrates the need for a latitude-dependent specification of eddy diffusion which is also consistent with diffusion observations made by other techniques.
Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Feng, E-mail: fwang@unu.edu; Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft; Huisman, Jaco
2013-11-15
Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lackmore » of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e-waste estimation studies.« less
Merante, Serena; Ferretti, Virginia; Elena, Chiara; Calvello, Celeste; Rocca, Barbara; Zappatore, Rita; Cavigliano, Paola; Orlandi, Ester
2017-01-01
Imatinib is a cornerstone of treatment of chronic myeloid leukemia. It remains unclear whether transient treatment discontinuation or dose changes affect outcome and this approach has not yet been approved for use outside clinical trials. We conducted a retrospective single-institution observational study to evaluate factors affecting response in 'real-life' clinical practice in 138 chronic myeloid leukemia patients in chronic phase treated with imatinib. We used a novel longitudinal data analytical model, with a generalized estimating equation model, to study BCR-ABL variation according to continuous standard dose, change in dose or discontinuation; BCR-ABL transcript levels were recorded. Treatment history was subdivided into time periods for which treatment was given at constant dosage (total 483 time periods). Molecular and cytogenetic complete response was observed after 154 (32%) and 358 (74%) time periods, respectively. After adjusting for length of time period, no association between dose and cytogenetic complete response rate was observed. There was a significantly lower molecular complete response rate after time periods at a high imatinib dosage. This statistical approach can identify individual patient variation in longitudinal data collected over time and suggests that changes in dose or discontinuation of therapy could be considered in patients with appropriate biological characteristics.
España-Romero, Vanesa; Golubic, Rajna; Martin, Kathryn R.; Hardy, Rebecca; Ekelund, Ulf; Kuh, Diana; Wareham, Nicholas J.; Cooper, Rachel; Brage, Soren
2014-01-01
Objectives To compare physical activity (PA) subcomponents from EPIC Physical Activity Questionnaire (EPAQ2) and combined heart rate and movement sensing in older adults. Methods Participants aged 60–64y from the MRC National Survey of Health and Development in Great Britain completed EPAQ2, which assesses self-report PA in 4 domains (leisure time, occupation, transportation and domestic life) during the past year and wore a combined sensor for 5 consecutive days. Estimates of PA energy expenditure (PAEE), sedentary behaviour, light (LPA) and moderate-to-vigorous PA (MVPA) were obtained from EPAQ2 and combined sensing and compared. Complete data were available in 1689 participants (52% women). Results EPAQ2 estimates of PAEE and MVPA were higher than objective estimates and sedentary time and LPA estimates were lower [bias (95% limits of agreement) in men and women were 32.3 (−61.5 to 122.6) and 29.0 (−39.2 to 94.6) kJ/kg/day for PAEE; −4.6 (−10.6 to 1.3) and −6.0 (−10.9 to −1.0) h/day for sedentary time; −171.8 (−454.5 to 110.8) and −60.4 (−367.5 to 246.6) min/day for LPA; 91.1 (−159.5 to 341.8) and 55.4 (−117.2 to 228.0) min/day for MVPA]. There were significant positive correlations between all self-reported and objectively assessed PA subcomponents (rho = 0.12 to 0.36); the strongest were observed for MVPA (rho = 0.30 men; rho = 0.36 women) and PAEE (rho = 0.26 men; rho = 0.25 women). Conclusion EPAQ2 produces higher estimates of PAEE and MVPA and lower estimates of sedentary and LPA than objective assessment. However, both methodologies rank individuals similarly, suggesting that EPAQ2 may be used in etiological studies in this population. PMID:24516543
Prediction of Acute Mountain Sickness Using a Blood Based Test
2017-01-01
reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...develop acute mountain sickness (AMS) when they travel to high altitudes. OVERALL PROJECT SUMMARY: Following program reviews over the last 24 months we...research. That work is ongoing and will continue for remainder of the time we work on this grant. We completed the request for a no cost extension. And we
Enhancing e-waste estimates: improving data quality by multivariate Input-Output Analysis.
Wang, Feng; Huisman, Jaco; Stevels, Ab; Baldé, Cornelis Peter
2013-11-01
Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input-Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e-waste estimation studies. Copyright © 2013 Elsevier Ltd. All rights reserved.
Time distortion when users at-risk for social media addiction engage in non-social media tasks.
Turel, Ofir; Brevers, Damien; Bechara, Antoine
2018-02-01
There is a growing concern over the addictiveness of Social Media use. Additional representative indicators of impaired control are needed in order to distinguish presumed social media addiction from normal use. (1) To examine the existence of time distortion during non-social media use tasks that involve social media cues among those who may be considered at-risk for social media addiction. (2) To examine the usefulness of this distortion for at-risk vs. low/no-risk classification. We used a task that prevented Facebook use and invoked Facebook reflections (survey on self-control strategies) and subsequently measured estimated vs. actual task completion time. We captured the level of addiction using the Bergen Facebook Addiction Scale in the survey, and we used a common cutoff criterion to classify people as at-risk vs. low/no-risk of Facebook addiction. The at-risk group presented significant upward time estimate bias and the low/no-risk group presented significant downward time estimate bias. The bias was positively correlated with Facebook addiction scores. It was efficacious, especially when combined with self-reported estimates of extent of Facebook use, in classifying people to the two categories. Our study points to a novel, easy to obtain, and useful marker of at-risk for social media addiction, which may be considered for inclusion in diagnosis tools and procedures. Copyright © 2017 Elsevier Ltd. All rights reserved.
Array distribution in data-parallel programs
NASA Technical Reports Server (NTRS)
Chatterjee, Siddhartha; Gilbert, John R.; Schreiber, Robert; Sheffler, Thomas J.
1994-01-01
We consider distribution at compile time of the array data in a distributed-memory implementation of a data-parallel program written in a language like Fortran 90. We allow dynamic redistribution of data and define a heuristic algorithmic framework that chooses distribution parameters to minimize an estimate of program completion time. We represent the program as an alignment-distribution graph. We propose a divide-and-conquer algorithm for distribution that initially assigns a common distribution to each node of the graph and successively refines this assignment, taking computation, realignment, and redistribution costs into account. We explain how to estimate the effect of distribution on computation cost and how to choose a candidate set of distributions. We present the results of an implementation of our algorithms on several test problems.
National health expenditures, 1990
Levit, Katharine R.; Lazenby, Helen C.; Cowan, Cathy A.; Letsch, Suzanne W.
1991-01-01
During 1990, health expenditures as a share of gross national product rose to 12.2 percent, up from 11.6 percent in 1989. This dramatic increase is the second largest increase in the past three decades. The national health expenditure estimates presented in this article document rapidly rising health care costs and provide a context for understanding the health care financing crisis facing the Nation today. The 1990 national health expenditures incorporate the most recently available data. They differ from historical estimates presented in the preceding article. The length of time and complicated process of producing projections required use of 1989 national health expenditures—data available prior to the completion of the 1990 estimates presented here. PMID:10114934
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fishman, S., E-mail: fishman@physics.technion.ac.il; Soffer, A., E-mail: soffer@math.rutgers.edu
2016-07-15
We employ the recently developed multi-time scale averaging method to study the large time behavior of slowly changing (in time) Hamiltonians. We treat some known cases in a new way, such as the Zener problem, and we give another proof of the adiabatic theorem in the gapless case. We prove a new uniform ergodic theorem for slowly changing unitary operators. This theorem is then used to derive the adiabatic theorem, do the scattering theory for such Hamiltonians, and prove some classical propagation estimates and asymptotic completeness.
2017-12-01
increases, TtC decreases. The increased travel time allows for more analysis to be completed on-board instead of through PMA and the parallel...Distance the UUV must travel past each track to have time and realign with the next track Local X X X X X X km N/A N/A TrackFlag Flag use to reroute...No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing
Borah, Rohit; Brown, Andrew W; Capers, Patrice L; Kaiser, Kathryn A
2017-02-27
To summarise logistical aspects of recently completed systematic reviews that were registered in the International Prospective Register of Systematic Reviews (PROSPERO) registry to quantify the time and resources required to complete such projects. Meta-analysis. All of the 195 registered and completed reviews (status from the PROSPERO registry) with associated publications at the time of our search (1 July 2014). All authors extracted data using registry entries and publication information related to the data sources used, the number of initially retrieved citations, the final number of included studies, the time between registration date to publication date and number of authors involved for completion of each publication. Information related to funding and geographical location was also recorded when reported. The mean estimated time to complete the project and publish the review was 67.3 weeks (IQR=42). The number of studies found in the literature searches ranged from 27 to 92 020; the mean yield rate of included studies was 2.94% (IQR=2.5); and the mean number of authors per review was 5, SD=3. Funded reviews took significantly longer to complete and publish (mean=42 vs 26 weeks) and involved more authors and team members (mean=6.8 vs 4.8 people) than those that did not report funding (both p<0.001). Systematic reviews presently take much time and require large amounts of human resources. In the light of the ever-increasing volume of published studies, application of existing computing and informatics technology should be applied to decrease this time and resource burden. We discuss recently published guidelines that provide a framework to make finding and accessing relevant literature less burdensome. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Verdin, Andrew; Funk, Christopher C.; Rajagopalan, Balaji; Kleiber, William
2016-01-01
Robust estimates of precipitation in space and time are important for efficient natural resource management and for mitigating natural hazards. This is particularly true in regions with developing infrastructure and regions that are frequently exposed to extreme events. Gauge observations of rainfall are sparse but capture the precipitation process with high fidelity. Due to its high resolution and complete spatial coverage, satellite-derived rainfall data are an attractive alternative in data-sparse regions and are often used to support hydrometeorological early warning systems. Satellite-derived precipitation data, however, tend to underrepresent extreme precipitation events. Thus, it is often desirable to blend spatially extensive satellite-derived rainfall estimates with high-fidelity rain gauge observations to obtain more accurate precipitation estimates. In this research, we use two different methods, namely, ordinary kriging and κ-nearest neighbor local polynomials, to blend rain gauge observations with the Climate Hazards Group Infrared Precipitation satellite-derived precipitation estimates in data-sparse Central America and Colombia. The utility of these methods in producing blended precipitation estimates at pentadal (five-day) and monthly time scales is demonstrated. We find that these blending methods significantly improve the satellite-derived estimates and are competitive in their ability to capture extreme precipitation.
Shamah-Levy, Teresa; Cuevas-Nasu, Lucía; Gómez-Acosta, Luz María; Morales-Ruan, Ma Del Carmen; Méndez-Gómez Humarán, Ignacio; Robles-Villaseñor, Mara Nadiezhda; Hernández-Ávila, Mauricio
2017-01-01
To assess the effect of Education in Nutrition and Food Assistance components of the SaludArte program in participant schools during 2013-2015. A three cohort comparative study was used, with two type of follow-up panel structures: a complete panel and a continuous time, with a total of consisting on 1620 scholar children from 144 schools. Information on food intake, feeding behaviors, food preservation and hygiene, physical activity (PI) and anthropometry was registered. To stablish effect estimates, a difference in difference method combined with propensity score matching was carried out; as an alternative procedure, logistic-multinomial and logistic regression models were also used. Program attributable estimated effects were as follows: an increase in personal hygiene (p=0.045), increase in nutrition knowledges (p=0.003), PI (p=0.002 2013-2014; p=0.032 2015) and increase in fiber Intake (p=0.064). Sugar intake, contrary to the expected showed a significant increase (p=0.012 continuous time and; p=0.037 complete time). SaludArte shows positive effects over some components as expected. However in order to institutionalize the SaludArte program, it is necessary to consider these learned lessons, give it permanence and impulse it in the schools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Basher, A.M.H.
Poor control of steam generator water level of a nuclear power plant may lead to frequent nuclear reactor shutdowns. These shutdowns are more common at low power where the plant exhibits strong non-minimum phase characteristics and flow measurements at low power are unreliable in many instances. There is need to investigate this problem and systematically design a controller for water level regulation. This work is concerned with the study and the design of a suitable controller for a U-Tube Steam Generator (UTSG) of a Pressurized Water Reactor (PWR) which has time varying dynamics. The controller should be suitable for themore » water level control of UTSG without manual operation from start-up to full load transient condition. Some preliminary simulation results are presented that demonstrate the effectiveness of the proposed controller. The development of the complete control algorithm includes components such as robust output tracking, and adaptively estimating both the system parameters and state variables simultaneously. At the present time all these components are not completed due to time constraints. A robust tracking component of the controller for water level control is developed and its effectiveness on the parameter variations is demonstrated in this study. The results appear encouraging and they are only preliminary. Additional work is warranted to resolve other issues such as robust adaptive estimation.« less
Organizing the National Guard to Provide Effective Domestic Operations
2011-12-01
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited ORGANIZING THE ...this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data...sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden
76 FR 54812 - Proposed Collections; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-02
..., serves as a certification document for various RRB employer reporting forms (Forms BA-3, BA-4, Form BA-6a... minutes when used as a certification and recapitulation form. Submission of Form BA-3, BA-4, and G-440 is... RRB estimates the completion time for BA-11 information as follows: 5 hours for BA-11 responses...
Whole Sky Imaging of Clouds in the Visible and IR for Starfire Optical Range
2007-07-31
burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing...data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this...30 Fig. 36 Nearly cloud-free moonlight case..................................................................31 Fig. 37
Scientific Visualization of Landscapes and Landforms
2012-01-01
on high resolution elevation data readily available in laboratory and mobile environments. Acknowledgements The authors would like to gratefully... The public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing...instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send
Non-Chromate Aluminum Pretreatments
2012-03-01
2) Potassium permanganate, seal: polyacrylic acid, poly propylene glycol, fatty acid esters Two solution (coating and seal), elevated temp...OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for...reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing this collection of
38 CFR 21.8070 - Basic duration of a vocational training program.
Code of Federal Regulations, 2010 CFR
2010-07-01
... vocational rehabilitation, the CP or VRC will estimate the time the child needs to complete a vocational... training period the eligible child needs, the CP or VRC must determine that: (1) The proposed vocational.... In calculating the proposed program's length, the CP or VRC will follow the procedures in § 21.8074(a...
78 FR 55256 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-10
... adverse effects, the Commission may bring suit in the U.S. District Court for the District of Columbia to... Form FMC-150, Information Form for Agreements Between or Among Ocean Common Carriers, is estimated to be 8.4 person-hours per response. The average time for completing Form FMC-151, Monitoring Report for...
Resensitizing Resistant Bacteria to Antibiotics
2011-04-01
per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and...completing and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information...Fig. 2). These data suggest the pentaglycine crossbridge must be partially exposed, an assertion which is supported by evidence that anti
Two-pass imputation algorithm for missing value estimation in gene expression time series.
Tsiporkova, Elena; Boeva, Veselka
2007-10-01
Gene expression microarray experiments frequently generate datasets with multiple values missing. However, most of the analysis, mining, and classification methods for gene expression data require a complete matrix of gene array values. Therefore, the accurate estimation of missing values in such datasets has been recognized as an important issue, and several imputation algorithms have already been proposed to the biological community. Most of these approaches, however, are not particularly suitable for time series expression profiles. In view of this, we propose a novel imputation algorithm, which is specially suited for the estimation of missing values in gene expression time series data. The algorithm utilizes Dynamic Time Warping (DTW) distance in order to measure the similarity between time expression profiles, and subsequently selects for each gene expression profile with missing values a dedicated set of candidate profiles for estimation. Three different DTW-based imputation (DTWimpute) algorithms have been considered: position-wise, neighborhood-wise, and two-pass imputation. These have initially been prototyped in Perl, and their accuracy has been evaluated on yeast expression time series data using several different parameter settings. The experiments have shown that the two-pass algorithm consistently outperforms, in particular for datasets with a higher level of missing entries, the neighborhood-wise and the position-wise algorithms. The performance of the two-pass DTWimpute algorithm has further been benchmarked against the weighted K-Nearest Neighbors algorithm, which is widely used in the biological community; the former algorithm has appeared superior to the latter one. Motivated by these findings, indicating clearly the added value of the DTW techniques for missing value estimation in time series data, we have built an optimized C++ implementation of the two-pass DTWimpute algorithm. The software also provides for a choice between three different initial rough imputation methods.
Temporal Data Set Reduction Based on D-Optimality for Quantitative FLIM-FRET Imaging.
Omer, Travis; Intes, Xavier; Hahn, Juergen
2015-01-01
Fluorescence lifetime imaging (FLIM) when paired with Förster resonance energy transfer (FLIM-FRET) enables the monitoring of nanoscale interactions in living biological samples. FLIM-FRET model-based estimation methods allow the quantitative retrieval of parameters such as the quenched (interacting) and unquenched (non-interacting) fractional populations of the donor fluorophore and/or the distance of the interactions. The quantitative accuracy of such model-based approaches is dependent on multiple factors such as signal-to-noise ratio and number of temporal points acquired when sampling the fluorescence decays. For high-throughput or in vivo applications of FLIM-FRET, it is desirable to acquire a limited number of temporal points for fast acquisition times. Yet, it is critical to acquire temporal data sets with sufficient information content to allow for accurate FLIM-FRET parameter estimation. Herein, an optimal experimental design approach based upon sensitivity analysis is presented in order to identify the time points that provide the best quantitative estimates of the parameters for a determined number of temporal sampling points. More specifically, the D-optimality criterion is employed to identify, within a sparse temporal data set, the set of time points leading to optimal estimations of the quenched fractional population of the donor fluorophore. Overall, a reduced set of 10 time points (compared to a typical complete set of 90 time points) was identified to have minimal impact on parameter estimation accuracy (≈5%), with in silico and in vivo experiment validations. This reduction of the number of needed time points by almost an order of magnitude allows the use of FLIM-FRET for certain high-throughput applications which would be infeasible if the entire number of time sampling points were used.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stark, Christopher C.; Roberge, Aki; Mandell, Avi
ExoEarth yield is a critical science metric for future exoplanet imaging missions. Here we estimate exoEarth candidate yield using single visit completeness for a variety of mission design and astrophysical parameters. We review the methods used in previous yield calculations and show that the method choice can significantly impact yield estimates as well as how the yield responds to mission parameters. We introduce a method, called Altruistic Yield Optimization, that optimizes the target list and exposure times to maximize mission yield, adapts maximally to changes in mission parameters, and increases exoEarth candidate yield by up to 100% compared to previousmore » methods. We use Altruistic Yield Optimization to estimate exoEarth candidate yield for a large suite of mission and astrophysical parameters using single visit completeness. We find that exoEarth candidate yield is most sensitive to telescope diameter, followed by coronagraph inner working angle, followed by coronagraph contrast, and finally coronagraph contrast noise floor. We find a surprisingly weak dependence of exoEarth candidate yield on exozodi level. Additionally, we provide a quantitative approach to defining a yield goal for future exoEarth-imaging missions.« less
NASA Astrophysics Data System (ADS)
Lv, Yongfeng; Na, Jing; Yang, Qinmin; Wu, Xing; Guo, Yu
2016-01-01
An online adaptive optimal control is proposed for continuous-time nonlinear systems with completely unknown dynamics, which is achieved by developing a novel identifier-critic-based approximate dynamic programming algorithm with a dual neural network (NN) approximation structure. First, an adaptive NN identifier is designed to obviate the requirement of complete knowledge of system dynamics, and a critic NN is employed to approximate the optimal value function. Then, the optimal control law is computed based on the information from the identifier NN and the critic NN, so that the actor NN is not needed. In particular, a novel adaptive law design method with the parameter estimation error is proposed to online update the weights of both identifier NN and critic NN simultaneously, which converge to small neighbourhoods around their ideal values. The closed-loop system stability and the convergence to small vicinity around the optimal solution are all proved by means of the Lyapunov theory. The proposed adaptation algorithm is also improved to achieve finite-time convergence of the NN weights. Finally, simulation results are provided to exemplify the efficacy of the proposed methods.
Belger, Mark; Haro, Josep Maria; Reed, Catherine; Happich, Michael; Kahle-Wrobleski, Kristin; Argimon, Josep Maria; Bruno, Giuseppe; Dodel, Richard; Jones, Roy W; Vellas, Bruno; Wimo, Anders
2016-07-18
Missing data are a common problem in prospective studies with a long follow-up, and the volume, pattern and reasons for missing data may be relevant when estimating the cost of illness. We aimed to evaluate the effects of different methods for dealing with missing longitudinal cost data and for costing caregiver time on total societal costs in Alzheimer's disease (AD). GERAS is an 18-month observational study of costs associated with AD. Total societal costs included patient health and social care costs, and caregiver health and informal care costs. Missing data were classified as missing completely at random (MCAR), missing at random (MAR) or missing not at random (MNAR). Simulation datasets were generated from baseline data with 10-40 % missing total cost data for each missing data mechanism. Datasets were also simulated to reflect the missing cost data pattern at 18 months using MAR and MNAR assumptions. Naïve and multiple imputation (MI) methods were applied to each dataset and results compared with complete GERAS 18-month cost data. Opportunity and replacement cost approaches were used for caregiver time, which was costed with and without supervision included and with time for working caregivers only being costed. Total costs were available for 99.4 % of 1497 patients at baseline. For MCAR datasets, naïve methods performed as well as MI methods. For MAR, MI methods performed better than naïve methods. All imputation approaches were poor for MNAR data. For all approaches, percentage bias increased with missing data volume. For datasets reflecting 18-month patterns, a combination of imputation methods provided more accurate cost estimates (e.g. bias: -1 % vs -6 % for single MI method), although different approaches to costing caregiver time had a greater impact on estimated costs (29-43 % increase over base case estimate). Methods used to impute missing cost data in AD will impact on accuracy of cost estimates although varying approaches to costing informal caregiver time has the greatest impact on total costs. Tailoring imputation methods to the reason for missing data will further our understanding of the best analytical approach for studies involving cost outcomes.
Efficient Learning of Continuous-Time Hidden Markov Models for Disease Progression
Liu, Yu-Ying; Li, Shuang; Li, Fuxin; Song, Le; Rehg, James M.
2016-01-01
The Continuous-Time Hidden Markov Model (CT-HMM) is an attractive approach to modeling disease progression due to its ability to describe noisy observations arriving irregularly in time. However, the lack of an efficient parameter learning algorithm for CT-HMM restricts its use to very small models or requires unrealistic constraints on the state transitions. In this paper, we present the first complete characterization of efficient EM-based learning methods for CT-HMM models. We demonstrate that the learning problem consists of two challenges: the estimation of posterior state probabilities and the computation of end-state conditioned statistics. We solve the first challenge by reformulating the estimation problem in terms of an equivalent discrete time-inhomogeneous hidden Markov model. The second challenge is addressed by adapting three approaches from the continuous time Markov chain literature to the CT-HMM domain. We demonstrate the use of CT-HMMs with more than 100 states to visualize and predict disease progression using a glaucoma dataset and an Alzheimer’s disease dataset. PMID:27019571
Aidara, Adjaratou W; Pitts, Nigel; Markowska, Neda; Bourgeois, Denis
2011-12-01
The FDI World Dental Federation is engaged in a global consultation process to assess the potential challenges and impacts of the introduction of a preventive model to existing systems for caries management. The aims of this study were to evaluate the quality of dental disease data collected with the International Caries Detection and Assessment System (ICDAS) index and dentists' perceptions with regard to the collection of data using the 'European Global Oral Health Indicators Development' (EGOHID) survey methods, and to estimate the mean time required for completion of the dental records according to the practitioners' perceptions. The data - 2877 clinical examinations and 2877 individual assessments - were collected in 2008 using a network of 146 sentinel dentists in eight European countries. A clinical survey was completed for each participant and the dentist gave a detailed assessment of each patient investigated. This study shows that practitioners' perceptions have an impact on the mean time required to complete the dental record. Mistakes originate from dentists' attempts to simplify the completion of many boxes. This results in a larger number of missing data than of error codes. These missing data have an effect on the time required for information collection. The quality of the data collected will allow the establishment of recommendations based on this method. © 2011 FDI World Dental Federation.
NASA Astrophysics Data System (ADS)
Brandt, C.; Thakur, S. C.; Tynan, G. R.
2016-04-01
Complexities of flow patterns in the azimuthal cross-section of a cylindrical magnetized helicon plasma and the corresponding plasma dynamics are investigated by means of a novel scheme for time delay estimation velocimetry. The advantage of this introduced method is the capability of calculating the time-averaged 2D velocity fields of propagating wave-like structures and patterns in complex spatiotemporal data. It is able to distinguish and visualize the details of simultaneously present superimposed entangled dynamics and it can be applied to fluid-like systems exhibiting frequently repeating patterns (e.g., waves in plasmas, waves in fluids, dynamics in planetary atmospheres, etc.). The velocity calculations are based on time delay estimation obtained from cross-phase analysis of time series. Each velocity vector is unambiguously calculated from three time series measured at three different non-collinear spatial points. This method, when applied to fast imaging, has been crucial to understand the rich plasma dynamics in the azimuthal cross-section of a cylindrical linear magnetized helicon plasma. The capabilities and the limitations of this velocimetry method are discussed and demonstrated for two completely different plasma regimes, i.e., for quasi-coherent wave dynamics and for complex broadband wave dynamics involving simultaneously present multiple instabilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brandt, C.; Max-Planck-Institute for Plasma Physics, Wendelsteinstr. 1, D-17491 Greifswald; Thakur, S. C.
2016-04-15
Complexities of flow patterns in the azimuthal cross-section of a cylindrical magnetized helicon plasma and the corresponding plasma dynamics are investigated by means of a novel scheme for time delay estimation velocimetry. The advantage of this introduced method is the capability of calculating the time-averaged 2D velocity fields of propagating wave-like structures and patterns in complex spatiotemporal data. It is able to distinguish and visualize the details of simultaneously present superimposed entangled dynamics and it can be applied to fluid-like systems exhibiting frequently repeating patterns (e.g., waves in plasmas, waves in fluids, dynamics in planetary atmospheres, etc.). The velocity calculationsmore » are based on time delay estimation obtained from cross-phase analysis of time series. Each velocity vector is unambiguously calculated from three time series measured at three different non-collinear spatial points. This method, when applied to fast imaging, has been crucial to understand the rich plasma dynamics in the azimuthal cross-section of a cylindrical linear magnetized helicon plasma. The capabilities and the limitations of this velocimetry method are discussed and demonstrated for two completely different plasma regimes, i.e., for quasi-coherent wave dynamics and for complex broadband wave dynamics involving simultaneously present multiple instabilities.« less
Cohn, Timothy A.
2005-01-01
This paper presents an adjusted maximum likelihood estimator (AMLE) that can be used to estimate fluvial transport of contaminants, like phosphorus, that are subject to censoring because of analytical detection limits. The AMLE is a generalization of the widely accepted minimum variance unbiased estimator (MVUE), and Monte Carlo experiments confirm that it shares essentially all of the MVUE's desirable properties, including high efficiency and negligible bias. In particular, the AMLE exhibits substantially less bias than alternative censored‐data estimators such as the MLE (Tobit) or the MLE followed by a jackknife. As with the MLE and the MVUE the AMLE comes close to achieving the theoretical Frechet‐Cramér‐Rao bounds on its variance. This paper also presents a statistical framework, applicable to both censored and complete data, for understanding and estimating the components of uncertainty associated with load estimates. This can serve to lower the cost and improve the efficiency of both traditional and real‐time water quality monitoring.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hashim, Marina; Abidin, Diana Atiqah Zainal; Das, Simon K.
The present study was conducted to investigate the food consumption pattern and gastric emptying time using x-radiography technique in scats fish, Scatophagus argus feeding to satiation in laboratory conditions. Prior to feeding experiment, fish of various sizes were examined their stomach volume, using freshly prepared stomachs ligatured at the tips of the burret, where the maximum amount of distilled water collected in the stomach were measured (ml). Stomach volume is correlated with maximum food intake (S{sub max}) and it can estimate the maximum stomach distension by allometric model i.e volume=0.0000089W{sup 2.93}. Gastric emptying time was estimated using a qualitative X-radiographymore » technique, where the fish of various sizes were fed to satiation at different time since feeding. All the experimental fish was feed into satiation using radio-opaque barium sulphate (BaSO{sub 4}) paste injected in the wet shrimp in proportion to the body weight. The BaSO{sub 4} was found suitable to track the movement of feed/prey in the stomach over time and gastric emptying time of scats fish can be estimated. The results of qualitative X-Radiography observation of gastric motility, showed the fish (200 gm) that fed to maximum satiation meal (circa 11 gm) completely emptied their stomach within 30 - 36 hrs. The results of the present study will provide the first baseline information on the stomach volume, gastric emptying of scats fish in captivity.« less
NASA Astrophysics Data System (ADS)
Hashim, Marina; Abidin, Diana Atiqah Zainal; Das, Simon K.; Ghaffar, Mazlan Abd.
2014-09-01
The present study was conducted to investigate the food consumption pattern and gastric emptying time using x-radiography technique in scats fish, Scatophagus argus feeding to satiation in laboratory conditions. Prior to feeding experiment, fish of various sizes were examined their stomach volume, using freshly prepared stomachs ligatured at the tips of the burret, where the maximum amount of distilled water collected in the stomach were measured (ml). Stomach volume is correlated with maximum food intake (Smax) and it can estimate the maximum stomach distension by allometric model i.e volume=0.0000089W2.93. Gastric emptying time was estimated using a qualitative X-radiography technique, where the fish of various sizes were fed to satiation at different time since feeding. All the experimental fish was feed into satiation using radio-opaque barium sulphate (BaSO4) paste injected in the wet shrimp in proportion to the body weight. The BaSO4 was found suitable to track the movement of feed/prey in the stomach over time and gastric emptying time of scats fish can be estimated. The results of qualitative X-Radiography observation of gastric motility, showed the fish (200 gm) that fed to maximum satiation meal (circa 11 gm) completely emptied their stomach within 30 - 36 hrs. The results of the present study will provide the first baseline information on the stomach volume, gastric emptying of scats fish in captivity.
NASA Astrophysics Data System (ADS)
Gurney, K. R.; Chandrasekaran, V.; Mendoza, D. L.; Geethakumar, S.
2010-12-01
The Vulcan Project has estimated United States fossil fuel CO2 emissions at the hourly time scale and at spatial scales below the county level for the year 2002. Vulcan is built from a wide variety of observational data streams including regulated air pollutant emissions reporting, traffic monitoring, energy statistics, and US census data. In addition to these data sets, Vulcan relies on a series of modeling assumptions and constructs to interpolate in space, time and transform non-CO2 reporting into an estimate of CO2 combustion emissions. The recent version 2.0 of the Vulcan inventory has produced advances in a number of categories with particular emphasis on improved temporal structure. Onroad transportation emissions now avail of roughly 5000 automated traffic count monitors allowing for much improved diurnal and weekly time structure in our onroad transportation emissions. Though the inventory shows excellent agreement with independent national-level CO2 emissions estimates, uncertainty quantification has been a challenging task given the large number of data sources and numerous modeling assumptions. However, we have now accomplished a complete uncertainty estimate across all the Vulcan economic sectors and will present uncertainty estimates as a function of space, time, sector and fuel. We find that, like the underlying distribution of CO2 emissions themselves, the uncertainty is also strongly lognormal with high uncertainty associated with a relatively small number of locations. These locations typically are locations reliant upon coal combustion as the dominant CO2 source. We will also compare and contrast Vulcan fossil fuel CO2 emissions estimates against estimates built from DOE fuel-based surveys at the state level. We conclude that much of the difference between the Vulcan inventory and DOE statistics are not due to biased estimation but mechanistic differences in supply versus demand and combustion in space/time.
Racimo, Allison R; Talathi, Nakul S; Zelenski, Nicole A; Wells, Lawrence; Shah, Apurva S
2018-05-02
Price transparency allows patients to make value-based health care decisions and is particularly important for individuals who are uninsured or enrolled in high-deductible health care plans. The availability of consumer prices for children undergoing orthopaedic surgery has not been previously investigated. We aimed to determine the availability of price estimates from hospitals in the United States for an archetypal pediatric orthopaedic surgical procedure (closed reduction and percutaneous pinning of a distal radius fracture) and identify variations in price estimates across hospitals. This prospective investigation utilized a scripted telephone call to obtain price estimates from 50 "top-ranked hospitals" for pediatric orthopaedics and 1 "non-top-ranked hospital" from each state and the District of Columbia. Price estimates were requested using a standardized script, in which an investigator posed as the mother of a child with a displaced distal radius fracture that needed closed reduction and pinning. Price estimates (complete or partial) were recorded for each hospital. The number of calls and the duration of time required to obtain the pricing information was also recorded. Variation was assessed, and hospitals were compared on the basis of ranking, teaching status, and region. Less than half (44%) of the 101 hospitals provided a complete price estimate. The mean price estimate for top-ranked hospitals ($17,813; range, $2742 to $49,063) was 50% higher than the price estimate for non-top-ranked hospitals ($11,866; range, $3623 to $22,967) (P=0.020). Differences in price estimates were attributable to differences in hospital fees (P=0.003), not surgeon fees. Top-ranked hospitals required more calls than non-top-ranked hospitals (4.4±2.9 vs. 2.8±2.3 calls, P=0.003). A longer duration of time was required to obtain price estimates from top-ranked hospitals than from non-top-ranked hospitals (8.2±9.4 vs. 4.1±5.1 d, P=0.024). Price estimates for pediatric orthopaedic procedures are difficult to obtain. Top-ranked hospitals are more expensive and less likely to provide price information than non-top-ranked hospitals, with price differences primarily caused by variation in hospital fees, not surgeon fees. Level II-economic and decision analyses.
Phase III : GIS for the Appalachian Development Highway System 2007 cost to complete estimate
DOT National Transportation Integrated Search
2008-02-01
The proposed research will create an ADHS GIS for integrating and disseminating GIS and transportation data that will increase the accuracy and efficiency associated with completing the 2007 ADHS Cost to Complete Estimate. This project will create ap...
Harnden, Laura M; Tomberlin, Jeffery K
2016-09-01
The black soldier fly, Hermetia illucens, is recognised for its use in a forensic context as a means for estimating the time of colonisation and potentially postmortem interval of decomposing remains. However, little data exist on this species outside of its use in waste management. This study offers a preliminary assessment of the development, and subsequent validation, of H. illucens. Larvae of H. illucens were reared at three temperatures (24.9°C, 27.6°C and 32.2°C) at 55% RH on beef loin muscle, pork loin muscle and a grain-based diet (control). Each of the temperatures and diets were found to significantly (P<0.05) affect all stages of immature growth except for pupation time. Overall, those reared on the pork diet required on average ≈23.1% and ≈139.7% more degree hours to complete larval development than those reared on the beef and grain-based diets, respectively. Larvae reared at 27.6°C and 32.2°C required on average ≈8.7% more degree hours to complete development and had a final larval weight ≈30% greater than larvae reared at 24.9°C. The validity of the laboratory larval length and weight data sets was assessed via estimating the age of field-reared larvae. Grain-diet data lacked accuracy when used to estimate larval age in comparison to estimates made with beef and pork-diet data, which were able to predict larval age for ≈55.6% and ≈88.9% of sampling points, respectively, when length and weight data were used in conjunction. Field-reared larval sizes exceeded the maximum observed under laboratory conditions in almost half of the samples, which reduced estimate accuracy. Future research should develop additional criteria for identifying development of each specific instar, which may aid in improving the accuracy and precision of larval age estimates for this species. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
2012-01-01
Background Nearly all HIV infections in children worldwide are acquired through mother-to-child transmission (MTCT) during pregnancy, labour, delivery or breastfeeding. The objective of our study was to estimate the number and rate of new HIV diagnoses in children less than 13 years of age in mainland France from 2003–2006. Methods We performed a capture-recapture analysis based on three sources of information: the mandatory HIV case reporting (DOVIH), the French Perinatal Cohort (ANRS-EPF) and a laboratory-based surveillance of HIV (LaboVIH). The missing values of a variable of heterogeneous catchability were estimated through multiple imputation. Log-linear modelling provided estimates of the number of new HIV infections in children, taking into account dependencies between sources and variables of heterogeneous catchability. Results The three sources observed 216 new HIV diagnoses after record-linkage. The number of new HIV diagnoses in children was estimated at 387 (95%CI [271–503]) from 2003–2006, among whom 60% were born abroad. The estimated rate of new HIV diagnoses in children in mainland France was 9.1 per million in 2006 and was 38 times higher in children born abroad than in those born in France. The estimated completeness of the three sources combined was 55.8% (95% CI [42.9 – 79.7]) and varied according to the source; the completeness of DOVIH (28.4%) and ANRS-EPF (26.1%) were lower than that of LaboVIH (33.3%). Conclusion Our study provided, for the first time, an estimated annual rate of new HIV diagnoses in children under 13 years old in mainland France. A more systematic HIV screening of pregnant women that is repeated during pregnancy among women likely to engage in risky behaviour is needed to optimise the prevention of MTCT. HIV screening for children who migrate from countries with high HIV prevalence to France could be recommended to facilitate early diagnosis and treatment. PMID:23050554
Milani, Alessandra; Mazzocco, Ketti; Stucchi, Sara; Magon, Giorgio; Pravettoni, Gabriella; Passoni, Claudia; Ciccarelli, Chiara; Tonali, Alessandra; Profeta, Teresa; Saiani, Luisa
2017-02-01
Few resources are available to quantify clinical trial-associated workload, needed to guide staffing and budgetary planning. The aim of the study is to describe a tool to measure clinical trials nurses' workload expressed in time spent to complete core activities. Clinical trials nurses drew up a list of nursing core activities, integrating results from literature searches with personal experience. The final 30 core activities were timed for each research nurse by an outside observer during daily practice in May and June 2014. Average times spent by nurses for each activity were calculated. The "Nursing Time Required by Clinical Trial-Assessment Tool" was created as an electronic sheet that combines the average times per specified activities and mathematic functions to return the total estimated time required by a research nurse for each specific trial. The tool was tested retrospectively on 141 clinical trials. The increasing complexity of clinical research requires structured approaches to determine workforce requirements. This study provides a tool to describe the activities of a clinical trials nurse and to estimate the associated time required to deliver individual trials. The application of the proposed tool in clinical research practice could provide a consistent structure for clinical trials nursing workload estimation internationally. © 2016 John Wiley & Sons Australia, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shatenstein, B.; Kosatsky, T.; Nadon, S.
1999-02-01
A two-season exercise was undertaken in 29 high-level sportfish consumers to evaluate the reliability and accuracy of study instruments. Fishers were invited to participate after completing the main study interview (Time 1) in fall 1995 or winter 1996. Over a 4-week period, they provided a nonconsecutive 7-day weighed food record, kept a fish consumption calendar, and responded to a shortened version of the Time 1 instrument at the end of this period (Time 2). A second blood sample (at Time 2) was analyzed for whole blood mercury (Hg) and the omega-3 fatty acids eicosapentanoic acid (EPA) and docosahexaenoic acid (DHA)more » in plasma and erythrocytes. Identical questions were compared in the Time 1 and Time 2 instruments. Reported sportfish consumption assessed by the different instruments was subjected to nutrient analysis. Three estimates of exposure to the target substances were derived from the dietary intake estimates and correlated with their respective Time 2 plasma (EPA, DHA) or blood (Hg) values, and with a kinetically derived interval-specific plasma/blood uptake value. Remarkable similarities were observed for the data derived from like questions in the Time 1 and 2 questionnaires in both seasons. However, frank discrepancies between some portion size estimates and measured values may signal cause for concern.« less
Classifying with confidence from incomplete information.
Parrish, Nathan; Anderson, Hyrum S.; Gupta, Maya R.; ...
2013-12-01
For this paper, we consider the problem of classifying a test sample given incomplete information. This problem arises naturally when data about a test sample is collected over time, or when costs must be incurred to compute the classification features. For example, in a distributed sensor network only a fraction of the sensors may have reported measurements at a certain time, and additional time, power, and bandwidth is needed to collect the complete data to classify. A practical goal is to assign a class label as soon as enough data is available to make a good decision. We formalize thismore » goal through the notion of reliability—the probability that a label assigned given incomplete data would be the same as the label assigned given the complete data, and we propose a method to classify incomplete data only if some reliability threshold is met. Our approach models the complete data as a random variable whose distribution is dependent on the current incomplete data and the (complete) training data. The method differs from standard imputation strategies in that our focus is on determining the reliability of the classification decision, rather than just the class label. We show that the method provides useful reliability estimates of the correctness of the imputed class labels on a set of experiments on time-series data sets, where the goal is to classify the time-series as early as possible while still guaranteeing that the reliability threshold is met.« less
Point-Process Models of Social Network Interactions: Parameter Estimation and Missing Data Recovery
2014-08-01
treating them as zero will have a de minimis impact on the results, but avoiding computing them (and computing with them) saves tremendous time. Set a... test the methods on simulated time series on artificial social networks, including some toy networks and some meant to resemble IkeNet. We conclude...the section by discussing the results in detail. In each of our tests we begin with a complete data set, whether it is real (IkeNet) or simulated. Then
NASA Technical Reports Server (NTRS)
Mullins, N. E.; Dao, N. C.; Martin, T. V.; Goad, C. C.; Boulware, N. L.; Chin, M. M.
1972-01-01
A computer program for executive control routine for orbit integration of artificial satellites is presented. At the beginning of each arc, the program initiates required constants as well as the variational partials at epoch. If epoch needs to be reset to a previous time, the program negates the stepsize, and calls for integration backward to the desired time. After backward integration is completed, the program resets the stepsize to the proper positive quantity.
Effect of Sampling Schedule on Pharmacokinetic Parameter Estimates of Promethazine in Astronauts
NASA Technical Reports Server (NTRS)
Boyd, Jason L.; Wang, Zuwei; Putcha, Lakshmi
2005-01-01
Six astronauts on the Shuttle Transport System (STS) participated in an investigation on the pharmacokinetics of promethazine (PMZ), a medication used for the treatment of space motion sickness (SMS) during flight. Each crewmember completed the protocol once during flight and repeated thirty days after returned to Earth. Saliva samples were collected at scheduled times for 72 h after PMZ administration; more frequent samples were collected on the ground than during flight owing to schedule constraints in flight. PMZ concentrations in saliva were determined by a liquid chromatographic/mass spectrometric (LC-MS) assay and pharmacokinetic parameters (PKPs) were calculated using actual flight and ground-based data sets and using time-matched sampling schedule on ground to that during flight. Volume of distribution (V(sub c)) and clearance (Cl(sub s),) decreased during flight compared to that from time-matched ground data set; however, Cl(sub s) and V(sub c) estimates were higher for all subjects when partial ground data sets were used for analysis. Area under the curve (AUC) normalized with administered dose was similar in flight and partial ground data; however AUC was significantly lower using time-matched sampling compared with the full data set on ground. Half life (t(sub 1/2)) was longest during flight, shorter with matched-sampling schedule on ground and shortest when complete data set from ground was used. Maximum concentration (C(sub max)), time for C(sub max), (t(sub max)), parameters of drug absorption, depicted a similar trend with lowest and longest respectively, during flight, lower with time-matched ground data and highest and shortest with full ground data.
Effect of sampling schedule on pharmacokinetic parameter estimates of promethazine in astronauts
NASA Astrophysics Data System (ADS)
Boyd, Jason L.; Wang, Zuwei; Putcha, Lakshmi
2005-08-01
Six astronauts on the Shuttle Transport System (STS) participated in an investigation on the pharmacokinetics of promethazine (PMZ), a medication used for the treatment of space motion sickness (SMS) during flight. Each crewmember completed the protocol once during flight and repeated thirty days after returned to Earth. Saliva samples were collected at scheduled times for 72 h after PMZ administration; more frequent samples were collected on the ground than during flight owing to schedule constraints in flight. PMZ concentrations in saliva were determined by a liquid chromatographic/mass spectrometric (LC-MS) assay and pharmacokinetic parameters (PKPs) were calculated using actual flight and ground-based data sets and using time-matched sampling schedule on ground to that during flight. Volume of distribution (Vc) and clearance (Cls) decreased during flight compared to that from time-matched ground data set; however, ClS and Vc estimates were higher for all subjects when partial ground data sets were used for analysis. Area under the curve (AUC) normalized with administered dose was similar in flight and partial ground data; however AUC was significantly lower using time-matched sampling compared with the full data set on ground. Half life (t1/2) was longest during flight, shorter with matched-sampling schedule on ground and shortest when complete data set from ground was used. Maximum concentration (Cmax), time for Cmax (tmax), parameters of drug absorption, depicted a similar trend with lowest and longest respectively, during flight, lower with time- matched ground data and highest and shortest with full ground data.
NASA Astrophysics Data System (ADS)
Chin, Siu A.
2014-03-01
The sign-problem in PIMC simulations of non-relativistic fermions increases in serverity with the number of fermions and the number of beads (or time-slices) of the simulation. A large of number of beads is usually needed, because the conventional primitive propagator is only second-order and the usual thermodynamic energy-estimator converges very slowly from below with the total imaginary time. The Hamiltonian energy-estimator, while more complicated to evaluate, is a variational upper-bound and converges much faster with the total imaginary time, thereby requiring fewer beads. This work shows that when the Hamiltonian estimator is used in conjunction with fourth-order propagators with optimizable parameters, the ground state energies of 2D parabolic quantum-dots with approximately 10 completely polarized electrons can be obtain with ONLY 3-5 beads, before the onset of severe sign problems. This work was made possible by NPRP GRANT #5-674-1-114 from the Qatar National Research Fund (a member of Qatar Foundation). The statements made herein are solely the responsibility of the author.
Robust double gain unscented Kalman filter for small satellite attitude estimation
NASA Astrophysics Data System (ADS)
Cao, Lu; Yang, Weiwei; Li, Hengnian; Zhang, Zhidong; Shi, Jianjun
2017-08-01
Limited by the low precision of small satellite sensors, the estimation theories with high performance remains the most popular research topic for the attitude estimation. The Kalman filter (KF) and its extensions have been widely applied in the satellite attitude estimation and achieved plenty of achievements. However, most of the existing methods just take use of the current time-step's priori measurement residuals to complete the measurement update and state estimation, which always ignores the extraction and utilization of the previous time-step's posteriori measurement residuals. In addition, the uncertainty model errors always exist in the attitude dynamic system, which also put forward the higher performance requirements for the classical KF in attitude estimation problem. Therefore, the novel robust double gain unscented Kalman filter (RDG-UKF) is presented in this paper to satisfy the above requirements for the small satellite attitude estimation with the low precision sensors. It is assumed that the system state estimation errors can be exhibited in the measurement residual; therefore, the new method is to derive the second Kalman gain Kk2 for making full use of the previous time-step's measurement residual to improve the utilization efficiency of the measurement data. Moreover, the sequence orthogonal principle and unscented transform (UT) strategy are introduced to robust and enhance the performance of the novel Kalman Filter in order to reduce the influence of existing uncertainty model errors. Numerical simulations show that the proposed RDG-UKF is more effective and robustness in dealing with the model errors and low precision sensors for the attitude estimation of small satellite by comparing with the classical unscented Kalman Filter (UKF).
Estimation and Projection of Prevalence of Colorectal Cancer in Iran, 2015-2020.
Vardanjani, Hossein Molavi; Haghdoost, AliAkbar; Bagheri-Lankarani, Kamran; Hadipour, Maryam
2018-01-01
Population aging and more prevalent westernized lifestyle would be expected to result in a markedly rising burden of colorectal cancer (CRC) in the future years. The aim of this study is to estimate the limited-time prevalence of CRC in Iran between 2015 and 2020. Aggregated CRC incidence data were extracted from the Iranian national cancer registry (IR.NCR) reports for 2003-2009 and from GLOBOCAN-2012 database for 2012. Incidence trends were analyzed by age groups, genders, histopathologic, and topographic subtypes to estimate annual percentage changes. Incidence was projected for 2020. The prevalence was estimated applying an adopted version of a previously introduced equation to estimate limited-time prevalence based on the incidence and survival data. Monte Carlo sensitivity analyses were applied to estimate 95% uncertainty levels (ULs). In each scenario, incidence, survival, annual percentage changes, and completeness of case ascertainment at IR.NCR were replaced under pre-assumed distributions. Number of estimated within 1, 2-3 and 4-5-year CRC patients in 2015 were 13676 (95% UL: 10051-18807), 20964 (15835-28268), and 14485 (11188-19293), respectively. Estimated 5-year prevalence for 2020 (99463; 75150-134744) was 2.03 times of that for 2015. Highest 5-year prevalence was estimated in ages 55-59 for females and 75 + for males. Adenocarcinoma (41376; 31227 55898) was the most prevalent histologic subtype. The most prevalent tumor location was colon (30822, 23262-41638). A substantial growth in the prevalence of CRC survivors is highly expected for future years in Iran. Establishment of specialized institutes is highly recommended to provide medical and especially social supports for Iranian CRC survivors.
Brotherton, Julia M L; Piers, Leonard S; Vaughan, Loretta
2016-04-01
Background Adult Australian women aged 18 to 26 years were offered human papillomavirus (HPV) vaccine in a mass catch up campaign between 2007 and 2009. Not all doses administered were notified to Australia's HPV vaccine register and not all young women commenced or completed the vaccine course. We surveyed vaccine age-eligible women as part of the Victorian Population Health Survey 2011-2012, a population based telephone survey, to ascertain self-reported vaccine uptake and reasons for non-vaccination or non-completion of vaccination among young women resident in the state of Victoria, Australia. Among 956 women surveyed, 62.3 per cent (57.8-66.6%) had been vaccinated against HPV and coverage with three doses was estimated at 53.7 per cent (49.1-58.2%). These estimates are higher than register-based estimates for the same cohort, which were 57.8 per cent and 37.2 per cent respectively. A lack of awareness about needing three doses and simply forgetting, rather than fear or experience of side effects, were the most common reasons for failure to complete all three doses. Among women who were not vaccinated, the most frequent reasons were not knowing the vaccine was available, perceiving they were too old to benefit, or not being resident in Australia at the time. It is likely that at least half of Victoria's young women were vaccinated during the catch-up program. This high level of coverage is likely to explain the marked reductions in HPV infection, genital warts and cervical disease already observed in young women in Victoria.
Adjustment to time of use pricing: Persistence of habits or change
NASA Astrophysics Data System (ADS)
Rebello, Derrick Michael
1999-11-01
Generally the dynamics related to residential electricity consumption under TOU rates have not been analyzed completely. A habit persistence model is proposed to account for the dynamics that may be present as a result of recurring habits or lack of information about the effects of shifting load across TOU periods. In addition, the presence of attrition bias necessitated a two-step estimation approach. The decision to remain in the program modeled in the first-step, while demand for electricity was estimated in the second-step. Results show that own-price effects and habit persistence have the most significant effect the model. The habit effects, which while small in absolute terms, are significant. Elasticity estimates show that electricity consumption is inelastic during all periods of the day. Estimates of the long-run elasticities were nearly identical to short-run estimates, showing little or no adjustment across time. Cross-price elasticities indicate a willingness to substitute consumption across periods implying that TOU goods are weak substitutes. The most significant substitution occurs during the period of 5:00 PM to 9:00 PM, when most individuals are likely to be home and active.
Halliday, Drew W R; Stawski, Robert S; MacDonald, Stuart W S
2017-02-01
Response time inconsistency (RTI) in cognitive performance predicts deleterious health outcomes in late-life; however, RTI estimates are often confounded by additional influences (e.g., individual differences in learning). Finger tapping is a basic sensorimotor measure largely independent of higher-order cognition that may circumvent such confounds of RTI estimates. We examined the within-person coupling of finger-tapping mean and RTI on working memory, and the moderation of these associations by cognitive status. A total of 262 older adults were recruited and classified as controls, cognitively-impaired-not-demented (CIND) unstable or CIND stable. Participants completed finger-tapping and working-memory tasks during multiple weekly assessments, repeated annually for 4 years. Within-person coupling estimates from multilevel models indicated that on occasions when RTI was greater, working-memory response latency was slower for the CIND-stable, but not for the CIND-unstable or control individuals. The finger-tapping task shows potential for minimizing confounds on RTI estimates, and for yielding RTI estimates sensitive to central nervous system function and cognitive status. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Cratering time scales for the Galilean satellites
NASA Technical Reports Server (NTRS)
Shoemaker, E. M.; Wolfe, R. F.
1982-01-01
An attempt is made to estimate the present cratering rate for each Galilean satellite within the correct order of magnitude and to extend the cratering rates back into the geologic past on the basis of evidence from the earth-moon system. For collisions with long and short period comets, the magnitudes and size distributions of the comet nuclei, the distribution of their perihelion distances, and the completeness of discovery are addressed. The diameters and masses of cometary nuclei are assessed, as are crater diameters and cratering rates. The dynamical relations between long period and short period comets are discussed, and the population of Jupiter-crossing asteroids is assessed. Estimated present cratering rates on the Galilean satellites are compared and variations of cratering rate with time are considered. Finally, the consistency of derived cratering time scales with the cratering record of the icy Galilean satellites is discussed.
van Tuinen, Marcel; Torres, Christopher R.
2015-01-01
Uncertainty in divergence time estimation is frequently studied from many angles but rarely from the perspective of phylogenetic node age. If appropriate molecular models and fossil priors are used, a multi-locus, partitioned analysis is expected to equally minimize error in accuracy and precision across all nodes of a given phylogeny. In contrast, if available models fail to completely account for rate heterogeneity, substitution saturation and incompleteness of the fossil record, uncertainty in divergence time estimation may increase with node age. While many studies have stressed this concern with regard to deep nodes in the Tree of Life, the inference that molecular divergence time estimation of shallow nodes is less sensitive to erroneous model choice has not been tested explicitly in a Bayesian framework. Because of available divergence time estimation methods that permit fossil priors across any phylogenetic node and the present increase in efficient, cheap collection of species-level genomic data, insight is needed into the performance of divergence time estimation of shallow (<10 MY) nodes. Here, we performed multiple sensitivity analyses in a multi-locus data set of aquatic birds with six fossil constraints. Comparison across divergence time analyses that varied taxon and locus sampling, number and position of fossil constraint and shape of prior distribution showed various insights. Deviation from node ages obtained from a reference analysis was generally highest for the shallowest nodes but determined more by temporal placement than number of fossil constraints. Calibration with only the shallowest nodes significantly underestimated the aquatic bird fossil record, indicating the presence of saturation. Although joint calibration with all six priors yielded ages most consistent with the fossil record, ages of shallow nodes were overestimated. This bias was found in both mtDNA and nDNA regions. Thus, divergence time estimation of shallow nodes may suffer from bias and low precision, even when appropriate fossil priors and best available substitution models are chosen. Much care must be taken to address the possible ramifications of substitution saturation across the entire Tree of Life. PMID:26106406
NASA Astrophysics Data System (ADS)
Koshimura, S.; Hino, R.; Ohta, Y.; Kobayashi, H.; Musa, A.; Murashima, Y.
2014-12-01
With use of modern computing power and advanced sensor networks, a project is underway to establish a new system of real-time tsunami inundation forecasting, damage estimation and mapping to enhance society's resilience in the aftermath of major tsunami disaster. The system consists of fusion of real-time crustal deformation monitoring/fault model estimation by Ohta et al. (2012), high-performance real-time tsunami propagation/inundation modeling with NEC's vector supercomputer SX-ACE, damage/loss estimation models (Koshimura et al., 2013), and geo-informatics. After a major (near field) earthquake is triggered, the first response of the system is to identify the tsunami source model by applying RAPiD Algorithm (Ohta et al., 2012) to observed RTK-GPS time series at GEONET sites in Japan. As performed in the data obtained during the 2011 Tohoku event, we assume less than 10 minutes as the acquisition time of the source model. Given the tsunami source, the system moves on to running tsunami propagation and inundation model which was optimized on the vector supercomputer SX-ACE to acquire the estimation of time series of tsunami at offshore/coastal tide gauges to determine tsunami travel and arrival time, extent of inundation zone, maximum flow depth distribution. The implemented tsunami numerical model is based on the non-linear shallow-water equations discretized by finite difference method. The merged bathymetry and topography grids are prepared with 10 m resolution to better estimate the tsunami inland penetration. Given the maximum flow depth distribution, the system performs GIS analysis to determine the numbers of exposed population and structures using census data, then estimates the numbers of potential death and damaged structures by applying tsunami fragility curve (Koshimura et al., 2013). Since the tsunami source model is determined, the model is supposed to complete the estimation within 10 minutes. The results are disseminated as mapping products to responders and stakeholders, e.g. national and regional municipalities, to be utilized for their emergency/response activities. In 2014, the system is verified through the case studies of 2011 Tohoku event and potential earthquake scenarios along Nankai Trough with regard to its capability and robustness.
Uhlig, Constantin E.; Seitz, Berthold; Eter, Nicole; Promesberger, Julia; Busse, Holger
2014-01-01
Aims To evaluate the relative efficiencies of five Internet-based digital and three paper-based scientific surveys and to estimate the costs for different-sized cohorts. Methods Invitations to participate in a survey were distributed via e-mail to employees of two university hospitals (E1 and E2) and to members of a medical association (E3), as a link placed in a special text on the municipal homepage regularly read by the administrative employees of two cities (H1 and H2), and paper-based to workers at an automobile enterprise (P1) and college (P2) and senior (P3) students. The main parameters analyzed included the numbers of invited and actual participants, and the time and cost to complete the survey. Statistical analysis was descriptive, except for the Kruskal-Wallis-H-test, which was used to compare the three recruitment methods. Cost efficiencies were compared and extrapolated to different-sized cohorts. Results The ratios of completely answered questionnaires to distributed questionnaires were between 81.5% (E1) and 97.4% (P2). Between 6.4% (P1) and 57.0% (P2) of the invited participants completely answered the questionnaires. The costs per completely answered questionnaire were $0.57–$1.41 (E1–3), $1.70 and $0.80 for H1 and H2, respectively, and $3.36–$4.21 (P1–3). Based on our results, electronic surveys with 10, 20, 30, or 42 questions would be estimated to be most cost (and time) efficient if more than 101.6–225.9 (128.2–391.7), 139.8–229.2 (93.8–193.6), 165.8–230.6 (68.7–115.7), or 188.2–231.5 (44.4–72.7) participants were required, respectively. Conclusions The study efficiency depended on the technical modalities of the survey methods and engagement of the participants. Depending on our study design, our results suggest that in similar projects that will certainly have more than two to three hundred required participants, the most efficient way of conducting a questionnaire-based survey is likely via the Internet with a digital questionnaire, specifically via a centralized e-mail. PMID:25313672
Effectiveness of Condition-Based Maintenance in Army Aviation
2009-06-12
for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data ...sources, gathering and maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this...increase in efficiency in dollars spent per operational flight hour, the data set was too small to draw major conclusions. Recommendations for
Towards a (Preliminary) Theory of Cyberpower
2008-06-01
response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and... reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information...In the 2006 Quadrennial Defense Review , a request was made to have the Center for Technology and National Security Policy (CTNSP), National Defense
Trauma-Informed Guilt Reduction (TrIGR) Intervention
2017-10-01
AWARD NUMBER: W81XWH-15-1-0331 TITLE: Trauma- Informed Guilt Reduction (TrIGR) Intervention PRINCIPAL INVESTIGATOR: Christy Capone, PhD...Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...searching existing data sources, gathering and maintaining the data needed, and completing and reviewing this collection of information . Send comments
Intelligent Command and Control Demonstration Setup and Presentation Instructions
2017-12-01
and Control Demonstration Setup and Presentation Instructions by Laurel C Sadler and Somiya Metu Computational and Information Sciences...0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing...instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection information . Send
Development of Information Assurance Protocol for Low Bandwidth Nanosatellite Communications
2017-09-01
INFORMATION ASSURANCE PROTOCOL FOR LOW BANDWIDTH NANOSATELLITE COMMUNICATIONS by Cervando A. Banuelos II September 2017 Thesis Advisor...reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction...searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments
Coexistence of Named Data Networking (NDN) and Software-Defined Networking (SDN)
2017-09-01
Networking (NDN) and Software-Defined Networking (SDN) by Vinod Mishra Computational and Information Sciences Directorate, ARL...reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection information . Send comments regarding this
REVEAL: Receiver Exploiting Variability in Estimated Acoustic Levels
2013-08-07
water . Several structures have been or are being investigated. In shallow water , passive sonar context, the characteristics of received signals are...source, particularly in shallow water . Several structures have been or are being investigated. In shallow water , passive sonar context, the... dynamic and variable in time and space, a statistical approach is necessary. WORK COMPLETED In a shallow water waveguide, where the distance
Complexity Theory and Network Centric Warfare
2003-09-01
realms of the unknown. Defence thinkers everywhere are searching forward for the science and alchemy that will deliver operational success. CCRP...0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing...instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send
The Influence of Age at Degree Completion on College Wage Premiums
ERIC Educational Resources Information Center
Taniguchi, Hiromi
2005-01-01
Although studies have shown a significant wage gain associated with the possession of a college degree, few have considered at what age the degree was received to estimate this college wage premium. Given the recent increase in the enrollment of older students, this study examines how the size of the premium is affected by college timing while…
The Nexus between the Above-Average Effect and Cooperative Learning in the Classroom
ERIC Educational Resources Information Center
Breneiser, Jennifer E.; Monetti, David M.; Adams, Katharine S.
2012-01-01
The present study examines the above-average effect (Chambers & Windschitl, 2004; Moore & Small, 2007) in assessments of task performance. Participants completed self-estimates of performance and group estimates of performance, before and after completing a task. Participants completed a task individually and in groups. Groups were…
Dual Quaternions as Constraints in 4D-DPM Models for Pose Estimation.
Martinez-Berti, Enrique; Sánchez-Salmerón, Antonio-José; Ricolfe-Viala, Carlos
2017-08-19
The goal of this research work is to improve the accuracy of human pose estimation using the Deformation Part Model (DPM) without increasing computational complexity. First, the proposed method seeks to improve pose estimation accuracy by adding the depth channel to DPM, which was formerly defined based only on red-green-blue (RGB) channels, in order to obtain a four-dimensional DPM (4D-DPM). In addition, computational complexity can be controlled by reducing the number of joints by taking it into account in a reduced 4D-DPM. Finally, complete solutions are obtained by solving the omitted joints by using inverse kinematics models. In this context, the main goal of this paper is to analyze the effect on pose estimation timing cost when using dual quaternions to solve the inverse kinematics.
A Twenty-Year Survey of Novae in M31
NASA Astrophysics Data System (ADS)
Crayton, Hannah; Rector, Travis A.; Walentosky, Matthew J.; Shafter, Allen W.; Lauber, Stephanie; Pilachowski, Catherine A.; RBSE Nova Search Team
2018-06-01
Numerous surveys of M31 in search of extragalactic novae have been completed over the last century, with a total of more than 1000 having been discovered during this time. From these surveys it has been estimated that the number of novae that occur in M31 is approximately 65 yr-1 (Darnley et al. 2006). A fraction of these are recurrent novae that recur on the timescales of years to decades (Shafter et al. 2015). From 1997 to 2017 we completed observations of M31 with the KPNO/WIYN 0.9-meter telescope, which offers a wide field of view suitable for surveying nearly all of the bulge and much of the disk of M31. Observations were completed in Hα so as to better detect novae in the bulge of the galaxy, where most novae reside. Our survey achieves a limiting absolute magnitude per epoch of MHα ∼ 7.5 mag, which prior M31 nova surveys in Hα (e.g., Ciardullo et al. 1987; Shafter & Irby 2001) have shown to be sufficiently deep to detect a typical nova several months after eruption. By completing nearly all of the observations with the same telescope, cameras, and filters we were able to obtain a remarkably consistent dataset.Our survey offers several benefits as compared to prior surveys. Nearly 200 epochs of observations were completed during the survey period. Observations were typically completed on a monthly basis; although on several occasions we completed weekly and nightly observations to search for novae with faster decay rates. Thus we were sensitive to most of the novae that erupted in M31 during the survey period.Over twenty years we detected 316 novae. Our survey found 85% of the novae in M31 that were reported by other surveys completed during the same time range and in the same survey area as ours (Pietsch et al. 2007). We also discovered 39 novae that were not found by other surveys. We present the complete catalog of novae from our survey, along with example light curves. Among other uses, our catalog will be useful for improving estimates of nova rate in M31. We also identify 72 standard stars within the survey area that will be useful for future surveys.
How Stable is Happiness? Using the STARTS Model to Estimate the Stability of Life Satisfaction.
Lucas, Richard E; Donnellan, M Brent
2007-10-01
A common interpretation of existing subjective well-being research is that long-term levels of well-being are almost completely stable. However, few studies have estimated stability and change using appropriate statistical models that can precisely address this question. The STARTS model (Kenny & Zautra, 2001) was used to analyze life satisfaction data from two nationally representative panel studies. Results show that 34-38% of the variance in observed scores is trait variance that does not change. An additional 29-34% can be accounted for by an autoregressive trait that is only moderately stable over time. Thus, although life satisfaction is moderately stable over long periods of time, there is also an appreciable degree of instability that might depend on contextual circumstances.
Italian cancer figures, report 2014: Prevalence and cure of cancer in Italy.
2014-01-01
This Report intends to estimate the total number of people still alive in 2010 after cancer diagnosis in Italy, regardless of the time since diagnosis, and to project these estimates to 2015. This study is also aimed to estimate the number of already cured cancer patients, whose mortality rates have become undistinguishable from that of the general population of the same age and sex. The study took advantage of the information from the AIRTUM database, which included 29 Cancer Registries (covering 21 million people, 35% of the Italian population). A total of 1,624,533 cancer cases diagnosed between 1976 and 2009 contributed to the study. For each registry, the observed prevalence was calculated. Prevalence for lengths of time exceeding the maximum duration of the registration and of the complete prevalence were derived by applying an estimated correction factor, the completeness index. This index was estimated by means of statistical regression models using cancer incidence and survival data available in registries with 18 years of observation or more. For 50 types or combinations of neoplasms, complete prevalence was estimated at 1.1.2010 as an absolute number and as a proportion per 100,000 inhabitants by sex, age group, area of residence, and years since diagnosis. Projections of complete prevalence for 1.1.2015 were computed under the assumption of a linear trend of the complete prevalence observed until 2010. Validated mixture cure models were used to estimate: the cure fraction, that is the proportion of patients who, starting from the time of diagnosis, are expected to reach the same mortality rate of the general population; the conditional relative survival (CRS), that is the cumulative probability of surviving some additional years, given that patients already survived a certain number of years; the time to cure, that is the number of years necessary so that conditional survival in the following five years (5-year CRS) exceeds the conventional threshold of 95% (i.e., mortality rates in cancer patients become undistinguishable compared to those of the general population); the proportion of patients already cured, i.e., people alive since a number of years exceeding time to cure. As of 1.1.2010, it was estimated that 2,587,347 people were alive after a cancer diagnosis, corresponding to 4.4% of the Italian population. A relevant geographical heterogeneity emerged, with a prevalence above 5% in northern registries and below 4% in southern areas. Men were 45% of the total (1,154,289) and women 55% (1,433,058). In the population aged 75 years or more, the proportions of prevalent cases were 20% in males and 13% in females, 11% between 60 and 74 years of age in both sexes. Nearly 600,000 Italian women were alive after a breast cancer diagnosis (41% of all women with this neoplasm), followed by women with cancers of the colon rectum (12%), corpus uteri (7%), and thyroid (6%). In men, 26% of prevalent cases (295,624) were patients with prostate cancer, 16% with either bladder or colon rectum cancer. The projections for 1.1.2015 are of three million (3,036,741) people alive after a cancer diagnosis, 4.9% of the Italian population; with a 20% increase for males and 15% for females, compared to 2010. The cure fractions were heterogeneous according to cancer type and age. Estimates obtained as the sum of cure fractions for all cancer types showed that more than 60% of patients diagnosed below the age of 45 years will reach the same mortality rate of the general population. This proportion decreased with increasing age and it was <30% for cancer diagnosed after the age of 74 years. It was observed that 60% of all prevalent cases (1,543,531 people or 2.6% of overall Italian population) had been diagnosed >5 years earlier (long-term survivors). Time to cure (5-year CRS>95%) was reached in <10 years by patients with cancers of the stomach, colon rectum, pancreas, corpus and cervix uteri, brain, and Hodgkin lymphoma. Mortality rates similar to the ones reported by the general population were reached after approximately 20 years for breast and prostate cancer patients. Five-year CRS remained <95% for >25 years after cancer diagnosis in patients with liver and larynx cancers, non-Hodgkin lymphoma, myeloma, and leukaemia. Time to cure was reached by 27% (20% in men and 33% in women) of all people living after a cancer diagnosis, defined as already cured. The study showed a steady increase over time (nearly +3% per year) of prevalent cases in Italy. A quarter of Italian cancer patients alive in 2010 can be considered as already cured. The AIRTUM Report 2014 describes characteristics of cancer patients and former-patients for 50 cancer types or combinations by sex and age. This detailed information promotes the conduction of studies aimed at expanding the current knowledge on the quality of life of these patients during and after the active phase of treatments (prevalence according to health status), on the long-term effects of treatments (in particular for paediatric patients), on the cost profile of cancer patients, and on rare tumours. All these observations have a high potential impact on health planning, clinical practice, and, most of all, patients' perspective.
Temporal processing dysfunction in schizophrenia.
Carroll, Christine A; Boggs, Jennifer; O'Donnell, Brian F; Shekhar, Anantha; Hetrick, William P
2008-07-01
Schizophrenia may be associated with a fundamental disturbance in the temporal coordination of information processing in the brain, leading to classic symptoms of schizophrenia such as thought disorder and disorganized and contextually inappropriate behavior. Despite the growing interest and centrality of time-dependent conceptualizations of the pathophysiology of schizophrenia, there remains a paucity of research directly examining overt timing performance in the disorder. Accordingly, the present study investigated timing in schizophrenia using a well-established task of time perception. Twenty-three individuals with schizophrenia and 22 non-psychiatric control participants completed a temporal bisection task, which required participants to make temporal judgments about auditory and visually presented durations ranging from 300 to 600 ms. Both schizophrenia and control groups displayed greater visual compared to auditory timing variability, with no difference between groups in the visual modality. However, individuals with schizophrenia exhibited less temporal precision than controls in the perception of auditory durations. These findings correlated with parameter estimates obtained from a quantitative model of time estimation, and provide evidence of a fundamental deficit in temporal auditory precision in schizophrenia.
Kenny, Sarah J; Palacios-Derflingher, Luz; Whittaker, Jackie L; Emery, Carolyn A
2018-03-01
Study Design Cohort study. Background Multiple operational definitions of injury exist in dance research. The influence that these different injury definitions have on epidemiological estimations of injury burden among dancers warrants investigation. Objective To describe the influence of injury definition on injury prevalence, incidence, and severity in preprofessional ballet and contemporary dancers. Methods Dancers registered in full-time preprofessional ballet (n = 85; 77 female; median age, 15 years; range, 11-19 years) and contemporary (n = 60; 58 female; median age, 19 years; range, 17-30 years) training completed weekly online questionnaires (modified Oslo Sports Trauma Research Centre questionnaire on health problems) using 3 injury definitions: (1) time loss (unable to complete 1 or more classes/rehearsals/performances for 1 or more days beyond onset), (2) medical attention, and (3) any complaint. Physical therapists completed injury report forms to capture dance-related medical attention and time-loss injuries. Percent agreement between injury registration methods was estimated. Injury prevalence (seasonal proportion of dancers injured), incidence rates (count of new injuries per 1000 dance-exposure hours), and severity (total days lost) were examined across each definition, registration method, and dance style. Results Questionnaire response rate was 99%. Agreement between registration methods ranged between 59% (time loss) and 74% (injury location). Depending on definition, registration, and dance style, injury prevalence ranged between 9.4% (95% confidence interval [CI]: 4.1%, 17.7%; time loss) and 82.4% (95% CI: 72.5%, 89.8%; any complaint), incidence rates between 0.1 (95% CI: 0.03, 0.2; time loss) and 4.9 (95% CI: 4.1, 5.8; any complaint) injuries per 1000 dance-hours, and days lost between 111 and 588 days. Conclusion Time-loss and medical-attention injury definitions underestimate the injury burden in preprofessional dancers. Accordingly, injury surveillance methodologies should consider more inclusive injury definitions. J Orthop Sports Phys Ther 2018;48(3):185-193. Epub 13 Dec 2017. doi:10.2519/jospt.2018.7542 Level of Evidence Symptom prevalence study, level 1b.
Modeling of in-use stability for tablets and powders in bottles.
Waterman, Kenneth C; Chen, Lili; Waterman, Philip; MacDonald, Bruce C; Monahan, Andrew P; Scrivens, Garry
2016-10-01
A model is presented for determining the time when an active pharmaceutical ingredient in tablets/powders will remain within its specification limits during an in-use period; that is, when a heat-induction sealed bottle is opened for fixed time periods and where tablets are removed at fixed time points. This model combines the Accelerated Stability Assessment Program to determine the impact on degradation rates of relative humidity (RH) with calculations of the RH as a function of time for the dosage forms under in-use conditions. These calculations, in a conservative approach, assume that the air inside bottles with broached heat-induction seals completely exchanges with the external environment during periods when the bottle remains open. The solid dosages are assumed to sorb water at estimable rates during these openings. When bottles are capped, the moisture vapor transmission rate can be estimated to determine the changing RH inside the bottles between opening events. The impact of silica gel desiccants can also be included in the modeling.
NASA Technical Reports Server (NTRS)
Matthews, Bryan L.; Srivastava, Ashok N.
2010-01-01
Prior to the launch of STS-119 NASA had completed a study of an issue in the flow control valve (FCV) in the Main Propulsion System of the Space Shuttle using an adaptive learning method known as Virtual Sensors. Virtual Sensors are a class of algorithms that estimate the value of a time series given other potentially nonlinearly correlated sensor readings. In the case presented here, the Virtual Sensors algorithm is based on an ensemble learning approach and takes sensor readings and control signals as input to estimate the pressure in a subsystem of the Main Propulsion System. Our results indicate that this method can detect faults in the FCV at the time when they occur. We use the standard deviation of the predictions of the ensemble as a measure of uncertainty in the estimate. This uncertainty estimate was crucial to understanding the nature and magnitude of transient characteristics during startup of the engine. This paper overviews the Virtual Sensors algorithm and discusses results on a comprehensive set of Shuttle missions and also discusses the architecture necessary for deploying such algorithms in a real-time, closed-loop system or a human-in-the-loop monitoring system. These results were presented at a Flight Readiness Review of the Space Shuttle in early 2009.
Different approaches to valuing the lost productivity of patients with migraine.
Lofland, J H; Locklear, J C; Frick, K D
2001-01-01
To calculate and compare the human capital approach (HCA) and friction cost approach (FCA) methods for estimating the cost of lost productivity of migraineurs after the initiation of sumatriptan from a US societal perspective. Secondary, retrospective analysis to a prospective observational study. A mixed-model managed care organisation in western Pennsylvania, USA. Patients with migraine using sumatriptan therapy. Patient-reported questionnaires collected at baseline, 3 and 6 months after initiation of sumatriptan therapy. The cost of lost productivity estimated with the HCA and FCA methods. Of the 178 patients who completed the study, 51% were full-time employees, 13% were part-time, 18% were not working and 17% changed work status. Twenty-four percent reported a clerical or administrative position. From the HCA, the estimated total cost of lost productivity for 6 months following the initiation of sumatriptan was $US117905 (1996 values). From the FCA, the six-month estimated total cost of lost productivity ranged from $US28329 to $US117905 (1996 values). This was the first study to retrospectively estimate lost productivity of patients with migraine using the FCA methodology. Our results demonstrate that depending on the assumptions and illustrations employed, the FCA can yield lost productivity estimates that vary greatly as a percentage of the HCA estimate. Prospective investigations are needed to better determine the components and the nature of the lost productivity for chronic episodic diseases such as migraine headache.
Travel Times, Streamflow Velocities, and Dispersion Rates in the Yellowstone River, Montana
McCarthy, Peter M.
2009-01-01
The Yellowstone River is a vital natural resource to the residents of southeastern Montana and is a primary source of water for irrigation and recreation and the primary source of municipal water for several cities. The Yellowstone River valley is the primary east-west transportation corridor through southern Montana. This complex of infrastructure makes the Yellowstone River especially vulnerable to accidental spills from various sources such as tanker cars and trucks. In 2008, the U.S. Geological Survey (USGS), in cooperation with the Montana Department of Environmental Quality, initiated a dye-tracer study to determine instream travel times, streamflow velocities, and dispersion rates for the Yellowstone River from Lockwood to Glendive, Montana. The purpose of this report is to describe the results of this study and summarize data collected at each of the measurement sites between Lockwood and Glendive. This report also compares the results of this study to estimated travel times from a transport model developed by the USGS for a previous study. For this study, Rhodamine WT dye was injected at four locations in late September and early October 2008 during reasonably steady streamflow conditions. Streamflows ranged from 3,490 to 3,770 cubic feet per second upstream from the confluence of the Bighorn River and ranged from 6,520 to 7,570 cubic feet per second downstream from the confluence of the Bighorn River. Mean velocities were calculated for each subreach between measurement sites for the leading edge, peak concentration, centroid, and trailing edge at 10 percent of the peak concentration. Calculated velocities for the centroid of the dye plume for subreaches that were completely laterally mixed ranged from 1.83 to 3.18 ft/s within the study reach from Lockwood Bridge to Glendive Bridge. The mean of the completely mixed centroid velocity for the entire study reach, excluding the subreach between Forsyth Bridge and Cartersville Dam, was 2.80 ft/s. Longitudinal dispersion rates of the dye plume for this study ranged from 0.06 ft/s for the subreach upstream from Forsyth Bridge to 2.25 ft/s for the subreach upstream from Calyspo Bridge for subreaches where the dye was completely laterally mixed. A relation was determined between travel time of the peak concentration and time for the dye plume to pass a site (duration). This relation can be used to estimate when the receding concentration of a potential contaminant reaches 10 percent of its peak concentration for accidental spills into the Yellowstone River. Data from this dye-tracer study were used to evaluate velocity and concentration estimates from a transport model developed as part of an earlier USGS study. Comparison of the estimated and calculated velocities for the study reach indicate that the transport model estimates the velocities of the Yellowstone River between Huntley Bridge and Glendive Bridge with reasonable accuracy. Velocities of the peak concentration of the dye plume calculated for this study averaged 10 percent faster than the most probable velocities and averaged 12 percent slower than the maximum probable velocities estimated from the transport model. Peak Rhodamine WT dye concentrations were consistently lower than the transport model estimates except for the most upstream subreach of each dye injection. The most upstream subreach of each dye injection is expected to have a higher concentration because of incomplete lateral mixing. Lower measured peak concentrations for all other sites were expected because Rhodamine WT dye deteriorates when exposed to sunlight and will sorb onto the streambanks and stream bottom. Velocity-streamflow relations developed by using routine streamflow measurements at USGS gaging stations and the transport model can be used to estimate mean streamflow velocities throughout a range of streamflows. The variation in these velocity-streamflow relations emphasizes the uncertainty in estimating the mean streamflow veloc
W5″ Test: A simple method for measuring mean power output in the bench press exercise.
Tous-Fajardo, Julio; Moras, Gerard; Rodríguez-Jiménez, Sergio; Gonzalo-Skok, Oliver; Busquets, Albert; Mujika, Iñigo
2016-11-01
The aims of the present study were to assess the validity and reliability of a novel simple test [Five Seconds Power Test (W5″ Test)] for estimating the mean power output during the bench press exercise at different loads, and its sensitivity to detect training-induced changes. Thirty trained young men completed as many repetitions as possible in a time of ≈5 s at 25%, 45%, 65% and 85% of one-repetition maximum (1RM) in two test sessions separated by four days. The number of repetitions, linear displacement of the bar and time needed to complete the test were recorded by two independent testers, and a linear encoder was used as the criterion measure. For each load, the mean power output was calculated in the W5″ Test as mechanical work per time unit and compared with that obtained from the linear encoder. Subsequently, 20 additional subjects (10 training group vs. 10 control group) were assessed before and after completing a seven-week training programme designed to improve maximal power. Results showed that both assessment methods correlated highly in estimating mean power output at different loads (r range: 0.86-0.94; p < .01) and detecting training-induced changes (R(2): 0.78). Good to excellent intra-tester (intraclass correlation coefficient (ICC) range: 0.81-0.97) and excellent inter-tester (ICC range: 0.96-0.99; coefficient of variation range: 2.4-4.1%) reliability was found for all loads. The W5″ Test was shown to be a valid, reliable and sensitive method for measuring mean power output during the bench press exercise in subjects who have previous resistance training experience.
Quantification of spheno-occipital synchondrosis fusion in a contemporary Malaysian population.
Hisham, Salina; Flavel, Ambika; Abdullah, Nurliza; Noor, Mohamad Helmee Mohamad; Franklin, Daniel
2018-03-01
Timing of fusion of the spheno-occipital synchondrosis (SOS) is correlated with age. Previous research, however, has demonstrated variation in the timing of closure among different global populations. The present study aims to quantify the timing of SOS fusion in Malaysian individuals as visualised in multi-detector computed tomography (CT) scans and to thereafter formulate age estimation models based on fusion status. Anonymised cranial CT scans of 336 males and 164 females, aged 5-25 years, were acquired from the National Institute of Forensic Medicine, Hospital Kuala Lumpur and Department of Diagnostic Imaging, Hospital Sultanah Aminah. The scans were received in DICOM format and reconstructed into three-dimensional images using OsiriX. The SOS is scored as open, fusing endocranially, fusing ectocranially or completely fused. Statistical analyses are performed using IBM SPSS Statistics version 24. Transition analysis (Nphases2) is then utilised to calculate age ranges for each stage. To assess the reliability of an observation, intra- and inter-observer agreement is quantified using Fleiss Kappa and was found to be excellent (κ=0.785-0.907 and 0.812). The mean (SD) age for complete fusion is 20.84 (2.84) years in males and 19.78 (3.35) years in females. Transition ages between Stages 0 and 1, 1 and 2, and 2 and 3 in males are 12.52, 13.98 and 15.52 years, respectively (SD 1.37); in females, the corresponding data are 10.47, 12.26 and 13.80 years (SD 1.72). Complete fusion of the SOS was observed in all individuals above the age of 18 years. SOS fusion status provides upper and lower age boundaries for forensic age estimation in the Malaysian sample. Copyright © 2018 Elsevier B.V. All rights reserved.
Integrating O/S models during conceptual design, part 3
NASA Technical Reports Server (NTRS)
Ebeling, Charles E.
1994-01-01
Space vehicles, such as the Space Shuttle, require intensive ground support prior to, during, and after each mission. Maintenance is a significant part of that ground support. All space vehicles require scheduled maintenance to ensure operability and performance. In addition, components of any vehicle are not one-hundred percent reliable so they exhibit random failures. Once detected, a failure initiates unscheduled maintenance on the vehicle. Maintenance decreases the number of missions which can be completed by keeping vehicles out of service so that the time between the completion of one mission and the start of the next is increased. Maintenance also requires resources such as people, facilities, tooling, and spare parts. Assessing the mission capability and resource requirements of any new space vehicle, in addition to performance specification, is necessary to predict the life cycle cost and success of the vehicle. Maintenance and logistics support has been modeled by computer simulation to estimate mission capability and resource requirements for evaluation of proposed space vehicles. The simulation was written with Simulation Language for Alternative Modeling II (SLAM II) for execution on a personal computer. For either one or a fleet of space vehicles, the model simulates the preflight maintenance checks, the mission and return to earth, and the post flight maintenance in preparation to be sent back into space. THe model enables prediction of the number of missions possible and vehicle turn-time (the time between completion of one mission and the start of the next) given estimated values for component reliability and maintainability. The model also facilitates study of the manpower and vehicle requirements for the proposed vehicle to meet its desired mission rate. This is the 3rd part of a 3 part technical report.
Elastic and viscoelastic calculations of stresses in sedimentary basins
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warpinski, N.R.
This study presents a method for estimating the stress state within reservoirs at depth using a time-history approach for both elastic and viscoelastic rock behavior. Two features of this model are particularly significant for stress calculations. The first is the time-history approach, where we assume that the present in situ stress is a result of the entire history of the rock mass, rather than due only to the present conditions. The model can incorporate: (1) changes in pore pressure due to gas generation; (2) temperature gradients and local thermal episodes; (3) consolidation and diagenesis through time-varying material properties; and (4)more » varying tectonic episodes. The second feature is the use of a new viscoelastic model. Rather than assume a form of the relaxation function, a complete viscoelastic solution is obtained from the elastic solution through the viscoelastic correspondence principal. Simple rate models are then applied to obtain the final rock behavior. Example calculations for some simple cases are presented that show the contribution of individual stress or strain components. Finally, a complete example of the stress history of rocks in the Piceance basin is attempted. This calculation compares favorably with present-day stress data in this location. This model serves as a predictor for natural fracture genesis and expected rock fracturing from the model is compared with actual fractures observed in this region. These results show that most current estimates of in situ stress at depth do not incorporate all of the important mechanisms and a more complete formulation, such as this study, is required for acceptable stress calculations. The method presented here is general and is applicable to any basin having a relatively simple geologic history. 25 refs., 18 figs.« less
Error Analyses of the North Alabama Lightning Mapping Array (LMA)
NASA Technical Reports Server (NTRS)
Koshak, W. J.; Solokiewicz, R. J.; Blakeslee, R. J.; Goodman, S. J.; Christian, H. J.; Hall, J. M.; Bailey, J. C.; Krider, E. P.; Bateman, M. G.; Boccippio, D. J.
2003-01-01
Two approaches are used to characterize how accurately the North Alabama Lightning Mapping Array (LMA) is able to locate lightning VHF sources in space and in time. The first method uses a Monte Carlo computer simulation to estimate source retrieval errors. The simulation applies a VHF source retrieval algorithm that was recently developed at the NASA-MSFC and that is similar, but not identical to, the standard New Mexico Tech retrieval algorithm. The second method uses a purely theoretical technique (i.e., chi-squared Curvature Matrix theory) to estimate retrieval errors. Both methods assume that the LMA system has an overall rms timing error of 50ns, but all other possible errors (e.g., multiple sources per retrieval attempt) are neglected. The detailed spatial distributions of retrieval errors are provided. Given that the two methods are completely independent of one another, it is shown that they provide remarkably similar results, except that the chi-squared theory produces larger altitude error estimates than the (more realistic) Monte Carlo simulation.
NASA Astrophysics Data System (ADS)
Pioldi, Fabio; Rizzi, Egidio
2016-08-01
This paper proposes a new output-only element-level system identification and input estimation technique, towards the simultaneous identification of modal parameters, input excitation time history and structural features at the element-level by adopting earthquake-induced structural response signals. The method, named Full Dynamic Compound Inverse Method (FDCIM), releases strong assumptions of earlier element-level techniques, by working with a two-stage iterative algorithm. Jointly, a Statistical Average technique, a modification process and a parameter projection strategy are adopted at each stage to achieve stronger convergence for the identified estimates. The proposed method works in a deterministic way and is completely developed in State-Space form. Further, it does not require continuous- to discrete-time transformations and does not depend on initialization conditions. Synthetic earthquake-induced response signals from different shear-type buildings are generated to validate the implemented procedure, also with noise-corrupted cases. The achieved results provide a necessary condition to demonstrate the effectiveness of the proposed identification method.
Scott, Michael L.; Reynolds, Elizabeth W.
2007-01-01
Compared to 5-m by 20-m tree quadrats, belt transects were shown to provide similar estimates of stand structure (stem density and stand basal area) in less than 30 percent of the time. Further, for the streams sampled, there were no statistically significant differences in stem density and basal area estimates between 10-m and 20-m belt transects and the smaller belts took approximately half the time to sample. There was, however, high variance associated with estimates of stand structure for infrequently occurring stems, such as large, relict or legacy riparian trees. Legacy riparian trees occurred in limited numbers at all sites sampled. A reachscale population census of these trees indicated that the 10-m belt transects tended to underestimate both stem density and basal area for these riparian forest elements and that a complete reach-scale census of legacy trees averaged less than one hour per site.
Downs, Andrew; Van Hoomissen, Jacqueline; Lafrenz, Andrew; Julka, Deana L
2014-01-01
To determine the level of moderate-vigorous-intensity physical activity (MVPA) assessed via self-report and accelerometer in the college population, and to examine intrapersonal and contextual variables associated with physical activity (PA). Participants were 77 college students at a university in the northwest sampled between January 2011 and December 2011. Participants completed a validated self-report measure of PA and measures of athletic identity and benefits and barriers to exercise. Participants' PA levels were assessed for 2 weeks via accelerometry. Participants' estimations of their time spent engaged in MVPA were significantly higher when measured via self-report versus accelerometry. Stronger athletic identity, perceived social benefits and barriers, and time-effort barriers were related to PA levels. Estimation of college students' level of PA may require interpretation of data from different measurement methods, as self-report and accelerometry generate different estimations of PA in college students who may be even less active than previously believed.
Allowances for evolving coastal flood risk under uncertain local sea-level rise
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buchanan, Maya K.; Kopp, Robert E.; Oppenheimer, Michael
Estimates of future flood hazards made under the assumption of stationary mean sea level are biased low due to sea-level rise (SLR). However, adjustments to flood return levels made assuming fixed increases of sea level are also inadequate when applied to sea level that is rising over time at an uncertain rate. SLR allowances—the height adjustment from historic flood levels that maintain under uncertainty the annual expected probability of flooding—are typically estimated independently of individual decision-makers’ preferences, such as time horizon, risk tolerance, and confidence in SLR projections.We provide a framework of SLR allowances that employs complete probability distributions ofmore » local SLR and a range of user-defined flood risk management preferences. Given non-stationary and uncertain sea-level rise, these metrics provide estimates of flood protection heights and offsets for different planning horizons in coastal areas. In conclusion, we illustrate the calculation of various allowance types for a set of long-duration tide gauges along U.S. coastlines.« less
Allowances for evolving coastal flood risk under uncertain local sea-level rise
Buchanan, Maya K.; Kopp, Robert E.; Oppenheimer, Michael; ...
2016-06-03
Estimates of future flood hazards made under the assumption of stationary mean sea level are biased low due to sea-level rise (SLR). However, adjustments to flood return levels made assuming fixed increases of sea level are also inadequate when applied to sea level that is rising over time at an uncertain rate. SLR allowances—the height adjustment from historic flood levels that maintain under uncertainty the annual expected probability of flooding—are typically estimated independently of individual decision-makers’ preferences, such as time horizon, risk tolerance, and confidence in SLR projections.We provide a framework of SLR allowances that employs complete probability distributions ofmore » local SLR and a range of user-defined flood risk management preferences. Given non-stationary and uncertain sea-level rise, these metrics provide estimates of flood protection heights and offsets for different planning horizons in coastal areas. In conclusion, we illustrate the calculation of various allowance types for a set of long-duration tide gauges along U.S. coastlines.« less
Gaskins, J T; Daniels, M J
2016-01-02
The estimation of the covariance matrix is a key concern in the analysis of longitudinal data. When data consists of multiple groups, it is often assumed the covariance matrices are either equal across groups or are completely distinct. We seek methodology to allow borrowing of strength across potentially similar groups to improve estimation. To that end, we introduce a covariance partition prior which proposes a partition of the groups at each measurement time. Groups in the same set of the partition share dependence parameters for the distribution of the current measurement given the preceding ones, and the sequence of partitions is modeled as a Markov chain to encourage similar structure at nearby measurement times. This approach additionally encourages a lower-dimensional structure of the covariance matrices by shrinking the parameters of the Cholesky decomposition toward zero. We demonstrate the performance of our model through two simulation studies and the analysis of data from a depression study. This article includes Supplementary Material available online.
System IDentification Programs for AirCraft (SIDPAC)
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.
2002-01-01
A collection of computer programs for aircraft system identification is described and demonstrated. The programs, collectively called System IDentification Programs for AirCraft, or SIDPAC, were developed in MATLAB as m-file functions. SIDPAC has been used successfully at NASA Langley Research Center with data from many different flight test programs and wind tunnel experiments. SIDPAC includes routines for experiment design, data conditioning, data compatibility analysis, model structure determination, equation-error and output-error parameter estimation in both the time and frequency domains, real-time and recursive parameter estimation, low order equivalent system identification, estimated parameter error calculation, linear and nonlinear simulation, plotting, and 3-D visualization. An overview of SIDPAC capabilities is provided, along with a demonstration of the use of SIDPAC with real flight test data from the NASA Glenn Twin Otter aircraft. The SIDPAC software is available without charge to U.S. citizens by request to the author, contingent on the requestor completing a NASA software usage agreement.
Surface-water/ground-water interaction along reaches of the Snake River and Henrys Fork, Idaho
Hortness, Jon E.; Vidmar, Peter
2005-01-01
Declining water levels in the eastern Snake River Plain aquifer and decreases in spring discharges from the aquifer to the Snake River have spurred studies to improve understanding of the surface-water/ground-water interaction on the plain. This study was done to estimate streamflow gains and losses along specific reaches of the Snake River and Henrys Fork and to compare changes in gain and loss estimates to changes in ground-water levels over time. Data collected during this study will be used to enhance the conceptual model of the hydrologic system and to refine computer models of ground-water flow and surface-water/ground-water interactions. Estimates of streamflow gains and losses along specific subreaches of the Snake River and Henrys Fork, based on the results of five seepage studies completed during 2001?02, varied greatly across the study area, ranging from a loss estimate of 606 ft3/s in a subreach of the upper Snake River near Heise to a gain estimate of 3,450 ft3/s in a subreach of the Snake River that includes Thousand Springs. Some variations over time also were apparent in specific subreaches. Surface spring flow accounted for much of the inflow to subreaches having large gain estimates. Several subreaches alternately gained and lost streamflow during the study. Changes in estimates of streamflow gains and losses along some of the subreaches were compared with changes in water levels, measured at three different times during 2001?02, in adjacent wells. In some instances, a strong relation between changes in estimates of gains or losses and changes in ground-water levels was apparent.
Updating histological data on crown initiation and crown completion ages in southern Africans.
Reid, Donald J; Guatelli-Steinberg, Debbie
2017-04-01
To update histological data on crown initiation and completion ages in southern Africans. To evaluate implications of these data for studies that: (a) rely on these data to time linear enamel hypoplasias (LEHs), or, (b) use these data for comparison to fossil hominins. Initiation ages were calculated on 67 histological sections from southern Africans, with sample sizes ranging from one to 11 per tooth type. Crown completion ages for southern Africans were calculated in two ways. First, actual derived initiation ages were added to crown formation times for each histological section to obtain direct information on the crown completion ages of individuals. Second, average initiation ages from this study were added to average crown formation times of southern Africans from the Reid and coworkers previous studies that were based on larger samples. For earlier-initiating tooth types (all anterior teeth and first molars), there is little difference in ages of initiation and crown completion between this and previous studies. Differences increase as a function of initiation age, such that the greatest differences between this and previous studies for both initiation and crown completion ages are for the second and third molars. This study documents variation in initiation ages, particularly for later-initiating tooth types. It upholds the use of previously published histological aging charts for LEHs on anterior teeth. However, this study finds that ages of crown initiation and completion in second and third molars for this southern African sample are earlier than previously estimated. These earlier ages reduce differences between modern humans and fossil hominins for these developmental events in second and third molars. © 2017 Wiley Periodicals, Inc.
Li, H; Liu, J; Xiong, L; Zhang, H; Zhou, H; Yin, H; Jing, W; Li, J; Shi, Q; Wang, Y; Liu, J; Nie, L
2017-05-01
The softshell turtles (Trionychidae) are one of the most widely distributed reptile groups in the world, and fossils have been found on all continents except Antarctica. The phylogenetic relationships among members of this group have been previously studied; however, disagreements regarding its taxonomy, its phylogeography and divergence times are still poorly understood as well. Here, we present a comprehensive mitogenomic study of softshell turtles. We sequenced the complete mitochondrial genomes of 10 softshell turtles, in addition to the GenBank sequence of Dogania subplana, Lissemys punctata, Trionyx triunguis, which cover all extant genera within Trionychidae except for Cyclanorbis and Cycloderma. These data were combined with other mitogenomes of turtles for phylogenetic analyses. Divergence time calibration and ancestral reconstruction were calculated using BEAST and RASP software, respectively. Our phylogenetic analyses indicate that Trionychidae is the sister taxon of Carettochelyidae, and support the monophyly of Trionychinae and Cyclanorbinae, which is consistent with morphological data and molecular analysis. Our phylogenetic analyses have established a sister taxon relationship between the Asian Rafetus and the Asian Palea + Pelodiscus + Dogania + Nilssonia + Amyda, whereas a previous study grouped the Asian Rafetus with the American Apalone. The results of divergence time estimates and area ancestral reconstruction show that extant Trionychidae originated in Asia at around 108 million years ago (MA), and radiations mainly occurred during two warm periods, namely Late Cretaceous-Early Eocene and Oligocene. By combining the estimated divergence time and the reconstructed ancestral area of softshell turtles, we determined that the dispersal of softshell turtles out of Asia may have taken three routes. Furthermore, the times of dispersal seem to be in agreement with the time of the India-Asia collision and opening of the Bering Strait, which provide evidence for the accuracy of our estimation of divergence time. Overall, the mitogenomes of this group were used to explore the origin and dispersal route of Trionychidae and have provided new insights on the evolution of this group. © 2017 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2017 European Society For Evolutionary Biology.
2011-01-01
Background Charging for tuberculosis (TB) treatment could reduce completion rates, particularly in the poor. We identified and synthesised studies that measure costs of TB treatment, estimates of adherence and the potential impact of charging on treatment completion in China. Methods Inclusion criteria were primary research studies, including surveys and studies using qualitative methods, conducted in mainland China. We searched MEDLINE, PUBMED, EMBASE, Science Direct, HEED, CNKI to June 2010; and web pages of relevant Chinese and international organisations. Cost estimates were extracted, transformed, and expressed in absolute values and as a percentage of household income. Results Low income patients, defined at household or district level, pay a total of US$ 149 to 724 (RMB 1241 to 5228) for medical costs for a treatment course; as a percentage of annual household income, estimates range from 42% to 119%. One national survey showed 73% of TB patients at the time of the survey had interrupted or suspended treatment, and estimates from 9 smaller more recent studies showed that the proportion of patients at the time of the survey who had run out of drugs or were not taking them ranged from 3 to 25%. Synthesis of surveys and qualitative research indicate that cost is the most cited reason for default. Conclusions Despite a policy of free drug treatment for TB in China, health services charge all income groups, and costs are high. Adherence measured in cross sectional surveys is often low, and the cumulative failure to adhere is likely to be much higher. These findings may be relevant to those concerned with the development and spread of multi-drug resistant TB. New strategies need to take this into account and ensure patient adherence. PMID:21615930
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strand, S.E.; Grafstroem, G.; Kontestabile, E.
In all injection procedures exists a risk for extravasation. For radiopharmaceuticals, the absorbed dose at the injection site can be high because of high activity concentrations. In radionuclide therapy (RNT), this can cause deterministic effects such as tissue necrosis. To estimate the risk for extravasation, we studied various injection techniques at two nuclear medicine clinics. The frequency and magnitude of extravasations was studied in randomly selected patients. Clinic A used peripheral venous cathethers (PVC), and clinic B used direct injections with injection needles (IN). At clinic A 203 patients were investigated and at clinic B 90. All of these patientsmore » were injected with either 99mTc-DTPA, 99mTc-MAA, 99mTc-MDP or pertechnetate. Both arms were imaged with a scintillation camera as soon as possible after the injection. In the case of an extravasation, the retention time at the injection site was determined with multiple imaging, together with volume estimates. The results for PVC injected patients showed one complete extravasation. We also found that in 8% of these patients the remaining activity at the injection site was up to 2%. For the IN injected patients there was none with complete extravasation. However, in 33% of these patients the remaining activity was up to 18%. The locally absorbed doses in these diagnostically investigated patients were estimated with the MIRD formalism to be up to 0.1 Sv (10 rem). Transforming these results to the RNT, the absorbed doses can be up to 1000 times higher. In addition to the calculated absorbed doses, radionuclides localizing to the cell nucleus could enhance the effects.« less
Horai, S; Hayasaka, K; Kondo, R; Tsugane, K; Takahata, N
1995-01-01
We analyzed the complete mitochondrial DNA (mtDNA) sequences of three humans (African, European, and Japanese), three African apes (common and pygmy chimpanzees, and gorilla), and one orangutan in an attempt to estimate most accurately the substitution rates and divergence times of hominoid mtDNAs. Nonsynonymous substitutions and substitutions in RNA genes have accumulated with an approximately clock-like regularity. From these substitutions and under the assumption that the orangutan and African apes diverged 13 million years ago, we obtained a divergence time for humans and chimpanzees of 4.9 million years. This divergence time permitted calibration of the synonymous substitution rate (3.89 x 10(-8)/site per year). To obtain the substitution rate in the displacement (D)-loop region, we compared the three human mtDNAs and measured the relative abundance of substitutions in the D-loop region and at synonymous sites. The estimated substitution rate in the D-loop region was 7.00 x 10(-8)/site per year. Using both synonymous and D-loop substitutions, we inferred the age of the last common ancestor of the human mtDNAs as 143,000 +/- 18,000 years. The shallow ancestry of human mtDNAs, together with the observation that the African sequence is the most diverged among humans, strongly supports the recent African origin of modern humans, Homo sapiens sapiens. PMID:7530363
Regression to fuzziness method for estimation of remaining useful life in power plant components
NASA Astrophysics Data System (ADS)
Alamaniotis, Miltiadis; Grelle, Austin; Tsoukalas, Lefteri H.
2014-10-01
Mitigation of severe accidents in power plants requires the reliable operation of all systems and the on-time replacement of mechanical components. Therefore, the continuous surveillance of power systems is a crucial concern for the overall safety, cost control, and on-time maintenance of a power plant. In this paper a methodology called regression to fuzziness is presented that estimates the remaining useful life (RUL) of power plant components. The RUL is defined as the difference between the time that a measurement was taken and the estimated failure time of that component. The methodology aims to compensate for a potential lack of historical data by modeling an expert's operational experience and expertise applied to the system. It initially identifies critical degradation parameters and their associated value range. Once completed, the operator's experience is modeled through fuzzy sets which span the entire parameter range. This model is then synergistically used with linear regression and a component's failure point to estimate the RUL. The proposed methodology is tested on estimating the RUL of a turbine (the basic electrical generating component of a power plant) in three different cases. Results demonstrate the benefits of the methodology for components for which operational data is not readily available and emphasize the significance of the selection of fuzzy sets and the effect of knowledge representation on the predicted output. To verify the effectiveness of the methodology, it was benchmarked against the data-based simple linear regression model used for predictions which was shown to perform equal or worse than the presented methodology. Furthermore, methodology comparison highlighted the improvement in estimation offered by the adoption of appropriate of fuzzy sets for parameter representation.
NASA Astrophysics Data System (ADS)
Di Giacomo, Domenico; Bondár, István; Storchak, Dmitry A.; Engdahl, E. Robert; Bormann, Peter; Harris, James
2015-02-01
This paper outlines the re-computation and compilation of the magnitudes now contained in the final ISC-GEM Reference Global Instrumental Earthquake Catalogue (1900-2009). The catalogue is available via the ISC website (http://www.isc.ac.uk/iscgem/). The available re-computed MS and mb provided an ideal basis for deriving new conversion relationships to moment magnitude MW. Therefore, rather than using previously published regression models, we derived new empirical relationships using both generalized orthogonal linear and exponential non-linear models to obtain MW proxies from MS and mb. The new models were tested against true values of MW, and the newly derived exponential models were then preferred to the linear ones in computing MW proxies. For the final magnitude composition of the ISC-GEM catalogue, we preferred directly measured MW values as published by the Global CMT project for the period 1976-2009 (plus intermediate-depth earthquakes between 1962 and 1975). In addition, over 1000 publications have been examined to obtain direct seismic moment M0 and, therefore, also MW estimates for 967 large earthquakes during 1900-1978 (Lee and Engdahl, 2015) by various alternative methods to the current GCMT procedure. In all other instances we computed MW proxy values by converting our re-computed MS and mb values into MW, using the newly derived non-linear regression models. The final magnitude composition is an improvement in terms of magnitude homogeneity compared to previous catalogues. The magnitude completeness is not homogeneous over the 110 years covered by the ISC-GEM catalogue. Therefore, seismicity rate estimates may be strongly affected without a careful time window selection. In particular, the ISC-GEM catalogue appears to be complete down to MW 5.6 starting from 1964, whereas for the early instrumental period the completeness varies from ∼7.5 to 6.2. Further time and resources would be necessary to homogenize the magnitude of completeness over the entire catalogue length.
Automated ambiguity estimation for VLBI Intensive sessions using L1-norm
NASA Astrophysics Data System (ADS)
Kareinen, Niko; Hobiger, Thomas; Haas, Rüdiger
2016-12-01
Very Long Baseline Interferometry (VLBI) is a space-geodetic technique that is uniquely capable of direct observation of the angle of the Earth's rotation about the Celestial Intermediate Pole (CIP) axis, namely UT1. The daily estimates of the difference between UT1 and Coordinated Universal Time (UTC) provided by the 1-h long VLBI Intensive sessions are essential in providing timely UT1 estimates for satellite navigation systems and orbit determination. In order to produce timely UT1 estimates, efforts have been made to completely automate the analysis of VLBI Intensive sessions. This involves the automatic processing of X- and S-band group delays. These data contain an unknown number of integer ambiguities in the observed group delays. They are introduced as a side-effect of the bandwidth synthesis technique, which is used to combine correlator results from the narrow channels that span the individual bands. In an automated analysis with the c5++ software the standard approach in resolving the ambiguities is to perform a simplified parameter estimation using a least-squares adjustment (L2-norm minimisation). We implement L1-norm as an alternative estimation method in c5++. The implemented method is used to automatically estimate the ambiguities in VLBI Intensive sessions on the Kokee-Wettzell baseline. The results are compared to an analysis set-up where the ambiguity estimation is computed using the L2-norm. For both methods three different weighting strategies for the ambiguity estimation are assessed. The results show that the L1-norm is better at automatically resolving the ambiguities than the L2-norm. The use of the L1-norm leads to a significantly higher number of good quality UT1-UTC estimates with each of the three weighting strategies. The increase in the number of sessions is approximately 5% for each weighting strategy. This is accompanied by smaller post-fit residuals in the final UT1-UTC estimation step.
Regression analysis for bivariate gap time with missing first gap time data.
Huang, Chia-Hui; Chen, Yi-Hau
2017-01-01
We consider ordered bivariate gap time while data on the first gap time are unobservable. This study is motivated by the HIV infection and AIDS study, where the initial HIV contracting time is unavailable, but the diagnosis times for HIV and AIDS are available. We are interested in studying the risk factors for the gap time between initial HIV contraction and HIV diagnosis, and gap time between HIV and AIDS diagnoses. Besides, the association between the two gap times is also of interest. Accordingly, in the data analysis we are faced with two-fold complexity, namely data on the first gap time is completely missing, and the second gap time is subject to induced informative censoring due to dependence between the two gap times. We propose a modeling framework for regression analysis of bivariate gap time under the complexity of the data. The estimating equations for the covariate effects on, as well as the association between, the two gap times are derived through maximum likelihood and suitable counting processes. Large sample properties of the resulting estimators are developed by martingale theory. Simulations are performed to examine the performance of the proposed analysis procedure. An application of data from the HIV and AIDS study mentioned above is reported for illustration.
Three-Minute All-Out Test in Swimming.
Tsai, Ming-Chang; Thomas, Scott G
2017-01-01
To validate the 3-minute all-out exercise test (3MT) protocol against the traditional critical-speed (CS) model (CSM) in front-crawl swimming. Ten healthy swimmers or triathletes (mean ± SD age 35.2 ± 10.5 y, height 176.5 ± 5.4 cm, body mass 69.6 ± 8.2 kg) completed 5 tests (3MT, 100m, 200m, 400m, 800m) over 2 wk on separate days. Traditional CS and anaerobic distance capacity (D') were determined for each of the 3 traditional CSMs (linear distance-time, LIN; linear speed/time, INV; nonlinear time-speed, NLIN) from the 4 set-distance time trials. For the 3MT, CS was determined as the mean speed during the final 30 s of the test and D' was estimated as the power-time integral above the CS. Our results indicated no significant difference between the CS estimates determined from the traditional CSM and 3MT except for the INV model (P = .0311). Correlations between traditional CSMs and 3MT were high (r = .95, P < .01) However, D' differed and post hoc analysis indicated that D' estimated from 3MT was significantly lower than LIN (P = .0052) and NLIN (P < .0001). Correlations were weak (r < .55, P > .1). In addition, Bland-Altman plots between the traditional CSMs and 3MT CS estimates showed scattered points above and below the zero line, suggesting there is no consistent bias of one approach versus the other. The 3MT is a valid protocol for swimming to estimate CS. The demonstrated concurrent validity of the 3MT may allow more widespread use of CSMs to evaluate participants and responses to training.
Zhang, Adah S.; Ostrom, Quinn T.; Kruchko, Carol; Rogers, Lisa; Peereboom, David M.
2017-01-01
Abstract Background. Complete prevalence proportions illustrate the burden of disease in a population. This study estimates the 2010 complete prevalence of malignant primary brain tumors overall and by Central Brain Tumor Registry of the United States (CBTRUS) histology groups, and compares the brain tumor prevalence estimates to the complete prevalence of other common cancers as determined by the Surveillance, Epidemiology, and End Results Program (SEER) by age at prevalence (2010): children (0–14 y), adolescent and young adult (AYA) (15–39 y), and adult (40+ y). Methods. Complete prevalence proportions were estimated using a novel regression method extended from the Completeness Index Method, which combines survival and incidence data from multiple sources. In this study, two datasets, CBTRUS and SEER, were used to calculate complete prevalence estimates of interest. Results. Complete prevalence for malignant primary brain tumors was 47.59/100000 population (22.31, 48.49, and 57.75/100000 for child, AYA, and adult populations). The most prevalent cancers by age were childhood leukemia (36.65/100000), AYA melanoma of the skin (66.21/100000), and adult female breast (1949.00/100000). The most prevalent CBTRUS histologies in children and AYA were pilocytic astrocytoma (6.82/100000, 5.92/100000), and glioblastoma (12.76/100000) in adults. Conclusions. The relative impact of malignant primary brain tumors is higher among children than any other age group; it emerges as the second most prevalent cancer among children. Complete prevalence estimates for primary malignant brain tumors fills a gap in overall cancer knowledge, which provides critical information toward public health and health care planning, including treatment, decision making, funding, and advocacy programs. PMID:28039365
Moseley, H N; Lee, W; Arrowsmith, C H; Krishna, N R
1997-05-06
We report a quantitative analysis of the 13C-edited intermolecular transferred NOESY (inter-TrNOESY) spectrum of the trp-repressor/operator complex (trp-rep/op) with [ul-13C/15N]-L-tryptophan corepressor using a computer program implementing complete relaxation and conformational exchange matrix (CORCEMA) methodology [Moseley et al. (1995) J. Magn. Reson. 108B, 243-261]. Using complete mixing time curves of three inter-TrNOESY peaks between the tryptophan and the Trp-rep/op, this self-consistent analysis determined the correlation time of the bound species (tauB = 13.5 ns) and the exchange off-rate (k(off) = 3.6 s(-1)) of the corepressor. In addition, the analysis estimated the correlation time of the free species (tauF approximately 0.15 ns). Also, we demonstrate the sensitivity of these inter-TrNOESY peaks to several factors including the k(off) and orientation of the tryptophan corepressor within the binding site. The analysis indicates that the crystal structure orientation for the corepressor is compatible with the solution NMR data.
Global Project Management: Graduate Course
2006-01-01
They take advantage of the fact that the other party assumes the counterparty is acting in good faith and telling the truth . , One common trick is...average 1 h our per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed...and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of
DOD Manufacturing Arsenals: Actions Needed to Identify and Sustain Critical Capabilities
2015-11-01
to each develop their own unique method. A senior OSD official described the resulting process as unsound . Each manufacturing arsenal declared what...Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments
Rapid and Iterative Estimation of Predictions of High School Graduation and Other Milestones
ERIC Educational Resources Information Center
Porter, Kristin E.; Balu, Rekha; Gunton, Brad; Pestronk, Jefferson; Cohen, Allison
2016-01-01
With the advent of data systems that allow for frequent or even real-time student data updates, and recognition that high school students often can move from being on-track to graduation to off-track in a matter of weeks, indicator analysis alone may not provide a complete picture to guide school leaders' actions. The authors of this paper suggest…
Naval Sea Systems Command On Watch 2010
2010-01-01
surface targets, such as zodiacs and fast patrol boats found in the littoral environment. As for future capabilities and goals for the program, An...Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour...per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing
Estimating Noise in the Hydrogen Epoch of Reionization Array
NASA Astrophysics Data System (ADS)
Englund Mathieu, Philip; HERA Team
2017-01-01
The Hydrogen Epoch of Reionization Array (HERA) is a radio telescope dedicated to observing large scale structure during and prior to the epoch of reionization. Once completed, HERA will have unprecedented sensitivity to the 21-cm signal from hydrogen reionization. This poster will present time- and frequency-subtraction methods and results from a preliminary analysis of the noise characteristics of the nineteen-element pathfinder array.
Comprehensive Energy and Water Master Plan, Redstone Arsenal
2009-01-01
Facility Energy Decision System (FEDS) analysis completed by the Pacific Northwest National Laboratory ( PNNL ). This model presents a clear picture of...steam options analysis conducted by Pacific Northwest National Laboratory ( PNNL ) giving priority to strategies that maximize the use of waste for...0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing
Federal Research and Development Funding: FY2012
2011-03-29
including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and... reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including...advisor, John Holdren—have raised concerns about the potential harm of a “boom-bust” approach to federal R&D funding (i.e., rapid growth in federal R
2017-12-01
MANAGEMENT : MAXIMIZING THE INFLUENCE OF EXTERNAL SPONSORS OVER AFFILIATED ARMED GROUPS by Anders C. Hamlin December 2017 Thesis Co...burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this
Mechanisms of Dynamic Deformation and Dynamic Failure in Aluminum Nitride
2012-06-01
hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and...completing and reviewing the collection information. Send comments regarding this burden estimate or any other aspect of this collection of information...INTERNATIONAL RSRCH ASSOCIATES INC D ORPHAL CAGE 06EXO 5274 BLACKBIRD DR PLEASANTON CA 94566 1 BOB SKAGGS CONSULTANT S R
Trauma Informed Guilt Reduction (TrIGR) Intervention
2017-10-01
AWARD NUMBER: W81XWH-15-1-0330 TITLE: Trauma- Informed Guilt Reduction (TrIGR) Intervention PRINCIPAL INVESTIGATOR: Sonya Norman, PhD CONTRACTING...Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...searching existing data sources, gathering and maintaining the data needed, and completing and reviewing this collection of information . Send comments
SWCC Prediction: Seep/W Add-In Functions
2017-11-01
acquire this information is to investigate from which soil data set the predictive method was derived. ERDC/GSL SR-17-4 rev. 38 References...Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing...instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing this collection of information . Send
Fake News, Conspiracy Theories, and Lies: An Information Laundering Model for Homeland Security
2018-03-01
THEORIES, AND LIES: AN INFORMATION LAUNDERING MODEL FOR HOMELAND SECURITY by Samantha M. Korta March 2018 Co-Advisors: Rodrigo Nieto...for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing...data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden
Hacking the Silos: Eliminating Information Barriers Between Public Health and Law Enforcement
2018-03-01
ELIMINATING INFORMATION BARRIERS BETWEEN PUBLIC HEALTH AND LAW ENFORCEMENT by Cody L. Minks March 2018 Thesis Advisor: Anke Richter...burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this
Computer Directed Training System (CDTS), User’s Manual
1983-07-01
lessons, together with an estimate of the time required for completion. a. BSCOl0. This lesson in BASIC ( Beginners All Purpose Symbolic Instruction Code...A2-8 FIGURESj Figure A2-1. Training Systems Manager and Training Monitors Responsibility Flowchart ...training at the site. Therefore, the TSM must be knowledgeable in the various tasks required. Figure A2-1 illustrates the position in the flowchart . These
Justification of Estimates for Fiscal Year 1983 Submitted to Congress.
1982-02-01
hierarchies to aid software production; completion of the components of an adaptive suspension vehicle including a storage energy unit, hydraulics, laser...and corrosion (long storage times), and radiation-induced breakdown. Solid- lubricated main engine bearings for cruise missile engines would offer...environments will cause "soft error" (computational and memory storage errors) in advanced microelectronic circuits. Research on high-speed, low-power
2013-04-01
and maintaining the data needed , and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other...diagnostics indicate the need for immediate action (Sheridan, 1970, 1980). Consequently, vigilance has a critical impact in a wide range of automated...activating system) needed for continued alertness. Consequently, lethargy increases in observers and signal detection is reduced. However, recent findings
Contrast Analysis for Side-Looking Sonar
2013-09-30
bound for shadow depth that can be used to validate modeling tools such as SWAT (Shallow Water Acoustics Toolkit). • Adaptive Postprocessing: Tune image...0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send
PISCES The Commander’s Tool for an Effective Exit Strategy
2003-05-16
per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing...and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information...international “police-like” capability, including training in SWAT tactics and non-lethal weapons. This force would specialize in Preventive Intervention
Time to Improve U.S. Defense Structure for the Western Hemisphere
2009-01-01
gathering and maintaining the data needed , and completing and reviewing the collection of information. Send comments regarding this burden estimate or any...diverse as the United States, Bolivia, and Saint Kitts and Nevis . Classical military threats that characterized the bipolar world do not...strategy is in the offing, seeking strategic relationships with France, Russia, and other extraregional actors. The United States needs to consider
Shallow Water Reverberation Measurement and Prediction
1994-06-01
tool . The temporal signal processing consisted of a short-time Fourier transform spectral estimation method applied to data from a single hydrophone...The three-dimensional Hamiltonian Acoustic Ray-tracing Program for the Ocean (HARPO) was used as the primary propagation modeling tool . The temporal...summarizes the work completed and discusses lessons learned . Advice regarding future work to refine the present study will be provided. 6 our poiut source
Influence of the Pulse Duration in the Anthropomorphic Test Device (ATD) Lower-Leg Loading Mechanics
2015-08-01
this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data ...sources, gathering and maintaining the data needed, and completing and reviewing the collection information. Send comments regarding this burden...its specialists for supporting the numerical analysis, experiment, data processing, and various discussions that gave me valuable ideas. 1 1
Towards a General Theory of Counterdeception
2015-02-20
AFRL-OSR-VA-TR-2015-0067 TOWARDS A GENERAL THEORY OF COUNTERDECEPTION Scott Craver RESEARCH FOUNDATION OF STATE UNIVERSITY OF NEW YORK THE Final...reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding
Tang, Jinjun; Zou, Yajie; Ash, John; Zhang, Shen; Liu, Fang; Wang, Yinhai
2016-01-01
Travel time is an important measurement used to evaluate the extent of congestion within road networks. This paper presents a new method to estimate the travel time based on an evolving fuzzy neural inference system. The input variables in the system are traffic flow data (volume, occupancy, and speed) collected from loop detectors located at points both upstream and downstream of a given link, and the output variable is the link travel time. A first order Takagi-Sugeno fuzzy rule set is used to complete the inference. For training the evolving fuzzy neural network (EFNN), two learning processes are proposed: (1) a K-means method is employed to partition input samples into different clusters, and a Gaussian fuzzy membership function is designed for each cluster to measure the membership degree of samples to the cluster centers. As the number of input samples increases, the cluster centers are modified and membership functions are also updated; (2) a weighted recursive least squares estimator is used to optimize the parameters of the linear functions in the Takagi-Sugeno type fuzzy rules. Testing datasets consisting of actual and simulated data are used to test the proposed method. Three common criteria including mean absolute error (MAE), root mean square error (RMSE), and mean absolute relative error (MARE) are utilized to evaluate the estimation performance. Estimation results demonstrate the accuracy and effectiveness of the EFNN method through comparison with existing methods including: multiple linear regression (MLR), instantaneous model (IM), linear model (LM), neural network (NN), and cumulative plots (CP).
Tang, Jinjun; Zou, Yajie; Ash, John; Zhang, Shen; Liu, Fang; Wang, Yinhai
2016-01-01
Travel time is an important measurement used to evaluate the extent of congestion within road networks. This paper presents a new method to estimate the travel time based on an evolving fuzzy neural inference system. The input variables in the system are traffic flow data (volume, occupancy, and speed) collected from loop detectors located at points both upstream and downstream of a given link, and the output variable is the link travel time. A first order Takagi-Sugeno fuzzy rule set is used to complete the inference. For training the evolving fuzzy neural network (EFNN), two learning processes are proposed: (1) a K-means method is employed to partition input samples into different clusters, and a Gaussian fuzzy membership function is designed for each cluster to measure the membership degree of samples to the cluster centers. As the number of input samples increases, the cluster centers are modified and membership functions are also updated; (2) a weighted recursive least squares estimator is used to optimize the parameters of the linear functions in the Takagi-Sugeno type fuzzy rules. Testing datasets consisting of actual and simulated data are used to test the proposed method. Three common criteria including mean absolute error (MAE), root mean square error (RMSE), and mean absolute relative error (MARE) are utilized to evaluate the estimation performance. Estimation results demonstrate the accuracy and effectiveness of the EFNN method through comparison with existing methods including: multiple linear regression (MLR), instantaneous model (IM), linear model (LM), neural network (NN), and cumulative plots (CP). PMID:26829639
2012-01-01
Background Although it has proven to be an important foundation for investigations of carnivoran ecology, biology and evolution, the complete species-level supertree for Carnivora of Bininda-Emonds et al. is showing its age. Additional, largely molecular sequence data are now available for many species and the advancement of computer technology means that many of the limitations of the original analysis can now be avoided. We therefore sought to provide an updated estimate of the phylogenetic relationships within all extant Carnivora, again using supertree analysis to be able to analyze as much of the global phylogenetic database for the group as possible. Results In total, 188 source trees were combined, representing 114 trees from the literature together with 74 newly constructed gene trees derived from nearly 45,000 bp of sequence data from GenBank. The greater availability of sequence data means that the new supertree is almost completely resolved and also better reflects current phylogenetic opinion (for example, supporting a monophyletic Mephitidae, Eupleridae and Prionodontidae; placing Nandinia binotata as sister to the remaining Feliformia). Following an initial rapid radiation, diversification rate analyses indicate a downturn in the net speciation rate within the past three million years as well as a possible increase some 18.0 million years ago; numerous diversification rate shifts within the order were also identified. Conclusions Together, the two carnivore supertrees remain the only complete phylogenetic estimates for all extant species and the new supertree, like the old one, will form a key tool in helping us to further understand the biology of this charismatic group of carnivores. PMID:22369503
The Value Of Enhanced Neo Surveys
NASA Astrophysics Data System (ADS)
Harris, Alan W.
2012-10-01
NEO surveys have now achieved, more or less, the “Spaceguard Goal” of cataloging 90% of NEAs larger than 1 km in diameter, and thereby have reduced the short-term hazard from cosmic impacts by about an order of magnitude, from an actuarial estimate of 1,000 deaths per year (actually about a billion every million years, with very little in between), to about 100 deaths per year, with a shift toward smaller but more frequent events accounting for the remaining risk. It is fair to ask, then, what is the value of a next-generation accelerated survey to “retire” much of the remaining risk. The curve of completion of survey versus size of NEA is remarkably similar for any survey, ground or space based, visible light or thermal IR, so it is possible to integrate risk over all sizes, with a time variable curve of completion to evaluate the actuarial value of speeding up survey completion. I will present my latest estimate of NEA population and completion of surveys. From those I will estimate the “value” of accelerated surveys such as Pan-STARRS, LSST, or space-based surveys, versus continuing with current surveys. My tentative conclusion is that we may have already reached the point in terms of cost-benefit where accelerated surveys are not cost-effective in terms of reducing impact risk. If not yet, we soon will. On the other hand, the surveys, which find and catalog main-belt and other classes of small bodies as well as NEOs, have provided a gold mine of good science. The scientific value of continued or accelerated surveys needs to be emphasized as the impact risk is increasingly “retired.”
The impact of HIV and ART on recurrent tuberculosis in a sub-Saharan setting.
Houben, Rein M G J; Glynn, Judith R; Mboma, Sebastian; Mzemba, Themba; Mwaungulu, Nimrod J; Mwaungulu, Lorren; Mwenibabu, Michael; Mpunga, James; French, Neil; Crampin, Amelia C
2012-11-13
To estimate the impact of antiretroviral therapy (ART) on the incidence of recurrent tuberculosis (TB) in an African population. A long-term population cohort in Karonga District, northern Malawi. Patients who had completed treatment for laboratory-confirmed TB diagnosed since 1996 were visited annually to record vital status, ART use and screen for TB. Survival analysis estimated the effect of HIV/ART status at completion of treatment on mortality and recurrence. Analyses were stratified by time since treatment completion to estimate the effects on relapse (predominates during first year) and reinfection disease (predominates later). Among 1133 index TB cases contributing 4353 person-years of follow-up, there were 307 deaths and 103 laboratory-confirmed recurrences (recurrence rate 4.6 per 100 person-years). Half the recurrences occurred in the first year since completing treatment. HIV infection increased the recurrence rate [rate ratio adjusted for age, sex, period and TB type 2.69, 95% confidence interval (CI) 1.69-4.26], but with less effect in the first year (adjusted rate ratio 1.71, 95% CI 0.87-3.35) than subsequently (adjusted rate ratio 4.2, 95% CI 2.16-8.15). Recurrence rates on ART were intermediate between those of HIV-negative individuals and HIV-positive individuals without ART. Compared with HIV-positive individuals without ART, the adjusted rate ratio was 0.74 (95% CI 0.27-2.06) in the first year, and 0.43 (95% CI 0.11-1.73) later. The increased incidence of TB recurrence observed in HIV-positive patients appeared to be reduced by ART. The effects are mostly on later (likely reinfection) disease so the impact of ART on reducing recurrence will be highest in high TB incidence settings.
COSMOGRAIL XVII: Time Delays for the Quadruply Imaged Quasar PG 1115+080
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonvin, V.; et al.
We present time-delay estimates for the quadruply imaged quasar PG 1115+080. Our resuls are based on almost daily observations for seven months at the ESO MPIA 2.2m telescope at La Silla Observatory, reaching a signal-to-noise ratio of about 1000 per quasar image. In addition, we re-analyse existing light curves from the literature that we complete with an additional three seasons of monitoring with the Mercator telescope at La Palma Observatory. When exploring the possible source of bias we consider the so-called microlensing time delay, a potential source of systematic error so far never directly accounted for in previous time-delay publications.more » In fifteen years of data on PG 1115+080, we find no strong evidence of microlensing time delay. Therefore not accounting for this effect, our time-delay estimates on the individual data sets are in good agreement with each other and with the literature. Combining the data sets, we obtain the most precise time-delay estimates to date on PG 1115+080, with Dt(AB) = 8.3+1.5-1.6 days (18.7% precision), Dt(AC) = 9.9+1.1-1.1 days (11.1%) and Dt(BC) = 18.8+1.6-1.6 days (8.5%). Turning these time delays into cosmological constraints is done in a companion paper that makes use of ground-based Adaptive Optics (AO) with the Keck telescope.« less
NASA Astrophysics Data System (ADS)
Christos, Kourouklas; Eleftheria, Papadimitriou; George, Tsaklidis; Vassilios, Karakostas
2018-06-01
The determination of strong earthquakes' recurrence time above a predefined magnitude, associated with specific fault segments, is an important component of seismic hazard assessment. The occurrence of these earthquakes is neither periodic nor completely random but often clustered in time. This fact in connection with their limited number, due to shortage of the available catalogs, inhibits a deterministic approach for recurrence time calculation, and for this reason, application of stochastic processes is required. In this study, recurrence time determination in the area of North Aegean Trough (NAT) is developed by the application of time-dependent stochastic models, introducing an elastic rebound motivated concept for individual fault segments located in the study area. For this purpose, all the available information on strong earthquakes (historical and instrumental) with M w ≥ 6.5 is compiled and examined for magnitude completeness. Two possible starting dates of the catalog are assumed with the same magnitude threshold, M w ≥ 6.5 and divided into five data sets, according to a new segmentation model for the study area. Three Brownian Passage Time (BPT) models with different levels of aperiodicity are applied and evaluated with the Anderson-Darling test for each segment in both catalog data where possible. The preferable models are then used in order to estimate the occurrence probabilities of M w ≥ 6.5 shocks on each segment of NAT for the next 10, 20, and 30 years since 01/01/2016. Uncertainties in probability calculations are also estimated using a Monte Carlo procedure. It must be mentioned that the provided results should be treated carefully because of their dependence to the initial assumptions. Such assumptions exhibit large variability and alternative means of these may return different final results.
Alternative nuclear technologies
NASA Astrophysics Data System (ADS)
Schubert, E.
1981-10-01
The lead times required to develop a select group of nuclear fission reactor types and fuel cycles to the point of readiness for full commercialization are compared. Along with lead times, fuel material requirements and comparative costs of producing electric power were estimated. A conservative approach and consistent criteria for all systems were used in estimates of the steps required and the times involved in developing each technology. The impact of the inevitable exhaustion of the low- or reasonable-cost uranium reserves in the United States on the desirability of completing the breeder reactor program, with its favorable long-term result on fission fuel supplies, is discussed. The long times projected to bring the most advanced alternative converter reactor technologies the heavy water reactor and the high-temperature gas-cooled reactor into commercial deployment when compared to the time projected to bring the breeder reactor into equivalent status suggest that the country's best choice is to develop the breeder. The perceived diversion-proliferation problems with the uranium plutonium fuel cycle have workable solutions that can be developed which will enable the use of those materials at substantially reduced levels of diversion risk.
Bayesian dynamic mediation analysis.
Huang, Jing; Yuan, Ying
2017-12-01
Most existing methods for mediation analysis assume that mediation is a stationary, time-invariant process, which overlooks the inherently dynamic nature of many human psychological processes and behavioral activities. In this article, we consider mediation as a dynamic process that continuously changes over time. We propose Bayesian multilevel time-varying coefficient models to describe and estimate such dynamic mediation effects. By taking the nonparametric penalized spline approach, the proposed method is flexible and able to accommodate any shape of the relationship between time and mediation effects. Simulation studies show that the proposed method works well and faithfully reflects the true nature of the mediation process. By modeling mediation effect nonparametrically as a continuous function of time, our method provides a valuable tool to help researchers obtain a more complete understanding of the dynamic nature of the mediation process underlying psychological and behavioral phenomena. We also briefly discuss an alternative approach of using dynamic autoregressive mediation model to estimate the dynamic mediation effect. The computer code is provided to implement the proposed Bayesian dynamic mediation analysis. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Loop transfer recovery for general nonminimum phase discrete time systems. I - Analysis
NASA Technical Reports Server (NTRS)
Chen, Ben M.; Saberi, Ali; Sannuti, Peddapullaiah; Shamash, Yacov
1992-01-01
A complete analysis of loop transfer recovery (LTR) for general nonstrictly proper, not necessarily minimum phase discrete time systems is presented. Three different observer-based controllers, namely, `prediction estimator' and full or reduced-order type `current estimator' based controllers, are used. The analysis corresponding to all these three controllers is unified into a single mathematical framework. The LTR analysis given here focuses on three fundamental issues: (1) the recoverability of a target loop when it is arbitrarily given, (2) the recoverability of a target loop while taking into account its specific characteristics, and (3) the establishment of necessary and sufficient conditions on the given system so that it has at least one recoverable target loop transfer function or sensitivity function. Various differences that arise in LTR analysis of continuous and discrete systems are pointed out.
Title I preliminary engineering for: A. S. E. F. solid waste to methane gas
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1976-01-01
An assignment to provide preliminary engineering of an Advanced System Experimental Facility for production of methane gas from urban solid waste by anaerobic digestion is documented. The experimental facility will be constructed on a now-existing solid waste shredding and landfill facility in Pompano Beach, Florida. Information is included on: general description of the project; justification of basic need; process design; preliminary drawings; outline specifications; preliminary estimate of cost; and time schedules for design and construction of accomplishment of design and construction. The preliminary cost estimate for the design and construction phases of the experimental program is $2,960,000, based on Dec.more » 1975 and Jan. 1976 costs. A time schedule of eight months to complete the Detailed Design, Equipment Procurement and the Award of Subcontracts is given.« less
How many species of flowering plants are there?
Joppa, Lucas N.; Roberts, David L.; Pimm, Stuart L.
2011-01-01
We estimate the probable number of flowering plants. First, we apply a model that explicitly incorporates taxonomic effort over time to estimate the number of as-yet-unknown species. Second, we ask taxonomic experts their opinions on how many species are likely to be missing, on a family-by-family basis. The results are broadly comparable. We show that the current number of species should grow by between 10 and 20 per cent. There are, however, interesting discrepancies between expert and model estimates for some families, suggesting that our model does not always completely capture patterns of taxonomic activity. The as-yet-unknown species are probably similar to those taxonomists have described recently—overwhelmingly rare and local, and disproportionately in biodiversity hotspots, where there are high levels of habitat destruction. PMID:20610425
DOE Office of Scientific and Technical Information (OSTI.GOV)
Losey, London M; Andres, Robert Joseph; Marland, Gregg
2006-12-01
Detailed understanding of global carbon cycling requires estimates of CO2 emissions on temporal and spatial scales finer than annual and country. This is the first attempt to derive such estimates for a large, developing, Southern Hemisphere country. Though data on energy use are not complete in terms of time and geography, there are enough data available on the sale or consumption of fuels in Brazil to reasonably approximate the temporal and spatial patterns of fuel use and CO2 emissions. Given the available data, a strong annual cycle in emissions from Brazil is not apparent. CO2 emissions are unevenly distributed withinmore » Brazil as the population density and level of development both vary widely.« less
Richards, Joseph M.; Green, W. Reed
2013-01-01
Millwood Lake, in southwestern Arkansas, was constructed and is operated by the U.S. Army Corps of Engineers (USACE) for flood-risk reduction, water supply, and recreation. The lake was completed in 1966 and it is likely that with time sedimentation has resulted in the reduction of storage capacity of the lake. The loss of storage capacity can cause less water to be available for water supply, and lessens the ability of the lake to mitigate flooding. Excessive sediment accumulation also can cause a reduction in aquatic habitat in some areas of the lake. Although many lakes operated by the USACE have periodic bathymetric and sediment surveys, none have been completed for Millwood Lake. In March 2013, the U.S. Geological Survey (USGS), in cooperation with the USACE, surveyed the bathymetry of Millwood Lake to prepare an updated bathymetric map and area/capacity table. The USGS also collected sediment thickness data in June 2013 to estimate the volume of sediment accumulated in the lake.
Blocking for Sequential Political Experiments
Moore, Sally A.
2013-01-01
In typical political experiments, researchers randomize a set of households, precincts, or individuals to treatments all at once, and characteristics of all units are known at the time of randomization. However, in many other experiments, subjects “trickle in” to be randomized to treatment conditions, usually via complete randomization. To take advantage of the rich background data that researchers often have (but underutilize) in these experiments, we develop methods that use continuous covariates to assign treatments sequentially. We build on biased coin and minimization procedures for discrete covariates and demonstrate that our methods outperform complete randomization, producing better covariate balance in simulated data. We then describe how we selected and deployed a sequential blocking method in a clinical trial and demonstrate the advantages of our having done so. Further, we show how that method would have performed in two larger sequential political trials. Finally, we compare causal effect estimates from differences in means, augmented inverse propensity weighted estimators, and randomization test inversion. PMID:24143061
Combinatorics of least-squares trees.
Mihaescu, Radu; Pachter, Lior
2008-09-09
A recurring theme in the least-squares approach to phylogenetics has been the discovery of elegant combinatorial formulas for the least-squares estimates of edge lengths. These formulas have proved useful for the development of efficient algorithms, and have also been important for understanding connections among popular phylogeny algorithms. For example, the selection criterion of the neighbor-joining algorithm is now understood in terms of the combinatorial formulas of Pauplin for estimating tree length. We highlight a phylogenetically desirable property that weighted least-squares methods should satisfy, and provide a complete characterization of methods that satisfy the property. The necessary and sufficient condition is a multiplicative four-point condition that the variance matrix needs to satisfy. The proof is based on the observation that the Lagrange multipliers in the proof of the Gauss-Markov theorem are tree-additive. Our results generalize and complete previous work on ordinary least squares, balanced minimum evolution, and the taxon-weighted variance model. They also provide a time-optimal algorithm for computation.
Integrated survival analysis using an event-time approach in a Bayesian framework
Walsh, Daniel P.; Dreitz, VJ; Heisey, Dennis M.
2015-01-01
Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the need for having completely known fate data.
Integrated survival analysis using an event-time approach in a Bayesian framework.
Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M
2015-02-01
Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the need for having completely known fate data.
Trigonometric parallaxes for 1507 nearby mid-to-late m dwarfs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dittmann, Jason A.; Irwin, Jonathan M.; Charbonneau, David
The MEarth survey is a search for small rocky planets around the smallest, nearest stars to the Sun as identified by high proper motion with red colors. We augmented our planetary search time series with lower cadence astrometric imaging and obtained two million images of approximately 1800 stars suspected to be mid-to-late M dwarfs. We fit an astrometric model to MEarth's images for 1507 stars and obtained trigonometric distance measurements to each star with an average precision of 5 mas. Our measurements, combined with the Two Micron All Sky Survey photometry, allowed us to obtain an absolute K{sub s} magnitudemore » for each star. In turn, this allows us to better estimate the stellar parameters than those obtained with photometric estimates alone and to better prioritize the targets chosen to monitor at high cadence for planetary transits. The MEarth sample is mostly complete out to a distance of 25 pc for stars of type M5.5V and earlier, and mostly complete for later type stars out to 20 pc. We find eight stars that are within 10 pc of the Sun for which there did not exist a published trigonometric parallax distance estimate. We release with this work a catalog of the trigonometric parallax measurements for 1507 mid-to-late M dwarfs, as well as new estimates of their masses and radii.« less
Gebrekristos, R.A.; Shapiro, A.M.; Usher, B.H.
2008-01-01
An in situ method of estimating the effective diffusion coefficient for a chemical constituent that diffuses into the primary porosity of a rock is developed by abruptly changing the concentration of the dissolved constituent in a borehole in contact with the rock matrix and monitoring the time-varying concentration. The experiment was conducted in a borehole completed in mudstone on the campus of the University of the Free State in Bloemfontein, South Africa. Numerous tracer tests were conducted at this site, which left a residual concentration of sodium chloride in boreholes that diffused into the rock matrix over a period of years. Fresh water was introduced into a borehole in contact with the mudstone, and the time-varying increase of chloride was observed by monitoring the electrical conductivity (EC) at various depths in the borehole. Estimates of the effective diffusion coefficient were obtained by interpreting measurements of EC over 34 d. The effective diffusion coefficient at a depth of 36 m was approximately 7.8??10-6 m2/d, but was sensitive to the assumed matrix porosity. The formation factor and mass flux for the mudstone were also estimated from the experiment. ?? Springer-Verlag 2007.
NASA Astrophysics Data System (ADS)
Kim, Younsu; Audigier, Chloé; Dillow, Austin; Cheng, Alexis; Boctor, Emad M.
2017-03-01
Thermal monitoring for ablation therapy has high demands for preserving healthy tissues while removing malignant ones completely. Various methods have been investigated. However, exposure to radiation, cost-effectiveness, and inconvenience hinder the use of X-ray or MRI methods. Due to the non-invasiveness and real-time capabilities of ultrasound, it is widely used in intraoperative procedures. Ultrasound thermal monitoring methods have been developed for affordable monitoring in real-time. We propose a new method for thermal monitoring using an ultrasound element. By inserting a Lead-zirconate-titanate (PZT) element to generate the ultrasound signal in the liver tissues, the single travel time of flight is recorded from the PZT element to the ultrasound transducer. We detect the speed of sound change caused by the increase in temperature during ablation therapy. We performed an ex vivo experiment with liver tissues to verify the feasibility of our speed of sound estimation technique. The time of flight information is used in an optimization method to recover the speed of sound maps during the ablation, which are then converted into temperature maps. The result shows that the trend of temperature changes matches with the temperature measured at a single point. The estimation error can be decreased by using a proper curve linking the speed of sound to the temperature. The average error over time was less than 3 degrees Celsius for a bovine liver. The speed of sound estimation using a single PZT element can be used for thermal monitoring.
Stroboscopic Training Enhances Anticipatory Timing.
Smith, Trevor Q; Mitroff, Stephen R
The dynamic aspects of sports often place heavy demands on visual processing. As such, an important goal for sports training should be to enhance visual abilities. Recent research has suggested that training in a stroboscopic environment, where visual experiences alternate between visible and obscured, may provide a means of improving attentional and visual abilities. The current study explored whether stroboscopic training could impact anticipatory timing - the ability to predict where a moving stimulus will be at a specific point in time. Anticipatory timing is a critical skill for both sports and non-sports activities, and thus finding training improvements could have broad impacts. Participants completed a pre-training assessment that used a Bassin Anticipation Timer to measure their abilities to accurately predict the timing of a moving visual stimulus. Immediately after this initial assessment, the participants completed training trials, but in one of two conditions. Those in the Control condition proceeded as before with no change. Those in the Strobe condition completed the training trials while wearing specialized eyewear that had lenses that alternated between transparent and opaque (rate of 100ms visible to 150ms opaque). Post-training assessments were administered immediately after training, 10-minutes after training, and 10-days after training. Compared to the Control group, the Strobe group was significantly more accurate immediately after training, was more likely to respond early than to respond late immediately after training and 10 minutes later, and was more consistent in their timing estimates immediately after training and 10 minutes later.
Model-Based Design of Long-Distance Tracer Transport Experiments in Plants.
Bühler, Jonas; von Lieres, Eric; Huber, Gregor J
2018-01-01
Studies of long-distance transport of tracer isotopes in plants offer a high potential for functional phenotyping, but so far measurement time is a bottleneck because continuous time series of at least 1 h are required to obtain reliable estimates of transport properties. Hence, usual throughput values are between 0.5 and 1 samples h -1 . Here, we propose to increase sample throughput by introducing temporal gaps in the data acquisition of each plant sample and measuring multiple plants one after each other in a rotating scheme. In contrast to common time series analysis methods, mechanistic tracer transport models allow the analysis of interrupted time series. The uncertainties of the model parameter estimates are used as a measure of how much information was lost compared to complete time series. A case study was set up to systematically investigate different experimental schedules for different throughput scenarios ranging from 1 to 12 samples h -1 . Selected designs with only a small amount of data points were found to be sufficient for an adequate parameter estimation, implying that the presented approach enables a substantial increase of sample throughput. The presented general framework for automated generation and evaluation of experimental schedules allows the determination of a maximal sample throughput and the respective optimal measurement schedule depending on the required statistical reliability of data acquired by future experiments.
Predictors of validity and reliability of a physical activity record in adolescents
2013-01-01
Background Poor to moderate validity of self-reported physical activity instruments is commonly observed in young people in low- and middle-income countries. However, the reasons for such low validity have not been examined in detail. We tested the validity of a self-administered daily physical activity record in adolescents and assessed if personal characteristics or the convenience level of reporting physical activity modified the validity estimates. Methods The study comprised a total of 302 adolescents from an urban and rural area in Ecuador. Validity was evaluated by comparing the record with accelerometer recordings for seven consecutive days. Test-retest reliability was examined by comparing registrations from two records administered three weeks apart. Time spent on sedentary (SED), low (LPA), moderate (MPA) and vigorous (VPA) intensity physical activity was estimated. Bland Altman plots were used to evaluate measurement agreement. We assessed if age, sex, urban or rural setting, anthropometry and convenience of completing the record explained differences in validity estimates using a linear mixed model. Results Although the record provided higher estimates for SED and VPA and lower estimates for LPA and MPA compared to the accelerometer, it showed an overall fair measurement agreement for validity. There was modest reliability for assessing physical activity in each intensity level. Validity was associated with adolescents’ personal characteristics: sex (SED: P = 0.007; LPA: P = 0.001; VPA: P = 0.009) and setting (LPA: P = 0.000; MPA: P = 0.047). Reliability was associated with the convenience of completing the physical activity record for LPA (low convenience: P = 0.014; high convenience: P = 0.045). Conclusions The physical activity record provided acceptable estimates for reliability and validity on a group level. Sex and setting were associated with validity estimates, whereas convenience to fill out the record was associated with better reliability estimates for LPA. This tendency of improved reliability estimates for adolescents reporting higher convenience merits further consideration. PMID:24289296
2015-08-01
McCullagh, P.; Nelder, J.A. Generalized Linear Model , 2nd ed.; Chapman and Hall: London, 1989. 7. Johnston, J. Econometric Methods, 3rd ed.; McGraw...FOR A DOSE-RESPONSE MODEL ECBC-TN-068 Kyong H. Park Steven J. Lagan RESEARCH AND TECHNOLOGY DIRECTORATE August 2015 Approved for public release...Likelihood Estimation Method for Completely Separated and Quasi-Completely Separated Data for a Dose-Response Model 5a. CONTRACT NUMBER 5b. GRANT
Phylogeny and temporal diversification of darters (Percidae: Etheostomatinae).
Near, Thomas J; Bossu, Christen M; Bradburd, Gideon S; Carlson, Rose L; Harrington, Richard C; Hollingsworth, Phillip R; Keck, Benjamin P; Etnier, David A
2011-10-01
Discussions aimed at resolution of the Tree of Life are most often focused on the interrelationships of major organismal lineages. In this study, we focus on the resolution of some of the most apical branches in the Tree of Life through exploration of the phylogenetic relationships of darters, a species-rich clade of North American freshwater fishes. With a near-complete taxon sampling of close to 250 species, we aim to investigate strategies for efficient multilocus data sampling and the estimation of divergence times using relaxed-clock methods when a clade lacks a fossil record. Our phylogenetic data set comprises a single mitochondrial DNA (mtDNA) gene and two nuclear genes sampled from 245 of the 248 darter species. This dense sampling allows us to determine if a modest amount of nuclear DNA sequence data can resolve relationships among closely related animal species. Darters lack a fossil record to provide age calibration priors in relaxed-clock analyses. Therefore, we use a near-complete species-sampled phylogeny of the perciform clade Centrarchidae, which has a rich fossil record, to assess two distinct strategies of external calibration in relaxed-clock divergence time estimates of darters: using ages inferred from the fossil record and molecular evolutionary rate estimates. Comparison of Bayesian phylogenies inferred from mtDNA and nuclear genes reveals that heterospecific mtDNA is present in approximately 12.5% of all darter species. We identify three patterns of mtDNA introgression in darters: proximal mtDNA transfer, which involves the transfer of mtDNA among extant and sympatric darter species, indeterminate introgression, which involves the transfer of mtDNA from a lineage that cannot be confidently identified because the introgressed haplotypes are not clearly referable to mtDNA haplotypes in any recognized species, and deep introgression, which is characterized by species diversification within a recipient clade subsequent to the transfer of heterospecific mtDNA. The results of our analyses indicate that DNA sequences sampled from single-copy nuclear genes can provide appreciable phylogenetic resolution for closely related animal species. A well-resolved near-complete species-sampled phylogeny of darters was estimated with Bayesian methods using a concatenated mtDNA and nuclear gene data set with all identified heterospecific mtDNA haplotypes treated as missing data. The relaxed-clock analyses resulted in very similar posterior age estimates across the three sampled genes and methods of calibration and therefore offer a viable strategy for estimating divergence times for clades that lack a fossil record. In addition, an informative rank-free clade-based classification of darters that preserves the rich history of nomenclature in the group and provides formal taxonomic communication of darter clades was constructed using the mtDNA and nuclear gene phylogeny. On the whole, the appeal of mtDNA for phylogeny inference among closely related animal species is diminished by the observations of extensive mtDNA introgression and by finding appreciable phylogenetic signal in a modest sampling of nuclear genes in our phylogenetic analyses of darters.
High-Fidelity Simulations of Moving and Flexible Airfoils at Low Reynolds Numbers (Postprint)
2010-02-01
1 hour per response, including the time for reviewing instructions, searching existing data sources, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other...phased-averaged structures for both values of Reynolds number are found to be in good agreement with the experimental data . Finally, the effect of
2012 Anthropometric Survey of U.S. Army Personnel: Methods and Summary Statistics
2014-12-05
s to complete. The software for participant scanning, CyScan for the whole-body and head scanners and INFOOT for the foot scanner...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and ...design and engineering needs, as well as those anticipated well into the future . Ninety-four directly measured dimensions, 39 derived
Special Operations Research Topics 2015
2014-01-01
McNally) Upper Right: A U.S. Navy SEAL provides cover while two Zodiac fast boats with soldiers from the U.S. Special Forces and the Jordanian Special...Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per...response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and
Managing Network Security Policies in Tactical Manet’s Using Drama
2010-08-04
Cheng, M. Raykova, A. Poylisher, S. Alexander, M. Eiger, S. M. Bellovin, “ The Zodiac Policy Subsystem: A Policy-Based Management System for a High...hour per response, including the time for reviewing instructions, searching data sources, gathering and maintaining the data needed, and completing and...reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information
Advanced Multi-Photon Chromophores for Broad-Band Ultra-Fast Optical Limiting
2014-11-04
for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data ...sources, gathering and maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this...with expanded -system. This data provides crucial insight into the push-pull 2PA-enhancement mechanism. Following systems were studied in detail: i
CrossTalk: The Journal of Defense Software Engineering. Volume 21, Number 5
2008-05-01
per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing...and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information...the publisher of CrossTalk, providing both editorial oversight and technical review of the journal.CrossTalk’s mission is to encourage the engineering
Experimental and Computational Analysis of a Miniature Ramjet at Mach 4.0
2013-09-01
this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data...sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden...intermittent after the second World War, with the most well-known example being Lockheed Martin’s SR-71 Blackbird using the Pratt & Whitney J58 turbojet
Environmental Assessment: T-10 Hush House Tinker Air Force Base, Oklahoma
2008-07-01
the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data...sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden...require modification of Tinker AFB’s current permits. PUBLIC COMMENTS: A Notice of Availability for public review of the Draft EA was published in the
Chairman of the Joint Chiefs of Staff Strategy Essay Competition: Essays 2002
2002-10-01
including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and... reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including...of reprints or reviews . NDU Press publications are sold by the U.S. Government Printing Office. For ordering information, call (202) 512–1800 or
Introduction to the United States Air Force
2001-01-01
per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing...and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information...royalties from both the book and the movie were used to construct a new orphanage near Seoul. Hess retired from the USAF in 1969. Col Dean Hess • Was
DECAF - Density Estimation for Cetaceans from Passive Acoustic Fixed Sensors
2007-01-01
including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing...penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE 2007 2. REPORT...as far as possible to leverage data that have already been collected, and classification and localization methods that have already been developed
Cell Phone-Based Expert Systems for Smoking Cessation
2012-03-01
depression prevention. Participants completed the HRI at each timepoint. B.3. Fagerström Nicotine Dependence Scale (FTND). The FTND66 is the most...widely used tool for assessing severity of nicotine tolerance and dependence. It is a six-item, self-report scale assessing severity of nicotine ...this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data
2010-02-09
the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data...sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden...Screw It Up If You Start With a Corrosion Resistant Design… No Skip Welds Abundant Drain Holes Sound Material Selection State of the Art
Evaluation of the Department of Defense Combating Trafficking in Persons Program
2014-06-16
1 6 , 2 0 1 4 Evaluation of the Department of Defense Combating Trafficking in Persons Program Report No. DODIG-2014-079 Report Documentation...Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the ...time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the
A Toolchain for the Detection of Structural and Behavioral Latent System Properties
2011-03-19
thus reducing the number of defects propagated to successive phases. 1 Introduction In software development, the cost to repair a defect increases...0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing...instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send
Tyner, David R; Johnson, Matthew R
2014-12-16
A comprehensive technical analysis of available industry-reported well activity and production data for Alberta in 2011 has been used to derive flaring, venting, and diesel combustion greenhouse gas and criteria air contaminant emission factors specifically linked to drilling, completion, and operation of hydraulically fractured natural gas wells. Analysis revealed that in-line ("green") completions were used at approximately 53% of wells completed in 2011, and in other cases the majority (99.5%) of flowback gases were flared rather than vented. Comparisons with limited analogous data available in the literature revealed that reported total flared and vented natural gas volumes attributable to tight gas well-completions were ∼ 6 times larger than Canadian Association of Petroleum Producers (CAPP) estimates for natural gas well-completion based on wells ca. 2000, but 62% less than an equivalent emission factor that can be derived from U.S. EPA data. Newly derived emission factors for diesel combustion during well drilling and completion are thought to be among the first such data available in the open literature, where drilling-related emissions for tight gas wells drilled in Alberta in 2011 were found to have increased by a factor of 2.8 relative to a typical well drilled in Canada in 2000 due to increased drilling lengths. From well-by-well analysis of production phase flared, vented, and fuel usage natural gas volumes reported at 3846 operating tight gas wells in 2011, operational emission factors were developed. Overall results highlight the importance of operational phase GHG emissions at upstream well sites (including on-site natural gas fuel use), and the critical levels of uncertainty in current estimates of liquid unloading emissions.
Atuegwu, Nkiruka C; Arlinghaus, Lori R; Li, Xia; Chakravarthy, A Bapsi; Abramson, Vandana G; Sanders, Melinda E; Yankeelov, Thomas E
2013-01-01
Diffusion-weighted and dynamic contrast-enhanced magnetic resonance imaging (MRI) data of 28 patients were obtained pretreatment, after one cycle, and after completion of all cycles of neoadjuvant chemotherapy (NAC). For each patient at each time point, the tumor cell number was estimated using the apparent diffusion coefficient and the extravascular extracellular (ve) and plasma volume (vp) fractions. The proliferation/death rate was obtained using the number of tumor cells from the first two time points in conjunction with the logistic model of tumor growth, which was then used to predict tumor cellularity at the conclusion of NAC. The Pearson correlation coefficient between the predicted and the experimental number of tumor cells measured at the end of NAC was 0.81 (P = .0043). The proliferation rate estimated after the first cycle of therapy was able to separate patients who went on to achieve pathologic complete response from those who did not (P = .021) with a sensitivity and specificity of 82.4% and 72.7%, respectively. These data provide preliminary results indicating that incorporating readily available quantitative MRI data into a simple model of tumor growth can lead to potentially clinically relevant information for predicting an individual patient's response to NAC. PMID:23730404
Resource Constrained Planning of Multiple Projects with Separable Activities
NASA Astrophysics Data System (ADS)
Fujii, Susumu; Morita, Hiroshi; Kanawa, Takuya
In this study we consider a resource constrained planning problem of multiple projects with separable activities. This problem provides a plan to process the activities considering a resource availability with time window. We propose a solution algorithm based on the branch and bound method to obtain the optimal solution minimizing the completion time of all projects. We develop three methods for improvement of computational efficiency, that is, to obtain initial solution with minimum slack time rule, to estimate lower bound considering both time and resource constraints and to introduce an equivalence relation for bounding operation. The effectiveness of the proposed methods is demonstrated by numerical examples. Especially as the number of planning projects increases, the average computational time and the number of searched nodes are reduced.
Derieppe, Marc; de Senneville, Baudouin Denis; Kuijf, Hugo; Moonen, Chrit; Bos, Clemens
2014-10-01
Previously, we demonstrated the feasibility to monitor ultrasound-mediated uptake of a cell-impermeable model drug in real time with fibered confocal fluorescence microscopy. Here, we present a complete post-processing methodology, which corrects for cell displacements, to improve the accuracy of pharmacokinetic parameter estimation. Nucleus detection was performed based on the radial symmetry transform algorithm. Cell tracking used an iterative closest point approach. Pharmacokinetic parameters were calculated by fitting a two-compartment model to the time-intensity curves of individual cells. Cells were tracked successfully, improving time-intensity curve accuracy and pharmacokinetic parameter estimation. With tracking, 93 % of the 370 nuclei showed a fluorescence signal variation that was well-described by a two-compartment model. In addition, parameter distributions were narrower, thus increasing precision. Dedicated image analysis was implemented and enabled studying ultrasound-mediated model drug uptake kinetics in hundreds of cells per experiment, using fiber-based confocal fluorescence microscopy.
Heimes, F.J.; Luckey, R.R.; Stephens, D.M.
1986-01-01
Combining estimates of applied irrigation water, determined for selected sample sites, with information on irrigated acreage provides one alternative for developing areal estimates of groundwater pumpage for irrigation. The reliability of this approach was evaluated by comparing estimated pumpage with metered pumpage for two years for a three-county area in southwestern Nebraska. Meters on all irrigation wells in the three counties provided a complete data set for evaluation of equipment and comparison with pumpage estimates. Regression analyses were conducted on discharge, time-of-operation, and pumpage data collected at 52 irrigation sites in 1983 and at 57 irrigation sites in 1984 using data from inline flowmeters as the independent variable. The standard error of the estimate for regression analysis of discharge measurements made using a portable flowmeter was 6.8% of the mean discharge metered by inline flowmeters. The standard error of the estimate for regression analysis of time of operation determined from electric meters was 8.1% of the mean time of operation determined from in-line and 15.1% for engine-hour meters. Sampled pumpage, calculated by multiplying the average discharge obtained from the portable flowmeter by the time of operation obtained from energy or hour meters, was compared with metered pumpage from in-line flowmeters at sample sites. The standard error of the estimate for the regression analysis of sampled pumpage was 10.3% of the mean of the metered pumpage for 1983 and 1984 combined. The difference in the mean of the sampled pumpage and the mean of the metered pumpage was only 1.8% for 1983 and 2.3% for 1984. Estimated pumpage, for each county and for the study area, was calculated by multiplying application (sampled pumpage divided by irrigated acreages at sample sites) by irrigated acreage compiled from Landsat (Land satellite) imagery. Estimated pumpage was compared with total metered pumpage for each county and the study area. Estimated pumpage by county varied from 9% less, to 20% more, than metered pumpage in 1983 and from 0 to 15% more than metered pumpage in 1984. Estimated pumpage for the study area was 11 % more than metered pumpage in 1983 and 5% more than metered pumpage in 1984. (Author 's abstract)
Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P
2016-10-01
An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.
Real-time source deformation modeling through GNSS permanent stations at Merapi volcano (Indonesia
NASA Astrophysics Data System (ADS)
Beauducel, F.; Nurnaning, A.; Iguchi, M.; Fahmi, A. A.; Nandaka, M. A.; Sumarti, S.; Subandriyo, S.; Metaxian, J. P.
2014-12-01
Mt. Merapi (Java, Indonesia) is one of the most active and dangerous volcano in the world. A first GPS repetition network was setup and periodically measured since 1993, allowing detecting a deep magma reservoir, quantifying magma flux in conduit and identifying shallow discontinuities around the former crater (Beauducel and Cornet, 1999;Beauducel et al., 2000, 2006). After the 2010 centennial eruption, when this network was almost completely destroyed, Indonesian and Japanese teams installed a new continuous GPS network for monitoring purpose (Iguchi et al., 2011), consisting of 3 stations located at the volcano flanks, plus a reference station at the Yogyakarta Observatory (BPPTKG).In the framework of DOMERAPI project (2013-2016) we have completed this network with 5 additional stations, which are located on the summit area and volcano surrounding. The new stations are 1-Hz sampling, GNSS (GPS + GLONASS) receivers, and near real-time data streaming to the Observatory. An automatic processing has been developed and included in the WEBOBS system (Beauducel et al., 2010) based on GIPSY software computing precise daily moving solutions every hour, and for different time scales (2 months, 1 and 5 years), time series and velocity vectors. A real-time source modeling estimation has also been implemented. It uses the depth-varying point source solution (Mogi, 1958; Williams and Wadge, 1998) in a systematic inverse problem model exploration that displays location, volume variation and 3-D probability map.The operational system should be able to better detect and estimate the location and volume variations of possible magma sources, and to follow magma transfer towards the surface. This should help monitoring and contribute to decision making during future unrest or eruption.
A Comparison of Two Fat Grafting Methods on Operating Room Efficiency and Costs.
Gabriel, Allen; Maxwell, G Patrick; Griffin, Leah; Champaneria, Manish C; Parekh, Mousam; Macarios, David
2017-02-01
Centrifugation (Cf) is a common method of fat processing but may be time consuming, especially when processing large volumes. To determine the effects on fat grafting time, volume efficiency, reoperations, and complication rates of Cf vs an autologous fat processing system (Rv) that incorporates fat harvesting and processing in a single unit. We performed a retrospective cohort study of consecutive patients who underwent autologous fat grafting during reconstructive breast surgery with Rv or Cf. Endpoints measured were volume of fat harvested (lipoaspirate) and volume injected after processing, time to complete processing, reoperations, and complications. A budget impact model was used to estimate cost of Rv vs Cf. Ninety-eight patients underwent fat grafting with Rv, and 96 patients received Cf. Mean volumes of lipoaspirate (506.0 vs 126.1 mL) and fat injected (177.3 vs 79.2 mL) were significantly higher (P < .0001) in the Rv vs Cf group, respectively. Mean time to complete fat grafting was significantly shorter in the Rv vs Cf group (34.6 vs 90.1 minutes, respectively; P < .0001). Proportions of patients with nodule and cyst formation and/or who received reoperations were significantly less in the Rv vs Cf group. Based on these outcomes and an assumed per minute operating room cost, an average per patient cost savings of $2,870.08 was estimated with Rv vs Cf. Compared to Cf, the Rv fat processing system allowed for a larger volume of fat to be processed for injection and decreased operative time in these patients, potentially translating to cost savings. LEVEL OF EVIDENCE 3. © 2016 The American Society for Aesthetic Plastic Surgery, Inc.
Odic, Darko; Lisboa, Juan Valle; Eisinger, Robert; Olivera, Magdalena Gonzalez; Maiche, Alejandro; Halberda, Justin
2016-01-01
What is the relationship between our intuitive sense of number (e.g., when estimating how many marbles are in a jar), and our intuitive sense of other quantities, including time (e.g., when estimating how long it has been since we last ate breakfast)? Recent work in cognitive, developmental, comparative psychology, and computational neuroscience has suggested that our representations of approximate number, time, and spatial extent are fundamentally linked and constitute a "generalized magnitude system". But, the shared behavioral and neural signatures between number, time, and space may alternatively be due to similar encoding and decision-making processes, rather than due to shared domain-general representations. In this study, we investigate the relationship between approximate number and time in a large sample of 6-8 year-old children in Uruguay by examining how individual differences in the precision of number and time estimation correlate with school mathematics performance. Over four testing days, each child completed an approximate number discrimination task, an approximate time discrimination task, a digit span task, and a large battery of symbolic math tests. We replicate previous reports showing that symbolic math abilities correlate with approximate number precision and extend those findings by showing that math abilities also correlate with approximate time precision. But, contrary to approximate number and time sharing common representations, we find that each of these dimensions uniquely correlates with formal math: approximate number correlates more strongly with formal math compared to time and continues to correlate with math even when precision in time and individual differences in working memory are controlled for. These results suggest that there are important differences in the mental representations of approximate number and approximate time and further clarify the relationship between quantity representations and mathematics. Copyright © 2015 Elsevier B.V. All rights reserved.
Generalizing boundaries for triangular designs, and efficacy estimation at extended follow-ups.
Allison, Annabel; Edwards, Tansy; Omollo, Raymond; Alves, Fabiana; Magirr, Dominic; E Alexander, Neal D
2015-11-16
Visceral leishmaniasis (VL) is a parasitic disease transmitted by sandflies and is fatal if left untreated. Phase II trials of new treatment regimens for VL are primarily carried out to evaluate safety and efficacy, while pharmacokinetic data are also important to inform future combination treatment regimens. The efficacy of VL treatments is evaluated at two time points, initial cure, when treatment is completed and definitive cure, commonly 6 months post end of treatment, to allow for slow response to treatment and detection of relapses. This paper investigates a generalization of the triangular design to impose a minimum sample size for pharmacokinetic or other analyses, and methods to estimate efficacy at extended follow-up accounting for the sequential design and changes in cure status during extended follow-up. We provided R functions that generalize the triangular design to impose a minimum sample size before allowing stopping for efficacy. For estimation of efficacy at a second, extended, follow-up time, the performance of a shrinkage estimator (SHE), a probability tree estimator (PTE) and the maximum likelihood estimator (MLE) for estimation was assessed by simulation. The SHE and PTE are viable approaches to estimate an extended follow-up although the SHE performed better than the PTE: the bias and root mean square error were lower and coverage probabilities higher. Generalization of the triangular design is simple to implement for adaptations to meet requirements for pharmacokinetic analyses. Using the simple MLE approach to estimate efficacy at extended follow-up will lead to biased results, generally over-estimating treatment success. The SHE is recommended in trials of two or more treatments. The PTE is an acceptable alternative for one-arm trials or where use of the SHE is not possible due to computational complexity. NCT01067443 , February 2010.
A digital signal processing system for coherent laser radar
NASA Technical Reports Server (NTRS)
Hampton, Diana M.; Jones, William D.; Rothermel, Jeffry
1991-01-01
A data processing system for use with continuous-wave lidar is described in terms of its configuration and performance during the second survey mission of NASA'a Global Backscatter Experiment. The system is designed to estimate a complete lidar spectrum in real time, record the data from two lidars, and monitor variables related to the lidar operating environment. The PC-based system includes a transient capture board, a digital-signal processing (DSP) board, and a low-speed data-acquisition board. Both unprocessed and processed lidar spectrum data are monitored in real time, and the results are compared to those of a previous non-DSP-based system. Because the DSP-based system is digital it is slower than the surface-acoustic-wave signal processor and collects 2500 spectra/s. However, the DSP-based system provides complete data sets at two wavelengths from the continuous-wave lidars.
Estimating time available for sensor fusion exception handling
NASA Astrophysics Data System (ADS)
Murphy, Robin R.; Rogers, Erika
1995-09-01
In previous work, we have developed a generate, test, and debug methodology for detecting, classifying, and responding to sensing failures in autonomous and semi-autonomous mobile robots. An important issue has arisen from these efforts: how much time is there available to classify the cause of the failure and determine an alternative sensing strategy before the robot mission must be terminated? In this paper, we consider the impact of time for teleoperation applications where a remote robot attempts to autonomously maintain sensing in the presence of failures yet has the option to contact the local for further assistance. Time limits are determined by using evidential reasoning with a novel generalization of Dempster-Shafer theory. Generalized Dempster-Shafer theory is used to estimate the time remaining until the robot behavior must be suspended because of uncertainty; this becomes the time limit on autonomous exception handling at the remote. If the remote cannot complete exception handling in this time or needs assistance, responsibility is passed to the local, while the remote assumes a `safe' state. An intelligent assistant then facilitates human intervention, either directing the remote without human assistance or coordinating data collection and presentation to the operator within time limits imposed by the mission. The impact of time on exception handling activities is demonstrated using video camera sensor data.
Tuberculosis incidence and treatment completion among Ugandan prison inmates
Schwitters, A.; Kaggwa, M.; Omiel, P.; Nagadya, G.; Kisa, N.; Dalal, S.
2016-01-01
SUMMARY BACKGROUND The Uganda Prisons Service (UPS) is responsible for the health of approximately 32 500 inmates in 233 prisons. In 2008 a rapid UPS assessment estimated TB prevalence at 654/100 000, three times that of the general population (183/100 000). Although treatment programs exist, little is known about treatment completion in sub-Saharan African prisons. METHODS We conducted a retrospective study of Ugandan prisoners diagnosed with TB from June 2011 to November 2012. We analyzed TB diagnosis, TB-HIV comorbidity and treatment completion from national registers and tracked prison transfers and releases. RESULTS A total of 469 prisoners were diagnosed with TB over the 1.5-year period (incidence 955/100 000 person-years). Of 466 prisoners starting treatment, 48% completed treatment, 43% defaulted, 5% died and 4% were currently on treatment. During treatment, 12% of prisoners remaining in the same prison defaulted, 53% of transfers defaulted and 81% of those released were lost to follow-up. The odds of defaulting were 8.36 times greater among prisoners who were transferred during treatment. CONCLUSIONS TB incidence and treatment default are high among Ugandan prisoners. Strategies to improve treatment completion and prevent multidrug resistance could include avoiding transfer of TB patients, improving communications between prisons to ensure treatment follow-up after transfer and facilitating transfer to community clinics for released prisoners. PMID:24902552
Brief communication: timing of spheno-occipital closure in modern Western Australians.
Franklin, Daniel; Flavel, Ambika
2014-01-01
The spheno-occipital synchondrosis is a craniofacial growth centre between the occipital and sphenoid bones-its ossification persists into adolescence, which for the skeletal biologist, means it has potential application for estimating subadult age. Based on previous research the timing of spheno-occipital fusion is widely variable between and within populations, with reports of complete fusion in individuals as young as 11 years of age and nonfusion in adults. The aim of this study is, therefore, to examine this structure in a mixed sex sample of Western Australian individuals that developmentally span late childhood to adulthood. The objective is to develop statistically quantified age estimation standards based on scoring the degree of spheno-occipital fusion. The sample comprises multidetector computed tomography (MDCT) scans of 312 individuals (169 male; 143 female) between 5 and 25 years of age. Each MDCT scan is visualized in a standardized sagittal plane using three-dimensional oblique multiplanar reformatting. Fusion status is scored according to a four-stage system. Transition analysis is used to calculate age ranges for each defined stage and determine the mean age for transition between an unfused, fusing and fused status. The maximum likelihood estimates for the transition from open to fusing in the endocranial half is 14.44 years (male) and 11.42 years (female); transition from fusion in the ectocranial half to complete fusion is 16.16 years (male) and 13.62 years (female). This study affirms the potential value of assessing the degree of fusion in the spheno-occipital synchondrosis as an indicator of skeletal age. Copyright © 2013 Wiley Periodicals, Inc.
Cost of best-practice primary care management of chronic disease in a remote Aboriginal community.
Gador-Whyte, Andrew P; Wakerman, John; Campbell, David; Lenthall, Sue; Struber, Janet; Hope, Alex; Watson, Colin
2014-06-16
To estimate the cost of completing all chronic care tasks recommended by the Central Australian Rural Practitioners Association Standard Treatment Manual (CARPA STM) for patients with type 2 diabetes and chronic kidney disease (CKD). The study was conducted at a health service in a remote Central Australian Aboriginal community between July 2010 and May 2011. The chronic care tasks required were ascertained from the CARPA STM. The clinic database was reviewed for data on disease prevalence and adherence to CARPA STM guidelines. Recommended tasks were observed in a time-and-motion study of clinicians' work. Clinicians were interviewed about systematic management and its barriers. Expenditure records were analysed for salary and administrative costs. Diabetes and CKD prevalence; time spent on chronic disease care tasks; completion of tasks recommended by the CARPA STM; barriers to systematic care identified by clinicians; and estimated costs of optimal primary care management of all residents with diabetes or CKD. Projected annual costs of best-practice care for diabetes and CKD for this community of 542 people were $900 792, of which $645 313 would be met directly by the local primary care service. Estimated actual expenditure for these conditions in 2009-10 was $446 585, giving a projected funding gap of $198 728 per annum, or $1733 per patient. High staff turnover, acute care workload and low health literacy also hindered optimal chronic disease care. Barriers to optimal care included inadequate funding and workforce issues. Reduction of avoidable hospital admissions and overall costs necessitates adequate funding of primary care of chronic disease in remote communities.
Naughton, Felix; Cooper, Sue; Bowker, Katharine; Campbell, Katarzyna; Sutton, Stephen; Leonardi-Bee, Jo; Sloan, Melanie; Coleman, Tim
2015-01-01
Objectives To adapt a tailored short message service (SMS) text message smoking cessation intervention (MiQuit) for use without active health professional endorsement in routine antenatal care settings, to estimate ‘real-world’ uptake and test the feasibility of its use. Design Single-site service evaluation. Setting A Nottinghamshire (UK) antenatal clinic. Participants Pregnant women accessing the antenatal clinic (N=1750) over 6 months. Intervention A single-sheet A5 leaflet provided in the women's maternity notes folder describing the MiQuit text service. Similar materials were left on clinic desks and noticeboards. Outcome measures MiQuit activation requests and system interactions were logged for two time frames: 6 months (strict) and 8 months (extended). Local hospital data were used to estimate the denominator of pregnant smokers exposed to the materials. Results During the strict and extended time frames, 13 and 25 activation requests were received, representing 3% (95% CI 2% to 5%) and 4% (95% CI 3% to 6%) of estimated smokers, respectively. Only 11 (44%) of the 25 requesting activation sent a correctly formatted initiation text. Of those activating MiQuit, and invited to complete tailoring questions (used to tailor support), 6 (67%) completed all 12 questions by text or website and 5 (56%) texted a quit date to the system. Of the 11 activating MiQuit, 5 (45%, 95% CI 21% to 72%) stopped the programme prematurely. Conclusions A low-intensity, cheap cessation intervention promoted at very low cost, resulted in a small but potentially impactful uptake rate by pregnant smokers. PMID:26493459
Genome Evolution in the Primary Endosymbiont of Whiteflies Sheds Light on Their Divergence
Santos-Garcia, Diego; Vargas-Chavez, Carlos; Moya, Andrés; Latorre, Amparo; Silva, Francisco J.
2015-01-01
Whiteflies are important agricultural insect pests, whose evolutionary success is related to a long-term association with a bacterial endosymbiont, Candidatus Portiera aleyrodidarum. To completely characterize this endosymbiont clade, we sequenced the genomes of three new Portiera strains covering the two extant whitefly subfamilies. Using endosymbiont and mitochondrial sequences we estimated the divergence dates in the clade and used these values to understand the molecular evolution of the endosymbiont coding sequences. Portiera genomes were maintained almost completely stable in gene order and gene content during more than 125 Myr of evolution, except in the Bemisia tabaci lineage. The ancestor had already lost the genetic information transfer autonomy but was able to participate in the synthesis of all essential amino acids and carotenoids. The time of divergence of the B. tabaci complex was much more recent than previous estimations. The recent divergence of biotypes B (MEAM1 species) and Q (MED species) suggests that they still could be considered strains of the same species. We have estimated the rates of evolution of Portiera genes, synonymous and nonsynonymous, and have detected significant differences among-lineages, with most Portiera lineages evolving very slowly. Although the nonsynonymous rates were much smaller than the synonymous, the genomic dN/dS ratios were similar, discarding selection as the driver of among-lineage variation. We suggest variation in mutation rate and generation time as the responsible factors. In conclusion, the slow evolutionary rates of Portiera may have contributed to its long-term association with whiteflies, avoiding its replacement by a novel and more efficient endosymbiont. PMID:25716826
Boitard, Simon; Rodríguez, Willy; Jay, Flora; Mona, Stefano; Austerlitz, Frédéric
2016-01-01
Inferring the ancestral dynamics of effective population size is a long-standing question in population genetics, which can now be tackled much more accurately thanks to the massive genomic data available in many species. Several promising methods that take advantage of whole-genome sequences have been recently developed in this context. However, they can only be applied to rather small samples, which limits their ability to estimate recent population size history. Besides, they can be very sensitive to sequencing or phasing errors. Here we introduce a new approximate Bayesian computation approach named PopSizeABC that allows estimating the evolution of the effective population size through time, using a large sample of complete genomes. This sample is summarized using the folded allele frequency spectrum and the average zygotic linkage disequilibrium at different bins of physical distance, two classes of statistics that are widely used in population genetics and can be easily computed from unphased and unpolarized SNP data. Our approach provides accurate estimations of past population sizes, from the very first generations before present back to the expected time to the most recent common ancestor of the sample, as shown by simulations under a wide range of demographic scenarios. When applied to samples of 15 or 25 complete genomes in four cattle breeds (Angus, Fleckvieh, Holstein and Jersey), PopSizeABC revealed a series of population declines, related to historical events such as domestication or modern breed creation. We further highlight that our approach is robust to sequencing errors, provided summary statistics are computed from SNPs with common alleles. PMID:26943927
Nomura, Koh; Yonezawa, Takahiro; Mano, Shuhei; Kawakami, Shigehisa; Shedlock, Andrew M.; Hasegawa, Masami; Amano, Takashi
2013-01-01
Goats (Capra hircus) are one of the oldest domesticated species, and they are kept all over the world as an essential resource for meat, milk, and fiber. Although recent archeological and molecular biological studies suggested that they originated in West Asia, their domestication processes such as the timing of population expansion and the dynamics of their selection pressures are little known. With the aim of addressing these issues, the nearly complete mitochondrial protein-encoding genes were determined from East, Southeast, and South Asian populations. Our coalescent time estimations suggest that the timing of their major population expansions was in the Late Pleistocene and significantly predates the beginning of their domestication in the Neolithic era (≈10,000 years ago). The ω (ratio of non-synonymous rate/synonymous substitution rate) for each lineage was also estimated. We found that the ω of the globally distributed haplogroup A which is inherited by more than 90% of goats examined, turned out to be extremely low, suggesting that they are under severe selection pressure probably due to their large population size. Conversely, the ω of the Asian-specific haplogroup B inherited by about 5% of goats was relatively high. Although recent molecular studies suggest that domestication of animals may tend to relax selective constraints, the opposite pattern observed in our goat mitochondrial genome data indicates the process of domestication is more complex than may be presently appreciated and cannot be explained only by a simple relaxation model. PMID:23936295
Reliability and convergent validity of the five-step test in people with chronic stroke.
Ng, Shamay S M; Tse, Mimi M Y; Tam, Eric W C; Lai, Cynthia Y Y
2018-01-10
(i) To estimate the intra-rater, inter-rater and test-retest reliabilities of the Five-Step Test (FST), as well as the minimum detectable change in FST completion times in people with stroke. (ii) To estimate the convergent validity of the FST with other measures of stroke-specific impairments. (iii) To identify the best cut-off times for distinguishing FST performance in people with stroke from that of healthy older adults. A cross-sectional study. University-based rehabilitation centre. Forty-eight people with stroke and 39 healthy controls. None. The FST, along with (for the stroke survivors only) scores on the Fugl-Meyer Lower Extremity Assessment (FMA-LE), the Berg Balance Scale (BBS), Limits of Stability (LOS) tests, and Activities-specific Balance Confidence (ABC) scale were tested. The FST showed excellent intra-rater (intra-class correlation coefficient; ICC = 0.866-0.905), inter-rater (ICC = 0.998), and test-retest (ICC = 0.838-0.842) reliabilities. A minimum detectable change of 9.16 s was found for the FST in people with stroke. The FST correlated significantly with the FMA-LE, BBS, and LOS results in the forward and sideways directions (r = -0.411 to -0.716, p < 0.004). The FST completion time of 13.35 s was shown to discriminate reliably between people with stroke and healthy older adults. The FST is a reliable, easy-to-administer clinical test for assessing stroke survivors' ability to negotiate steps and stairs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Kyung-Doo; Jeong, Jae-Jun; Lee, Seung-Wook
The Nuclear Steam Supply System (NSSS) thermal-hydraulic model adopted in the Korea Nuclear Plant Education Center (KNPEC)-2 simulator was provided in the early 1980s. The reference plant for KNPEC-2 is the Yong Gwang Nuclear Unit 1, which is a Westinghouse-type 3-loop, 950 MW(electric) pressurized water reactor. Because of the limited computational capability at that time, it uses overly simplified physical models and assumptions for a real-time simulation of NSSS thermal-hydraulic transients. This may entail inaccurate results and thus, the possibility of so-called ''negative training,'' especially for complicated two-phase flows in the reactor coolant system. To resolve the problem, we developedmore » a realistic NSSS thermal-hydraulic program (named ARTS code) based on the best-estimate code RETRAN-3D. The systematic assessment of ARTS has been conducted by both a stand-alone test and an integrated test in the simulator environment. The non-integrated stand-alone test (NIST) results were reasonable in terms of accuracy, real-time simulation capability, and robustness. After successful completion of the NIST, ARTS was integrated with a 3-D reactor kinetics model and other system models. The site acceptance test (SAT) has been completed successively and confirmed to comply with the ANSI/ANS-3.5-1998 simulator software performance criteria. This paper presents our efforts for the ARTS development and some test results of the NIST and SAT.« less
Surman, G; da Silva, A A M; Kurinczuk, J J
2012-01-01
As the survival of very preterm and low-birthweight infants increases, so does the importance of monitoring the birth prevalence of childhood impairments; disease registers provide a means to do so for these rare conditions. High levels of ascertainment for disease research registers have become increasingly difficult to achieve in the face of additional challenges posed by consent and confidentiality issues. 4Child - Four Counties Database of Cerebral Palsy, Vision Loss and Hearing Loss in Children has been collecting data and monitoring these three major childhood impairments since 1984. This study used capture-recapture and related techniques to identify areas which are particularly affected by low ascertainment, to estimate the magnitude of missing cases on the 4Child register and to provide birth prevalence estimates of cerebral palsy which allow for these missing cases. Estimates suggest that while overall around 27% of cerebral palsy cases were not reported to 4Child, ascertainment for severely motor-impaired children (93% complete) and those born in two of the four counties was good (Oxfordshire: 90%, Northamptonshire: 94%). After allowing for missing cases, adjusted estimates of cerebral palsy birth prevalence for 1984-1993 were 3.0 per 1000 live births versus 2.5 per 1000 live births in 1994-2003. Capture-recapture techniques can identify areas of poor ascertainment and add to information around the provision of cerebral palsy birth prevalence estimates. Despite variation in ascertainment over time, capture-recapture estimates supported a decline in cerebral palsy birth prevalence between the earlier and later study periods in the four English counties of the geographical area covered by 4Child. © 2011 Blackwell Publishing Ltd.
North Alabama Lightning Mapping Array (LMA): VHF Source Retrieval Algorithm and Error Analyses
NASA Technical Reports Server (NTRS)
Koshak, W. J.; Solakiewicz, R. J.; Blakeslee, R. J.; Goodman, S. J.; Christian, H. J.; Hall, J.; Bailey, J.; Krider, E. P.; Bateman, M. G.; Boccippio, D.
2003-01-01
Two approaches are used to characterize how accurately the North Alabama Lightning Mapping Array (LMA) is able to locate lightning VHF sources in space and in time. The first method uses a Monte Carlo computer simulation to estimate source retrieval errors. The simulation applies a VHF source retrieval algorithm that was recently developed at the NASA Marshall Space Flight Center (MSFC) and that is similar, but not identical to, the standard New Mexico Tech retrieval algorithm. The second method uses a purely theoretical technique (i.e., chi-squared Curvature Matrix Theory) to estimate retrieval errors. Both methods assume that the LMA system has an overall rms timing error of 50 ns, but all other possible errors (e.g., multiple sources per retrieval attempt) are neglected. The detailed spatial distributions of retrieval errors are provided. Given that the two methods are completely independent of one another, it is shown that they provide remarkably similar results. However, for many source locations, the Curvature Matrix Theory produces larger altitude error estimates than the (more realistic) Monte Carlo simulation.
Mammographers’ Perception of Women’s Breast Cancer Risk
Egger, Joseph R.; Cutter, Gary R.; Carney, Patricia A.; Taplin, Stephen H.; Barlow, William E.; Hendrick, R. Edward; D’Orsi, Carl J.; Fosse, Jessica S.; Abraham, Linn; Elmore, Joann G.
2011-01-01
Objective To understand mammographers’ perception of individual women’s breast cancer risk. Materials and Methods Radiologists interpreting screening mammography examinations completed a mailed survey consisting of questions pertaining to demographic and clinical practice characteristics, as well as 2 vignettes describing different risk profiles of women. Respondents were asked to estimate the probability of a breast cancer diagnosis in the next 5 years for each vignette. Vignette responses were plotted against mean recall rates in actual clinical practice. Results The survey was returned by 77% of eligible radiologists. Ninety-three percent of radiologists overestimated risk in the vignette involving a 70-year-old woman; 96% overestimated risk in the vignette involving a 41-year-old woman. Radiologists who more accurately estimated breast cancer risk were younger, worked full-time, were affiliated with an academic medical center, had fellowship training, had fewer than 10 years experience interpreting mammograms, and worked more than 40% of the time in breast imaging. However, only age was statistically significant. No association was found between radiologists’ risk estimate and their recall rate. Conclusion U.S. radiologists have a heightened perception of breast cancer risk. PMID:15951455
Gray: a ray tracing-based Monte Carlo simulator for PET
NASA Astrophysics Data System (ADS)
Freese, David L.; Olcott, Peter D.; Buss, Samuel R.; Levin, Craig S.
2018-05-01
Monte Carlo simulation software plays a critical role in PET system design. Performing complex, repeated Monte Carlo simulations can be computationally prohibitive, as even a single simulation can require a large amount of time and a computing cluster to complete. Here we introduce Gray, a Monte Carlo simulation software for PET systems. Gray exploits ray tracing methods used in the computer graphics community to greatly accelerate simulations of PET systems with complex geometries. We demonstrate the implementation of models for positron range, annihilation acolinearity, photoelectric absorption, Compton scatter, and Rayleigh scatter. For validation, we simulate the GATE PET benchmark, and compare energy, distribution of hits, coincidences, and run time. We show a speedup using Gray, compared to GATE for the same simulation, while demonstrating nearly identical results. We additionally simulate the Siemens Biograph mCT system with both the NEMA NU-2 scatter phantom and sensitivity phantom. We estimate the total sensitivity within % when accounting for differences in peak NECR. We also estimate the peak NECR to be kcps, or within % of published experimental data. The activity concentration of the peak is also estimated within 1.3%.
NASA Astrophysics Data System (ADS)
Ha, Taesung
A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential usefulness of quantifying model uncertainty as sensitivity analysis in the PRA model.