Sample records for complete time series

  1. Developing a comprehensive time series of GDP per capita for 210 countries from 1950 to 2015

    PubMed Central

    2012-01-01

    Background Income has been extensively studied and utilized as a determinant of health. There are several sources of income expressed as gross domestic product (GDP) per capita, but there are no time series that are complete for the years between 1950 and 2015 for the 210 countries for which data exist. It is in the interest of population health research to establish a global time series that is complete from 1950 to 2015. Methods We collected GDP per capita estimates expressed in either constant US dollar terms or international dollar terms (corrected for purchasing power parity) from seven sources. We applied several stages of models, including ordinary least-squares regressions and mixed effects models, to complete each of the seven source series from 1950 to 2015. The three US dollar and four international dollar series were each averaged to produce two new GDP per capita series. Results and discussion Nine complete series from 1950 to 2015 for 210 countries are available for use. These series can serve various analytical purposes and can illustrate myriad economic trends and features. The derivation of the two new series allows for researchers to avoid any series-specific biases that may exist. The modeling approach used is flexible and will allow for yearly updating as new estimates are produced by the source series. Conclusion GDP per capita is a necessary tool in population health research, and our development and implementation of a new method has allowed for the most comprehensive known time series to date. PMID:22846561

  2. The incorporation of focused history in checklist for early recognition and treatment of acute illness and injury.

    PubMed

    Jayaprakash, Namita; Ali, Rashid; Kashyap, Rahul; Bennett, Courtney; Kogan, Alexander; Gajic, Ognjen

    2016-08-31

    Diagnostic error and delay are critical impediments to the safety of critically ill patients. Checklist for early recognition and treatment of acute illness and injury (CERTAIN) has been developed as a tool that facilitates timely and error-free evaluation of critically ill patients. While the focused history is an essential part of the CERTAIN framework, it is not clear how best to choreograph this step in the process of evaluation and treatment of the acutely decompensating patient. An un-blinded crossover clinical simulation study was designed in which volunteer critical care clinicians (fellows and attendings) were randomly assigned to start with either obtaining a focused history choreographed in series (after) or in parallel to the primary survey. A focused history was obtained using the standardized SAMPLE model that is incorporated into American College of Trauma Life Support (ATLS) and Pediatric Advanced Life Support (PALS). Clinicians were asked to assess six acutely decompensating patients using pre - determined clinical scenarios (three in series choreography, three in parallel). Once the initial choreography was completed the clinician would crossover to the alternative choreography. The primary outcome was the cognitive burden assessed through the NASA task load index. Secondary outcome was time to completion of a focused history. A total of 84 simulated cases (42 in parallel, 42 in series) were tested on 14 clinicians. Both the overall cognitive load and time to completion improved with each successive practice scenario, however no difference was observed between the series versus parallel choreographies. The median (IQR) overall NASA TLX task load index for series was 39 (17 - 58) and for parallel 43 (27 - 52), p = 0.57. The median (IQR) time to completion of the tasks in series was 125 (112 - 158) seconds and in parallel 122 (108 - 158) seconds, p = 0.92. In this clinical simulation study assessing the incorporation of a focused history into the primary survey of a non-trauma critically ill patient, there was no difference in cognitive burden or time to task completion when using series choreography (after the exam) compared to parallel choreography (concurrent with the primary survey physical exam). However, with repetition of the task both overall task load and time to completion improved in each of the choreographies.

  3. Different strategies in solving series completion inductive reasoning problems: an fMRI and computational study.

    PubMed

    Liang, Peipeng; Jia, Xiuqin; Taatgen, Niels A; Zhong, Ning; Li, Kuncheng

    2014-08-01

    Neural correlate of human inductive reasoning process is still unclear. Number series and letter series completion are two typical inductive reasoning tasks, and with a common core component of rule induction. Previous studies have demonstrated that different strategies are adopted in number series and letter series completion tasks; even the underlying rules are identical. In the present study, we examined cortical activation as a function of two different reasoning strategies for solving series completion tasks. The retrieval strategy, used in number series completion tasks, involves direct retrieving of arithmetic knowledge to get the relations between items. The procedural strategy, used in letter series completion tasks, requires counting a certain number of times to detect the relations linking two items. The two strategies require essentially the equivalent cognitive processes, but have different working memory demands (the procedural strategy incurs greater demands). The procedural strategy produced significant greater activity in areas involved in memory retrieval (dorsolateral prefrontal cortex, DLPFC) and mental representation/maintenance (posterior parietal cortex, PPC). An ACT-R model of the tasks successfully predicted behavioral performance and BOLD responses. The present findings support a general-purpose dual-process theory of inductive reasoning regarding the cognitive architecture. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. A randomized intervention of reminder letter for human papillomavirus vaccine series completion.

    PubMed

    Chao, Chun; Preciado, Melissa; Slezak, Jeff; Xu, Lanfang

    2015-01-01

    Completion rate for the three-dose series of the human papillomavirus (HPV) vaccine has generally been low. This study evaluated the effectiveness of a reminder letter intervention on HPV vaccine three-dose series completion. Female members of Kaiser Permanente Southern California Health Plan who received at least one dose, but not more than two doses, of the HPV vaccine by February 13, 2013, and who were between ages 9 and 26 years at the time of first HPV vaccination were included. Eighty percent of these females were randomized to receive the reminder letter, and 20% were randomized to receive standard of care (control). The reminder letters were mailed quarterly to those who had not completed the series. The proportion of series completion at the end of the 12-month evaluation period was compared using chi-square test. A total of 9,760 females were included in the intervention group and 2,445 in the control group. HPV vaccine series completion was 56.4% in the intervention group and 46.6% in the control groups (p < .001). The effect of the intervention appeared to be stronger in girls aged 9-17 years compared with young women aged 18-26 years at the first dose and in blacks compared with whites. Reminder letters scheduled quarterly were effective to enhance HPV vaccine series completion among those who initiated the vaccine. However, a large gap in series completion remained despite the intervention. Future studies should address other barriers to series completion, including those at the providers and the health care system level. Copyright © 2015 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  5. Application of time series discretization using evolutionary programming for classification of precancerous cervical lesions.

    PubMed

    Acosta-Mesa, Héctor-Gabriel; Rechy-Ramírez, Fernando; Mezura-Montes, Efrén; Cruz-Ramírez, Nicandro; Hernández Jiménez, Rodolfo

    2014-06-01

    In this work, we present a novel application of time series discretization using evolutionary programming for the classification of precancerous cervical lesions. The approach optimizes the number of intervals in which the length and amplitude of the time series should be compressed, preserving the important information for classification purposes. Using evolutionary programming, the search for a good discretization scheme is guided by a cost function which considers three criteria: the entropy regarding the classification, the complexity measured as the number of different strings needed to represent the complete data set, and the compression rate assessed as the length of the discrete representation. This discretization approach is evaluated using a time series data based on temporal patterns observed during a classical test used in cervical cancer detection; the classification accuracy reached by our method is compared with the well-known times series discretization algorithm SAX and the dimensionality reduction method PCA. Statistical analysis of the classification accuracy shows that the discrete representation is as efficient as the complete raw representation for the present application, reducing the dimensionality of the time series length by 97%. This representation is also very competitive in terms of classification accuracy when compared with similar approaches. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saaban, Azizan; Zainudin, Lutfi; Bakar, Mohd Nazari Abu

    This paper intends to reveal the ability of the linear interpolation method to predict missing values in solar radiation time series. Reliable dataset is equally tends to complete time series observed dataset. The absence or presence of radiation data alters long-term variation of solar radiation measurement values. Based on that change, the opportunities to provide bias output result for modelling and the validation process is higher. The completeness of the observed variable dataset has significantly important for data analysis. Occurrence the lack of continual and unreliable time series solar radiation data widely spread and become the main problematic issue. However,more » the limited number of research quantity that has carried out to emphasize and gives full attention to estimate missing values in the solar radiation dataset.« less

  7. The Understanding Words Reading Intervention: Evidence from a Case Series Design

    ERIC Educational Resources Information Center

    Wright, Craig; Conlon, Elizabeth G.; Wright, Michalle

    2012-01-01

    Using a case-series design with double baseline and 10-week maintenance phase, 5 struggling readers from middle- to high-income families (age range 6.4-7.9 years) completed a 5-times-weekly intervention (96 sessions) administered by a parent. All participants completed the intervention with phonological decoding, text-reading accuracy and reading…

  8. 49 CFR 571.126 - Standard No. 126; Electronic stability control systems.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... series uses counterclockwise steering, and the other series uses clockwise steering. The maximum time... rate and to estimate its side slip or side slip derivative with respect to time; (4) That has a means... after completion of the sine with dwell steering input (time T0 + 1 in Figure 1) must not exceed 35...

  9. Predictors of Infant Hepatitis B Immunization in Cameroon: Data to Inform Implementation of a Hepatitis B Birth Dose.

    PubMed

    Dionne-Odom, Jodie; Westfall, Andrew O; Nzuobontane, Divine; Vinikoor, Michael J; Halle-Ekane, Gregory; Welty, Thomas; Tita, Alan T N

    2018-01-01

    Although most African countries offer hepatitis B immunization through a 3-dose vaccine series recommended at 6, 10 and 14 weeks of age, very few provide birth dose vaccination. In support of Cameroon's national plan to implement the birth dose vaccine in 2017, we investigated predictors of infant hepatitis B virus (HBV) vaccination under the current program. Using the 2011 Demographic Health Survey in Cameroon, we identified women with at least one living child (age 12-60 months) and information about the hepatitis B vaccine series. Vaccination rates were calculated, and logistic regression modeling was used to identify factors associated with 3-dose series completion. Changes over time were assessed with linear logistic model. Among 4594 mothers analyzed, 66.7% (95% confidence interval [CI]: 64.1-69.3) of infants completed the hepatitis B vaccine series; however, an average 4-week delay in series initiation was noted with median dose timing at 10, 14 and 19 weeks of age. Predictors of series completion included facility delivery (adjusted odds ratio [aOR]: 2.1; 95% CI: 1.7-2.6), household wealth (aOR: 1.9; 95% CI: 1.2-3.1 comparing the highest and lowest quintiles), Christian religion (aOR: 1.8; 95% CI: 1.3-2.5 compared with Muslim religion) and older maternal age (aOR: 1.4; 95% CI: 1.2-1.7 for 10 year units). Birth dose vaccination to reduce vertical and early childhood transmission of hepatitis B may overcome some of the obstacles to timely and complete HBV immunization in Cameroon. Increased awareness of HBV is needed among pregnant women and high-risk groups about vertical transmission, the importance of facility delivery and the effectiveness of prevention beginning with monovalent HBV vaccination at birth.

  10. Improving Cancer-Related Outcomes with Connected Health - Acknowledgements

    Cancer.gov

    The President’s Cancer Panel is grateful to all participants who invested their time to take part in the series of workshops on connected health and cancer. A complete list of participants is in Series Information. The Panel is especially appreciative to the series co-chairs who graciously contributed their time and knowledge on this topic, providing valuable guidance during workshop planning and extensive input on this report.

  11. 77 FR 55811 - Fisheries of the South Atlantic; Southeast Data, Assessment, and Review (SEDAR); Assessment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-11

    ... queen triggerfish will consist of a series of workshops and webinars: This notice is for a webinar... time. The established times may be adjusted as necessary to accommodate the timely completion of..., or completed prior to, the time established by this notice. ADDRESSES: The meeting will be held via...

  12. An Analysis of AH-1Z Helicopter Pilots and Qualifications: The Impact of Fleet Squadron Training Progression Timelines

    DTIC Science & Technology

    2018-03-01

    remaining 3000- series time block. Although there are fewer flight hours when compared to AHC, the coordination with external agencies may increase the...training duration to complete FAC(A). Additionally, pilots may conduct 4000 to 6000 series events concurrently, thus expanding the training time . 4...A)I training. Because FAC(A)I is included in the 5000- series training block, the time period cannot be finitely determined. 8. FL and AMC are the

  13. InSAR Deformation Time Series Processed On-Demand in the Cloud

    NASA Astrophysics Data System (ADS)

    Horn, W. B.; Weeden, R.; Dimarchi, H.; Arko, S. A.; Hogenson, K.

    2017-12-01

    During this past year, ASF has developed a cloud-based on-demand processing system known as HyP3 (http://hyp3.asf.alaska.edu/), the Hybrid Pluggable Processing Pipeline, for Synthetic Aperture Radar (SAR) data. The system makes it easy for a user who doesn't have the time or inclination to install and use complex SAR processing software to leverage SAR data in their research or operations. One such processing algorithm is generation of a deformation time series product, which is a series of images representing ground displacements over time, which can be computed using a time series of interferometric SAR (InSAR) products. The set of software tools necessary to generate this useful product are difficult to install, configure, and use. Moreover, for a long time series with many images, the processing of just the interferograms can take days. Principally built by three undergraduate students at the ASF DAAC, the deformation time series processing relies the new Amazon Batch service, which enables processing of jobs with complex interconnected dependencies in a straightforward and efficient manner. In the case of generating a deformation time series product from a stack of single-look complex SAR images, the system uses Batch to serialize the up-front processing, interferogram generation, optional tropospheric correction, and deformation time series generation. The most time consuming portion is the interferogram generation, because even for a fairly small stack of images many interferograms need to be processed. By using AWS Batch, the interferograms are all generated in parallel; the entire process completes in hours rather than days. Additionally, the individual interferograms are saved in Amazon's cloud storage, so that when new data is acquired in the stack, an updated time series product can be generated with minimal addiitonal processing. This presentation will focus on the development techniques and enabling technologies that were used in developing the time series processing in the ASF HyP3 system. Data and process flow from job submission through to order completion will be shown, highlighting the benefits of the cloud for each step.

  14. Staying on track: a cluster randomized controlled trial of automated reminders aimed at increasing human papillomavirus vaccine completion.

    PubMed

    Patel, Ashlesha; Stern, Lisa; Unger, Zoe; Debevec, Elie; Roston, Alicia; Hanover, Rita; Morfesis, Johanna

    2014-05-01

    To evaluate whether automated reminders increase on-time completion of the three-dose human papillomavirus (HPV) vaccine series. Ten reproductive health centers enrolled 365 women aged 19-26 to receive dose one of the HPV vaccine. Health centers were matched and randomized so that participants received either routine follow-up (control) or automated reminder messages for vaccine doses two and three (intervention). Intervention participants selected their preferred method of reminders - text, e-mail, phone, private Facebook message, or standard mail. We compared vaccine completion rates between groups over a period of 32 weeks. The reminder system did not increase completion rates, which overall were low at 17.2% in the intervention group and 18.9% in the control group (p=0.881). Exploratory analyses revealed that participants who completed the series on-time were more likely to be older (OR=1.15, 95% CI 1.01-1.31), report having completed a four-year college degree or more (age-adjusted OR=2.51, 95% CI 1.29-4.90), and report three or more lifetime sexual partners (age-adjusted OR=3.45, 95% CI 1.20-9.92). The study intervention did not increase HPV vaccine series completion. Despite great public health interest in HPV vaccine completion and reminder technologies, completion rates remain low. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. The Relationship of Negative Affect and Thought: Time Series Analyses.

    ERIC Educational Resources Information Center

    Rubin, Amy; And Others

    In recent years, the relationship between moods and thoughts has been the focus of much theorizing and some empirical work. A study was undertaken to examine the intraindividual relationship between negative affect and negative thoughts using a Box-Jenkins time series analysis. College students (N=33) completed a measure of negative mood and…

  16. Towards a study of synoptic-scale variability of the California current system

    NASA Technical Reports Server (NTRS)

    1985-01-01

    A West Coast satellite time series advisory group was established to consider the scientific rationale for the development of complete west coast time series of imagery of sea surface temperature (as derived by the Advanced Very High Resolution Radiometer on the NOAA polar orbiter, and near-surface phytoplankton pigment concentrations (as derived by the Coastal Zone Color Scanner on Nimbus 7). The scientific and data processing requirements for such time series are also considered. It is determined that such time series are essential if a number of scientific questions regarding the synoptic-scale dynamics of the California Current System are to be addressed. These questions concern both biological and physical processes.

  17. Association mining of dependency between time series

    NASA Astrophysics Data System (ADS)

    Hafez, Alaaeldin

    2001-03-01

    Time series analysis is considered as a crucial component of strategic control over a broad variety of disciplines in business, science and engineering. Time series data is a sequence of observations collected over intervals of time. Each time series describes a phenomenon as a function of time. Analysis on time series data includes discovering trends (or patterns) in a time series sequence. In the last few years, data mining has emerged and been recognized as a new technology for data analysis. Data Mining is the process of discovering potentially valuable patterns, associations, trends, sequences and dependencies in data. Data mining techniques can discover information that many traditional business analysis and statistical techniques fail to deliver. In this paper, we adapt and innovate data mining techniques to analyze time series data. By using data mining techniques, maximal frequent patterns are discovered and used in predicting future sequences or trends, where trends describe the behavior of a sequence. In order to include different types of time series (e.g. irregular and non- systematic), we consider past frequent patterns of the same time sequences (local patterns) and of other dependent time sequences (global patterns). We use the word 'dependent' instead of the word 'similar' for emphasis on real life time series where two time series sequences could be completely different (in values, shapes, etc.), but they still react to the same conditions in a dependent way. In this paper, we propose the Dependence Mining Technique that could be used in predicting time series sequences. The proposed technique consists of three phases: (a) for all time series sequences, generate their trend sequences, (b) discover maximal frequent trend patterns, generate pattern vectors (to keep information of frequent trend patterns), use trend pattern vectors to predict future time series sequences.

  18. Impact of Delivery Modality, Student GPA, and Time-Lapse since High School on Successful Completion of College-Level Math after Taking Developmental Math

    ERIC Educational Resources Information Center

    Acosta, Diane; North, Teresa Lynn; Avella, John

    2016-01-01

    This study considered whether delivery modality, student GPA, or time since high school affected whether 290 students who had completed a developmental math series as a community college were able to successfully complete college-level math. The data used in the study was comprised of a 4-year period historical student data from Odessa College…

  19. An introduction to chaotic and random time series analysis

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.

    1989-01-01

    The origin of chaotic behavior and the relation of chaos to randomness are explained. Two mathematical results are described: (1) a representation theorem guarantees the existence of a specific time-domain model for chaos and addresses the relation between chaotic, random, and strictly deterministic processes; (2) a theorem assures that information on the behavior of a physical system in its complete state space can be extracted from time-series data on a single observable. Focus is placed on an important connection between the dynamical state space and an observable time series. These two results lead to a practical deconvolution technique combining standard random process modeling methods with new embedded techniques.

  20. Predictors of Human Papillomavirus Vaccine Completion Among Female and Male Vaccine Initiators in Family Planning Centers.

    PubMed

    Simons, Hannah R; Unger, Zoe D; Lopez, Priscilla M; Kohn, Julia E

    2015-12-01

    We estimated human papillomavirus (HPV) vaccine series completion and examined predictors of completion among adolescents and young adults in a large family planning network. Our retrospective cohort study of vaccine completion within 12 months and time to completion used electronic health record data from 119 Planned Parenthood health centers in 11 US states for 9648 patients who initiated HPV vaccination between January 2011 and January 2013. Among vaccine initiators, 29% completed the series within 12 months. Patients who were male, younger than 22 years, or non-Hispanic Black or who had public insurance were less likely to complete within 12 months and completed more slowly than their counterparts. Gender appeared to modify the effect of public versus private insurance on completion (adjusted hazard ratio = 0.76 for women and 0.95 for men; relative excess risk due to interaction = 0.41; 95% confidence interval = 0.09, 0.73). Completion was low yet similar to previous studies conducted in safety net settings.

  1. Macroscopic Spatial Complexity of the Game of Life Cellular Automaton: A Simple Data Analysis

    NASA Astrophysics Data System (ADS)

    Hernández-Montoya, A. R.; Coronel-Brizio, H. F.; Rodríguez-Achach, M. E.

    In this chapter we present a simple data analysis of an ensemble of 20 time series, generated by averaging the spatial positions of the living cells for each state of the Game of Life Cellular Automaton (GoL). We show that at the macroscopic level described by these time series, complexity properties of GoL are also presented and the following emergent properties, typical of data extracted complex systems such as financial or economical come out: variations of the generated time series following an asymptotic power law distribution, large fluctuations tending to be followed by large fluctuations, and small fluctuations tending to be followed by small ones, and fast decay of linear correlations, however, the correlations associated to their absolute variations exhibit a long range memory. Finally, a Detrended Fluctuation Analysis (DFA) of the generated time series, indicates that the GoL spatial macro states described by the time series are not either completely ordered or random, in a measurable and very interesting way.

  2. Effects of a nurse-managed program on hepatitis A and B vaccine completion among homeless adults.

    PubMed

    Nyamathi, Adeline; Liu, Yihang; Marfisee, Mary; Shoptaw, Steven; Gregerson, Paul; Saab, Sammy; Leake, Barbara; Tyler, Darlene; Gelberg, Lillian

    2009-01-01

    Hepatitis B virus (HBV) infection constitutes a major health problem for homeless persons. Ability to complete an HBV vaccination series is complicated by the need to prioritize competing needs, such as addiction issues, safe places to sleep, and food, over health concerns. The objectives of this study were to evaluate the effectiveness of a nurse-case-managed intervention compared with that of two standard programs on completion of the combined hepatitis A virus (HAV) and HBV vaccine series among homeless adults and to assess sociodemographic factors and risk behaviors related to the vaccine completion. A randomized, three-group, prospective, quasi-experimental design was conducted with 865 homeless adults residing in homeless shelters, drug rehabilitation sites, and outdoor areas in the Skid Row area of Los Angeles. The programs included (a) nurse-case-managed sessions plus targeted hepatitis education, incentives, and tracking (NCMIT); (b) standard targeted hepatitis education plus incentives and tracking (SIT); and (c) standard targeted hepatitis education and incentives only (SI). Sixty-eight percent of the NCMIT participants completed the three-series vaccine at 6 months, compared with 61% of SIT participants and 54% of SI participants. NCMIT participants had almost 2 times greater odds of completing vaccination than those of participants in the SI program. Completers were more likely to be older, to be female, to report fair or poor health, and not to have participated in a self-help drug treatment program. Newly homeless White adults were significantly less likely than were African Americans to complete the vaccine series. The use of vaccination programs incorporating nurse case management and tracking is critical in supporting adherence to completion of a 6-month HAV/HBV vaccine. The finding that White homeless persons were the least likely to complete the vaccine series suggests that programs tailored to address their unique cultural issues are needed.

  3. Classroom Management in the Social Studies Class. How to Do It Series, Series 2, No. 7.

    ERIC Educational Resources Information Center

    Sullivan, Cheryl Granade

    Classroom management is discussed in terms of effective instruction, successful group management, maximum use of space, time, and resources, meaningful discipline, student rights, and change strategies. The discussion of effective instruction stresses appropriateness, completeness, clarity, and a variety of lessons. Techniques for successful group…

  4. 78 FR 9699 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-11

    ...-time series design. Four groups of 25-30 employees will be established to test the two intervention... expects to complete data collection in 2014. There are no costs to respondents other than their time. The...

  5. 76 FR 30919 - Marine Mammals; File No. 15844

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-27

    ... minke whales, (2) continue one of the longest and most complete time- series data set on humpback whale... ecotypes could be harassed up to 50 times each, to acquire 30 successful biopsy samples, per year. See the...

  6. Clinical time series prediction: Toward a hierarchical dynamical system framework.

    PubMed

    Liu, Zitao; Hauskrecht, Milos

    2015-09-01

    Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. We tested our framework by first learning the time series model from data for the patients in the training set, and then using it to predict future time series values for the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Surface Area Analysis Using the Brunauer-Emmett-Teller (BET) Method: Standard Operating Procedure Series: SOP-C

    DTIC Science & Technology

    2016-09-01

    Method Scientific Operating Procedure Series : SOP-C En vi ro nm en ta l L ab or at or y Jonathon Brame and Chris Griggs September 2016...BET) Method Scientific Operating Procedure Series : SOP-C Jonathon Brame and Chris Griggs Environmental Laboratory U.S. Army Engineer Research and...response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing

  8. A Proposed Data Fusion Architecture for Micro-Zone Analysis and Data Mining

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kevin McCarthy; Milos Manic

    Data Fusion requires the ability to combine or “fuse” date from multiple data sources. Time Series Analysis is a data mining technique used to predict future values from a data set based upon past values. Unlike other data mining techniques, however, Time Series places special emphasis on periodicity and how seasonal and other time-based factors tend to affect trends over time. One of the difficulties encountered in developing generic time series techniques is the wide variability of the data sets available for analysis. This presents challenges all the way from the data gathering stage to results presentation. This paper presentsmore » an architecture designed and used to facilitate the collection of disparate data sets well suited to Time Series analysis as well as other predictive data mining techniques. Results show this architecture provides a flexible, dynamic framework for the capture and storage of a myriad of dissimilar data sets and can serve as a foundation from which to build a complete data fusion architecture.« less

  9. Sikuliqiruq: Ice dynamics of the Meade river - Arctic Alaska, from freezeup to breakup from time-series ground imagery

    USGS Publications Warehouse

    Beck, R.A.; Rettig, A.J.; Ivenso, C.; Eisner, Wendy R.; Hinkel, Kenneth M.; Jones, Benjamin M.; Arp, C.D.; Grosse, G.; Whiteman, D.

    2010-01-01

    Ice formation and breakup on Arctic rivers strongly influence river flow, sedimentation, river ecology, winter travel, and subsistence fishing and hunting by Alaskan Natives. We use time-series ground imagery ofthe Meade River to examine the process at high temporal and spatial resolution. Freezeup from complete liquid cover to complete ice cover ofthe Meade River at Atqasuk, Alaska in the fall of 2008 occurred in less than three days between 28 September and 2 October 2008. Breakup in 2009 occurred in less than two hours between 23:47 UTC on 23 May 2009 and 01:27 UTC on 24 May 2009. All times in UTC. Breakup in 2009 and 2010 was ofthe thermal style in contrast to the mechanical style observed in 1966 and is consistent with a warming Arctic. ?? 2010 Taylor & Francis.

  10. Advertising bans as a means of tobacco control policy: a systematic literature review of time-series analyses.

    PubMed

    Quentin, Wilm; Neubauer, Simone; Leidl, Reiner; König, Hans-Helmut

    2007-01-01

    This paper reviews the international literature that employed time-series analysis to evaluate the effects of advertising bans on aggregate consumption of cigarettes or tobacco. A systematic search of the literature was conducted. Three groups of studies representing analyses of advertising bans in the U.S.A., in other countries and in 22 OECD countries were defined. The estimated effects of advertising bans and their significance were analysed. 24 studies were identified. They used a wide array of explanatory variables, models, estimating methods and data sources. 18 studies found a negative effect of an advertising ban on aggregate consumption, but only ten of these studies found a significant effect. Two studies using data from 22 OECD countries suggested that partial bans would have little or no influence on aggregate consumption, whereas complete bans would significantly reduce consumption. The results imply that advertising bans have a negative but sometimes only narrow impact on consumption. Complete bans let expect a higher effectiveness. Because of methodological restrictions of analysing advertising bans' effects by time series approaches, also different approaches should be used in the future.

  11. Crossing trend analysis methodology and application for Turkish rainfall records

    NASA Astrophysics Data System (ADS)

    Şen, Zekâi

    2018-01-01

    Trend analyses are the necessary tools for depicting possible general increase or decrease in a given time series. There are many versions of trend identification methodologies such as the Mann-Kendall trend test, Spearman's tau, Sen's slope, regression line, and Şen's innovative trend analysis. The literature has many papers about the use, cons and pros, and comparisons of these methodologies. In this paper, a completely new approach is proposed based on the crossing properties of a time series. It is suggested that the suitable trend from the centroid of the given time series should have the maximum number of crossings (total number of up-crossings or down-crossings). This approach is applicable whether the time series has dependent or independent structure and also without any dependence on the type of the probability distribution function. The validity of this method is presented through extensive Monte Carlo simulation technique and its comparison with other existing trend identification methodologies. The application of the methodology is presented for a set of annual daily extreme rainfall time series from different parts of Turkey and they have physically independent structure.

  12. Assimilating a synthetic Kalman filter leaf area index series into the WOFOST model to improve regional winter wheat yield estimation

    USDA-ARS?s Scientific Manuscript database

    The scale mismatch between remotely sensed observations and crop growth models simulated state variables decreases the reliability of crop yield estimates. To overcome this problem, we used a two-step data assimilation phases: first we generated a complete leaf area index (LAI) time series by combin...

  13. [Introduction and some problems of the rapid time series laboratory reporting system].

    PubMed

    Kanao, M; Yamashita, K; Kuwajima, M

    1999-09-01

    We introduced an on-line system of biochemical, hematological, serological, urinary, bacteriological, and emergency examinations and associated office work using a client server system NEC PC-LACS based on a system consisting of concentration of outpatient blood collection, concentration of outpatient reception, and outpatient examination by reservation. Using this on-line system, results of 71 items in chemical serological, hematological, and urinary examinations are rapidly reported within 1 hour. Since the ordering system at our hospital has not been completed yet, we constructed a rapid time series reporting system in which time series data obtained on 5 serial occasions are printed on 2 sheets of A4 paper at the time of the final report. In each consultation room of the medical outpatient clinic, at the neuromedical outpatient clinic, and at the kidney center where examinations are frequently performed, terminal equipment and a printer for inquiry were established for real-time output of time series reports. Results are reported by FAX to the other outpatient clinics and wards, and subsequently, time series reports are output at the clinical laboratory department. This system allowed rapid examination, especially preconsultation examination. This system was also useful for reducing office work and effectively utilize examination data.

  14. Improvements in Obstacle Clearance Parameters and Reaction Time Over a Series of Obstacles Revealed After Five Repeated Testing Sessions in Older Adults.

    PubMed

    Jehu, Deborah A; Lajoie, Yves; Paquet, Nicole

    2017-12-21

    The purpose of this study was to investigate obstacle clearance and reaction time parameters when crossing a series of six obstacles in older adults. A second aim was to examine the repeated exposure of this testing protocol once per week for 5 weeks. In total, 10 older adults (five females; age: 67.0 ± 6.9 years) walked onto and over six obstacles of varying heights (range: 100-200 mm) while completing no reaction time, simple reaction time, and choice reaction time tasks once per week for 5 weeks. The highest obstacles elicited the lowest toe clearance, and the first three obstacles revealed smaller heel clearance compared with the last three obstacles. Dual tasking negatively impacted obstacle clearance parameters when information processing demands were high. Longer and less consistent time to completion was observed in Session 1 compared with Sessions 2-5. Finally, improvements in simple reaction time were displayed after Session 2, but choice reaction time gradually improved and did not reach a plateau after repeated testing.

  15. Exploring total cardiac variability in healthy and pathophysiological subjects using improved refined multiscale entropy.

    PubMed

    Marwaha, Puneeta; Sunkaria, Ramesh Kumar

    2017-02-01

    Multiscale entropy (MSE) and refined multiscale entropy (RMSE) techniques are being widely used to evaluate the complexity of a time series across multiple time scales 't'. Both these techniques, at certain time scales (sometimes for the entire time scales, in the case of RMSE), assign higher entropy to the HRV time series of certain pathologies than that of healthy subjects, and to their corresponding randomized surrogate time series. This incorrect assessment of signal complexity may be due to the fact that these techniques suffer from the following limitations: (1) threshold value 'r' is updated as a function of long-term standard deviation and hence unable to explore the short-term variability as well as substantial variability inherited in beat-to-beat fluctuations of long-term HRV time series. (2) In RMSE, entropy values assigned to different filtered scaled time series are the result of changes in variance, but do not completely reflect the real structural organization inherited in original time series. In the present work, we propose an improved RMSE (I-RMSE) technique by introducing a new procedure to set the threshold value by taking into account the period-to-period variability inherited in a signal and evaluated it on simulated and real HRV database. The proposed I-RMSE assigns higher entropy to the age-matched healthy subjects than that of patients suffering from atrial fibrillation, congestive heart failure, sudden cardiac death and diabetes mellitus, for the entire time scales. The results strongly support the reduction in complexity of HRV time series in female group, old-aged, patients suffering from severe cardiovascular and non-cardiovascular diseases, and in their corresponding surrogate time series.

  16. Interrupted time series analysis in drug utilization research is increasing: systematic review and recommendations.

    PubMed

    Jandoc, Racquel; Burden, Andrea M; Mamdani, Muhammad; Lévesque, Linda E; Cadarette, Suzanne M

    2015-08-01

    To describe the use and reporting of interrupted time series methods in drug utilization research. We completed a systematic search of MEDLINE, Web of Science, and reference lists to identify English language articles through to December 2013 that used interrupted time series methods in drug utilization research. We tabulated the number of studies by publication year and summarized methodological detail. We identified 220 eligible empirical applications since 1984. Only 17 (8%) were published before 2000, and 90 (41%) were published since 2010. Segmented regression was the most commonly applied interrupted time series method (67%). Most studies assessed drug policy changes (51%, n = 112); 22% (n = 48) examined the impact of new evidence, 18% (n = 39) examined safety advisories, and 16% (n = 35) examined quality improvement interventions. Autocorrelation was considered in 66% of studies, 31% reported adjusting for seasonality, and 15% accounted for nonstationarity. Use of interrupted time series methods in drug utilization research has increased, particularly in recent years. Despite methodological recommendations, there is large variation in reporting of analytic methods. Developing methodological and reporting standards for interrupted time series analysis is important to improve its application in drug utilization research, and we provide recommendations for consideration. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Initiation & completion rates of hepatitis A vaccination among US pediatric populations born between 2005 and 2009.

    PubMed

    Weiss, Thomas; Zhang, Dongmu; Borse, Nagesh N; Walter, Emmanuel B

    2015-11-27

    To estimate hepatitis A vaccine series initiation and completion rates, assess time to vaccination, identify missed opportunities for the hepatitis A vaccine series, and examine factors associated with hepatitis A vaccine series initiation and completion. We conducted a retrospective, observational study using three healthcare claims databases separately. The study population was comprised of children born between years 2005 and 2009 that were continuously enrolled for at least three and a half years from the date of birth. Every child was followed from date of birth for three and a half years for hepatitis A vaccination. There were 93,735 eligible children from Clinformatics Data Mart, 202,513 from MarketScan Commercial, and 207,545 from MarketScan Medicaid. The overall hepatitis A vaccine series initiation rate was 63.8-79.4% and completion rate was 45.1-66.8% across the three databases. About 62.8-90.1% of the children who never initiated hepatitis A vaccine had at least one well visit from 1 year to three and a half years old. Children were more likely to initiate and complete the hepatitis A vaccine series if they were from more recent birth cohorts, from states with a hepatitis A vaccination recommendation prior to the ACIP universal recommendation, from states with daycare/school entry requirements, were enrolled in an HMO health plan, had pediatricians as primary providers, had more doctor's office/well visits and received MMR/Varicella vaccines. In this study, approximately one in every three to five children remained unvaccinated against hepatitis A. Although the hepatitis A vaccine series initiation and completion improved from 2005 to 2009, vaccine coverage has stabilized in recent years. It is important for providers to identify every opportunity for hepatitis A vaccination and to assure that children get protection from this vaccine-preventable disease. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Hand Path Priming in Manual Obstacle Avoidance: Evidence for Abstract Spatiotemporal Forms in Human Motor Control

    ERIC Educational Resources Information Center

    van der Wel, Robrecht P. R. D.; Fleckenstein, Robin M.; Jax, Steven A.; Rosenbaum, David A.

    2007-01-01

    Previous research suggests that motor equivalence is achieved through reliance on effector-independent spatiotemporal forms. Here the authors report a series of experiments investigating the role of such forms in the production of movement sequences. Participants were asked to complete series of arm movements in time with a metronome and, on some…

  19. 36 CFR 1254.92 - How do I submit a request to microfilm records and donated historical materials?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., records preparation, and other NARA requirements in a shorter time frame. (1) You may include in your request only one project to microfilm a complete body of documents, such as an entire series, a major continuous segment of a very large series which is reasonably divisible, or a limited number of separate...

  20. 36 CFR 1254.92 - How do I submit a request to microfilm records and donated historical materials?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., records preparation, and other NARA requirements in a shorter time frame. (1) You may include in your request only one project to microfilm a complete body of documents, such as an entire series, a major continuous segment of a very large series which is reasonably divisible, or a limited number of separate...

  1. Change detection using landsat time series: A review of frequencies, preprocessing, algorithms, and applications

    NASA Astrophysics Data System (ADS)

    Zhu, Zhe

    2017-08-01

    The free and open access to all archived Landsat images in 2008 has completely changed the way of using Landsat data. Many novel change detection algorithms based on Landsat time series have been developed We present a comprehensive review of four important aspects of change detection studies based on Landsat time series, including frequencies, preprocessing, algorithms, and applications. We observed the trend that the more recent the study, the higher the frequency of Landsat time series used. We reviewed a series of image preprocessing steps, including atmospheric correction, cloud and cloud shadow detection, and composite/fusion/metrics techniques. We divided all change detection algorithms into six categories, including thresholding, differencing, segmentation, trajectory classification, statistical boundary, and regression. Within each category, six major characteristics of different algorithms, such as frequency, change index, univariate/multivariate, online/offline, abrupt/gradual change, and sub-pixel/pixel/spatial were analyzed. Moreover, some of the widely-used change detection algorithms were also discussed. Finally, we reviewed different change detection applications by dividing these applications into two categories, change target and change agent detection.

  2. Clinical time series prediction: towards a hierarchical dynamical system framework

    PubMed Central

    Liu, Zitao; Hauskrecht, Milos

    2014-01-01

    Objective Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Materials and methods Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. Results We tested our framework by first learning the time series model from data for the patient in the training set, and then applying the model in order to predict future time series values on the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. Conclusion A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. PMID:25534671

  3. 21 CFR 1020.31 - Radiographic equipment.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... time, but means may be provided to permit completion of any single exposure of the series in process.... Means shall be provided to terminate the exposure at a preset time interval, a preset product of current and time, a preset number of pulses, or a preset radiation exposure to the image receptor. (i) Except...

  4. Classifying with confidence from incomplete information.

    DOE PAGES

    Parrish, Nathan; Anderson, Hyrum S.; Gupta, Maya R.; ...

    2013-12-01

    For this paper, we consider the problem of classifying a test sample given incomplete information. This problem arises naturally when data about a test sample is collected over time, or when costs must be incurred to compute the classification features. For example, in a distributed sensor network only a fraction of the sensors may have reported measurements at a certain time, and additional time, power, and bandwidth is needed to collect the complete data to classify. A practical goal is to assign a class label as soon as enough data is available to make a good decision. We formalize thismore » goal through the notion of reliability—the probability that a label assigned given incomplete data would be the same as the label assigned given the complete data, and we propose a method to classify incomplete data only if some reliability threshold is met. Our approach models the complete data as a random variable whose distribution is dependent on the current incomplete data and the (complete) training data. The method differs from standard imputation strategies in that our focus is on determining the reliability of the classification decision, rather than just the class label. We show that the method provides useful reliability estimates of the correctness of the imputed class labels on a set of experiments on time-series data sets, where the goal is to classify the time-series as early as possible while still guaranteeing that the reliability threshold is met.« less

  5. LESSONS-LEARNED AND SUCCESS STORIES FROM EPA'S REAL-TIME ENVIRONMENTAL MONITORING, DATA DELIVERY, AND PUBLIC OUTREACH PROGRAM

    EPA Science Inventory

    TTSD has completed a series of technology transfer and risk communication handbooks, case studies, and summary reports for community-based environmental monitoring projects under EPA's Real-Time Environmental Monitoring, Data Delivery, and Public Outreach Program. The Program tak...

  6. The "Chaos Theory" and nonlinear dynamics in heart rate variability analysis: does it work in short-time series in patients with coronary heart disease?

    PubMed

    Krstacic, Goran; Krstacic, Antonija; Smalcelj, Anton; Milicic, Davor; Jembrek-Gostovic, Mirjana

    2007-04-01

    Dynamic analysis techniques may quantify abnormalities in heart rate variability (HRV) based on nonlinear and fractal analysis (chaos theory). The article emphasizes clinical and prognostic significance of dynamic changes in short-time series applied on patients with coronary heart disease (CHD) during the exercise electrocardiograph (ECG) test. The subjects were included in the series after complete cardiovascular diagnostic data. Series of R-R and ST-T intervals were obtained from exercise ECG data after sampling digitally. The range rescaled analysis method determined the fractal dimension of the intervals. To quantify fractal long-range correlation's properties of heart rate variability, the detrended fluctuation analysis technique was used. Approximate entropy (ApEn) was applied to quantify the regularity and complexity of time series, as well as unpredictability of fluctuations in time series. It was found that the short-term fractal scaling exponent (alpha(1)) is significantly lower in patients with CHD (0.93 +/- 0.07 vs 1.09 +/- 0.04; P < 0.001). The patients with CHD had higher fractal dimension in each exercise test program separately, as well as in exercise program at all. ApEn was significant lower in CHD group in both RR and ST-T ECG intervals (P < 0.001). The nonlinear dynamic methods could have clinical and prognostic applicability also in short-time ECG series. Dynamic analysis based on chaos theory during the exercise ECG test point out the multifractal time series in CHD patients who loss normal fractal characteristics and regularity in HRV. Nonlinear analysis technique may complement traditional ECG analysis.

  7. Response to ``Comment on `Adaptive Q-S (lag, anticipated, and complete) time-varying synchronization and parameters identification of uncertain delayed neural networks''' [Chaos 17, 038101 (2007)

    NASA Astrophysics Data System (ADS)

    Yu, Wenwu; Cao, Jinde

    2007-09-01

    Parameter identification of dynamical systems from time series has received increasing interest due to its wide applications in secure communication, pattern recognition, neural networks, and so on. Given the driving system, parameters can be estimated from the time series by using an adaptive control algorithm. Recently, it has been reported that for some stable systems, in which parameters are difficult to be identified [Li et al., Phys Lett. A 333, 269-270 (2004); Remark 5 in Yu and Cao, Physica A 375, 467-482 (2007); and Li et al., Chaos 17, 038101 (2007)], and in this paper, a brief discussion about whether parameters can be identified from time series is investigated. From some detailed analyses, the problem of why parameters of stable systems can be hardly estimated is discussed. Some interesting examples are drawn to verify the proposed analysis.

  8. Unraveling chaotic attractors by complex networks and measurements of stock market complexity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, Hongduo; Li, Ying, E-mail: mnsliy@mail.sysu.edu.cn

    2014-03-15

    We present a novel method for measuring the complexity of a time series by unraveling a chaotic attractor modeled on complex networks. The complexity index R, which can potentially be exploited for prediction, has a similar meaning to the Kolmogorov complexity (calculated from the Lempel–Ziv complexity), and is an appropriate measure of a series' complexity. The proposed method is used to research the complexity of the world's major capital markets. None of these markets are completely random, and they have different degrees of complexity, both over the entire length of their time series and at a level of detail. However,more » developing markets differ significantly from mature markets. Specifically, the complexity of mature stock markets is stronger and more stable over time, whereas developing markets exhibit relatively low and unstable complexity over certain time periods, implying a stronger long-term price memory process.« less

  9. A videotape series for teaching physicians to evaluate sexually abused children.

    PubMed

    Jones, Jerry G; Garrett, Judy; Worthington, Toss

    2004-01-01

    A free videotape subscription series was utilized to increase the knowledge of general physicians in clinical practice about the medical evaluation of sexually abused children. Of the 65 physicians who requested the first tape, 39 (60%) completed it. Fourteen of the 39 physicians who completed the first tape (36%) completed the 5-tape series. Completion data suggested that series completion was unrelated to prior knowledge, years since training or number of sexual abuse examinations performed in the previous year. Evaluative comments suggested that quality of the tapes was not a factor in completion rate. On tests of immediate retention, the average posttest percent correct was significantly higher than on the pretest. In a 3-year follow-up of the 14 physicians who completed the series, 10 reported that they were still performing sexual abuse examinations.

  10. OceanSITES: Sustained Ocean Time Series Observations in the Global Ocean.

    NASA Astrophysics Data System (ADS)

    Weller, R. A.; Gallage, C.; Send, U.; Lampitt, R. S.; Lukas, R.

    2016-02-01

    Time series observations at critical or representative locations are an essential element of a global ocean observing system that is unique and complements other approaches to sustained observing. OceanSITES is an international group of oceanographers associated with such time series sites. OceanSITES exists to promote the continuation and extension of ocean time series sites around the globe. It also exists to plan and oversee the global array of sites in order to address the needs of research, climate change detection, operational applications, and policy makers. OceanSITES is a voluntary group that sits as an Action Group of the JCOMM-OPS Data Buoy Cooperation Panel, where JCOMM-OPS is the operational ocean observing oversight group of the Joint Commission on Oceanography and Marine Meteorology of the International Oceanographic Commission and the World Meteorological Organization. The way forward includes working to complete the global array, moving toward multidisciplinary instrumentation on a subset of the sites, and increasing utilization of the time series data, which are freely available from two Global Data Assembly Centers, one at the National Data Buoy Center and one at Coriolis at IFREMER. One recnet OceanSITES initiative and several results from OceanSITES time series sites are presented. The recent initiative was the assembly of a pool of temperature/conductivity recorders fro provision to OceanSITES sites in order to provide deep ocean temperature and salinity time series. Examples from specific sites include: a 15-year record of surface meteorology and air-sea fluxes from off northern Chile that shows evidence of long-term trends in surface forcing; change in upper ocean salinity and stratification in association with regional change in the hydrological cycle can be seen at the Hawaii time series site; results from monitoring Atlantic meridional transport; and results from a European multidisciplinary time series site.

  11. Fast Determination of Distribution-Connected PV Impacts Using a Variable Time-Step Quasi-Static Time-Series Approach: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mather, Barry

    The increasing deployment of distribution-connected photovoltaic (DPV) systems requires utilities to complete complex interconnection studies. Relatively simple interconnection study methods worked well for low penetrations of photovoltaic systems, but more complicated quasi-static time-series (QSTS) analysis is required to make better interconnection decisions as DPV penetration levels increase. Tools and methods must be developed to support this. This paper presents a variable-time-step solver for QSTS analysis that significantly shortens the computational time and effort to complete a detailed analysis of the operation of a distribution circuit with many DPV systems. Specifically, it demonstrates that the proposed variable-time-step solver can reduce themore » required computational time by as much as 84% without introducing any important errors to metrics, such as the highest and lowest voltage occurring on the feeder, number of voltage regulator tap operations, and total amount of losses realized in the distribution circuit during a 1-yr period. Further improvement in computational speed is possible with the introduction of only modest errors in these metrics, such as a 91 percent reduction with less than 5 percent error when predicting voltage regulator operations.« less

  12. Long-Term Stability of Radio Sources in VLBI Analysis

    NASA Technical Reports Server (NTRS)

    Engelhardt, Gerald; Thorandt, Volkmar

    2010-01-01

    Positional stability of radio sources is an important requirement for modeling of only one source position for the complete length of VLBI data of presently more than 20 years. The stability of radio sources can be verified by analyzing time series of radio source coordinates. One approach is a statistical test for normal distribution of residuals to the weighted mean for each radio source component of the time series. Systematic phenomena in the time series can thus be detected. Nevertheless, an inspection of rate estimation and weighted root-mean-square (WRMS) variations about the mean is also necessary. On the basis of the time series computed by the BKG group in the frame of the ICRF2 working group, 226 stable radio sources with an axis stability of 10 as could be identified. They include 100 ICRF2 axes-defining sources which are determined independently of the method applied in the ICRF2 working group. 29 stable radio sources with a source structure index of less than 3.0 can also be used to increase the number of 295 ICRF2 defining sources.

  13. 76 FR 45799 - Agency Information Collection Activities; Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-01

    ... the 2005 survey so that the results from it can be used as a baseline for a time-series analysis.\\1... 15 minutes to complete the pretest, the same time as that needed for the actual survey. The revised estimate takes further into account the presumed added time required to respond to questions unique to the...

  14. Necessary and sufficient conditions for the complete controllability and observability of systems in series using the coprime factorization of a rational matrix

    NASA Technical Reports Server (NTRS)

    Callier, F. M.; Nahum, C. D.

    1975-01-01

    The series connection of two linear time-invariant systems that have minimal state space system descriptions is considered. From these descriptions, strict-system-equivalent polynomial matrix system descriptions in the manner of Rosenbrock are derived. They are based on the factorization of the transfer matrix of the subsystems as a ratio of two right or left coprime polynomial matrices. They give rise to a simple polynomial matrix system description of the tandem connection. Theorem 1 states that for the complete controllability and observability of the state space system description of the series connection, it is necessary and sufficient that certain 'denominator' and 'numerator' groups are coprime. Consequences for feedback systems are drawn in Corollary 1. The role of pole-zero cancellations is explained by Lemma 3 and Corollaires 2 and 3.

  15. HMI Data Driven Magnetohydrodynamic Model Predicted Active Region Photospheric Heating Rates: Their Scale Invariant, Flare Like Power Law Distributions, and Their Possible Association With Flares

    NASA Technical Reports Server (NTRS)

    Goodman, Michael L.; Kwan, Chiman; Ayhan, Bulent; Shang, Eric L.

    2017-01-01

    There are many flare forecasting models. For an excellent review and comparison of some of them see Barnes et al. (2016). All these models are successful to some degree, but there is a need for better models. We claim the most successful models explicitly or implicitly base their forecasts on various estimates of components of the photospheric current density J, based on observations of the photospheric magnetic field B. However, none of the models we are aware of compute the complete J. We seek to develop a better model based on computing the complete photospheric J. Initial results from this model are presented in this talk. We present a data driven, near photospheric, 3 D, non-force free magnetohydrodynamic (MHD) model that computes time series of the total J, and associated resistive heating rate in each pixel at the photosphere in the neutral line regions (NLRs) of 14 active regions (ARs). The model is driven by time series of B measured by the Helioseismic & Magnetic Imager (HMI) on the Solar Dynamics Observatory (SDO) satellite. Spurious Doppler periods due to SDO orbital motion are filtered out of the time series of B in every AR pixel. Errors in B due to these periods can be significant.

  16. Automated Bayesian model development for frequency detection in biological time series.

    PubMed

    Granqvist, Emma; Oldroyd, Giles E D; Morris, Richard J

    2011-06-24

    A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and the requirement for uniformly sampled data. Biological time series often deviate significantly from the requirements of optimality for Fourier transformation. In this paper we present an alternative approach based on Bayesian inference. We show the value of placing spectral analysis in the framework of Bayesian inference and demonstrate how model comparison can automate this procedure.

  17. Automated Bayesian model development for frequency detection in biological time series

    PubMed Central

    2011-01-01

    Background A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. Results In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Conclusions Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and the requirement for uniformly sampled data. Biological time series often deviate significantly from the requirements of optimality for Fourier transformation. In this paper we present an alternative approach based on Bayesian inference. We show the value of placing spectral analysis in the framework of Bayesian inference and demonstrate how model comparison can automate this procedure. PMID:21702910

  18. Multiresolution forecasting for futures trading using wavelet decompositions.

    PubMed

    Zhang, B L; Coggins, R; Jabri, M A; Dersch, D; Flower, B

    2001-01-01

    We investigate the effectiveness of a financial time-series forecasting strategy which exploits the multiresolution property of the wavelet transform. A financial series is decomposed into an over complete, shift invariant scale-related representation. In transform space, each individual wavelet series is modeled by a separate multilayer perceptron (MLP). We apply the Bayesian method of automatic relevance determination to choose short past windows (short-term history) for the inputs to the MLPs at lower scales and long past windows (long-term history) at higher scales. To form the overall forecast, the individual forecasts are then recombined by the linear reconstruction property of the inverse transform with the chosen autocorrelation shell representation, or by another perceptron which learns the weight of each scale in the prediction of the original time series. The forecast results are then passed to a money management system to generate trades.

  19. A New Strategy for Analyzing Time-Series Data Using Dynamic Networks: Identifying Prospective Biomarkers of Hepatocellular Carcinoma.

    PubMed

    Huang, Xin; Zeng, Jun; Zhou, Lina; Hu, Chunxiu; Yin, Peiyuan; Lin, Xiaohui

    2016-08-31

    Time-series metabolomics studies can provide insight into the dynamics of disease development and facilitate the discovery of prospective biomarkers. To improve the performance of early risk identification, a new strategy for analyzing time-series data based on dynamic networks (ATSD-DN) in a systematic time dimension is proposed. In ATSD-DN, the non-overlapping ratio was applied to measure the changes in feature ratios during the process of disease development and to construct dynamic networks. Dynamic concentration analysis and network topological structure analysis were performed to extract early warning information. This strategy was applied to the study of time-series lipidomics data from a stepwise hepatocarcinogenesis rat model. A ratio of lyso-phosphatidylcholine (LPC) 18:1/free fatty acid (FFA) 20:5 was identified as the potential biomarker for hepatocellular carcinoma (HCC). It can be used to classify HCC and non-HCC rats, and the area under the curve values in the discovery and external validation sets were 0.980 and 0.972, respectively. This strategy was also compared with a weighted relative difference accumulation algorithm (wRDA), multivariate empirical Bayes statistics (MEBA) and support vector machine-recursive feature elimination (SVM-RFE). The better performance of ATSD-DN suggests its potential for a more complete presentation of time-series changes and effective extraction of early warning information.

  20. A New Strategy for Analyzing Time-Series Data Using Dynamic Networks: Identifying Prospective Biomarkers of Hepatocellular Carcinoma

    NASA Astrophysics Data System (ADS)

    Huang, Xin; Zeng, Jun; Zhou, Lina; Hu, Chunxiu; Yin, Peiyuan; Lin, Xiaohui

    2016-08-01

    Time-series metabolomics studies can provide insight into the dynamics of disease development and facilitate the discovery of prospective biomarkers. To improve the performance of early risk identification, a new strategy for analyzing time-series data based on dynamic networks (ATSD-DN) in a systematic time dimension is proposed. In ATSD-DN, the non-overlapping ratio was applied to measure the changes in feature ratios during the process of disease development and to construct dynamic networks. Dynamic concentration analysis and network topological structure analysis were performed to extract early warning information. This strategy was applied to the study of time-series lipidomics data from a stepwise hepatocarcinogenesis rat model. A ratio of lyso-phosphatidylcholine (LPC) 18:1/free fatty acid (FFA) 20:5 was identified as the potential biomarker for hepatocellular carcinoma (HCC). It can be used to classify HCC and non-HCC rats, and the area under the curve values in the discovery and external validation sets were 0.980 and 0.972, respectively. This strategy was also compared with a weighted relative difference accumulation algorithm (wRDA), multivariate empirical Bayes statistics (MEBA) and support vector machine-recursive feature elimination (SVM-RFE). The better performance of ATSD-DN suggests its potential for a more complete presentation of time-series changes and effective extraction of early warning information.

  1. Stochastic model stationarization by eliminating the periodic term and its effect on time series prediction

    NASA Astrophysics Data System (ADS)

    Moeeni, Hamid; Bonakdari, Hossein; Fatemi, Seyed Ehsan

    2017-04-01

    Because time series stationarization has a key role in stochastic modeling results, three methods are analyzed in this study. The methods are seasonal differencing, seasonal standardization and spectral analysis to eliminate the periodic effect on time series stationarity. First, six time series including 4 streamflow series and 2 water temperature series are stationarized. The stochastic term for these series obtained with ARIMA is subsequently modeled. For the analysis, 9228 models are introduced. It is observed that seasonal standardization and spectral analysis eliminate the periodic term completely, while seasonal differencing maintains seasonal correlation structures. The obtained results indicate that all three methods present acceptable performance overall. However, model accuracy in monthly streamflow prediction is higher with seasonal differencing than with the other two methods. Another advantage of seasonal differencing over the other methods is that the monthly streamflow is never estimated as negative. Standardization is the best method for predicting monthly water temperature although it is quite similar to seasonal differencing, while spectral analysis performed the weakest in all cases. It is concluded that for each monthly seasonal series, seasonal differencing is the best stationarization method in terms of periodic effect elimination. Moreover, the monthly water temperature is predicted with more accuracy than monthly streamflow. The criteria of the average stochastic term divided by the amplitude of the periodic term obtained for monthly streamflow and monthly water temperature were 0.19 and 0.30, 0.21 and 0.13, and 0.07 and 0.04 respectively. As a result, the periodic term is more dominant than the stochastic term for water temperature in the monthly water temperature series compared to streamflow series.

  2. Hispanic Mothers' Beliefs Regarding HPV Vaccine Series Completion in Their Adolescent Daughters

    ERIC Educational Resources Information Center

    Roncancio, A. M.; Ward, K. K.; Carmack, C. C.; Muñoz, B. T.; Cribbs, F. L.

    2017-01-01

    Rates of human papillomavirus (HPV) vaccine series completion among adolescent Hispanic females in Texas in 2014 (~39%) lag behind the Healthy People 2020 goal (80%). This qualitative study identifies Hispanic mothers' salient behavioral, normative and control beliefs regarding having their adolescent daughters complete the vaccine series.…

  3. TOMS and SBUV Data: Comparison to 3D Chemical-Transport Model Results

    NASA Technical Reports Server (NTRS)

    Stolarski, Richard S.; Douglass, Anne R.; Steenrod, Steve; Frith, Stacey

    2003-01-01

    We have updated our merged ozone data (MOD) set using the TOMS data from the new version 8 algorithm. We then analyzed these data for contributions from solar cycle, volcanoes, QBO, and halogens using a standard statistical time series model. We have recently completed a hindcast run of our 3D chemical-transport model for the same years. This model uses off-line winds from the finite-volume GCM, a full stratospheric photochemistry package, and time-varying forcing due to halogens, solar uv, and volcanic aerosols. We will report on a parallel analysis of these model results using the same statistical time series technique as used for the MOD data.

  4. Comparison of missing value imputation methods in time series: the case of Turkish meteorological data

    NASA Astrophysics Data System (ADS)

    Yozgatligil, Ceylan; Aslan, Sipan; Iyigun, Cem; Batmaz, Inci

    2013-04-01

    This study aims to compare several imputation methods to complete the missing values of spatio-temporal meteorological time series. To this end, six imputation methods are assessed with respect to various criteria including accuracy, robustness, precision, and efficiency for artificially created missing data in monthly total precipitation and mean temperature series obtained from the Turkish State Meteorological Service. Of these methods, simple arithmetic average, normal ratio (NR), and NR weighted with correlations comprise the simple ones, whereas multilayer perceptron type neural network and multiple imputation strategy adopted by Monte Carlo Markov Chain based on expectation-maximization (EM-MCMC) are computationally intensive ones. In addition, we propose a modification on the EM-MCMC method. Besides using a conventional accuracy measure based on squared errors, we also suggest the correlation dimension (CD) technique of nonlinear dynamic time series analysis which takes spatio-temporal dependencies into account for evaluating imputation performances. Depending on the detailed graphical and quantitative analysis, it can be said that although computational methods, particularly EM-MCMC method, are computationally inefficient, they seem favorable for imputation of meteorological time series with respect to different missingness periods considering both measures and both series studied. To conclude, using the EM-MCMC algorithm for imputing missing values before conducting any statistical analyses of meteorological data will definitely decrease the amount of uncertainty and give more robust results. Moreover, the CD measure can be suggested for the performance evaluation of missing data imputation particularly with computational methods since it gives more precise results in meteorological time series.

  5. Test Series 2. 4: detailed test plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Test Series 2.4 comprises the fourth sub-series of tests to be scheduled as a part of Test Series 2, the second stage of the combustion research program to be carried out at the Grimethorpe Experimental Pressurized Fluidized Bed Combustion Facility. Test Series 2.1, the first sub-series of tests, was completed in February 1983, and the first part of the second sub-series, Test Series 2.3, in October 1983. Test Series 2.2 was completed in February 1984 after which the second part of Test Series 2.3 commenced. The Plan for Test Series 2.4 consists of 350 data gathering hours to be completedmore » within 520 coal burning hours. This document provides a brief description of the Facility and modifications which have been made following the completion of Test Series 2.1. No further modifications were made following the completion of the first part of Test Series 2.3 or Test Series 2.2. The operating requirements for Test Series 2.4 are specified. The tests will be performed using a UK coal (Lady Windsor), and a UK limestone (Middleton) both nominated by the FRG. Seven objectives are proposed which are to be fulfilled by thirteen test conditions. Six part load tests based on input supplied by Kraftwerk Union AG are included. The cascade is expected to be on line for each test condition and total cascade exposure is expected to be in excess of 450 hours. Details of sampling and special measurements are given. A test plan schedule envisages the full test series being completed within a two month calendar period. Finally, a number of contingency strategies are proposed. 3 figures, 14 tables.« less

  6. Predicting Long-Term College Success through Degree Completion Using ACT[R] Composite Score, ACT Benchmarks, and High School Grade Point Average. ACT Research Report Series, 2012 (5)

    ERIC Educational Resources Information Center

    Radunzel, Justine; Noble, Julie

    2012-01-01

    This study compared the effectiveness of ACT[R] Composite score and high school grade point average (HSGPA) for predicting long-term college success. Outcomes included annual progress towards a degree (based on cumulative credit-bearing hours earned), degree completion, and cumulative grade point average (GPA) at 150% of normal time to degree…

  7. 33 CFR Appendix B to Part 263 - Application of Multiobjective Planning Framework to Continuing Authorities Program

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Authorities Program 1. General. The planning process described in the ER 1105-2-200 series of regulations... at the same time keeping the requirements for information and analyses consistent with the scope of the study, solutions recommended, and the Program completion-time objectives outlined in § 263.18 of...

  8. 33 CFR Appendix B to Part 263 - Application of Multiobjective Planning Framework to Continuing Authorities Program

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Authorities Program 1. General. The planning process described in the ER 1105-2-200 series of regulations... at the same time keeping the requirements for information and analyses consistent with the scope of the study, solutions recommended, and the Program completion-time objectives outlined in § 263.18 of...

  9. 33 CFR Appendix B to Part 263 - Application of Multiobjective Planning Framework to Continuing Authorities Program

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Authorities Program 1. General. The planning process described in the ER 1105-2-200 series of regulations... at the same time keeping the requirements for information and analyses consistent with the scope of the study, solutions recommended, and the Program completion-time objectives outlined in § 263.18 of...

  10. 33 CFR Appendix B to Part 263 - Application of Multiobjective Planning Framework to Continuing Authorities Program

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Authorities Program 1. General. The planning process described in the ER 1105-2-200 series of regulations... at the same time keeping the requirements for information and analyses consistent with the scope of the study, solutions recommended, and the Program completion-time objectives outlined in § 263.18 of...

  11. 33 CFR Appendix B to Part 263 - Application of Multiobjective Planning Framework to Continuing Authorities Program

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Authorities Program 1. General. The planning process described in the ER 1105-2-200 series of regulations... at the same time keeping the requirements for information and analyses consistent with the scope of the study, solutions recommended, and the Program completion-time objectives outlined in § 263.18 of...

  12. 76 FR 4997 - Medicare Program; Inpatient Psychiatric Facilities Prospective Payment System-Update for Rate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-27

    .... Repeating this step for other periods produces a series of market basket levels over time. Dividing an index..., P.O. Box 8010, Baltimore, MD 21244-1850. Please allow sufficient time for mailed comments to be...); interrupted stays; and a per treatment adjustment for patients who undergo ECT. A complete discussion of the...

  13. Test Series 2. 2: Detailed Test Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Test Series 2.2 comprises the third sub-series of tests to be scheduled as a part of Test Series 2, the second stage of the combustion research program to be carried out at the Grimethorpe Experimental Pressurized Fluidized Bed Combustion Facility. Test Series 2.1, the first sub-series of tests, was completed in February 1983, and the first half of the second sub-series, Test Series 2.3, in October 1983. Test Series 2.2 is to consist of 350 data gathering hours, which it is hoped to complete within 560 coal burning hours. This document provides a brief description of the Facility and modificationsmore » which have been made following the completion of Test Series 2.1. No further modifications were made following the completion of the first half of Test Series 2.3. The operating requirements are specified. The tests will be performed using a UK coal (Kiveton Park), and a UK limestone (Middleton) both nominated by the FRG. Nine objectives are proposed which are to be fulfilled by thirteen test conditions. Six part load tests are included, as defined by Kraftwerk Union AG. The cascade is expected to be on line for each test condition and total cascade exposure is expected to be in excess of 450 hours. Details of sampling and special measurements are given. A test plan schedule envisages the test series being completed within a two month calendar period. Finally, a number of contingency strategies are proposed.« less

  14. Bioremediation: Hope/Hype for Environmental Cleanup (LBNL Summer Lecture Series)

    ScienceCinema

    Hazen, Terry [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Ecology Dept.

    2018-01-23

    Summer Lecture Series 2007: Terry Hazen, Senior Staff Scientists and Head of the LBNL Ecology Department, discusses when it's best to resort to engineered bioremediation of contaminated sites, and when it's best to rely on natural attenuation. Recent advances have greatly broadened the potential applications for bioremediation. At the same time, scientists' knowledge of biogeochemical processes has advanced and they can better gauge how quickly and completely contaminants can be degraded without human intervention.

  15. Bioremediation: Hope/Hype for Environmental Cleanup (LBNL Summer Lecture Series)

    ScienceCinema

    Hazen, Terry [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Ecology Dept.

    2018-05-04

    Summer Lecture Series 2007: Terry Hazen, Senior Staff Scientists and Head of the LBNL Ecology Department, discusses when it's best to resort to engineered bioremediation of contaminated sites, and when it's best to rely on natural attenuation. Recent advances have greatly broadened the potential applications for bioremediation. At the same time, scientists' knowledge of biogeochemical processes has advanced and they can better gauge how quickly and completely contaminants can be degraded without human intervention.

  16. On the maximum-entropy/autoregressive modeling of time series

    NASA Technical Reports Server (NTRS)

    Chao, B. F.

    1984-01-01

    The autoregressive (AR) model of a random process is interpreted in the light of the Prony's relation which relates a complex conjugate pair of poles of the AR process in the z-plane (or the z domain) on the one hand, to the complex frequency of one complex harmonic function in the time domain on the other. Thus the AR model of a time series is one that models the time series as a linear combination of complex harmonic functions, which include pure sinusoids and real exponentials as special cases. An AR model is completely determined by its z-domain pole configuration. The maximum-entropy/autogressive (ME/AR) spectrum, defined on the unit circle of the z-plane (or the frequency domain), is nothing but a convenient, but ambiguous visual representation. It is asserted that the position and shape of a spectral peak is determined by the corresponding complex frequency, and the height of the spectral peak contains little information about the complex amplitude of the complex harmonic functions.

  17. Hepatitis A, B, and A/B vaccination series completion among US adults: a claims-based analysis.

    PubMed

    Ghaswalla, Parinaz K; Patterson, Brandon J; Cheng, Wendy Y; Duchesneau, Emilie; Macheca, Monica; Duh, Mei Sheng

    2018-06-20

    Hepatitis A and B disease burden persists in the US. We assessed hepatitis A and hepatitis B vaccination series completion rates among 350,240 commercial/Medicare and 12,599 Medicaid enrollees aged ≥19 years. A vaccination series was considered as completed provided that the minimum interval between doses, as defined by the CDC, and the minimum number of doses were reached. We stratified completion rates by vaccine type (i.e. monovalent or bivalent) at initial vaccination for each cohort. In the commercial/Medicare cohort, the series completion rate was 32.0% for hepatitis A and 39.6% for hepatitis B among those who initiated with a monovalent vaccine, and it was 36.2% for hepatitis A and 48.9% for hepatitis B among those who initiated with a bivalent vaccine. In the Medicaid cohort, the series completion rate was 21.0% for hepatitis A and 24.0% for hepatitis B among those who initiated with a monovalent vaccine, and it was 19.0% for hepatitis A and 24.6% for hepatitis B among those who initiated with a bivalent vaccine. In conclusion, hepatitis A and B vaccination series completion rates were low, and appeared to be lower among Medicaid than among commercial/Medicare enrollees. Commercial/Medicare enrollees who initiated with a bivalent vaccine had higher series completion rates than those who initiated with monovalent vaccines - an observation that was not made among Medicaid enrollees.

  18. Using Social Marketing Theory as a Framework for Understanding and Increasing HPV Vaccine Series Completion Among Hispanic Adolescents: A Qualitative Study

    PubMed Central

    Ward, Kristy K.; Carmack, Chakema C.; Muñoz, Becky T.; Cano, Miguel A.; Cribbs, Felicity

    2016-01-01

    HPV vaccine series completion rates among adolescent Hispanic females and males (~39 and 21 %, respectively) are far below the Healthy People 80 % coverage goal. Completion of the 3-dose vaccine series is critical to reducing the incidence of HPV-associated cancers. This formative study applies social marketing theory to assess the needs and preferences of Hispanic mothers in order to guide the development of interventions to increase HPV vaccine completion. We conducted 51 in-depth interviews with Hispanic mothers of adolescents to identify the key concepts of social marketing theory (i.e., the four P’s: product, price, place and promotion). Results suggest that a desire complete the vaccine series, vaccine reminders and preventing illnesses and protecting their children against illnesses and HPV all influence vaccination (product). The majority of Completed mothers did not experience barriers that prevented vaccine series completion and Initiated mothers perceived a lack of health insurance and the cost of the vaccine as potential barriers. Informational barriers were prevalent across both market segments (price). Clinics are important locations for deciding to complete the vaccine series (place). They are the preferred sources to obtain information about the HPV vaccine thus making them ideal locations to deliver intervention messages, followed by television, the child’s school and brochures (promotion). Increasing HPV vaccine coverage among Hispanic adolescents will reduce the rates of HPV-associated cancers and the cervical cancer health disparity among Hispanic women. This research can inform the development of an intervention to increase HPV vaccine series completion in this population. PMID:27624345

  19. Using Social Marketing Theory as a Framework for Understanding and Increasing HPV Vaccine Series Completion Among Hispanic Adolescents: A Qualitative Study.

    PubMed

    Roncancio, Angelica M; Ward, Kristy K; Carmack, Chakema C; Muñoz, Becky T; Cano, Miguel A; Cribbs, Felicity

    2017-02-01

    HPV vaccine series completion rates among adolescent Hispanic females and males (~39 and 21 %, respectively) are far below the Healthy People 80 % coverage goal. Completion of the 3-dose vaccine series is critical to reducing the incidence of HPV-associated cancers. This formative study applies social marketing theory to assess the needs and preferences of Hispanic mothers in order to guide the development of interventions to increase HPV vaccine completion. We conducted 51 in-depth interviews with Hispanic mothers of adolescents to identify the key concepts of social marketing theory (i.e., the four P's: product, price, place and promotion). Results suggest that a desire complete the vaccine series, vaccine reminders and preventing illnesses and protecting their children against illnesses and HPV all influence vaccination (product). The majority of Completed mothers did not experience barriers that prevented vaccine series completion and Initiated mothers perceived a lack of health insurance and the cost of the vaccine as potential barriers. Informational barriers were prevalent across both market segments (price). Clinics are important locations for deciding to complete the vaccine series (place). They are the preferred sources to obtain information about the HPV vaccine thus making them ideal locations to deliver intervention messages, followed by television, the child's school and brochures (promotion). Increasing HPV vaccine coverage among Hispanic adolescents will reduce the rates of HPV-associated cancers and the cervical cancer health disparity among Hispanic women. This research can inform the development of an intervention to increase HPV vaccine series completion in this population.

  20. Orbital forced frequencies in the 975000 year pollen record from Tenagi Philippon (Greece)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mommersteeg, H.J.P.M.; Young, R.; Wijmstra, T.A.

    Frequency analysis was applied to different time series obtained from the 975 ka pollen record of Tenagi Philippon (Macedonia, Greece). These time series are characteristic of different vegetation types related to specific climatic conditions. Time control of the 196 m deep core was based on 11 finite {sup 14}C dates in the upper 17 m, magnetostratigraphy and correlation with the marine oxygen isotope stratigraphy. Maximum entropy spectrum analyses and thomson multi-taper spectrum analysis were applied using the complete time series. Periods of 95-99, 40-45. 24.0-25.5 and 19-21 ka which can be related to orbital forcing, as well as periods ofmore » about 68, 30 ka and of about 15.5, 13.5, 12 and 10.5 ka were detected. The detected periods of about 68, 30 ka and 16, 14, 12, 10.5 ka are likely to be harmonics and combination tones of the periods related to orbital forcing. The period of around 30 ka is possibly a secondary peak of obliquity. To study the stability of the detected periods through time, analysis with a moving window was employed. Signals in the eccentricity band were detected clearly during the last 650 ka. In the precession band, detected periods of about 24 ka show an increase in amplitude during the last 650 ka. The evolution of orbital frequencies during the last 1.0 Ma is in general agreement with the results of other marine and continental time series. Time series related to different climatic settings showed a different response to orbital forcing. Time series of vegetational elements sensitive to changes in net precipitation were forced in the precession and obliquity bands. Changes in precession caused changes in the monsoon system, which indirectly had a strong influence on the climatic history of Greece. Time series of vegetational elements which are more indicative of changes in annual temperature are forced in the eccentricity band. 54 refs., 12 figs., 3 tabs.« less

  1. Spectral Unmixing Analysis of Time Series Landsat 8 Images

    NASA Astrophysics Data System (ADS)

    Zhuo, R.; Xu, L.; Peng, J.; Chen, Y.

    2018-05-01

    Temporal analysis of Landsat 8 images opens up new opportunities in the unmixing procedure. Although spectral analysis of time series Landsat imagery has its own advantage, it has rarely been studied. Nevertheless, using the temporal information can provide improved unmixing performance when compared to independent image analyses. Moreover, different land cover types may demonstrate different temporal patterns, which can aid the discrimination of different natures. Therefore, this letter presents time series K-P-Means, a new solution to the problem of unmixing time series Landsat imagery. The proposed approach is to obtain the "purified" pixels in order to achieve optimal unmixing performance. The vertex component analysis (VCA) is used to extract endmembers for endmember initialization. First, nonnegative least square (NNLS) is used to estimate abundance maps by using the endmember. Then, the estimated endmember is the mean value of "purified" pixels, which is the residual of the mixed pixel after excluding the contribution of all nondominant endmembers. Assembling two main steps (abundance estimation and endmember update) into the iterative optimization framework generates the complete algorithm. Experiments using both simulated and real Landsat 8 images show that the proposed "joint unmixing" approach provides more accurate endmember and abundance estimation results compared with "separate unmixing" approach.

  2. EFFECTS OF THERMAL TREATMENTS ON THE CHEMICAL REACTIVITY OF TRICHLOROETHYLENE

    EPA Science Inventory

    A series of experiments was completed to investigate abiotic degradation and reaction product formation of trichloroethylene (TCE) when heated. A quartz-tube apparatus was used to study short residence time and high temperature conditions that are thought to occur during thermal ...

  3. A statistical analysis of flank eruptions on Etna volcano

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Tinti, Stefano; Boschi, Enzo

    1985-02-01

    A singularly complete record exists for the eruptive activity of Etna volcano. The time series of occurrence of flank eruptions in the period 1600-1980, in which the record is presumably complete, is found to follow a stationary Poisson process. A revision of the available data shows that eruption durations are rather well correlated with the estimates of the volume of lava flows. This implies that the magnitude of an eruption can be defined directly by its duration. Extreme value statistics are then applied to the time series, using duration as a dependent variable. The probability of occurrence of a very long (300 days) eruption is greater than 50% only in time intervals of the order of 50 years. The correlation found between duration and total output also allows estimation of the probability of occurrence of a major event which exceeds a given duration and total flow of lava. The composite probabilities do not differ considerably from the pure ones. Paralleling a well established application to seismic events, extreme value theory can be profitably used in volcanic risk estimates, provided that appropriate account is also taken of all other variables.

  4. Efficacy of memory aids after traumatic brain injury: A single case series.

    PubMed

    Bos, Hannah R; Babbage, Duncan R; Leathem, Janet M

    2017-01-01

    Individuals living with traumatic brain injury commonly have difficulties with prospective memory-the ability to remember a planned action at the intended time. Traditionally a memory notebook has been recommended as a compensatory memory aid. Electronic devices have the advantage of providing a cue at the appropriate time to remind participants to refer to the memory aid and complete tasks. Research suggests these have potential benefit in neurorehabilitation. This study aimed to investigate the efficacy of a memory notebook and specifically a smartphone as a compensatory memory aid. A single case series design was used to assess seven participants. A no-intervention baseline was followed by training and intervention with either the smartphone alone, or a memory notebook and later the smartphone. Memory was assessed with weekly assigned memory tasks. Participants using a smartphone showed improvements in their ability to complete assigned memory tasks accurately and within the assigned time periods. Use of a smartphone provided additional benefits over and above those already seen for those who received a memory notebook first. Smartphones have the potential to be a useful and cost effective tool in neurorehabilitation practice.

  5. Multifractality and heteroscedastic dynamics: An application to time series analysis

    NASA Astrophysics Data System (ADS)

    Nascimento, C. M.; Júnior, H. B. N.; Jennings, H. D.; Serva, M.; Gleria, Iram; Viswanathan, G. M.

    2008-01-01

    An increasingly important problem in physics concerns scale invariance symmetry in diverse complex systems, often characterized by heteroscedastic dynamics. We investigate the nature of the relationship between the heteroscedastic and fractal aspects of the dynamics of complex systems, by analyzing the sensitivity to heteroscedasticity of the scaling properties of weakly nonstationary time series. By using multifractal detrended fluctuation analysis, we study the singularity spectra of currency exchange rate fluctuations, after partially or completely eliminating n-point correlations via data shuffling techniques. We conclude that heteroscedasticity can significantly increase multifractality and interpret these findings in the context of self-organizing and adaptive complex systems.

  6. Two-Way Satellite Time Transfer Between USNO and PTB

    DTIC Science & Technology

    2005-08-01

    Observatory 3450 Massachusetts Ave. NW Washington, DC 20392, USA Abstract—Two completely independent two-way time and frequency transfer ( TWSTFT ...for the realization of TAI. The X- band data are provided as a backup. To reach the full potential of TWSTFT , especially for time scale comparisons...ns for both links were achieved. A change of the TWSTFT transmission frequencies or satellite changes in general cause discontinuities in the series

  7. Featureless classification of light curves

    NASA Astrophysics Data System (ADS)

    Kügler, S. D.; Gianniotis, N.; Polsterer, K. L.

    2015-08-01

    In the era of rapidly increasing amounts of time series data, classification of variable objects has become the main objective of time-domain astronomy. Classification of irregularly sampled time series is particularly difficult because the data cannot be represented naturally as a vector which can be directly fed into a classifier. In the literature, various statistical features serve as vector representations. In this work, we represent time series by a density model. The density model captures all the information available, including measurement errors. Hence, we view this model as a generalization to the static features which directly can be derived, e.g. as moments from the density. Similarity between each pair of time series is quantified by the distance between their respective models. Classification is performed on the obtained distance matrix. In the numerical experiments, we use data from the OGLE (Optical Gravitational Lensing Experiment) and ASAS (All Sky Automated Survey) surveys and demonstrate that the proposed representation performs up to par with the best currently used feature-based approaches. The density representation preserves all static information present in the observational data, in contrast to a less-complete description by features. The density representation is an upper boundary in terms of information made available to the classifier. Consequently, the predictive power of the proposed classification depends on the choice of similarity measure and classifier, only. Due to its principled nature, we advocate that this new approach of representing time series has potential in tasks beyond classification, e.g. unsupervised learning.

  8. Sustainability Logistics Basing - Science and Technology Objective - Demonstration; Industry Assessment and Demonstration Final Report

    DTIC Science & Technology

    2017-08-14

    the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing this...place; c) Site visits took place for two of the candidate technologies, T- SERIES by ZeroBase and Sol-Char by the University of Colorado, within the...visits during the planned timeframe within the SLB-STO-D master plan; d) The T- Series by Zero-Base appears to be the most mature of all the industry

  9. Bayesian dynamic modeling of time series of dengue disease case counts.

    PubMed

    Martínez-Bello, Daniel Adyro; López-Quílez, Antonio; Torres-Prieto, Alexander

    2017-07-01

    The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model's short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC) for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease, producing useful models for decision-making in public health.

  10. The Impact of a Standalone, Patient-centered Communication Course Series on Student Achievement, Preparedness, and Attitudes.

    PubMed

    Trujillo, Jennifer M; McNair, Chelsea D; Linnebur, Sunny A; Valdez, Connie; Trujillo, Toby C

    2016-12-25

    Objective. To evaluate the impact of a standalone, patient-centered communication (PCC) course series on student achievement of and perceived preparedness for PCC skills and to assess student attitudes regarding learning methods used. Design. During curriculum renewal, a standalone PCC course series that integrated horizontally and vertically within the curriculum was developed. Student achievement of outcomes was evaluated by aggregate performance on simulated evaluations. Students who completed the PCC series were surveyed to assess preparedness and attitudes. Students in the prior curriculum were also surveyed. Assessment. The majority of students who completed the PCC series met or exceeded expectations for the simulated evaluations. Preparedness responses were more positive from students who completed the PCC series than from those who completed the prior curriculum. Student attitudes about the learning methods use in the courses also were more positive. Conclusion. The standalone PCC course series effectively achieved PCC outcomes and improved student preparedness for communication-based activities.

  11. Primary Education in Latin America: The Unfinished Agenda. Sustainable Development Department Technical Papers Series.

    ERIC Educational Resources Information Center

    Wolff, Laurence; Schiefelbein, Ernesto; Schiefelbein, Paulina

    This paper assesses progress made in elementary education in Latin America from 1990-2000. Besides examining completion rates, it looks at four critical indicators: the extent to which repetition rates have declined over the decade; the extent of timely access and on-time ages of elementary school students; the level or elementary school students'…

  12. Characterizing rainfall in the Tenerife island

    NASA Astrophysics Data System (ADS)

    Díez-Sierra, Javier; del Jesus, Manuel; Losada Rodriguez, Inigo

    2017-04-01

    In many locations, rainfall data are collected through networks of meteorological stations. The data collection process is nowadays automated in many places, leading to the development of big databases of rainfall data covering extensive areas of territory. However, managers, decision makers and engineering consultants tend not to extract most of the information contained in these databases due to the lack of specific software tools for their exploitation. Here we present the modeling and development effort put in place in the Tenerife island in order to develop MENSEI-L, a software tool capable of automatically analyzing a complete rainfall database to simplify the extraction of information from observations. MENSEI-L makes use of weather type information derived from atmospheric conditions to separate the complete time series into homogeneous groups where statistical distributions are fitted. Normal and extreme regimes are obtained in this manner. MENSEI-L is also able to complete missing data in the time series and to generate synthetic stations by using Kriging techniques. These techniques also serve to generate the spatial regimes of precipitation, both normal and extreme ones. MENSEI-L makes use of weather type information to also provide a stochastic three-day probability forecast for rainfall.

  13. Mission Exploitation Platform PROBA-V

    NASA Astrophysics Data System (ADS)

    Goor, Erwin

    2016-04-01

    VITO and partners developed an end-to-end solution to drastically improve the exploitation of the PROBA-V EO-data archive (http://proba-v.vgt.vito.be/), the past mission SPOT-VEGETATION and derived vegetation parameters by researchers, service providers and end-users. The analysis of time series of data (+1PB) is addressed, as well as the large scale on-demand processing of near real-time data. From November 2015 an operational Mission Exploitation Platform (MEP) PROBA-V, as an ESA pathfinder project, will be gradually deployed at the VITO data center with direct access to the complete data archive. Several applications will be released to the users, e.g. - A time series viewer, showing the evolution of PROBA-V bands and derived vegetation parameters for any area of interest. - Full-resolution viewing services for the complete data archive. - On-demand processing chains e.g. for the calculation of N-daily composites. - A Virtual Machine will be provided with access to the data archive and tools to work with this data, e.g. various toolboxes and support for R and Python. After an initial release in January 2016, a research platform will gradually be deployed allowing users to design, debug and test applications on the platform. From the MEP PROBA-V, access to Sentinel-2 and landsat data will be addressed as well, e.g. to support the Cal/Val activities of the users. Users can make use of powerful Web based tools and can self-manage virtual machines to perform their work on the infrastructure at VITO with access to the complete data archive. To realise this, private cloud technology (openStack) is used and a distributed processing environment is built based on Hadoop. The Hadoop ecosystem offers a lot of technologies (Spark, Yarn, Accumulo, etc.) which we integrate with several open-source components. The impact of this MEP on the user community will be high and will completely change the way of working with the data and hence open the large time series to a larger community of users. The presentation will address these benefits for the users and discuss on the technical challenges in implementing this MEP.

  14. The scheme and research of TV series multidimensional comprehensive evaluation on cross-platform

    NASA Astrophysics Data System (ADS)

    Chai, Jianping; Bai, Xuesong; Zhou, Hongjun; Yin, Fulian

    2016-10-01

    As for shortcomings of the comprehensive evaluation system on traditional TV programs such as single data source, ignorance of new media as well as the high time cost and difficulty of making surveys, a new evaluation of TV series is proposed in this paper, which has a perspective in cross-platform multidimensional evaluation after broadcasting. This scheme considers the data directly collected from cable television and the Internet as research objects. It's based on TOPSIS principle, after preprocessing and calculation of the data, they become primary indicators that reflect different profiles of the viewing of TV series. Then after the process of reasonable empowerment and summation by the six methods(PCA, AHP, etc.), the primary indicators form the composite indices on different channels or websites. The scheme avoids the inefficiency and difficulty of survey and marking; At the same time, it not only reflects different dimensions of viewing, but also combines TV media and new media, completing the multidimensional comprehensive evaluation of TV series on cross-platform.

  15. Multiple imputation for multivariate data with missing and below-threshold measurements: time-series concentrations of pollutants in the Arctic.

    PubMed

    Hopke, P K; Liu, C; Rubin, D B

    2001-03-01

    Many chemical and environmental data sets are complicated by the existence of fully missing values or censored values known to lie below detection thresholds. For example, week-long samples of airborne particulate matter were obtained at Alert, NWT, Canada, between 1980 and 1991, where some of the concentrations of 24 particulate constituents were coarsened in the sense of being either fully missing or below detection limits. To facilitate scientific analysis, it is appealing to create complete data by filling in missing values so that standard complete-data methods can be applied. We briefly review commonly used strategies for handling missing values and focus on the multiple-imputation approach, which generally leads to valid inferences when faced with missing data. Three statistical models are developed for multiply imputing the missing values of airborne particulate matter. We expect that these models are useful for creating multiple imputations in a variety of incomplete multivariate time series data sets.

  16. Period and phase comparisons of near-decadal oscillations in solar, geomagnetic, and cosmic ray time series

    NASA Astrophysics Data System (ADS)

    Juckett, David A.

    2001-09-01

    A more complete understanding of the periodic dynamics of the Sun requires continued exploration of non-11-year oscillations in addition to the benchmark 11-year sunspot cycle. In this regard, several solar, geomagnetic, and cosmic ray time series were examined to identify common spectral components and their relative phase relationships. Several non-11-year oscillations were identified within the near-decadal range with periods of ~8, 10, 12, 15, 18, 22, and 29 years. To test whether these frequency components were simply low-level noise or were related to a common source, the phases were extracted for each component in each series. The phases were nearly identical across the solar and geomagnetic series, while the corresponding components in four cosmic ray surrogate series exhibited inverted phases, similar to the known phase relationship with the 11-year sunspot cycle. Cluster analysis revealed that this pattern was unlikely to occur by chance. It was concluded that many non-11-year oscillations truly exist in the solar dynamical environment and that these contribute to the complex variations observed in geomagnetic and cosmic ray time series. Using the different energy sensitivities of the four cosmic ray surrogate series, a preliminary indication of the relative intensities of the various solar-induced oscillations was observed. It provides evidence that many of the non-11-year oscillations result from weak interplanetary magnetic field/solar wind oscillations that originate from corresponding variations in the open-field regions of the Sun.

  17. New Tools for Comparing Beliefs about the Timing of Recurrent Events with Climate Time Series Datasets

    NASA Astrophysics Data System (ADS)

    Stiller-Reeve, Mathew; Stephenson, David; Spengler, Thomas

    2017-04-01

    For climate services to be relevant and informative for users, scientific data definitions need to match users' perceptions or beliefs. This study proposes and tests novel yet simple methods to compare beliefs of timing of recurrent climatic events with empirical evidence from multiple historical time series. The methods are tested by applying them to the onset date of the monsoon in Bangladesh, where several scientific monsoon definitions can be applied, yielding different results for monsoon onset dates. It is a challenge to know which monsoon definition compares best with people's beliefs. Time series from eight different scientific monsoon definitions in six regions are compared with respondent beliefs from a previously completed survey concerning the monsoon onset. Beliefs about the timing of the monsoon onset are represented probabilistically for each respondent by constructing a probability mass function (PMF) from elicited responses about the earliest, normal, and latest dates for the event. A three-parameter circular modified triangular distribution (CMTD) is used to allow for the possibility (albeit small) of the onset at any time of the year. These distributions are then compared to the historical time series using two approaches: likelihood scores, and the mean and standard deviation of time series of dates simulated from each belief distribution. The methods proposed give the basis for further iterative discussion with decision-makers in the development of eventual climate services. This study uses Jessore, Bangladesh, as an example and finds that a rainfall definition, applying a 10 mm day-1 threshold to NCEP-NCAR reanalysis (Reanalysis-1) data, best matches the survey respondents' beliefs about monsoon onset.

  18. Potential and Pitfalls of High-Rate GPS

    NASA Astrophysics Data System (ADS)

    Smalley, R.

    2008-12-01

    With completion of the Plate Boundary Observatory (PBO), we are poised to capture a dense sampling of strong motion displacement time series from significant earthquakes in western North America with High-Rate GPS (HRGPS) data collected at 1 and 5 Hz. These data will provide displacement time series at potentially zero epicentral distance that, if valid, have great potential to contribute to understanding earthquake rupture processes. The caveat relates to whether or not the data are aliased: is the sampling rate fast enough to accurately capture the displacement's temporal history? Using strong motion recordings in the immediate epicentral area of several 6.77.5 events, which can be reasonably expected in the PBO footprint, even the 5 Hz data may be aliased. Some sort of anti-alias processing, currently not applied, will therefore necessary at the closest stations to guarantee the veracity of the displacement time series. We discuss several solutions based on a-priori knowledge of the expected ground motion and practicality of implementation.

  19. Effects of Concurrent Music Listening on Emotional Processing

    ERIC Educational Resources Information Center

    Graham, Rodger; Robinson, Johanna; Mulhall, Peter

    2009-01-01

    Increased processing time for threatening stimuli is a reliable finding in emotional Stroop tasks. This is particularly pronounced among individuals with anxiety disorders and reflects heightened attentional bias for perceived threat. In this repeated measures study, 35 healthy participants completed a randomized series of Stroop tasks involving…

  20. Amplitude death and synchronized states in nonlinear time-delay systems coupled through mean-field diffusion

    NASA Astrophysics Data System (ADS)

    Banerjee, Tanmoy; Biswas, Debabrata

    2013-12-01

    We explore and experimentally demonstrate the phenomena of amplitude death (AD) and the corresponding transitions through synchronized states that lead to AD in coupled intrinsic time-delayed hyperchaotic oscillators interacting through mean-field diffusion. We identify a novel synchronization transition scenario leading to AD, namely transitions among AD, generalized anticipatory synchronization (GAS), complete synchronization (CS), and generalized lag synchronization (GLS). This transition is mediated by variation of the difference of intrinsic time-delays associated with the individual systems and has no analogue in non-delayed systems or coupled oscillators with coupling time-delay. We further show that, for equal intrinsic time-delays, increasing coupling strength results in a transition from the unsynchronized state to AD state via in-phase (complete) synchronized states. Using Krasovskii-Lyapunov theory, we derive the stability conditions that predict the parametric region of occurrence of GAS, GLS, and CS; also, using a linear stability analysis, we derive the condition of occurrence of AD. We use the error function of proper synchronization manifold and a modified form of the similarity function to provide the quantitative support to GLS and GAS. We demonstrate all the scenarios in an electronic circuit experiment; the experimental time-series, phase-plane plots, and generalized autocorrelation function computed from the experimental time series data are used to confirm the occurrence of all the phenomena in the coupled oscillators.

  1. Cross-bispectrum computation and variance estimation

    NASA Technical Reports Server (NTRS)

    Lii, K. S.; Helland, K. N.

    1981-01-01

    A method for the estimation of cross-bispectra of discrete real time series is developed. The asymptotic variance properties of the bispectrum are reviewed, and a method for the direct estimation of bispectral variance is given. The symmetry properties are described which minimize the computations necessary to obtain a complete estimate of the cross-bispectrum in the right-half-plane. A procedure is given for computing the cross-bispectrum by subdividing the domain into rectangular averaging regions which help reduce the variance of the estimates and allow easy application of the symmetry relationships to minimize the computational effort. As an example of the procedure, the cross-bispectrum of a numerically generated, exponentially distributed time series is computed and compared with theory.

  2. Dynamic Black-Level Correction and Artifact Flagging for Kepler Pixel Time Series

    NASA Technical Reports Server (NTRS)

    Kolodziejczak, J. J.; Clarke, B. D.; Caldwell, D. A.

    2011-01-01

    Methods applied to the calibration stage of Kepler pipeline data processing [1] (CAL) do not currently use all of the information available to identify and correct several instrument-induced artifacts. These include time-varying crosstalk from the fine guidance sensor (FGS) clock signals, and manifestations of drifting moire pattern as locally correlated nonstationary noise, and rolling bands in the images which find their way into the time series [2], [3]. As the Kepler Mission continues to improve the fidelity of its science data products, we are evaluating the benefits of adding pipeline steps to more completely model and dynamically correct the FGS crosstalk, then use the residuals from these model fits to detect and flag spatial regions and time intervals of strong time-varying black-level which may complicate later processing or lead to misinterpretation of instrument behavior as stellar activity.

  3. Continuous Remote Measurements of Atmospheric O2 Concentrations in Relation to Interannual Variations in Biological Production and Carbon Cycling in the Oceans

    NASA Technical Reports Server (NTRS)

    Keeling, Ralph F.; Campbell, J. A. (Technical Monitor)

    2002-01-01

    We successfully initiated a program to obtain continuous time series of atmospheric O2 concentrations at a semi-remote coastal site, in Trinidad, California. The installation, which was completed in September 1999, consists of a commercially-available O2 and CO2 analyzers interfaced to a custom gas handling system and housed in a dedicated building at the Trinidad site. Ultimately, the data from this site are expected to provide constraints, complementing satellite data, on variations in ocean productivity and carbon exchange on annual and interannual time scales, in the context of human-induced changes in global climate and other perturbations. The existing time-series, of limited duration, have been used in support of studies of the O2/CO2 exchange from a wild fire (which fortuitously occurred nearby in October 1999) and to quantify air-sea N2O and O2 exchanges related to coastal upwelling events. More generally, the project demonstrates the feasibility of obtaining semi-continuous O2 time series at moderate cost from strategic locations globally.

  4. Preliminary comparative assessment of PM10 hourly measurement results from new monitoring stations type using stochastic and exploratory methodology and models

    NASA Astrophysics Data System (ADS)

    Czechowski, Piotr Oskar; Owczarek, Tomasz; Badyda, Artur; Majewski, Grzegorz; Rogulski, Mariusz; Ogrodnik, Paweł

    2018-01-01

    The paper presents selected preliminary stage key issues proposed extended equivalence measurement results assessment for new portable devices - the comparability PM10 concentration results hourly series with reference station measurement results with statistical methods. In article presented new portable meters technical aspects. The emphasis was placed on the comparability the results using the stochastic and exploratory methods methodology concept. The concept is based on notice that results series simple comparability in the time domain is insufficient. The comparison of regularity should be done in three complementary fields of statistical modeling: time, frequency and space. The proposal is based on model's results of five annual series measurement results new mobile devices and WIOS (Provincial Environmental Protection Inspectorate) reference station located in Nowy Sacz city. The obtained results indicate both the comparison methodology completeness and the high correspondence obtained new measurements results devices with reference.

  5. Trident Warrior Buoy Testing

    DTIC Science & Technology

    2013-09-30

    the data to track swell events, accurately model swell refraction, and use the data to drive surf -forecasting and other nearshore models (e.g...Temperature (SST). • Addition of a 8GB Micro-SD card for on board time series storage (can be unpopulated or disabled ). • A complete rewrite of the

  6. MOICC and GIS: An Impact Study. Final Evaluation Report.

    ERIC Educational Resources Information Center

    Ryan, Charles W.; Drummond, Robert J.

    The Guidance Information System (GIS) is a statewide computer-based career information system developed by the Maine Occupational Information Coordinating Committee (MOICC). A time-series design was utilized to investigate the impact of GIS on selected users in public schools and agencies. Participants completed questionnaires immediately after…

  7. 76 FR 20386 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-12

    ... series of online investor forms. Investors may access these forms through the SEC Center for Complaints... depends on the number of investors who use the forms each year and the estimated time it takes to complete... techniques or other forms of information technology. Consideration will be given to comments and suggestions...

  8. 78 FR 15001 - 36(b)(1) Arms Sales Notification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-08

    ... complete night time missions down to starlight and extreme low light conditions. The AN/AVS-9 is designed... responds to threats autonomously with a specific series of measures designed to protect the aircraft from... without interrupting his field of view through the cockpit canopy, the system uses a magnetic transmitter...

  9. 21 CFR 1020.31 - Radiographic equipment.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... time, but means may be provided to permit completion of any single exposure of the series in process.... Radiation therapy simulation systems are exempt from this requirement. (iii) The edge of the light field at..., except when the spot-film device is provided for use with a radiation therapy simulation system: (1...

  10. Gap filling strategies and error in estimating annual soil respiration

    USDA-ARS?s Scientific Manuscript database

    Soil respiration (Rsoil) is one of the largest CO2 fluxes in the global carbon (C) cycle. Estimation of annual Rsoil requires extrapolation of survey measurements or gap-filling of automated records to produce a complete time series. While many gap-filling methodologies have been employed, there is ...

  11. Reliability prediction of ontology-based service compositions using Petri net and time series models.

    PubMed

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy.

  12. Information Foraging Theory in Software Maintenance

    DTIC Science & Technology

    2012-09-30

    classified information, stamp classification level on the top and bottom of this page. 17. LIMITATION OF ABSTRACT. This block must be completed to assign a ...time: for example a time series plot of model reaction times to many (simulated) stimuli presented to it in a run • “ Statistical ” abstractions summed...shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number

  13. MIUS integration and subsystems test program

    NASA Technical Reports Server (NTRS)

    Beckham, W. S., Jr.; Shows, G. C.; Redding, T. E.; Wadle, R. C.; Keough, M. B.; Poradek, J. C.

    1976-01-01

    The MIUS Integration and Subsystems Test (MIST) facility at the Lyndon B. Johnson Space Center was completed and ready in May 1974 for conducting specific tests in direct support of the Modular Integrated Utility System (MIUS). A series of subsystems and integrated tests was conducted since that time, culminating in a series of 24-hour dynamic tests to further demonstrate the capabilities of the MIUS Program concepts to meet typical utility load profiles for a residential area. Results of the MIST Program are presented which achieved demonstrated plant thermal efficiencies ranging from 57 to 65 percent.

  14. Rover Takes a Sunday Drive

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This animation, made with images from the Mars Exploration Rover Spirit hazard-identification camera, shows the rover's perspective of its first post-egress drive on Mars Sunday. Engineers drove Spirit approximately 3 meters (10 feet) toward its first rock target, a football-sized, mountain-shaped rock called Adirondack. The drive took approximately 30 minutes to complete, including time stopped to take images. Spirit first made a series of arcing turns totaling approximately 1 meter (3 feet). It then turned in place and made a series of short, straightforward movements totaling approximately 2 meters (6.5 feet).

  15. Rotavirus vaccine coverage and factors associated with uptake using linked data: Ontario, Canada

    PubMed Central

    Chung, Hannah; Schwartz, Kevin L.; Guttmann, Astrid; Deeks, Shelley L.; Kwong, Jeffrey C.; Crowcroft, Natasha S.; Wing, Laura; Tu, Karen

    2018-01-01

    Background In August 2011, Ontario, Canada introduced a rotavirus immunization program using Rotarix™ vaccine. No assessments of rotavirus vaccine coverage have been previously conducted in Ontario. Methods We assessed vaccine coverage (series initiation and completion) and factors associated with uptake using the Electronic Medical Record Administrative data Linked Database (EMRALD), a collection of family physician electronic medical records (EMR) linked to health administrative data. Series initiation (1 dose) and series completion (2 doses) before and after the program’s introduction were calculated. To identify factors associated with series initiation and completion, adjusted odds ratios (aOR) and 95% confidence intervals (95%CI) were calculated using logistic regression. Results A total of 12,525 children were included. Series completion increased each year of the program (73%, 79% and 84%, respectively). Factors associated with series initiation included high continuity of care (aOR = 2.15; 95%CI, 1.61–2.87), maternal influenza vaccination (aOR = 1.55; 95%CI,1.24–1.93), maternal immmigration to Canada in the last five years (aOR = 1.47; 95% CI, 1.05–2.04), and having no siblings (aOR = 1.62; 95%CI,1.30–2.03). Relative to the first program year, infants were more likely to initiate the series in the second year (aOR = 1.71; 95% CI 1.39–2.10) and third year (aOR = 2.02; 95% CI 1.56–2.61) of the program. Infants receiving care from physicians with large practices were less likely to initiate the series (aOR 0.91; 95%CI, 0.88–0.94, per 100 patients rostered) and less likely to complete the series (aOR 0.94; 95%CI, 0.91–0.97, per 100 patients rostered). Additional associations were identified for series completion. Conclusions Family physician delivery achieved moderately high coverage in the program’s first three years. This assessment demonstrates the usefulness of EMR data for evaluating vaccine coverage. Important insights into factors associated with initiation or completion (i.e. high continuity of care, smaller roster sizes, rural practice location) suggest areas for research and potential program supports. PMID:29444167

  16. Hispanic mothers’ beliefs regarding HPV vaccine series completion in their adolescent daughters

    PubMed Central

    Roncancio, A. M.; Ward, K. K.; Carmack, C. C.; Mu�oz, B. T.; Cribbs, F. L.

    2017-01-01

    Abstract Rates of human papillomavirus (HPV) vaccine series completion among adolescent Hispanic females in Texas in 2014 (∼39%) lag behind the Healthy People 2020 goal (80%). This qualitative study identifies Hispanic mothers’ salient behavioral, normative and control beliefs regarding having their adolescent daughters complete the vaccine series. Thirty-two mothers of girls (aged 11–17) that had received at least one dose of the HPV vaccine, completed in-depth interviews. Six girls had received one dose of the HPV vaccine, 10 girls had received two doses, and 16 girls had received all three doses. The questions elicited salient: (i) experiential and instrumental attitudes (behavioral beliefs); (ii) supporters and non-supporters (normative beliefs) and (iii) facilitators and barriers (control beliefs). Directed content analysis was employed to select the most salient beliefs. Mothers: (i) expressed salient positive feelings (e.g. good, secure, happy and satisfied); (ii) believed that completing the series resulted in positive effects (e.g. protection, prevention); (iii) believed that the main supporters were themselves, their daughter’s father and doctor with some of their friends not supporting series completion and (iv) believed that vaccine affordability, information, transportation, ease of scheduling and keeping vaccination appointments and taking their daughter’s immunization card to appointments were facilitators. This study represents the first step in building theory-based framework of vaccine series completion for this population. The beliefs identified provide guidance for health care providers and intervention developers. PMID:28088755

  17. Model-Based Design of Long-Distance Tracer Transport Experiments in Plants.

    PubMed

    Bühler, Jonas; von Lieres, Eric; Huber, Gregor J

    2018-01-01

    Studies of long-distance transport of tracer isotopes in plants offer a high potential for functional phenotyping, but so far measurement time is a bottleneck because continuous time series of at least 1 h are required to obtain reliable estimates of transport properties. Hence, usual throughput values are between 0.5 and 1 samples h -1 . Here, we propose to increase sample throughput by introducing temporal gaps in the data acquisition of each plant sample and measuring multiple plants one after each other in a rotating scheme. In contrast to common time series analysis methods, mechanistic tracer transport models allow the analysis of interrupted time series. The uncertainties of the model parameter estimates are used as a measure of how much information was lost compared to complete time series. A case study was set up to systematically investigate different experimental schedules for different throughput scenarios ranging from 1 to 12 samples h -1 . Selected designs with only a small amount of data points were found to be sufficient for an adequate parameter estimation, implying that the presented approach enables a substantial increase of sample throughput. The presented general framework for automated generation and evaluation of experimental schedules allows the determination of a maximal sample throughput and the respective optimal measurement schedule depending on the required statistical reliability of data acquired by future experiments.

  18. Interannual Variations In the Low-Degree Components of the Geopotential derived from SLR and the Connections With Geophysical/Climatic Processes

    NASA Technical Reports Server (NTRS)

    Chao, Benjamin F.; Cox, Christopher M.; Au, Andrew Y.

    2004-01-01

    Recent Satellite Laser Ranging derived long wavelength gravity time series analysis has focused to a large extent on the effects of the recent large changes in the Earth s 52, and the potential causes. However, it is difficult to determine whether there are corresponding signals in the shorter wavelength zonals from the existing SLR-derived time variable gravity results, although it appears that geophysical fluid transport is being observed. For example, the recovered J3 time series shows remarkable agreement with NCEP-derived estimates of atmospheric gravity variations. Likewise, some of the non-zonal spherical harmonic coefficient series have significant interannual signal that appears to be related to mass transport. The non-zonal degree 2 terms show reasonable correlation with atmospheric signals, as well as climatic effects such as El Nino Southern Oscillation. While the formal uncertainty of these terms is significantly higher than that for J2, it is also clear that there is useful signal to be extracted. Consequently, the SLR time series is being reprocessed to improve the time variable gravity field recovery. We will present recent updates on the J2 evolution, as well as a look at other components of the interannual variations of the gravity field, complete through degree 4, and possible geophysical and climatic causes.

  19. Two-pass imputation algorithm for missing value estimation in gene expression time series.

    PubMed

    Tsiporkova, Elena; Boeva, Veselka

    2007-10-01

    Gene expression microarray experiments frequently generate datasets with multiple values missing. However, most of the analysis, mining, and classification methods for gene expression data require a complete matrix of gene array values. Therefore, the accurate estimation of missing values in such datasets has been recognized as an important issue, and several imputation algorithms have already been proposed to the biological community. Most of these approaches, however, are not particularly suitable for time series expression profiles. In view of this, we propose a novel imputation algorithm, which is specially suited for the estimation of missing values in gene expression time series data. The algorithm utilizes Dynamic Time Warping (DTW) distance in order to measure the similarity between time expression profiles, and subsequently selects for each gene expression profile with missing values a dedicated set of candidate profiles for estimation. Three different DTW-based imputation (DTWimpute) algorithms have been considered: position-wise, neighborhood-wise, and two-pass imputation. These have initially been prototyped in Perl, and their accuracy has been evaluated on yeast expression time series data using several different parameter settings. The experiments have shown that the two-pass algorithm consistently outperforms, in particular for datasets with a higher level of missing entries, the neighborhood-wise and the position-wise algorithms. The performance of the two-pass DTWimpute algorithm has further been benchmarked against the weighted K-Nearest Neighbors algorithm, which is widely used in the biological community; the former algorithm has appeared superior to the latter one. Motivated by these findings, indicating clearly the added value of the DTW techniques for missing value estimation in time series data, we have built an optimized C++ implementation of the two-pass DTWimpute algorithm. The software also provides for a choice between three different initial rough imputation methods.

  20. The Effect of Two-dimensional and Stereoscopic Presentation on Middle School Students' Performance of Spatial Cognition Tasks

    NASA Astrophysics Data System (ADS)

    Price, Aaron; Lee, Hee-Sun

    2010-02-01

    We investigated whether and how student performance on three types of spatial cognition tasks differs when worked with two-dimensional or stereoscopic representations. We recruited nineteen middle school students visiting a planetarium in a large Midwestern American city and analyzed their performance on a series of spatial cognition tasks in terms of response accuracy and task completion time. Results show that response accuracy did not differ between the two types of representations while task completion time was significantly greater with the stereoscopic representations. The completion time increased as the number of mental manipulations of 3D objects increased in the tasks. Post-interviews provide evidence that some students continued to think of stereoscopic representations as two-dimensional. Based on cognitive load and cue theories, we interpret that, in the absence of pictorial depth cues, students may need more time to be familiar with stereoscopic representations for optimal performance. In light of these results, we discuss potential uses of stereoscopic representations for science learning.

  1. Bayesian dynamic modeling of time series of dengue disease case counts

    PubMed Central

    López-Quílez, Antonio; Torres-Prieto, Alexander

    2017-01-01

    The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model’s short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC) for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease, producing useful models for decision-making in public health. PMID:28671941

  2. Fuzzy Inference System Approach for Locating Series, Shunt, and Simultaneous Series-Shunt Faults in Double Circuit Transmission Lines

    PubMed Central

    Swetapadma, Aleena; Yadav, Anamika

    2015-01-01

    Many schemes are reported for shunt fault location estimation, but fault location estimation of series or open conductor faults has not been dealt with so far. The existing numerical relays only detect the open conductor (series) fault and give the indication of the faulty phase(s), but they are unable to locate the series fault. The repair crew needs to patrol the complete line to find the location of series fault. In this paper fuzzy based fault detection/classification and location schemes in time domain are proposed for both series faults, shunt faults, and simultaneous series and shunt faults. The fault simulation studies and fault location algorithm have been developed using Matlab/Simulink. Synchronized phasors of voltage and current signals of both the ends of the line have been used as input to the proposed fuzzy based fault location scheme. Percentage of error in location of series fault is within 1% and shunt fault is 5% for all the tested fault cases. Validation of percentage of error in location estimation is done using Chi square test with both 1% and 5% level of significance. PMID:26413088

  3. High-Temperature Cyclic Oxidation Data, Volume 1

    NASA Technical Reports Server (NTRS)

    Barrett, C. A.; Garlick, R. G.; Lowell, C. E.

    1984-01-01

    This first in a series of cyclic oxidation handbooks contains specific-weight-change-versus-time data and X-ray diffraction results derived from high-temperature cyclic tests on high-temperature, high-strength nickel-base gamma/gamma' and cobalt-base turbine alloys. Each page of data summarizes a complete test on a given alloy sample.

  4. Family Child Care Tax Workbook. Redleaf Press Business Series.

    ERIC Educational Resources Information Center

    Copeland, Tom

    This workbook presents information to assist taxpayers in completing their 1996 federal income tax forms for their family child care business and is designed to be used in conjunction with "The Basic Guide to Family Child Care Record Keeping." Procedures prior to filing the tax return are discussed and calculation of the time-space…

  5. On the completeness and the linear dependence of the Cartesian multipole series in representing the solution to the Helmholtz equation.

    PubMed

    Liu, Yangfan; Bolton, J Stuart

    2016-08-01

    The (Cartesian) multipole series, i.e., the series comprising monopole, dipoles, quadrupoles, etc., can be used, as an alternative to the spherical or cylindrical wave series, in representing sound fields in a wide range of problems, such as source radiation, sound scattering, etc. The proofs of the completeness of the spherical and cylindrical wave series in these problems are classical results, and it is also generally agreed that the Cartesian multipole series spans the same space as the spherical waves: a rigorous mathematical proof of that statement has, however, not been presented. In the present work, such a proof of the completeness of the Cartesian multipole series, both in two and three dimensions, is given, and the linear dependence relations among different orders of multipoles are discussed, which then allows one to easily extract a basis from the multipole series. In particular, it is concluded that the multipoles comprising the two highest orders in the series form a basis of the whole series, since the multipoles of all the lower source orders can be expressed as a linear combination of that basis.

  6. Earth Observing System, Conclusions and Recommendations

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The following Earth Observing Systems (E.O.S.) recommendations were suggested: (1) a program must be initiated to ensure that present time series of Earth science data are maintained and continued. (2) A data system that provides easy, integrated, and complete access to past, present, and future data must be developed as soon as possible. (3) A long term research effort must be sustained to study and understand these time series of Earth observations. (4) The E.O.S. should be established as an information system to carry out those aspects of the above recommendations which go beyond existing and currently planned activities. (5) The scientific direction of the E.O.S. should be established and continued through an international scientific steering committee.

  7. Tracking signal test to monitor an intelligent time series forecasting model

    NASA Astrophysics Data System (ADS)

    Deng, Yan; Jaraiedi, Majid; Iskander, Wafik H.

    2004-03-01

    Extensive research has been conducted on the subject of Intelligent Time Series forecasting, including many variations on the use of neural networks. However, investigation of model adequacy over time, after the training processes is completed, remains to be fully explored. In this paper we demonstrate a how a smoothed error tracking signals test can be incorporated into a neuro-fuzzy model to monitor the forecasting process and as a statistical measure for keeping the forecasting model up-to-date. The proposed monitoring procedure is effective in the detection of nonrandom changes, due to model inadequacy or lack of unbiasedness in the estimation of model parameters and deviations from the existing patterns. This powerful detection device will result in improved forecast accuracy in the long run. An example data set has been used to demonstrate the application of the proposed method.

  8. Hispanic mothers' beliefs regarding HPV vaccine series completion in their adolescent daughters.

    PubMed

    Roncancio, A M; Ward, K K; Carmack, C C; Muñoz, B T; Cribbs, F L

    2017-02-01

    Rates of human papillomavirus (HPV) vaccine series completion among adolescent Hispanic females in Texas in 2014 (∼39%) lag behind the Healthy People 2020 goal (80%). This qualitative study identifies Hispanic mothers' salient behavioral, normative and control beliefs regarding having their adolescent daughters complete the vaccine series. Thirty-two mothers of girls (aged 11-17) that had received at least one dose of the HPV vaccine, completed in-depth interviews. Six girls had received one dose of the HPV vaccine, 10 girls had received two doses, and 16 girls had received all three doses. The questions elicited salient: (i) experiential and instrumental attitudes (behavioral beliefs); (ii) supporters and non-supporters (normative beliefs) and (iii) facilitators and barriers (control beliefs). Directed content analysis was employed to select the most salient beliefs. Mothers: (i) expressed salient positive feelings (e.g. good, secure, happy and satisfied); (ii) believed that completing the series resulted in positive effects (e.g. protection, prevention); (iii) believed that the main supporters were themselves, their daughter's father and doctor with some of their friends not supporting series completion and (iv) believed that vaccine affordability, information, transportation, ease of scheduling and keeping vaccination appointments and taking their daughter's immunization card to appointments were facilitators. This study represents the first step in building theory-based framework of vaccine series completion for this population. The beliefs identified provide guidance for health care providers and intervention developers. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  9. Hepatitis A/B vaccine completion among homeless adults with history of incarceration.

    PubMed

    Nyamathi, Adeline M; Marlow, Elizabeth; Branson, Catherine; Marfisee, Mary; Nandy, Karabi

    2012-03-01

    Hepatitis B virus (HBV) vaccination rates for incarcerated adults remain low despite their high risk for infection. This study determined predictors of vaccine completion in homeless adults (N= 297) who reported histories of incarceration and who participated in one of three nurse-led hepatitis programs of different intensity. Moreover time since release from incarceration was also considered. Just over half of the former prisoners completed the vaccine series. Older age (≥40), having a partner, and chronic homelessness were associated with vaccine completion. Recent research has documented the difficulty in providing vaccine services to younger homeless persons and homeless males at risk for HBV. Additional strategies are needed to achieve HBV vaccination completion rates greater than 50% for formerly incarcerated homeless men. © 2012 International Association of Forensic Nurses.

  10. Human Mars Lander Design for NASA's Evolvable Mars Campaign

    NASA Technical Reports Server (NTRS)

    Polsgrove, Tara; Chapman, Jack; Sutherlin, Steve; Taylor, Brian; Fabisinski, Leo; Collins, Tim; Cianciolo Dwyer, Alicia; Samareh, Jamshid; Robertson, Ed; Studak, Bill; hide

    2016-01-01

    Landing humans on Mars will require entry, descent, and landing capability beyond the current state of the art. Nearly twenty times more delivered payload and an order of magnitude improvement in precision landing capability will be necessary. To better assess entry, descent, and landing technology options and sensitivities to future human mission design variations, a series of design studies on human-class Mars landers has been initiated. This paper describes the results of the first design study in the series of studies to be completed in 2016 and includes configuration, trajectory and subsystem design details for a lander with Hypersonic Inflatable Aerodynamic Decelerator (HIAD) entry technology. Future design activities in this series will focus on other entry technology options.

  11. A summary of measured hydraulic data for the series of steady and unsteady flow experiments over patterned roughness

    USGS Publications Warehouse

    Collins, Dannie L.; Flynn, Kathleen M.

    1979-01-01

    This report summarizes and makes available to other investigators the measured hydraulic data collected during a series of experiments designed to study the effect of patterned bed roughness on steady and unsteady open-channel flow. The patterned effect of the roughness was obtained by clear-cut mowing of designated areas of an otherwise fairly dense coverage of coastal Bermuda grass approximately 250 mm high. All experiments were conducted in the Flood Plain Simulation Facility during the period of October 7 through December 12, 1974. Data from 18 steady flow experiments and 10 unsteady flow experiments are summarized. Measured data included are ground-surface elevations, grass heights and densities, water-surface elevations and point velocities for all experiments. Additional tables of water-surface elevations and measured point velocities are included for the clear-cut areas for most experiments. One complete set of average water-surface elevations and one complete set of measured point velocities are tabulated for each steady flow experiment. Time series data, on a 2-minute time interval, are tabulated for both water-surface elevations and point velocities for each unsteady flow experiment. All data collected, including individual records of water-surface elevations for the steady flow experiments, have been stored on computer disk storage and can be retrieved using the computer programs listed in the attachment to this report. (Kosco-USGS)

  12. Activation of Central Pattern Generator for Respiration Following Complete High Cervical Spinal Cord Interruption

    DTIC Science & Technology

    2017-09-01

    periodical or series . Include any significant publication in the proceedings of a one- time conference or in the report of a one- time study... Interruption PRINCIPAL INVESTIGATOR: Vitaliy Marchenko,MD,PhD CONTRACTING ORGANIZATION DREXEL UNIVERSITY PHILADELPHIA PA 19104-2875 REPORT DATE...the author(s) and should not be construed as an official Department of the Army position, policy or decision unless so designated by other

  13. Quantifying the behavior of price dynamics at opening time in stock market

    NASA Astrophysics Data System (ADS)

    Ochiai, Tomoshiro; Takada, Hideyuki; Nacher, Jose C.

    2014-11-01

    The availability of huge volume of financial data has offered the possibility for understanding the markets as a complex system characterized by several stylized facts. Here we first show that the time evolution of the Japan’s Nikkei stock average index (Nikkei 225) futures follows the resistance and breaking-acceleration effects when the complete time series data is analyzed. However, in stock markets there are periods where no regular trades occur between the close of the market on one day and the next day’s open. To examine these time gaps we decompose the time series data into opening time and intermediate time. Our analysis indicates that for the intermediate time, both the resistance and the breaking-acceleration effects are still observed. However, for the opening time there are almost no resistance and breaking-acceleration effects, and volatility is always constantly high. These findings highlight unique dynamic differences between stock markets and forex market and suggest that current risk management strategies may need to be revised to address the absence of these dynamic effects at the opening time.

  14. Time Series Analysis for Spatial Node Selection in Environment Monitoring Sensor Networks

    PubMed Central

    Bhandari, Siddhartha; Jurdak, Raja; Kusy, Branislav

    2017-01-01

    Wireless sensor networks are widely used in environmental monitoring. The number of sensor nodes to be deployed will vary depending on the desired spatio-temporal resolution. Selecting an optimal number, position and sampling rate for an array of sensor nodes in environmental monitoring is a challenging question. Most of the current solutions are either theoretical or simulation-based where the problems are tackled using random field theory, computational geometry or computer simulations, limiting their specificity to a given sensor deployment. Using an empirical dataset from a mine rehabilitation monitoring sensor network, this work proposes a data-driven approach where co-integrated time series analysis is used to select the number of sensors from a short-term deployment of a larger set of potential node positions. Analyses conducted on temperature time series show 75% of sensors are co-integrated. Using only 25% of the original nodes can generate a complete dataset within a 0.5 °C average error bound. Our data-driven approach to sensor position selection is applicable for spatiotemporal monitoring of spatially correlated environmental parameters to minimize deployment cost without compromising data resolution. PMID:29271880

  15. Burnt area mapping from ERS-SAR time series using the principal components transformation

    NASA Astrophysics Data System (ADS)

    Gimeno, Meritxell; San-Miguel Ayanz, Jesus; Barbosa, Paulo M.; Schmuck, Guido

    2003-03-01

    Each year thousands of hectares of forest burnt across Southern Europe. To date, remote sensing assessments of this phenomenon have focused on the use of optical satellite imagery. However, the presence of clouds and smoke prevents the acquisition of this type of data in some areas. It is possible to overcome this problem by using synthetic aperture radar (SAR) data. Principal component analysis (PCA) was performed to quantify differences between pre- and post- fire images and to investigate the separability over a European Remote Sensing (ERS) SAR time series. Moreover, the transformation was carried out to determine the best conditions to acquire optimal SAR imagery according to meteorological parameters and the procedures to enhance burnt area discrimination for the identification of fire damage assessment. A comparative neural network classification was performed in order to map and to assess the burnts using a complete ERS time series or just an image before and an image after the fire according to the PCA. The results suggest that ERS is suitable to highlight areas of localized changes associated with forest fire damage in Mediterranean landcover.

  16. Measuring predictability in ultrasonic signals: an application to scattering material characterization.

    PubMed

    Carrión, Alicia; Miralles, Ramón; Lara, Guillermo

    2014-09-01

    In this paper, we present a novel and completely different approach to the problem of scattering material characterization: measuring the degree of predictability of the time series. Measuring predictability can provide information of the signal strength of the deterministic component of the time series in relation to the whole time series acquired. This relationship can provide information about coherent reflections in material grains with respect to the rest of incoherent noises that typically appear in non-destructive testing using ultrasonics. This is a non-parametric technique commonly used in chaos theory that does not require making any kind of assumptions about attenuation profiles. In highly scattering media (low SNR), it has been shown theoretically that the degree of predictability allows material characterization. The experimental results obtained in this work with 32 cement probes of 4 different porosities demonstrate the ability of this technique to do classification. It has also been shown that, in this particular application, the measurement of predictability can be used as an indicator of the percentages of porosity of the test samples with great accuracy. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Effects of Phone and Text Message Reminders on Completion of the Human Papillomavirus Vaccine Series.

    PubMed

    Rand, Cynthia M; Vincelli, Phyllis; Goldstein, Nicolas P N; Blumkin, Aaron; Szilagyi, Peter G

    2017-01-01

    To assess the effect of phone or text message reminders to parents of adolescents on human papillomavirus (HPV) vaccine series completion in Rochester, NY. We performed parallel randomized controlled trials of phone and text reminders for HPV vaccine for parents of 11- to 17-year olds in three urban primary care clinics. The main outcome measures were time to receipt of the third dose of HPV vaccine and HPV vaccination rates. We enrolled 178 phone intervention (180 control) and 191 text intervention (200 control) participants. In multivariate survival analysis controlling for gender, age, practice, insurance, race, and ethnicity, the time from enrollment to receipt of the third HPV dose for those receiving a phone reminder compared with controls was not significant overall (hazard ratio [HR] = 1.30, p = .12) but was for those enrolling at dose 1 (HR = 1.91, p = .007). There was a significant difference in those receiving a text reminder compared with controls (HR = 2.34, p < .0001; an average of 71 days earlier). At the end of the study, 48% of phone intervention versus 40% of phone control (p = .34), and 49% of text intervention versus 30% of text control (p = .001) adolescents had received 3 HPV vaccine doses. In this urban population of parents of adolescents, text message reminders for HPV vaccine completion for those who had already started the series were effective, whereas phone message reminders were only effective for those enrolled at dose 1. Copyright © 2016 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  18. fixedTimeEvents: An R package for the distribution of distances between discrete events in fixed time

    NASA Astrophysics Data System (ADS)

    Liland, Kristian Hovde; Snipen, Lars

    When a series of Bernoulli trials occur within a fixed time frame or limited space, it is often interesting to assess if the successful outcomes have occurred completely at random, or if they tend to group together. One example, in genetics, is detecting grouping of genes within a genome. Approximations of the distribution of successes are possible, but they become inaccurate for small sample sizes. In this article, we describe the exact distribution of time between random, non-overlapping successes in discrete time of fixed length. A complete description of the probability mass function, the cumulative distribution function, mean, variance and recurrence relation is included. We propose an associated test for the over-representation of short distances and illustrate the methodology through relevant examples. The theory is implemented in an R package including probability mass, cumulative distribution, quantile function, random number generator, simulation functions, and functions for testing.

  19. Spatial and Temporal scales of time-averaged 700 MB height anomalies

    NASA Technical Reports Server (NTRS)

    Gutzler, D.

    1981-01-01

    The monthly and seasonal forecasting technique is based to a large extent on the extrapolation of trends in the positions of the centers of time averaged geopotential height anomalies. The complete forecasted height pattern is subsequently drawn around the forecasted anomaly centers. The efficacy of this technique was tested and time series of observed monthly mean and 5 day mean 700 mb geopotential heights were examined. Autocorrelation statistics are generated to document the tendency for persistence of anomalies. These statistics are compared to a red noise hypothesis to check for evidence of possible preferred time scales of persistence. Space-time spectral analyses at middle latitudes are checked for evidence of periodicities which could be associated with predictable month-to-month trends. A local measure of the average spatial scale of anomalies is devised for guidance in the completion of the anomaly pattern around the forecasted centers.

  20. Reliability Prediction of Ontology-Based Service Compositions Using Petri Net and Time Series Models

    PubMed Central

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy. PMID:24688429

  1. Two-Dimensional Numerical Model of coupled Heat and Moisture Transport in Frost Heaving Soils.

    DTIC Science & Technology

    1982-08-01

    integrated relations become: The exact solution is the %%ell-known series expansion: At -11)e )+bO! -201, +Li j I:IAx), " 2" 4 ,, sin 3 .x )fx. t=-szf...giethe complete mab balance formula tion. Integrating .patiall% and temporall % on eac:n R ~ .% fl, Icc .1’l i l Ilt,.’. ,l~llc "jaJ i l C tl~ I1I’ .El~lt...diffusivity model can be approximately linearized by using values of diffusivitv assumed constant for small intervals of space and time. By a series expansion

  2. Outcomes of popliteal vascular injuries at Sri Lankan war-front military hospital: case series of 44 cases.

    PubMed

    Ratnayake, Amila; Samarasinghe, Bandula; Bala, Miklosh

    2014-05-01

    Traumatic injury to the popliteal vascular zone remains a challenging problem on the modern battlefield and is frequently associated with more complications than other vascular injuries. Limb salvage and morbidity (graft infection, thrombosis and delayed haemorrhage) were studied. All popliteal vascular injuries over an 8 month period admitted to the Military Base Hospital were analyzed. Local limb evaluation included confirmation of the presence of ischaemia, extent of soft tissue damage, muscle viability after calf fasciotomy, and neurological injury. Ischaemic time was recorded from the time of injury to definitive revascularization. If there was a prior attempt at reconstruction, the amputation was considered delayed. For a series of 44 patients with popliteal vascular injury average time to presentation was 390min, 46% were completely ischaemic. Of those 39 (89%) had popliteal artery injuries. There were 24 (62%) complete popliteal artery transections and associated venous (69%) and osseous (46%) injuries. The preferred technique of repair was inter-position venous graft (IPVG) (54%). Eleven immediate amputations were performed (28%). There were 13 wound infections (33%), 5 early graft thromboses (5 of 21 IPVG, 238%), 2 anastomotic disruptions (2 of 21 IPVG, 9%), which resulted in 4 delayed amputations. Mortality was 5% (2 patients). In this case series of popliteal artery injury early identification of limbs at risk, early four compartment fasciotomy, temporary intra-luminal shunting, definitive repair of concomitant venous injuries and aggressive treatment of haemodynamic instability were shown to be beneficial in achieving reasonable outcome in an austere environment with limited recourses. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Development of web tools to disseminate space geodesy data-related products

    NASA Astrophysics Data System (ADS)

    Soudarin, Laurent; Ferrage, Pascale; Mezerette, Adrien

    2015-04-01

    In order to promote the products of the DORIS system, the French Space Agency CNES has developed and implemented on the web site of the International DORIS Service (IDS) a set of plot tools to interactively build and display time series of site positions, orbit residuals and terrestrial parameters (scale, geocenter). An interactive global map is also available to select sites, and to get access to their information. Besides the products provided by the CNES Orbitography Team and the IDS components, these tools allow comparing time evolutions of coordinates for collocated DORIS and GNSS stations, thanks to the collaboration with the Terrestrial Frame Combination Center of the International GNSS Service (IGS). A database was created to improve robustness and efficiency of the tools, with the objective to propose a complete web service to foster data exchange with the other geodetic services of the International Association of Geodesy (IAG). The possibility to visualize and compare position time series of the four main space geodetic techniques DORIS, GNSS, SLR and VLBI is already under way at the French level. A dedicated version of these web tools has been developed for the French Space Geodesy Research Group (GRGS). It will give access to position time series provided by the GRGS Analysis Centers involved in DORIS, GNSS, SLR and VLBI data processing for the realization of the International Terrestrial Reference Frame. In this presentation, we will describe the functionalities of these tools, and we will address some aspects of the time series (content, format).

  4. Effect of stirring on the safety of flammable liquid mixtures.

    PubMed

    Liaw, Horng-Jang; Gerbaud, Vincent; Chen, Chan-Cheng; Shu, Chi-Min

    2010-05-15

    Flash point is the most important variable employed to characterize fire and explosion hazard of liquids. The models developed for predicting the flash point of partially miscible mixtures in the literature to date are all based on the assumption of liquid-liquid equilibrium. In real-world environments, however, the liquid-liquid equilibrium assumption does not always hold, such as the collection or accumulation of waste solvents without stirring, where complete stirring for a period of time is usually used to ensure the liquid phases being in equilibrium. This study investigated the effect of stirring on the flash-point behavior of binary partially miscible mixtures. Two series of partially miscible binary mixtures were employed to elucidate the effect of stirring. The first series was aqueous-organic mixtures, including water+1-butanol, water+2-butanol, water+isobutanol, water+1-pentanol, and water+octane; the second series was the mixtures of two flammable solvents, which included methanol+decane, methanol+2,2,4-trimethylpentane, and methanol+octane. Results reveal that for binary aqueous-organic solutions the flash-point values of unstirred mixtures were located between those of the completely stirred mixtures and those of the flammable component. Therefore, risk assessment could be done based on the flammable component flash-point value. However, for the assurance of safety, it is suggested to completely stir those mixtures before handling to reduce the risk. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  5. Fading channel simulator

    DOEpatents

    Argo, Paul E.; Fitzgerald, T. Joseph

    1993-01-01

    Fading channel effects on a transmitted communication signal are simulated with both frequency and time variations using a channel scattering function to affect the transmitted signal. A conventional channel scattering function is converted to a series of channel realizations by multiplying the square root of the channel scattering function by a complex number of which the real and imaginary parts are each independent variables. The two-dimensional inverse-FFT of this complex-valued channel realization yields a matrix of channel coefficients that provide a complete frequency-time description of the channel. The transmitted radio signal is segmented to provide a series of transmitted signal and each segment is subject to FFT to generate a series of signal coefficient matrices. The channel coefficient matrices and signal coefficient matrices are then multiplied and subjected to inverse-FFT to output a signal representing the received affected radio signal. A variety of channel scattering functions can be used to characterize the response of a transmitter-receiver system to such atmospheric effects.

  6. Manipulation Capabilities with Simple Hands

    DTIC Science & Technology

    2010-01-01

    allowing it to interpret online kinesthetic data, addressing two objectives: • Grasp classification: Distinguish between successful and unsuccessful...determining the grasp outcome before the grasping process is complete, by using the entire time series or kinesthetic signature of the grasping process. As...the grasp proceeds and additional kinesthetic data accumulates, the confidence also increases. In some cases Manipulation Capabilities with Simple Hands

  7. High Resolution Time Series Observations of Bio-Optical and Physical Variability in the Arabian Sea

    DTIC Science & Technology

    1998-09-30

    1995-October 20, 1995). Multi-variable moored systems ( MVMS ) were deployed by our group at 35 and 80m. The MVMS utilizes a VMCM to measure currents...similar to that of the UCSB MVMSs. WORK COMPLETED Our MVMS interdisciplinary systems with sampling intervals of a few minutes were placed on a mooring

  8. Dropout Rates in the United States: 1989.

    ERIC Educational Resources Information Center

    Kaufman, Phillip; Frase, Mary J.

    This is the second annual report to Congress required by the Hawkins-Stafford Elementary and Secondary School Improvement Amendments of 1988 (P.L. 100-297). It presents data on high school dropout and retention rates for 1989 and time series data since 1968. It also examines high school completion and graduation rates. Two kinds of dropout rates…

  9. An Interdisciplinary Invitation: A Study of "Gender and Aesthetics: An Introduction"

    ERIC Educational Resources Information Center

    Morton, Charlene

    2006-01-01

    The new reader "Gender and Aesthetics: An Introduction" is part of a series "designed for students who have typically completed an introductory course in philosophy and are coming to feminist philosophy for the first time". Why should music educators adopt this feminist introduction to gender and aesthetics when they can readily turn to more…

  10. Culturally Diverse Cohorts: The Exploration of Learning in Context and Community

    ERIC Educational Resources Information Center

    Callaghan, Carolyn M.

    2012-01-01

    This dissertation explores the experiences of culturally diverse interactions and learning in adult cohorts. A cohort is defined as a group of students who enter a program of study together and complete a series of common learning experiences during a specified period of time (Saltiel & Russo, 2001). There is much research on the general use,…

  11. Shoulder arthroscopy simulator training improves shoulder arthroscopy performance in a cadaveric model.

    PubMed

    Henn, R Frank; Shah, Neel; Warner, Jon J P; Gomoll, Andreas H

    2013-06-01

    The purpose of this study was to quantify the benefits of shoulder arthroscopy simulator training with a cadaveric model of shoulder arthroscopy. Seventeen first-year medical students with no prior experience in shoulder arthroscopy were enrolled and completed this study. Each subject completed a baseline proctored arthroscopy on a cadaveric shoulder, which included controlling the camera and completing a standard series of tasks using the probe. The subjects were randomized, and 9 of the subjects received training on a virtual reality simulator for shoulder arthroscopy. All subjects then repeated the same cadaveric arthroscopy. The arthroscopic videos were analyzed in a blinded fashion for time to task completion and subjective assessment of technical performance. The 2 groups were compared by use of Student t tests, and change over time within groups was analyzed with paired t tests. There were no observed differences between the 2 groups on the baseline evaluation. The simulator group improved significantly from baseline with respect to time to completion and subjective performance (P < .05). Time to completion was significantly faster in the simulator group compared with controls at the final evaluation (P < .05). No difference was observed between the groups on the subjective scores at the final evaluation (P = .98). Shoulder arthroscopy simulator training resulted in significant benefits in clinical shoulder arthroscopy time to task completion in this cadaveric model. This study provides important additional evidence of the benefit of simulators in orthopaedic surgical training. There may be a role for simulator training in shoulder arthroscopy education. Copyright © 2013 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  12. Shoulder Arthroscopy Simulator Training Improves Shoulder Arthroscopy Performance in a Cadaver Model

    PubMed Central

    Henn, R. Frank; Shah, Neel; Warner, Jon J.P.; Gomoll, Andreas H.

    2013-01-01

    Purpose The purpose of this study was to quantify the benefits of shoulder arthroscopy simulator training with a cadaver model of shoulder arthroscopy. Methods Seventeen first year medical students with no prior experience in shoulder arthroscopy were enrolled and completed this study. Each subject completed a baseline proctored arthroscopy on a cadaveric shoulder, which included controlling the camera and completing a standard series of tasks using the probe. The subjects were randomized, and nine of the subjects received training on a virtual reality simulator for shoulder arthroscopy. All subjects then repeated the same cadaveric arthroscopy. The arthroscopic videos were analyzed in a blinded fashion for time to task completion and subjective assessment of technical performance. The two groups were compared with students t-tests, and change over time within groups was analyzed with paired t-tests. Results There were no observed differences between the two groups on the baseline evaluation. The simulator group improved significantly from baseline with respect to time to completion and subjective performance (p<0.05). Time to completion was significantly faster in the simulator group compared to controls at final evaluation (p<0.05). No difference was observed between the groups on the subjective scores at final evaluation (p=0.98). Conclusions Shoulder arthroscopy simulator training resulted in significant benefits in clinical shoulder arthroscopy time to task completion in this cadaver model. This study provides important additional evidence of the benefit of simulators in orthopaedic surgical training. Clinical Relevance There may be a role for simulator training in shoulder arthroscopy education. PMID:23591380

  13. A Bayesian CUSUM plot: Diagnosing quality of treatment.

    PubMed

    Rosthøj, Steen; Jacobsen, Rikke-Line

    2017-12-01

    To present a CUSUM plot based on Bayesian diagnostic reasoning displaying evidence in favour of "healthy" rather than "sick" quality of treatment (QOT), and to demonstrate a technique using Kaplan-Meier survival curves permitting application to case series with ongoing follow-up. For a case series with known final outcomes: Consider each case a diagnostic test of good versus poor QOT (expected vs. increased failure rates), determine the likelihood ratio (LR) of the observed outcome, convert LR to weight taking log to base 2, and add up weights sequentially in a plot showing how many times odds in favour of good QOT have been doubled. For a series with observed survival times and an expected survival curve: Divide the curve into time intervals, determine "healthy" and specify "sick" risks of failure in each interval, construct a "sick" survival curve, determine the LR of survival or failure at the given observation times, convert to weights, and add up. The Bayesian plot was applied retrospectively to 39 children with acute lymphoblastic leukaemia with completed follow-up, using Nordic collaborative results as reference, showing equal odds between good and poor QOT. In the ongoing treatment trial, with 22 of 37 children still at risk for event, QOT has been monitored with average survival curves as reference, odds so far favoring good QOT 2:1. QOT in small patient series can be assessed with a Bayesian CUSUM plot, retrospectively when all treatment outcomes are known, but also in ongoing series with unfinished follow-up. © 2017 John Wiley & Sons, Ltd.

  14. Real-time magnetic resonance-guided ablation of typical right atrial flutter using a combination of active catheter tracking and passive catheter visualization in man: initial results from a consecutive patient series.

    PubMed

    Hilbert, Sebastian; Sommer, Philipp; Gutberlet, Matthias; Gaspar, Thomas; Foldyna, Borek; Piorkowski, Christopher; Weiss, Steffen; Lloyd, Thomas; Schnackenburg, Bernhard; Krueger, Sascha; Fleiter, Christian; Paetsch, Ingo; Jahnke, Cosima; Hindricks, Gerhard; Grothoff, Matthias

    2016-04-01

    Recently cardiac magnetic resonance (CMR) imaging has been found feasible for the visualization of the underlying substrate for cardiac arrhythmias as well as for the visualization of cardiac catheters for diagnostic and ablation procedures. Real-time CMR-guided cavotricuspid isthmus ablation was performed in a series of six patients using a combination of active catheter tracking and catheter visualization using real-time MR imaging. Cardiac magnetic resonance utilizing a 1.5 T system was performed in patients under deep propofol sedation. A three-dimensional-whole-heart sequence with navigator technique and a fast automated segmentation algorithm was used for online segmentation of all cardiac chambers, which were thereafter displayed on a dedicated image guidance platform. In three out of six patients complete isthmus block could be achieved in the MR scanner, two of these patients did not need any additional fluoroscopy. In the first patient technical issues called for a completion of the procedure in a conventional laboratory, in another two patients the isthmus was partially blocked by magnetic resonance imaging (MRI)-guided ablation. The mean procedural time for the MR procedure was 109 ± 58 min. The intubation of the CS was performed within a mean time of 2.75 ± 2.21 min. Total fluoroscopy time for completion of the isthmus block ranged from 0 to 7.5 min. The combination of active catheter tracking and passive real-time visualization in CMR-guided electrophysiologic (EP) studies using advanced interventional hardware and software was safe and enabled efficient navigation, mapping, and ablation. These cases demonstrate significant progress in the development of MR-guided EP procedures. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.

  15. 18 years of continuous observation of tritium and atmospheric precipitations in Ramnicu Valcea (Romania): A time series analysis.

    PubMed

    Duliu, Octavian G; Varlam, Carmen; Shnawaw, Muataz Dheyaa

    2018-05-16

    To get more information on the origin of tritium and to evidence any possible presence of anthropogenic sources, between January 1999 and December 2016, the precipitation level and tritium concentration were monthly recorded and investigated by the Cryogenic Institute of Ramnicu Valcea, Romania. Compared with similar data covering a radius of about 1200 km westward, the measurements gave similar results concerning the time evolution of tritium content and precipitation level for the entire time interval excepting the period between 2009 and 2011 when the tritium concentrations showed a slight increase, most probable due to the activity of neighboring experimental pilot plant for tritium and deuterium separation. Regardless this fact, all data pointed towards a steady tendency of tritium concentrations to decrease with an annual rate of about 1.4 ± 0.05%. The experimental data on precipitation levels and tritium concentrations form two complete time series whose time series analysis showed, at p < 0.01, the presence of a single one-year periodicity whose coincident maximums which correspond to late spring - early summer months suggest the existence of the Spring Leak mechanism with a possible contribution of the soil moisture remobilization during the warm period. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. [The calming process of anger experience: time series changes of affects, cognitions, and behaviors].

    PubMed

    Hibino, Kei; Yukawa, Shintaro

    2004-02-01

    This study investigated time series changes and relationships of affects, cognitions, and behaviors immediately, a few days, and a week after anger episodes. Two hundred undergraduates (96 men, and 104 women) completed a questionnaire. The results were as follows. Anger intensely aroused immediately after anger episodes, and was rapidly calmed as time passed. Anger and depression correlated in each period, so depression was accompanied with anger experiences. The results of covariance structure analysis showed that aggressive behavior was evoked only by affects (especially anger) immediately, and was evoked only by cognitions (especially inflating) a few days after the episode. One week after the episode, aggressive behavior decreased, and was not influenced by affects and cognitions. Anger elicited all anger-expressive behaviors such as aggressive behavior, social sharing, and object-displacement, while depression accompanied with anger episodes elicited only object-displacement.

  17. Time-series analysis of the barriers for admission into a spinal rehabilitation unit.

    PubMed

    New, P W; Akram, M

    2016-02-01

    This is a prospective open-cohort case series. The objective of this study was to assess changes over time in the duration of key acute hospital process barriers for patients with spinal cord damage (SCD) from admission until transfer into spinal rehabilitation unit (SRU) or other destinations. The study was conducted in Acute hospitals, Victoria, Australia (2006-2013). Duration of the following discrete sequential processes was measured: acute hospital admission until referral to SRU, referral until SRU assessment, SRU assessment until ready for SRU transfer and ready for transfer until SRU admission. Time-series analysis was performed using a generalised additive model (GAM). Seasonality of non-traumatic spinal cord dysfunction (SCDys) was examined. GAM analysis shows that the waiting time for admission into SRU was significantly (P<0.001) longer for patients who were female, who had tetraplegia, who were motor complete, had a pelvic pressure ulcer and who were referred from another health network. Age had a non-linear effect on the duration of waiting for transfer from acute hospital to SRU and both the acute hospital and SRU length of stay (LOS). The duration patients spent waiting for SRU admission increased over the study period. There was an increase in the number of referrals over the study period and an increase in the number of patients accepted but not admitted into the SRU. There was no notable seasonal influence on the referral of patients with SCDys. Time-series analysis provides additional insights into changes in the waiting times for SRU admission and the LOS in hospital for patients with SCD.

  18. VO-ESD: a virtual observatory approach to describe the geomagnetic field temporal variations with application to Swarm data

    NASA Astrophysics Data System (ADS)

    Saturnino, Diana; Langlais, Benoit; Amit, Hagay; Mandea, Mioara; Civet, François; Beucler, Éric

    2017-04-01

    A complete description of the main geomagnetic field temporal variation is crucial to understand dynamics in the core. This variation, termed secular variation (SV), is known with high accuracy at ground magnetic observatory locations. However the description of its spatial variability is hampered by the globally uneven distribution of the observatories. For the past two decades a global coverage of the field changes has been allowed by satellites. Their surveys of the geomagnetic field have been used to derive and improve global spherical harmonic (SH) models through some strict data selection schemes to minimise external field contributions. But discrepancies remain between ground measurements and field predictions by these models. Indeed, the global models do not reproduce small spatial scales of the field temporal variations. To overcome this problem we propose a modified Virtual Observatory (VO) approach by defining a globally homogeneous mesh of VOs at satellite altitude. With this approach we directly extract time series of the field and its temporal variation from satellite measurements as it is done at observatory locations. As satellite measurements are acquired at different altitudes a correction for the altitude is needed. Therefore, we apply an Equivalent Source Dipole (ESD) technique for each VO and each given time interval to reduce all measurements to a unique location, leading to time series similar to those available at ground magnetic observatories. Synthetic data is first used to validate the new VO-ESD approach. Then, we apply our scheme to measurements from the Swarm mission. For the first time, a 2.5 degrees resolution global mesh of VO times series is built. The VO-ESD derived time series are locally compared to ground observations as well as to satellite-based model predictions. The approach is able to describe detailed temporal variations of the field at local scales. The VO-ESD time series are also used to derive global SH models. Without regularization these models describe well the secular trend of the magnetic field. The derivation of longer VO-ESD time series, as more data will be made available, will allow the study of field temporal variations features such as geomagnetic jerks.

  19. Orthodontic bracket slot dimensions as measured from entire bracket series.

    PubMed

    Brown, Paul; Wagner, Warren; Choi, Hyden

    2015-07-01

    To measure the slot dimensions of an entire series of metal orthodontic brackets. Ten bracket series approximating five complete sets of brackets each were imaged and measured. Descriptive statistics were generated. Slot dimension varied significantly from series to series as well as within the series themselves. About one-third of the brackets would not accommodate a full-size wire, and 15% to 20% are 0.001 inches or larger than the nominal advertised size. The clinician is unlikely to have on hand complete sets (upper and lower 5-5) of ideal brackets and should both expect and be able to be accommodate tooth movement through wire bending in three planes of space to overcome any bracket deficiencies.

  20. Statistical process control of mortality series in the Australian and New Zealand Intensive Care Society (ANZICS) adult patient database: implications of the data generating process.

    PubMed

    Moran, John L; Solomon, Patricia J

    2013-05-24

    Statistical process control (SPC), an industrial sphere initiative, has recently been applied in health care and public health surveillance. SPC methods assume independent observations and process autocorrelation has been associated with increase in false alarm frequency. Monthly mean raw mortality (at hospital discharge) time series, 1995-2009, at the individual Intensive Care unit (ICU) level, were generated from the Australia and New Zealand Intensive Care Society adult patient database. Evidence for series (i) autocorrelation and seasonality was demonstrated using (partial)-autocorrelation ((P)ACF) function displays and classical series decomposition and (ii) "in-control" status was sought using risk-adjusted (RA) exponentially weighted moving average (EWMA) control limits (3 sigma). Risk adjustment was achieved using a random coefficient (intercept as ICU site and slope as APACHE III score) logistic regression model, generating an expected mortality series. Application of time-series to an exemplar complete ICU series (1995-(end)2009) was via Box-Jenkins methodology: autoregressive moving average (ARMA) and (G)ARCH ((Generalised) Autoregressive Conditional Heteroscedasticity) models, the latter addressing volatility of the series variance. The overall data set, 1995-2009, consisted of 491324 records from 137 ICU sites; average raw mortality was 14.07%; average(SD) raw and expected mortalities ranged from 0.012(0.113) and 0.013(0.045) to 0.296(0.457) and 0.278(0.247) respectively. For the raw mortality series: 71 sites had continuous data for assessment up to or beyond lag40 and 35% had autocorrelation through to lag40; and of 36 sites with continuous data for ≥ 72 months, all demonstrated marked seasonality. Similar numbers and percentages were seen with the expected series. Out-of-control signalling was evident for the raw mortality series with respect to RA-EWMA control limits; a seasonal ARMA model, with GARCH effects, displayed white-noise residuals which were in-control with respect to EWMA control limits and one-step prediction error limits (3SE). The expected series was modelled with a multiplicative seasonal autoregressive model. The data generating process of monthly raw mortality series at the ICU level displayed autocorrelation, seasonality and volatility. False-positive signalling of the raw mortality series was evident with respect to RA-EWMA control limits. A time series approach using residual control charts resolved these issues.

  1. Time-series analysis of delta13C from tree rings. I. Time trends and autocorrelation.

    PubMed

    Monserud, R A; Marshall, J D

    2001-09-01

    Univariate time-series analyses were conducted on stable carbon isotope ratios obtained from tree-ring cellulose. We looked for the presence and structure of autocorrelation. Significant autocorrelation violates the statistical independence assumption and biases hypothesis tests. Its presence would indicate the existence of lagged physiological effects that persist for longer than the current year. We analyzed data from 28 trees (60-85 years old; mean = 73 years) of western white pine (Pinus monticola Dougl.), ponderosa pine (Pinus ponderosa Laws.), and Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco var. glauca) growing in northern Idaho. Material was obtained by the stem analysis method from rings laid down in the upper portion of the crown throughout each tree's life. The sampling protocol minimized variation caused by changing light regimes within each tree. Autoregressive moving average (ARMA) models were used to describe the autocorrelation structure over time. Three time series were analyzed for each tree: the stable carbon isotope ratio (delta(13)C); discrimination (delta); and the difference between ambient and internal CO(2) concentrations (c(a) - c(i)). The effect of converting from ring cellulose to whole-leaf tissue did not affect the analysis because it was almost completely removed by the detrending that precedes time-series analysis. A simple linear or quadratic model adequately described the time trend. The residuals from the trend had a constant mean and variance, thus ensuring stationarity, a requirement for autocorrelation analysis. The trend over time for c(a) - c(i) was particularly strong (R(2) = 0.29-0.84). Autoregressive moving average analyses of the residuals from these trends indicated that two-thirds of the individual tree series contained significant autocorrelation, whereas the remaining third were random (white noise) over time. We were unable to distinguish between individuals with and without significant autocorrelation beforehand. Significant ARMA models were all of low order, with either first- or second-order (i.e., lagged 1 or 2 years, respectively) models performing well. A simple autoregressive (AR(1)), model was the most common. The most useful generalization was that the same ARMA model holds for each of the three series (delta(13)C, delta, c(a) - c(i)) for an individual tree, if the time trend has been properly removed for each series. The mean series for the two pine species were described by first-order ARMA models (1-year lags), whereas the Douglas-fir mean series were described by second-order models (2-year lags) with negligible first-order effects. Apparently, the process of constructing a mean time series for a species preserves an underlying signal related to delta(13)C while canceling some of the random individual tree variation. Furthermore, the best model for the overall mean series (e.g., for a species) cannot be inferred from a consensus of the individual tree model forms, nor can its parameters be estimated reliably from the mean of the individual tree parameters. Because two-thirds of the individual tree time series contained significant autocorrelation, the normal assumption of a random structure over time is unwarranted, even after accounting for the time trend. The residuals of an appropriate ARMA model satisfy the independence assumption, and can be used to make hypothesis tests.

  2. Use of Entertainment Elements in an Online Video Mini-Series to Train Pharmacy Preceptors.

    PubMed

    Cox, Craig D; Cheon, Jongpil; Crooks, Steven M; Lee, Jaehoon; Curtis, Jacob D

    2017-02-25

    Objective. To create an entertaining approach to training pharmacy preceptors. Design. A training program was developed to provide an innovative, entertaining, and flexible continuing education program for pharmacy preceptors. Three instructional design principles - providing an authentic context, offering a diversity of content, and engaging and maintaining attention - were foundational to this concept. The mini-series consisted of 12 online video episodes. Participants completed three reflective questions and one evaluation after watching each episode. Three months following completion of the training, a survey was distributed to analyze the long-term impact of the mini-series on precepting skills. Assessment. Two hundred two participants completed all 12 episodes. After completing the training series, the participants' confidence level in their knowledge pertaining to the objectives was significantly greater than before they started. Among the 32% of participants who responded to the three-month follow-up survey, the mean score for precepting confidence was 6.8 on a scale of 1 to 10 on which 1=no increase to 10=big increase. Also, 99% of participants indicated they would complete a similar training program and recommend to others. Conclusions. Feedback from the mini-series provides evidence of the effectiveness of its delivery format and use as a preceptor learning tool.

  3. Use of Entertainment Elements in an Online Video Mini-Series to Train Pharmacy Preceptors

    PubMed Central

    Cheon, Jongpil; Crooks, Steven M.; Lee, Jaehoon; Curtis, Jacob D.

    2017-01-01

    Objective. To create an entertaining approach to training pharmacy preceptors. Design. A training program was developed to provide an innovative, entertaining, and flexible continuing education program for pharmacy preceptors. Three instructional design principles – providing an authentic context, offering a diversity of content, and engaging and maintaining attention – were foundational to this concept. The mini-series consisted of 12 online video episodes. Participants completed three reflective questions and one evaluation after watching each episode. Three months following completion of the training, a survey was distributed to analyze the long-term impact of the mini-series on precepting skills. Assessment. Two hundred two participants completed all 12 episodes. After completing the training series, the participants’ confidence level in their knowledge pertaining to the objectives was significantly greater than before they started. Among the 32% of participants who responded to the three-month follow-up survey, the mean score for precepting confidence was 6.8 on a scale of 1 to 10 on which 1=no increase to 10=big increase. Also, 99% of participants indicated they would complete a similar training program and recommend to others. Conclusions. Feedback from the mini-series provides evidence of the effectiveness of its delivery format and use as a preceptor learning tool. PMID:28289302

  4. Uptake and timeliness of rotavirus vaccination in Norway: The first year post-introduction.

    PubMed

    Valcarcel Salamanca, Beatriz; Hagerup-Jenssen, Maria Elisabeth; Flem, Elmira

    2016-09-07

    To minimise vaccine-associated risk of intussusception following rotavirus vaccination, Norway adopted very strict age limits for initiating and completing the vaccine series at the time rotavirus vaccination was included in the national immunisation programme, October 2014. Although Norway has a high coverage for routine childhood vaccines, these stringent age limits could negatively affect rotavirus coverage. We documented the status and impact of rotavirus vaccination on other infant vaccines during the first year after its introduction. We used individual vaccination data from the national immunisation register to calculate coverage for rotavirus and other vaccines and examine adherence with the recommended schedules. We identified factors associated with completing the full rotavirus series by performing multiple logistic regression analyses. We also evaluated potential changes in uptake and timeliness of other routine vaccines after the introduction of rotavirus vaccine using the Kaplan-Meier method. The national coverage for rotavirus vaccine achieved a year after the introduction was 89% for one dose and 82% for two doses, respectively. Among fully rotavirus-vaccinated children, 98% received both doses within the upper age limit and 90% received both doses according to the recommended schedule. The child's age at the initiation of rotavirus series and being vaccinated with diphtheria, tetanus, pertussis, polio and Haemophilus influenzae type b (DTaP/IPV/Hib) and pneumococcal vaccines were the strongest predictors of completing the full rotavirus series. No major changes in uptake and timeliness of other paediatric vaccines were observed after introduction of rotavirus vaccine. Norway achieved a high national coverage and excellent adherence with the strict age limits for rotavirus vaccine administration during the first year of introduction, indicating robustness of the national immunisation programme. Rotavirus vaccination did not impact coverage or timeliness of other infant vaccines. Copyright © 2016. Published by Elsevier Ltd.

  5. [Book review] Life histories of North American cardinals, grosbeaks, buntings, towhees, finches, sparrows, and allies

    USGS Publications Warehouse

    Banks, R.C.

    1969-01-01

    The completion of an ornithological series as important as the Bent Life Histories is an exciting event. Here is a series of 21 volumes, spanning a history of nearly 60 years from inception to completion, containing over 9,500 text pages of information about North American birds, largely the work of one man – who was not professionally an ornithologist. One cannot well review the final number of such a series without considering the series as a whole and that volume relative to the rest of the series, when the authorship of the last is different and varied.

  6. A lengthy look at the daily grind: time series analysis of events, mood, stress, and satisfaction.

    PubMed

    Fuller, Julie A; Stanton, Jeffrey M; Fisher, Gwenith G; Spitzmuller, Christiane; Russell, Steven S; Smith, Patricia C

    2003-12-01

    The present study investigated processes by which job stress and satisfaction unfold over time by examining the relations between daily stressful events, mood, and these variables. Using a Web-based daily survey of stressor events, perceived strain, mood, and job satisfaction completed by 14 university workers, 1,060 occasions of data were collected. Transfer function analysis, a multivariate version of time series analysis, was used to examine the data for relationships among the measured variables after factoring out the contaminating influences of serial dependency. Results revealed a contrast effect in which a stressful event associated positively with higher strain on the same day and associated negatively with strain on the following day. Perceived strain increased over the course of a semester for a majority of participants, suggesting that effects of stress build over time. Finally, the data were consistent with the notion that job satisfaction is a distal outcome that is mediated by perceived strain. ((c) 2003 APA, all rights reserved)

  7. Inferring the interplay between network structure and market effects in Bitcoin

    NASA Astrophysics Data System (ADS)

    Kondor, Dániel; Csabai, István; Szüle, János; Pósfai, Márton; Vattay, Gábor

    2014-12-01

    A main focus in economics research is understanding the time series of prices of goods and assets. While statistical models using only the properties of the time series itself have been successful in many aspects, we expect to gain a better understanding of the phenomena involved if we can model the underlying system of interacting agents. In this article, we consider the history of Bitcoin, a novel digital currency system, for which the complete list of transactions is available for analysis. Using this dataset, we reconstruct the transaction network between users and analyze changes in the structure of the subgraph induced by the most active users. Our approach is based on the unsupervised identification of important features of the time variation of the network. Applying the widely used method of Principal Component Analysis to the matrix constructed from snapshots of the network at different times, we are able to show how structural changes in the network accompany significant changes in the exchange price of bitcoins.

  8. Pi2 detection using Empirical Mode Decomposition (EMD)

    NASA Astrophysics Data System (ADS)

    Mieth, Johannes Z. D.; Frühauff, Dennis; Glassmeier, Karl-Heinz

    2017-04-01

    Empirical Mode Decomposition has been used as an alternative method to wavelet transformation to identify onset times of Pi2 pulsations in data sets of the Scandinavian Magnetometer Array (SMA). Pi2 pulsations are magnetohydrodynamic waves occurring during magnetospheric substorms. Almost always Pi2 are observed at substorm onset in mid to low latitudes on Earth's nightside. They are fed by magnetic energy release caused by dipolarization processes. Their periods lie between 40 to 150 seconds. Usually, Pi2 are detected using wavelet transformation. Here, Empirical Mode Decomposition (EMD) is presented as an alternative approach to the traditional procedure. EMD is a young signal decomposition method designed for nonlinear and non-stationary time series. It provides an adaptive, data driven, and complete decomposition of time series into slow and fast oscillations. An optimized version using Monte-Carlo-type noise assistance is used here. By displaying the results in a time-frequency space a characteristic frequency modulation is observed. This frequency modulation can be correlated with the onset of Pi2 pulsations. A basic algorithm to find the onset is presented. Finally, the results are compared to classical wavelet-based analysis. The use of different SMA stations furthermore allows the spatial analysis of Pi2 onset times. EMD mostly finds application in the fields of engineering and medicine. This work demonstrates the applicability of this method to geomagnetic time series.

  9. Evidence for a fundamental and pervasive shift away from nature-based recreation

    PubMed Central

    Pergams, Oliver R. W.; Zaradic, Patricia A.

    2008-01-01

    After 50 years of steady increase, per capita visits to U.S. National Parks have declined since 1987. To evaluate whether we are seeing a fundamental shift away from people's interest in nature, we tested for similar longitudinal declines in 16 time series representing four classes of nature participation variables: (i) visitation to various types of public lands in the U.S. and National Parks in Japan and Spain, (ii) number of various types of U.S. game licenses issued, (iii) indicators of time spent camping, and (iv) indicators of time spent backpacking or hiking. The four variables with the greatest per capita participation were visits to Japanese National Parks, U.S. State Parks, U.S. National Parks, and U.S. National Forests, with an average individual participating 0.74–2.75 times per year. All four time series are in downtrends, with linear regressions showing ongoing losses of −1.0% to −3.1% per year. The longest and most complete time series tested suggest that typical declines in per capita nature recreation began between 1981 and 1991, are proceeding at rates of −1.0% to −1.3% per year, and total to date −18% to −25%. Spearman correlation analyses were performed on untransformed time series and on transformed percentage year-to-year changes. Results showed very highly significant correlations between many of the highest per capita participation variables in both untransformed and in difference models, further corroborating the general downtrend in nature recreation. In conclusion, all major lines of evidence point to an ongoing and fundamental shift away from nature-based recreation. PMID:18250312

  10. Assessment of Program Impact Through First Grade, Volume V: Impact on Children. An Evaluation of Project Developmental Continuity. Interim Report X.

    ERIC Educational Resources Information Center

    Berrueta-Clement, John; And Others

    Fifth in a series of six volumes reporting outcomes of the preliminary evaluation of an educational intervention, this report presents the findings of the effects of Project Developmental Continuity (PDC) up to the time the evaluation study's cohort of children completed grade 1. Preliminary findings concerning the relationship between variables…

  11. Question Asking and the Teaching of Writing.

    ERIC Educational Resources Information Center

    Odell, Lee

    This paper argues that provocative questions can be used by teachers to help students write. Described are a series of questions drawn from rhetorical theory, such as "How many times can I change focus so as to get the most complete understanding of a topic?""When does X occur?""Why does X occur?" and "What does X cause or prompt?" Several…

  12. Argument-Driven Inquiry as a Way to Help Undergraduate Students Write to Learn by Learning to Write in Chemistry

    ERIC Educational Resources Information Center

    Sampson, Victor; Walker, Joi Phelps

    2012-01-01

    This exploratory study examined how undergraduate students' ability to write in science changed over time as they completed a series of laboratory activities designed using a new instructional model called argument-driven inquiry. The study was conducted in a single section of an undergraduate general chemistry lab course offered at a large…

  13. Just-in-Time Research: A Call to Arms for Research into Mobile Technologies in Higher Education

    ERIC Educational Resources Information Center

    Byrne-Davis, Lucie; Dexter, Hilary; Hart, Jo; Cappelli, Tim; Byrne, Ged; Sampson, Ian; Mooney, Jane; Lumsden, Colin

    2015-01-01

    Mobile technologies are becoming commonplace in society and in education. In higher education, it is crucial to understand the impact of constant access to information on the development of the knowledge and competence of the learner. This study reports on a series of four surveys completed by UK-based medical students (n = 443) who received…

  14. How the Experience of Assessed Collaborative Writing Impacts on Undergraduate Students' Perceptions of Assessed Group Work

    ERIC Educational Resources Information Center

    Scotland, James

    2016-01-01

    A time-series analysis was used to investigate Arabic undergraduate students' (n = 50) perceptions of assessed group work in a major government institution of higher education in Qatar. A longitudinal mixed methods approach was employed. Likert scale questionnaires were completed over the duration of a collaborative writing event. Additionally,…

  15. Hausdorff clustering

    NASA Astrophysics Data System (ADS)

    Basalto, Nicolas; Bellotti, Roberto; de Carlo, Francesco; Facchi, Paolo; Pantaleo, Ester; Pascazio, Saverio

    2008-10-01

    A clustering algorithm based on the Hausdorff distance is analyzed and compared to the single, complete, and average linkage algorithms. The four clustering procedures are applied to a toy example and to the time series of financial data. The dendrograms are scrutinized and their features compared. The Hausdorff linkage relies on firm mathematical grounds and turns out to be very effective when one has to discriminate among complex structures.

  16. Mapping Impervious Surface Expansion using Medium-resolution Satellite Image Time Series: A Case Study in the Yangtze River Delta, China

    NASA Technical Reports Server (NTRS)

    Gao, Feng; DeColstoun, Eric Brown; Ma, Ronghua; Weng, Qihao; Masek, Jeffrey G.; Chen, Jin; Pan, Yaozhong; Song, Conghe

    2012-01-01

    Cities have been expanding rapidly worldwide, especially over the past few decades. Mapping the dynamic expansion of impervious surface in both space and time is essential for an improved understanding of the urbanization process, land-cover and land-use change, and their impacts on the environment. Landsat and other medium-resolution satellites provide the necessary spatial details and temporal frequency for mapping impervious surface expansion over the past four decades. Since the US Geological Survey opened the historical record of the Landsat image archive for free access in 2008, the decades-old bottleneck of data limitation has gone. Remote-sensing scientists are now rich with data, and the challenge is how to make best use of this precious resource. In this article, we develop an efficient algorithm to map the continuous expansion of impervious surface using a time series of four decades of medium-resolution satellite images. The algorithm is based on a supervised classification of the time-series image stack using a decision tree. Each imerpervious class represents urbanization starting in a different image. The algorithm also allows us to remove inconsistent training samples because impervious expansion is not reversible during the study period. The objective is to extract a time series of complete and consistent impervious surface maps from a corresponding times series of images collected from multiple sensors, and with a minimal amount of image preprocessing effort. The approach was tested in the lower Yangtze River Delta region, one of the fastest urban growth areas in China. Results from nearly four decades of medium-resolution satellite data from the Landsat Multispectral Scanner (MSS), Thematic Mapper (TM), Enhanced Thematic Mapper plus (ETM+) and China-Brazil Earth Resources Satellite (CBERS) show a consistent urbanization process that is consistent with economic development plans and policies. The time-series impervious spatial extent maps derived from this study agree well with an existing urban extent polygon data set that was previously developed independently. The overall mapping accuracy was estimated at about 92.5% with 3% commission error and 12% omission error for the impervious type from all images regardless of image quality and initial spatial resolution.

  17. Machine learning methods as a tool to analyse incomplete or irregularly sampled radon time series data.

    PubMed

    Janik, M; Bossew, P; Kurihara, O

    2018-07-15

    Machine learning is a class of statistical techniques which has proven to be a powerful tool for modelling the behaviour of complex systems, in which response quantities depend on assumed controls or predictors in a complicated way. In this paper, as our first purpose, we propose the application of machine learning to reconstruct incomplete or irregularly sampled data of time series indoor radon ( 222 Rn). The physical assumption underlying the modelling is that Rn concentration in the air is controlled by environmental variables such as air temperature and pressure. The algorithms "learn" from complete sections of multivariate series, derive a dependence model and apply it to sections where the controls are available, but not the response (Rn), and in this way complete the Rn series. Three machine learning techniques are applied in this study, namely random forest, its extension called the gradient boosting machine and deep learning. For a comparison, we apply the classical multiple regression in a generalized linear model version. Performance of the models is evaluated through different metrics. The performance of the gradient boosting machine is found to be superior to that of the other techniques. By applying learning machines, we show, as our second purpose, that missing data or periods of Rn series data can be reconstructed and resampled on a regular grid reasonably, if data of appropriate physical controls are available. The techniques also identify to which degree the assumed controls contribute to imputing missing Rn values. Our third purpose, though no less important from the viewpoint of physics, is identifying to which degree physical, in this case environmental variables, are relevant as Rn predictors, or in other words, which predictors explain most of the temporal variability of Rn. We show that variables which contribute most to the Rn series reconstruction, are temperature, relative humidity and day of the year. The first two are physical predictors, while "day of the year" is a statistical proxy or surrogate for missing or unknown predictors. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Point-Process Models of Social Network Interactions: Parameter Estimation and Missing Data Recovery

    DTIC Science & Technology

    2014-08-01

    treating them as zero will have a de minimis impact on the results, but avoiding computing them (and computing with them) saves tremendous time. Set a... test the methods on simulated time series on artificial social networks, including some toy networks and some meant to resemble IkeNet. We conclude...the section by discussing the results in detail. In each of our tests we begin with a complete data set, whether it is real (IkeNet) or simulated. Then

  19. Quality Improvement to Immunization Coverage in Primary Care Measured in Medical Record and Population-Based Registry Data.

    PubMed

    Harder, Valerie S; Barry, Sara E; Ahrens, Bridget; Davis, Wendy S; Shaw, Judith S

    Despite the proven benefits of immunizations, coverage remains low in many states, including Vermont. This study measured the impact of a quality improvement (QI) project on immunization coverage in childhood, school-age, and adolescent groups. In 2013, a total of 20 primary care practices completed a 7-month QI project aimed to increase immunization coverage among early childhood (29-33 months), school-age (6 years), and adolescent (13 years) age groups. For this study, we examined random cross-sectional medical record reviews from 12 of the 20 practices within each age group in 2012, 2013, and 2014 to measure improvement in immunization coverage over time using chi-squared tests. We repeated these analyses on population-level data from Vermont's immunization registry for the 12 practices in each age group each year. We used difference-in-differences regressions in the immunization registry data to compare improvements over time between the 12 practices and those not participating in QI. Immunization coverage increased over 3 years for all ages and all immunization series (P ≤ .009) except one, as measured by medical record review. Registry results aligned partially with medical record review with increases in early childhood and adolescent series over time (P ≤ .012). Notably, the adolescent immunization series completion, including human papillomavirus, increased more than in the comparison practices (P = .037). Medical record review indicated that QI efforts led to increases in immunization coverage in pediatric primary care. Results were partially validated in the immunization registry particularly among early childhood and adolescent groups, with a population-level impact of the intervention among adolescents. Copyright © 2018 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  20. The Homogeneity of the Potsdam Solar Radiation Data

    NASA Astrophysics Data System (ADS)

    Behrens, K.

    2009-04-01

    At Meteorological Station in Potsdam (Germany) the measurement of sunshine duration started already in 1983. Later on, in 1937 the registration of global, diffuse and direct solar radiation was begun with pyranometers and a pyrheliometer. Since 1983 sunshine duration has been measured with the same method, the Campbell-Stokes sunshine recorder, at the same site, while the measurements of solar radiation changed as well as in equipment, measurement methods and location. Furthermore, it was firstly necessary to supplement some missing data within the time series and secondly, it was desirable to extend the series of global radiation by regression with the sunshine duration backward to 1893. Because solar radiation, especially global radiation, is one of the most important quantities for climate research, it is necessary to investigate the homogeneity of these time series. At first the history was studied and as much as possible information about all parameters, which could influence the data, were gathered. In a second step these metadata were reviewed critically followed by a discussion about the potential effects of local factors on the homogeneity of the data. In a first step of data rehabilitation the so-called engineering correction (data levelling to WRR and SI units) were made followed by the supplementation of gaps. Finally, for every month and the year the so generated time series of measured data (1937/2008) and the complete series, prolonged by regression and measurements (1893/2008), were tested on homogeneity with the following distribution-free tests: WILCOXON (U) test, MANN-KENDALL test and progressive analysis were used for the examination of the stability of the mean and the dispersion, while with the Wald-Wolfowitz test the first order autocorrelation was checked. These non-parametric test were used, because frequently radiation data do not fulfil the assumption of a GAUSSian or normal distribution. The investigations showed, that discontinuities which were found in most cases are not in relation to metadata marking changes of site, equipment etc. Also, the point of intersection, where the calculated time series were connected to the measurements were not marked. This means that the time series are stable and measurements and the calculated part are in good agreement.

  1. Development of web tools to disseminate space geodesy data-related products

    NASA Astrophysics Data System (ADS)

    Soudarin, L.; Ferrage, P.; Mezerette, A.

    2014-12-01

    In order to promote the products of the DORIS system, the French Space Agency CNES has developed and implemented on the web site of the International DORIS Service (IDS) a set of plot tools to interactively build and display time series of site positions, orbit residuals and terrestrial parameters (scale, geocenter). An interactive global map is also available to select sites, and to get access to their information. Besides the products provided by the CNES Orbitography Team and the IDS components, these tools allow comparing time evolutions of coordinates for collocated DORIS and GNSS stations, thanks to the collaboration with the Terrestrial Frame Combination Center of the International GNSS Service (IGS). The next step currently in progress is the creation of a database to improve robustness and efficiency of the tools, with the objective to propose a complete web service to foster data exchange with the other geodetic services of the International Association of Geodesy (IAG). The possibility to visualize and compare position time series of the four main space geodetic techniques DORIS, GNSS, SLR and VLBI is already under way at the French level. A dedicated version of these web tools has been developed for the French Space Geodesy Research Group (GRGS). It will give access to position time series provided by the GRGS Analysis Centers involved in DORIS, GNSS, SLR and VLBI data processing for the realization of the International Terrestrial Reference Frame. In this presentation, we will describe the functionalities of these tools, and we will address some aspects of the time series (content, format).

  2. Research on maximum level noise contaminated of remote reference magnetotelluric measurements using synthesized data

    NASA Astrophysics Data System (ADS)

    Gang, Zhang; Fansong, Meng; Jianzhong, Wang; Mingtao, Ding

    2018-02-01

    Determining magnetotelluric impedance precisely and accurately is fundamental to valid inversion and geological interpretation. This study aims to determine the minimum value of signal-to-noise ratio (SNR) which maintains the effectiveness of remote reference technique. Results of standard time series simulation, addition of different Gaussian noises to obtain the different SNR time series, and analysis of the intermediate data, such as polarization direction, correlation coefficient, and impedance tensor, show that when the SNR value is larger than 23.5743, the polarization direction disorder at morphology and a smooth and accurate sounding carve value can be obtained. At this condition, the correlation coefficient value of nearly complete segments between the base and remote station is larger than 0.9, and impedance tensor Zxy presents only one aggregation, which meet the natural magnetotelluric signal characteristic.

  3. Conditioned empirical orthogonal functions for interpolation of runoff time series along rivers: Application to reconstruction of missing monthly records

    NASA Astrophysics Data System (ADS)

    Li, Lingqi; Gottschalk, Lars; Krasovskaia, Irina; Xiong, Lihua

    2018-01-01

    Reconstruction of missing runoff data is of important significance to solve contradictions between the common situation of gaps and the fundamental necessity of complete time series for reliable hydrological research. The conventional empirical orthogonal functions (EOF) approach has been documented to be useful for interpolating hydrological series based upon spatiotemporal decomposition of runoff variation patterns, without additional measurements (e.g., precipitation, land cover). This study develops a new EOF-based approach (abbreviated as CEOF) that conditions EOF expansion on the oscillations at outlet (or any other reference station) of a target basin and creates a set of residual series by removing the dependence on this reference series, in order to redefine the amplitude functions (components). This development allows a transparent hydrological interpretation of the dimensionless components and thereby strengthens their capacities to explain various runoff regimes in a basin. The two approaches are demonstrated on an application of discharge observations from the Ganjiang basin, China. Two alternatives for determining amplitude functions based on centred and standardised series, respectively, are tested. The convergence in the reconstruction of observations at different sites as a function of the number of components and its relation to the characteristics of the site are analysed. Results indicate that the CEOF approach offers an efficient way to restore runoff records with only one to four components; it shows more superiority in nested large basins than at headwater sites and often performs better than the EOF approach when using standardised series, especially in improving infilling accuracy for low flows. Comparisons against other interpolation methods (i.e., nearest neighbour, linear regression, inverse distance weighting) further confirm the advantage of the EOF-based approaches in avoiding spatial and temporal inconsistencies in estimated series.

  4. Kinetic Titration Series with Biolayer Interferometry

    PubMed Central

    Frenzel, Daniel; Willbold, Dieter

    2014-01-01

    Biolayer interferometry is a method to analyze protein interactions in real-time. In this study, we illustrate the usefulness to quantitatively analyze high affinity protein ligand interactions employing a kinetic titration series for characterizing the interactions between two pairs of interaction patterns, in particular immunoglobulin G and protein G B1 as well as scFv IC16 and amyloid beta (1–42). Kinetic titration series are commonly used in surface plasmon resonance and involve sequential injections of analyte over a desired concentration range on a single ligand coated sensor chip without waiting for complete dissociation between the injections. We show that applying this method to biolayer interferometry is straightforward and i) circumvents problems in data evaluation caused by unavoidable sensor differences, ii) saves resources and iii) increases throughput if screening a multitude of different analyte/ligand combinations. PMID:25229647

  5. Kinetic titration series with biolayer interferometry.

    PubMed

    Frenzel, Daniel; Willbold, Dieter

    2014-01-01

    Biolayer interferometry is a method to analyze protein interactions in real-time. In this study, we illustrate the usefulness to quantitatively analyze high affinity protein ligand interactions employing a kinetic titration series for characterizing the interactions between two pairs of interaction patterns, in particular immunoglobulin G and protein G B1 as well as scFv IC16 and amyloid beta (1-42). Kinetic titration series are commonly used in surface plasmon resonance and involve sequential injections of analyte over a desired concentration range on a single ligand coated sensor chip without waiting for complete dissociation between the injections. We show that applying this method to biolayer interferometry is straightforward and i) circumvents problems in data evaluation caused by unavoidable sensor differences, ii) saves resources and iii) increases throughput if screening a multitude of different analyte/ligand combinations.

  6. Biogeochemical Response to Mesoscale Physical Forcing in the California Current System

    NASA Technical Reports Server (NTRS)

    Niiler, Pearn P.; Letelier, Ricardo; Moisan, John R.; Marra, John A. (Technical Monitor)

    2001-01-01

    In the first part of the project, we investigated the local response of the coastal ocean ecosystems (changes in chlorophyll, concentration and chlorophyll, fluorescence quantum yield) to physical forcing by developing and deploying Autonomous Drifting Ocean Stations (ADOS) within several mesoscale features along the U.S. west coast. Also, we compared the temporal and spatial variability registered by sensors mounted in the drifters to that registered by the sensors mounted in the satellites in order to assess the scales of variability that are not resolved by the ocean color satellite. The second part of the project used the existing WOCE SVP Surface Lagrangian drifters to track individual water parcels through time. The individual drifter tracks were used to generate multivariate time series by interpolating/extracting the biological and physical data fields retrieved by remote sensors (ocean color, SST, wind speed and direction, wind stress curl, and sea level topography). The individual time series of the physical data (AVHRR, TOPEX, NCEP) were analyzed against the ocean color (SeaWiFS) time-series to determine the time scale of biological response to the physical forcing. The results from this part of the research is being used to compare the decorrelation scales of chlorophyll from a Lagrangian and Eulerian framework. The results from both parts of this research augmented the necessary time series data needed to investigate the interactions between the ocean mesoscale features, wind, and the biogeochemical processes. Using the historical Lagrangian data sets, we have completed a comparison of the decorrelation scales in both the Eulerian and Lagrangian reference frame for the SeaWiFS data set. We are continuing to investigate how these results might be used in objective mapping efforts.

  7. Satellite Emission Range Inferred Earth Survey (SERIES) project

    NASA Technical Reports Server (NTRS)

    Buennagel, L. A.; Macdoran, P. F.; Neilan, R. E.; Spitzmesser, D. J.; Young, L. E.

    1984-01-01

    The Global Positioning System (GPS) was developed by the Department of Defense primarily for navigation use by the United States Armed Forces. The system will consist of a constellation of 18 operational Navigation Satellite Timing and Ranging (NAVSTAR) satellites by the late 1980's. During the last four years, the Satellite Emission Range Inferred Earth Surveying (SERIES) team at the Jet Propulsion Laboratory (JPL) has developed a novel receiver which is the heart of the SERIES geodetic system designed to use signals broadcast from the GPS. This receiver does not require knowledge of the exact code sequence being transmitted. In addition, when two SERIES receivers are used differentially to determine a baseline, few cm accuracies can be obtained. The initial engineering test phase has been completed for the SERIES Project. Baseline lengths, ranging from 150 meters to 171 kilometers, have been measured with 0.3 cm to 7 cm accuracies. This technology, which is sponsored by the NASA Geodynamics Program, has been developed at JPL to meet the challenge for high precision, cost-effective geodesy, and to complement the mobile Very Long Baseline Interferometry (VLBI) system for Earth surveying.

  8. Uncertainty estimation with bias-correction for flow series based on rating curve

    NASA Astrophysics Data System (ADS)

    Shao, Quanxi; Lerat, Julien; Podger, Geoff; Dutta, Dushmanta

    2014-03-01

    Streamflow discharge constitutes one of the fundamental data required to perform water balance studies and develop hydrological models. A rating curve, designed based on a series of concurrent stage and discharge measurements at a gauging location, provides a way to generate complete discharge time series with a reasonable quality if sufficient measurement points are available. However, the associated uncertainty is frequently not available even though it has a significant impact on hydrological modelling. In this paper, we identify the discrepancy of the hydrographers' rating curves used to derive the historical discharge data series and proposed a modification by bias correction which is also in the form of power function as the traditional rating curve. In order to obtain the uncertainty estimation, we propose a further both-side Box-Cox transformation to stabilize the regression residuals as close to the normal distribution as possible, so that a proper uncertainty can be attached for the whole discharge series in the ensemble generation. We demonstrate the proposed method by applying it to the gauging stations in the Flinders and Gilbert rivers in north-west Queensland, Australia.

  9. Revision of Primary Series Maps

    USGS Publications Warehouse

    ,

    2000-01-01

    In 1992, the U.S. Geological Survey (USGS) completed a 50-year effort to provide primary series map coverage of the United States. Many of these maps now need to be updated to reflect the construction of new roads and highways and other changes that have taken place over time. The USGS has formulated a graphic revision plan to help keep the primary series maps current. Primary series maps include 1:20,000-scale quadrangles of Puerto Rico, 1:24,000- or 1:25,000-scale quadrangles of the conterminous United States, Hawaii, and U.S. Territories, and 1:63,360-scale quadrangles of Alaska. The revision of primary series maps from new collection sources is accomplished using a variety of processes. The raster revision process combines the scanned content of paper maps with raster updating technologies. The vector revision process involves the automated plotting of updated vector files. Traditional processes use analog stereoplotters and manual scribing instruments on specially coated map separates. The ability to select from or combine these processes increases the efficiency of the National Mapping Division map revision program.

  10. Generalized synchronization in relay systems with instantaneous coupling

    NASA Astrophysics Data System (ADS)

    Gutiérrez, R.; Sevilla-Escoboza, R.; Piedrahita, P.; Finke, C.; Feudel, U.; Buldú, J. M.; Huerta-Cuellar, G.; Jaimes-Reátegui, R.; Moreno, Y.; Boccaletti, S.

    2013-11-01

    We demonstrate the existence of generalized synchronization in systems that act as mediators between two dynamical units that, in turn, show complete synchronization with each other. These are the so-called relay systems. Specifically, we analyze the Lyapunov spectrum of the full system to elucidate when complete and generalized synchronization appear. We show that once a critical coupling strength is achieved, complete synchronization emerges between the systems to be synchronized, and at the same point, generalized synchronization with the relay system also arises. Next, we use two nonlinear measures based on the distance between phase-space neighbors to quantify the generalized synchronization in discretized time series. Finally, we experimentally show the robustness of the phenomenon and of the theoretical tools here proposed to characterize it.

  11. An Examination of Master's Student Retention & Completion

    ERIC Educational Resources Information Center

    Barry, Melissa; Mathies, Charles

    2011-01-01

    This study was conducted at a research-extensive public university in the southeastern United States. It examined the retention and completion of master's degree students across numerous disciplines. Results were derived from a series of descriptive statistics, T-tests, and a series of binary logistic regression models. The findings from binary…

  12. Assessment of Program Impact Through First Grade, Volume IV: Impact on Teachers. An Evaluation of Project Developmental Continuity. Interim Report X.

    ERIC Educational Resources Information Center

    Wacker, Sally; And Others

    The fourth in a series reporting evaluation findings on the impact of Project Developmental Continuity (PDC), this volume reports treatment-related and other findings concerning teachers and classrooms up to the time the evaluation study's cohort of children had completed grade 1. Begun at 15 sites in 1974 with the purpose of ensuring that…

  13. Assessment of Program Impact Through First Grade, Volume III: Impact on Parents. An Evaluation of Project Developmental Continuity. Interim Report X.

    ERIC Educational Resources Information Center

    Morris, Mary; And Others

    Third in a series of six, this volume reports findings concerning the impact of Project Developmental Continuity (PDC) on the parents of the evaluation study's cohort of children as well as preliminary findings on the relationship between family characteristics and program outcome variables up to the time the children had completed grade 1. Begun…

  14. Assessment of Program Impact Through First Grade, Volume II: Impact on Institutions. An Evaluation of Project Developmental Continuity. Interim Report X.

    ERIC Educational Resources Information Center

    Rosario, Jose; And Others

    As part of a longitudinal study evaluating program effects, this report, the second in a series of six, describes the impact of Project Developmental Continuity (PDC) on the institutional policies and procedures of participating Head Start centers and elementary schools up to the time the evaluation study's cohort of children had completed grade…

  15. Mapping of volcanic units at Alba Patera, Mars

    NASA Technical Reports Server (NTRS)

    Cattermole, Peter

    1987-01-01

    Detailed photogeologic mapping of Alba Patera, Northern Tharsis, was completed and a geologic map prepared. This was supplemented by a series of detailed volcanic flow maps and used to study the morphometry of different flow types and analyze the way in which the behavior of the volcano has changed with time and also the manner in which flow fields developed in different sectors of the structure.

  16. Nationwide disturbance attribution on NASA’s earth exchange: experiences in a high-end computing environment

    Treesearch

    J. Chris Toney; Karen G. Schleeweis; Jennifer Dungan; Andrew Michaelis; Todd Schroeder; Gretchen G. Moisen

    2015-01-01

    The North American Forest Dynamics (NAFD) project’s Attribution Team is completing nationwide processing of historic Landsat data to provide a comprehensive annual, wall-to-wall analysis of US disturbance history, with attribution, over the last 25+ years. Per-pixel time series analysis based on a new nonparametric curve fitting algorithm yields several metrics useful...

  17. Use of Saliva for Assessment of Stress and Its Effect on the Immune System Prior to Gross Anatomy Practical Examinations

    ERIC Educational Resources Information Center

    Lester, S. Reid; Brown, Jason R.; Aycock, Jeffrey E.; Grubbs, S. Lee; Johnson, Roger B.

    2010-01-01

    The objective of this study was to determine the longitudinal effects of a series of stressful gross anatomy tests on the immune system. Thirty-six freshman occupational therapy students completed a written stress evaluation survey, and saliva samples were obtained at baseline and prior to each of three timed-practical gross anatomy tests.…

  18. Privatization and Access: The Chilean Higher Education Experiment and Its Discontents. Research and Occasional Paper Series: CSHE.11.15

    ERIC Educational Resources Information Center

    González, Cristina; Pedraja, Liliana

    2015-01-01

    President Barack Obama recently announced a proposal to eliminate tuition charges at community colleges so that everyone can easily complete the first two years of a university education. At the same time, the administration is creating new regulations to curb the worst abuses of for-profit universities. This suggests that the country has reached…

  19. Essential Study Skills: The Complete Guide to Success at University. Second Edition. Sage Study Skills Series

    ERIC Educational Resources Information Center

    Burns, Tom; Sinfield, Sandra

    2008-01-01

    The eagerly-awaited new edition of the successful "Essential Study Skills" continues to provide a truly practical guide to achieving success at university. Whether you are going to university straight from school, a mature student, or an overseas student studying in the UK for the first time, this is the book that will help you to better…

  20. Serial office-based steroid injections for treatment of idiopathic subglottic stenosis.

    PubMed

    Hoffman, Matthew R; Coughlin, Adam R; Dailey, Seth H

    2017-11-01

    Current treatment options for idiopathic subglottic stenosis include endoscopic interventions, resection, and tracheotomy. Recently, serial office-based steroid injections were proposed as an alternative that may stabilize or induce regression of airway stenosis without the need for repeated operations. Procedure completion rate, pain, complications, effect on stenosis, time since the last operation, and limitations have not been described. Retrospective case series. Retrospective series of 19 patients undergoing serial office-based steroid injection for idiopathic subglottic stenosis. Outcome measures included completion rate, procedure-related pain scores, complications, percentage of airway stenosis, and time since the last operative intervention. Procedure completion rate was 98.8%. Average pain score during the procedure was 2.3 ± 1.7 on a 10-point scale. There were no immediate complications. One patient underwent awake tracheotomy 8 days after her second injection and was later decannulated. Average stenosis decreased from 35% ± 15% to 25% ± 15% (n = 16; P = .086) over the first of three injections and 40% ± 15% to 25% ± 10% to 20% ± 10% (n = 8; P = .002) for those patients completing two sets of three injections. Fourteen of 17 patients undergoing at least three injections have not returned to the operating room since the first injection. Office-based steroid injection represents a promising new treatment pathway for a disease that requires long-term management, offering a purely pharmacologic approach to a disorder that has traditionally been approached from a mechanical perspective. It is safe, well tolerated, and effective. Furthermore, it may help patients and physicians avoid repeated trips to the operating room and the associated risks. 4. Laryngoscope, 127:2475-2481, 2017. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.

  1. Statistical process control of mortality series in the Australian and New Zealand Intensive Care Society (ANZICS) adult patient database: implications of the data generating process

    PubMed Central

    2013-01-01

    Background Statistical process control (SPC), an industrial sphere initiative, has recently been applied in health care and public health surveillance. SPC methods assume independent observations and process autocorrelation has been associated with increase in false alarm frequency. Methods Monthly mean raw mortality (at hospital discharge) time series, 1995–2009, at the individual Intensive Care unit (ICU) level, were generated from the Australia and New Zealand Intensive Care Society adult patient database. Evidence for series (i) autocorrelation and seasonality was demonstrated using (partial)-autocorrelation ((P)ACF) function displays and classical series decomposition and (ii) “in-control” status was sought using risk-adjusted (RA) exponentially weighted moving average (EWMA) control limits (3 sigma). Risk adjustment was achieved using a random coefficient (intercept as ICU site and slope as APACHE III score) logistic regression model, generating an expected mortality series. Application of time-series to an exemplar complete ICU series (1995-(end)2009) was via Box-Jenkins methodology: autoregressive moving average (ARMA) and (G)ARCH ((Generalised) Autoregressive Conditional Heteroscedasticity) models, the latter addressing volatility of the series variance. Results The overall data set, 1995-2009, consisted of 491324 records from 137 ICU sites; average raw mortality was 14.07%; average(SD) raw and expected mortalities ranged from 0.012(0.113) and 0.013(0.045) to 0.296(0.457) and 0.278(0.247) respectively. For the raw mortality series: 71 sites had continuous data for assessment up to or beyond lag40 and 35% had autocorrelation through to lag40; and of 36 sites with continuous data for ≥ 72 months, all demonstrated marked seasonality. Similar numbers and percentages were seen with the expected series. Out-of-control signalling was evident for the raw mortality series with respect to RA-EWMA control limits; a seasonal ARMA model, with GARCH effects, displayed white-noise residuals which were in-control with respect to EWMA control limits and one-step prediction error limits (3SE). The expected series was modelled with a multiplicative seasonal autoregressive model. Conclusions The data generating process of monthly raw mortality series at the ICU level displayed autocorrelation, seasonality and volatility. False-positive signalling of the raw mortality series was evident with respect to RA-EWMA control limits. A time series approach using residual control charts resolved these issues. PMID:23705957

  2. Automated classification of Permanent Scatterers time-series based on statistical characterization tests

    NASA Astrophysics Data System (ADS)

    Berti, Matteo; Corsini, Alessandro; Franceschini, Silvia; Iannacone, Jean Pascal

    2013-04-01

    The application of space borne synthetic aperture radar interferometry has progressed, over the last two decades, from the pioneer use of single interferograms for analyzing changes on the earth's surface to the development of advanced multi-interferogram techniques to analyze any sort of natural phenomena which involves movements of the ground. The success of multi-interferograms techniques in the analysis of natural hazards such as landslides and subsidence is widely documented in the scientific literature and demonstrated by the consensus among the end-users. Despite the great potential of this technique, radar interpretation of slope movements is generally based on the sole analysis of average displacement velocities, while the information embraced in multi interferogram time series is often overlooked if not completely neglected. The underuse of PS time series is probably due to the detrimental effect of residual atmospheric errors, which make the PS time series characterized by erratic, irregular fluctuations often difficult to interpret, and also to the difficulty of performing a visual, supervised analysis of the time series for a large dataset. In this work is we present a procedure for automatic classification of PS time series based on a series of statistical characterization tests. The procedure allows to classify the time series into six distinctive target trends (0=uncorrelated; 1=linear; 2=quadratic; 3=bilinear; 4=discontinuous without constant velocity; 5=discontinuous with change in velocity) and retrieve for each trend a series of descriptive parameters which can be efficiently used to characterize the temporal changes of ground motion. The classification algorithms were developed and tested using an ENVISAT datasets available in the frame of EPRS-E project (Extraordinary Plan of Environmental Remote Sensing) of the Italian Ministry of Environment (track "Modena", Northern Apennines). This dataset was generated using standard processing, then the time series are typically affected by a significant noise to signal ratio. The results of the analysis show that even with such a rough-quality dataset, our automated classification procedure can greatly improve radar interpretation of mass movements. In general, uncorrelated PS (type 0) are concentrated in flat areas such as fluvial terraces and valley bottoms, and along stable watershed divides; linear PS (type 1) are mainly located on slopes (both inside or outside mapped landslides) or near the edge of scarps or steep slopes; non-linear PS (types 2 to 5) typically fall inside landslide deposits or in the surrounding areas. The spatial distribution of classified PS allows to detect deformation phenomena not visible by considering the average velocity alone, and provide important information on the temporal evolution of the phenomena such as acceleration, deceleration, seasonal fluctuations, abrupt or continuous changes of the displacement rate. Based on these encouraging results we integrated all the classification algorithms into a Graphical User Interface (called PSTime) which is freely available as a standalone application.

  3. Automatic location of L/H transition times for physical studies with a large statistical basis

    NASA Astrophysics Data System (ADS)

    González, S.; Vega, J.; Murari, A.; Pereira, A.; Dormido-Canto, S.; Ramírez, J. M.; contributors, JET-EFDA

    2012-06-01

    Completely automatic techniques to estimate and validate L/H transition times can be essential in L/H transition analyses. The generation of databases with hundreds of transition times and without human intervention is an important step to accomplish (a) L/H transition physics analysis, (b) validation of L/H theoretical models and (c) creation of L/H scaling laws. An entirely unattended methodology is presented in this paper to build large databases of transition times in JET using time series. The proposed technique has been applied to a dataset of 551 JET discharges between campaigns C21 and C26. A prediction with discharges that show a clear signature in time series is made through the locating properties of the wavelet transform. It is an accurate prediction and the uncertainty interval is ±3.2 ms. The discharges with a non-clear pattern in the time series use an L/H mode classifier based on discharges with a clear signature. In this case, the estimation error shows a distribution with mean and standard deviation of 27.9 ms and 37.62 ms, respectively. Two different regression methods have been applied to the measurements acquired at the transition times identified by the automatic system. The obtained scaling laws for the threshold power are not significantly different from those obtained using the data at the transition times determined manually by the experts. The automatic methods allow performing physical studies with a large number of discharges, showing, for example, that there are statistically different types of transitions characterized by different scaling laws.

  4. Deriving phenological metrics from NDVI through an open source tool developed in QGIS

    NASA Astrophysics Data System (ADS)

    Duarte, Lia; Teodoro, A. C.; Gonçalves, Hernãni

    2014-10-01

    Vegetation indices have been commonly used over the past 30 years for studying vegetation characteristics using images collected by remote sensing satellites. One of the most commonly used is the Normalized Difference Vegetation Index (NDVI). The various stages that green vegetation undergoes during a complete growing season can be summarized through time-series analysis of NDVI data. The analysis of such time-series allow for extracting key phenological variables or metrics of a particular season. These characteristics may not necessarily correspond directly to conventional, ground-based phenological events, but do provide indications of ecosystem dynamics. A complete list of the phenological metrics that can be extracted from smoothed, time-series NDVI data is available in the USGS online resources (http://phenology.cr.usgs.gov/methods_deriving.php).This work aims to develop an open source application to automatically extract these phenological metrics from a set of satellite input data. The main advantage of QGIS for this specific application relies on the easiness and quickness in developing new plug-ins, using Python language, based on the experience of the research group in other related works. QGIS has its own application programming interface (API) with functionalities and programs to develop new features. The toolbar developed for this application was implemented using the plug-in NDVIToolbar.py. The user introduces the raster files as input and obtains a plot and a report with the metrics. The report includes the following eight metrics: SOST (Start Of Season - Time) corresponding to the day of the year identified as having a consistent upward trend in the NDVI time series; SOSN (Start Of Season - NDVI) corresponding to the NDVI value associated with SOST; EOST (End of Season - Time) which corresponds to the day of year identified at the end of a consistent downward trend in the NDVI time series; EOSN (End of Season - NDVI) corresponding to the NDVI value associated with EOST; MAXN (Maximum NDVI) which corresponds to the maximum NDVI value; MAXT (Time of Maximum) which is the day associated with MAXN; DUR (Duration) defined as the number of days between SOST and EOST; and AMP (Amplitude) which is the difference between MAXN and SOSN. This application provides all these metrics in a single step. Initially, the data points are interpolated using a moving average graphic with five and three points. The eight metrics previously described are then obtained from the spline using numpy functions. In the present work, the developed toolbar was applied to MODerate resolution Imaging Spectroradiometer (MODIS) data covering a particular region of Portugal, which can be generally applied to other satellite data and study area. The code is open and can be modified according to the user requirements. Other advantage in publishing the plug-ins and the application code is the possibility of other users to improve this application.

  5. Development of a multikilowatt ion thruster power processor

    NASA Technical Reports Server (NTRS)

    Schoenfeld, A. D.; Goldin, D. S.; Biess, J. J.

    1972-01-01

    A feasibility study was made of the application of silicon-controlled, rectifier series, resonant inverter, power conditioning technology to electric propulsion power processing operating from a 200 to 400 Vdc solar array bus. A power system block diagram was generated to meet the electrical requirements of a 20 CM hollow cathode, mercury bombardment, ion engine. The SCR series resonant inverter was developed as a primary means of power switching and conversion, and the analog signal-to-discrete-time-interval converter control system was applied to achieve good regulation. A complete breadboard was designed, fabricated, and tested with a resistive load bank, and critical power processor areas relating to efficiency, weight, and part count were identified.

  6. Human papillomavirus vaccination among adolescents in Georgia.

    PubMed

    Underwood, Natasha L; Weiss, Paul; Gargano, Lisa M; Seib, Katherine; Rask, Kimberly J; Morfaw, Christopher; Murray, Dennis; DiClemente, Ralph J; Hughes, James M; Sales, Jessica M

    2015-01-01

    Human papillomavirus (HPV) vaccination coverage for adolescent females and males remains low in the United States. We conducted a 3-arm randomized controlled trial (RCT) conducted in middle and high schools in eastern Georgia from 2011-2013 to determine the effect of 2 educational interventions used to increase adolescent vaccination coverage for the 4 recommended adolescent vaccines: Tdap, MCV4, HPV and influenza. As part of this RCT, this article focuses on: 1) describing initiation and completion of HPV vaccine series among a diverse population of male and female adolescents; 2) assessing parental attitudes toward HPV vaccine; and 3) examining correlates of HPV vaccine series initiation and completion. Parental attitude score was the strongest predictor of HPV vaccine initiation among adolescents (adjusted odds ratio (aOR): 2.08; 95% confidence interval (CI): 1.80, 2.39). Other correlates that significantly predicted HPV series initiation were gender, study year, and intervention arm. Parental attitudes remained a significant predictor of receipt of 3 doses of HPV vaccine along with gender, race, school type and insurance type. This study demonstrates that positive parental attitudes are important predictors of HPV vaccination and critical to increasing coverage rates. Our findings suggest that more research is needed to understand how parental attitudes are developed and evolve over time.

  7. Findings of multiple HPV genotypes in cervical carcinoma are associated with poor cancer-specific survival in a Swedish cohort of cervical cancer primarily treated with radiotherapy.

    PubMed

    Kaliff, Malin; Sorbe, Bengt; Mordhorst, Louise Bohr; Helenius, Gisela; Karlsson, Mats G; Lillsunde-Larsson, Gabriella

    2018-04-10

    Cervical cancer (CC) is one of the most common cancers in women and virtually all cases of CC are a result of a persistent infection of human papillomavirus (HPV). For disease detected in early stages there is curing treatment but when diagnosed late with recurring disease and metastasis there are limited possibilities. Here we evaluate HPV impact on treatment resistance and metastatic disease progression. Prevalence and distribution of HPV genotypes and HPV16 variants in a Swedish CC patient cohort (n=209) was evaluated, as well as HPV influence on patient prognosis. Tumor samples suitable for analysis (n=204) were genotyped using two different real-time PCR methods. HPV16 variant analysis was made using pyrosequencing. Results showed that HPV prevalence in the total series was 93%. Of the HPV-positive samples, 13% contained multiple infections, typically with two high-risk HPV together. Primary cure rate for the complete series was 95%. Recurrence rate of the complete series was 28% and distant recurrences were most frequent (20%). Patients with tumors containing multiple HPV-strains and particularly HPV genotypes belonging to the alpha 7 and 9 species together had a significantly higher rate of distant tumor recurrences and worse cancer-specific survival rate.

  8. HMI Data Driven Magnetohydrodynamic Model Predicted Active Region Photospheric Heating Rates: Their Scale Invariant, Flare Like Power Law Distributions, and Their Possible Association With Flares

    NASA Technical Reports Server (NTRS)

    Goodman, Michael L.; Kwan, Chiman; Ayhan, Bulent; Shang, Eric L.

    2017-01-01

    A data driven, near photospheric, 3 D, non-force free magnetohydrodynamic model pre- dicts time series of the complete current density, and the resistive heating rate Q at the photosphere in neutral line regions (NLRs) of 14 active regions (ARs). The model is driven by time series of the magnetic field B observed by the Helioseismic & Magnetic Imager on the Solar Dynamics Observatory (SDO) satellite. Spurious Doppler periods due to SDO orbital motion are filtered out of the time series for B in every AR pixel. Errors in B due to these periods can be significant. The number of occurrences N(q) of values of Q > or = q for each AR time series is found to be a scale invariant power law distribution, N(Q) / Q-s, above an AR dependent threshold value of Q, where 0.3952 < or = s < or = 0.5298 with mean and standard deviation of 0.4678 and 0.0454, indicating little variation between ARs. Observations show that the number of occurrences N(E) of coronal flares with a total energy released > or = E obeys the same type of distribution, N(E) / E-S, above an AR dependent threshold value of E, with 0.38 < or approx. S < or approx. 0.60, also with little variation among ARs. Within error margins the ranges of s and S are nearly identical. This strong similarity between N(Q) and N(E) suggests a fundamental connection between the process that drives coronal flares and the process that drives photospheric NLR heating rates in ARs. In addition, results suggest it is plausible that spikes in Q, several orders of magnitude above background values, are correlated with times of the subsequent occurrence of M or X flares.

  9. HMI Data Driven Magnetohydrodynamic Model Predicted Active Region Photospheric Heating Rates: Their Scale Invariant, Flare Like Power Law Distributions, and Their Possible Association With Flares

    NASA Technical Reports Server (NTRS)

    Goodman, Michael L.; Kwan, Chiman; Ayhan, Bulent; Shang, Eric L.

    2017-01-01

    A data driven, near photospheric, 3 D, non-force free magnetohydrodynamic model predicts time series of the complete current density, and the resistive heating rate Q at the photosphere in neutral line regions (NLRs) of 14 active regions (ARs). The model is driven by time series of the magnetic field B observed by the Helioseismic and Magnetic Imager on the Solar Dynamics Observatory (SDO) satellite. Spurious Doppler periods due to SDO orbital motion are filtered out of the time series for B in every AR pixel. Errors in B due to these periods can be significant. The number of occurrences N(q) of values of Q > or = q for each AR time series is found to be a scale invariant power law distribution, N(Q) / Q-s, above an AR dependent threshold value of Q, where 0.3952 < or = s < or = 0.5298 with mean and standard deviation of 0.4678 and 0.0454, indicating little variation between ARs. Observations show that the number of occurrences N(E) of coronal flares with a total energy released > or = E obeys the same type of distribution, N(E) / E-S, above an AR dependent threshold value of E, with 0.38 < or approx. S < or approx. 0.60, also with little variation among ARs. Within error margins the ranges of s and S are nearly identical. This strong similarity between N(Q) and N(E) suggests a fundamental connection between the process that drives coronal flares and the process that drives photospheric NLR heating rates in ARs. In addition, results suggest it is plausible that spikes in Q, several orders of magnitude above background values, are correlated with times of the subsequent occurrence of M or X flares.

  10. Comparing errors in ED computer-assisted vs conventional pediatric drug dosing and administration.

    PubMed

    Yamamoto, Loren; Kanemori, Joan

    2010-06-01

    Compared to fixed-dose single-vial drug administration in adults, pediatric drug dosing and administration requires a series of calculations, all of which are potentially error prone. The purpose of this study is to compare error rates and task completion times for common pediatric medication scenarios using computer program assistance vs conventional methods. Two versions of a 4-part paper-based test were developed. Each part consisted of a set of medication administration and/or dosing tasks. Emergency department and pediatric intensive care unit nurse volunteers completed these tasks using both methods (sequence assigned to start with a conventional or a computer-assisted approach). Completion times, errors, and the reason for the error were recorded. Thirty-eight nurses completed the study. Summing the completion of all 4 parts, the mean conventional total time was 1243 seconds vs the mean computer program total time of 879 seconds (P < .001). The conventional manual method had a mean of 1.8 errors vs the computer program with a mean of 0.7 errors (P < .001). Of the 97 total errors, 36 were due to misreading the drug concentration on the label, 34 were due to calculation errors, and 8 were due to misplaced decimals. Of the 36 label interpretation errors, 18 (50%) occurred with digoxin or insulin. Computerized assistance reduced errors and the time required for drug administration calculations. A pattern of errors emerged, noting that reading/interpreting certain drug labels were more error prone. Optimizing the layout of drug labels could reduce the error rate for error-prone labels. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  11. Bibliography on Tidal Hydraulics. Supplementary Material Compiled from June 1983 to June 1986. Tidal Flows in Rivers and Harbors. Supplement Number 10.

    DTIC Science & Technology

    1987-06-01

    Section VIII.) the total time. The reverse of this cir- culation (surface inflow, outflow at Edinger, J. E., and Buchak, E. M. "Estu- depth) and storage ...respect to their applicabil- Attempts have been made to determine the ity. Hourly sampled 70-hours time series flow characteristics in the estuary, ana- of...Integration Using Pumped Storage ." cient equations, it is obvious that the (See complete entry in Section V.) flow will not be properly simulated with

  12. Nonlinear wave vacillation in the atmosphere

    NASA Technical Reports Server (NTRS)

    Antar, Basil N.

    1987-01-01

    The problem of vacillation in a baroclinically unstable flow field is studied through the time evolution of a single nonlinearly unstable wave. To this end a computer code is being developed to solve numerically for the time evolution of the amplitude of such a wave. The final working code will be the end product resulting from the development of a heirarchy of codes with increasing complexity. The first code in this series was completed and is undergoing several diagnostic analyses to verify its validity. The development of this code is detailed.

  13. GPS data exploration for seismologists and geodesists

    NASA Astrophysics Data System (ADS)

    Webb, F.; Bock, Y.; Kedar, S.; Dong, D.; Jamason, P.; Chang, R.; Prawirodirdjo, L.; MacLeod, I.; Wadsworth, G.

    2007-12-01

    Over the past decade, GPS and seismic networks spanning the western US plate boundaries have produced vast amounts of data that need to be made accessible to both the geodesy and seismology communities. Unlike seismic data, raw geodetic data requires significant processing before geophysical interpretations can be made. This requires the generation of data-products (time series, velocities and strain maps) and dissemination strategies to bridge these differences and assure efficient use of data across traditionally separate communities. "GPS DATA PRODUCTS FOR SOLID EARTH SCIENCE" (GDPSES) is a multi-year NASA funded project, designed to produce and deliver high quality GPS time series, velocities, and strain fields, derived from multiple GPS networks along the western US plate boundary, and to make these products easily accessible to geophysicists. Our GPS product dissemination is through modern web-based IT methodology. Product browsing is facilitated through a web tool known as GPS Explorer and continuous streams of GPS time series are provided using web services to the seismic archive, where it can be accessed by seismologists using traditional seismic data viewing and manipulation tools. GPS-Explorer enables users to efficiently browse several layers of data products from raw data through time series, velocities and strain by providing the user with a web interface, which seamlessly interacts with a continuously updated database of these data products through the use of web-services. The current archive contains GDPSES data products beginning in 1995, and includes observations from GPS stations in EarthScope's Plate Boundary Observatory (PBO), as well as from real-time real-time CGPS stations. The generic, standards-based approach used in this project enables GDPSES to seamlessly expand indefinitely to include other space-time-dependent data products from additional GPS networks. The prototype GPS-Explorer provides users with a personalized working environment in which the user may zoom in and access subsets of the data via web services. It provides users with a variety of interactive web tools interconnected in a portlet environment to explore and save datasets of interest to return to at a later date. At the same time the GPS time series are also made available through the seismic data archive, where the GPS networks are treated as regular seismic networks, whose data is made available in data formats used by seismic utilities such as SEED readers and SAC. A key challenge, stemming from the fundamental differences between seismic and geodetic time series, is the representation of reprocessed of GPS data in the seismic archive. As GPS processing algorithms evolve and their accuracy increases, a periodic complete recreation of the the GPS time series archive is necessary.

  14. Video capsule endoscopy after bariatric and gastric surgery: oral ingestion is associated with satisfactory completion rate.

    PubMed

    Stanich, Peter P; Kleinman, Bryan; Porter, Kyle M; Meyer, Marty M

    2015-01-01

    To investigate the outcomes of video capsule endoscopy (VCE) performed on patients after bariatric and gastric surgery with a focus on delivery method (oral ingestion or endoscopic placement). There is minimal published data regarding the use of VCE in patients after bariatric and gastric surgery and the optimal delivery method is unknown. Retrospective case series of patients with bariatric or gastric surgery undergoing VCE in a tertiary care center over 3 years. Outcomes of interest were completion of the procedure and bowel transit times. Twenty-three patients met study criteria. They underwent 24 VCE in the study period, with 13/16 (81.3%; 95% CI, 54%-96%) completed to the colon after oral ingestion and 5/8 (62.5%; 95% CI, 24%-91%) completed after endoscopic deployment. The median gastric transit time after oral ingestion was <1 minute (IQR, <1 to 99). Median total transit time after oral ingestion was 291 minutes (IQR, 213 to 434) and after endoscopic deployment was 364 minutes (IQR, 233 to >440) (P=0.48). There were no instances of capsule retention. Oral ingestion of VCE resulted in a satisfactory completion rate with rapid gastric transit after bariatric and gastric surgery. There were no capsule retention events. Given this and the favorable risk and cost profile, oral ingestion should be favored over endoscopic placement in this patient population.

  15. Single crystal synthesis and magnetism of the Ba Ln 2O 4 family ( Ln = lanthanide)

    DOE PAGES

    Besara, Tiglet; Lundberg, Matthew S.; Sun, Jifeng; ...

    2014-05-27

    The series of compounds in the Ba Ln 2O 4 family (Ln = La–Lu, Y) has been synthesized for the first time in single crystalline form, using a molten metal flux. The series crystallizes in the CaV 2O 4 structure type with primitive orthorhombic symmetry (space group Pnma, #62), and a complete structural study of atomic positions, bonds, angles, and distortions across the lanthanide series is presented. With the exception of the Y, La, Eu, and Lu members, magnetic susceptibility measurements were performed between 2 K and 300 K. BaCe 2O 4 and BaYb 2O 4 display large crystal fieldsmore » effects and suppression of magnetic ordering. As a result, all compounds show signs of magnetic frustration due to the trigonal arrangements of the trivalent lanthanide cations in the structure.« less

  16. Decreasing triage time: effects of implementing a step-wise ESI algorithm in an EHR.

    PubMed

    Villa, Stephen; Weber, Ellen J; Polevoi, Steven; Fee, Christopher; Maruoka, Andrew; Quon, Tina

    2018-06-01

    To determine if adapting a widely-used triage scale into a computerized algorithm in an electronic health record (EHR) shortens emergency department (ED) triage time. Before-and-after quasi-experimental study. Urban, tertiary care hospital ED. Consecutive adult patient visits between July 2011 and June 2013. A step-wise algorithm, based on the Emergency Severity Index (ESI-5) was programmed into the triage module of a commercial EHR. Duration of triage (triage interval) for all patients and change in percentage of high acuity patients (ESI 1 and 2) completing triage within 15 min, 12 months before-and-after implementation of the algorithm. Multivariable analysis adjusted for confounders; interrupted time series demonstrated effects over time. Secondary outcomes examined quality metrics and patient flow. About 32 546 patient visits before and 33 032 after the intervention were included. Post-intervention patients were slightly older, census was higher and admission rate slightly increased. Median triage interval was 5.92 min (interquartile ranges, IQR 4.2-8.73) before and 2.8 min (IQR 1.88-4.23) after the intervention (P < 0.001). Adjusted mean triage interval decreased 3.4 min (95% CI: -3.6, -3.2). The proportion of high acuity patients completing triage within 15 min increased from 63.9% (95% CI 62.5, 65.2%) to 75.0% (95% CI 73.8, 76.1). Monthly time series demonstrated immediate and sustained improvement following the intervention. Return visits within 72 h and door-to-balloon time were unchanged. Total length of stay was similar. The computerized triage scale improved speed of triage, allowing more high acuity patients to be seen within recommended timeframes, without notable impact on quality.

  17. A hybrid wavelet analysis-cloud model data-extending approach for meteorologic and hydrologic time series

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Ding, Hao; Singh, Vijay P.; Shang, Xiaosan; Liu, Dengfeng; Wang, Yuankun; Zeng, Xiankui; Wu, Jichun; Wang, Lachun; Zou, Xinqing

    2015-05-01

    For scientific and sustainable management of water resources, hydrologic and meteorologic data series need to be often extended. This paper proposes a hybrid approach, named WA-CM (wavelet analysis-cloud model), for data series extension. Wavelet analysis has time-frequency localization features, known as "mathematics microscope," that can decompose and reconstruct hydrologic and meteorologic series by wavelet transform. The cloud model is a mathematical representation of fuzziness and randomness and has strong robustness for uncertain data. The WA-CM approach first employs the wavelet transform to decompose the measured nonstationary series and then uses the cloud model to develop an extension model for each decomposition layer series. The final extension is obtained by summing the results of extension of each layer. Two kinds of meteorologic and hydrologic data sets with different characteristics and different influence of human activity from six (three pairs) representative stations are used to illustrate the WA-CM approach. The approach is also compared with four other methods, which are conventional correlation extension method, Kendall-Theil robust line method, artificial neural network method (back propagation, multilayer perceptron, and radial basis function), and single cloud model method. To evaluate the model performance completely and thoroughly, five measures are used, which are relative error, mean relative error, standard deviation of relative error, root mean square error, and Thiel inequality coefficient. Results show that the WA-CM approach is effective, feasible, and accurate and is found to be better than other four methods compared. The theory employed and the approach developed here can be applied to extension of data in other areas as well.

  18. Size of clinical trials and Introductory prices of prophylactic vaccine series

    PubMed Central

    Weinberg, Steven H.; Butchart, Amy T.; Davis, Matthew M.

    2012-01-01

    Costs of completing the recommended immunization schedule have increased over the last decade. Access to prophylactic vaccines may become limited due to financing obstacles within current delivery systems. Vaccine prices reflect research and development expenses incurred by vaccine manufacturers, including costs associated with evaluating candidate vaccines in human subjects. If the number of subjects in clinical trials is increasing over time and associated with vaccine price, this may help explain increases in prices of vaccine series. We examined whether: (A) the initial public- and private-sector prices for recommended prophylactic vaccine series licensed and recommended in the US increased from 2000–2011, (B) the number of human subjects per licensed vaccine increased during the time period, and (C) the number of human subjects was associated with the initial public–and private–sector prices of the vaccine series. In regression analyses of 13 vaccines, approval year was not significantly associated with the number of human subjects, initial public-sector prices, or initial private-sector prices. While the number of phase II subjects was not significantly associated with prices, the numbers of phase III and combined late phase (phases II + III) subjects were significantly associated with initial public- and private-sector series prices (p < 0.05). The association between number of subjects and initial prices demonstrated diminishing marginal increases in price with increasing numbers of subjects. These findings may help guide the number of subjects required by the FDA in clinical trials, in order to reduce expenses for manufacturers and thereby help mitigate increases in initial vaccine series prices. PMID:22854668

  19. Size of clinical trials and Introductory prices of prophylactic vaccine series.

    PubMed

    Weinberg, Steven H; Butchart, Amy T; Davis, Matthew M

    2012-08-01

    Costs of completing the recommended immunization schedule have increased over the last decade. Access to prophylactic vaccines may become limited due to financing obstacles within current delivery systems. Vaccine prices reflect research and development expenses incurred by vaccine manufacturers, including costs associated with evaluating candidate vaccines in human subjects. If the number of subjects in clinical trials is increasing over time and associated with vaccine price, this may help explain increases in prices of vaccine series. We examined whether: (A) the initial public- and private-sector prices for recommended prophylactic vaccine series licensed and recommended in the US increased from 2000-2011, (B) the number of human subjects per licensed vaccine increased during the time period, and (C) the number of human subjects was associated with the initial public-and private-sector prices of the vaccine series. In regression analyses of 13 vaccines, approval year was not significantly associated with the number of human subjects, initial public-sector prices, or initial private-sector prices. While the number of phase II subjects was not significantly associated with prices, the numbers of phase III and combined late phase (phases II + III) subjects were significantly associated with initial public- and private-sector series prices (p < 0.05). The association between number of subjects and initial prices demonstrated diminishing marginal increases in price with increasing numbers of subjects. These findings may help guide the number of subjects required by the FDA in clinical trials, in order to reduce expenses for manufacturers and thereby help mitigate increases in initial vaccine series prices.

  20. Naval War College Review. Winter 1988

    DTIC Science & Technology

    1988-01-01

    great many Americans to see as the culprits in the latest series of White House shenanigans two distinguished military officers on active duty...ination of Atlas and Titan missiles (ICBMs) from the SAC inventory for financial reasons. This completely ignores the military’s cognizance of...connection to financial , comn1crcial. and mari- time interest<. Mostimportantly, the 162 Naval War College Review authors, by examining the early stages of

  1. Area Handbook Series: Chad, A Country Study

    DTIC Science & Technology

    1988-12-01

    northern and central interests. As dis- affection in these regions increased, in the late 1960s dissident groups formed an antigovernment coalition, the...decades-might finally ensue. December 13, 1988 * * seea eet After the research for this book was completed, several events occurred that greatly affected ...Chad’s geographic position along major trans-Saharan trade routes has also affected its historical development. In early times, trade consisted of

  2. Paraquat and pine trees in east Tennessee. Progress report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schnell, R.L.

    The Tennessee Valley Authority started a series of 8% Paraquat tests in east Tennessee on shortleaf, Virginia, and loblolly pines in the spring of 1974. In addition to species, also the effects of season of application and the length of time between the completed treatment and the harvest cut is being tested. Wood samples are being analyzed by the Botany Department at the University of Tennessee in Knoxville.

  3. Etanercept therapy for toxic epidermal necrolysis.

    PubMed

    Paradisi, Andrea; Abeni, Damiano; Bergamo, Fabio; Ricci, Francesco; Didona, Dario; Didona, Biagio

    2014-08-01

    Toxic epidermal necrolysis (TEN) is a severe and potentially lethal drug reaction for which no standard treatment is available. To describe a case series of patients with TEN treated with a single dose of etanercept. We observed 10 consecutive patients with TEN. For each patient, we recorded the presence of comorbidities and all the drugs recently started (ie, in the last month). In all cases, 50 mg of etanercept was administered in a single subcutaneous injection. The clinical severity of disease was computed using the SCORe of Toxic Epidermal Necrosis (SCORTEN) scale. Using the probabilities of death linked to each level of SCORTEN score, we calculated the expected probability of death in our patients. Healing was defined as complete reepithelialization, and a time to healing curve was then obtained using the Kaplan-Meier method. All patients promptly responded to treatment, reaching complete reepithelialization without complications or side effects. The median time to healing was 8.5 days. This is a small, uncontrolled case series. These preliminary results suggest the possibility that tumor necrosis factor-alfa may be an effective target for control of TEN, a dangerous skin condition for which no effective cure has yet been found. Copyright © 2014 American Academy of Dermatology, Inc. Published by Mosby, Inc. All rights reserved.

  4. Hierarchical structure of the energy landscape of proteins revisited by time series analysis. I. Mimicking protein dynamics in different time scales

    NASA Astrophysics Data System (ADS)

    Alakent, Burak; Camurdan, Mehmet C.; Doruker, Pemra

    2005-10-01

    Time series models, which are constructed from the projections of the molecular-dynamics (MD) runs on principal components (modes), are used to mimic the dynamics of two proteins: tendamistat and immunity protein of colicin E7 (ImmE7). Four independent MD runs of tendamistat and three independent runs of ImmE7 protein in vacuum are used to investigate the energy landscapes of these proteins. It is found that mean-square displacements of residues along the modes in different time scales can be mimicked by time series models, which are utilized in dividing protein dynamics into different regimes with respect to the dominating motion type. The first two regimes constitute the dominance of intraminimum motions during the first 5ps and the random walk motion in a hierarchically higher-level energy minimum, which comprise the initial time period of the trajectories up to 20-40ps for tendamistat and 80-120ps for ImmE7. These are also the time ranges within which the linear nonstationary time series are completely satisfactory in explaining protein dynamics. Encountering energy barriers enclosing higher-level energy minima constrains the random walk motion of the proteins, and pseudorelaxation processes at different levels of minima are detected in tendamistat, depending on the sampling window size. Correlation (relaxation) times of 30-40ps and 150-200ps are detected for two energy envelopes of successive levels for tendamistat, which gives an overall idea about the hierarchical structure of the energy landscape. However, it should be stressed that correlation times of the modes are highly variable with respect to conformational subspaces and sampling window sizes, indicating the absence of an actual relaxation. The random-walk step sizes and the time length of the second regime are used to illuminate an important difference between the dynamics of the two proteins, which cannot be clarified by the investigation of relaxation times alone: ImmE7 has lower-energy barriers enclosing the higher-level energy minimum, preventing the protein to relax and letting it move in a random-walk fashion for a longer period of time.

  5. Time since maximum of Brownian motion and asymmetric Lévy processes

    NASA Astrophysics Data System (ADS)

    Martin, R. J.; Kearney, M. J.

    2018-07-01

    Motivated by recent studies of record statistics in relation to strongly correlated time series, we consider explicitly the drawdown time of a Lévy process, which is defined as the time since it last achieved its running maximum when observed over a fixed time period . We show that the density function of this drawdown time, in the case of a completely asymmetric jump process, may be factored as a function of t multiplied by a function of T  ‑  t. This extends a known result for the case of pure Brownian motion. We state the factors explicitly for the cases of exponential down-jumps with drift, and for the downward inverse Gaussian Lévy process with drift.

  6. Powerful lineup

    NASA Image and Video Library

    2012-10-11

    Two J-2X engines and a powerpack, developed for NASA by Pratt and Whitney Rocketdyne, sit side-by-side Oct. 11 at Stennis Space Center as work continues on the Space Launch System. Engine 10001 (far left) has been removed from the A-2 Test Stand after being hot-fire tested 21 times, for a total of 2,697 seconds. The engine is now undergoing a series of post-test inspections. A J-2X powerpack (center) has been removed from the A-1 Test Stand to receive additional instrumentation. So far, the powerpack been hot-fire tested 10 times, for a total of 4,162 seconds. Meanwhile, assembly on the second J-2X engine, known as Engine 10002 and located to the far right, has begun in earnest, with engine completion scheduled for this November. Engine 10002 is about 15 percent complete.

  7. Changes in eating pathology and associated symptoms among chronically ill adults attending a brief psychoeducational group.

    PubMed

    von Ranson, Kristin M; Stevenson, Andrea S; Cannon, Colleen K; Shah, Wendy

    2010-08-01

    Two quasi-experimental pilot studies examined eating pathology, eating self-efficacy, shame, guilt, and pride in adults with chronic illness before and after participating in brief cognitive-behavioral psychoeducational groups addressing eating concerns. In Study 1, 60 adults completed assessments before and after a series of two groups; in Study 2, 21 adults also completed an assessment five weeks prior to the first group to identify time-related changes in symptoms. Study 1 participants improved across domains, whereas Study 2 analyses also examining time-related changes showed improvements in eating self-efficacy, shame, guilt, and pride, but not in eating pathology. Psychoeducational groups may help improve symptoms including eating pathology, eating self-efficacy, shame, guilt, and pride among chronically-ill adults with eating concerns. 2009 Elsevier Ltd. All rights reserved.

  8. Satellite Ocean Color: Present Status, Future Challenges

    NASA Technical Reports Server (NTRS)

    Gregg, Watson W.; McClain, Charles R.; Zukor, Dorothy J. (Technical Monitor)

    2001-01-01

    We are midway into our 5th consecutive year of nearly continuous, high quality ocean color observations from space. The Ocean Color and Temperature Scanner/Polarization and Directionality of the Earth's Reflectances (OCTS/POLDER: Nov. 1996 - Jun. 1997), the Sea-viewing Wide Field-of-view Sensor (SeaWiFS: Sep. 1997 - present), and now the Moderate Resolution Imaging Spectrometer (MODIS: Sep. 2000 - present) have and are providing unprecedented views of chlorophyll dynamics on global scales. Global synoptic views of ocean chlorophyll were once a fantasy for ocean color scientists. It took nearly the entire 8-year lifetime of limited Coastal Zone Color Scanner (CZCS) observations to compile seasonal climatologies. Now SeaWIFS produces comparably complete fields in about 8 days. For the first time, scientists may observe spatial and temporal variability never before seen in a synoptic context. Even more exciting, we are beginning to plausibly ask questions of interannual variability. We stand at the beginning of long-time time series of ocean color, from which we may begin to ask questions of interdecadal variability and climate change. These are the scientific questions being addressed by users of the 18-year Advanced Very High Resolution Radiometer time series with respect to terrestrial processes and ocean temperatures. The nearly 5-year time series of ocean color observations now being constructed, with possibilities of continued observations, can put us at comparable standing with our terrestrial and physical oceanographic colleagues, and enable us to understand how ocean biological processes contribute to, and are affected by global climate change.

  9. An open-database of Grape Harvest dates for climate research: data description and quality assessment

    NASA Astrophysics Data System (ADS)

    Daux, V.; Garcia de Cortazar-Atauri, I.; Yiou, P.; Chuine, I.; Garnier, E.; Ladurie, E. Le Roy; Mestre, O.; Tardaguila, J.

    2011-11-01

    We present a dataset of grape harvest dates (GHD) series that has been compiled from international and non-translated French and Spanish literature and from unpublished documentary sources from public organizations and from wine-growers. As of June 2011, this GHD dataset comprises 378 series mainly from France (93% of the data) as well as series from Switzerland, Italy, Spain and Luxembourg. The series have variable length and contain gaps of variable sizes. The longest and most complete ones are from Burgundy, Switzerland, Southern Rhône valley, Jura and Ile-de-France. The GHD series were grouped into 27 regions according to their location, to geomorphological and geological criteria, and to past and present grape varieties. The GHD regional composite series (GHD-RCS) were calculated and compared pairwise to assess the quality of the series. Significant (p-value < 0.001) and strong correlations exist between most of them. As expected, the correlations tended to be higher when the vineyards are closer, the highest correlation (R = 0.91) being obtained between the High Loire Valley and the Ile-de-France GHD-RCS. The strong dependence of vine cycle on temperature and, therefore, the strong link between GHD and the temperature of the growing season was also used to test the quality of the GHD series. The strongest correlations are obtained between the GHD-RCS and the temperature series of the nearest weather stations. Moreover, the GHD-RCS/temperature correlation maps show spatial patterns similar to temperature correlation maps. The stability of the correlations over time is explored. The most striking feature is their generalized deterioration at the late 19th-early 20th turning point. The possible effects on the GHD of the phylloxera crisis, which took place at this time, are discussed. The median of the standardized GHD-RCS was calculated. The distribution of the extreme years of this general synthetic series is not homogenous. Extremely late years all occur during a two-century long time-window from the early 17th to the early 19th century, while extremely early years are frequent during the 16th and since the mid-19th century. The dataset is made accessible for climate research through the Internet. It should allow a variety of climate studies, including reconstructions of atmospheric circulation over Western Europe.

  10. Minding the immunization gap: family characteristics associated with completion rates in rural Ethiopia.

    PubMed

    Sullivan, Mary-Christine; Tegegn, Ayalew; Tessema, Fasil; Galea, Sandro; Hadley, Craig

    2010-02-01

    To examine risk factors for lack of immunization, we tested the impact of maternal, paternal, and household variables on child immunization status in children >or =1 year in a rural area of Ethiopia. Data collected by face-to-face interview on maternal, paternal, household and child variables from cross-sectional random sample community-based study on health and well-being in rural Ethiopia was used to test hypotheses on immunization status of children (n = 924). Bivariate and multivariate logistic regression models were used for two immunization outcomes: record of at least one vaccination, and record of DPT3, indicating completion of the DPT series. Complete data were available for 924 children > or =1 year of which 79% had at least one vaccination. Of those, 64% had DPT3/Polio3; below recommended coverage level. Children were more likely to be vaccinated if the mother reported antenatal care (ANC), and less likely to be vaccinated if the mother had a history of stillbirth, and no opinion of health center. Children were more likely to have DPT3 if: mother had > or =1 year of education, mother reported ANC, or older paternal age. Children were less likely to have DPT3 in households with food insecurity and no maternal opinion of health center. The study had three findings with implications for immunization programming: (1) Mothers completing the recommended ANC visits is strongly associated with receiving at least one vaccination and with completing a vaccination series; (2) Maternal education is associated with a completed vaccination series; (3) Paternal characteristics may affect vaccination series completion.

  11. Radiation safety in the cardiac catheterization lab: A time series quality improvement initiative.

    PubMed

    Abuzeid, Wael; Abunassar, Joseph; Leis, Jerome A; Tang, Vicky; Wong, Brian; Ko, Dennis T; Wijeysundera, Harindra C

    Interventional cardiologists have one of the highest annual radiation exposures yet systems of care that promote radiation safety in cardiac catheterization labs are lacking. This study sought to reduce the frequency of radiation exposure, for PCI procedures, above 1.5Gy in labs utilizing a Phillips system at our local institution by 40%, over a 12-month period. We performed a time series study to assess the impact of different interventions on the frequency of radiation exposure above 1.5Gy. Process measures were percent of procedures where collimation and magnification were used and percent of completion of online educational modules. Balancing measures were the mean number of cases performed and mean fluoroscopy time. Information sessions, online modules, policies and posters were implemented followed by the introduction of a new lab with a novel software (AlluraClarity©) to reduce radiation dose. There was a significant reduction (91%, p<0.05) in the frequency of radiation exposure above 1.5Gy after utilizing a novel software (AlluraClarity©) in a new Phillips lab. Process measures of use of collimation (95.0% to 98.0%), use of magnification (20.0% to 14.0%) and completion of online modules (62%) helped track implementation. The mean number of cases performed and mean fluoroscopy time did not change significantly. While educational strategies had limited impact on reducing radiation exposure, implementing a novel software system provided the most effective means of reducing radiation exposure. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  12. Multiscale characterization and prediction of monsoon rainfall in India using Hilbert-Huang transform and time-dependent intrinsic correlation analysis

    NASA Astrophysics Data System (ADS)

    Adarsh, S.; Reddy, M. Janga

    2017-07-01

    In this paper, the Hilbert-Huang transform (HHT) approach is used for the multiscale characterization of All India Summer Monsoon Rainfall (AISMR) time series and monsoon rainfall time series from five homogeneous regions in India. The study employs the Complete Ensemble Empirical Mode Decomposition with Adaptive Noise (CEEMDAN) for multiscale decomposition of monsoon rainfall in India and uses the Normalized Hilbert Transform and Direct Quadrature (NHT-DQ) scheme for the time-frequency characterization. The cross-correlation analysis between orthogonal modes of All India monthly monsoon rainfall time series and that of five climate indices such as Quasi Biennial Oscillation (QBO), El Niño Southern Oscillation (ENSO), Sunspot Number (SN), Atlantic Multi Decadal Oscillation (AMO), and Equatorial Indian Ocean Oscillation (EQUINOO) in the time domain showed that the links of different climate indices with monsoon rainfall are expressed well only for few low-frequency modes and for the trend component. Furthermore, this paper investigated the hydro-climatic teleconnection of ISMR in multiple time scales using the HHT-based running correlation analysis technique called time-dependent intrinsic correlation (TDIC). The results showed that both the strength and nature of association between different climate indices and ISMR vary with time scale. Stemming from this finding, a methodology employing Multivariate extension of EMD and Stepwise Linear Regression (MEMD-SLR) is proposed for prediction of monsoon rainfall in India. The proposed MEMD-SLR method clearly exhibited superior performance over the IMD operational forecast, M5 Model Tree (MT), and multiple linear regression methods in ISMR predictions and displayed excellent predictive skill during 1989-2012 including the four extreme events that have occurred during this period.

  13. Adolescent Immunization Coverage and Implementation of New School Requirements in Michigan, 2010

    PubMed Central

    DeVita, Stefanie F.; Vranesich, Patricia A.; Boulton, Matthew L.

    2014-01-01

    Objectives. We examined the effect of Michigan’s new school rules and vaccine coadministration on time to completion of all the school-required vaccine series, the individual adolescent vaccines newly required for sixth grade in 2010, and initiation of the human papillomavirus (HPV) vaccine series, which was recommended but not required for girls. Methods. Data were derived from the Michigan Care Improvement Registry, a statewide Immunization Information System. We assessed the immunization status of Michigan children enrolled in sixth grade in 2009 or 2010. We used univariable and multivariable Cox regression models to identify significant associations between each factor and school completeness. Results. Enrollment in sixth grade in 2010 and coadministration of adolescent vaccines at the first adolescent visit were significantly associated with completion of the vaccines required for Michigan’s sixth graders. Children enrolled in sixth grade in 2010 had higher coverage with the newly required adolescent vaccines by age 13 years than did sixth graders in 2009, but there was little difference in the rate of HPV vaccine initiation among girls. Conclusions. Education and outreach efforts, particularly regarding the importance and benefits of coadministration of all recommended vaccines in adolescents, should be directed toward health care providers, parents, and adolescents. PMID:24922144

  14. Recurrent Vestibular Migraine Vertigo Attacks Associated With the Development of Profound Bilateral Vestibulopathy: A Case Series.

    PubMed

    Wester, Jacob L; Ishiyama, Akira; Ishiyama, Gail

    2017-09-01

    Bilateral vestibulopathy (BVP) is a debilitating condition characterized by gait ataxia, oscillopsia, and imbalance. Case series of patients with migraine-linked vertigo spells and profound BVP. PATIENT 1:: A 69-year-old man presented with a history of recurrent severe vertigo spells lasting up to 3 days in duration associated with prostrating migraine headaches starting at age 60. His symptoms were misdiagnosed as an anxiety syndrome. At age 68, electronystagmography (ENG) revealed bilaterally absent caloric responses and complete BVP. His hearing was normal. PATIENT 2:: A 51-year-old man presented with a history of "earthquake-like" vertigo, sharp head pain, and phonophobia. These episodes occurred a handful of times over a 7-year period. Previous ENG testing at age 43 was normal. However, his ENG at age 48 revealed complete BVP. He was started on acetazolamide and noted improved balance, although subsequent ENG was unchanged. PATIENT 3:: A 49-year-old woman presented with a history of recurrent migraines with visual aura associated with vertigo lasting 1 hour. ENG at age 50 revealed complete BVP. Subjectively, she noted improved balance with acetazolamide and subsequent ENG demonstrated mild improvement. PATIENT 4:: A 43-year-old man presented with a 5-year history of optical migraines and recurrent vertigo spells, lasting 30 seconds, which was misdiagnosed as positional vertigo. He additionally had a 10-year history of oscillopsia. ENG at age 61 revealed complete BVP. In these cases, vestibular migraine was linked to recurrent vertigo spells that eventually led to complete bilateral vestibulopathy.

  15. Spatiotemporal topology and temporal sequence identification with an adaptive time-delay neural network

    NASA Astrophysics Data System (ADS)

    Lin, Daw-Tung; Ligomenides, Panos A.; Dayhoff, Judith E.

    1993-08-01

    Inspired from the time delays that occur in neurobiological signal transmission, we describe an adaptive time delay neural network (ATNN) which is a powerful dynamic learning technique for spatiotemporal pattern transformation and temporal sequence identification. The dynamic properties of this network are formulated through the adaptation of time-delays and synapse weights, which are adjusted on-line based on gradient descent rules according to the evolution of observed inputs and outputs. We have applied the ATNN to examples that possess spatiotemporal complexity, with temporal sequences that are completed by the network. The ATNN is able to be applied to pattern completion. Simulation results show that the ATNN learns the topology of a circular and figure eight trajectories within 500 on-line training iterations, and reproduces the trajectory dynamically with very high accuracy. The ATNN was also trained to model the Fourier series expansion of the sum of different odd harmonics. The resulting network provides more flexibility and efficiency than the TDNN and allows the network to seek optimal values for time-delays as well as optimal synapse weights.

  16. Why are U.S. girls getting meningococcal but not human papilloma virus vaccines? Comparison of factors associated with human papilloma virus and meningococcal vaccination among adolescent girls 2008 to 2012.

    PubMed

    Perkins, Rebecca B; Lin, Mengyun; Silliman, Rebecca A; Clark, Jack A; Hanchate, Amresh

    2015-01-01

    Human papilloma virus (HPV) vaccination rates in the United States remain low, compared with other recommended adolescent vaccines. We compared factors associated with intention to receive and receipt of HPV and meningococcal vaccines and completion of the HPV vaccine series among U.S. adolescent girls. Secondary analysis of data from the National Immunization Survey-Teen for 2008 through 2012 was performed. Multivariable logistic modeling was used to determine factors associated with intent to receive and receipt of HPV and meningococcal vaccination, completion of the HPV vaccine series among girls who started the series, and receipt of HPV vaccination among girls who received meningococcal vaccination. Provider recommendation increased the odds of receipt and intention to receive both HPV and meningococcal vaccines. Provider recommendation was also associated with a three-fold increase in HPV vaccination among girls who received meningococcal vaccination (p<.001), indicating a relationship between provider recommendation and missed vaccine opportunities. However, White girls were 10% more likely to report provider recommendation than Black or Hispanic girls (p<.01), yet did not have higher vaccination rates, implying a role for parental refusal. No factors predicted consistently the completion of the HPV vaccine series among those who started. Improving provider recommendation for co-administration of HPV and meningococcal vaccines would reduce missed opportunities for initiating the HPV vaccine series. However, different interventions may be necessary to improve series completion. Copyright © 2015 Jacobs Institute of Women's Health. Published by Elsevier Inc. All rights reserved.

  17. Direct determination of geocenter motion by combining SLR, VLBI, GNSS, and DORIS time series

    NASA Astrophysics Data System (ADS)

    Wu, X.; Abbondanza, C.; Altamimi, Z.; Chin, T. M.; Collilieux, X.; Gross, R. S.; Heflin, M. B.; Jiang, Y.; Parker, J. W.

    2013-12-01

    The longest-wavelength surface mass transport includes three degree-one spherical harmonic components involving hemispherical mass exchanges. The mass load causes geocenter motion between the center-of-mass of the total Earth system (CM) and the center-of-figure of the solid Earth surface (CF), and deforms the solid Earth. Estimation of the degree-1 surface mass changes through CM-CF and degree-1 deformation signatures from space geodetic techniques can thus complement GRACE's time-variable gravity data to form a complete change spectrum up to a high resolution. Currently, SLR is considered the most accurate technique for direct geocenter motion determination. By tracking satellite motion from ground stations, SLR determines the motion between CM and the geometric center of its ground network (CN). This motion is then used to approximate CM-CF and subsequently for deriving degree-1 mass changes. However, the SLR network is very sparse and uneven in global distribution. The average number of operational tracking stations is about 20 in recent years. The poor network geometry can have a large CN-CF motion and is not ideal for the determination of CM-CF motion and degree-1 mass changes. We recently realized an experimental Terrestrial Reference Frame (TRF) through station time series using the Kalman filter and the RTS smoother. The TRF has its origin defined at nearly instantaneous CM using weekly SLR measurement time series. VLBI, GNSS and DORIS time series are combined weekly with those of SLR and tied to the geocentric (CM) reference frame through local tie measurements and co-motion constraints on co-located geodetic stations. The unified geocentric time series of the four geodetic techniques provide a much better network geometry for direct geodetic determination of geocenter motion. Results from this direct approach using a 90-station network compares favorably with those obtained from joint inversions of GPS/GRACE data and ocean bottom pressure models. We will also show that a previously identified discrepancy in X-component between direct SLR orbit-tracking and inverse determined geocenter motions is largely reconciled with the new unified network.

  18. GRACE RL03-v2 monthly time series of solutions from CNES/GRGS

    NASA Astrophysics Data System (ADS)

    Lemoine, Jean-Michel; Bourgogne, Stéphane; Bruinsma, Sean; Gégout, Pascal; Reinquin, Franck; Biancale, Richard

    2015-04-01

    Based on GRACE GPS and KBR Level-1B.v2 data, as well as on LAGEOS-1/2 SLR data, CNES/GRGS has published in 2014 the third full re-iteration of its GRACE gravity field solutions. This monthly time series of solutions, named RL03-v1, complete to spherical harmonics degree/order 80, has displayed interesting performances in terms of spatial resolution and signal amplitude compared to JPL/GFZ/CSR RL05. This is due to a careful selection of the background models (FES2014 ocean tides, ECMWF ERA-interim (atmosphere) and TUGO (non IB-ocean) "dealiasing" models every 3 hours) and to the choice of an original method for gravity field inversion : truncated SVD. Identically to the previous CNES/GRGS releases, no additional filtering of the solutions is necessary before using them. Some problems have however been identified in CNES/GRGS RL03-v1: - an erroneous mass signal located in two small circular rings close to the Earth's poles, leading to the recommendation not to use RL03-v1 above 82° latitudes North and South; - a weakness in the sectorials due to an excessive downweighting of the GRACE GPS observations. These two problems have been understood and addressed, leading to the computation of a corrected time series of solutions, RL03-v2. The corrective steps have been: - to strengthen the determination of the very low degrees by adding Starlette and Stella SLR data to the normal equations; - to increase the weight of the GRACE GPS observations; - to adopt a two steps approach for the computation of the solutions: first a Choleski inversion for the low degrees, followed by a truncated SVD solution. The identification of these problems will be discussed and the performance of the new time series evaluated.

  19. Using spectrotemporal indices to improve the fruit-tree crop classification accuracy

    NASA Astrophysics Data System (ADS)

    Peña, M. A.; Liao, R.; Brenning, A.

    2017-06-01

    This study assesses the potential of spectrotemporal indices derived from satellite image time series (SITS) to improve the classification accuracy of fruit-tree crops. Six major fruit-tree crop types in the Aconcagua Valley, Chile, were classified by applying various linear discriminant analysis (LDA) techniques on a Landsat-8 time series of nine images corresponding to the 2014-15 growing season. As features we not only used the complete spectral resolution of the SITS, but also all possible normalized difference indices (NDIs) that can be constructed from any two bands of the time series, a novel approach to derive features from SITS. Due to the high dimensionality of this "enhanced" feature set we used the lasso and ridge penalized variants of LDA (PLDA). Although classification accuracies yielded by the standard LDA applied on the full-band SITS were good (misclassification error rate, MER = 0.13), they were further improved by 23% (MER = 0.10) with ridge PLDA using the enhanced feature set. The most important bands to discriminate the crops of interest were mainly concentrated on the first two image dates of the time series, corresponding to the crops' greenup stage. Despite the high predictor weights provided by the red and near infrared bands, typically used to construct greenness spectral indices, other spectral regions were also found important for the discrimination, such as the shortwave infrared band at 2.11-2.19 μm, sensitive to foliar water changes. These findings support the usefulness of spectrotemporal indices in the context of SITS-based crop type classifications, which until now have been mainly constructed by the arithmetic combination of two bands of the same image date in order to derive greenness temporal profiles like those from the normalized difference vegetation index.

  20. SMM X-ray polychromator

    NASA Technical Reports Server (NTRS)

    Strong, Keith T.; Haisch, Bernhard M. (Compiler); Lemen, James R. (Compiler); Acton, L. W.; Bawa, H. S.; Claflin, E. S.; Freeland, S. L.; Slater, G. L.; Kemp, D. L.; Linford, G. A.

    1988-01-01

    The range of observing and analysis programs accomplished with the X-Ray Polychromator (XRP) instruments during the decline of solar cycle 21 and the rise of the solar cycle 22 is summarized. Section 2 describes XRP operations and current status. This is meant as a guide on how the instrument is used to obtain data and what its capabilities are for potential users. The science section contains a series of representative abstracts from recently published papers on major XRP science topics. It is not meant to be a complete list but illustrates the type of science that can come from the analysis of the XRP data. There then follows a series of appendixes that summarize the major data bases that are available. Appendix A is a complete bibliography of papers and presentations produced using XRP data. Appendix B lists all the spectroscopic data accumulated by the Flat Crystal Spectrometer (FCS). Appendix C is a compilation of the XRP flare catalogue for events equivalent to a GOES C-level flare or greater. It lists the start, peak and end times as well as the peak Ca XIX flux.

  1. SMM X-ray polychromator

    NASA Astrophysics Data System (ADS)

    Strong, Keith T.; Haisch, Bernhard M.; Lemen, James R.; Acton, L. W.; Bawa, H. S.; Claflin, E. S.; Freeland, S. L.; Slater, G. L.; Kemp, D. L.; Linford, G. A.

    1988-05-01

    The range of observing and analysis programs accomplished with the X-Ray Polychromator (XRP) instruments during the decline of solar cycle 21 and the rise of the solar cycle 22 is summarized. Section 2 describes XRP operations and current status. This is meant as a guide on how the instrument is used to obtain data and what its capabilities are for potential users. The science section contains a series of representative abstracts from recently published papers on major XRP science topics. It is not meant to be a complete list but illustrates the type of science that can come from the analysis of the XRP data. There then follows a series of appendixes that summarize the major data bases that are available. Appendix A is a complete bibliography of papers and presentations produced using XRP data. Appendix B lists all the spectroscopic data accumulated by the Flat Crystal Spectrometer (FCS). Appendix C is a compilation of the XRP flare catalogue for events equivalent to a GOES C-level flare or greater. It lists the start, peak and end times as well as the peak Ca XIX flux.

  2. Changing Trends in the Clinical Presentation and Management of Complete Hydatidiform Mole Among Brazilian Women.

    PubMed

    Braga, Antonio; Moraes, Valéria; Maestá, Izildinha; Amim Júnior, Joffre; Rezende-Filho, Jorge de; Elias, Kevin; Berkowitz, Ross

    2016-06-01

    The aim of the study was to evaluate potential changes in the clinical, diagnostic, and therapeutic parameters of complete hydatidiform mole in the last 25 years in Brazil. A retrospective cohort study was conducted involving the analysis of 2163 medical records of patients diagnosed with complete hydatidiform mole who received treatment at the Rio de Janeiro Reference Center for Gestational Trophoblastic Disease between January 1988 and December 2012. For the statistical analysis of the natural history of the patients with complete molar pregnancies, time series were evaluated using the Cox-Stuart test and adjusted by linear regression models. A downward linear temporal trend was observed for gestational age of complete hydatidiform mole at diagnosis, which is also reflected in the reduced occurrence of vaginal bleeding, hyperemesis and pre-eclampsia. We also observed an increase in the use of uterine vacuum aspiration to treat molar pregnancy. Although the duration of postmolar follow-up was found to decline, this was not accompanied by any alteration in the time to remission of the disease or its progression to gestational trophoblastic neoplasia. Early diagnosis of complete hydatidiform mole has altered the natural history of molar pregnancy, especially with a reduction in classical clinical symptoms. However, early diagnosis has not resulted in a reduction in the development of gestational trophoblastic neoplasia, a dilemma that still challenges professionals working with gestational trophoblastic disease.

  3. Medium term outcome of bipolar plasma vaporization in prostate cancer patients--a palliative modality of preserving spontaneous voiding.

    PubMed

    Geavlete, B; Moldoveanu, C; Niţă, Gh; Stănescu, F; Jecu, M; Geavlete, P

    2012-12-15

    This retrospective analysis evaluated the efficiency, safety, and medium term postoperative results of bipolar plasma vaporization (BPV) in prostate cancer (PCa) cases associating complete urinary retention. A series of 40 patients diagnosed with locally advanced or metastatic PCa and complete urinary retention requiring a Foley catheter indwelling underwent BPV aiming to restore spontaneous voiding. A total of 35 patients completed the one year evaluation protocol consisting of International Prostate Symptom Score (IPSS), quality of life score (QoL), maximum flow rate (Q(max)) and post-voiding residual urinary volume (PVR), measured at 1, 3, 6 and 12 months after surgery. BPV was successfully performed in all cases with satisfactory efficiency, as confirmed by the mean operation time (42.8 minutes) and hemoglobin drop (0.7 g/dl). A fast and safe postoperative recovery period was described in this series (hematuria rate--7.5%; mean catheterization period--36 hours; mean hospital stay--2.5 days; early-irritative symptoms' rate--15%). At 1, 3, 6 and 12 months, satisfactory values were determined in terms of IPSS, Qmax, QoL and PVR. These parameters emphasized a stable evolution throughout the entire follow-up, as 88.6% of the patients maintained spontaneous voiding. The present trial confirmed the plasma-button vaporization as a promising therapeutic approach in PCa cases associating complete urinary retention. The technique displayed good efficacy, low perioperative morbidity, short convalescence, and satisfactory urodynamics and symptom score parameters during the one-year follow-up period.

  4. A new edition of the Mars 1:5,000,000 map series

    NASA Technical Reports Server (NTRS)

    Batson, R. M.; Mcewen, Alfred S.; Wu, Sherman S. C.

    1991-01-01

    A new edition of the Mars 1:5,000,000 scale map series is in preparation. Two sheets will be made for each quadrangle. Sheet one will show shaded relief, contours, and nomenclature. Sheet 2 will be a full-color photomosaic prepared on the Mars digital image model (MDIM) base co-registered with the Mars low-resolution color database. The latter will have an abbreviated graticule (latitude/longitude ticks only) and no other line overprint. The four major databases used to assemble this series are now virtually complete. These are: (1) Viking-revised shaded relief maps at 1:5,000,000 scale; (2) contour maps at 1:2,000,000 scale; (3) the Mars digital image model; and (4) a color image mosaic of Mars. Together, these databases form the most complete planetwide cartographic definition of Mars that can be compiled with existing data. The new edition will supersede the published Mars 1:5,000,000 scale maps, including the original shaded relief and topographic maps made primarily with Mariner 9 data and the Viking-revised shaded relief and controlled photomosaic series. Publication of the new series will begin in late 1991 or early 1992, and it should be completed in two years.

  5. Studies of the Earth Energy Budget and Water Cycle Using Satellite Observations and Model Analyses

    NASA Technical Reports Server (NTRS)

    Campbell, G. G.; VonderHarr, T. H.; Randel, D. L.; Kidder, S. Q.

    1997-01-01

    During this research period we have utilized the ERBE data set in comparisons to surface properties and water vapor observations in the atmosphere. A relationship between cloudiness and surface temperature anomalies was found. This same relationship was found in a general circulation model, verifying the model. The attempt to construct a homogeneous time series from Nimbus 6, Nimbus 7 and ERBE data is not complete because we are still waiting for the ERBE reanalysis to be completed. It will be difficult to merge the Nimbus 6 data in because its observations occurred when the average weather was different than the other periods, so regression adjustments are not effective.

  6. Design of a nanoscale time-of-flight sensor and an integrated multiscale module for the point-of-care diagnosis of stroke

    NASA Astrophysics Data System (ADS)

    Andrus, Matthew

    Stroke is a leading cause of death and disability in the United States, however, there remains no rapid diagnostic test for differentiating between ischemic and hemorrhagic stroke within the three-hour treatment window. Here we describe the design of a multiscale microfluidic module with an embedded time-of-flight nanosensor for the clinical diagnosis of stroke. The nanosensor described utilizes two synthetic pores in series, relying on resistive pulse sensing (RPS) to measure the passage of molecules through the time-of-flight tube. Once the nanosensor design was completed, a multiscale module to process patient samples and house the sensors was designed in a similar iterative process. This design utilized pillar arrays, called "pixels" to immobilize oligonucleotides from patient samples for ligase detection reactions (LDR) to be carried out. COMSOL simulations were performed to understand the operation and behavior of both the nanosensor and the modular chip once the designs were completed.

  7. Are graduate students rational? Evidence from the market for biomedical scientists.

    PubMed

    Blume-Kohout, Margaret E; Clack, John W

    2013-01-01

    The U.S. National Institutes of Health (NIH) budget expansion from 1998 through 2003 increased demand for biomedical research, raising relative wages and total employment in the market for biomedical scientists. However, because research doctorates in biomedical sciences can often take six years or more to complete, the full labor supply response to such changes in market conditions is not immediate, but rather is observed over a period of several years. Economic rational expectations models assume that prospective students anticipate these future changes, and also that students take into account the opportunity costs of their pursuing graduate training. Prior empirical research on student enrollment and degree completions in science and engineering (S&E) fields indicates that "cobweb" expectations prevail: that is, at least in theory, prospective graduate students respond to contemporaneous changes in market wages and employment, but do not forecast further changes that will arise by the time they complete their degrees and enter the labor market. In this article, we analyze time-series data on wages and employment of biomedical scientists versus alternative careers, on completions of S&E bachelor's degrees and biomedical sciences PhDs, and on research expenditures funded both by NIH and by biopharmaceutical firms, to examine the responsiveness of the biomedical sciences labor supply to changes in market conditions. Consistent with previous studies, we find that enrollments and completions in biomedical sciences PhD programs are responsive to market conditions at the time of students' enrollment. More striking, however, is the close correspondence between graduate student enrollments and completions, and changes in availability of NIH-funded traineeships, fellowships, and research assistantships.

  8. Microgravity

    NASA Image and Video Library

    1997-04-01

    Apfel's excellent match: This series of photos shows a water drop containing a surfactant (Triton-100) as it experiences a complete cycle of superoscillation on U.S. Microgravity Lab-2 (USML-2; October 1995). The time in seconds appears under the photos. The figures above the photos are the oscillation shapes predicted by a numerical model. The time shown with the predictions is nondimensional. Robert Apfel (Yale University) used the Drop Physics Module on USML-2 to explore the effect of surfactants on liquid drops. Apfel's research of surfactants may contribute to improvements in a variety of industrial processes, including oil recovery and environmental cleanup.

  9. Paraquat and pine trees in east Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schnell, R.L.; Toennisson, R.L.

    The Tennessee Valley Authority started a series of 8% Paraquat tests in east Tennessee on loblolly, shortleaf, and Virginia pines in the spring of 1974. In addition to species, we are also testing the effects of season of treatment application and the length of time between the completed treatment and the harvest cut. Wood samples are being analyzed by the Botany Department at the University of Tennessee in Knoxville. All three species have shown increased oleoresin production. Season of treatment did not have a significant effect on enhancement nor did length of time between treatment and harvest.

  10. The digital darkroom, part 3: digital presentation in plastic surgery.

    PubMed

    Galdino, G M; Chiaramonte, M; Klatsky, S A

    2001-01-01

    We summarize here the third and final part of our series on the Digital Darkroom. In this part, we review the use of digital technology for medical and other presentations, including the kinds of equipment available, the advantages and disadvantages of digital projection, and the most common pitfalls encountered in preparing and presenting material in digital presentations. The full text of the complete series, including expanded illustrative material and complete bibliographic documentation, is now available at our journal web site at . Please see page 39 for instructions on how to access Aesthetic Surgery Journal Online and view the entire series.

  11. Nursing Case Management, Peer Coaching, and Hepatitis A and B Vaccine Completion Among Homeless Men Recently Released on Parole: Randomized Clinical Trial

    PubMed Central

    Nyamathi, Adeline; Salem, Benissa E.; Zhang, Sheldon; Farabee, David; Hall, Betsy; Khalilifard, Farinaz; Leake, Barbara

    2015-01-01

    Background Although hepatitis A virus (HAV) and hepatitis B virus (HBV) infections are vaccine-preventable diseases, few homeless parolees coming out of prisons and jails have received the hepatitis A and B vaccination series. Objectives The study focused on completion of the HAV and HBV vaccine series among homeless men on parole. The efficacy of three levels of peer coaching and nurse-delivered interventions was compared at 12-month follow up: (a) intensive peer coaching and nurse case management (PC-NCM); (b) intensive peer coaching (PC) intervention condition, with minimal nurse involvement; and a (c) usual care (UC) intervention condition, which included minimal PC and nurse involvement. Further, we assessed predictors of vaccine completion among this targeted sample. Methods A randomized control trial was conducted with 600 recently paroled men to assess the impact of the three intervention conditions (PC-NCM vs. PC vs. UC) on reducing drug use and recidivism; of these, 345 seronegative, vaccine-eligible subjects were included in this analysis of completion of the Twinrix HAV/HAB vaccine. Logistic regression was added to assess predictors of completion of the HAV/HBV vaccine series and chi-squared analysis to compare completion rates across the three levels of intervention. Results Vaccine completion rate for the intervention conditions were 75.4% (PC-NCM), 71.8% (PC), and 71.9% (UC) (p =. 78). Predictors of vaccine noncompletion included being Asian and Pacific Islander, experiencing high levels of hostility, positive social support, reporting a history of injection drug use, being released early from California prisons, and being admitted for psychiatric illness. Predictors of vaccine series completion included reporting six or more friends, recent cocaine use, and staying in drug treatment for at least 90 days. Discussion Findings allow greater understanding of factors affecting vaccination completion in order to design more effective programs among the high-risk population of men recently released from prison and on parole. PMID:25932697

  12. Nursing case management, peer coaching, and hepatitis a and B vaccine completion among homeless men recently released on parole: randomized clinical trial.

    PubMed

    Nyamathi, Adeline; Salem, Benissa E; Zhang, Sheldon; Farabee, David; Hall, Betsy; Khalilifard, Farinaz; Leake, Barbara

    2015-01-01

    Although hepatitis A virus (HAV) and hepatitis B virus (HBV) infections are vaccine-preventable diseases, few homeless parolees coming out of prisons and jails have received the hepatitis A and B vaccination series. The study focused on completion of the HAV and HBV vaccine series among homeless men on parole. The efficacy of three levels of peer coaching (PC) and nurse-delivered interventions was compared at 12-month follow-up: (a) intensive peer coaching and nurse case management (PC-NCM); (b) intensive PC intervention condition, with minimal nurse involvement; and (c) usual care (UC) intervention condition, which included minimal PC and nurse involvement. Furthermore, we assessed predictors of vaccine completion among this targeted sample. A randomized control trial was conducted with 600 recently paroled men to assess the impact of the three intervention conditions (PC-NCM vs. PC vs. UC) on reducing drug use and recidivism; of these, 345 seronegative, vaccine-eligible subjects were included in this analysis of completion of the Twinrix HAV/HBV vaccine. Logistic regression was added to assess predictors of completion of the HAV/HBV vaccine series and chi-square analysis to compare completion rates across the three levels of intervention. Vaccine completion rate for the intervention conditions were 75.4% (PC-NCM), 71.8% (PC), and 71.9% (UC; p = .78). Predictors of vaccine noncompletion included being Asian and Pacific Islander, experiencing high levels of hostility, positive social support, reporting a history of injection drug use, being released early from California prisons, and being admitted for psychiatric illness. Predictors of vaccine series completion included reporting having six or more friends, recent cocaine use, and staying in drug treatment for at least 90 days. Findings allow greater understanding of factors affecting vaccination completion in order to design more effective programs among the high-risk population of men recently released from prison and on parole.

  13. [Winter wheat area estimation with MODIS-NDVI time series based on parcel].

    PubMed

    Li, Le; Zhang, Jin-shui; Zhu, Wen-quan; Hu, Tan-gao; Hou, Dong

    2011-05-01

    Several attributes of MODIS (moderate resolution imaging spectrometer) data, especially the short temporal intervals and the global coverage, provide an extremely efficient way to map cropland and monitor its seasonal change. However, the reliability of their measurement results is challenged because of the limited spatial resolution. The parcel data has clear geo-location and obvious boundary information of cropland. Also, the spectral differences and the complexity of mixed pixels are weak in parcels. All of these make that area estimation based on parcels presents more advantage than on pixels. In the present study, winter wheat area estimation based on MODIS-NDVI time series has been performed with the support of cultivated land parcel in Tongzhou, Beijing. In order to extract the regional winter wheat acreage, multiple regression methods were used to simulate the stable regression relationship between MODIS-NDVI time series data and TM samples in parcels. Through this way, the consistency of the extraction results from MODIS and TM can stably reach up to 96% when the amount of samples accounts for 15% of the whole area. The results shows that the use of parcel data can effectively improve the error in recognition results in MODIS-NDVI based multi-series data caused by the low spatial resolution. Therefore, with combination of moderate and low resolution data, the winter wheat area estimation became available in large-scale region which lacks completed medium resolution images or has images covered with clouds. Meanwhile, it carried out the preliminary experiments for other crop area estimation.

  14. Detection of Undocumented Changepoints Using Multiple Test Statistics and Composite Reference Series.

    NASA Astrophysics Data System (ADS)

    Menne, Matthew J.; Williams, Claude N., Jr.

    2005-10-01

    An evaluation of three hypothesis test statistics that are commonly used in the detection of undocumented changepoints is described. The goal of the evaluation was to determine whether the use of multiple tests could improve undocumented, artificial changepoint detection skill in climate series. The use of successive hypothesis testing is compared to optimal approaches, both of which are designed for situations in which multiple undocumented changepoints may be present. In addition, the importance of the form of the composite climate reference series is evaluated, particularly with regard to the impact of undocumented changepoints in the various component series that are used to calculate the composite.In a comparison of single test changepoint detection skill, the composite reference series formulation is shown to be less important than the choice of the hypothesis test statistic, provided that the composite is calculated from the serially complete and homogeneous component series. However, each of the evaluated composite series is not equally susceptible to the presence of changepoints in its components, which may be erroneously attributed to the target series. Moreover, a reference formulation that is based on the averaging of the first-difference component series is susceptible to random walks when the composition of the component series changes through time (e.g., values are missing), and its use is, therefore, not recommended. When more than one test is required to reject the null hypothesis of no changepoint, the number of detected changepoints is reduced proportionately less than the number of false alarms in a wide variety of Monte Carlo simulations. Consequently, a consensus of hypothesis tests appears to improve undocumented changepoint detection skill, especially when reference series homogeneity is violated. A consensus of successive hypothesis tests using a semihierarchic splitting algorithm also compares favorably to optimal solutions, even when changepoints are not hierarchic.

  15. Why didn't Box-Jenkins win (again)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pack, D.J.; Downing, D.J.

    This paper focuses on the forecasting performance of the Box-Jenkins methodology applied to the 111 time series of the Makridakis competition. It considers the influence of the following factors: (1) time series length, (2) time-series information (autocorrelation) content, (3) time-series outliers or structural changes, (4) averaging results over time series, and (5) forecast time origin choice. It is found that the 111 time series contain substantial numbers of very short series, series with obvious structural change, and series whose histories are relatively uninformative. If these series are typical of those that one must face in practice, the real message ofmore » the competition is that univariate time series extrapolations will frequently fail regardless of the methodology employed to produce them.« less

  16. Multifractal analysis of visibility graph-based Ito-related connectivity time series.

    PubMed

    Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano

    2016-02-01

    In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.

  17. Rainfall height stochastic modelling as a support tool for landslides early warning

    NASA Astrophysics Data System (ADS)

    Capparelli, G.; Giorgio, M.; Greco, R.; Versace, P.

    2009-04-01

    Occurrence of landslides is uneasy to predict, since it is affected by a number of variables, such as mechanical and hydraulic soil properties, slope morphology, vegetation coverage, rainfall spatial and temporal variability. Although heavy landslides frequently occurred in Campania, southern Italy, during the last decade, no complete data sets are available for natural slopes where landslides occurred. As a consequence, landslide risk assessment procedures and early warning systems in Campania still rely on simple empirical models based on correlation between daily rainfall records and observed landslides, like FLAIR model [Versace et al., 2003]. Effectiveness of such systems could be improved by reliable quantitative rainfall prediction. In mountainous areas, rainfall spatial and temporal variability are very pronounced due to orographic effects, making predictions even more complicated. Existing rain gauge networks are not dense enough to resolve the small scale spatial variability, and the same limitation of spatial resolution affects rainfall height maps provided by radar sensors as well as by meteorological physically based models. Therefore, analysis of on-site recorded rainfall height time series still represents the most effective approach for a reliable prediction of local temporal evolution of rainfall. Hydrological time series analysis is a widely studied field in hydrology, often carried out by means of autoregressive models, such as AR and ARMA [Box and Jenkins, 1976]. Sometimes exogenous information coming from additional series of observations is also taken into account, and the models are called ARX and ARMAX (e.g. Salas [1992]). Such models gave the best results when applied to the analysis of autocorrelated hydrological time series, like river flow or level time series. Conversely, they are not able to model the behaviour of intermittent time series, like point rainfall height series usually are, especially when recorded with short sampling time intervals. More useful for this issue are the so-called DRIP (Disaggregated Rectangular Intensity Pulse) and NSRP (Neymann-Scott Rectangular Pulse) model [Heneker et al., 2001; Cowpertwait et al., 2002], usually adopted to generate synthetic point rainfall series. In this paper, the DRIP model approach is adopted in conjunction with FLAIR model to calculate the probability of flowslides occurrence. The final aim of the study is in fact to provide a useful tool to implement an early warning system for hydrogeological risk management. Model calibration has been carried out with hourly rainfall hieght data provided by the rain gauges of Campania Region civil protection agency meteorological warning network. So far, the model has been applied only to data series recorded at a single rain gauge. Future extension will deal with spatial correlation between time series recorded at different gauges. ACKNOWLEDGEMENTS The research was co-financed by the Italian Ministry of University, by means of the PRIN 2006 PRIN program, within the research project entitled ‘Definition of critical rainfall thresholds for destructive landslides for civil protection purposes'. REFERENCES Box, G.E.P. and Jenkins, G.M., 1976. Time Series Analysis Forecasting and Control, Holden-Day, San Francisco. Cowpertwait, P.S.P., Kilsby, C.G. and O'Connell, P.E., 2002. A space-time Neyman-Scott model of rainfall: Empirical analysis of extremes, Water Resources Research, 38(8):1-14. Salas, J.D., 1992. Analysis and modeling of hydrological time series, in D.R. Maidment, ed., Handbook of Hydrology, McGraw-Hill, New York. Heneker, T.M., Lambert, M.F. and Kuczera G., 2001. A point rainfall model for risk-based design, Journal of Hydrology, 247(1-2):54-71. Versace, P., Sirangelo. B. and Capparelli, G., 2003. Forewarning model of landslides triggered by rainfall. Proc. 3rd International Conference on Debris-Flow Hazards Mitigation: Mechanics, Prediction and Assessment, Davos.

  18. An open-access database of grape harvest dates for climate research: data description and quality assessment

    NASA Astrophysics Data System (ADS)

    Daux, V.; Garcia de Cortazar-Atauri, I.; Yiou, P.; Chuine, I.; Garnier, E.; Ladurie, E. Le Roy; Mestre, O.; Tardaguila, J.

    2012-09-01

    We present an open-access dataset of grape harvest dates (GHD) series that has been compiled from international, French and Spanish literature and from unpublished documentary sources from public organizations and from wine-growers. As of June 2011, this GHD dataset comprises 380 series mainly from France (93% of the data) as well as series from Switzerland, Italy, Spain and Luxemburg. The series have variable length (from 1 to 479 data, mean length of 45 data) and contain gaps of variable sizes (mean ratio of observations/series length of 0.74). The longest and most complete ones are from Burgundy, Switzerland, Southern Rhône valley, Jura and Ile-de-France. The most ancient harvest date of the dataset is in 1354 in Burgundy. The GHD series were grouped into 27 regions according to their location, to geomorphological and geological criteria, and to past and present grape varieties. The GHD regional composite series (GHD-RCS) were calculated and compared pairwise to assess their reliability assuming that series close to one another are highly correlated. Most of the pairwise correlations are significant (p-value < 0.001) and strong (mean pairwise correlation coefficient of 0.58). As expected, the correlations tend to be higher when the vineyards are closer. The highest correlation (R = 0.91) is obtained between the High Loire Valley and the Ile-de-France GHD-RCS. The strong dependence of the vine cycle on temperature and, therefore, the strong link between the harvest dates and the temperature of the growing season was also used to test the quality of the GHD series. The strongest correlations are obtained between the GHD-RCS and the temperature series of the nearest weather stations. Moreover, the GHD-RCS/temperature correlation maps show spatial patterns similar to temperature correlation maps. The stability of the correlations over time is explored. The most striking feature is their generalised deterioration at the late 19th-early 20th century. The possible effects on GHD of the phylloxera crisis, which took place at this time, are discussed. The median of all the standardized GHD-RCS was calculated. The distribution of the extreme years of this general series is not homogenous. Extremely late years all occur during a two-century long time window from the early 17th to the early 19th century, while extremely early years are frequent during the 16th and since the mid-19th century.

  19. The Earth and Environmental Systems Podcast, and the Earth Explorations Video Series

    NASA Astrophysics Data System (ADS)

    Shorey, C. V.

    2015-12-01

    The Earth and Environmental Systems Podcast, a complete overview of the theoretical basics of Earth Science in 64 episodes, was completed in 2009, but has continued to serve the worldwide community as evidenced by listener feedback (e.g. "I am a 65 year old man. I have been retired for awhile and thought that retirement would be nothing more than waiting for the grave. However I want to thank you for your geo podcasts. They have given me a new lease on life and taught me a great deal." - FP, 2015). My current project is a video series on the practical basics of Earth Science titled "Earth Explorations". Each video is under 12 minutes long and tackles a major Earth Science concept. These videos go beyond a talking head, or even voice-over with static pictures or white-board graphics. Moving images are combined with animations created with Adobe After Effects, and aerial shots using a UAV. The dialog is scripted in a way to make it accessible at many levels, and the episodes as they currently stand have been used in K-12, and Freshman college levels with success. Though these videos are made to be used at this introductory level, they are also designed as remedial episodes for upper level classes, freeing up time given to review for new content. When completed, the series should contain close to 200 episodes, and this talk will cover the full range of resources I have produced, plan to produce, and how to access these resources. Both resources are available on iTunesU, and the videos are also available on YouTube.

  20. Measures of dependence for multivariate Lévy distributions

    NASA Astrophysics Data System (ADS)

    Boland, J.; Hurd, T. R.; Pivato, M.; Seco, L.

    2001-02-01

    Recent statistical analysis of a number of financial databases is summarized. Increasing agreement is found that logarithmic equity returns show a certain type of asymptotic behavior of the largest events, namely that the probability density functions have power law tails with an exponent α≈3.0. This behavior does not vary much over different stock exchanges or over time, despite large variations in trading environments. The present paper proposes a class of multivariate distributions which generalizes the observed qualities of univariate time series. A new consequence of the proposed class is the "spectral measure" which completely characterizes the multivariate dependences of the extreme tails of the distribution. This measure on the unit sphere in M-dimensions, in principle completely general, can be determined empirically by looking at extreme events. If it can be observed and determined, it will prove to be of importance for scenario generation in portfolio risk management.

  1. Representations of time coordinates in FITS. Time and relative dimension in space

    NASA Astrophysics Data System (ADS)

    Rots, Arnold H.; Bunclark, Peter S.; Calabretta, Mark R.; Allen, Steven L.; Manchester, Richard N.; Thompson, William T.

    2015-02-01

    Context. In a series of three previous papers, formulation and specifics of the representation of world coordinate transformations in FITS data have been presented. This fourth paper deals with encoding time. Aims: Time on all scales and precisions known in astronomical datasets is to be described in an unambiguous, complete, and self-consistent manner. Methods: Employing the well-established World Coordinate System (WCS) framework, and maintaining compatibility with the FITS conventions that are currently in use to specify time, the standard is extended to describe rigorously the time coordinate. Results: World coordinate functions are defined for temporal axes sampled linearly and as specified by a lookup table. The resulting standard is consistent with the existing FITS WCS standards and specifies a metadata set that achieves the aims enunciated above.

  2. Pharmacists' perception of synchronous versus asynchronous distance learning for continuing education programs.

    PubMed

    Buxton, Eric C

    2014-02-12

    To evaluate and compare pharmacists' satisfaction with the content and learning environment of a continuing education program series offered as either synchronous or asynchronous webinars. An 8-lecture series of online presentations on the topic of new drug therapies was offered to pharmacists in synchronous and asynchronous webinar formats. Participants completed a 50-question online survey at the end of the program series to evaluate their perceptions of the distance learning experience. Eighty-two participants completed the survey instrument (41 participants from the live webinar series and 41 participants from the asynchronous webinar series.) Responses indicated that while both groups were satisfied with the program content, the asynchronous group showed greater satisfaction with many aspects of the learning environment. The synchronous and asynchronous webinar participants responded positively regarding the quality of the programming and the method of delivery, but asynchronous participants rated their experience more positively overall.

  3. Pharmacists’ Perception of Synchronous Versus Asynchronous Distance Learning for Continuing Education Programs

    PubMed Central

    2014-01-01

    Objective. To evaluate and compare pharmacists’ satisfaction with the content and learning environment of a continuing education program series offered as either synchronous or asynchronous webinars. Methods. An 8-lecture series of online presentations on the topic of new drug therapies was offered to pharmacists in synchronous and asynchronous webinar formats. Participants completed a 50-question online survey at the end of the program series to evaluate their perceptions of the distance learning experience. Results. Eighty-two participants completed the survey instrument (41 participants from the live webinar series and 41 participants from the asynchronous webinar series.) Responses indicated that while both groups were satisfied with the program content, the asynchronous group showed greater satisfaction with many aspects of the learning environment. Conclusion. The synchronous and asynchronous webinar participants responded positively regarding the quality of the programming and the method of delivery, but asynchronous participants rated their experience more positively overall. PMID:24558276

  4. Control of the repeatability of high frequency multibeam echosounder backscatter by using natural reference areas

    NASA Astrophysics Data System (ADS)

    Roche, Marc; Degrendele, Koen; Vrignaud, Christophe; Loyer, Sophie; Le Bas, Tim; Augustin, Jean-Marie; Lurton, Xavier

    2018-06-01

    The increased use of backscatter measurements in time series for environmental monitoring necessitates the comparability of individual results. With the current lack of pre-calibrated multibeam echosounder systems for absolute backscatter measurement, a pragmatic solution is the use of natural reference areas for ensuring regular assessment of the backscatter measurement repeatability. This method mainly relies on the assumption of a sufficiently stable reference area regarding its backscatter signature. The aptitude of a natural area to provide a stable and uniform backscatter response must be carefully considered and demonstrated by a sufficiently long time-series of measurements. Furthermore, this approach requires a strict control of the acquisition and processing parameters. If all these conditions are met, stability check and relative calibration of a system are possible by comparison with the averaged backscatter values for the area. Based on a common multibeam echosounder and sampling campaign completed by available bathymetric and backscatter time series, the suitability as a backscatter reference area of three different candidates was evaluated. Two among them, Carré Renard and Kwinte, prove to be excellent choices, while the third one, Western Solent, lacks sufficient data over time, but remains a valuable candidate. The case studies and the available backscatter data on these areas prove the applicability of this method. The expansion of the number of commonly used reference areas and the growth of the number of multibeam echosounder controlled thereon could greatly contribute to the further development of quantitative applications based on multibeam echosounder backscatter measurements.

  5. Regenerating time series from ordinal networks.

    PubMed

    McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael

    2017-03-01

    Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.

  6. Regenerating time series from ordinal networks

    NASA Astrophysics Data System (ADS)

    McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael

    2017-03-01

    Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.

  7. Understanding Clinic Practices for Human Papilloma Virus Vaccination Series Completion in Clinics That Provide Primary Care: Survey of Clinic Managers in Iowa.

    PubMed

    Askelson, Natoshia M; Edmonds, Stephanie W; Momany, Elizabeth T; Tegegne, Mesay A

    2016-07-01

    Rates for human papilloma virus (HPV) vaccination are low across the United States. Evidence-based-practices to increase immunization coverage have been recommended by public health organizations, yet many primary care clinics do not follow these practices. The purpose of this study was to examine if primary care clinics use these best practices to promote completion of the HPV vaccine series for their adolescent patients. Understanding the prevalence of evidence-based immunization strategies is key to increasing vaccination coverage. We mailed 914 surveys to clinic managers of clinics that provide primary care in Iowa. The survey content was based on immunization strategies related to clinic practice and policies that have been proven effective to promote the completion of the HPV vaccination series. Survey responses from 127 clinics were used in the final analysis. Most clinics always used the state's immunization information system to record HPV vaccinations (89.4%). Over a quarter of clinics (27.6%) did not use any type of reminder or recall system to alert parents or providers that an HPV vaccine was due, and 35.0% did not give the vaccine at sick visits. Clinics need to focus more on the recommended logistics and processes to ensure that patients receive the entire HPV vaccination series. Survey results indicate that clinics are not consistently implementing the recommended best practices to ensure that vaccination series are completed.

  8. GPS Position Time Series @ JPL

    NASA Technical Reports Server (NTRS)

    Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen

    2013-01-01

    Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis

  9. Chemical reactivity indices for the complete series of chlorinated benzenes: solvent effect.

    PubMed

    Padmanabhan, J; Parthasarathi, R; Subramanian, V; Chattaraj, P K

    2006-03-02

    We present a comprehensive analysis to probe the effect of solvation on the reactivity of the complete series of chlorobenzenes through the conceptual density functional theory (DFT)-based global and local descriptors. We propose a multiphilic descriptor in this study to explore the nature of attack at a particular site in a molecule. It is defined as the difference between nucleophilic and electrophilic condensed philicity functions. This descriptor is capable of explaining both the nucleophilicity and electrophilicity of the given atomic sites in the molecule simultaneously. The predictive ability of this descriptor is tested on the complete series of chlorobenzenes in gas and solvent media. A structure-toxicity analysis of these entire sets of chlorobenzenes toward aquatic organisms demonstrates the importance of the electrophilicity index in the prediction of the reactivity/toxicity.

  10. Impact of missing data on the efficiency of homogenisation: experiments with ACMANTv3

    NASA Astrophysics Data System (ADS)

    Domonkos, Peter; Coll, John

    2018-04-01

    The impact of missing data on the efficiency of homogenisation with ACMANTv3 is examined with simulated monthly surface air temperature test datasets. The homogeneous database is derived from an earlier benchmarking of daily temperature data in the USA, and then outliers and inhomogeneities (IHs) are randomly inserted into the time series. Three inhomogeneous datasets are generated and used, one with relatively few and small IHs, another one with IHs of medium frequency and size, and a third one with large and frequent IHs. All of the inserted IHs are changes to the means. Most of the IHs are single sudden shifts or pair of shifts resulting in platform-shaped biases. Each test dataset consists of 158 time series of 100 years length, and their mean spatial correlation is 0.68-0.88. For examining the impacts of missing data, seven experiments are performed, in which 18 series are left complete, while variable quantities (10-70%) of the data of the other 140 series are removed. The results show that data gaps have a greater impact on the monthly root mean squared error (RMSE) than the annual RMSE and trend bias. When data with a large ratio of gaps is homogenised, the reduction of the upper 5% of the monthly RMSE is the least successful, but even there, the efficiency remains positive. In terms of reducing the annual RMSE and trend bias, the efficiency is 54-91%. The inclusion of short and incomplete series with sufficient spatial correlation in all cases improves the efficiency of homogenisation with ACMANTv3.

  11. Area-based socioeconomic factors and Human Papillomavirus (HPV) vaccination among teen boys in the United States.

    PubMed

    Henry, Kevin A; Swiecki-Sikora, Allison L; Stroup, Antoinette M; Warner, Echo L; Kepka, Deanna

    2017-07-14

    This study is the first to examine associations between several area-based socioeconomic factors and human papillomavirus (HPV) vaccine uptake among boys in the United States (U.S.). Data from the 2012-2013 National Immunization Survey-Teen restricted-use data were analyzed to examine associations of HPV vaccination initiation (receipt of ≥1 dose) and series completion (receipt of three doses) among boys aged 13-17 years (N = 19,518) with several individual-level and ZIP Code Tabulation Area (ZCTA) census measures. Multivariable logistic regression was used to estimate the odds of HPV vaccination initiation and series completion separately. In 2012-2013 approximately 27.9% (95% CI 26.6%-29.2%) of boys initiated and 10.38% (95% CI 9.48%-11.29%) completed the HPV vaccine series. Area-based poverty was not statistically significantly associated with HPV vaccination initiation. It was, however, associated with series completion, with boys living in high-poverty areas (≥20% of residents living below poverty) having higher odds of completing the series (AOR 1.22, 95% CI 1.01-1.48) than boys in low-poverty areas (0-4.99%). Interactions between race/ethnicity and ZIP code-level poverty indicated that Hispanic boys living in high-poverty areas had a statistically significantly higher odds of  HPV vaccine initiation (AOR 1.43, 95% CI 1.03-1.97) and series completion (AOR 1.56, 95% CI 1.05-2.32)  than Hispanic boys in  low-poverty areas. Non-Hispanic Black boys in high poverty areas had higher odds of initiation (AOR 2.23, 95% CI 1.33-3.75) and completion (AOR 2.61, 95% CI 1.06-6.44) than non-Hispanic Black boys in low-poverty areas. Rural/urban residence and population density were also significant factors, with boys from urban or densely populated areas having higher odds of initiation and completion compared to boys living in non-urban, less densely populated areas. Higher HPV vaccination coverage in urban areas and among racial/ethnic minorities in areas with high poverty may be attributable to factors such as vaccine acceptance, health-care practices, and their access to HPV vaccines through the Vaccines for Children Program, which provides free vaccines to uninsured and under-insured children. Given the low HPV vaccination rates among boys in the U.S., these results provide important evidence to inform public health interventions to increase HPV vaccination.

  12. A Multi-Scale Structural Health Monitoring Approach for Damage Detection, Diagnosis and Prognosis in Aerospace Structures

    DTIC Science & Technology

    2012-01-20

    ultrasonic Lamb waves to plastic strain and fatigue life. Theory was developed and validated to predict second harmonic generation for specific mode... Fatigue and damage generation and progression are processes consisting of a series of interrelated events that span large scales of space and time...strain and fatigue life A set of experiments were completed that worked to relate the acoustic nonlinearity measured with Lamb waves to both the

  13. Investigating flow patterns and related dynamics in multi-instability turbulent plasmas using a three-point cross-phase time delay estimation velocimetry scheme

    NASA Astrophysics Data System (ADS)

    Brandt, C.; Thakur, S. C.; Tynan, G. R.

    2016-04-01

    Complexities of flow patterns in the azimuthal cross-section of a cylindrical magnetized helicon plasma and the corresponding plasma dynamics are investigated by means of a novel scheme for time delay estimation velocimetry. The advantage of this introduced method is the capability of calculating the time-averaged 2D velocity fields of propagating wave-like structures and patterns in complex spatiotemporal data. It is able to distinguish and visualize the details of simultaneously present superimposed entangled dynamics and it can be applied to fluid-like systems exhibiting frequently repeating patterns (e.g., waves in plasmas, waves in fluids, dynamics in planetary atmospheres, etc.). The velocity calculations are based on time delay estimation obtained from cross-phase analysis of time series. Each velocity vector is unambiguously calculated from three time series measured at three different non-collinear spatial points. This method, when applied to fast imaging, has been crucial to understand the rich plasma dynamics in the azimuthal cross-section of a cylindrical linear magnetized helicon plasma. The capabilities and the limitations of this velocimetry method are discussed and demonstrated for two completely different plasma regimes, i.e., for quasi-coherent wave dynamics and for complex broadband wave dynamics involving simultaneously present multiple instabilities.

  14. Investigating flow patterns and related dynamics in multi-instability turbulent plasmas using a three-point cross-phase time delay estimation velocimetry scheme

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandt, C.; Max-Planck-Institute for Plasma Physics, Wendelsteinstr. 1, D-17491 Greifswald; Thakur, S. C.

    2016-04-15

    Complexities of flow patterns in the azimuthal cross-section of a cylindrical magnetized helicon plasma and the corresponding plasma dynamics are investigated by means of a novel scheme for time delay estimation velocimetry. The advantage of this introduced method is the capability of calculating the time-averaged 2D velocity fields of propagating wave-like structures and patterns in complex spatiotemporal data. It is able to distinguish and visualize the details of simultaneously present superimposed entangled dynamics and it can be applied to fluid-like systems exhibiting frequently repeating patterns (e.g., waves in plasmas, waves in fluids, dynamics in planetary atmospheres, etc.). The velocity calculationsmore » are based on time delay estimation obtained from cross-phase analysis of time series. Each velocity vector is unambiguously calculated from three time series measured at three different non-collinear spatial points. This method, when applied to fast imaging, has been crucial to understand the rich plasma dynamics in the azimuthal cross-section of a cylindrical linear magnetized helicon plasma. The capabilities and the limitations of this velocimetry method are discussed and demonstrated for two completely different plasma regimes, i.e., for quasi-coherent wave dynamics and for complex broadband wave dynamics involving simultaneously present multiple instabilities.« less

  15. Ribbons of semithin sections: an advanced method with a new type of diamond knife.

    PubMed

    Blumer, Michael J F; Gahleitner, P; Narzt, T; Handl, C; Ruthensteiner, B

    2002-10-15

    Complete series of semithin sections are imperative for 3-D reconstruction, but with traditional microtomy techniques it is difficult and time-consuming to trace stained and labeled structures. In the present study we introduce a method for making and collecting ribbons of semithin sections with a new, commercial available diamond knife (histo-jumbo-diamond knife, Diatome AG, Biel, Switzerland). The special feature of the diamond knife is the large water bath (boat) into which a glass slide can be dipped. The method has distinct advantages and the handling is simple. The resin block is trimmed into a truncated pyramid. Contact glue is applied to the leading face of the pyramid, which makes sections stick together to form a ribbon. Following sectioning, the ribbons are mounted onto glass slides and aligned in parallel. Stretching out and drying the ribbons on a hot plate is the final step of the method. Major advantages of this method are the perfect alignment of sections with identical orientation of structures, the completeness of series, and the significant saving of time. This facilitates tracing of stained and labeled structures, yielding quick 3-D reconstruction. Semithin sections can be cut from 0.5 to 2 micro m and several ribbons can be mounted side by side onto the slide. Two examples are presented to illustrate the advantages of the method.

  16. Mathematical Modeling and Dynamic Simulation of Metabolic Reaction Systems Using Metabolome Time Series Data.

    PubMed

    Sriyudthsak, Kansuporn; Shiraishi, Fumihide; Hirai, Masami Yokota

    2016-01-01

    The high-throughput acquisition of metabolome data is greatly anticipated for the complete understanding of cellular metabolism in living organisms. A variety of analytical technologies have been developed to acquire large-scale metabolic profiles under different biological or environmental conditions. Time series data are useful for predicting the most likely metabolic pathways because they provide important information regarding the accumulation of metabolites, which implies causal relationships in the metabolic reaction network. Considerable effort has been undertaken to utilize these data for constructing a mathematical model merging system properties and quantitatively characterizing a whole metabolic system in toto. However, there are technical difficulties between benchmarking the provision and utilization of data. Although, hundreds of metabolites can be measured, which provide information on the metabolic reaction system, simultaneous measurement of thousands of metabolites is still challenging. In addition, it is nontrivial to logically predict the dynamic behaviors of unmeasurable metabolite concentrations without sufficient information on the metabolic reaction network. Yet, consolidating the advantages of advancements in both metabolomics and mathematical modeling remain to be accomplished. This review outlines the conceptual basis of and recent advances in technologies in both the research fields. It also highlights the potential for constructing a large-scale mathematical model by estimating model parameters from time series metabolome data in order to comprehensively understand metabolism at the systems level.

  17. A comparative simulation study of AR(1) estimators in short time series.

    PubMed

    Krone, Tanja; Albers, Casper J; Timmerman, Marieke E

    2017-01-01

    Various estimators of the autoregressive model exist. We compare their performance in estimating the autocorrelation in short time series. In Study 1, under correct model specification, we compare the frequentist r 1 estimator, C-statistic, ordinary least squares estimator (OLS) and maximum likelihood estimator (MLE), and a Bayesian method, considering flat (B f ) and symmetrized reference (B sr ) priors. In a completely crossed experimental design we vary lengths of time series (i.e., T = 10, 25, 40, 50 and 100) and autocorrelation (from -0.90 to 0.90 with steps of 0.10). The results show a lowest bias for the B sr , and a lowest variability for r 1 . The power in different conditions is highest for B sr and OLS. For T = 10, the absolute performance of all measurements is poor, as expected. In Study 2, we study robustness of the methods through misspecification by generating the data according to an ARMA(1,1) model, but still analysing the data with an AR(1) model. We use the two methods with the lowest bias for this study, i.e., B sr and MLE. The bias gets larger when the non-modelled moving average parameter becomes larger. Both the variability and power show dependency on the non-modelled parameter. The differences between the two estimation methods are negligible for all measurements.

  18. Reconstructing missing information on precipitation datasets: impact of tails on adopted statistical distributions.

    NASA Astrophysics Data System (ADS)

    Pedretti, Daniele; Beckie, Roger Daniel

    2014-05-01

    Missing data in hydrological time-series databases are ubiquitous in practical applications, yet it is of fundamental importance to make educated decisions in problems involving exhaustive time-series knowledge. This includes precipitation datasets, since recording or human failures can produce gaps in these time series. For some applications, directly involving the ratio between precipitation and some other quantity, lack of complete information can result in poor understanding of basic physical and chemical dynamics involving precipitated water. For instance, the ratio between precipitation (recharge) and outflow rates at a discharge point of an aquifer (e.g. rivers, pumping wells, lysimeters) can be used to obtain aquifer parameters and thus to constrain model-based predictions. We tested a suite of methodologies to reconstruct missing information in rainfall datasets. The goal was to obtain a suitable and versatile method to reduce the errors given by the lack of data in specific time windows. Our analyses included both a classical chronologically-pairing approach between rainfall stations and a probability-based approached, which accounted for the probability of exceedence of rain depths measured at two or multiple stations. Our analyses proved that it is not clear a priori which method delivers the best methodology. Rather, this selection should be based considering the specific statistical properties of the rainfall dataset. In this presentation, our emphasis is to discuss the effects of a few typical parametric distributions used to model the behavior of rainfall. Specifically, we analyzed the role of distributional "tails", which have an important control on the occurrence of extreme rainfall events. The latter strongly affect several hydrological applications, including recharge-discharge relationships. The heavy-tailed distributions we considered were parametric Log-Normal, Generalized Pareto, Generalized Extreme and Gamma distributions. The methods were first tested on synthetic examples, to have a complete control of the impact of several variables such as minimum amount of data required to obtain reliable statistical distributions from the selected parametric functions. Then, we applied the methodology to precipitation datasets collected in the Vancouver area and on a mining site in Peru.

  19. A knowledge translation tool improved osteoporosis disease management in primary care: an interrupted time series analysis.

    PubMed

    Kastner, Monika; Sawka, Anna M; Hamid, Jemila; Chen, Maggie; Thorpe, Kevin; Chignell, Mark; Ewusie, Joycelyne; Marquez, Christine; Newton, David; Straus, Sharon E

    2014-09-25

    Osteoporosis affects over 200 million people worldwide at a high cost to healthcare systems, yet gaps in management still exist. In response, we developed a multi-component osteoporosis knowledge translation (Op-KT) tool involving a patient-initiated risk assessment questionnaire (RAQ), which generates individualized best practice recommendations for physicians and customized education for patients at the point of care. The objective of this study was to evaluate the effectiveness of the Op-KT tool for appropriate disease management by physicians. The Op-KT tool was evaluated using an interrupted time series design. This involved multiple assessments of the outcomes 12 months before (baseline) and 12 months after tool implementation (52 data points in total). Inclusion criteria were family physicians and their patients at risk for osteoporosis (women aged ≥ 50 years, men aged ≥ 65 years). Primary outcomes were the initiation of appropriate osteoporosis screening and treatment. Analyses included segmented linear regression modeling and analysis of variance. The Op-KT tool was implemented in three family practices in Ontario, Canada representing 5 family physicians with 2840 age eligible patients (mean age 67 years; 76% women). Time series regression models showed an overall increase from baseline in the initiation of screening (3.4%; P < 0.001), any osteoporosis medications (0.5%; P = 0.006), and calcium or vitamin D (1.2%; P = 0.001). Improvements were also observed at site level for all the three sites considered, but these results varied across the sites. Of 351 patients who completed the RAQ unprompted (mean age 64 years, 77% women), the mean time for completing the RAQ was 3.43 minutes, and 56% had any disease management addressed by their physician. Study limitations included the inherent susceptibility of our design compared with a randomized trial. The multicomponent Op-KT tool significantly increased osteoporosis investigations in three family practices, and highlights its potential to facilitate patient self-management. Next steps include wider implementation and evaluation of the tool in primary care.

  20. A general framework for time series data mining based on event analysis: application to the medical domains of electroencephalography and stabilometry.

    PubMed

    Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P

    2014-10-01

    There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Forecasting ESKAPE infections through a time-varying auto-adaptive algorithm using laboratory-based surveillance data.

    PubMed

    Ballarin, Antonio; Posteraro, Brunella; Demartis, Giuseppe; Gervasi, Simona; Panzarella, Fabrizio; Torelli, Riccardo; Paroni Sterbini, Francesco; Morandotti, Grazia; Posteraro, Patrizia; Ricciardi, Walter; Gervasi Vidal, Kristian A; Sanguinetti, Maurizio

    2014-12-06

    Mathematical or statistical tools are capable to provide a valid help to improve surveillance systems for healthcare and non-healthcare-associated bacterial infections. The aim of this work is to evaluate the time-varying auto-adaptive (TVA) algorithm-based use of clinical microbiology laboratory database to forecast medically important drug-resistant bacterial infections. Using TVA algorithm, six distinct time series were modelled, each one representing the number of episodes per single 'ESKAPE' (E nterococcus faecium, S taphylococcus aureus, K lebsiella pneumoniae, A cinetobacter baumannii, P seudomonas aeruginosa and E nterobacter species) infecting pathogen, that had occurred monthly between 2002 and 2011 calendar years at the Università Cattolica del Sacro Cuore general hospital. Monthly moving averaged numbers of observed and forecasted ESKAPE infectious episodes were found to show a complete overlapping of their respective smoothed time series curves. Overall good forecast accuracy was observed, with percentages ranging from 82.14% for E. faecium infections to 90.36% for S. aureus infections. Our approach may regularly provide physicians with forecasted bacterial infection rates to alert them about the spread of antibiotic-resistant bacterial species, especially when clinical microbiological results of patients' specimens are delayed.

  2. Genexpi: a toolset for identifying regulons and validating gene regulatory networks using time-course expression data.

    PubMed

    Modrák, Martin; Vohradský, Jiří

    2018-04-13

    Identifying regulons of sigma factors is a vital subtask of gene network inference. Integrating multiple sources of data is essential for correct identification of regulons and complete gene regulatory networks. Time series of expression data measured with microarrays or RNA-seq combined with static binding experiments (e.g., ChIP-seq) or literature mining may be used for inference of sigma factor regulatory networks. We introduce Genexpi: a tool to identify sigma factors by combining candidates obtained from ChIP experiments or literature mining with time-course gene expression data. While Genexpi can be used to infer other types of regulatory interactions, it was designed and validated on real biological data from bacterial regulons. In this paper, we put primary focus on CyGenexpi: a plugin integrating Genexpi with the Cytoscape software for ease of use. As a part of this effort, a plugin for handling time series data in Cytoscape called CyDataseries has been developed and made available. Genexpi is also available as a standalone command line tool and an R package. Genexpi is a useful part of gene network inference toolbox. It provides meaningful information about the composition of regulons and delivers biologically interpretable results.

  3. Backward transfer entropy: Informational measure for detecting hidden Markov models and its interpretations in thermodynamics, gambling and causality

    PubMed Central

    Ito, Sosuke

    2016-01-01

    The transfer entropy is a well-established measure of information flow, which quantifies directed influence between two stochastic time series and has been shown to be useful in a variety fields of science. Here we introduce the transfer entropy of the backward time series called the backward transfer entropy, and show that the backward transfer entropy quantifies how far it is from dynamics to a hidden Markov model. Furthermore, we discuss physical interpretations of the backward transfer entropy in completely different settings of thermodynamics for information processing and the gambling with side information. In both settings of thermodynamics and the gambling, the backward transfer entropy characterizes a possible loss of some benefit, where the conventional transfer entropy characterizes a possible benefit. Our result implies the deep connection between thermodynamics and the gambling in the presence of information flow, and that the backward transfer entropy would be useful as a novel measure of information flow in nonequilibrium thermodynamics, biochemical sciences, economics and statistics. PMID:27833120

  4. Imputation of missing data in time series for air pollutants

    NASA Astrophysics Data System (ADS)

    Junger, W. L.; Ponce de Leon, A.

    2015-02-01

    Missing data are major concerns in epidemiological studies of the health effects of environmental air pollutants. This article presents an imputation-based method that is suitable for multivariate time series data, which uses the EM algorithm under the assumption of normal distribution. Different approaches are considered for filtering the temporal component. A simulation study was performed to assess validity and performance of proposed method in comparison with some frequently used methods. Simulations showed that when the amount of missing data was as low as 5%, the complete data analysis yielded satisfactory results regardless of the generating mechanism of the missing data, whereas the validity began to degenerate when the proportion of missing values exceeded 10%. The proposed imputation method exhibited good accuracy and precision in different settings with respect to the patterns of missing observations. Most of the imputations obtained valid results, even under missing not at random. The methods proposed in this study are implemented as a package called mtsdi for the statistical software system R.

  5. Correlated full-field and pointwise temporally resolved measurements of thermomechanical stress inside an operating power transistor

    NASA Astrophysics Data System (ADS)

    Borza, Dan N.; Gautrelet, Christophe

    2015-01-01

    The paper describes a measurement system based on time-resolved speckle interferometry, able to record long series of thermally induced full-field deformation maps of die and wire bonds inside an operating power transistor. The origin of the deformation is the transistor heating during its normal operation. The full-field results consist in completely unwrapped deformation maps for out-of-plane displacements greater than 14 μm, with nanometer resolution, in presence of discontinuities due to structural and material inhomogeneity. These measurements are synchronized with the measurement of heatsink temperature and of base-emitter junction temperature, so as to provide data related to several interacting physical parameters. The temporal histories of the displacement are also accessible for any point. They are correlated with the thermal and electrical time series. Mechanical full-field curvatures may also be estimated, making these measurements useful for inspecting physical origins of thermomechanical stresses and for interacting with numerical models used in reliability-related studies.

  6. Backward transfer entropy: Informational measure for detecting hidden Markov models and its interpretations in thermodynamics, gambling and causality

    NASA Astrophysics Data System (ADS)

    Ito, Sosuke

    2016-11-01

    The transfer entropy is a well-established measure of information flow, which quantifies directed influence between two stochastic time series and has been shown to be useful in a variety fields of science. Here we introduce the transfer entropy of the backward time series called the backward transfer entropy, and show that the backward transfer entropy quantifies how far it is from dynamics to a hidden Markov model. Furthermore, we discuss physical interpretations of the backward transfer entropy in completely different settings of thermodynamics for information processing and the gambling with side information. In both settings of thermodynamics and the gambling, the backward transfer entropy characterizes a possible loss of some benefit, where the conventional transfer entropy characterizes a possible benefit. Our result implies the deep connection between thermodynamics and the gambling in the presence of information flow, and that the backward transfer entropy would be useful as a novel measure of information flow in nonequilibrium thermodynamics, biochemical sciences, economics and statistics.

  7. Applications and Comparisons of Four Time Series Models in Epidemiological Surveillance Data

    PubMed Central

    Young, Alistair A.; Li, Xiaosong

    2014-01-01

    Public health surveillance systems provide valuable data for reliable predication of future epidemic events. This paper describes a study that used nine types of infectious disease data collected through a national public health surveillance system in mainland China to evaluate and compare the performances of four time series methods, namely, two decomposition methods (regression and exponential smoothing), autoregressive integrated moving average (ARIMA) and support vector machine (SVM). The data obtained from 2005 to 2011 and in 2012 were used as modeling and forecasting samples, respectively. The performances were evaluated based on three metrics: mean absolute error (MAE), mean absolute percentage error (MAPE), and mean square error (MSE). The accuracy of the statistical models in forecasting future epidemic disease proved their effectiveness in epidemiological surveillance. Although the comparisons found that no single method is completely superior to the others, the present study indeed highlighted that the SVMs outperforms the ARIMA model and decomposition methods in most cases. PMID:24505382

  8. Automatic Derivation of Forest Cover and Forest Cover Change Using Dense Multi-Temporal Time Series Data from Landsat and SPOT 5 Take5

    NASA Astrophysics Data System (ADS)

    Storch, Cornelia; Wagner, Thomas; Ramminger, Gernot; Pape, Marlon; Ott, Hannes; Hausler, Thomas; Gomez, Sharon

    2016-08-01

    The paper presents a description of the methods development for an automated processing chain for the classification of Forest Cover and Change based on high resolution multi-temporal time series Landsat and SPOT5Take5 data with focus on the dry forest ecosystems of Africa. The method has been developed within the European Space Agency (ESA) funded Global monitoring for Environment and Security Service Element for Forest Monitoring (GSE FM) project on dry forest areas; the demonstration site selected was in Malawi. The methods are based on the principles of a robust, but still flexible monitoring system, to cope with most complex Earth Observation (EO) data scenarios, varying in terms of data quality, source, accuracy, information content, completeness etc. The method allows automated tracking of change dates, data gap filling and takes into account phenology, seasonality of tree species with respect to leaf fall and heavy cloud cover during the rainy season.

  9. The acute effects of a warm-up including static or dynamic stretching on countermovement jump height, reaction time, and flexibility.

    PubMed

    Perrier, Erica T; Pavol, Michael J; Hoffman, Mark A

    2011-07-01

    The purpose of this research was to compare the effects of a warm-up with static vs. dynamic stretching on countermovement jump (CMJ) height, reaction time, and low-back and hamstring flexibility and to determine whether any observed performance deficits would persist throughout a series of CMJs. Twenty-one recreationally active men (24.4 ± 4.5 years) completed 3 data collection sessions. Each session included a 5-minute treadmill jog followed by 1 of the stretch treatments: no stretching (NS), static stretching (SS), or dynamic stretching (DS). After the jog and stretch treatment, the participant performed a sit-and-reach test. Next, the participant completed a series of 10 maximal-effort CMJs, during which he was asked to jump as quickly as possible after seeing a visual stimulus (light). The CMJ height and reaction time were determined from measured ground reaction forces. A treatment × jump repeated-measures analysis of variance for CMJ height revealed a significant main effect of treatment (p = 0.004). The CMJ height was greater for DS (43.0 cm) than for NS (41.4 cm) and SS (41.9 cm) and was not less for SS than for NS. Analysis also revealed a significant main effect of jump (p = 0.005) on CMJ height: Jump height decreased from the early to the late jumps. The analysis of reaction time showed no significant effect of treatment. Treatment had a main effect (p < 0.001) on flexibility, however. Flexibility was greater after both SS and DS compared to after NS, with no difference in flexibility between SS and DS. Athletes in sports requiring lower-extremity power should use DS techniques in warm-up to enhance flexibility while improving performance.

  10. Fatalistic Beliefs and Completion of the HPV Vaccination Series Among a Sample of Young Appalachian Kentucky Women

    PubMed Central

    Vanderpool, Robin C.; Van Meter Dressler, Emily; Stradtman, Lindsay R.; Crosby, Richard A.

    2016-01-01

    Purpose Uptake and completion of the 3-dose human papillomavirus (HPV) vaccine is important for the primary prevention of cervical cancer. However, HPV vaccination rates among adolescent females and young women remain low in certain geographic areas of the United States, including Appalachia. Although greater fatalistic beliefs have been previously associated with lower rates of preventive cancer behaviors among adults, little research exists on the impact of fatalism on HPV vaccination behaviors, especially among younger individuals. Therefore, the purpose of this study was to examine the association between fatalistic beliefs and completion of the full HPV vaccine series among young women, ages 18–26, in Appalachian Kentucky. Results Data from this study were from a baseline survey completed by 344 women randomized into a communication intervention trial focused on increasing adherence to the 3-dose HPV vaccine series. Principal components analysis was used to construct 2 fatalism-related subscales from 8 survey questions. Findings In a controlled analysis, 1 subscale—“lack of control over cancer”— was significantly associated with not completing the full HPV vaccine series. In a rural area that experiences higher rates of cervical cancer, poverty, limited access to health care, and negative cancer-related attitudes and experiences, fatalism may be common, even among young people. Conclusion Future educational and interventional research addressing fatalistic beliefs in a culturally sensitive manner may be warranted to improve HPV vaccination behaviors and impact cancer disparities among Appalachian women. PMID:25640763

  11. THE PATTERN OF LONGITUDINAL CHANGE IN SERUM CREATININE AND NINETY-DAY MORTALITY AFTER MAJOR SURGERY

    PubMed Central

    Hobson, Charles E; Pardalos, Panos

    2016-01-01

    Objective Calculate mortality risk that accounts for both severity and recovery of postoperative kidney dysfunction using the pattern of longitudinal change in creatinine. Summary Background Data Although the importance of renal recovery after acute kidney injury (AKI) is increasingly recognized, the complex association that accounts for longitudinal creatinine changes and mortality is not fully described. Methods We used routinely collected clinical information for 46,299 adult patients undergoing major surgery to develop a multivariable probabilistic model optimized for non-linearity of serum creatinine time series that calculates the risk function for ninety-day mortality. We performed a 70/30 cross validation analysis to assess the accuracy of the model. Results All creatinine time series exhibited nonlinear risk function in relation to ninety-day mortality and their addition to other clinical factors improved the model discrimination. For any given severity of AKI, patients with complete renal recovery, as manifested by the return of the discharge creatinine to the baseline value, experienced a significant decrease in the odds of dying within ninety days of admission compared to patients with partial recovery. Yet, for any severity of AKI even complete renal recovery did not entirely mitigate the increased odds of dying as patients with mild AKI and complete renal recovery still had significantly increased odds for dying compared to patients without AKI (odds ratio 1,48 (95% confidence interval 1.30-1.68). Conclusions We demonstrate the nonlinear relationship between both severity and recovery of renal dysfunction and ninety-day mortality after major surgery. We have developed an easily applicable computer algorithm that calculates this complex relationship. PMID:26181482

  12. Detection of a sudden change of the field time series based on the Lorenz system.

    PubMed

    Da, ChaoJiu; Li, Fang; Shen, BingLu; Yan, PengCheng; Song, Jian; Ma, DeShan

    2017-01-01

    We conducted an exploratory study of the detection of a sudden change of the field time series based on the numerical solution of the Lorenz system. First, the time when the Lorenz path jumped between the regions on the left and right of the equilibrium point of the Lorenz system was quantitatively marked and the sudden change time of the Lorenz system was obtained. Second, the numerical solution of the Lorenz system was regarded as a vector; thus, this solution could be considered as a vector time series. We transformed the vector time series into a time series using the vector inner product, considering the geometric and topological features of the Lorenz system path. Third, the sudden change of the resulting time series was detected using the sliding t-test method. Comparing the test results with the quantitatively marked time indicated that the method could detect every sudden change of the Lorenz path, thus the method is effective. Finally, we used the method to detect the sudden change of the pressure field time series and temperature field time series, and obtained good results for both series, which indicates that the method can apply to high-dimension vector time series. Mathematically, there is no essential difference between the field time series and vector time series; thus, we provide a new method for the detection of the sudden change of the field time series.

  13. Parents' and providers' attitudes toward school-located provision and school-entry requirements for HPV vaccines

    PubMed Central

    Vercruysse, Jessica; Chigurupati, Nagasudha L.; Fung, Leslie; Apte, Gauri; Pierre-Joseph, Natalie; Perkins, Rebecca B.

    2016-01-01

    ABSTRACT Objective: To determine parents' and providers' attitudes toward school-located provision and school-entry requirements for HPV vaccination. Methods: Parents/guardians of 11–17 y old girls and pediatric healthcare providers at one inner-city public clinic and three private practices completed semi-structured interviews in 2012-2013. Participants were asked open-ended questions regarding their attitudes toward school-located provision and school-entry requirements for HPV vaccination. Parents' answers were analyzed with relationship to whether their daughters had not initiated, initiated but not completed, or completed the HPV vaccine series. Qualitative analysis was used to identify themes related to shared views. Results: 129 parents/guardians and 34 providers participated. 61% of parents supported providing HPV vaccinations in schools, citing reasons of convenience, improved access, and positive peer pressure. Those who opposed school-located provision raised concerns related to privacy and the capacity of school nurses to manage vaccine-related reactions. Parents whose daughters had not completed the series were more likely to intend to vaccinate their daughters in schools (70%) and support requirements (64%) than parents who had not initiated vaccination (42% would vaccinate at school, 46% support requirements) or completed the series (42% would vaccinate at school, 32% support requirements; p < 0 .05 for all comparisons). 81% of providers supported offering vaccination in schools, wanting to take advantage of the captive audience, improve vaccine completion rates, and decrease the administrative burden on medical office staff, but were concerned about adequate information transfer between schools and medical offices. Only 32% of providers supported school-entry requirements, largely because they felt that a requirement might provoke a public backlash that could further hinder vaccination efforts. Conclusions: School-located provision of HPV vaccination was widely accepted by healthcare providers and parents whose children have not completed the series, indicating that this venue might be a valuable addition to improve completion rates. Support for school-entry requirements was limited among both parents and healthcare providers. PMID:26934421

  14. Parents' and providers' attitudes toward school-located provision and school-entry requirements for HPV vaccines.

    PubMed

    Vercruysse, Jessica; Chigurupati, Nagasudha L; Fung, Leslie; Apte, Gauri; Pierre-Joseph, Natalie; Perkins, Rebecca B

    2016-06-02

    To determine parents' and providers' attitudes toward school-located provision and school-entry requirements for HPV vaccination. Parents/guardians of 11-17 y old girls and pediatric healthcare providers at one inner-city public clinic and three private practices completed semi-structured interviews in 2012-2013. Participants were asked open-ended questions regarding their attitudes toward school-located provision and school-entry requirements for HPV vaccination. Parents' answers were analyzed with relationship to whether their daughters had not initiated, initiated but not completed, or completed the HPV vaccine series. Qualitative analysis was used to identify themes related to shared views. 129 parents/guardians and 34 providers participated. 61% of parents supported providing HPV vaccinations in schools, citing reasons of convenience, improved access, and positive peer pressure. Those who opposed school-located provision raised concerns related to privacy and the capacity of school nurses to manage vaccine-related reactions. Parents whose daughters had not completed the series were more likely to intend to vaccinate their daughters in schools (70%) and support requirements (64%) than parents who had not initiated vaccination (42% would vaccinate at school, 46% support requirements) or completed the series (42% would vaccinate at school, 32% support requirements; p < 0 .05 for all comparisons). 81% of providers supported offering vaccination in schools, wanting to take advantage of the captive audience, improve vaccine completion rates, and decrease the administrative burden on medical office staff, but were concerned about adequate information transfer between schools and medical offices. Only 32% of providers supported school-entry requirements, largely because they felt that a requirement might provoke a public backlash that could further hinder vaccination efforts. School-located provision of HPV vaccination was widely accepted by healthcare providers and parents whose children have not completed the series, indicating that this venue might be a valuable addition to improve completion rates. Support for school-entry requirements was limited among both parents and healthcare providers.

  15. Practical analysis of tide gauges records from Antarctica

    NASA Astrophysics Data System (ADS)

    Galassi, Gaia; Spada, Giorgio

    2015-04-01

    We have collected and analyzed in a basic way the currently available time series from tide gauges deployed along the coasts of Antarctica. The database of the Permanent Service for Mean Sea Level (PSMSL) holds relative sea level information for 17 stations, which are mostly concentrated in the Antarctic Peninsula (8 out of 17). For 7 of the PSMSL stations, Revised Local Reference (RLR) monthly and yearly observations are available, spanning from year 1957.79 (Almirante Brown) to 2013.95 (Argentine Islands). For the remaining 11 stations, only metric monthly data can be obtained during the time window 1957-2013. The record length of the available time series is not generally exceeding 20 years. Remarkable exceptions are the RLR station of Argentine Island, located in the Antarctic Peninsula (AP) (time span: 1958-2013, record length: 54 years, completeness=98%), and the metric station of Syowa in East Antarctica (1975-2012, 37 years, 92%). The general quality (geographical coverage and length of record) of the time series hinders a coherent geophysical interpretation of the relative sea-level data along the coasts of Antarctica. However, in an attempt to characterize the relative sea level signals available, we have stacked (i.e., averaged) the RLR time series for the AP and for the whole Antarctica. The so obtained time series have been analyzed using simple regression in order to estimate a trend and a possible sea-level acceleration. For the AP, the the trend is 1.8 ± 0.2 mm/yr and for the whole Antarctica it is 2.1 ± 0.1 mm/yr (both during 1957-2013). The modeled values of Glacial Isostatic Adjustment (GIA) obtained with ICE-5G(VM2) using program SELEN, range between -0.7 and -1.6 mm/yr, showing that the sea-level trend recorded by tide gauges is strongly influenced by GIA. Subtracting the average GIA contribution (-1.1 mm/yr) to observed sea-level trend from the two stacks, we obtain 3.2 and 2.9 mm/yr for Antarctica and AP respectively, which are interpreted as the effect of current ice melting and steric ocean contributions. By the Ensemble Empirical Mode Decomposition method, we have detected different oscillations embedded in the sea-level signals for Antarctica and AP. This confirms previously recognized connections between the sea-level variations in Antarctica and ocean modes like the ENSO.

  16. An evaluation of grease type ball bearing lubricants operating in various environments

    NASA Technical Reports Server (NTRS)

    Mcmurtrey, E. L.

    1984-01-01

    Because many future spacecraft or space stations will require mechanisms to operate for long periods of time in environments which are adverse to most bearing lubricants, a series of tests has been completed to evaluate 38 grease type lubricants in R-4 size bearings in five different environments for a 1 year period. Four repetitions of each test were made to provide statistical samples. These tests were also used to select four lubricants for 5 year tests in selected environments with five repetitions of each test for statistical samples. In this completed program, 172 test sets have been completed. The three 5 year tests in: (1) continuous operation and (2) start stop operation, with both in vacuum at ambient temperatures, and (3) continuous vacuum operation at 93.3 C have been completed. In both the 1 year and 5 year tests, the best results in all environments have been obtained with a high viscosity index perfluoroalkylpolyether (PFPE) grease.

  17. Volatility of linear and nonlinear time series

    NASA Astrophysics Data System (ADS)

    Kalisky, Tomer; Ashkenazy, Yosef; Havlin, Shlomo

    2005-07-01

    Previous studies indicated that nonlinear properties of Gaussian distributed time series with long-range correlations, ui , can be detected and quantified by studying the correlations in the magnitude series ∣ui∣ , the “volatility.” However, the origin for this empirical observation still remains unclear and the exact relation between the correlations in ui and the correlations in ∣ui∣ is still unknown. Here we develop analytical relations between the scaling exponent of linear series ui and its magnitude series ∣ui∣ . Moreover, we find that nonlinear time series exhibit stronger (or the same) correlations in the magnitude time series compared with linear time series with the same two-point correlations. Based on these results we propose a simple model that generates multifractal time series by explicitly inserting long range correlations in the magnitude series; the nonlinear multifractal time series is generated by multiplying a long-range correlated time series (that represents the magnitude series) with uncorrelated time series [that represents the sign series sgn(ui) ]. We apply our techniques on daily deep ocean temperature records from the equatorial Pacific, the region of the El-Ninõ phenomenon, and find: (i) long-range correlations from several days to several years with 1/f power spectrum, (ii) significant nonlinear behavior as expressed by long-range correlations of the volatility series, and (iii) broad multifractal spectrum.

  18. Duality between Time Series and Networks

    PubMed Central

    Campanharo, Andriana S. L. O.; Sirer, M. Irmak; Malmgren, R. Dean; Ramos, Fernando M.; Amaral, Luís A. Nunes.

    2011-01-01

    Studying the interaction between a system's components and the temporal evolution of the system are two common ways to uncover and characterize its internal workings. Recently, several maps from a time series to a network have been proposed with the intent of using network metrics to characterize time series. Although these maps demonstrate that different time series result in networks with distinct topological properties, it remains unclear how these topological properties relate to the original time series. Here, we propose a map from a time series to a network with an approximate inverse operation, making it possible to use network statistics to characterize time series and time series statistics to characterize networks. As a proof of concept, we generate an ensemble of time series ranging from periodic to random and confirm that application of the proposed map retains much of the information encoded in the original time series (or networks) after application of the map (or its inverse). Our results suggest that network analysis can be used to distinguish different dynamic regimes in time series and, perhaps more importantly, time series analysis can provide a powerful set of tools that augment the traditional network analysis toolkit to quantify networks in new and useful ways. PMID:21858093

  19. Three new defined proton affinities for polybasic molecules in the gas-phase: Proton microaffinity, proton macroaffinity and proton overallaffinity

    NASA Astrophysics Data System (ADS)

    Salehzadeh, Sadegh; Bayat, Mehdi

    2006-08-01

    A theoretical study on complete protonation of a series of tetrabasic molecules with general formula N[(CH 2) nNH 2][(CH 2) mNH 2][(CH 2) pNH 2] (tren, pee, ppe, tpt, epb and ppb) is reported. For first time, three kinds of gas-phase proton affinities for each polybasic molecule are defined as: 'proton microaffinity (PA n, i)', 'proton macroaffinity (PA)' and 'proton overall affinity ( PA)'. The variations of calculated logPA in the series of these molecules is very similar to that of their measured log Kn. There is also a good correlation between the calculated gas-phase proton macroaffinities and proton overallaffinities with corresponding equilibrium macroconstants and overall protonation constants in solution.

  20. Self-rated health: patterns in the journeys of patients with multi-morbidity and frailty.

    PubMed

    Martin, Carmel Mary

    2014-12-01

    Self-rated health (SRH) is a single measure predictor of hospital utilization and health outcomes in epidemiological studies. There have been few studies of SRH in patient journeys in clinical settings. Reduced resilience to stressors, reflected by SRH, exposes older people (complex systems) to the risk of hospitalization. It is proposed that SRH reflects rather than predicts deteriorations and hospital use; with low SRH autocorrelation in time series. The aim was to investigate SRH fluctuations in regular outbound telephone calls (average biweekly) to patients by Care Guides. Descriptive case study using quantitative autoregressive techniques and qualitative case analysis on SRH time series. Fourteen participants were randomly selected from the Patient Journey Record System (PaJR) database. The PaJR database recorded 198 consecutively sampled older multi-morbid patients journeys in three primary care settings. Analysis consisted of triangulation of SRH (0 very poor - 6 excellent) patterns from three analyses: SRH graduations associations with service utilization; time series modelling (autocorrelation, and step ahead forecast); and qualitative categorization of deteriorations. Fourteen patients reported mean SRH 2.84 (poor-fair) in 818 calls over 13 ± 6.4 months of follow-up. In 24% calls, SRH was poor-fair and significantly associated with hospital use. SRH autocorrelation was low in 14 time series (-0.11 to 0.26) with little difference (χ(2)  = 6.46, P = 0.91) among them. Fluctuations between better and worse health were very common and poor health was associated with hospital use. It is not clear why some patients continued on a downward trajectory, whereas others who destabilized appeared to completely recover, and even improved over time. SRH reflects an individual's complex health trajectory, but as a single measure does not predict when and how deteriorations will occur in this study. Individual patients appear to behave as complex adaptive systems. The dynamics of SRH and its influences in destabilizations warrant further research. © 2014 John Wiley & Sons, Ltd.

  1. Evaluating a community-based exercise intervention with adults living with HIV: protocol for an interrupted time series study

    PubMed Central

    O'Brien, Kelly K; Bayoumi, Ahmed M; Solomon, Patricia; Tang, Ada; Murzin, Kate; Chan Carusone, Soo; Zobeiry, Mehdi; Nayar, Ayesha; Davis, Aileen M

    2016-01-01

    Introduction Our aim was to evaluate a community-based exercise (CBE) intervention with the goal of reducing disability and enhancing health for community-dwelling people living with HIV (PLWH). Methods and analysis We will use a mixed-methods implementation science study design, including a prospective longitudinal interrupted time series study, to evaluate a CBE intervention with PLWH in Toronto, Canada. We will recruit PLWH who consider themselves medically stable and safe to participate in exercise. In the baseline phase (0–8 months), participants will be monitored bimonthly. In the intervention phase (8–14 months), participants will take part in a 24-week CBE intervention that includes aerobic, resistance, balance and flexibility exercise at the YMCA 3 times per week, with weekly supervision by a fitness instructor, and monthly educational sessions. In the follow-up phase (14–22 months), participants will be encouraged to continue to engage in unsupervised exercise 3 times per week. Quantitative assessment: We will assess cardiopulmonary fitness, strength, weight, body composition and flexibility outcomes followed by the administration of self-reported questionnaires to assess disability and contextual factor outcomes (coping, mastery, stigma, social support) bimonthly. We will use time series regression analysis to determine the level and trend of outcomes across each phase in relation to the intervention. Qualitative assessment: We will conduct a series of face-to-face interviews with a subsample of participants and recreation providers at initiation, midpoint and completion of the 24-week CBE intervention. We will explore experiences and anticipated benefits with exercise, perceived impact of CBE for PLWH and the strengths and challenges of implementing a CBE intervention. Interviews will be audio recorded and analysed thematically. Ethics and dissemination Protocol approved by the University of Toronto HIV/AIDS Research Ethics Board. Knowledge translation will occur with stakeholders in the form of presentations and publications in open access peer-reviewed journals. Trial registration number NCT02794415; Pre-results. PMID:27798038

  2. Institute of Educational Technology. Series List. Reports and Papers Produced in Series 1969-1991. A Companion to "The Institute Bibliography."

    ERIC Educational Resources Information Center

    Whitehead, Don J., Comp.

    Compiled to make available details of reports which contain the record of work by the research groups of the Open University's Institute of Educational Technology, this list is divided into two sections: (1) Completed Series; and (2) Current Series. Within each section the lists are placed in alphabetical order by program and, where known, the…

  3. Evaluation of Mechanisms to Improve Performance of Mobile Phone Surveys in Low- and Middle-Income Countries: Research Protocol.

    PubMed

    Gibson, Dustin G; Pariyo, George William; Wosu, Adaeze C; Greenleaf, Abigail R; Ali, Joseph; Ahmed, Saifuddin; Labrique, Alain B; Islam, Khaleda; Masanja, Honorati; Rutebemberwa, Elizeus; Hyder, Adnan A

    2017-05-05

    Mobile phone ownership and access have increased rapidly across low- and middle-income countries (LMICs) within the last decade. Concomitantly, LMICs are experiencing demographic and epidemiologic transitions, where non-communicable diseases (NCDs) are increasingly becoming leading causes of morbidity and mortality. Mobile phone surveys could aid data collection for prevention and control of these NCDs but limited evidence of their feasibility exists. The objective of this paper is to describe a series of sub-studies aimed at optimizing the delivery of interactive voice response (IVR) and computer-assisted telephone interviews (CATI) for NCD risk factor data collection in LMICs. These sub-studies are designed to assess the effect of factors such as airtime incentive timing, amount, and structure, survey introduction characteristics, different sampling frames, and survey modality on key survey metrics, such as survey response, completion, and attrition rates. In a series of sub-studies, participants will be randomly assigned to receive different airtime incentive amounts (eg, 10 minutes of airtime versus 20 minutes of airtime), different incentive delivery timings (airtime delivered before survey begins versus delivery upon completion of survey), different survey introductions (informational versus motivational), different narrative voices (male versus female), and different sampling frames (random digit dialing versus mobile network operator-provided numbers) to examine which study arms will yield the highest response and completion rates. Furthermore, response and completion rates and the inter-modal reliability of the IVR and CATI delivery methods will be compared. Research activities are expected to be completed in Bangladesh, Tanzania, and Uganda in 2017. This is one of the first studies to examine the feasibility of using IVR and CATI for systematic collection of NCD risk factor information in LMICs. Our findings will inform the future design and implementation of mobile phone surveys in LMICs. ©Dustin G Gibson, George William Pariyo, Adaeze C Wosu, Abigail R Greenleaf, Joseph Ali, Saifuddin Ahmed, Alain B Labrique, Khaleda Islam, Honorati Masanja, Elizeus Rutebemberwa, Adnan A Hyder. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 05.05.2017.

  4. Inductive reasoning and implicit memory: evidence from intact and impaired memory systems.

    PubMed

    Girelli, Luisa; Semenza, Carlo; Delazer, Margarete

    2004-01-01

    In this study, we modified a classic problem solving task, number series completion, in order to explore the contribution of implicit memory to inductive reasoning. Participants were required to complete number series sharing the same underlying algorithm (e.g., +2), differing in both constituent elements (e.g., 2468 versus 57911) and correct answers (e.g., 10 versus 13). In Experiment 1, reliable priming effects emerged, whether primes and targets were separated by four or ten fillers. Experiment 2 provided direct evidence that the observed facilitation arises at central stages of problem solving, namely the identification of the algorithm and its subsequent extrapolation. The observation of analogous priming effects in a severely amnesic patient strongly supports the hypothesis that the facilitation in number series completion was largely determined by implicit memory processes. These findings demonstrate that the influence of implicit processes extends to higher level cognitive domain such as induction reasoning.

  5. Detection of a sudden change of the field time series based on the Lorenz system

    PubMed Central

    Li, Fang; Shen, BingLu; Yan, PengCheng; Song, Jian; Ma, DeShan

    2017-01-01

    We conducted an exploratory study of the detection of a sudden change of the field time series based on the numerical solution of the Lorenz system. First, the time when the Lorenz path jumped between the regions on the left and right of the equilibrium point of the Lorenz system was quantitatively marked and the sudden change time of the Lorenz system was obtained. Second, the numerical solution of the Lorenz system was regarded as a vector; thus, this solution could be considered as a vector time series. We transformed the vector time series into a time series using the vector inner product, considering the geometric and topological features of the Lorenz system path. Third, the sudden change of the resulting time series was detected using the sliding t-test method. Comparing the test results with the quantitatively marked time indicated that the method could detect every sudden change of the Lorenz path, thus the method is effective. Finally, we used the method to detect the sudden change of the pressure field time series and temperature field time series, and obtained good results for both series, which indicates that the method can apply to high-dimension vector time series. Mathematically, there is no essential difference between the field time series and vector time series; thus, we provide a new method for the detection of the sudden change of the field time series. PMID:28141832

  6. Correlates of human papillomavirus vaccination rates in low-income, minority adolescents: a multicenter study.

    PubMed

    Perkins, Rebecca B; Brogly, Susan B; Adams, William G; Freund, Karen M

    2012-08-01

    Low rates of human papillomavirus (HPV) vaccination in low-income, minority adolescents may exacerbate racial disparities in cervical cancer incidence. Using electronic medical record data and chart abstraction, we examined correlates of HPV vaccine series initiation and completion among 7702 low-income and minority adolescents aged 11-21 receiving primary care at one of seven medical centers between May 1, 2007, and June 30, 2009. Our population included 61% African Americans, 13% Caucasians, 15% Latinas, and 11% other races; 90% receive public insurance (e.g., Medicaid). We used logistic regression to estimate the associations between vaccine initiation and completion and age, race/ethnicity, number of contacts with the healthcare system, provider documentation, and clinical site of care. Of the 41% of adolescent girls who initiated HPV vaccination, 20% completed the series. A higher proportion of girls aged 11-<13 (46%) and 13-<18 (47%) initiated vaccination than those aged 18-21 (28%). In adjusted analyses, receipt of other recommended adolescent vaccines was associated with vaccine initiation, and increased contact with the medical system was associated with both initiation and completion of the series. Conversely, provider failure to document risky health behaviors predicted nonvaccination. Manual review of a subset of unvaccinated patients' charts revealed no documentation of vaccine discussions in 67% of cases. Fewer than half of low-income and minority adolescents receiving health maintenance services initiated HPV vaccination, and only 20% completed the series. Provider failure to discuss vaccination with their patients appears to be an important contributor to nonvaccination. Future research should focus on improving both initiation and completion of HPV vaccination in high-risk adolescents.

  7. Predicting Long-term Temperature Increase for Time-Dependent SAR Levels with a Single Short-term Temperature Response

    PubMed Central

    Carluccio, Giuseppe; Bruno, Mary; Collins, Christopher M.

    2015-01-01

    Purpose Present a novel method for rapid prediction of temperature in vivo for a series of pulse sequences with differing levels and distributions of specific energy absorption rate (SAR). Methods After the temperature response to a brief period of heating is characterized, a rapid estimate of temperature during a series of periods at different heating levels is made using a linear heat equation and Impulse-Response (IR) concepts. Here the initial characterization and long-term prediction for a complete spine exam are made with the Pennes’ bioheat equation where, at first, core body temperature is allowed to increase and local perfusion is not. Then corrections through time allowing variation in local perfusion are introduced. Results The fast IR-based method predicted maximum temperature increase within 1% of that with a full finite difference simulation, but required less than 3.5% of the computation time. Even higher accelerations are possible depending on the time step size chosen, with loss in temporal resolution. Correction for temperature-dependent perfusion requires negligible additional time, and can be adjusted to be more or less conservative than the corresponding finite difference simulation. Conclusion With appropriate methods, it is possible to rapidly predict temperature increase throughout the body for actual MR examinations. (200/200 words) PMID:26096947

  8. Predicting long-term temperature increase for time-dependent SAR levels with a single short-term temperature response.

    PubMed

    Carluccio, Giuseppe; Bruno, Mary; Collins, Christopher M

    2016-05-01

    Present a novel method for rapid prediction of temperature in vivo for a series of pulse sequences with differing levels and distributions of specific energy absorption rate (SAR). After the temperature response to a brief period of heating is characterized, a rapid estimate of temperature during a series of periods at different heating levels is made using a linear heat equation and impulse-response (IR) concepts. Here the initial characterization and long-term prediction for a complete spine exam are made with the Pennes' bioheat equation where, at first, core body temperature is allowed to increase and local perfusion is not. Then corrections through time allowing variation in local perfusion are introduced. The fast IR-based method predicted maximum temperature increase within 1% of that with a full finite difference simulation, but required less than 3.5% of the computation time. Even higher accelerations are possible depending on the time step size chosen, with loss in temporal resolution. Correction for temperature-dependent perfusion requires negligible additional time and can be adjusted to be more or less conservative than the corresponding finite difference simulation. With appropriate methods, it is possible to rapidly predict temperature increase throughout the body for actual MR examinations. © 2015 Wiley Periodicals, Inc.

  9. Proba-V Mission Exploitation Platform

    NASA Astrophysics Data System (ADS)

    Goor, Erwin; Dries, Jeroen

    2017-04-01

    VITO and partners developed the Proba-V Mission Exploitation Platform (MEP) as an end-to-end solution to drastically improve the exploitation of the Proba-V (a Copernicus contributing mission) EO-data archive (http://proba-v.vgt.vito.be/), the past mission SPOT-VEGETATION and derived vegetation parameters by researchers, service providers and end-users. The analysis of time series of data (+1PB) is addressed, as well as the large scale on-demand processing of near real-time data on a powerful and scalable processing environment. Furthermore data from the Copernicus Global Land Service is in scope of the platform. From November 2015 an operational Proba-V MEP environment, as an ESA operation service, is gradually deployed at the VITO data center with direct access to the complete data archive. Since autumn 2016 the platform is operational and yet several applications are released to the users, e.g. - A time series viewer, showing the evolution of Proba-V bands and derived vegetation parameters from the Copernicus Global Land Service for any area of interest. - Full-resolution viewing services for the complete data archive. - On-demand processing chains on a powerfull Hadoop/Spark backend e.g. for the calculation of N-daily composites. - Virtual Machines can be provided with access to the data archive and tools to work with this data, e.g. various toolboxes (GDAL, QGIS, GrassGIS, SNAP toolbox, …) and support for R and Python. This allows users to immediately work with the data without having to install tools or download data, but as well to design, debug and test applications on the platform. - A prototype of jupyter Notebooks is available with some examples worked out to show the potential of the data. Today the platform is used by several third party projects to perform R&D activities on the data, and to develop/host data analysis toolboxes. In parallel the platform is further improved and extended. From the MEP PROBA-V, access to Sentinel-2 and landsat data will be available as well soon. Users can make use of powerful Web based tools and can self-manage virtual machines to perform their work on the infrastructure at VITO with access to the complete data archive. To realise this, private cloud technology (openStack) is used and a distributed processing environment is built based on Hadoop. The Hadoop ecosystem offers a lot of technologies (Spark, Yarn, Accumulo, etc.) which we integrate with several open-source components (e.g. Geotrellis). The impact of this MEP on the user community will be high and will completely change the way of working with the data and hence open the large time series to a larger community of users. The presentation will address these benefits for the users and discuss on the technical challenges in implementing this MEP. Furthermore demonstrations will be done. Platform URL: https://proba-v-mep.esa.int/

  10. Estimation of surface heat and moisture fluxes over a prairie grassland. I - In situ energy budget measurements incorporating a cooled mirror dew point hygrometer

    NASA Technical Reports Server (NTRS)

    Smith, Eric A.; Crosson, William L.; Tanner, Bertrand D.

    1992-01-01

    Attention is focused on in situ measurements taken during FIFE required to support the development and validation of a biosphere model. Seasonal time series of surface flux measurements obtained from two surface radiation and energy budget stations utilized to support the FIFE surface flux measurement subprogram are examined. Data collection and processing procedures are discussed along with the measurement analysis for the complete 1987 test period.

  11. Are Graduate Students Rational? Evidence from the Market for Biomedical Scientists

    PubMed Central

    Blume-Kohout, Margaret E.; Clack, John W.

    2013-01-01

    The U.S. National Institutes of Health (NIH) budget expansion from 1998 through 2003 increased demand for biomedical research, raising relative wages and total employment in the market for biomedical scientists. However, because research doctorates in biomedical sciences can often take six years or more to complete, the full labor supply response to such changes in market conditions is not immediate, but rather is observed over a period of several years. Economic rational expectations models assume that prospective students anticipate these future changes, and also that students take into account the opportunity costs of their pursuing graduate training. Prior empirical research on student enrollment and degree completions in science and engineering (S&E) fields indicates that “cobweb” expectations prevail: that is, at least in theory, prospective graduate students respond to contemporaneous changes in market wages and employment, but do not forecast further changes that will arise by the time they complete their degrees and enter the labor market. In this article, we analyze time-series data on wages and employment of biomedical scientists versus alternative careers, on completions of S&E bachelor's degrees and biomedical sciences PhDs, and on research expenditures funded both by NIH and by biopharmaceutical firms, to examine the responsiveness of the biomedical sciences labor supply to changes in market conditions. Consistent with previous studies, we find that enrollments and completions in biomedical sciences PhD programs are responsive to market conditions at the time of students' enrollment. More striking, however, is the close correspondence between graduate student enrollments and completions, and changes in availability of NIH-funded traineeships, fellowships, and research assistantships. PMID:24376573

  12. Computer Assisted Instruction in Teacher Education: A Full Length Course.

    ERIC Educational Resources Information Center

    Cartwright, G. Phillip

    Pennsylvania State University has developed, evaluated, and implemented a series of modules and an entire three-credit teacher education course which is offered completely by microcomputer. The course is entitled "Educating Special Learners." The modules use the Apple II series and the IBM PC series. Evaluation of the course, based on…

  13. Atomic Fuel, Understanding the Atom Series. Revised.

    ERIC Educational Resources Information Center

    Hogerton, John F.

    This publication is part of the "Understanding the Atom" series. Complete sets of the series are available free to teachers, schools, and public librarians who can make them available for reference or use by groups. Among the topics discussed are: What Atomic Fuel Is; The Odyssey of Uranium; Production of Uranium; Fabrication of Reactor…

  14. Piezometer completion report for borehole cluster sites DC-19, DC-20, and DC-22

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, R.L.; Diediker, L.D.; Ledgerwood, R.K.

    1984-07-01

    This report describes the design and installation of multi-level piezometers at borehole cluster sites DC-19, DC-20 and DC-22. The network of borehole cluster sites will provide facilities for multi-level water-level monitoring across the RRL for piezometer baseline monitoring and for large-scale hydraulic stress testing. These groundwater-monitoring facilities were installed between August 1983 and March 1984. Three series of piezometer nests (A-, C- and D-series) were installed in nine hydrogeologic units (monitoring horizons) within the Columbia River Basalt Group at each borehole cluster site. In addition to the piezometer facilities, a B-series pumping well was installed at borehole cluster sites DC-20more » and DC-22. The A-series piezometer nest monitors the basal Ringold sediments and the Rattlesnake Ridge interbed. The C-series piezometer nests monitors the six deepest horizons, which are in increasing depth, the Priest Rapids interflow, Sentinel Gap flow top, Ginkgo flow top, Rocky Coulee flow top, Cohassett flow top and Umtanum flow top. The D-series piezometer monitors the Mabton interbed. The B-series pumping well was completed in the Priest Rapids interflow. 21 refs., 6 figs., 6 tabs.« less

  15. Multiple Indicator Stationary Time Series Models.

    ERIC Educational Resources Information Center

    Sivo, Stephen A.

    2001-01-01

    Discusses the propriety and practical advantages of specifying multivariate time series models in the context of structural equation modeling for time series and longitudinal panel data. For time series data, the multiple indicator model specification improves on classical time series analysis. For panel data, the multiple indicator model…

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chichester, Heather Jean MacLean; Hayes, Steven Lowe; Dempsey, Douglas

    This report summarizes the objectives of the current irradiation testing activities being undertaken by the Advanced Fuels Campaign relative to supporting the development and demonstration of innovative design features for metallic fuels in order to realize reliable performance to ultra-high burnups. The AFC-3 and AFC-4 test series are nearing completion; the experiments in this test series that have been completed or are in progress are reviewed and the objectives and test matrices for the final experiments in these two series are defined. The objectives, testing strategy, and test parameters associated with a future AFC test series, AFC-5, are documented. Finally,more » the future intersections and/or synergies of the AFC irradiation testing program with those of the TREAT transient testing program, emerging needs of proposed Versatile Test Reactor concepts, and the Joint Fuel Cycle Study program’s Integrated Recycle Test are discussed.« less

  17. Beginning of grain harvest in the tri-border region Basel as a proxy for mean April - May temperatures; creation of a long Swiss series c. 1455 AD - 1950 AD

    NASA Astrophysics Data System (ADS)

    Wetter, Oliver; Pfister, Christian

    2010-05-01

    Beginning of grain harvest in the tri-border region Basel as a proxy for mean April-July temperatures; creation of a long Swiss series c. 1454 AD - 1950 AD O. Wetter and C. Pfister Section of Economic, Social and Environmental History, Institute of History, University of Bern, Bern, Switzerland (oliver.wetter@hist.unibe.ch) Before agricultural harvesting machines replaced manual labour the date of the grain harvest was largely dependent on mean temperatures from spring to early summer. It thus constitutes a very valuable source of information to reconstruct these temperatures. The later the harvest began, the cooler spring and early summer must have been and vice versa. For this reconstruction a new data series of grain harvests in the tri-border region Basel (representative for north-west Switzerland, the Alsace (France) and south-west Germany) was used as a temperature proxy. The harvesting dates have been extracted from the account books of the hospital of Basel which cover the period from c.1454 AD to 1705 AD. This series could be completed with several series of grain tithe dates originating from the Swiss Midland, covering the period between 1557 and 1825 and several grain harvest dates series covering the time between 1825 and 1950. Thus a series of almost 500 years could be compiled. Since the method of harvesting remained unchanged until the 1950's when manual labour was replaced by machines, the harvest dates of the modern series, lying within the temperature measurement series, could be used for calibrating the medieval dates.

  18. Time series segmentation: a new approach based on Genetic Algorithm and Hidden Markov Model

    NASA Astrophysics Data System (ADS)

    Toreti, A.; Kuglitsch, F. G.; Xoplaki, E.; Luterbacher, J.

    2009-04-01

    The subdivision of a time series into homogeneous segments has been performed using various methods applied to different disciplines. In climatology, for example, it is accompanied by the well-known homogenization problem and the detection of artificial change points. In this context, we present a new method (GAMM) based on Hidden Markov Model (HMM) and Genetic Algorithm (GA), applicable to series of independent observations (and easily adaptable to autoregressive processes). A left-to-right hidden Markov model, estimating the parameters and the best-state sequence, respectively, with the Baum-Welch and Viterbi algorithms, was applied. In order to avoid the well-known dependence of the Baum-Welch algorithm on the initial condition, a Genetic Algorithm was developed. This algorithm is characterized by mutation, elitism and a crossover procedure implemented with some restrictive rules. Moreover the function to be minimized was derived following the approach of Kehagias (2004), i.e. it is the so-called complete log-likelihood. The number of states was determined applying a two-fold cross-validation procedure (Celeux and Durand, 2008). Being aware that the last issue is complex, and it influences all the analysis, a Multi Response Permutation Procedure (MRPP; Mielke et al., 1981) was inserted. It tests the model with K+1 states (where K is the state number of the best model) if its likelihood is close to K-state model. Finally, an evaluation of the GAMM performances, applied as a break detection method in the field of climate time series homogenization, is shown. 1. G. Celeux and J.B. Durand, Comput Stat 2008. 2. A. Kehagias, Stoch Envir Res 2004. 3. P.W. Mielke, K.J. Berry, G.W. Brier, Monthly Wea Rev 1981.

  19. A solid solution series of atacamite type Ni{sub 2x}Mg{sub 2−2x}Cl(OH){sub 3}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bette, Sebastian; Dinnebier, Robert E.; Röder, Christian

    2015-08-15

    For the first time a complete solid solution series Ni{sub 2x}Mg{sub 2−2x}Cl(OH){sub 3} of an atacamite type alkaline main group metal chloride, Mg{sub 2}Cl(OH){sub 3}, and a transition group metal chloride, Ni{sub 2}Cl(OH){sub 3}, was prepared and characterized by chemical and thermal analysis as well as by Raman and IR spectroscopy, and high resolution laboratory X-ray powder diffraction. All members of the solid solution series crystallize in space group Pnam (62). The main building units of these crystal structures are distorted, edge-linked Ni/MgO{sub 4}Cl{sub 2} and Ni/MgO{sub 5}Cl octahedra. The distribution of Ni{sup 2+}- and Mg{sup 2+}-ions among these twomore » metal-sites within the solid solution series is discussed in detail. The crystallization of the solid solution phases occurs via an intermediate solid solution series, (Ni/Mg)Cl{sub 2x}(OH){sub 2−2x}, with variable Cl: OH ratio up to the 1:3 ratio according to the formula Ni{sub 2x}Mg{sub 2−2x} Cl(OH){sub 3}. For one isolated intermediate solid solution member, Ni{sub 0.70}Mg{sub 0.30}Cl{sub 0.58}(OH){sub 1.42}, the formation and crystal structure is presented as well. - Graphical abstract: For the first time a complete solid solution series, Ni{sub 2x}Mg{sub 2−2x} Cl(OH){sub 3}, was synthesized and characterized. Structure solution by revealed that Ni{sup 2+} prefers to occupy the Jahn–Teller-like distorted hole, out of two available cation sites. Substitution of Ni{sup 2+} by Mg{sup 2+} in atacamite type Ni{sub 2}Cl(OH){sub 3} results in systematic band shifts in Raman and IR spectra as well as in systematic changes in thermal properties. The α-polymorphs M{sub 2}Cl(OH){sub 3} with M=Mg{sup 2+}, Ni{sup 2+} and other divalent transition metal ions, as described in literature, were identified as separate compounds. - Highlights: • First synthesis of solid solution series between main and transition metal chloride. • Ni{sup 2+} prefers to occupy Jahn–Teller-like distorted octahedral holes. • Substitution of Ni{sup 2+} by Mg{sup 2+} results in systematic Raman and IR band shifts. • α-Polymorphs M{sub 2}Cl(OH){sub 3} with M=Mg{sup 2+}, Ni{sup 2+}, … as described in literature do not exist.« less

  20. Quantifying nonstationary radioactivity concentration fluctuations near Chernobyl: A complete statistical description

    NASA Astrophysics Data System (ADS)

    Viswanathan, G. M.; Buldyrev, S. V.; Garger, E. K.; Kashpur, V. A.; Lucena, L. S.; Shlyakhter, A.; Stanley, H. E.; Tschiersch, J.

    2000-09-01

    We analyze nonstationary 137Cs atmospheric activity concentration fluctuations measured near Chernobyl after the 1986 disaster and find three new results: (i) the histogram of fluctuations is well described by a log-normal distribution; (ii) there is a pronounced spectral component with period T=1yr, and (iii) the fluctuations are long-range correlated. These findings allow us to quantify two fundamental statistical properties of the data: the probability distribution and the correlation properties of the time series. We interpret our findings as evidence that the atmospheric radionuclide resuspension processes are tightly coupled to the surrounding ecosystems and to large time scale weather patterns.

  1. On the asymptotic optimality and improved strategies of SPTB heuristic for open-shop scheduling problem

    NASA Astrophysics Data System (ADS)

    Bai, Danyu; Zhang, Zhihai

    2014-08-01

    This article investigates the open-shop scheduling problem with the optimal criterion of minimising the sum of quadratic completion times. For this NP-hard problem, the asymptotic optimality of the shortest processing time block (SPTB) heuristic is proven in the sense of limit. Moreover, three different improvements, namely, the job-insert scheme, tabu search and genetic algorithm, are introduced to enhance the quality of the original solution generated by the SPTB heuristic. At the end of the article, a series of numerical experiments demonstrate the convergence of the heuristic, the performance of the improvements and the effectiveness of the quadratic objective.

  2. Testing a Nursing-Specific Model of Electronic Patient Record documentation with regard to information completeness, comprehensiveness and consistency.

    PubMed

    von Krogh, Gunn; Nåden, Dagfinn; Aasland, Olaf Gjerløw

    2012-10-01

    To present the results from the test site application of the documentation model KPO (quality assurance, problem solving and caring) designed to impact the quality of nursing information in electronic patient record (EPR). The KPO model was developed by means of consensus group and clinical testing. Four documentation arenas and eight content categories, nursing terminologies and a decision-support system were designed to impact the completeness, comprehensiveness and consistency of nursing information. The testing was performed in a pre-test/post-test time series design, three times at a one-year interval. Content analysis of nursing documentation was accomplished through the identification, interpretation and coding of information units. Data from the pre-test and post-test 2 were subjected to statistical analyses. To estimate the differences, paired t-tests were used. At post-test 2, the information is found to be more complete, comprehensive and consistent than at pre-test. The findings indicate that documentation arenas combining work flow and content categories deduced from theories on nursing practice can influence the quality of nursing information. The KPO model can be used as guide when shifting from paper-based to electronic-based nursing documentation with the aim of obtaining complete, comprehensive and consistent nursing information. © 2012 Blackwell Publishing Ltd.

  3. The Multiple Sclerosis Self-Management Scale

    PubMed Central

    Ghahari, Setareh; Khoshbin, Lana S.

    2014-01-01

    Background: The Multiple Sclerosis Self-Management Scale (MSSM) is currently the only measure that was developed specifically to address self-management among individuals with multiple sclerosis (MS). While good internal consistency (α = 0.85) and construct validity have been demonstrated, other psychometric properties have not been established. This study was undertaken to evaluate the criterion validity, test-retest reliability, and face validity of the MSSM. Methods: Thirty-one individuals with MS who met the inclusion criteria were recruited to complete a series of questionnaires at two time points. At Time 1, participants completed the MSSM and two generic self-management tools—the Partners in Health (PIH-12) and the Health Education Impact Questionnaire (heiQ)—as well as a short questionnaire to capture participants' opinions about the MSSM. At Time 2, approximately 2 weeks after Time 1, participants completed the MSSM again. Results: The available MSSM factors showed moderate to high correlations with both PIH-12 and heiQ and were deemed to have satisfactory test-retest reliability. Face validity pointed to areas of the MSSM that need to be revised in future work. As indicated by the participants, some dimensions of MS self-management are missing in the MSSM and some items such as medication are redundant. Conclusions: This study provides evidence for the reliability and validity of the MSSM; however, further changes are required for both researchers and clinicians to use the tool meaningfully in practice. PMID:25061429

  4. Efficient Algorithms for Segmentation of Item-Set Time Series

    NASA Astrophysics Data System (ADS)

    Chundi, Parvathi; Rosenkrantz, Daniel J.

    We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.

  5. Surgical management of giant sphenoid wing meningiomas encasing major cerebral arteries.

    PubMed

    Champagne, Pierre-Olivier; Lemoine, Emile; Bojanowski, Michel W

    2018-04-01

    OBJECTIVE Sphenoid wing meningiomas are a heterogeneous group of tumors with variable surgical risks and prognosis. Those that have grown to a very large size, encasing the major cerebral arteries, are associated with a high risk of stroke. In reviewing the authors' series of giant sphenoid wing meningiomas, the goal was to evaluate how the extent of the tumor's invasion of surrounding structures affected the ability to safely remove the tumor and restore function. METHODS The authors conducted a retrospective study of a series of giant sphenoid wing meningiomas operated on between 1996 and 2016. Inclusion criteria were meningiomas with a globoid component ≥ 6 cm, encasing at least 1 major intradural cerebral artery. Extent of resection was measured according to Simpson grade. RESULTS This series included 12 patients, with a mean age of 59 years. Visual symptoms were the most common clinical presentation. There was complete or partial encasement of all 3 major cerebral arteries except for 3 cases in which only the anterior cerebral artery was not involved. The lateral wall of the cavernous sinus was invaded in 8 cases (67%) and the optic canal in 6 (50%). Complete resection was achieved in 2 cases (Simpson grades 2 and 3). In the remaining 10 cases of partial resection (Simpson grade 4), radical removal (> 90%) was achieved in 7 cases (70%). In the immediate postoperative period, there were no deaths. Four of 9 patients with visual deficits improved, while the 5 others remained unchanged. Two patients experienced transient neurological deficits. Other than an asymptomatic lacuna of the internal capsule, there were no ischemic lesions following surgery. Tumor recurrence occurred in 5 patients, between 24 and 168 months (mean 61 months) following surgery. CONCLUSIONS Although these giant lesions encasing major cerebral arteries are particularly treacherous for surgery, this series demonstrates that it is possible to safely achieve radical removal and at times even gross-total resection. However, the risk of recurrence remains high and larger studies are needed to see if and how improvement can be achieved, whether in surgical technique or technological advances, and by determining the timing and modality of adjuvant radiation therapy.

  6. A novel weight determination method for time series data aggregation

    NASA Astrophysics Data System (ADS)

    Xu, Paiheng; Zhang, Rong; Deng, Yong

    2017-09-01

    Aggregation in time series is of great importance in time series smoothing, predicting and other time series analysis process, which makes it crucial to address the weights in times series correctly and reasonably. In this paper, a novel method to obtain the weights in time series is proposed, in which we adopt induced ordered weighted aggregation (IOWA) operator and visibility graph averaging (VGA) operator and linearly combine the weights separately generated by the two operator. The IOWA operator is introduced to the weight determination of time series, through which the time decay factor is taken into consideration. The VGA operator is able to generate weights with respect to the degree distribution in the visibility graph constructed from the corresponding time series, which reflects the relative importance of vertices in time series. The proposed method is applied to two practical datasets to illustrate its merits. The aggregation of Construction Cost Index (CCI) demonstrates the ability of proposed method to smooth time series, while the aggregation of The Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) illustrate how proposed method maintain the variation tendency of original data.

  7. Absolute continuity for operator valued completely positive maps on C∗-algebras

    NASA Astrophysics Data System (ADS)

    Gheondea, Aurelian; Kavruk, Ali Şamil

    2009-02-01

    Motivated by applicability to quantum operations, quantum information, and quantum probability, we investigate the notion of absolute continuity for operator valued completely positive maps on C∗-algebras, previously introduced by Parthasarathy [in Athens Conference on Applied Probability and Time Series Analysis I (Springer-Verlag, Berlin, 1996), pp. 34-54]. We obtain an intrinsic definition of absolute continuity, we show that the Lebesgue decomposition defined by Parthasarathy is the maximal one among all other Lebesgue-type decompositions and that this maximal Lebesgue decomposition does not depend on the jointly dominating completely positive map, we obtain more flexible formulas for calculating the maximal Lebesgue decomposition, and we point out the nonuniqueness of the Lebesgue decomposition as well as a sufficient condition for uniqueness. In addition, we consider Radon-Nikodym derivatives for absolutely continuous completely positive maps that, in general, are unbounded positive self-adjoint operators affiliated to a certain von Neumann algebra, and we obtain a spectral approximation by bounded Radon-Nikodym derivatives. An application to the existence of the infimum of two completely positive maps is indicated, and formulas in terms of Choi's matrices for the Lebesgue decomposition of completely positive maps in matrix algebras are obtained.

  8. Finding Your Workforce: The Top 25 Institutions Graduating Latinos in Science, Technology, Engineering, and Math (STEM) by Academic Level--2009-10. Third in a Series Linking College Completion with U.S. Workforce Needs

    ERIC Educational Resources Information Center

    Santiago, Deborah; Soliz, Megan

    2012-01-01

    Drawing attention to the institutions graduating Latinos in postsecondary education links the college completion goals of the U.S. with the workforce needs of the country. This third brief in the Finding Your Workforce series provides a summary of the top 25 institutions at each academic level graduating Latinos from certificates to doctoral…

  9. Teleoperation in surgical robotics--network latency effects on surgical performance.

    PubMed

    Lum, Mitchell J H; Rosen, Jacob; King, Hawkeye; Friedman, Diana C W; Lendvay, Thomas S; Wright, Andrew S; Sinanan, Mika N; Hannaford, Blake

    2009-01-01

    A teleoperated surgical robotic system allows surgical procedures to be conducted across long distances while utilizing wired and wireless communication with a wide spectrum of performance that may affect the outcome. An open architecture portable surgical robotic system (Raven) was developed for both open and minimally invasive surgery. The system has been the subject of an intensive telesurgical experimental protocol aimed at exploring the boundaries of the system and surgeon performance during a series of field experiments in extreme environments (desert and underwater) teleportation between US, Europe, and Japan as well as lab experiments under synthetic fixed time delay. One standard task (block transfer emulating tissue manipulation) of the Fundamentals of Laparoscopic Surgery (FLS) training kit was used for the experimental protocol. Network characterization indicated a typical time delay in the range of 16-172 ms in field experiments. The results of the lab experiments showed that the completion time of the task as well as the length of the tool tip trajectory significantly increased (alpha< 0.02) as time delay increased in the range of 0-0.5 sec increased. For teleoperation with a time delay of 0.25s and 0.5s the task completion time was lengthened by a factor of 1.45 and 2.04 with respect to no time delay, whereas the length of the tools' trajectory was increased by a factor of 1.28 and 1.53 with respect to no time delay. There were no statistical differences between experienced surgeons and non-surgeons in the number of errors (block drooping) as well as the completion time and the tool tip path length at different time delays.

  10. Updating Landsat time series of surface-reflectance composites and forest change products with new observations

    NASA Astrophysics Data System (ADS)

    Hermosilla, Txomin; Wulder, Michael A.; White, Joanne C.; Coops, Nicholas C.; Hobart, Geordie W.

    2017-12-01

    The use of time series satellite data allows for the temporally dense, systematic, transparent, and synoptic capture of land dynamics over time. Subsequent to the opening of the Landsat archive, several time series approaches for characterizing landscape change have been developed, often representing a particular analytical time window. The information richness and widespread utility of these time series data have created a need to maintain the currency of time series information via the addition of new data, as it becomes available. When an existing time series is temporally extended, it is critical that previously generated change information remains consistent, thereby not altering reported change statistics or science outcomes based on that change information. In this research, we investigate the impacts and implications of adding additional years to an existing 29-year annual Landsat time series for forest change. To do so, we undertook a spatially explicit comparison of the 29 overlapping years of a time series representing 1984-2012, with a time series representing 1984-2016. Surface reflectance values, and presence, year, and type of change were compared. We found that the addition of years to extend the time series had minimal effect on the annual surface reflectance composites, with slight band-specific differences (r ≥ 0.1) in the final years of the original time series being updated. The area of stand replacing disturbances and determination of change year are virtually unchanged for the overlapping period between the two time-series products. Over the overlapping temporal period (1984-2012), the total area of change differs by 0.53%, equating to an annual difference in change area of 0.019%. Overall, the spatial and temporal agreement of the changes detected by both time series was 96%. Further, our findings suggest that the entire pre-existing historic time series does not need to be re-processed during the update process. Critically, given the time series change detection and update approach followed here, science outcomes or reports representing one temporal epoch can be considered stable and will not be altered when a time series is updated with newly available data.

  11. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    PubMed

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  12. Highly comparative time-series analysis: the empirical structure of time series and their methods

    PubMed Central

    Fulcher, Ben D.; Little, Max A.; Jones, Nick S.

    2013-01-01

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344

  13. Promotora outreach, education and navigation support for HPV vaccination to Hispanic women with unvaccinated daughters

    PubMed Central

    Parra-Medina, Deborah; Morales-Campos, Daisy Y.; Mojica, Cynthia; Ramirez, Amelie G.

    2015-01-01

    Background Cervical cancer disparities persist in the predominantly Hispanic population of South Texas, and Hispanic girls are less likely to initiate and complete the three-dose HPV vaccine series. Culturally relevant interventions are needed to eliminate these disparities and improve HPV vaccine initiation and completion. Subjects We enrolled 372 Hispanic women from South Texas’ Cameron and Hidalgo counties with a daughter aged 11–17 who had not received HPV vaccine. Intervention All participants received an HPV vaccine educational brochure in their preferred language (English or Spanish) and were invited to participate in the Entre Madre e Hija (EMH) program, a culturally relevant cervical cancer prevention program. EMH participants (n= 257) received group health education, referral and navigation support from a promotora (a trained, culturally competent community health worker). Those who declined participation in EMH received the brochure only (n=115). Results Eighty-four percent of enrolled participants initiated the HPV vaccine, and no differences were observed between EMH program and brochure-only participants. Compared to brochure-only participants, EMH participants were more likely to complete the vaccine series [Adj. OR=2.24, 95% CI (1.25, 4.02)]. In addition, participants who were employed and insured had lower odds of completing the vaccine series [Adj. OR=.45, 95% CI (.21 – .96); Adj. OR=.36, 95% CI (.13 – .98), respectively]. Conclusion All enrolled participants had high vaccine initiation rates (>80%); however, EMH program participants were more likely to complete the vaccine series. HPV vaccine promotion efforts that include referral and navigation support in addition to education show promise. PMID:24898942

  14. Uncertain Classification of Variable Stars: Handling Observational GAPS and Noise

    NASA Astrophysics Data System (ADS)

    Castro, Nicolás; Protopapas, Pavlos; Pichara, Karim

    2018-01-01

    Automatic classification methods applied to sky surveys have revolutionized the astronomical target selection process. Most surveys generate a vast amount of time series, or “lightcurves,” that represent the brightness variability of stellar objects in time. Unfortunately, lightcurves’ observations take several years to be completed, producing truncated time series that generally remain without the application of automatic classifiers until they are finished. This happens because state-of-the-art methods rely on a variety of statistical descriptors or features that present an increasing degree of dispersion when the number of observations decreases, which reduces their precision. In this paper, we propose a novel method that increases the performance of automatic classifiers of variable stars by incorporating the deviations that scarcity of observations produces. Our method uses Gaussian process regression to form a probabilistic model of each lightcurve’s observations. Then, based on this model, bootstrapped samples of the time series features are generated. Finally, a bagging approach is used to improve the overall performance of the classification. We perform tests on the MAssive Compact Halo Object (MACHO) and Optical Gravitational Lensing Experiment (OGLE) catalogs, results show that our method effectively classifies some variability classes using a small fraction of the original observations. For example, we found that RR Lyrae stars can be classified with ~80% accuracy just by observing the first 5% of the whole lightcurves’ observations in the MACHO and OGLE catalogs. We believe these results prove that, when studying lightcurves, it is important to consider the features’ error and how the measurement process impacts it.

  15. Methods for removal of unwanted signals from gravity time-series: Comparison using linear techniques complemented with analysis of system dynamics

    NASA Astrophysics Data System (ADS)

    Valencio, Arthur; Grebogi, Celso; Baptista, Murilo S.

    2017-10-01

    The presence of undesirable dominating signals in geophysical experimental data is a challenge in many subfields. One remarkable example is surface gravimetry, where frequencies from Earth tides correspond to time-series fluctuations up to a thousand times larger than the phenomena of major interest, such as hydrological gravity effects or co-seismic gravity changes. This work discusses general methods for the removal of unwanted dominating signals by applying them to 8 long-period gravity time-series of the International Geodynamics and Earth Tides Service, equivalent to the acquisition from 8 instruments in 5 locations representative of the network. We compare three different conceptual approaches for tide removal: frequency filtering, physical modelling, and data-based modelling. Each approach reveals a different limitation to be considered depending on the intended application. Vestiges of tides remain in the residues for the modelling procedures, whereas the signal was distorted in different ways by the filtering and data-based procedures. The linear techniques employed were power spectral density, spectrogram, cross-correlation, and classical harmonics decomposition, while the system dynamics was analysed by state-space reconstruction and estimation of the largest Lyapunov exponent. Although the tides could not be completely eliminated, they were sufficiently reduced to allow observation of geophysical events of interest above the 10 nm s-2 level, exemplified by a hydrology-related event of 60 nm s-2. The implementations adopted for each conceptual approach are general, so that their principles could be applied to other kinds of data affected by undesired signals composed mainly by periodic or quasi-periodic components.

  16. J-2X concludes series of tests

    NASA Image and Video Library

    2008-05-09

    NASA engineers successfully complete the first series of tests in the early development of the J-2X engine that will power the Ares I and Ares V rockets, key components of NASA's Constellation Program.

  17. Reconstructing biochemical pathways from time course data.

    PubMed

    Srividhya, Jeyaraman; Crampin, Edmund J; McSharry, Patrick E; Schnell, Santiago

    2007-03-01

    Time series data on biochemical reactions reveal transient behavior, away from chemical equilibrium, and contain information on the dynamic interactions among reacting components. However, this information can be difficult to extract using conventional analysis techniques. We present a new method to infer biochemical pathway mechanisms from time course data using a global nonlinear modeling technique to identify the elementary reaction steps which constitute the pathway. The method involves the generation of a complete dictionary of polynomial basis functions based on the law of mass action. Using these basis functions, there are two approaches to model construction, namely the general to specific and the specific to general approach. We demonstrate that our new methodology reconstructs the chemical reaction steps and connectivity of the glycolytic pathway of Lactococcus lactis from time course experimental data.

  18. A Framework and Algorithms for Multivariate Time Series Analytics (MTSA): Learning, Monitoring, and Recommendation

    ERIC Educational Resources Information Center

    Ngan, Chun-Kit

    2013-01-01

    Making decisions over multivariate time series is an important topic which has gained significant interest in the past decade. A time series is a sequence of data points which are measured and ordered over uniform time intervals. A multivariate time series is a set of multiple, related time series in a particular domain in which domain experts…

  19. A Review of Subsequence Time Series Clustering

    PubMed Central

    Teh, Ying Wah

    2014-01-01

    Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies. PMID:25140332

  20. A review of subsequence time series clustering.

    PubMed

    Zolhavarieh, Seyedjamal; Aghabozorgi, Saeed; Teh, Ying Wah

    2014-01-01

    Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies.

  1. Multiscale structure of time series revealed by the monotony spectrum.

    PubMed

    Vamoş, Călin

    2017-03-01

    Observation of complex systems produces time series with specific dynamics at different time scales. The majority of the existing numerical methods for multiscale analysis first decompose the time series into several simpler components and the multiscale structure is given by the properties of their components. We present a numerical method which describes the multiscale structure of arbitrary time series without decomposing them. It is based on the monotony spectrum defined as the variation of the mean amplitude of the monotonic segments with respect to the mean local time scale during successive averagings of the time series, the local time scales being the durations of the monotonic segments. The maxima of the monotony spectrum indicate the time scales which dominate the variations of the time series. We show that the monotony spectrum can correctly analyze a diversity of artificial time series and can discriminate the existence of deterministic variations at large time scales from the random fluctuations. As an application we analyze the multifractal structure of some hydrological time series.

  2. Timing of HPV vaccine intervals among United States teens with consideration to the current ACIP schedule and the WHO 2-dose schedule

    PubMed Central

    Cloessner, Emily A.; Stokley, Shannon; Yankey, David; Markowitz, Lauri E.

    2016-01-01

    Abstract The current recommendation for human papillomavirus (HPV) vaccination in the United States is for 3 doses to be administered over a 6 month period. In April 2014, the World Health Organization (WHO) recommended adoption of a 2-dose schedule, with doses spaced a minimum of 6 months apart, for teens who begin the series before age 15. We analyzed data from the 2013 National Immunization Survey-Teen to examine the timing of second and third dose receipt among US adolescents. All analyses were restricted to adolescents age 13–17 y who had adequate provider data. The Wilcoxon–Mann–Whitney test measured differences in time to receive vaccine doses among demographic and socioeconomic groups. Logistic regression identified socioeconomic characteristics associated with receiving the second dose of HPV vaccine at least 6 months after the first dose. The median time for teens to receive the second dose of HPV vaccine was 2.6 months after the first dose, and the median time to receive the third dose was 4.9 months after the second dose. Minority teens and teens living below the poverty level took significantly longer to receive doses. Among teens that initiated the HPV vaccine series before age 15 y, 28.6% received the second dose at least 6 months after the first dose. If these teens, who met the WHO criteria for up-to-date HPV vaccination, were classified as having completed the vaccination series, overall coverage in the US would increase 3.9 percentage points, with African American and Hispanic teens having the greatest increases in coverage. PMID:26587886

  3. Timing of HPV vaccine intervals among United States teens with consideration to the current ACIP schedule and the WHO 2-dose schedule.

    PubMed

    Cloessner, Emily A; Stokley, Shannon; Yankey, David; Markowitz, Lauri E

    2016-06-02

    The current recommendation for human papillomavirus (HPV) vaccination in the United States is for 3 doses to be administered over a 6 month period. In April 2014, the World Health Organization (WHO) recommended adoption of a 2-dose schedule, with doses spaced a minimum of 6 months apart, for teens who begin the series before age 15. We analyzed data from the 2013 National Immunization Survey-Teen to examine the timing of second and third dose receipt among US adolescents. All analyses were restricted to adolescents age 13-17 y who had adequate provider data. The Wilcoxon-Mann-Whitney test measured differences in time to receive vaccine doses among demographic and socioeconomic groups. Logistic regression identified socioeconomic characteristics associated with receiving the second dose of HPV vaccine at least 6 months after the first dose. The median time for teens to receive the second dose of HPV vaccine was 2.6 months after the first dose, and the median time to receive the third dose was 4.9 months after the second dose. Minority teens and teens living below the poverty level took significantly longer to receive doses. Among teens that initiated the HPV vaccine series before age 15 y, 28.6% received the second dose at least 6 months after the first dose. If these teens, who met the WHO criteria for up-to-date HPV vaccination, were classified as having completed the vaccination series, overall coverage in the US would increase 3.9 percentage points, with African American and Hispanic teens having the greatest increases in coverage.

  4. Healing and the Mind with Bill Moyers. Resource Materials.

    ERIC Educational Resources Information Center

    Grippo, Lois; Kelso, Richard

    This high school resource package for the public television series "Healing and the Mind with Bill Moyers" includes: (1) a teacher's guide that provides complete lesson plans for each program in the series; (2) a glossary that features definitions of the terms used in the series; (3) a bibliography containing books of interest to both…

  5. 31 CFR 321.8 - Redemption-exchange of Series E and EE savings bonds and savings notes.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... eligible for exchange are: (1) Series EE bonds bearing issue dates of January 1, 2003, or earlier... February 1, 2003, or thereafter, presented no earlier than 12 months from their issue dates; and (3) Series... securities on exchange unless: (1) The securities are accompanied by a completed exchange subscription signed...

  6. Index to the Understanding the Atom Series.

    ERIC Educational Resources Information Center

    Atomic Energy Commission, Oak Ridge, TN. Div. of Technical Information.

    This index was prepared for the set of 51 booklets in the "Understanding the Atom Series" published by the U. S. Atomic Energy Commission for high school students and their teachers. In addition to the index, a complete list of the series is provided in which the booklets are grouped into the categories of physics, chemistry, biology, nuclear…

  7. A dynamic scheduling algorithm for singe-arm two-cluster tools with flexible processing times

    NASA Astrophysics Data System (ADS)

    Li, Xin; Fung, Richard Y. K.

    2018-02-01

    This article presents a dynamic algorithm for job scheduling in two-cluster tools producing multi-type wafers with flexible processing times. Flexible processing times mean that the actual times for processing wafers should be within given time intervals. The objective of the work is to minimize the completion time of the newly inserted wafer. To deal with this issue, a two-cluster tool is decomposed into three reduced single-cluster tools (RCTs) in a series based on a decomposition approach proposed in this article. For each single-cluster tool, a dynamic scheduling algorithm based on temporal constraints is developed to schedule the newly inserted wafer. Three experiments have been carried out to test the dynamic scheduling algorithm proposed, comparing with the results the 'earliest starting time' heuristic (EST) adopted in previous literature. The results show that the dynamic algorithm proposed in this article is effective and practical.

  8. A simple two-stage model predicts response time distributions.

    PubMed

    Carpenter, R H S; Reddi, B A J; Anderson, A J

    2009-08-15

    The neural mechanisms underlying reaction times have previously been modelled in two distinct ways. When stimuli are hard to detect, response time tends to follow a random-walk model that integrates noisy sensory signals. But studies investigating the influence of higher-level factors such as prior probability and response urgency typically use highly detectable targets, and response times then usually correspond to a linear rise-to-threshold mechanism. Here we show that a model incorporating both types of element in series - a detector integrating noisy afferent signals, followed by a linear rise-to-threshold performing decision - successfully predicts not only mean response times but, much more stringently, the observed distribution of these times and the rate of decision errors over a wide range of stimulus detectability. By reconciling what previously may have seemed to be conflicting theories, we are now closer to having a complete description of reaction time and the decision processes that underlie it.

  9. Simulation Exploration through Immersive Parallel Planes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunhart-Lupo, Nicholas J; Bush, Brian W; Gruchalla, Kenny M

    We present a visualization-driven simulation system that tightly couples systems dynamics simulations with an immersive virtual environment to allow analysts to rapidly develop and test hypotheses in a high-dimensional parameter space. To accomplish this, we generalize the two-dimensional parallel-coordinates statistical graphic as an immersive 'parallel-planes' visualization for multivariate time series emitted by simulations running in parallel with the visualization. In contrast to traditional parallel coordinate's mapping the multivariate dimensions onto coordinate axes represented by a series of parallel lines, we map pairs of the multivariate dimensions onto a series of parallel rectangles. As in the case of parallel coordinates, eachmore » individual observation in the dataset is mapped to a polyline whose vertices coincide with its coordinate values. Regions of the rectangles can be 'brushed' to highlight and select observations of interest: a 'slider' control allows the user to filter the observations by their time coordinate. In an immersive virtual environment, users interact with the parallel planes using a joystick that can select regions on the planes, manipulate selection, and filter time. The brushing and selection actions are used to both explore existing data as well as to launch additional simulations corresponding to the visually selected portions of the input parameter space. As soon as the new simulations complete, their resulting observations are displayed in the virtual environment. This tight feedback loop between simulation and immersive analytics accelerates users' realization of insights about the simulation and its output.« less

  10. Time series analysis of personal exposure to ambient air pollution and mortality using an exposure simulator.

    PubMed

    Chang, Howard H; Fuentes, Montserrat; Frey, H Christopher

    2012-09-01

    This paper describes a modeling framework for estimating the acute effects of personal exposure to ambient air pollution in a time series design. First, a spatial hierarchical model is used to relate Census tract-level daily ambient concentrations and simulated exposures for a subset of the study period. The complete exposure time series is then imputed for risk estimation. Modeling exposure via a statistical model reduces the computational burden associated with simulating personal exposures considerably. This allows us to consider personal exposures at a finer spatial resolution to improve exposure assessment and for a longer study period. The proposed approach is applied to an analysis of fine particulate matter of <2.5 μm in aerodynamic diameter (PM(2.5)) and daily mortality in the New York City metropolitan area during the period 2001-2005. Personal PM(2.5) exposures were simulated from the Stochastic Human Exposure and Dose Simulation. Accounting for exposure uncertainty, the authors estimated a 2.32% (95% posterior interval: 0.68, 3.94) increase in mortality per a 10 μg/m(3) increase in personal exposure to PM(2.5) from outdoor sources on the previous day. The corresponding estimates per a 10 μg/m(3) increase in PM(2.5) ambient concentration was 1.13% (95% confidence interval: 0.27, 2.00). The risks of mortality associated with PM(2.5) were also higher during the summer months.

  11. Estimating the basic reproduction rate of HFMD using the time series SIR model in Guangdong, China

    PubMed Central

    Du, Zhicheng; Zhang, Wangjian; Zhang, Dingmei; Yu, Shicheng

    2017-01-01

    Hand, foot, and mouth disease (HFMD) has caused a substantial burden of disease in China, especially in Guangdong Province. Based on notifiable cases, we use the time series Susceptible-Infected-Recovered model to estimate the basic reproduction rate (R0) and the herd immunity threshold, understanding the transmission and persistence of HFMD more completely for efficient intervention in this province. The standardized difference between the reported and fitted time series of HFMD was 0.009 (<0.2). The median basic reproduction rate of total, enterovirus 71, and coxsackievirus 16 cases in Guangdong were 4.621 (IQR: 3.907–5.823), 3.023 (IQR: 2.289–4.292) and 7.767 (IQR: 6.903–10.353), respectively. The heatmap of R0 showed semiannual peaks of activity, including a major peak in spring and early summer (about the 12th week) followed by a smaller peak in autumn (about the 36th week). The county-level model showed that Longchuan (R0 = 33), Gaozhou (R0 = 24), Huazhou (R0 = 23) and Qingxin (R0 = 19) counties have higher basic reproduction rate than other counties in the province. The epidemic of HFMD in Guangdong Province is still grim, and strategies like the World Health Organization’s expanded program on immunization need to be implemented. An elimination of HFMD in Guangdong might need a Herd Immunity Threshold of 78%. PMID:28692654

  12. Simulation Exploration through Immersive Parallel Planes: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunhart-Lupo, Nicholas; Bush, Brian W.; Gruchalla, Kenny

    We present a visualization-driven simulation system that tightly couples systems dynamics simulations with an immersive virtual environment to allow analysts to rapidly develop and test hypotheses in a high-dimensional parameter space. To accomplish this, we generalize the two-dimensional parallel-coordinates statistical graphic as an immersive 'parallel-planes' visualization for multivariate time series emitted by simulations running in parallel with the visualization. In contrast to traditional parallel coordinate's mapping the multivariate dimensions onto coordinate axes represented by a series of parallel lines, we map pairs of the multivariate dimensions onto a series of parallel rectangles. As in the case of parallel coordinates, eachmore » individual observation in the dataset is mapped to a polyline whose vertices coincide with its coordinate values. Regions of the rectangles can be 'brushed' to highlight and select observations of interest: a 'slider' control allows the user to filter the observations by their time coordinate. In an immersive virtual environment, users interact with the parallel planes using a joystick that can select regions on the planes, manipulate selection, and filter time. The brushing and selection actions are used to both explore existing data as well as to launch additional simulations corresponding to the visually selected portions of the input parameter space. As soon as the new simulations complete, their resulting observations are displayed in the virtual environment. This tight feedback loop between simulation and immersive analytics accelerates users' realization of insights about the simulation and its output.« less

  13. CHRONOS: a time-varying method for microRNA-mediated subpathway enrichment analysis.

    PubMed

    Vrahatis, Aristidis G; Dimitrakopoulou, Konstantina; Balomenos, Panos; Tsakalidis, Athanasios K; Bezerianos, Anastasios

    2016-03-15

    In the era of network medicine and the rapid growth of paired time series mRNA/microRNA expression experiments, there is an urgent need for pathway enrichment analysis methods able to capture the time- and condition-specific 'active parts' of the biological circuitry as well as the microRNA impact. Current methods ignore the multiple dynamical 'themes'-in the form of enriched biologically relevant microRNA-mediated subpathways-that determine the functionality of signaling networks across time. To address these challenges, we developed time-vaRying enriCHment integrOmics Subpathway aNalysis tOol (CHRONOS) by integrating time series mRNA/microRNA expression data with KEGG pathway maps and microRNA-target interactions. Specifically, microRNA-mediated subpathway topologies are extracted and evaluated based on the temporal transition and the fold change activity of the linked genes/microRNAs. Further, we provide measures that capture the structural and functional features of subpathways in relation to the complete organism pathway atlas. Our application to synthetic and real data shows that CHRONOS outperforms current subpathway-based methods into unraveling the inherent dynamic properties of pathways. CHRONOS is freely available at http://biosignal.med.upatras.gr/chronos/ tassos.bezerianos@nus.edu.sg Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. The revised solar array synthesis computer program

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The Revised Solar Array Synthesis Computer Program is described. It is a general-purpose program which computes solar array output characteristics while accounting for the effects of temperature, incidence angle, charged-particle irradiation, and other degradation effects on various solar array configurations in either circular or elliptical orbits. Array configurations may consist of up to 75 solar cell panels arranged in any series-parallel combination not exceeding three series-connected panels in a parallel string and no more than 25 parallel strings in an array. Up to 100 separate solar array current-voltage characteristics, corresponding to 100 equal-time increments during the sunlight illuminated portion of an orbit or any 100 user-specified combinations of incidence angle and temperature, can be computed and printed out during one complete computer execution. Individual panel incidence angles may be computed and printed out at the user's option.

  15. Improving outcomes for people with progressive cancer: interrupted time series trial of a needs assessment intervention.

    PubMed

    Waller, Amy; Girgis, Afaf; Johnson, Claire; Lecathelinais, Christophe; Sibbritt, David; Forstner, Dion; Liauw, Winston; Currow, David C

    2012-03-01

    Improving the effectiveness of cancer care delivery has become a major focus of research. This study assessed the uptake and impact of the Palliative Care Needs Assessment Guidelines and Needs Assessment Tool: Progressive Disease--Cancer (NAT: PD-C) on the outcomes of people with advanced cancer. Given widely varying survival in people with advanced cancer, an interrupted time series design was used, with data on unmet needs, depression, anxiety, and quality of life collected from 195 patients using telephone interviews every two months, for up to 18 months. Patients completed at least two baseline interviews before health professionals were academically detailed in the use of the Palliative Care Needs Assessment Guidelines and NAT: PD-C. Health professionals completed the NAT: PD-C with patients approximately monthly for the remainder of the study. Changes in patients' outcomes were compared prior to and following the introduction of the NAT: PD-C using general estimating equations. Moderate to high needs across all domains were frequently seen in the preintervention phase. The use of the NAT: PD-C was associated with a significant reduction in health system and information and patient care and support needs. These resources have the potential as an efficient and acceptable strategy for supporting needs-based cancer care. Further work is required to determine their unique contribution to improvements in patient outcomes. Copyright © 2012 U.S. Cancer Pain Relief Committee. Published by Elsevier Inc. All rights reserved.

  16. Transarterial treatment with Onyx of Cognard type IV anterior cranial fossa dural arteriovenous fistulas.

    PubMed

    Li, Chuanhui; Wu, Zhongxue; Yang, Xinjian; Li, Youxiang; Jiang, Chuhan; He, Hongwei

    2014-03-01

    Cognard type IV anterior cranial fossa dural arteriovenous fistulas (DAVFs) are rare lesions with a high risk of intracranial hemorrhage. We present our experience with the use of Onyx via the arterial route in these aggressive lesions. Between October 2009 and October 2011, six consecutive patients diagnosed with Cognard type IV anterior cranial fossa DAVFs were treated transarterially with Onyx in our department. All patients were male; mean age was 55 years (range 38-68). Four patients presented with intracranial hemorrhage as the initial manifestation; one patient presented with seizures at the time of diagnosis and experienced intracranial hemorrhage during the antiepileptic therapy; and the other patient was asymptomatic. In five patients, complete obliteration was achieved with transarterial Onyx injection in a single treatment session; in the remaining patient, subtotal occlusion was achieved and gamma knife treatment was followed. The average time of injection was 19 min (range 5-28) for every pedicle catheterized and the average amount of Onyx was 3.2 ml (range 0.4-6.3) for each lesion. All patients recovered uneventfully after embolization. No mortality or permanent morbidity was observed in this series. Follow-up digital subtraction or MR angiography confirmed durable obliteration of the fistulas in five cured cases. No patients suffered intracranial hemorrhage during the follow-up period. In this small series, our experience with the use of Onyx for arterial embolization of Cognard type IV DAVFs is encouraging, with durable complete cure in most lesions without severe complications.

  17. 76 FR 6646 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-07

    ... Series, adjusted option series and any options series until the time to expiration for such series is... time to expiration for such series is less than nine months be treated differently. Specifically, under... series until the time to expiration for such series is less than nine months. Accordingly, the...

  18. Complexity quantification of cardiac variability time series using improved sample entropy (I-SampEn).

    PubMed

    Marwaha, Puneeta; Sunkaria, Ramesh Kumar

    2016-09-01

    The sample entropy (SampEn) has been widely used to quantify the complexity of RR-interval time series. It is a fact that higher complexity, and hence, entropy is associated with the RR-interval time series of healthy subjects. But, SampEn suffers from the disadvantage that it assigns higher entropy to the randomized surrogate time series as well as to certain pathological time series, which is a misleading observation. This wrong estimation of the complexity of a time series may be due to the fact that the existing SampEn technique updates the threshold value as a function of long-term standard deviation (SD) of a time series. However, time series of certain pathologies exhibits substantial variability in beat-to-beat fluctuations. So the SD of the first order difference (short term SD) of the time series should be considered while updating threshold value, to account for period-to-period variations inherited in a time series. In the present work, improved sample entropy (I-SampEn), a new methodology has been proposed in which threshold value is updated by considering the period-to-period variations of a time series. The I-SampEn technique results in assigning higher entropy value to age-matched healthy subjects than patients suffering atrial fibrillation (AF) and diabetes mellitus (DM). Our results are in agreement with the theory of reduction in complexity of RR-interval time series in patients suffering from chronic cardiovascular and non-cardiovascular diseases.

  19. Preparations for the IGS realization of ITRF2014

    NASA Astrophysics Data System (ADS)

    Rebischung, Paul; Schmid, Ralf

    2016-04-01

    The International GNSS Service (IGS) currently prepares its own realization, called IGS14, of the latest release of the International Terrestrial Reference Frame (ITRF2014). This preparation involves: - a selection of the most suitable reference frame (RF) stations from the complete set of GNSS stations in ITRF2014; - the design of a well-distributed core network of RF stations for the purpose of aligning global GNSS solutions; - a re-evaluation of the GPS and GLONASS satellite antenna phase center offsets (PCOs), based on the SINEX files provided by the IGS Analysis Centers (ACs) in the frame of the second IGS reprocessing campaign repro2. This presentation will first cover the criteria used for the selection of the IGS14 and IGS14 core RF stations as well as preliminary station selection results. We will then use the preliminary IGS14 RF to re-align the daily IGS combined repro2 SINEX solutions and study the impact of the RF change on GNSS-derived geodetic parameter time series. In a second part, we will focus on the re-evaluation of the GNSS satellite antenna PCOs. A re-evaluation of at least their radial (z) components is indeed required, despite the negligible scale difference between ITRF2008 and ITRF2014, because of modeling changes recently introduced within the IGS which affect the scale of GNSS terrestrial frames (Earth radiation pressure, antenna thrust). Moreover, the 13 GPS and GLONASS satellites launched since September 2012 are currently assigned preliminary block-specific mean PCO values which need to be updated. From the daily AC repro2 SINEX files, we will therefore derive time series of satellite z-PCO estimates and analyze the resulting time series. Since several ACs provided all three components of the satellite PCOs in their SINEX files, we will additionally derive similar x- and y-PCO time series and discuss the relevance of their potential re-evaluation.

  20. Aerosol Climate Time Series Evaluation In ESA Aerosol_cci

    NASA Astrophysics Data System (ADS)

    Popp, T.; de Leeuw, G.; Pinnock, S.

    2015-12-01

    Within the ESA Climate Change Initiative (CCI) Aerosol_cci (2010 - 2017) conducts intensive work to improve algorithms for the retrieval of aerosol information from European sensors. By the end of 2015 full mission time series of 2 GCOS-required aerosol parameters are completely validated and released: Aerosol Optical Depth (AOD) from dual view ATSR-2 / AATSR radiometers (3 algorithms, 1995 - 2012), and stratospheric extinction profiles from star occultation GOMOS spectrometer (2002 - 2012). Additionally, a 35-year multi-sensor time series of the qualitative Absorbing Aerosol Index (AAI) together with sensitivity information and an AAI model simulator is available. Complementary aerosol properties requested by GCOS are in a "round robin" phase, where various algorithms are inter-compared: fine mode AOD, mineral dust AOD (from the thermal IASI spectrometer), absorption information and aerosol layer height. As a quasi-reference for validation in few selected regions with sparse ground-based observations the multi-pixel GRASP algorithm for the POLDER instrument is used. Validation of first dataset versions (vs. AERONET, MAN) and inter-comparison to other satellite datasets (MODIS, MISR, SeaWIFS) proved the high quality of the available datasets comparable to other satellite retrievals and revealed needs for algorithm improvement (for example for higher AOD values) which were taken into account for a reprocessing. The datasets contain pixel level uncertainty estimates which are also validated. The paper will summarize and discuss the results of major reprocessing and validation conducted in 2015. The focus will be on the ATSR, GOMOS and IASI datasets. Pixel level uncertainties validation will be summarized and discussed including unknown components and their potential usefulness and limitations. Opportunities for time series extension with successor instruments of the Sentinel family will be described and the complementarity of the different satellite aerosol products (e.g. dust vs. total AOD, ensembles from different algorithms for the same sensor) will be discussed.

  1. Aerosol Climate Time Series in ESA Aerosol_cci

    NASA Astrophysics Data System (ADS)

    Popp, Thomas; de Leeuw, Gerrit; Pinnock, Simon

    2016-04-01

    Within the ESA Climate Change Initiative (CCI) Aerosol_cci (2010 - 2017) conducts intensive work to improve algorithms for the retrieval of aerosol information from European sensors. Meanwhile, full mission time series of 2 GCOS-required aerosol parameters are completely validated and released: Aerosol Optical Depth (AOD) from dual view ATSR-2 / AATSR radiometers (3 algorithms, 1995 - 2012), and stratospheric extinction profiles from star occultation GOMOS spectrometer (2002 - 2012). Additionally, a 35-year multi-sensor time series of the qualitative Absorbing Aerosol Index (AAI) together with sensitivity information and an AAI model simulator is available. Complementary aerosol properties requested by GCOS are in a "round robin" phase, where various algorithms are inter-compared: fine mode AOD, mineral dust AOD (from the thermal IASI spectrometer, but also from ATSR instruments and the POLDER sensor), absorption information and aerosol layer height. As a quasi-reference for validation in few selected regions with sparse ground-based observations the multi-pixel GRASP algorithm for the POLDER instrument is used. Validation of first dataset versions (vs. AERONET, MAN) and inter-comparison to other satellite datasets (MODIS, MISR, SeaWIFS) proved the high quality of the available datasets comparable to other satellite retrievals and revealed needs for algorithm improvement (for example for higher AOD values) which were taken into account for a reprocessing. The datasets contain pixel level uncertainty estimates which were also validated and improved in the reprocessing. For the three ATSR algorithms the use of an ensemble method was tested. The paper will summarize and discuss the status of dataset reprocessing and validation. The focus will be on the ATSR, GOMOS and IASI datasets. Pixel level uncertainties validation will be summarized and discussed including unknown components and their potential usefulness and limitations. Opportunities for time series extension with successor instruments of the Sentinel family will be described and the complementarity of the different satellite aerosol products (e.g. dust vs. total AOD, ensembles from different algorithms for the same sensor) will be discussed.

  2. Reverse engineering gene regulatory networks from measurement with missing values.

    PubMed

    Ogundijo, Oyetunji E; Elmas, Abdulkadir; Wang, Xiaodong

    2016-12-01

    Gene expression time series data are usually in the form of high-dimensional arrays. Unfortunately, the data may sometimes contain missing values: for either the expression values of some genes at some time points or the entire expression values of a single time point or some sets of consecutive time points. This significantly affects the performance of many algorithms for gene expression analysis that take as an input, the complete matrix of gene expression measurement. For instance, previous works have shown that gene regulatory interactions can be estimated from the complete matrix of gene expression measurement. Yet, till date, few algorithms have been proposed for the inference of gene regulatory network from gene expression data with missing values. We describe a nonlinear dynamic stochastic model for the evolution of gene expression. The model captures the structural, dynamical, and the nonlinear natures of the underlying biomolecular systems. We present point-based Gaussian approximation (PBGA) filters for joint state and parameter estimation of the system with one-step or two-step missing measurements . The PBGA filters use Gaussian approximation and various quadrature rules, such as the unscented transform (UT), the third-degree cubature rule and the central difference rule for computing the related posteriors. The proposed algorithm is evaluated with satisfying results for synthetic networks, in silico networks released as a part of the DREAM project, and the real biological network, the in vivo reverse engineering and modeling assessment (IRMA) network of yeast Saccharomyces cerevisiae . PBGA filters are proposed to elucidate the underlying gene regulatory network (GRN) from time series gene expression data that contain missing values. In our state-space model, we proposed a measurement model that incorporates the effect of the missing data points into the sequential algorithm. This approach produces a better inference of the model parameters and hence, more accurate prediction of the underlying GRN compared to when using the conventional Gaussian approximation (GA) filters ignoring the missing data points.

  3. Development and application of a modified dynamic time warping algorithm (DTW-S) to analyses of primate brain expression time series

    PubMed Central

    2011-01-01

    Background Comparing biological time series data across different conditions, or different specimens, is a common but still challenging task. Algorithms aligning two time series represent a valuable tool for such comparisons. While many powerful computation tools for time series alignment have been developed, they do not provide significance estimates for time shift measurements. Results Here, we present an extended version of the original DTW algorithm that allows us to determine the significance of time shift estimates in time series alignments, the DTW-Significance (DTW-S) algorithm. The DTW-S combines important properties of the original algorithm and other published time series alignment tools: DTW-S calculates the optimal alignment for each time point of each gene, it uses interpolated time points for time shift estimation, and it does not require alignment of the time-series end points. As a new feature, we implement a simulation procedure based on parameters estimated from real time series data, on a series-by-series basis, allowing us to determine the false positive rate (FPR) and the significance of the estimated time shift values. We assess the performance of our method using simulation data and real expression time series from two published primate brain expression datasets. Our results show that this method can provide accurate and robust time shift estimates for each time point on a gene-by-gene basis. Using these estimates, we are able to uncover novel features of the biological processes underlying human brain development and maturation. Conclusions The DTW-S provides a convenient tool for calculating accurate and robust time shift estimates at each time point for each gene, based on time series data. The estimates can be used to uncover novel biological features of the system being studied. The DTW-S is freely available as an R package TimeShift at http://www.picb.ac.cn/Comparative/data.html. PMID:21851598

  4. Development and application of a modified dynamic time warping algorithm (DTW-S) to analyses of primate brain expression time series.

    PubMed

    Yuan, Yuan; Chen, Yi-Ping Phoebe; Ni, Shengyu; Xu, Augix Guohua; Tang, Lin; Vingron, Martin; Somel, Mehmet; Khaitovich, Philipp

    2011-08-18

    Comparing biological time series data across different conditions, or different specimens, is a common but still challenging task. Algorithms aligning two time series represent a valuable tool for such comparisons. While many powerful computation tools for time series alignment have been developed, they do not provide significance estimates for time shift measurements. Here, we present an extended version of the original DTW algorithm that allows us to determine the significance of time shift estimates in time series alignments, the DTW-Significance (DTW-S) algorithm. The DTW-S combines important properties of the original algorithm and other published time series alignment tools: DTW-S calculates the optimal alignment for each time point of each gene, it uses interpolated time points for time shift estimation, and it does not require alignment of the time-series end points. As a new feature, we implement a simulation procedure based on parameters estimated from real time series data, on a series-by-series basis, allowing us to determine the false positive rate (FPR) and the significance of the estimated time shift values. We assess the performance of our method using simulation data and real expression time series from two published primate brain expression datasets. Our results show that this method can provide accurate and robust time shift estimates for each time point on a gene-by-gene basis. Using these estimates, we are able to uncover novel features of the biological processes underlying human brain development and maturation. The DTW-S provides a convenient tool for calculating accurate and robust time shift estimates at each time point for each gene, based on time series data. The estimates can be used to uncover novel biological features of the system being studied. The DTW-S is freely available as an R package TimeShift at http://www.picb.ac.cn/Comparative/data.html.

  5. Single paracostal approach to thoracic duct and cisterna chyli: experimental study and case series.

    PubMed

    Staiger, Benjamin A; Stanley, Bryden J; McAnulty, Jonathan F

    2011-10-01

    To-determine the feasibility of a single paracostal abdominal approach for thoracic duct ligation (TDL) and cisterna chyli ablation (CCA) in dogs with chylothorax. Observational study and prospective case series. Normal dogs (n = 5) and dogs with chylothorax (n = 8). A single paracostal approach with transdiaphragmatic extension for TDL and CCA was developed experimentally (n = 5) and used in 8 clinical cases with subtotal pericardectomy (SPE) performed in 4 dogs. Surgery time, complications, hospitalization time, outcome, and follow-up of clinical cases were recorded. Exposure of relevant anatomy was excellent; vital lymphatic staining facilitated identification of lymphatic structures. In clinical cases, mean surgery time for TDL + CCA was 136 minutes. Mean hospitalization time was 3.1 days. Seven of 8 cases survived, with 1 dog dying of heart failure shortly after discharge. One dog required a second (left) paracostal approach to ligate 2 more lymphatic vessels. On follow-up (median, 7 months; range, 2-20 months), there was complete resolution of chylothorax in 6 dogs. A single paracostal approach provides excellent exposure of cisterna chyli, caudal thoracic duct, and intestinal lymphatics. This approach eliminates the need for repositioning during combined TDL + CCA procedures and avoids an intercostal thoracotomy. © Copyright 2011 by The American College of Veterinary Surgeons.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rycroft, Chris H.; Bazant, Martin Z.

    An advection-diffusion-limited dissolution model of an object being eroded by a two-dimensional potential flow is presented. By taking advantage of the conformal invariance of the model, a numerical method is introduced that tracks the evolution of the object boundary in terms of a time-dependent Laurent series. Simulations of a variety of dissolving objects are shown, which shrink and collapse to a single point in finite time. The simulations reveal a surprising exact relationship, whereby the collapse point is the root of a non-Analytic function given in terms of the flow velocity and the Laurent series coefficients describing the initial shape.more » This result is subsequently derived using residue calculus. The structure of the non-Analytic function is examined for three different test cases, and a practical approach to determine the collapse point using a generalized Newton-Raphson root-finding algorithm is outlined. These examples also illustrate the possibility that the model breaks down in finite time prior to complete collapse, due to a topological singularity, as the dissolving boundary overlaps itself rather than breaking up into multiple domains (analogous to droplet pinch-off in fluid mechanics). In conclusion, the model raises fundamental mathematical questions about broken symmetries in finite-Time singularities of both continuous and stochastic dynamical systems.« less

  7. Asymmetric collapse by dissolution or melting in a uniform flow

    PubMed Central

    Bazant, Martin Z.

    2016-01-01

    An advection–diffusion-limited dissolution model of an object being eroded by a two-dimensional potential flow is presented. By taking advantage of the conformal invariance of the model, a numerical method is introduced that tracks the evolution of the object boundary in terms of a time-dependent Laurent series. Simulations of a variety of dissolving objects are shown, which shrink and collapse to a single point in finite time. The simulations reveal a surprising exact relationship, whereby the collapse point is the root of a non-analytic function given in terms of the flow velocity and the Laurent series coefficients describing the initial shape. This result is subsequently derived using residue calculus. The structure of the non-analytic function is examined for three different test cases, and a practical approach to determine the collapse point using a generalized Newton–Raphson root-finding algorithm is outlined. These examples also illustrate the possibility that the model breaks down in finite time prior to complete collapse, due to a topological singularity, as the dissolving boundary overlaps itself rather than breaking up into multiple domains (analogous to droplet pinch-off in fluid mechanics). The model raises fundamental mathematical questions about broken symmetries in finite-time singularities of both continuous and stochastic dynamical systems. PMID:26997890

  8. Asymmetric collapse by dissolution or melting in a uniform flow

    DOE PAGES

    Rycroft, Chris H.; Bazant, Martin Z.

    2016-01-06

    An advection-diffusion-limited dissolution model of an object being eroded by a two-dimensional potential flow is presented. By taking advantage of the conformal invariance of the model, a numerical method is introduced that tracks the evolution of the object boundary in terms of a time-dependent Laurent series. Simulations of a variety of dissolving objects are shown, which shrink and collapse to a single point in finite time. The simulations reveal a surprising exact relationship, whereby the collapse point is the root of a non-Analytic function given in terms of the flow velocity and the Laurent series coefficients describing the initial shape.more » This result is subsequently derived using residue calculus. The structure of the non-Analytic function is examined for three different test cases, and a practical approach to determine the collapse point using a generalized Newton-Raphson root-finding algorithm is outlined. These examples also illustrate the possibility that the model breaks down in finite time prior to complete collapse, due to a topological singularity, as the dissolving boundary overlaps itself rather than breaking up into multiple domains (analogous to droplet pinch-off in fluid mechanics). In conclusion, the model raises fundamental mathematical questions about broken symmetries in finite-Time singularities of both continuous and stochastic dynamical systems.« less

  9. Atmospheric circulation patterns associated to the variability of River Ammer floods: evidence from observed and proxy data

    NASA Astrophysics Data System (ADS)

    Rimbu, N.; Czymzik, M.; Ionita, M.; Lohmann, G.; Brauer, A.

    2015-09-01

    The relationship between the frequency of River Ammer floods (southern Germany) and atmospheric circulation variability is investigated based on observational Ammer discharge data back to 1926 and a flood layer time series from varved sediments of the downstream Lake Ammersee for the pre-instrumental period back to 1766. A composite analysis reveals that, at synoptic time scales, observed River Ammer floods are associated with enhanced moisture transport from the Atlantic Ocean and the Mediterranean towards the Ammer region, a pronounced trough over Western Europe as well as enhanced potential vorticity at upper levels. We argue that this synoptic scale configuration can trigger heavy precipitation and floods in the Ammer region. Interannual to multidecadal increases in flood frequency as recorded in the instrumental discharge record are associated to a wave-train pattern extending from the North Atlantic to western Asia with a prominent negative center over western Europe. A similar atmospheric circulation pattern is associated to increases in flood layer frequency in the Lake Ammersee sediment record during the pre-instrumental period. We argue that the complete flood layer time-series from Lake Ammersee sediments covering the last 5500 years, contains information about atmospheric circulation variability on inter-annual to millennial time-scales.

  10. Near-Field Postseismic Deformation Measurements from the Andaman and Nicobar Islands

    NASA Astrophysics Data System (ADS)

    Freymueller, J. T.; Rajendran, C.; Rajendran, K.; Rajamani, A.

    2006-12-01

    Since the December 26, 2004 Sumatra-Andaman Islands earthquake, we have carried out campaign GPS measurements at several sites in the Andaman and Nicobar Islands (India) and installed three continuous GPS sites in the region. Most of these sites had pre-earthquake measurements, which showed slow westward motion relative to the Indian plate. Postseismic measurements, on the other hand, show average westward velocities of several cm/yr to a few decimeters per year relative to the Indian plate. The motion of all sites is strongly non-linear in time, and is not uniform in space. We use a combination of continuous site time series and nearby campaign site time series to construct the most complete possible postseismic displacement records. Postseismic deformation from large earthquakes is likely to be dominated by a combination of afterslip on the deeper subduction interface, and viscoelastic relaxation of the mantle. Afterslip following the (similar magnitude) 1964 Alaska earthquake amounted to 20-50% of the magnitude of the coseismic slip, and smaller subduction zone earthquakes have exhibited the same or even larger proportion of afterslip to coseismic slip. We compare the time decay and spatial pattern of the observed postseismic displacement to postseismic deformation models and to observations from the Alaska earthquake.

  11. Dynamically downscaled climate simulations over North America: Methods, evaluation, and supporting documentation for users

    USGS Publications Warehouse

    Hostetler, S.W.; Alder, J.R.; Allan, A.M.

    2011-01-01

    We have completed an array of high-resolution simulations of present and future climate over Western North America (WNA) and Eastern North America (ENA) by dynamically downscaling global climate simulations using a regional climate model, RegCM3. The simulations are intended to provide long time series of internally consistent surface and atmospheric variables for use in climate-related research. In addition to providing high-resolution weather and climate data for the past, present, and future, we have developed an integrated data flow and methodology for processing, summarizing, viewing, and delivering the climate datasets to a wide range of potential users. Our simulations were run over 50- and 15-kilometer model grids in an attempt to capture more of the climatic detail associated with processes such as topographic forcing than can be captured by general circulation models (GCMs). The simulations were run using output from four GCMs. All simulations span the present (for example, 1968-1999), common periods of the future (2040-2069), and two simulations continuously cover 2010-2099. The trace gas concentrations in our simulations were the same as those of the GCMs: the IPCC 20th century time series for 1968-1999 and the A2 time series for simulations of the future. We demonstrate that RegCM3 is capable of producing present day annual and seasonal climatologies of air temperature and precipitation that are in good agreement with observations. Important features of the high-resolution climatology of temperature, precipitation, snow water equivalent (SWE), and soil moisture are consistently reproduced in all model runs over WNA and ENA. The simulations provide a potential range of future climate change for selected decades and display common patterns of the direction and magnitude of changes. As expected, there are some model to model differences that limit interpretability and give rise to uncertainties. Here, we provide background information about the GCMs and the RegCM3, a basic evaluation of the model output and examples of simulated future climate. We also provide information needed to access the web applications for visualizing and downloading the data, and give complete metadata that describe the variables in the datasets.

  12. An evaluation of grease-type ball bearing lubricants operation in various environments

    NASA Technical Reports Server (NTRS)

    Mcmurtrey, E. L.

    1983-01-01

    Because many future spacecraft or space stations will require mechanisms to operate for long periods of time in environments which are adverse to most bearing lubricants, a series of tests is continuing to evaluate 38 grease type lubricants in R-4 size bearings in five different environments for a 1 year period. Four repetitions of each test are made to provide statistical samples. These tests have also been used to select four lubricants for 5 year tests in selected environments with five repetitions of each test for statistical samples. At the present time, 142 test sets have been completed and 30 test sets are underway. The three 5 year tests in (1) continuous operation and (2) start stop operation, with both in vacuum at ambient temperatures, and (3) continuous vacuum operation at 93.3 C are now completed. To date, in both the 1 year and 5 year tests, the best results in all environments have been obtained with a high viscosity index perfluoroalkylpolyether (PFPE) grease.

  13. Measuring Complexity and Predictability of Time Series with Flexible Multiscale Entropy for Sensor Networks

    PubMed Central

    Zhou, Renjie; Yang, Chen; Wan, Jian; Zhang, Wei; Guan, Bo; Xiong, Naixue

    2017-01-01

    Measurement of time series complexity and predictability is sometimes the cornerstone for proposing solutions to topology and congestion control problems in sensor networks. As a method of measuring time series complexity and predictability, multiscale entropy (MSE) has been widely applied in many fields. However, sample entropy, which is the fundamental component of MSE, measures the similarity of two subsequences of a time series with either zero or one, but without in-between values, which causes sudden changes of entropy values even if the time series embraces small changes. This problem becomes especially severe when the length of time series is getting short. For solving such the problem, we propose flexible multiscale entropy (FMSE), which introduces a novel similarity function measuring the similarity of two subsequences with full-range values from zero to one, and thus increases the reliability and stability of measuring time series complexity. The proposed method is evaluated on both synthetic and real time series, including white noise, 1/f noise and real vibration signals. The evaluation results demonstrate that FMSE has a significant improvement in reliability and stability of measuring complexity of time series, especially when the length of time series is short, compared to MSE and composite multiscale entropy (CMSE). The proposed method FMSE is capable of improving the performance of time series analysis based topology and traffic congestion control techniques. PMID:28383496

  14. Measuring Complexity and Predictability of Time Series with Flexible Multiscale Entropy for Sensor Networks.

    PubMed

    Zhou, Renjie; Yang, Chen; Wan, Jian; Zhang, Wei; Guan, Bo; Xiong, Naixue

    2017-04-06

    Measurement of time series complexity and predictability is sometimes the cornerstone for proposing solutions to topology and congestion control problems in sensor networks. As a method of measuring time series complexity and predictability, multiscale entropy (MSE) has been widely applied in many fields. However, sample entropy, which is the fundamental component of MSE, measures the similarity of two subsequences of a time series with either zero or one, but without in-between values, which causes sudden changes of entropy values even if the time series embraces small changes. This problem becomes especially severe when the length of time series is getting short. For solving such the problem, we propose flexible multiscale entropy (FMSE), which introduces a novel similarity function measuring the similarity of two subsequences with full-range values from zero to one, and thus increases the reliability and stability of measuring time series complexity. The proposed method is evaluated on both synthetic and real time series, including white noise, 1/f noise and real vibration signals. The evaluation results demonstrate that FMSE has a significant improvement in reliability and stability of measuring complexity of time series, especially when the length of time series is short, compared to MSE and composite multiscale entropy (CMSE). The proposed method FMSE is capable of improving the performance of time series analysis based topology and traffic congestion control techniques.

  15. Empirical method to measure stochasticity and multifractality in nonlinear time series

    NASA Astrophysics Data System (ADS)

    Lin, Chih-Hao; Chang, Chia-Seng; Li, Sai-Ping

    2013-12-01

    An empirical algorithm is used here to study the stochastic and multifractal nature of nonlinear time series. A parameter can be defined to quantitatively measure the deviation of the time series from a Wiener process so that the stochasticity of different time series can be compared. The local volatility of the time series under study can be constructed using this algorithm, and the multifractal structure of the time series can be analyzed by using this local volatility. As an example, we employ this method to analyze financial time series from different stock markets. The result shows that while developed markets evolve very much like an Ito process, the emergent markets are far from efficient. Differences about the multifractal structures and leverage effects between developed and emergent markets are discussed. The algorithm used here can be applied in a similar fashion to study time series of other complex systems.

  16. The Role of Amodal Surface Completion in Stereoscopic Transparency

    PubMed Central

    Anderson, Barton L.; Schmid, Alexandra C.

    2012-01-01

    Previous work has shown that the visual system can decompose stereoscopic textures into percepts of inhomogeneous transparency. We investigate whether this form of layered image decomposition is shaped by constraints on amodal surface completion. We report a series of experiments that demonstrate that stereoscopic depth differences are easier to discriminate when the stereo images generate a coherent percept of surface color, than when images require amodally integrating a series of color changes into a coherent surface. Our results provide further evidence for the intimate link between the segmentation processes that occur in conditions of transparency and occlusion, and the interpolation processes involved in the formation of amodally completed surfaces. PMID:23060829

  17. Complete description of all self-similar models driven by Lévy stable noise

    NASA Astrophysics Data System (ADS)

    Weron, Aleksander; Burnecki, Krzysztof; Mercik, Szymon; Weron, Karina

    2005-01-01

    A canonical decomposition of H -self-similar Lévy symmetric α -stable processes is presented. The resulting components completely described by both deterministic kernels and the corresponding stochastic integral with respect to the Lévy symmetric α -stable motion are shown to be related to the dissipative and conservative parts of the dynamics. This result provides stochastic analysis tools for study the anomalous diffusion phenomena in the Langevin equation framework. For example, a simple computer test for testing the origins of self-similarity is implemented for four real empirical time series recorded from different physical systems: an ionic current flow through a single channel in a biological membrane, an energy of solar flares, a seismic electric signal recorded during seismic Earth activity, and foreign exchange rate daily returns.

  18. Multiscale Poincaré plots for visualizing the structure of heartbeat time series.

    PubMed

    Henriques, Teresa S; Mariani, Sara; Burykin, Anton; Rodrigues, Filipa; Silva, Tiago F; Goldberger, Ary L

    2016-02-09

    Poincaré delay maps are widely used in the analysis of cardiac interbeat interval (RR) dynamics. To facilitate visualization of the structure of these time series, we introduce multiscale Poincaré (MSP) plots. Starting with the original RR time series, the method employs a coarse-graining procedure to create a family of time series, each of which represents the system's dynamics in a different time scale. Next, the Poincaré plots are constructed for the original and the coarse-grained time series. Finally, as an optional adjunct, color can be added to each point to represent its normalized frequency. We illustrate the MSP method on simulated Gaussian white and 1/f noise time series. The MSP plots of 1/f noise time series reveal relative conservation of the phase space area over multiple time scales, while those of white noise show a marked reduction in area. We also show how MSP plots can be used to illustrate the loss of complexity when heartbeat time series from healthy subjects are compared with those from patients with chronic (congestive) heart failure syndrome or with atrial fibrillation. This generalized multiscale approach to Poincaré plots may be useful in visualizing other types of time series.

  19. Measurement error in time-series analysis: a simulation study comparing modelled and monitored data.

    PubMed

    Butland, Barbara K; Armstrong, Ben; Atkinson, Richard W; Wilkinson, Paul; Heal, Mathew R; Doherty, Ruth M; Vieno, Massimo

    2013-11-13

    Assessing health effects from background exposure to air pollution is often hampered by the sparseness of pollution monitoring networks. However, regional atmospheric chemistry-transport models (CTMs) can provide pollution data with national coverage at fine geographical and temporal resolution. We used statistical simulation to compare the impact on epidemiological time-series analysis of additive measurement error in sparse monitor data as opposed to geographically and temporally complete model data. Statistical simulations were based on a theoretical area of 4 regions each consisting of twenty-five 5 km × 5 km grid-squares. In the context of a 3-year Poisson regression time-series analysis of the association between mortality and a single pollutant, we compared the error impact of using daily grid-specific model data as opposed to daily regional average monitor data. We investigated how this comparison was affected if we changed the number of grids per region containing a monitor. To inform simulations, estimates (e.g. of pollutant means) were obtained from observed monitor data for 2003-2006 for national network sites across the UK and corresponding model data that were generated by the EMEP-WRF CTM. Average within-site correlations between observed monitor and model data were 0.73 and 0.76 for rural and urban daily maximum 8-hour ozone respectively, and 0.67 and 0.61 for rural and urban loge(daily 1-hour maximum NO2). When regional averages were based on 5 or 10 monitors per region, health effect estimates exhibited little bias. However, with only 1 monitor per region, the regression coefficient in our time-series analysis was attenuated by an estimated 6% for urban background ozone, 13% for rural ozone, 29% for urban background loge(NO2) and 38% for rural loge(NO2). For grid-specific model data the corresponding figures were 19%, 22%, 54% and 44% respectively, i.e. similar for rural loge(NO2) but more marked for urban loge(NO2). Even if correlations between model and monitor data appear reasonably strong, additive classical measurement error in model data may lead to appreciable bias in health effect estimates. As process-based air pollution models become more widely used in epidemiological time-series analysis, assessments of error impact that include statistical simulation may be useful.

  20. Degree-Pruning Dynamic Programming Approaches to Central Time Series Minimizing Dynamic Time Warping Distance.

    PubMed

    Sun, Tao; Liu, Hongbo; Yu, Hong; Chen, C L Philip

    2016-06-28

    The central time series crystallizes the common patterns of the set it represents. In this paper, we propose a global constrained degree-pruning dynamic programming (g(dp)²) approach to obtain the central time series through minimizing dynamic time warping (DTW) distance between two time series. The DTW matching path theory with global constraints is proved theoretically for our degree-pruning strategy, which is helpful to reduce the time complexity and computational cost. Our approach can achieve the optimal solution between two time series. An approximate method to the central time series of multiple time series [called as m_g(dp)²] is presented based on DTW barycenter averaging and our g(dp)² approach by considering hierarchically merging strategy. As illustrated by the experimental results, our approaches provide better within-group sum of squares and robustness than other relevant algorithms.

  1. Maxillary reaction patterns identified by three-dimensional analysis of casts from infants with unilateral cleft lip and palate.

    PubMed

    Neuschulz, J; Schaefer, I; Scheer, M; Christ, H; Braumann, B

    2013-07-01

    In order to visualize and quantify the direction and extent of morphological upper-jaw changes in infants with unilateral cleft lip and palate (UCLP) during early orthodontic treatment, a three-dimensional method of cast analysis for routine application was developed. In the present investigation, this method was used to identify reaction patterns associated with specific cleft forms. The study included a cast series reflecting the upper-jaw situations of 46 infants with complete (n=27) or incomplete (n=19) UCLP during week 1 and months 3, 6, and 12 of life. Three-dimensional datasets were acquired and visualized with scanning software (DigiModel®; OrthoProof, The Netherlands). Following interactive identification of landmarks on the digitized surface relief, a defined set of representative linear parameters were three-dimensionally measured. At the same time, the three-dimensional surfaces of one patient series were superimposed based on a defined reference plane. Morphometric differences were statistically analyzed. Thanks to the user-friendly software, all landmarks could be identified quickly and reproducibly, thus, allowing for simultaneous three-dimensional measurement of all defined parameters. The measured values revealed that significant morphometric differences were present in all three planes of space between the two patient groups. Patients with complete UCLP underwent significantly larger reductions in cleft width (p<0.001), and sagittal growth in the complete UCLP group exceeded sagittal growth in the incomplete UCLP group by almost 50% within the first year of life. Based on patients with incomplete versus complete UCLP, different reaction patterns were identified that depended not on apparent severities of malformation but on cleft forms.

  2. From Networks to Time Series

    NASA Astrophysics Data System (ADS)

    Shimada, Yutaka; Ikeguchi, Tohru; Shigehara, Takaomi

    2012-10-01

    In this Letter, we propose a framework to transform a complex network to a time series. The transformation from complex networks to time series is realized by the classical multidimensional scaling. Applying the transformation method to a model proposed by Watts and Strogatz [Nature (London) 393, 440 (1998)], we show that ring lattices are transformed to periodic time series, small-world networks to noisy periodic time series, and random networks to random time series. We also show that these relationships are analytically held by using the circulant-matrix theory and the perturbation theory of linear operators. The results are generalized to several high-dimensional lattices.

  3. Forest canopy growth dynamic modeling based on remote sensing prodcuts and meteorological data in Daxing'anling of Northeast China

    NASA Astrophysics Data System (ADS)

    Wu, Qiaoli; Song, Jinling; Wang, Jindi; Xiao, Zhiqiang

    2014-11-01

    Leaf Area Index (LAI) is an important biophysical variable for vegetation. Compared with vegetation indexes like NDVI and EVI, LAI is more capable of monitoring forest canopy growth quantitatively. GLASS LAI is a spatially complete and temporally continuous product derived from AVHRR and MODIS reflectance data. In this paper, we present the approach to build dynamic LAI growth models for young and mature Larix gmelinii forest in north Daxing'anling in Inner Mongolia of China using the Dynamic Harmonic Regression (DHR) model and Double Logistic (D-L) model respectively, based on the time series extracted from multi-temporal GLASS LAI data. Meanwhile we used the dynamic threshold method to attract the key phenological phases of Larix gmelinii forest from the simulated time series. Then, through the relationship analysis between phenological phases and the meteorological factors, we found that the annual peak LAI and the annual maximum temperature have a good correlation coefficient. The results indicate this forest canopy growth dynamic model to be very effective in predicting forest canopy LAI growth and extracting forest canopy LAI growth dynamic.

  4. 76 FR 14111 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-15

    ... Options Series, adjusted option series and any options series until the time to expiration for such series... time to expiration for such series is less than nine months be treated differently. Specifically, under... until the time to expiration for such series is less than nine months. Accordingly, the requirement to...

  5. Incremental fuzzy C medoids clustering of time series data using dynamic time warping distance

    PubMed Central

    Chen, Jingli; Wu, Shuai; Liu, Zhizhong; Chao, Hao

    2018-01-01

    Clustering time series data is of great significance since it could extract meaningful statistics and other characteristics. Especially in biomedical engineering, outstanding clustering algorithms for time series may help improve the health level of people. Considering data scale and time shifts of time series, in this paper, we introduce two incremental fuzzy clustering algorithms based on a Dynamic Time Warping (DTW) distance. For recruiting Single-Pass and Online patterns, our algorithms could handle large-scale time series data by splitting it into a set of chunks which are processed sequentially. Besides, our algorithms select DTW to measure distance of pair-wise time series and encourage higher clustering accuracy because DTW could determine an optimal match between any two time series by stretching or compressing segments of temporal data. Our new algorithms are compared to some existing prominent incremental fuzzy clustering algorithms on 12 benchmark time series datasets. The experimental results show that the proposed approaches could yield high quality clusters and were better than all the competitors in terms of clustering accuracy. PMID:29795600

  6. Incremental fuzzy C medoids clustering of time series data using dynamic time warping distance.

    PubMed

    Liu, Yongli; Chen, Jingli; Wu, Shuai; Liu, Zhizhong; Chao, Hao

    2018-01-01

    Clustering time series data is of great significance since it could extract meaningful statistics and other characteristics. Especially in biomedical engineering, outstanding clustering algorithms for time series may help improve the health level of people. Considering data scale and time shifts of time series, in this paper, we introduce two incremental fuzzy clustering algorithms based on a Dynamic Time Warping (DTW) distance. For recruiting Single-Pass and Online patterns, our algorithms could handle large-scale time series data by splitting it into a set of chunks which are processed sequentially. Besides, our algorithms select DTW to measure distance of pair-wise time series and encourage higher clustering accuracy because DTW could determine an optimal match between any two time series by stretching or compressing segments of temporal data. Our new algorithms are compared to some existing prominent incremental fuzzy clustering algorithms on 12 benchmark time series datasets. The experimental results show that the proposed approaches could yield high quality clusters and were better than all the competitors in terms of clustering accuracy.

  7. Solar panel acceptance testing using a pulsed solar simulator

    NASA Technical Reports Server (NTRS)

    Hershey, T. L.

    1977-01-01

    Utilizing specific parameters as area of an individual cell, number in series and parallel, and established coefficient of current and voltage temperature dependence, a solar array irradiated with one solar constant at AMO and at ambient temperature can be characterized by a current-voltage curve for different intensities, temperatures, and even different configurations. Calibration techniques include: uniformity in area, depth and time, absolute and transfer irradiance standards, dynamic and functional check out procedures. Typical data are given for individual cell (2x2 cm) to complete flat solar array (5x5 feet) with 2660 cells and on cylindrical test items with up to 10,000 cells. The time and energy saving of such testing techniques are emphasized.

  8. Effect of time delay on surgical performance during telesurgical manipulation.

    PubMed

    Fabrizio, M D; Lee, B R; Chan, D Y; Stoianovici, D; Jarrett, T W; Yang, C; Kavoussi, L R

    2000-03-01

    Telementoring allows a less experienced surgeon to benefit from an expert surgical consultation, reducing cost, travel, and the learning curve associated with new procedures. However, there are several technical limitations that affect practical applications. One potentially serious problem is the time delay that occurs any time data are transferred across long distances. To date, the effect of time delay on surgical performance has not been studied. A two-phase trial was designed to examine the effect of time delay on surgical performance. In the first phase, a series of tasks was performed, and the numbers of robotic movements required for completion was counted. Programmed incremental time delays were made in audiovisual acquisition and robotic controls. The number of errors made while performing each task at various time delay intervals was noted. In the second phase, a remote surgeon in Baltimore performed the tasks 9000 miles away in Singapore. The number of errors made was recorded. As the time delay increased, the number of operator errors increased. The accuracy needed to perform remote robotic procedures was diminished as the time delay increased. A learning curve did exist for each task, but as the time delay interval increased, it took longer to complete the task. Time delay does affect surgical performance. There is an acceptable delay of <700 msec in which surgeons can compensate for this phenomenon. Clinical studies will be needed to evaluate the true impact of time delay.

  9. Design and Implementation of a Professional Development Course Series.

    PubMed

    Welch, Beth; Spooner, Joshua J; Tanzer, Kim; Dintzner, Matthew R

    2017-12-01

    Objective. To design and implement a longitudinal course series focused on professional development and professional identity formation in pharmacy students at Western New England University. Methods. A four-year, theme-based course series was designed to sequentially and longitudinally impart the values, attributes, and characteristics of a professional pharmacist. Requirements of the course include: goal planning and reflective assignments, submission of "Best Works," attendance at professional meetings, completion of service hours, annual completion of a Pharmacy Professionalism Instrument, attendance at Dean's Seminar, participation in roundtable discussions, and maintenance of an electronic portfolio. Though the Professional Development course series carries no credit, these courses are progression requirements and students are assessed on a pass/fail basis. Results. Course pass rates in the 2015-2016 academic year for all four classes were 99% to 100%, suggesting the majority of students take professional development seriously and are achieving the intended outcomes of the courses. Conclusion. A professional development course series was designed and implemented in the new Doctor of Pharmacy program at Western New England University to enhance the professional identity formation of students.

  10. Visibility Graph Based Time Series Analysis.

    PubMed

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  11. Multichannel biomedical time series clustering via hierarchical probabilistic latent semantic analysis.

    PubMed

    Wang, Jin; Sun, Xiangping; Nahavandi, Saeid; Kouzani, Abbas; Wu, Yuchuan; She, Mary

    2014-11-01

    Biomedical time series clustering that automatically groups a collection of time series according to their internal similarity is of importance for medical record management and inspection such as bio-signals archiving and retrieval. In this paper, a novel framework that automatically groups a set of unlabelled multichannel biomedical time series according to their internal structural similarity is proposed. Specifically, we treat a multichannel biomedical time series as a document and extract local segments from the time series as words. We extend a topic model, i.e., the Hierarchical probabilistic Latent Semantic Analysis (H-pLSA), which was originally developed for visual motion analysis to cluster a set of unlabelled multichannel time series. The H-pLSA models each channel of the multichannel time series using a local pLSA in the first layer. The topics learned in the local pLSA are then fed to a global pLSA in the second layer to discover the categories of multichannel time series. Experiments on a dataset extracted from multichannel Electrocardiography (ECG) signals demonstrate that the proposed method performs better than previous state-of-the-art approaches and is relatively robust to the variations of parameters including length of local segments and dictionary size. Although the experimental evaluation used the multichannel ECG signals in a biometric scenario, the proposed algorithm is a universal framework for multichannel biomedical time series clustering according to their structural similarity, which has many applications in biomedical time series management. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  12. Essentials of Pediatric Emergency Medicine Fellowship: Part 6: Program Administration.

    PubMed

    Kim, In K; Zuckerbraun, Noel; Kou, Maybelle; Vu, Tien; Levasseur, Kelly; Yen, Kenneth; Chapman, Jennifer; Doughty, Cara; McAneney, Constance; Zaveri, Pavan; Hsu, Deborah

    2016-10-01

    This article is the sixth in a 7-part series that aims to comprehensively describe the current state and future directions of pediatric emergency medicine (PEM) fellowship training from the essential requirements to considerations for successfully administering and managing a program to the careers that may be anticipated upon program completion. This article provides a broad overview of administering and supervising a PEM fellowship program. It explores 3 topics: the principles of program administration, committee management, and recommendations for minimum time allocated for PEM fellowship program directors to administer their programs.

  13. The male reproductive system of Hippolyte inermis Leach 1815 (Decapoda, Caridea)

    NASA Astrophysics Data System (ADS)

    Cobos, Vanesa; Díaz, Vanessa; Raso, Jose Enrique García; Manjón-Cabeza, M. E.

    2011-03-01

    The present work completes a series of studies on the biology of the shrimp Hippolyte inermis Leach 1815, where we suggested the species to be gonochoristic. The morphology of the male reproductive system (testes, vasa deferentia, gonopores) and the different stages of male germ cell development are described for the first time in the genus Hippolyte, using TEM, SEM, and histological methods. All males from 1.70 to 3.42 mm in carapace length had active testes and well-developed vasa deferentia. No case of sex reversal could be found.

  14. Statistical physics of interacting neural networks

    NASA Astrophysics Data System (ADS)

    Kinzel, Wolfgang; Metzler, Richard; Kanter, Ido

    2001-12-01

    Recent results on the statistical physics of time series generation and prediction are presented. A neural network is trained on quasi-periodic and chaotic sequences and overlaps to the sequence generator as well as the prediction errors are calculated numerically. For each network there exists a sequence for which it completely fails to make predictions. Two interacting networks show a transition to perfect synchronization. A pool of interacting networks shows good coordination in the minority game-a model of competition in a closed market. Finally, as a demonstration, a perceptron predicts bit sequences produced by human beings.

  15. Directional wave navigation radar measurements compared with pitch-roll buoy data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A.-Munoyerro, M.A.; Borge, J.C.N.

    1997-02-01

    The knowledge of the spectral behavior of a specific sea region is complete when one knows surface elevations and directional wave movements. Usually, sea directional descriptions have been made using pitch-roll buoys, which can provide one with several wave characteristic time series. Alternatively, there are other measure systems, which belong to remote sensing technics, such as shipboard navigation radars. The aim of the present work is to compare results obtained from pitch-roll data and ship radar wave measurements obtained during a campaign in the Cantabric Sea.

  16. Analysis of Nonstationary Time Series for Biological Rhythms Research.

    PubMed

    Leise, Tanya L

    2017-06-01

    This article is part of a Journal of Biological Rhythms series exploring analysis and statistics topics relevant to researchers in biological rhythms and sleep research. The goal is to provide an overview of the most common issues that arise in the analysis and interpretation of data in these fields. In this article on time series analysis for biological rhythms, we describe some methods for assessing the rhythmic properties of time series, including tests of whether a time series is indeed rhythmic. Because biological rhythms can exhibit significant fluctuations in their period, phase, and amplitude, their analysis may require methods appropriate for nonstationary time series, such as wavelet transforms, which can measure how these rhythmic parameters change over time. We illustrate these methods using simulated and real time series.

  17. GIAnT - Generic InSAR Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Agram, P.; Jolivet, R.; Riel, B. V.; Simons, M.; Doin, M.; Lasserre, C.; Hetland, E. A.

    2012-12-01

    We present a computing framework for studying the spatio-temporal evolution of ground deformation from interferometric synthetic aperture radar (InSAR) data. Several open-source tools including Repeat Orbit Interferometry PACkage (ROI-PAC) and InSAR Scientific Computing Environment (ISCE) from NASA-JPL, and Delft Object-oriented Repeat Interferometric Software (DORIS), have enabled scientists to generate individual interferograms from raw radar data with relative ease. Numerous computational techniques and algorithms that reduce phase information from multiple interferograms to a deformation time-series have been developed and verified over the past decade. However, the sharing and direct comparison of products from multiple processing approaches has been hindered by - 1) absence of simple standards for sharing of estimated time-series products, 2) use of proprietary software tools with license restrictions and 3) the closed source nature of the exact implementation of many of these algorithms. We have developed this computing framework to address all of the above issues. We attempt to take the first steps towards creating a community software repository for InSAR time-series analysis. To date, we have implemented the short baseline subset algorithm (SBAS), NSBAS and multi-scale interferometric time-series (MInTS) in this framework and the associated source code is included in the GIAnT distribution. A number of the associated routines have been optimized for performance and scalability with large data sets. Some of the new features in our processing framework are - 1) the use of daily solutions from continuous GPS stations to correct for orbit errors, 2) the use of meteorological data sets to estimate the tropospheric delay screen and 3) a data-driven bootstrapping approach to estimate the uncertainties associated with estimated time-series products. We are currently working on incorporating tidal load corrections for individual interferograms and propagation of noise covariance models through the processing chain for robust estimation of uncertainties in the deformation estimates. We will demonstrate the ease of use of our framework with results ranging from regional scale analysis around Long Valley, CA and Parkfield, CA to continental scale analysis in Western South America. We will also present preliminary results from a new time-series approach that simultaneously estimates deformation over the complete spatial domain at all time epochs on a distributed computing platform. GIAnT has been developed entirely using open source tools and uses Python as the underlying platform. We build on the extensive numerical (NumPy) and scientific (SciPy) computing Python libraries to develop an object-oriented, flexible and modular framework for time-series InSAR applications. The toolbox is currently configured to work with outputs from ROI-PAC, ISCE and DORIS, but can easily be extended to support products from other SAR/InSAR processors. The toolbox libraries include support for hierarchical data format (HDF5) memory mapped files, parallel processing with Python's multi-processing module and support for many convex optimization solvers like CSDP, CVXOPT etc. An extensive set of routines to deal with ASCII and XML files has also been included for controlling the processing parameters.

  18. EMC: Air Quality Forecast Home page

    Science.gov Websites

    archive NAM Verification Meteorology Error Time Series EMC NAM Spatial Maps Real Time Mesoscale Analysis Precipitation verification NAQFC VERIFICATION CMAQ Ozone & PM Error Time Series AOD Error Time Series HYSPLIT Smoke forecasts vs GASP satellite Dust and Smoke Error Time Series HYSPLIT WCOSS Upgrade (July

  19. Time Series Remote Sensing in Monitoring the Spatio-Temporal Dynamics of Plant Invasions: A Study of Invasive Saltcedar (Tamarix Spp.)

    NASA Astrophysics Data System (ADS)

    Diao, Chunyuan

    In today's big data era, the increasing availability of satellite and airborne platforms at various spatial and temporal scales creates unprecedented opportunities to understand the complex and dynamic systems (e.g., plant invasion). Time series remote sensing is becoming more and more important to monitor the earth system dynamics and interactions. To date, most of the time series remote sensing studies have been conducted with the images acquired at coarse spatial scale, due to their relatively high temporal resolution. The construction of time series at fine spatial scale, however, is limited to few or discrete images acquired within or across years. The objective of this research is to advance the time series remote sensing at fine spatial scale, particularly to shift from discrete time series remote sensing to continuous time series remote sensing. The objective will be achieved through the following aims: 1) Advance intra-annual time series remote sensing under the pure-pixel assumption; 2) Advance intra-annual time series remote sensing under the mixed-pixel assumption; 3) Advance inter-annual time series remote sensing in monitoring the land surface dynamics; and 4) Advance the species distribution model with time series remote sensing. Taking invasive saltcedar as an example, four methods (i.e., phenological time series remote sensing model, temporal partial unmixing method, multiyear spectral angle clustering model, and time series remote sensing-based spatially explicit species distribution model) were developed to achieve the objectives. Results indicated that the phenological time series remote sensing model could effectively map saltcedar distributions through characterizing the seasonal phenological dynamics of plant species throughout the year. The proposed temporal partial unmixing method, compared to conventional unmixing methods, could more accurately estimate saltcedar abundance within a pixel by exploiting the adequate temporal signatures of saltcedar. The multiyear spectral angle clustering model could guide the selection of the most representative remotely sensed image for repetitive saltcedar mapping over space and time. Through incorporating spatial autocorrelation, the species distribution model developed in the study could identify the suitable habitats of saltcedar at a fine spatial scale and locate appropriate areas at high risk of saltcedar infestation. Among 10 environmental variables, the distance to the river and the phenological attributes summarized by the time series remote sensing were regarded as the most important. These methods developed in the study provide new perspectives on how the continuous time series can be leveraged under various conditions to investigate the plant invasion dynamics.

  20. A hybrid approach EMD-HW for short-term forecasting of daily stock market time series data

    NASA Astrophysics Data System (ADS)

    Awajan, Ahmad Mohd; Ismail, Mohd Tahir

    2017-08-01

    Recently, forecasting time series has attracted considerable attention in the field of analyzing financial time series data, specifically within the stock market index. Moreover, stock market forecasting is a challenging area of financial time-series forecasting. In this study, a hybrid methodology between Empirical Mode Decomposition with the Holt-Winter method (EMD-HW) is used to improve forecasting performances in financial time series. The strength of this EMD-HW lies in its ability to forecast non-stationary and non-linear time series without a need to use any transformation method. Moreover, EMD-HW has a relatively high accuracy and offers a new forecasting method in time series. The daily stock market time series data of 11 countries is applied to show the forecasting performance of the proposed EMD-HW. Based on the three forecast accuracy measures, the results indicate that EMD-HW forecasting performance is superior to traditional Holt-Winter forecasting method.

  1. hctsa: A Computational Framework for Automated Time-Series Phenotyping Using Massive Feature Extraction.

    PubMed

    Fulcher, Ben D; Jones, Nick S

    2017-11-22

    Phenotype measurements frequently take the form of time series, but we currently lack a systematic method for relating these complex data streams to scientifically meaningful outcomes, such as relating the movement dynamics of organisms to their genotype or measurements of brain dynamics of a patient to their disease diagnosis. Previous work addressed this problem by comparing implementations of thousands of diverse scientific time-series analysis methods in an approach termed highly comparative time-series analysis. Here, we introduce hctsa, a software tool for applying this methodological approach to data. hctsa includes an architecture for computing over 7,700 time-series features and a suite of analysis and visualization algorithms to automatically select useful and interpretable time-series features for a given application. Using exemplar applications to high-throughput phenotyping experiments, we show how hctsa allows researchers to leverage decades of time-series research to quantify and understand informative structure in time-series data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Temporal and spatial distribution of isotopes in river water in Central Europe: 50 years experience with the Austrian network of isotopes in rivers.

    PubMed

    Rank, Dieter; Wyhlidal, Stefan; Schott, Katharina; Weigand, Silvia; Oblin, Armin

    2018-05-01

    The Austrian network of isotopes in rivers comprises about 15 sampling locations and has been operated since 1976. The Danube isotope time series goes back to 1963. The isotopic composition of river water in Central Europe is mainly governed by the isotopic composition of precipitation in the catchment area; evaporation effects play only a minor role. Short-term and long-term isotope signals in precipitation are thus transmitted through the whole catchment. The influence of climatic changes has become observable in the long-term stable isotope time series of precipitation and surface waters. Environmental 3 H values were around 8 TU in 2015, short-term 3 H pulses up to about 80 TU in the rivers Danube and March were a consequence of releases from nuclear power plants. The complete isotope data series of this network will be included in the Global Network of Isotopes in Rivers database of the International Atomic Energy Agency (IAEA) in 2017. This article comprises a review of 50 years isotope monitoring on rivers and is also intended to provide base information on the (isotope-)hydrological conditions in Central Europe specifically for the end-users of these data, e.g. for modelling hydrological processes. Furthermore, this paper includes the 2006-2015 supplement adding to the Danube isotope set published earlier.

  3. Enhancing sequential time perception and storytelling ability of deaf and hard of hearing children.

    PubMed

    Ingber, Sara; Eden, Sigal

    2011-01-01

    A 3-month intervention was conducted to enhance the sequential time perception and storytelling ability of young children with hearing loss. The children were trained to arrange pictorial episodes of temporal scripts and tell the stories they created. Participants (N = 34, aged 4-7 years) were divided into 2 groups based on whether their spoken-language gap was more or less than 1 year compared to age norms. They completed A. Kaufman and N. Kaufman's (1983) picture series subtest and Guralnik's (1982) storytelling test at pretest and posttest. Measures demonstrated significant improvement in sequential time and storytelling achievement postintervention. Three of the examined demographic variables revealed correlations: Participants with genetic etiology showed greater improvement in time sequencing and storytelling than participants with unknown etiology; early onset of treatment correlated with better achievement in time sequencing; cochlear implant users showed greater storytelling improvement than hearing aid users.

  4. Is globalization really good for public health?

    PubMed

    Tausch, Arno

    2016-10-01

    In the light of recent very prominent studies, especially that of Mukherjee and Krieckhaus (), one should be initially tempted to assume that nowadays globalization is a driver of a good public health performance in the entire world system. Most of these studies use time series analyses based on the KOF Index of Globalization. We attempt to re-analyze the entire question, using a variety of methodological approaches and data. Our re-analysis shows that neoliberal globalization has resulted in very important implosions of public health development in various regions of the world and in increasing inequality in the countries of the world system, which in turn negatively affect health performance. We use standard ibm/spss ordinary least squares (OLS) regressions, time series and cross-correlation analyses based on aggregate, freely available data. Different components of the KOF Index, most notably actual capital inflows, affect public health negatively. The "decomposition" of the available data suggests that for most of the time period of the last four decades, globalization inflows even implied an aggregate deterioration of public health, quite in line with globalization critical studies. We introduce the effects of inequality on public health, widely debated in global public health research. Our annual time series for 99 countries show that globalization indeed leads to increased inequality, and this, in turn, leads to a deteriorating public health performance. In only 19 of the surveyed 99 nations with complete data (i.e., 19.1%), globalization actually preceded an improvement in the public health performance. Far from falsifying globalization critical research, our analyses show the basic weaknesses of the new "pro-globalization" literature in the public health profession. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  5. Trend analysis of air temperature and precipitation time series over Greece: 1955-2010

    NASA Astrophysics Data System (ADS)

    Marougianni, G.; Melas, D.; Kioutsioukis, I.; Feidas, H.; Zanis, P.; Anandranistakis, E.

    2012-04-01

    In this study, a database of air temperature and precipitation time series from the network of Hellenic National Meteorological Service has been developed in the framework of the project GEOCLIMA, co-financed by the European Union and Greek national funds through the Operational Program "Competitiveness and Entrepreneurship" of the Research Funding Program COOPERATION 2009. Initially, a quality test was applied to the raw data and then missing observations have been imputed with a regularized, spatial-temporal expectation - maximization algorithm to complete the climatic record. Next, a quantile - matching algorithm was applied in order to verify the homogeneity of the data. The processed time series were used for the calculation of temporal annual and seasonal trends of air temperature and precipitation. Monthly maximum and minimum surface air temperature and precipitation means at all available stations in Greece were analyzed for temporal trends and spatial variation patterns for the longest common time period of homogenous data (1955 - 2010), applying the Mann-Kendall test. The majority of the examined stations showed a significant increase in the summer maximum and minimum temperatures; this could be possibly physically linked to the Etesian winds, because of the less frequent expansion of the low over the southeastern Mediterranean. Summer minimum temperatures have been increasing at a faster rate than that of summer maximum temperatures, reflecting an asymmetric change of extreme temperature distributions. Total annual precipitation has been significantly decreased at the stations located in western Greece, as well as in the southeast, while the remaining areas exhibit a non-significant negative trend. This reduction is very likely linked to the positive phase of the NAO that resulted in an increase in the frequency and persistence of anticyclones over the Mediterranean.

  6. Methodology and results of calculating central California surface temperature trends: Evidence of human-induced climate change?

    USGS Publications Warehouse

    Christy, J.R.; Norris, W.B.; Redmond, K.; Gallo, K.P.

    2006-01-01

    A procedure is described to construct time series of regional surface temperatures and is then applied to interior central California stations to test the hypothesis that century-scale trend differences between irrigated and nonirrigated regions may be identified. The procedure requires documentation of every point in time at which a discontinuity in a station record may have occurred through (a) the examination of metadata forms (e.g., station moves) and (b) simple statistical tests. From this "homogeneous segments" of temperature records for each station are defined. Biases are determined for each segment relative to all others through a method employing mathematical graph theory. The debiased segments are then merged, forming a complete regional time series. Time series of daily maximum and minimum temperatures for stations in the irrigated San Joaquin Valley (Valley) and nearby nonirrigated Sierra Nevada (Sierra) were generated for 1910-2003. Results show that twentieth-century Valley minimum temperatures are warming at a highly significant rate in all seasons, being greatest in summer and fall (> +0.25??C decade-1). The Valley trend of annual mean temperatures is +0.07?? ?? 0.07??C decade-1. Sierra summer and fall minimum temperatures appear to be cooling, but at a less significant rate, while the trend of annual mean Sierra temperatures is an unremarkable -0.02?? ?? 0.10??C decade-1. A working hypothesis is that the relative positive trends in Valley minus Sierra minima (>0.4??C decade-1 for summer and fall) are related to the altered surface environment brought about by the growth of irrigated agriculture, essentially changing a high-albedo desert into a darker, moister, vegetated plain. ?? 2006 American Meteorological Society.

  7. Changing Family Roles - Across the Deployment Cycle

    DTIC Science & Technology

    2017-09-01

    recorded. In addition to the home interviews, we gather additional data using surveys and data bursts – a series of brief data collections within a...week. During deployment, the spouse / partner completes a series of surveys regarding daily communication with the (deployed) service member

  8. Hypoplasia or Absence of Posterior Leaflet: A Rare Congenital Anomaly of The Mitral Valve in Adulthood - Case Series.

    PubMed

    Parato, Vito Maurizio; Masia, Stefano Lucio

    2018-01-01

    We present a case series of two adult patients with almost complete absence of the posterior mitral valve leaflet and who are asymptomatic or mildly symptomatic, with two different degrees of mitral regurgitation.

  9. 3D displacement time series in the Afar rift zone computed from SAR phase and amplitude information

    NASA Astrophysics Data System (ADS)

    Casu, Francesco; Manconi, Andrea

    2013-04-01

    Large and rapid deformations, such as those caused by earthquakes, eruptions, and landslides cannot be fully measured by using standard DInSAR applications. Indeed, the phase information often degrades and some areas of the interferograms are affected by high fringe rates, leading to difficulties in the phase unwrapping, and/or to complete loss of coherence due to significant misregistration errors. This limitation can be overcome by exploiting the SAR image amplitude information instead of the phase, and by calculating the Pixel-Offset (PO) field SAR image pairs, for both range and azimuth directions. Moreover, it is possible to combine the PO results by following the same rationale of the SBAS technique, to finally retrieve the offset-based deformation time series. Such technique, named PO-SBAS, permits to retrieve the deformation field in areas affected by very large displacements at an accuracy that, for ENVISAT data, correspond to 30 cm and 15 cm for the range and azimuth, respectively [1]. Moreover, the combination of SBAS and PO-SBAS time series can help to better study and model deformation phenomena characterized by spatial and temporal heterogeneities [2]. The Dabbahu rift segment of the Afar depression has been active since 2005 when a 2.5 km3 dyke intrusion and hundreds of earthquakes marked the onset a rifting episode which continues to date. The ENVISAT satellite has repeatedly imaged the Afar depression since 2003, generating a large SAR archive. In this work, we study the Afar rift region deformations by using both the phase and amplitude information of several sets of SAR images acquired from ascending and descending ENVISAT tracks. We combined sets of small baseline interferograms through the SBAS algorithm, and we generate both ground deformation maps and time series along the satellite Line-Of-Sight (LOS). In areas where the deformation gradient causes loss of coherence, we retrieve the displacement field through the amplitude information. Furthermore, we could also retrieve the full 3D deformation field, by considering the North-South displacement component obtained from the azimuth PO information. The combination of SBAS and PO-SBAS information permits to better retrieve and constrain the full deformation field due to repeated intrusions, fault movements, as well as the magma movements from individual magma chambers. [1] Casu, F., A. Manconi, A. Pepe and R. Lanari, 2011. Deformation time-series generation in areas characterized by large displacement dynamics: the SAR amplitude Pixel-Offset SBAS technique, IEEE Transaction on Geosciences and Remote Sensing. [2] Manconi, A. and F. Casu, 2012. Joint analysis of displacement time series retrieved from SAR phase and amplitude: impact on the estimation of volcanic source parameters, Geophysical Research Letters, doi:10.1029/2012GL052202.

  10. Time series momentum and contrarian effects in the Chinese stock market

    NASA Astrophysics Data System (ADS)

    Shi, Huai-Long; Zhou, Wei-Xing

    2017-10-01

    This paper concentrates on the time series momentum or contrarian effects in the Chinese stock market. We evaluate the performance of the time series momentum strategy applied to major stock indices in mainland China and explore the relation between the performance of time series momentum strategies and some firm-specific characteristics. Our findings indicate that there is a time series momentum effect in the short run and a contrarian effect in the long run in the Chinese stock market. The performances of the time series momentum and contrarian strategies are highly dependent on the look-back and holding periods and firm-specific characteristics.

  11. A Multitaper, Causal Decomposition for Stochastic, Multivariate Time Series: Application to High-Frequency Calcium Imaging Data.

    PubMed

    Sornborger, Andrew T; Lauderdale, James D

    2016-11-01

    Neural data analysis has increasingly incorporated causal information to study circuit connectivity. Dimensional reduction forms the basis of most analyses of large multivariate time series. Here, we present a new, multitaper-based decomposition for stochastic, multivariate time series that acts on the covariance of the time series at all lags, C ( τ ), as opposed to standard methods that decompose the time series, X ( t ), using only information at zero-lag. In both simulated and neural imaging examples, we demonstrate that methods that neglect the full causal structure may be discarding important dynamical information in a time series.

  12. A Scandcleft randomised trials of primary surgery for unilateral cleft lip and palate: 1. Planning and management.

    PubMed

    Semb, Gunvor; Enemark, Hans; Friede, Hans; Paulin, Gunnar; Lilja, Jan; Rautio, Jorma; Andersen, Mikael; Åbyholm, Frank; Lohmander, Anette; Shaw, William; Mølsted, Kirsten; Heliövaara, Arja; Bolund, Stig; Hukki, Jyri; Vindenes, Hallvard; Davenport, Peter; Arctander, Kjartan; Larson, Ola; Berggren, Anders; Whitby, David; Leonard, Alan; Neovius, Erik; Elander, Anna; Willadsen, Elisabeth; Bannister, R Patricia; Bradbury, Eileen; Henningsson, Gunilla; Persson, Christina; Eyres, Philip; Emborg, Berit; Kisling-Møller, Mia; Küseler, Annelise; Granhof Black, Birthe; Schöps, Antje; Bau, Anja; Boers, Maria; Andersen, Helene Søgaard; Jeppesen, Karin; Marxen, Dorte; Paaso, Marjukka; Hölttä, Elina; Alaluusua, Suvi; Turunen, Leena; Humerinta, Kirsti; Elfving-Little, Ulla; Tørdal, Inger Beate; Kjøll, Lillian; Aukner, Ragnhild; Hide, Øydis; Feragen, Kristin Billaud; Rønning, Elisabeth; Skaare, Pål; Brinck, Eli; Semmingsen, Ann-Magritt; Lindberg, Nina; Bowden, Melanie; Davies, Julie; Mooney, Jeanette; Bellardie, Haydn; Schofield, Nina; Nyberg, Jill; Lundberg, Maria; Karsten, Agneta Linder-Aronson; Larson, Margareta; Holmefjord, Anders; Reisæter, Sigvor; Pedersen, Nina-Helen; Rasmussen, Therese; Tindlund, Rolf; Sæle, Paul; Blomhoff, Reidunn; Jacobsen, Gry; Havstam, Christina; Rizell, Sara; Enocson, Lars; Hagberg, Catharina; Najar Chalien, Midia; Paganini, Anna; Lundeborg, Inger; Marcusson, Agneta; Mjönes, Anna-Britta; Gustavsson, Annica; Hayden, Christine; McAleer, Eilish; Slevan, Emma; Gregg, Terry; Worthington, Helen

    2017-02-01

    Longstanding uncertainty surrounds the selection of surgical protocols for the closure of unilateral cleft lip and palate, and randomised trials have only rarely been performed. This paper is an introduction to three randomised trials of primary surgery for children born with complete unilateral cleft lip and palate (UCLP). It presents the protocol developed for the trials in CONSORT format, and describes the management structure that was developed to achieve the long-term engagement and commitment required to complete the project. Ten established national or regional cleft centres participated. Lip and soft palate closure at 3-4 months, and hard palate closure at 12 months served as a common method in each trial. Trial 1 compared this with hard palate closure at 36 months. Trial 2 compared it with lip closure at 3-4 months and hard and soft palate closure at 12 months. Trial 3 compared it with lip and hard palate closure at 3-4 months and soft palate closure at 12 months. The primary outcomes were speech and dentofacial development, with a series of perioperative and longer-term secondary outcomes. Recruitment of 448 infants took place over a 9-year period, with 99.8% subsequent retention at 5 years. The series of reports that follow this introductory paper include comparisons at age 5 of surgical outcomes, speech outcomes, measures of dentofacial development and appearance, and parental satisfaction. The outcomes recorded and the numbers analysed for each outcome and time point are described in the series. ISRCTN29932826.

  13. Graphical Data Analysis on the Circle: Wrap-Around Time Series Plots for (Interrupted) Time Series Designs.

    PubMed

    Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew

    2014-01-01

    Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.

  14. Nonlinear parametric model for Granger causality of time series

    NASA Astrophysics Data System (ADS)

    Marinazzo, Daniele; Pellicoro, Mario; Stramaglia, Sebastiano

    2006-06-01

    The notion of Granger causality between two time series examines if the prediction of one series could be improved by incorporating information of the other. In particular, if the prediction error of the first time series is reduced by including measurements from the second time series, then the second time series is said to have a causal influence on the first one. We propose a radial basis function approach to nonlinear Granger causality. The proposed model is not constrained to be additive in variables from the two time series and can approximate any function of these variables, still being suitable to evaluate causality. Usefulness of this measure of causality is shown in two applications. In the first application, a physiological one, we consider time series of heart rate and blood pressure in congestive heart failure patients and patients affected by sepsis: we find that sepsis patients, unlike congestive heart failure patients, show symmetric causal relationships between the two time series. In the second application, we consider the feedback loop in a model of excitatory and inhibitory neurons: we find that in this system causality measures the combined influence of couplings and membrane time constants.

  15. Multivariate time series clustering on geophysical data recorded at Mt. Etna from 1996 to 2003

    NASA Astrophysics Data System (ADS)

    Di Salvo, Roberto; Montalto, Placido; Nunnari, Giuseppe; Neri, Marco; Puglisi, Giuseppe

    2013-02-01

    Time series clustering is an important task in data analysis issues in order to extract implicit, previously unknown, and potentially useful information from a large collection of data. Finding useful similar trends in multivariate time series represents a challenge in several areas including geophysics environment research. While traditional time series analysis methods deal only with univariate time series, multivariate time series analysis is a more suitable approach in the field of research where different kinds of data are available. Moreover, the conventional time series clustering techniques do not provide desired results for geophysical datasets due to the huge amount of data whose sampling rate is different according to the nature of signal. In this paper, a novel approach concerning geophysical multivariate time series clustering is proposed using dynamic time series segmentation and Self Organizing Maps techniques. This method allows finding coupling among trends of different geophysical data recorded from monitoring networks at Mt. Etna spanning from 1996 to 2003, when the transition from summit eruptions to flank eruptions occurred. This information can be used to carry out a more careful evaluation of the state of volcano and to define potential hazard assessment at Mt. Etna.

  16. Piloted simulation tests of propulsion control as backup to loss of primary flight controls for a mid-size jet transport

    NASA Technical Reports Server (NTRS)

    Bull, John; Mah, Robert; Davis, Gloria; Conley, Joe; Hardy, Gordon; Gibson, Jim; Blake, Matthew; Bryant, Don; Williams, Diane

    1995-01-01

    Failures of aircraft primary flight-control systems to aircraft during flight have led to catastrophic accidents with subsequent loss of lives (e.g. , DC-1O crash, B-747 crash, C-5 crash, B-52 crash, and others). Dryden Flight Research Center (DFRC) investigated the use of engine thrust for emergency flight control of several airplanes, including the B-720, Lear 24, F-15, C-402, and B-747. A series of three piloted simulation tests have been conducted at Ames Research Center to investigate propulsion control for safely landing a medium size jet transport which has experienced a total primary flight-control failure. The first series of tests was completed in July 1992 and defined the best interface for the pilot commands to drive the engines. The second series of tests was completed in August 1994 and investigated propulsion controlled aircraft (PCA) display requirements and various command modes. The third series of tests was completed in May 1995 and investigated PCA full-flight envelope capabilities. This report describes the concept of a PCA, discusses pilot controls, displays, and procedures; and presents the results of piloted simulation evaluations of the concept by a cross-section of air transport pilots.

  17. Evaluating a community-based exercise intervention with adults living with HIV: protocol for an interrupted time series study.

    PubMed

    O'Brien, Kelly K; Bayoumi, Ahmed M; Solomon, Patricia; Tang, Ada; Murzin, Kate; Chan Carusone, Soo; Zobeiry, Mehdi; Nayar, Ayesha; Davis, Aileen M

    2016-10-20

    Our aim was to evaluate a community-based exercise (CBE) intervention with the goal of reducing disability and enhancing health for community-dwelling people living with HIV (PLWH). We will use a mixed-methods implementation science study design, including a prospective longitudinal interrupted time series study, to evaluate a CBE intervention with PLWH in Toronto, Canada. We will recruit PLWH who consider themselves medically stable and safe to participate in exercise. In the baseline phase (0-8 months), participants will be monitored bimonthly. In the intervention phase (8-14 months), participants will take part in a 24-week CBE intervention that includes aerobic, resistance, balance and flexibility exercise at the YMCA 3 times per week, with weekly supervision by a fitness instructor, and monthly educational sessions. In the follow-up phase (14-22 months), participants will be encouraged to continue to engage in unsupervised exercise 3 times per week. Quantitative assessment: We will assess cardiopulmonary fitness, strength, weight, body composition and flexibility outcomes followed by the administration of self-reported questionnaires to assess disability and contextual factor outcomes (coping, mastery, stigma, social support) bimonthly. We will use time series regression analysis to determine the level and trend of outcomes across each phase in relation to the intervention. Qualitative assessment: We will conduct a series of face-to-face interviews with a subsample of participants and recreation providers at initiation, midpoint and completion of the 24-week CBE intervention. We will explore experiences and anticipated benefits with exercise, perceived impact of CBE for PLWH and the strengths and challenges of implementing a CBE intervention. Interviews will be audio recorded and analysed thematically. Protocol approved by the University of Toronto HIV/AIDS Research Ethics Board. Knowledge translation will occur with stakeholders in the form of presentations and publications in open access peer-reviewed journals. NCT02794415; Pre-results. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  18. Reclaiming the past: Using hierarchical Bayesian analysis to fill missing values in the tide gauge mean sea level record, with application to extreme value analysis

    NASA Astrophysics Data System (ADS)

    Piecuch, C. G.; Huybers, P. J.; Tingley, M.

    2015-12-01

    Tide gauge records of mean sea level are some of the most valuable instrumental time series of oceanic variability and change. Yet these time series sometimes have short record lengths and intermittently missing values. Such issues can limit the utility of the data, for example, precluding rigorous analyses of return periods of extreme mean sea level events and whether they are unprecedented. With a view to filling gaps in the tide gauge mean sea level time series, we describe a hierarchical Bayesian modeling approach. The model, which is predicated on the notion of conditional probabilities, comprises three levels: a process level, which casts mean sea level as a field with spatiotemporal covariance; a data level, which represents tide gauge observations as noisy, biased versions of the true process; and a prior level, which gives prior functional forms to model parameters. Using Bayes' rule, this technique gives estimates of the posterior probability of the process and the parameters given the observations. To demonstrate the approach, we apply it to 2,967 station-years of annual mean sea level observations over 1856-2013 from 70 tide gauges along the United States East Coast from Florida to Maine (i.e., 26.8% record completeness). The model overcomes the data paucity by sharing information across space and time. The result is an ensemble of realizations, each member of which is a possible history of sea level changes at these locations over this period, which is consistent with and equally likely given the tide gauge data and underlying model assumptions. Using the ensemble of histories furnished by the Bayesian model, we identify extreme events of mean sea level change in the tide gauge time series. Specifically, we use the model to address the particular hypothesis (with rigorous uncertainty quantification) that a recently reported interannual sea level rise during 2008-2010 was unprecedented in the instrumental record along the northeast coast of North America, and that it had a return period of 850 years. Preliminary analysis suggests that this event was likely unprecedented on the coast of Maine in the last century.

  19. Residual settlements detection of ocean reclaimed lands with multi-platform SAR time series and SBAS technique: a case study of Shanghai Pudong International Airport

    NASA Astrophysics Data System (ADS)

    Yu, Lei; Yang, Tianliang; Zhao, Qing; Pepe, Antonio; Dong, Hongbin; Sun, Zhibin

    2017-09-01

    Shanghai Pudong International airport is one of the three major international airports in China. The airport is located at the Yangtze estuary which is a sensitive belt of sea and land interaction region. The majority of the buildings and facilities in the airport are built on ocean-reclaimed lands and silt tidal flat. Residual ground settlement could probably occur after the completion of the airport construction. The current status of the ground settlement of the airport and whether it is within a safe range are necessary to be investigated. In order to continuously monitor the ground settlement of the airport, two Synthetic Aperture Radar (SAR) time series, acquired by X-band TerraSAR-X (TSX) and TanDEM-X (TDX) sensors from December 2009 to December 2010 and from April 2013 to July 2015, were used for analyzing with SBAS technique. We firstly obtained ground deformation measurement of each SAR subset. Both of the measurements show that obvious ground subsidence phenomenon occurred at the airport, especially in the second runway, the second terminal, the sixth cargo plane and the eighth apron. The maximum vertical ground deformation rates of both SAR subset measurements were greater than -30 mm/year, while the cumulative ground deformations reached up to -30 mm and -35 mm respectively. After generation of SBAS-retrieved ground deformation for each SAR subset, we performed a joint analysis to combine time series of each common coherent point by applying a geotechnical model. The results show that three centralized areas of ground deformation existed in the airport, mainly distributed in the sixth cargo plane, the fifth apron and the fourth apron, The maximum vertical cumulative ground subsidence was more than -70 mm. In addition, by analyzing the combined time series of four selected points, we found that the ground deformation rates of the points located at the second runway, the third runway, and the second terminal, were progressively smaller as time goes by. It indicates that the stabilities of the foundation around these points were gradually enhanced.

  20. Comparison between weather station data in south-eastern Italy and CRU precipitation datasets

    NASA Astrophysics Data System (ADS)

    Miglietta, D.

    2009-04-01

    Monthly precipitation data in south-eastern Italy from 1920 to 2005 have been extensively analyzed. Data were collected in almost 200 weather stations located 10-20km apart from each other and almost uniformly distributed in Puglia and Basilicata regions. Apart from few years around world war II, time series are mostly complete and allow a reliable reconstruction of climate variability in the considered region. Statistically significant trends have been studied by applying the Mann-Kendall test to annual, seasonal and monthly values. A comparison has been made between observations and precipitation data given by the Climate Research Unit (CRU), University of East Anglia, with both low (30') and high (10') space resolution grid. In particular, rainfall records, time series behaviors and annual cycles at each station have been compared to the corresponding CRU data. CRU time series show a large negative trend for winter since 1970. Trend is not significant if the whole 20th century is considered (both for the whole year and for winter only). This might be considered as an evidence of recent acceleration towards increasingly dry conditions. However correlation between CRU data and observations is not very high and large percent errors are present mainly in the mountains regions, where observations show a large annual cycle, with intense precipitation in winter, which is not present in CRU data. To identify trends, therefore observed data are needed, even at monthly scale. In particular observations confirm the overall trend, but also indicate large spatial variability, with locations where precipitation has even increased since 1970. Daily precipitation data coming from a subset of weather stations have also been studied for the same time period. The distributions of maximum annual rainfalls, wet spells and dry spells were analyzed for each station, together with their time series. The tools of statistical analysis of extremes have been used in order to evaluate return values and their space distribution over the considered region. A procedure for data quality control and homogeneity test on monthly rainfall records is also being applied, while kriging techniques are being developed in order to fully understand rainfall climatology in south-eastern Italy.

  1. An Energy-Based Similarity Measure for Time Series

    NASA Astrophysics Data System (ADS)

    Boudraa, Abdel-Ouahab; Cexus, Jean-Christophe; Groussat, Mathieu; Brunagel, Pierre

    2007-12-01

    A new similarity measure, called SimilB, for time series analysis, based on the cross-[InlineEquation not available: see fulltext.]-energy operator (2004), is introduced. [InlineEquation not available: see fulltext.] is a nonlinear measure which quantifies the interaction between two time series. Compared to Euclidean distance (ED) or the Pearson correlation coefficient (CC), SimilB includes the temporal information and relative changes of the time series using the first and second derivatives of the time series. SimilB is well suited for both nonstationary and stationary time series and particularly those presenting discontinuities. Some new properties of [InlineEquation not available: see fulltext.] are presented. Particularly, we show that [InlineEquation not available: see fulltext.] as similarity measure is robust to both scale and time shift. SimilB is illustrated with synthetic time series and an artificial dataset and compared to the CC and the ED measures.

  2. A hybrid algorithm for clustering of time series data based on affinity search technique.

    PubMed

    Aghabozorgi, Saeed; Ying Wah, Teh; Herawan, Tutut; Jalab, Hamid A; Shaygan, Mohammad Amin; Jalali, Alireza

    2014-01-01

    Time series clustering is an important solution to various problems in numerous fields of research, including business, medical science, and finance. However, conventional clustering algorithms are not practical for time series data because they are essentially designed for static data. This impracticality results in poor clustering accuracy in several systems. In this paper, a new hybrid clustering algorithm is proposed based on the similarity in shape of time series data. Time series data are first grouped as subclusters based on similarity in time. The subclusters are then merged using the k-Medoids algorithm based on similarity in shape. This model has two contributions: (1) it is more accurate than other conventional and hybrid approaches and (2) it determines the similarity in shape among time series data with a low complexity. To evaluate the accuracy of the proposed model, the model is tested extensively using syntactic and real-world time series datasets.

  3. A Hybrid Algorithm for Clustering of Time Series Data Based on Affinity Search Technique

    PubMed Central

    Aghabozorgi, Saeed; Ying Wah, Teh; Herawan, Tutut; Jalab, Hamid A.; Shaygan, Mohammad Amin; Jalali, Alireza

    2014-01-01

    Time series clustering is an important solution to various problems in numerous fields of research, including business, medical science, and finance. However, conventional clustering algorithms are not practical for time series data because they are essentially designed for static data. This impracticality results in poor clustering accuracy in several systems. In this paper, a new hybrid clustering algorithm is proposed based on the similarity in shape of time series data. Time series data are first grouped as subclusters based on similarity in time. The subclusters are then merged using the k-Medoids algorithm based on similarity in shape. This model has two contributions: (1) it is more accurate than other conventional and hybrid approaches and (2) it determines the similarity in shape among time series data with a low complexity. To evaluate the accuracy of the proposed model, the model is tested extensively using syntactic and real-world time series datasets. PMID:24982966

  4. Influence of denitrification reactor retention time distribution (RTD) on dissolved oxygen control and nitrogen removal efficiency.

    PubMed

    Raboni, Massimo; Gavasci, Renato; Viotti, Paolo

    2015-01-01

    Low concentrations of dissolved oxygen (DO) are usually found in biological anoxic pre-denitrification reactors, causing a reduction in nitrogen removal efficiency. Therefore, the reduction of DO in such reactors is fundamental for achieving good nutrient removal. The article shows the results of an experimental study carried out to evaluate the effect of the anoxic reactor hydrodynamic model on both residual DO concentration and nitrogen removal efficiency. In particular, two hydrodynamic models were considered: the single completely mixed reactor and a series of four reactors that resemble plug-flow behaviour. The latter prove to be more effective in oxygen consumption, allowing a lower residual DO concentration than the former. The series of reactors also achieves better specific denitrification rates and higher denitrification efficiency. Moreover, the denitrification food to microrganism (F:M) ratio (F:MDEN) demonstrates a relevant synergic action in both controlling residual DO and improving the denitrification performance.

  5. Geostationary Operational Environmental Statellite(GEOS-N report)

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The Advanced Missions Analysis Office (AMAO) of GSFC has completed a study of the Geostationary Operational Environmental Satellites (GOES-N) series. The feasibility, risks, schedules, and associated costs of advanced space and ground system concepts responsive to National Oceanic and Atmospheric Administration (NOAA) requirements were evaluated. The study is the first step in a multi-phased procurement effort that is expected to result in launch ready hardware in the post 2000 time frame. This represents the latest activity of GSFC in translating meteorological requirements of NOAA into viable space systems in geosynchronous earth orbits (GEO). GOES-N represents application of the latest spacecraft, sensor, and instrument technologies to enhance NOAA meteorological capabilities via remote and in-situ sensing from GEO. The GOES-N series, if successfully developed, could become another significant step in NOAA weather forecasting space systems, meeting increasingly complex emerging national needs for that agency's services.

  6. Observations of a pressurized hydraulic hose under lateral liquid impacts

    NASA Astrophysics Data System (ADS)

    Stewart, C. D.; Gorman, D. G.

    The effects of 'pin-hole' failure of one pressurized hydraulic hose on its neighbour are investigated. A pressurized test hose was inserted into a custom testing apparatus and subjected to a series of ten short duration liquid impacts simulating the pin-hole failure of an initial hose. Subsequent displacements of the hose were filmed and plotted with respect to time. Three distinct pattern groups emerged which were used to explain the resultant damage to the hose. It was observed that the middle pattern, corresponding to impacts 6 and 7, appears to be the point where the very damaging hydraulic penetration mechanism became dominant and the outer layer of the hose failed. On completion of the ten impact series it was observed that a small hole on the outer surface of the hose gave way to a relatively large damaged area in the strength bearing inner braid material.

  7. Clustering Financial Time Series by Network Community Analysis

    NASA Astrophysics Data System (ADS)

    Piccardi, Carlo; Calatroni, Lisa; Bertoni, Fabio

    In this paper, we describe a method for clustering financial time series which is based on community analysis, a recently developed approach for partitioning the nodes of a network (graph). A network with N nodes is associated to the set of N time series. The weight of the link (i, j), which quantifies the similarity between the two corresponding time series, is defined according to a metric based on symbolic time series analysis, which has recently proved effective in the context of financial time series. Then, searching for network communities allows one to identify groups of nodes (and then time series) with strong similarity. A quantitative assessment of the significance of the obtained partition is also provided. The method is applied to two distinct case-studies concerning the US and Italy Stock Exchange, respectively. In the US case, the stability of the partitions over time is also thoroughly investigated. The results favorably compare with those obtained with the standard tools typically used for clustering financial time series, such as the minimal spanning tree and the hierarchical tree.

  8. Carbon Fiber Reinforced/Silicon Carbide Turbine Blisk Testing in the SIMPLEX Turbopump

    NASA Technical Reports Server (NTRS)

    Genge, Gary G.; Marsh, Matthew W.

    1999-01-01

    A program designed to implement a ceramic matrix composite integrally bladed disk (blisk) into rocket engine style turbomachinery has successfully completed testing. The Marshall Space Flight Center (MSFC) program, utilizing the MSFC turbomachinery design, analysis, and testing capabilities along with materials development capabilities from both Glenn Research Center (GRC) and MSFC, has tested two carbon fiber reinforced silicon carbide blisks in the Simplex Turbopump at MSFC's Test Stand 500. One blisk contained a polar woven fiber preform, while the second blisk tested utilized a quasi-isotropic preform. Vhile earlier papers have chronicled the program's design, material testing, and torque testing efforts, this paper focuses on the testing of the blisks in the Simplex turbopump. Emphasis will be placed on the actual condition of the blisks before and after the testing test program design methodology, and conclusions that can be drawn from the test data and blisk final conditions. The program performed three separate test series. The first series was needed to validate that the Simplex turbopump was correctly re-built following a major incident to the turbopump. The turbopump had two major differences from the original design. The most obvious difference was the sleeve required throughout the bore of the main housing. The second major difference was modifications to the pump diffuser to improve performance. Several areas were burnt during the incident and were either repaired by weld repair (pump inlet housing) or simply smoothed out (turbine nozzle discharge). The test series was designed to weed out any turbopump design and manufacturing flaws or fatigue issues prior to putting the C/SiC blisks into it. The second and third series were the C/SiC blisk test series. The primary goal of these series was to expose the blisks to as much fatigue causing dynamic stress as possible to examine the material's capability. Initially, the test plan was to put equal time on the two blisks, however, as the test series progressed, the funding allowed additional testing to occur. The additional test time was placed on the polar weave blisk. The total test time accrued on the polar blisk was 2550 seconds with 860 seconds near the turbopump design speed of 25,000 rpm. This testing included 6 tests / 775 seconds pumping liquid nitrogen and 7 tests / 1775 seconds pumping liquid oxygen. The drive gas for all of the tests was gaseous nitrogen due to the lack of hot gas source for the Simplex turbopump. The quasi-isotropic blisk was tested for XX total tests and XXXX seconds with X tests/XXXX seconds pumping liquid nitrogen and X tests/XXXX seconds pumping liquid \\oxygen.During the test series, the blisks were inspected following each test. Inspections initially were viewed from the downstream side of the blisks only. Midway through the testing, a method of borescoping the leading edges of the blades was devised, and subsequently, both sides of the blades were inspected following each test. The leading and trailing edges of the polar blisk held up better than the quasi-isotropic blisk. This was a known possibility due to the varying fiber direction in the blades as the rectangular preform weave is cut in a circular pattern. The surprising fact about the testing was that there was no measurable performance loss due to the inaccuracies in the blade manufacturing in the C/SiC blisks, the surface roughness C/SiC of the blades, or the loss of the material in the polar blisk. A performance shift was seen in the quasi-isotropic blisk as portions of the leading and trailing edges were lost. After the testing was completed, detailed inspections of the blisks were performed. The largest surprise was the polar blisk had a obvious crack in a single blade that was located nearly midspan which was not detected in test. The crack ran completely through the blade circumferenciary and through the radial length of the blade. However, the crack does not appear to extend into the blisk hub. Although the cause of the crack is still under investigation, the material appears to be tolerant of this crack, and other hairline cracks discovered under higher magnification. This bodies well for eventual use of this material in actual flight turbopumps where monolithic fracture toughness issues limit its use.

  9. Dynamics of transit times and StorAge Selection functions in four forested catchments from stable isotope data

    NASA Astrophysics Data System (ADS)

    Rodriguez, Nicolas B.; McGuire, Kevin J.; Klaus, Julian

    2017-04-01

    Transit time distributions, residence time distributions and StorAge Selection functions are fundamental integrated descriptors of water storage, mixing, and release in catchments. In this contribution, we determined these time-variant functions in four neighboring forested catchments in H.J. Andrews Experimental Forest, Oregon, USA by employing a two year time series of 18O in precipitation and discharge. Previous studies in these catchments assumed stationary, exponentially distributed transit times, and complete mixing/random sampling to explore the influence of various catchment properties on the mean transit time. Here we relaxed such assumptions to relate transit time dynamics and the variability of StoreAge Selection functions to catchment characteristics, catchment storage, and meteorological forcing seasonality. Conceptual models of the catchments, consisting of two reservoirs combined in series-parallel, were calibrated to discharge and stable isotope tracer data. We assumed randomly sampled/fully mixed conditions for each reservoir, which resulted in an incompletely mixed system overall. Based on the results we solved the Master Equation, which describes the dynamics of water ages in storage and in catchment outflows Consistent between all catchments, we found that transit times were generally shorter during wet periods, indicating the contribution of shallow storage (soil, saprolite) to discharge. During extended dry periods, transit times increased significantly indicating the contribution of deeper storage (bedrock) to discharge. Our work indicated that the strong seasonality of precipitation impacted transit times by leading to a dynamic selection of stored water ages, whereas catchment size was not a control on transit times. In general this work showed the usefulness of using time-variant transit times with conceptual models and confirmed the existence of the catchment age mixing behaviors emerging from other similar studies.

  10. Relationship of occupational therapy inpatient rehabilitation interventions and patient characteristics to outcomes following spinal cord injury: The SCIRehab Project

    PubMed Central

    Ozelie, Rebecca; Gassaway, Julie; Buchman, Emily; Thimmaiah, Deepa; Heisler, Lauren; Cantoni, Kara; Foy, Teresa; Hsieh, Ching-Hui (Jean); Smout, Randall J.; Kreider, Scott E. D.; Whiteneck, Gale

    2012-01-01

    Background/objective Describe associations of occupational therapy (OT) interventions delivered during inpatient spinal cord injury (SCI) rehabilitation and patient characteristics with outcomes at the time of discharge and 1-year post-injury. Methods Occupational therapists at six inpatient rehabilitation centers documented detailed information about treatment provided. Least squares regression modeling was used to predict outcomes at discharge and 1-year injury anniversary for a 75% subset; models were validated with the remaining 25%. Functional outcomes for injury subgroups (motor complete low tetraplegia and motor complete paraplegia) also were examined. Results OT treatment variables explain a small amount of variation in Functional Independence Measure (FIM) outcomes for the full sample and significantly more in two functionally homogeneous subgroups. For patients with motor complete paraplegia, more time spent in clothing management and hygiene related to toileting was a strong predictor of higher scores on the lower body items of the self-care component of the discharge motor FIM. Among patients with motor complete low tetraplegia, higher scores for the FIM lower body self-care items were associated with more time spent on lower body dressing, manual wheelchair mobility training, and bathing training. Active patient participation during OT treatment sessions also was predictive of FIM and other outcomes. Conclusion OT treatments add to explained variance (in addition to patient characteristics) for multiple outcomes. The impact of OT treatment on functional outcomes is more evident when examining more homogeneous patient groupings and outcomes specific to the groupings. Note This is the third of nine articles in the SCIRehab series. PMID:23318035

  11. Retention of fundamental surgical skills learned in robot-assisted surgery.

    PubMed

    Suh, Irene H; Mukherjee, Mukul; Shah, Bhavin C; Oleynikov, Dmitry; Siu, Ka-Chun

    2012-12-01

    Evaluation of the learning curve for robotic surgery has shown reduced errors and decreased task completion and training times compared with regular laparoscopic surgery. However, most training evaluations of robotic surgery have only addressed short-term retention after the completion of training. Our goal was to investigate the amount of surgical skills retained after 3 months of training with the da Vinci™ Surgical System. Seven medical students without any surgical experience were recruited. Participants were trained with a 4-day training program of robotic surgical skills and underwent a series of retention tests at 1 day, 1 week, 1 month, and 3 months post-training. Data analysis included time to task completion, speed, distance traveled, and movement curvature by the instrument tip. Performance of the participants was graded using the modified Objective Structured Assessment of Technical Skills (OSATS) for robotic surgery. Participants filled out a survey after each training session by answering a set of questions. Time to task completion and the movement curvature was decreased from pre- to post-training and the performance was retained at all the corresponding retention periods: 1 day, 1 week, 1 month, and 3 months. The modified OSATS showed improvement from pre-test to post-test and this improvement was maintained during all the retention periods. Participants increased in self-confidence and mastery in performing robotic surgical tasks after training. Our novel comprehensive training program improved robot-assisted surgical performance and learning. All trainees retained their fundamental surgical skills for 3 months after receiving the training program.

  12. Discovery of Jurassic ammonite-bearing series in Jebel Bou Hedma (South-Central Tunisian Atlas): Implications for stratigraphic correlations and paleogeographic reconstruction

    NASA Astrophysics Data System (ADS)

    Bahrouni, Néjib; Houla, Yassine; Soussi, Mohamed; Boughdiri, Mabrouk; Ali, Walid Ben; Nasri, Ahmed; Bouaziz, Samir

    2016-01-01

    Recent geological mapping undertaken in the Southern-Central Atlas of Tunisia led to the discovery of Jurassic ammonite-bearing series in the Jebel Bou Hedma E-W anticline structure. These series represent the Southernmost Jurassic rocks ever documented in the outcrops of the Tunisian Atlas. These series which outcrop in a transitional zone between the Southern Tunisian Atlas and the Chott basin offer a valuable benchmark for new stratigraphic correlation with the well-known Jurassic series of the North-South Axis of Central Tunisia and also with the Jurassic subsurface successions transected by petroleum wells in the study area. The preliminary investigations allowed the identification, within the most complete section outcropping in the center of the structure, of numerous useful biochronological and sedimentological markers helping in the establishment of an updated Jurassic stratigraphic framework chart of South-Western Tunisia. Additionally, the Late Jurassic succession documents syn-sedimentary features such as slumping, erosion and reworking of sediments and ammonite faunas that can be considered as strong witnesses of an important geodynamic event around the Jurassic-Cretaceous boundary. These stratigraphic and geodynamic new data make of the Jurassic of Jebel Bou Hedma a key succession for stratigraphic correlation attempt between Atlas Tunisian series and those currently buried in the Chott basin or outcropping in the Saharan platform. Furthermore, the several rich-ammonite identified horizons within the Middle and Upper Jurassic series constitute reliable time lines that can be useful for both paleogeographic and geodynamic reconstructions of this part of the North African Tethyan margin but also in the refinement of the potential migration routes for ammonite populations from the Maghrebian Southern Tethys to Arabia.

  13. Strategies for Success: Promising Ideas in Adult College Completion. Policy Exchanges

    ERIC Educational Resources Information Center

    Lane, Patrick

    2012-01-01

    This publication is the first of a series focusing on promising new ideas and innovative practices developed through the Adult College Completion Network. The brief addresses five topics of importance to those working to improve adult college completion: (1) Data availability particular to the returning adult population; (2) Partnerships between…

  14. A perturbative approach for enhancing the performance of time series forecasting.

    PubMed

    de Mattos Neto, Paulo S G; Ferreira, Tiago A E; Lima, Aranildo R; Vasconcelos, Germano C; Cavalcanti, George D C

    2017-04-01

    This paper proposes a method to perform time series prediction based on perturbation theory. The approach is based on continuously adjusting an initial forecasting model to asymptotically approximate a desired time series model. First, a predictive model generates an initial forecasting for a time series. Second, a residual time series is calculated as the difference between the original time series and the initial forecasting. If that residual series is not white noise, then it can be used to improve the accuracy of the initial model and a new predictive model is adjusted using residual series. The whole process is repeated until convergence or the residual series becomes white noise. The output of the method is then given by summing up the outputs of all trained predictive models in a perturbative sense. To test the method, an experimental investigation was conducted on six real world time series. A comparison was made with six other methods experimented and ten other results found in the literature. Results show that not only the performance of the initial model is significantly improved but also the proposed method outperforms the other results previously published. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, H; Guerrero, M; Prado, K

    Purpose: Building up a TG-71 based electron monitor-unit (MU) calculation protocol usually involves massive measurements. This work investigates a minimum data set of measurements and its calculation accuracy and measurement time. Methods: For 6, 9, 12, 16, and 20 MeV of our Varian Clinac-Series linear accelerators, the complete measurements were performed at different depth using 5 square applicators (6, 10, 15, 20 and 25 cm) with different cutouts (2, 3, 4, 6, 10, 15 and 20 cm up to applicator size) for 5 different SSD’s. For each energy, there were 8 PDD scans and 150 point measurements for applicator factors,more » cutout factors and effective SSDs that were then converted to air-gap factors for SSD 99–110cm. The dependence of each dosimetric quantity on field size and SSD was examined to determine the minimum data set of measurements as a subset of the complete measurements. The “missing” data excluded in the minimum data set were approximated by linear or polynomial fitting functions based on the included data. The total measurement time and the calculated electron MU using the minimum and the complete data sets were compared. Results: The minimum data set includes 4 or 5 PDD’s and 51 to 66 point measurements for each electron energy, and more PDD’s and fewer point measurements are generally needed as energy increases. Using only <50% of complete measurement time, the minimum data set generates acceptable MU calculation results compared to those with the complete data set. The PDD difference is within 1 mm and the calculated MU difference is less than 1.5%. Conclusion: Data set measurement for TG-71 electron MU calculations can be minimized based on the knowledge of how each dosimetric quantity depends on various setup parameters. The suggested minimum data set allows acceptable MU calculation accuracy and shortens measurement time by a few hours.« less

  16. 7. Cable Creek Bridge after completion. Zion National Park negative ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. Cable Creek Bridge after completion. Zion National Park negative number 1485, classification series 002, 12. - Floor of the Valley Road, Cable Creek Bridge, Spanning Cable Creek on Floor of Valley, Springdale, Washington County, UT

  17. Multifractal analysis of the Korean agricultural market

    NASA Astrophysics Data System (ADS)

    Kim, Hongseok; Oh, Gabjin; Kim, Seunghwan

    2011-11-01

    We have studied the long-term memory effects of the Korean agricultural market using the detrended fluctuation analysis (DFA) method. In general, the return time series of various financial data, including stock indices, foreign exchange rates, and commodity prices, are uncorrelated in time, while the volatility time series are strongly correlated. However, we found that the return time series of Korean agricultural commodity prices are anti-correlated in time, while the volatility time series are correlated. The n-point correlations of time series were also examined, and it was found that a multifractal structure exists in Korean agricultural market prices.

  18. Trends and Correlation Estimation in Climate Sciences: Effects of Timescale Errors

    NASA Astrophysics Data System (ADS)

    Mudelsee, M.; Bermejo, M. A.; Bickert, T.; Chirila, D.; Fohlmeister, J.; Köhler, P.; Lohmann, G.; Olafsdottir, K.; Scholz, D.

    2012-12-01

    Trend describes time-dependence in the first moment of a stochastic process, and correlation measures the linear relation between two random variables. Accurately estimating the trend and correlation, including uncertainties, from climate time series data in the uni- and bivariate domain, respectively, allows first-order insights into the geophysical process that generated the data. Timescale errors, ubiquitious in paleoclimatology, where archives are sampled for proxy measurements and dated, poses a problem to the estimation. Statistical science and the various applied research fields, including geophysics, have almost completely ignored this problem due to its theoretical almost-intractability. However, computational adaptations or replacements of traditional error formulas have become technically feasible. This contribution gives a short overview of such an adaptation package, bootstrap resampling combined with parametric timescale simulation. We study linear regression, parametric change-point models and nonparametric smoothing for trend estimation. We introduce pairwise-moving block bootstrap resampling for correlation estimation. Both methods share robustness against autocorrelation and non-Gaussian distributional shape. We shortly touch computing-intensive calibration of bootstrap confidence intervals and consider options to parallelize the related computer code. Following examples serve not only to illustrate the methods but tell own climate stories: (1) the search for climate drivers of the Agulhas Current on recent timescales, (2) the comparison of three stalagmite-based proxy series of regional, western German climate over the later part of the Holocene, and (3) trends and transitions in benthic oxygen isotope time series from the Cenozoic. Financial support by Deutsche Forschungsgemeinschaft (FOR 668, FOR 1070, MU 1595/4-1) and the European Commission (MC ITN 238512, MC ITN 289447) is acknowledged.

  19. Visibility Graph Based Time Series Analysis

    PubMed Central

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it’s microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks. PMID:26571115

  20. The MOPITT instrument as a Prototype for Long-Term Space-Based Atmospheric Measurements in the Anthropocene

    NASA Astrophysics Data System (ADS)

    Drummond, James

    2016-07-01

    One of the major characteristics of the Anthropocene will be changes in all the Earth systems on many timescales. Changes that occur within a generation will be very significant for policy decisions and these will require measurements on corresponding timescales from space-based instruments, but these times are long compared to traditional satellite lifetimes. Whether by luck or by good design there are now a number of satellite missions that are recording data over long time periods. With a single instrument, decadal and longer time series of relevant atmospheric parameters have been achieved and the Measurements Of Pollution In The Troposphere (MOPITT) instrument is one such instrument. Launched on 18th December 1999 on the Terra spacecraft, MOPITT has now completed more than 16 years of operation measuring carbon monoxide (CO) over the planet and the mission continues. It is entirely possible that these measurements will span two decades before completion. MOPITT therefore offers a case study of a very long single-instrument time series, albeit one with challenges because this longevity was not part of the original design criteria: The original design specified about a five year life and this has already been considerably exceeded. MOPITT does enable us to look at long term trends and intermittent phenomena over the planet for an extended period of tie encompassing an entire solar cycle and many cycles of El Niño and other quasi-periodic phenomena. This presentation will consider, with examples, some of the advantages and some of the problems of these long-term space measurements with an eye to the future and the needs of future generations. MOPITT was provided to NASA's Terra spacecraft by the Canadian Space Agency and was built by COMDEV of Cambridge, Ontario. Data processing is performed by the MOPITT team at the National Center for Atmospheric Research, Boulder, CO. Instrument control is by the team at the University of Toronto.

  1. CLEAR Instructional Reports Series, Numbers 1-7.

    ERIC Educational Resources Information Center

    Center for Applied Linguistics, Washington, DC. Center for Language Education and Research.

    The complete Instructional Reports Series of the Center for Language Education and Research (CLEAR) is presented. Included are (1) "Using Video in the Foreign Language Classroom" (Ingrid Berdahl); (2) "Strategies for Integrating Language and Content Instruction: Art, Music, and Physical Education" (Carolyn Andrade, Carol Ann…

  2. Feasibility of a school reintegration programme for children with acute lymphoblastic leukaemia.

    PubMed

    Annett, R D; Erickson, S J

    2009-07-01

    Despite children with acute lymphoblastic leukaemia missing a significant amount of school, little empirical literature guides the optimal content, setting and timing of a school reintegration programme. We examined the feasibility of a 4-month school reintegration intervention by: (1) developing collaboration with a community-based advocacy organisation; (2) developing intervention modules and observable end points; and (3) determining how the study achieved recruitment expectations. Eight families with children aged 6-12 years diagnosed with acute lymphoblastic leukaemia and parents were enrolled in the study. An experienced advocate implemented a series of eight modules over a 4-month period (twice per month) with the families. Participants completed pre-post measures. Successful collaboration with the advocacy organisation and the development of an intervention module series were achieved. Recruitment aims proved more difficult: enrolment was extended when recruitment for the original 1- to 6-month post-diagnosis window proved difficult. The advocate was able to complete between three and seven of the modules (mean = 5.2, standard deviation = 1.5). Families preferred clinic-based intervention. Challenges faced and lessons learned include: (1) advocacy organisations may be useful resources for school reintegration interventions; (2) school reintegration interventions must be flexibly applied; and (3) measurement end points constructed to gauge programme effectiveness.

  3. Quantifying memory in complex physiological time-series.

    PubMed

    Shirazi, Amir H; Raoufy, Mohammad R; Ebadi, Haleh; De Rui, Michele; Schiff, Sami; Mazloom, Roham; Hajizadeh, Sohrab; Gharibzadeh, Shahriar; Dehpour, Ahmad R; Amodio, Piero; Jafari, G Reza; Montagnese, Sara; Mani, Ali R

    2013-01-01

    In a time-series, memory is a statistical feature that lasts for a period of time and distinguishes the time-series from a random, or memory-less, process. In the present study, the concept of "memory length" was used to define the time period, or scale over which rare events within a physiological time-series do not appear randomly. The method is based on inverse statistical analysis and provides empiric evidence that rare fluctuations in cardio-respiratory time-series are 'forgotten' quickly in healthy subjects while the memory for such events is significantly prolonged in pathological conditions such as asthma (respiratory time-series) and liver cirrhosis (heart-beat time-series). The memory length was significantly higher in patients with uncontrolled asthma compared to healthy volunteers. Likewise, it was significantly higher in patients with decompensated cirrhosis compared to those with compensated cirrhosis and healthy volunteers. We also observed that the cardio-respiratory system has simple low order dynamics and short memory around its average, and high order dynamics around rare fluctuations.

  4. Quantifying Memory in Complex Physiological Time-Series

    PubMed Central

    Shirazi, Amir H.; Raoufy, Mohammad R.; Ebadi, Haleh; De Rui, Michele; Schiff, Sami; Mazloom, Roham; Hajizadeh, Sohrab; Gharibzadeh, Shahriar; Dehpour, Ahmad R.; Amodio, Piero; Jafari, G. Reza; Montagnese, Sara; Mani, Ali R.

    2013-01-01

    In a time-series, memory is a statistical feature that lasts for a period of time and distinguishes the time-series from a random, or memory-less, process. In the present study, the concept of “memory length” was used to define the time period, or scale over which rare events within a physiological time-series do not appear randomly. The method is based on inverse statistical analysis and provides empiric evidence that rare fluctuations in cardio-respiratory time-series are ‘forgotten’ quickly in healthy subjects while the memory for such events is significantly prolonged in pathological conditions such as asthma (respiratory time-series) and liver cirrhosis (heart-beat time-series). The memory length was significantly higher in patients with uncontrolled asthma compared to healthy volunteers. Likewise, it was significantly higher in patients with decompensated cirrhosis compared to those with compensated cirrhosis and healthy volunteers. We also observed that the cardio-respiratory system has simple low order dynamics and short memory around its average, and high order dynamics around rare fluctuations. PMID:24039811

  5. Scale-dependent intrinsic entropies of complex time series.

    PubMed

    Yeh, Jia-Rong; Peng, Chung-Kang; Huang, Norden E

    2016-04-13

    Multi-scale entropy (MSE) was developed as a measure of complexity for complex time series, and it has been applied widely in recent years. The MSE algorithm is based on the assumption that biological systems possess the ability to adapt and function in an ever-changing environment, and these systems need to operate across multiple temporal and spatial scales, such that their complexity is also multi-scale and hierarchical. Here, we present a systematic approach to apply the empirical mode decomposition algorithm, which can detrend time series on various time scales, prior to analysing a signal's complexity by measuring the irregularity of its dynamics on multiple time scales. Simulated time series of fractal Gaussian noise and human heartbeat time series were used to study the performance of this new approach. We show that our method can successfully quantify the fractal properties of the simulated time series and can accurately distinguish modulations in human heartbeat time series in health and disease. © 2016 The Author(s).

  6. Hilbert-Huang spectral analysis for characterizing the intrinsic time-scales of variability in decennial time-series of surface solar radiation

    NASA Astrophysics Data System (ADS)

    Bengulescu, Marc; Blanc, Philippe; Wald, Lucien

    2016-04-01

    An analysis of the variability of the surface solar irradiance (SSI) at different local time-scales is presented in this study. Since geophysical signals, such as long-term measurements of the SSI, are often produced by the non-linear interaction of deterministic physical processes that may also be under the influence of non-stationary external forcings, the Hilbert-Huang transform (HHT), an adaptive, noise-assisted, data-driven technique, is employed to extract locally - in time and in space - the embedded intrinsic scales at which a signal oscillates. The transform consists of two distinct steps. First, by means of the Empirical Mode Decomposition (EMD), the time-series is "de-constructed" into a finite number - often small - of zero-mean components that have distinct temporal scales of variability, termed hereinafter the Intrinsic Mode Functions (IMFs). The signal model of the components is an amplitude modulation - frequency modulation (AM - FM) one, and can also be thought of as an extension of a Fourier series having both time varying amplitude and frequency. Following the decomposition, Hilbert spectral analysis is then employed on the IMFs, yielding a time-frequency-energy representation that portrays changes in the spectral contents of the original data, with respect to time. As measurements of surface solar irradiance may possibly be contaminated by the manifestation of different type of stochastic processes (i.e. noise), the identification of real, physical processes from this background of random fluctuations is of interest. To this end, an adaptive background noise null hypothesis is assumed, based on the robust statistical properties of the EMD when applied to time-series of different classes of noise (e.g. white, red or fractional Gaussian). Since the algorithm acts as an efficient constant-Q dyadic, "wavelet-like", filter bank, the different noise inputs are decomposed into components having the same spectral shape, but that are translated to the next lower octave in the spectral domain. Thus, when the sampling step is increased, the spectral shape of IMFs cannot remain at its original position, due to the new lower Nyquist frequency, and is instead pushed toward the lower scaled frequency. Based on these features, the identification of potential signals within the data should become possible without any prior knowledge of the background noises. When applying the above outlined procedure to decennial time-series of surface solar irradiance, only the component that has an annual time-scale of variability is shown to have statistical properties that diverge from those of noise. Nevertheless, the noise-like components are not completely devoid of information, as it is found that their AM components have a non-null rank correlation coefficient with the annual mode, i.e. the background noise intensity seems to be modulated by the seasonal cycle. The findings have possible implications on the modelling and forecast of the surface solar irradiance, by discriminating its deterministic from its quasi-stochastic constituents, at distinct local time-scales.

  7. Complete Fabrication of a Traversable 3 µm Thick NbN Film Superconducting Coil with Cu plated layer of 42m in Length in a Spiral Three-Storied Trench Engraved in a Si Wafer of 76.2 mm in Diameter Formed by MEMS Technology for a Compact SMES with High Energy Storage Volume Density

    NASA Astrophysics Data System (ADS)

    Suzuki, Yasuhiro; Iguchi, Nobuhiro; Adachi, Kazuhiro; Ichiki, Akihisa; Hioki, Tatsumi; Hsu, Che-Wei; Sato, Ryoto; Kumagai, Shinya; Sasaki, Minoru; Noh, Joo-Hyong; Sakurahara, Yuuske; Okabe, Kyohei; Takai, Osamu; Honma, Hideo; Watanabe, Hideo; Sakoda, Hitoshi; Sasagawa, Hiroaki; Doy, Hideyuki; Zhou, Shuliang; Hori, H.; Nishikawa, Shigeaki; Nozaki, Toshihiro; Sugimoto, Noriaki; Motohiro, Tomoyoshi

    2017-09-01

    Based on the concept of a novel approach to make a compact SMES unit composed of a stack of Si wafers using MEMS process proposed previously, a complete fabrication of a traversable 3 µam thick NbN film superconducting coil lined with Cu plated layer of 42m in length in a spiral three-storied trench engraved in and extended over a whole Si-wafer of 76.2 mm in diameter was attained for the first time. With decrease in temperature, the DC resistivity showed a metallic decrease indicating the current pass was in the Cu plated layer and then made a sudden fall to residual contact resistance indicating the shift of current pass from the Cu plated layer to the NbN film at the critical temperature Tc of 15.5K by superconducting transition. The temperature dependence of I-V curve showed the increase in the critical current with decrease in the temperature and the highest critical current measured was 220 mA at 4K which is five times as large as that obtained in the test fabrication as the experimental proof of concept presented in the previous report. This completion of a one wafer superconducting NbN coil is an indispensable step for the next proof of concept of fabrication of series-connected two wafer coils via superconductive joint which will read to series connected 600 wafer coils finally, and for replacement of NbN by high Tc superconductor such as YBa2Cu3O7-x for operation under the cold energy of liquid hydrogen or liquid nitrogen.

  8. Patient Compliance With Electronic Patient Reported Outcomes Following Shoulder Arthroscopy.

    PubMed

    Makhni, Eric C; Higgins, John D; Hamamoto, Jason T; Cole, Brian J; Romeo, Anthony A; Verma, Nikhil N

    2017-11-01

    To determine the patient compliance in completing electronically administered patient-reported outcome (PRO) scores following shoulder arthroscopy, and to determine if dedicated research assistants improve patient compliance. Patients undergoing arthroscopic shoulder surgery from January 1, 2014, to December 31, 2014, were prospectively enrolled into an electronic data collection system with retrospective review of compliance data. A total of 143 patients were included in this study; 406 patients were excluded (for any or all of the following reasons, such as incomplete follow-up, inaccessibility to the order sets, and inability to complete the order sets). All patients were assigned an order set of PROs through an electronic reporting system, with order sets to be completed prior to surgery, as well as 6 and 12 months postoperatively. Compliance rates of form completion were documented. Patients who underwent arthroscopic anterior and/or posterior stabilization were excluded. The average age of the patients was 53.1 years, ranging from 20 to 83. Compliance of form completion was highest preoperatively (76%), and then dropped subsequently at 6 months postoperatively (57%) and 12 months postoperatively (45%). Use of research assistants improved compliance by approximately 20% at each time point. No differences were found according to patient gender and age group. Of those completing forms, a majority completed forms at home or elsewhere prior to returning to the office for the clinic visit. Electronic administration of PRO may decrease the amount of time required in the office setting for PRO completion by patients. This may be mutually beneficial to providers and patients. It is unclear if an electronic system improves patient compliance in voluntary completion PRO. Compliance rates at final follow-up remain a concern if data are to be used for establishing quality or outcome metrics. Level IV, case series. Copyright © 2017 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  9. A non linear analysis of human gait time series based on multifractal analysis and cross correlations

    NASA Astrophysics Data System (ADS)

    Muñoz-Diosdado, A.

    2005-01-01

    We analyzed databases with gait time series of adults and persons with Parkinson, Huntington and amyotrophic lateral sclerosis (ALS) diseases. We obtained the staircase graphs of accumulated events that can be bounded by a straight line whose slope can be used to distinguish between gait time series from healthy and ill persons. The global Hurst exponent of these series do not show tendencies, we intend that this is because some gait time series have monofractal behavior and others have multifractal behavior so they cannot be characterized with a single Hurst exponent. We calculated the multifractal spectra, obtained the spectra width and found that the spectra of the healthy young persons are almost monofractal. The spectra of ill persons are wider than the spectra of healthy persons. In opposition to the interbeat time series where the pathology implies loss of multifractality, in the gait time series the multifractal behavior emerges with the pathology. Data were collected from healthy and ill subjects as they walked in a roughly circular path and they have sensors in both feet, so we have one time series for the left foot and other for the right foot. First, we analyzed these time series separately, and then we compared both results, with direct comparison and with a cross correlation analysis. We tried to find differences in both time series that can be used as indicators of equilibrium problems.

  10. The gravitational potential due to uniform disks and rings

    NASA Astrophysics Data System (ADS)

    Lass, H.; Blitzer, L.

    1983-07-01

    The gravitational potential of bodies possessing axial symmetry can be expressed as a power series in distance, with the Legendre polynomials as coefficients. Such series, however, converge so slowly in the neighborhood of thin, uniform disks and rings that too many series terms must be summed in order to obtain an accurate field measure. A gravitational potential expression is presently obtained in closed form, in terms of complete elliptic integrals.

  11. The examination of headache activity using time-series research designs.

    PubMed

    Houle, Timothy T; Remble, Thomas A; Houle, Thomas A

    2005-05-01

    The majority of research conducted on headache has utilized cross-sectional designs which preclude the examination of dynamic factors and principally rely on group-level effects. The present article describes the application of an individual-oriented process model using time-series analytical techniques. The blending of a time-series approach with an interactive process model allows consideration of the relationships of intra-individual dynamic processes, while not precluding the researcher to examine inter-individual differences. The authors explore the nature of time-series data and present two necessary assumptions underlying the time-series approach. The concept of shock and its contribution to headache activity is also presented. The time-series approach is not without its problems and two such problems are specifically reported: autocorrelation and the distribution of daily observations. The article concludes with the presentation of several analytical techniques suited to examine the time-series interactive process model.

  12. Long-range correlations in time series generated by time-fractional diffusion: A numerical study

    NASA Astrophysics Data System (ADS)

    Barbieri, Davide; Vivoli, Alessandro

    2005-09-01

    Time series models showing power law tails in autocorrelation functions are common in econometrics. A special non-Markovian model for such kind of time series is provided by the random walk introduced by Gorenflo et al. as a discretization of time fractional diffusion. The time series so obtained are analyzed here from a numerical point of view in terms of autocorrelations and covariance matrices.

  13. Improving estimates of ecosystem metabolism by reducing effects of tidal advection on dissolved oxygen time series

    EPA Science Inventory

    In aquatic systems, time series of dissolved oxygen (DO) have been used to compute estimates of ecosystem metabolism. Central to this open-water method is the assumption that the DO time series is a Lagrangian specification of the flow field. However, most DO time series are coll...

  14. 76 FR 28897 - Magnuson-Stevens Act Provisions; Fisheries Off West Coast States; Pacific Coast Groundfish...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-19

    ... time series became closer (while depletion at the end of the time series became more divergent); (4) the agreement in the recruitment time series was much improved; (5) recruitment deviations in log space showed much closer agreement; and (6) the fishing intensity time series showed much closer...

  15. 75 FR 4570 - Government-Owned Inventions; Availability for Licensing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-28

    ... applications. Signal-to-Noise Enhancement in Imaging Applications Using a Time-Series of Images Description of... applications that use a time-series of images. In one embodiment of the invention, a time-series of images is... Imaging Applications Using a Time-Series of Images'' (HHS Reference No. E-292- 2009/0-US-01). Related...

  16. Systematic review of FDG-PET prediction of complete pathological response and survival in rectal cancer.

    PubMed

    Memon, Sameer; Lynch, A Craig; Akhurst, Timothy; Ngan, Samuel Y; Warrier, Satish K; Michael, Michael; Heriot, Alexander G

    2014-10-01

    Advances in the management of rectal cancer have resulted in an increased application of multimodal therapy with the aim of tailoring therapy to individual patients. Complete pathological response (pCR) is associated with improved survival and may be potentially managed without radical surgical resection. Over the last decade, there has been increasing interest in the ability of functional imaging to predict complete response to treatment. The aim of this review was to assess the role of (18)F-flurordeoxyglucose positron emission tomography (FDG-PET) in prediction of pCR and prognosis in resectable locally advanced rectal cancer. A search of the MEDLINE and Embase databases was conducted, and a systematic review of the literature investigating positron emission tomography (PET) in the prediction of pCR and survival in rectal cancer was performed. Seventeen series assessing PET prediction of pCR were included in the review. Seven series assessed postchemoradiation SUVmax, which was significantly different between response groups in all six studies that assessed this. Nine series assessed the response index (RI) for SUVmax, which was significantly different between response groups in seven series. Thirteen studies investigated PET response for prediction of survival. Metabolic complete response assessed by SUV2max or visual response and RISUVmax showed strong associations with disease-free survival (DFS) and overall survival (OS). SUV2max and RISUVmax appear to be useful FDG-PET markers for prediction of pCR and these parameters also show strong associations with DFS and OS. FDG-PET may have a role in outcome prediction in patients with advanced rectal cancer.

  17. Terminator field-aligned current system: A new finding from model-assimilated data set (MADS)

    NASA Astrophysics Data System (ADS)

    Zhu, L.; Schunk, R. W.; Scherliess, L.; Sojka, J. J.; Gardner, L. C.; Eccles, J. V.; Rice, D.

    2013-12-01

    Physics-based data assimilation models have been recognized by the space science community as the most accurate approach to specify and forecast the space weather of the solar-terrestrial environment. The model-assimilated data sets (MADS) produced by these models constitute an internally consistent time series of global three-dimensional fields whose accuracy can be estimated. Because of its internal consistency of physics and completeness of descriptions on the status of global systems, the MADS has also been a powerful tool to identify the systematic errors in measurements, reveal the missing physics in physical models, and discover the important dynamical physical processes that are inadequately observed or missed by measurements due to observational limitations. In the past years, we developed a data assimilation model for the high-latitude ionospheric plasma dynamics and electrodynamics. With a set of physical models, an ensemble Kalman filter, and the ingestion of data from multiple observations, the data assimilation model can produce a self-consistent time-series of the complete descriptions of the global high-latitude ionosphere, which includes the convection electric field, horizontal and field-aligned currents, conductivity, as well as 3-D plasma densities and temperatures, In this presentation, we will show a new field-aligned current system discovered from the analysis of the MADS produced by our data assimilation model. This new current system appears and develops near the ionospheric terminator. The dynamical features of this current system will be described and its connection to the active role of the ionosphere in the M-I coupling will be discussed.

  18. 78 FR 15385 - Self-Regulatory Organizations; NASDAQ OMX BX, Inc.; Notice of Filing of Proposed Rule Change To...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-11

    ... Series, any adjusted option series, and any option series until the time to expiration for such series is... existing requirement may at times discourage liquidity in particular options series because a market maker... the option is subject to the Price/Time execution algorithm, the Directed Market Maker shall receive...

  19. A novel water quality data analysis framework based on time-series data mining.

    PubMed

    Deng, Weihui; Wang, Guoyin

    2017-07-01

    The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. End-to-end test of the electron-proton spectrometer

    NASA Technical Reports Server (NTRS)

    Cash, B. L.

    1972-01-01

    A series of end-to-end tests were performed to demonstrate the proper functioning of the complete Electron-Proton Spectrometer (EPS). The purpose of the tests was to provide experimental verification of the design and to provide a complete functional performance check of the instrument from the excitation of the sensors to and including the data processor and equipment test set. Each of the channels of the EPS was exposed to a calibrated beam of energetic particles, and counts were accumulated for a predetermined period of time for each of several energies. The counts were related to the known flux of particles to give a monodirectional response function for each channel. The measured response function of the test unit was compared to the response function determined for the calibration sensors from the data taken from the calibration program.

  1. Algorithms exploiting ultrasonic sensors for subject classification

    NASA Astrophysics Data System (ADS)

    Desai, Sachi; Quoraishee, Shafik

    2009-09-01

    Proposed here is a series of techniques exploiting micro-Doppler ultrasonic sensors capable of characterizing various detected mammalian targets based on their physiological movements captured a series of robust features. Employed is a combination of unique and conventional digital signal processing techniques arranged in such a manner they become capable of classifying a series of walkers. These processes for feature extraction develops a robust feature space capable of providing discrimination of various movements generated from bipeds and quadrupeds and further subdivided into large or small. These movements can be exploited to provide specific information of a given signature dividing it in a series of subset signatures exploiting wavelets to generate start/stop times. After viewing a series spectrograms of the signature we are able to see distinct differences and utilizing kurtosis, we generate an envelope detector capable of isolating each of the corresponding step cycles generated during a walk. The walk cycle is defined as one complete sequence of walking/running from the foot pushing off the ground and concluding when returning to the ground. This time information segments the events that are readily seen in the spectrogram but obstructed in the temporal domain into individual walk sequences. This walking sequence is then subsequently translated into a three dimensional waterfall plot defining the expected energy value associated with the motion at particular instance of time and frequency. The value is capable of being repeatable for each particular class and employable to discriminate the events. Highly reliable classification is realized exploiting a classifier trained on a candidate sample space derived from the associated gyrations created by motion from actors of interest. The classifier developed herein provides a capability to classify events as an adult humans, children humans, horses, and dogs at potentially high rates based on the tested sample space. The algorithm developed and described will provide utility to an underused sensor modality for human intrusion detection because of the current high-rate of generated false alarms. The active ultrasonic sensor coupled in a multi-modal sensor suite with binary, less descriptive sensors like seismic devices realizing a greater accuracy rate for detection of persons of interest for homeland purposes.

  2. Preformed, patterned striping material : "Stamark Pliant Polymer Marking Tape" : "Series 5730" , "Series A350" : final report.

    DOT National Transportation Integrated Search

    1991-11-01

    In 1989, two pavement striping tape materials were placed on two new asphalt pavements. A two-year performance evaluation of the materials has been completed by the Oregon State Highway Division's (OSHD's) Materials and Research Section. : On the fir...

  3. Hypoplasia or Absence of Posterior Leaflet: A Rare Congenital Anomaly of The Mitral Valve in Adulthood – Case Series

    PubMed Central

    Parato, Vito Maurizio; Masia, Stefano Lucio

    2018-01-01

    We present a case series of two adult patients with almost complete absence of the posterior mitral valve leaflet and who are asymptomatic or mildly symptomatic, with two different degrees of mitral regurgitation. PMID:29629259

  4. WNV Typer: a server for genotyping of West Nile viruses using an alignment-free method based on a return time distribution.

    PubMed

    Kolekar, Pandurang; Hake, Nilesh; Kale, Mohan; Kulkarni-Kale, Urmila

    2014-03-01

    West Nile virus (WNV), genus Flavivirus, family Flaviviridae, is a major cause of viral encephalitis with broad host range and global spread. The virus has undergone a series of evolutionary changes with emergence of various genotypic lineages that are known to differ in type and severity of the diseases caused. Currently, genotyping is carried out using molecular phylogeny of complete coding sequences and genotype is assigned based on proximity to reference genotypes in tree topology. Efficient epidemiological surveillance of WNVs demands development of objective criteria for typing. An alignment-free approach based on return time distribution (RTD) of k-mers has been validated for genotyping of WNVs. The RTDs of complete genome sequences at k=7 were found to be optimum for classification of the known lineages of WNVs as well as for genotyping. It provides time and computationally efficient alternative for genome based annotation of WNV lineages. The development of a WNV Typer server based on RTD is described (http://bioinfo.net.in/wnv/homepage.html). Both the method and the server have 100% sensitivity and specificity. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  5. Climate Prediction Center - Stratosphere: Polar Stratosphere and Ozone

    Science.gov Websites

    depletion processes can occur. In addition, the latitudinal-time cross sections shows the thermal evolution UV Daily Dosage Estimate South Polar Vertical Ozone Profile Time Series of Size of S.H. Polar Vortex Time Series of Size of S.H. PSC Temperature Time Series of Size of N.H. Polar Vortex Time Series of

  6. Using forbidden ordinal patterns to detect determinism in irregularly sampled time series.

    PubMed

    Kulp, C W; Chobot, J M; Niskala, B J; Needhammer, C J

    2016-02-01

    It is known that when symbolizing a time series into ordinal patterns using the Bandt-Pompe (BP) methodology, there will be ordinal patterns called forbidden patterns that do not occur in a deterministic series. The existence of forbidden patterns can be used to identify deterministic dynamics. In this paper, the ability to use forbidden patterns to detect determinism in irregularly sampled time series is tested on data generated from a continuous model system. The study is done in three parts. First, the effects of sampling time on the number of forbidden patterns are studied on regularly sampled time series. The next two parts focus on two types of irregular-sampling, missing data and timing jitter. It is shown that forbidden patterns can be used to detect determinism in irregularly sampled time series for low degrees of sampling irregularity (as defined in the paper). In addition, comments are made about the appropriateness of using the BP methodology to symbolize irregularly sampled time series.

  7. Sensor-Generated Time Series Events: A Definition Language

    PubMed Central

    Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan

    2012-01-01

    There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.

  8. Homogenising time series: Beliefs, dogmas and facts

    NASA Astrophysics Data System (ADS)

    Domonkos, P.

    2010-09-01

    For obtaining reliable information about climate change and climate variability the use of high quality data series is essentially important, and one basic tool of quality improvements is the statistical homogenisation of observed time series. In the recent decades large number of homogenisation methods has been developed, but the real effects of their application on time series are still not known entirely. The ongoing COST HOME project (COST ES0601) is devoted to reveal the real impacts of homogenisation methods more detailed and with higher confidence than earlier. As part of the COST activity, a benchmark dataset was built whose characteristics approach well the characteristics of real networks of observed time series. This dataset offers much better opportunity than ever to test the wide variety of homogenisation methods, and analyse the real effects of selected theoretical recommendations. The author believes that several old theoretical rules have to be re-evaluated. Some examples of the hot questions, a) Statistically detected change-points can be accepted only with the confirmation of metadata information? b) Do semi-hierarchic algorithms for detecting multiple change-points in time series function effectively in practise? c) Is it good to limit the spatial comparison of candidate series with up to five other series in the neighbourhood? Empirical results - those from the COST benchmark, and other experiments too - show that real observed time series usually include several inhomogeneities of different sizes. Small inhomogeneities seem like part of the climatic variability, thus the pure application of classic theory that change-points of observed time series can be found and corrected one-by-one is impossible. However, after homogenisation the linear trends, seasonal changes and long-term fluctuations of time series are usually much closer to the reality, than in raw time series. The developers and users of homogenisation methods have to bear in mind that the eventual purpose of homogenisation is not to find change-points, but to have the observed time series with statistical properties those characterise well the climate change and climate variability.

  9. Research on the remote sensing methods of drought monitoring in Chongqing

    NASA Astrophysics Data System (ADS)

    Yang, Shiqi; Tang, Yunhui; Gao, Yanghua; Xu, Yongjin

    2011-12-01

    There are regional and periodic droughts in Chongqing, which impacted seriously on agricultural production and people's lives. This study attempted to monitor the drought in Chongqing with complex terrain using MODIS data. First, we analyzed and compared three remote sensing methods for drought monitoring (time series of vegetation index, temperature vegetation dryness index (TVDI), and vegetation supply water index (VSWI)) for the severe drought in 2006. Then we developed a remote sensing based drought monitoring model for Chongqing by combining soil moisture data and meteorological data. The results showed that the three remote sensing based drought monitoring models performed well in detecting the occurrence of drought in Chongqing on a certain extent. However, Time Series of Vegetation Index has stronger sensitivity in time pattern but weaker in spatial pattern; although TVDI and VSWI can reflect inverse the whole process of severe drought in 2006 summer from drought occurred - increased - relieved - increased again - complete remission in spatial domain, but TVDI requires the situation of extreme drought and extreme moist both exist in study area which it is more difficult in Chongqing; VSWI is simple and practicable, which the correlation coefficient between VSWI and soil moisture data reaches significant levels. In summary, VSWI is the best model for summer drought monitoring in Chongqing.

  10. Historical Space Climate Data from Finland: Compilation and Analysis

    NASA Astrophysics Data System (ADS)

    Nevanlinna, Heikki

    2004-10-01

    We have compiled archived geomagnetic observations from the Helsinki magnetic observatory as well as visual sightings of auroral occurrence in Finland. The magnetic database comprises about 2 000 000 observations of H- and D-components measured during 1844-1909 with time resolution of 10 min to 1 h. In addition, magnetic observations carried out in the First and Second Polar Years in Finland have been recompiled. Magnetic activity indices (three-hour K-and daily Ak-figures) have been derived from the magnetic observations. Comparisons between the Finnish indices and simultaneous global aa-index (starting in 1868) show a good mutual correlation. The Helsinki activity index series can be used as a (pseudo) extension of the aa-index series for about two solar cycles 1844d -1868. On the annual level the correlation coefficient is about 0.9 during the overlapped time interval 1868-1897. The auroral database consists of about 20 000 single observations observed in Finland since the year 1748. The database of visual auroras has been completed by auroral occurrence (AO) index data derived from the Finnish all-sky camera recordings during 1973 -1997 at several sites in Lapland. The AO-index reveals both spatial and temporal variations of auroras from diurnal to solar cycle time scales in different space weather conditions.

  11. Causal Inference and the Comparative Interrupted Time Series Design: Findings from Within-Study Comparisons

    ERIC Educational Resources Information Center

    St. Clair, Travis; Hallberg, Kelly; Cook, Thomas D.

    2014-01-01

    Researchers are increasingly using comparative interrupted time series (CITS) designs to estimate the effects of programs and policies when randomized controlled trials are not feasible. In a simple interrupted time series design, researchers compare the pre-treatment values of a treatment group time series to post-treatment values in order to…

  12. Time Series Model Identification and Prediction Variance Horizon.

    DTIC Science & Technology

    1980-06-01

    stationary time series Y(t). -6- In terms of p(v), the definition of the three time series memory types is: No Memory Short Memory Long Memory X IP (v)I 0 0...X lp(v)l < - I IP (v) = v=1 v=l v=l Within short memory time series there are three types whose classification in terms of correlation functions is...1974) "Some Recent Advances in Time Series Modeling", TEEE Transactions on Automatic ControZ, VoZ . AC-19, No. 6, December, 723-730. Parzen, E. (1976) "An

  13. Extending nonlinear analysis to short ecological time series.

    PubMed

    Hsieh, Chih-hao; Anderson, Christian; Sugihara, George

    2008-01-01

    Nonlinearity is important and ubiquitous in ecology. Though detectable in principle, nonlinear behavior is often difficult to characterize, analyze, and incorporate mechanistically into models of ecosystem function. One obvious reason is that quantitative nonlinear analysis tools are data intensive (require long time series), and time series in ecology are generally short. Here we demonstrate a useful method that circumvents data limitation and reduces sampling error by combining ecologically similar multispecies time series into one long time series. With this technique, individual ecological time series containing as few as 20 data points can be mined for such important information as (1) significantly improved forecast ability, (2) the presence and location of nonlinearity, and (3) the effective dimensionality (the number of relevant variables) of an ecological system.

  14. Introducing an operational method to forecast long-term regional drought based on the application of artificial intelligence capabilities

    NASA Astrophysics Data System (ADS)

    Kousari, Mohammad Reza; Hosseini, Mitra Esmaeilzadeh; Ahani, Hossein; Hakimelahi, Hemila

    2017-01-01

    An effective forecast of the drought definitely gives lots of advantages in regard to the management of water resources being used in agriculture, industry, and households consumption. To introduce such a model applying simple data inputs, in this study a regional drought forecast method on the basis of artificial intelligence capabilities (artificial neural networks) and Standardized Precipitation Index (SPI in 3, 6, 9, 12, 18, and 24 monthly series) has been presented in Fars Province of Iran. The precipitation data of 41 rain gauge stations were applied for computing SPI values. Besides, weather signals including Multivariate ENSO Index (MEI), North Atlantic Oscillation (NAO), Southern Oscillation Index (SOI), NINO1+2, anomaly NINO1+2, NINO3, anomaly NINO3, NINO4, anomaly NINO4, NINO3.4, and anomaly NINO3.4 were also used as the predictor variables for SPI time series forecast the next 12 months. Frequent testing and validating steps were considered to obtain the best artificial neural networks (ANNs) models. The forecasted values were mapped in verification sector then they were compared with the observed maps at the same dates. Results showed considerable spatial and temporal relationships even among the maps of different SPI time series. Also, the first 6 months forecasted maps showed an average of 73 % agreements with the observed ones. The most important finding and the strong point of this study was the fact that although drought forecast in each station and time series was completely independent, the relationships between spatial and temporal predictions remained. This strong point mainly referred to frequent testing and validating steps in order to explore the best drought forecast models from plenty of produced ANNs models. Finally, wherever the precipitation data are available, the practical application of the presented method is possible.

  15. Completing the gaps in Kilauea's Father's Day InSAR displacement signature with ScanSAR

    NASA Astrophysics Data System (ADS)

    Bertran Ortiz, A.; Pepe, A.; Lanari, R.; Lundgren, P.; Rosen, P. A.

    2009-12-01

    Currently there are gaps in the known displacement signature obtained with InSAR at Kilauea between 2002 and 2009. InSAR data can be richer than GPS because of denser spatial cover. However, to better model rapidly varying and non-steady geophysical events InSAR is limited because of its less dense time observations of the area under study. The ScanSAR mode currently available in several satellites mitigates this effect because the satellite may illuminate a given area more than once within an orbit cycle. The Kilauea displacement graph below from Instituto per Il Rilevamento Electromagnetico dell'Ambiente (IREA) is a cut in space of the displacement signature obtained from a time series of several stripmap-to-stripmap interferograms. It shows that critical information is missing, especially between 2006 and 2007. The displacement is expected to be non-linear judging from the 2007-2008 displacement signature, thus simple interpolation would not suffice. The gap can be filled by incorporating Envisat stripmap-to-ScanSAR interferograms available during that time period. We propose leveraging JPL's new ROI-PAC ScanSAR module to create stripmap-to-ScanSAR interferograms. The new interferograms will be added to the stripmap ones in order to extend the existing stripmap time series generated by using the Small BAseline Subset (SBAS) technique. At AGU we will present denser graphs that better capture Kilauea's displacement between 2003 and 2009.

  16. Maternal and Child Health Research Program. Completed Projects 1989, 1990, and 1991.

    ERIC Educational Resources Information Center

    National Center for Education in Maternal and Child Health, Arlington, VA.

    This publication describes 33 research projects supported by the federal Maternal and Child Health Bureau and completed in 1989, 1990, and 1991. It is the third edition in a series of collected abstracts of completed maternal and child health research projects. Each project abstract contains the name of the grantee, name and address of the…

  17. Pediatric and adolescent gynecology learned via a Web-based computerized case series.

    PubMed

    De Silva, Nirupama K; Dietrich, Jennifer E; Young, Amy E

    2010-04-01

    To increase resident knowledge in pediatric and adolescent gynecology via a Web-based self-tutorial. Prospective cohort involving 11 third- and fourth-year residents in a large university program. Residents were asked to complete a Web-based teaching series of cases involving common topics of pediatric and adolescent gynecology (PAG). A pretest and a posttest were completed to assess knowledge gained. Residents were asked to give feedback regarding improvements to the Web-based series for future case development. University-affiliated residency program in a major metropolitan area. Resident physicians in the Department of Obstetrics and Gynecology. Introduction of a Web-based teaching series to enhance resident education. Improvement of resident knowledge in PAG. All residents improved their knowledge in PAG after reviewing the series of cases. The pretest group mean score was 50%. The posttest group score was 69% (P < .05). All (100%) of participants said that this tool was an effective way to improve resident knowledge in PAG. A computer-based self-tutorial in pediatric and adolescent gynecology is a feasible and satisfactory teaching adjunct to PAG. Copyright 2010 North American Society for Pediatric and Adolescent Gynecology. Published by Elsevier Inc. All rights reserved.

  18. Real-Time Non-Intrusive Assessment of Viewing Distance during Computer Use.

    PubMed

    Argilés, Marc; Cardona, Genís; Pérez-Cabré, Elisabet; Pérez-Magrané, Ramon; Morcego, Bernardo; Gispets, Joan

    2016-12-01

    To develop and test the sensitivity of an ultrasound-based sensor to assess the viewing distance of visual display terminals operators in real-time conditions. A modified ultrasound sensor was attached to a computer display to assess viewing distance in real time. Sensor functionality was tested on a sample of 20 healthy participants while they conducted four 10-minute randomly presented typical computer tasks (a match-three puzzle game, a video documentary, a task requiring participants to complete a series of sentences, and a predefined internet search). The ultrasound sensor offered good measurement repeatability. Game, text completion, and web search tasks were conducted at shorter viewing distances (54.4 cm [95% CI 51.3-57.5 cm], 54.5 cm [95% CI 51.1-58.0 cm], and 54.5 cm [95% CI 51.4-57.7 cm], respectively) than the video task (62.3 cm [95% CI 58.9-65.7 cm]). Statistically significant differences were found between the video task and the other three tasks (all p < 0.05). Range of viewing distances (from 22 to 27 cm) was similar for all tasks (F = 0.996; p = 0.413). Real-time assessment of the viewing distance of computer users with a non-intrusive ultrasonic device disclosed a task-dependent pattern.

  19. Conventional and advanced time series estimation: application to the Australian and New Zealand Intensive Care Society (ANZICS) adult patient database, 1993-2006.

    PubMed

    Moran, John L; Solomon, Patricia J

    2011-02-01

    Time series analysis has seen limited application in the biomedical Literature. The utility of conventional and advanced time series estimators was explored for intensive care unit (ICU) outcome series. Monthly mean time series, 1993-2006, for hospital mortality, severity-of-illness score (APACHE III), ventilation fraction and patient type (medical and surgical), were generated from the Australia and New Zealand Intensive Care Society adult patient database. Analyses encompassed geographical seasonal mortality patterns, series structural time changes, mortality series volatility using autoregressive moving average and Generalized Autoregressive Conditional Heteroscedasticity models in which predicted variances are updated adaptively, and bivariate and multivariate (vector error correction models) cointegrating relationships between series. The mortality series exhibited marked seasonality, declining mortality trend and substantial autocorrelation beyond 24 lags. Mortality increased in winter months (July-August); the medical series featured annual cycling, whereas the surgical demonstrated long and short (3-4 months) cycling. Series structural breaks were apparent in January 1995 and December 2002. The covariance stationary first-differenced mortality series was consistent with a seasonal autoregressive moving average process; the observed conditional-variance volatility (1993-1995) and residual Autoregressive Conditional Heteroscedasticity effects entailed a Generalized Autoregressive Conditional Heteroscedasticity model, preferred by information criterion and mean model forecast performance. Bivariate cointegration, indicating long-term equilibrium relationships, was established between mortality and severity-of-illness scores at the database level and for categories of ICUs. Multivariate cointegration was demonstrated for {log APACHE III score, log ICU length of stay, ICU mortality and ventilation fraction}. A system approach to understanding series time-dependence may be established using conventional and advanced econometric time series estimators. © 2010 Blackwell Publishing Ltd.

  20. Sea change: Charting the course for biogeochemical ocean time-series research in a new millennium

    NASA Astrophysics Data System (ADS)

    Church, Matthew J.; Lomas, Michael W.; Muller-Karger, Frank

    2013-09-01

    Ocean time-series provide vital information needed for assessing ecosystem change. This paper summarizes the historical context, major program objectives, and future research priorities for three contemporary ocean time-series programs: The Hawaii Ocean Time-series (HOT), the Bermuda Atlantic Time-series Study (BATS), and the CARIACO Ocean Time-Series. These three programs operate in physically and biogeochemically distinct regions of the world's oceans, with HOT and BATS located in the open-ocean waters of the subtropical North Pacific and North Atlantic, respectively, and CARIACO situated in the anoxic Cariaco Basin of the tropical Atlantic. All three programs sustain near-monthly shipboard occupations of their field sampling sites, with HOT and BATS beginning in 1988, and CARIACO initiated in 1996. The resulting data provide some of the only multi-disciplinary, decadal-scale determinations of time-varying ecosystem change in the global ocean. Facilitated by a scoping workshop (September 2010) sponsored by the Ocean Carbon Biogeochemistry (OCB) program, leaders of these time-series programs sought community input on existing program strengths and for future research directions. Themes that emerged from these discussions included: 1. Shipboard time-series programs are key to informing our understanding of the connectivity between changes in ocean-climate and biogeochemistry 2. The scientific and logistical support provided by shipboard time-series programs forms the backbone for numerous research and education programs. Future studies should be encouraged that seek mechanistic understanding of ecological interactions underlying the biogeochemical dynamics at these sites. 3. Detecting time-varying trends in ocean properties and processes requires consistent, high-quality measurements. Time-series must carefully document analytical procedures and, where possible, trace the accuracy of analyses to certified standards and internal reference materials. 4. Leveraged implementation, testing, and validation of autonomous and remote observing technologies at time-series sites provide new insights into spatiotemporal variability underlying ecosystem changes. 5. The value of existing time-series data for formulating and validating ecosystem models should be promoted. In summary, the scientific underpinnings of ocean time-series programs remain as strong and important today as when these programs were initiated. The emerging data inform our knowledge of the ocean's biogeochemistry and ecology, and improve our predictive capacity about planetary change.

  1. Microstructural evolution and mechanical properties of a low alloy high strength Ni-Cr-Mo-V steel during heat treatment process

    NASA Astrophysics Data System (ADS)

    Wu, C.; Han, S.

    2018-05-01

    In order to obtain an optimal heat treatment for a low alloy high strength Ni-Cr-Mo-V steel, the microstructural evolution and mechanical properties of the material were studied. For this purpose, a series of quenching and temper experiments were carried out. The results showed that the effects of tempering temperature, time, original microstructure on the microstructural evolution and final properties were significant. The martensite can be completely transformed into the tempered lath structure. The width and length of the lath became wider and shorter, respectively with increasing temperature and time. The amount and size of the precipitates increased with temperature and time. The yield strength (YS), ultimate tensile strength (UTS) and hardness decreased with temperature and time, but the reduction in area (Z), elongation (E) and impact toughness displayed an opposite trend, which was related to the morphological evolution of the lath tempered structure.

  2. Characterization of time series via Rényi complexity-entropy curves

    NASA Astrophysics Data System (ADS)

    Jauregui, M.; Zunino, L.; Lenzi, E. K.; Mendes, R. S.; Ribeiro, H. V.

    2018-05-01

    One of the most useful tools for distinguishing between chaotic and stochastic time series is the so-called complexity-entropy causality plane. This diagram involves two complexity measures: the Shannon entropy and the statistical complexity. Recently, this idea has been generalized by considering the Tsallis monoparametric generalization of the Shannon entropy, yielding complexity-entropy curves. These curves have proven to enhance the discrimination among different time series related to stochastic and chaotic processes of numerical and experimental nature. Here we further explore these complexity-entropy curves in the context of the Rényi entropy, which is another monoparametric generalization of the Shannon entropy. By combining the Rényi entropy with the proper generalization of the statistical complexity, we associate a parametric curve (the Rényi complexity-entropy curve) with a given time series. We explore this approach in a series of numerical and experimental applications, demonstrating the usefulness of this new technique for time series analysis. We show that the Rényi complexity-entropy curves enable the differentiation among time series of chaotic, stochastic, and periodic nature. In particular, time series of stochastic nature are associated with curves displaying positive curvature in a neighborhood of their initial points, whereas curves related to chaotic phenomena have a negative curvature; finally, periodic time series are represented by vertical straight lines.

  3. Long-term memory and volatility clustering in high-frequency price changes

    NASA Astrophysics Data System (ADS)

    oh, Gabjin; Kim, Seunghwan; Eom, Cheoljun

    2008-02-01

    We studied the long-term memory in diverse stock market indices and foreign exchange rates using Detrended Fluctuation Analysis (DFA). For all high-frequency market data studied, no significant long-term memory property was detected in the return series, while a strong long-term memory property was found in the volatility time series. The possible causes of the long-term memory property were investigated using the return data filtered by the AR(1) model, reflecting the short-term memory property, the GARCH(1,1) model, reflecting the volatility clustering property, and the FIGARCH model, reflecting the long-term memory property of the volatility time series. The memory effect in the AR(1) filtered return and volatility time series remained unchanged, while the long-term memory property diminished significantly in the volatility series of the GARCH(1,1) filtered data. Notably, there is no long-term memory property, when we eliminate the long-term memory property of volatility by the FIGARCH model. For all data used, although the Hurst exponents of the volatility time series changed considerably over time, those of the time series with the volatility clustering effect removed diminish significantly. Our results imply that the long-term memory property of the volatility time series can be attributed to the volatility clustering observed in the financial time series.

  4. Modeling seasonal variation of hip fracture in Montreal, Canada.

    PubMed

    Modarres, Reza; Ouarda, Taha B M J; Vanasse, Alain; Orzanco, Maria Gabriela; Gosselin, Pierre

    2012-04-01

    The investigation of the association of the climate variables with hip fracture incidences is important in social health issues. This study examined and modeled the seasonal variation of monthly population based hip fracture rate (HFr) time series. The seasonal ARIMA time series modeling approach is used to model monthly HFr incidences time series of female and male patients of the ages 40-74 and 75+ of Montreal, Québec province, Canada, in the period of 1993-2004. The correlation coefficients between meteorological variables such as temperature, snow depth, rainfall depth and day length and HFr are significant. The nonparametric Mann-Kendall test for trend assessment and the nonparametric Levene's test and Wilcoxon's test for checking the difference of HFr before and after change point are also used. The seasonality in HFr indicated sharp difference between winter and summer time. The trend assessment showed decreasing trends in HFr of female and male groups. The nonparametric test also indicated a significant change of the mean HFr. A seasonal ARIMA model was applied for HFr time series without trend and a time trend ARIMA model (TT-ARIMA) was developed and fitted to HFr time series with a significant trend. The multi criteria evaluation showed the adequacy of SARIMA and TT-ARIMA models for modeling seasonal hip fracture time series with and without significant trend. In the time series analysis of HFr of the Montreal region, the effects of the seasonal variation of climate variables on hip fracture are clear. The Seasonal ARIMA model is useful for modeling HFr time series without trend. However, for time series with significant trend, the TT-ARIMA model should be applied for modeling HFr time series. Copyright © 2011 Elsevier Inc. All rights reserved.

  5. Smoothing of climate time series revisited

    NASA Astrophysics Data System (ADS)

    Mann, Michael E.

    2008-08-01

    We present an easily implemented method for smoothing climate time series, generalizing upon an approach previously described by Mann (2004). The method adaptively weights the three lowest order time series boundary constraints to optimize the fit with the raw time series. We apply the method to the instrumental global mean temperature series from 1850-2007 and to various surrogate global mean temperature series from 1850-2100 derived from the CMIP3 multimodel intercomparison project. These applications demonstrate that the adaptive method systematically out-performs certain widely used default smoothing methods, and is more likely to yield accurate assessments of long-term warming trends.

  6. Seroprevalence of poliovirus antibodies in the Kansas City metropolitan area, 2012-2013.

    PubMed

    Wallace, Gregory S; Pahud, Barbara A; Weldon, William C; Curns, Aaron T; Oberste, M Steven; Harrison, Christopher J

    2017-04-03

    No indigenous cases of poliomyelitis have occurred in the US since 1979; however the risk of importation persists until global eradication is achieved. The seropositivity rate for different age cohorts with exposures to different poliovirus vaccine types and wild virus in the US are not presently known. A convenience sample was conducted in the Kansas City metropolitan area during 2012-2103 with approximately 100 participants enrolled for each of 5 age cohorts categorized based on vaccine policy changes over time in the US. Immunization records for poliovirus vaccination were required for participants <18 y of age. We evaluated the prevalence of serum antibodies to all 3 poliovirus serotypes. Seroprevalence was evaluated by demographics as well as between polio serotypes. The overall seroprevalence to poliovirus was 90.7%, 94.4%, and 83.3%, for types 1, 2, and 3, respectively. Seroprevalence was high (88.6%-96.2%) for all 3 types of poliovirus for the 6-10 y old age group that was likely to have received a complete schedule of IPV-only vaccination. Children 2-3 y of age, who have not yet completed their full IPV series, had lower seroprevalence compared with all older age groups for types 1 and 2 (p-value <0. 05). Seroprevalence was high for all 3 types of poliovirus in the population surveyed. Seroprevalence for subjects aged 2-3 y was lower than all other age groups for serotypes 1 and 2 highlighting the importance of completing the recommended poliovirus vaccine series with a booster dose at age 4-6 y.

  7. Hunger induced changes in food choice. When beggars cannot be choosers even if they are allowed to choose.

    PubMed

    Hoefling, Atilla; Strack, Fritz

    2010-06-01

    The present work was to examine the influence of food deprivation on food choice. For this purpose hungry versus satiated subjects were presented with a series of choices between two snacks in a complete block design of pairwise comparisons. Snacks systematically varied with respect to subjects' idiosyncratic taste preferences (preferred versus un-preferred snack), portion size (large portion versus very small portion), and availability in terms of time (immediately available versus available only after a substantial time delay). Food choices were analyzed with a conjoint analysis which corroborated the assumption that food deprivation decreases the relative importance of taste preference and increases the importance of immediate availability of food. Copyright 2010 Elsevier Ltd. All rights reserved.

  8. Modern Era Retrospective-analysis for Research and Applications (MERRA) Global Water and Energy Budgets

    NASA Technical Reports Server (NTRS)

    Bosilovich, Michael G.; Robertson, Franklin R.; Chen, Junye

    2008-01-01

    The Modern. Era Retrospective-analysis for Research and Applications (MERRA) reanalyses has produced several years of data, on the way to a completing. the 1979-present modern satellite era. Here, we present a preliminary evaluation of those years currently available, includin g comparisons with the existing long reanalyses (ERA40, JRA25 and NCE P I and II) as well as with global data sets for the water and energy cycle Time series shows that the MERRA budgets can change with some of the variations in observing systems. We will present all terms of the budgets in MERRA including the time rates of change and analysis increments (tendency due to the analysis of observations)

  9. The Defense Acquisition University: Training Professionals for the Acquisition Workforce, 1992-2003

    DTIC Science & Technology

    2007-01-01

    Procurement Issues for the Coming Decade,” The PricewaterhouseCoopers Endowment for The Business of Government, New Ways to Manage Series (January 2002...led to the sixth in the series of reports. Led by David Packard, the co-founder of the electronics firm Hewlett-Packard and a former Deputy Secretary...occupational series , 67 percent had not completed their mandatory training. The deficiencies of DoD acquisition training found by the House

  10. Causality as a Rigorous Notion and Quantitative Causality Analysis with Time Series

    NASA Astrophysics Data System (ADS)

    Liang, X. S.

    2017-12-01

    Given two time series, can one faithfully tell, in a rigorous and quantitative way, the cause and effect between them? Here we show that this important and challenging question (one of the major challenges in the science of big data), which is of interest in a wide variety of disciplines, has a positive answer. Particularly, for linear systems, the maximal likelihood estimator of the causality from a series X2 to another series X1, written T2→1, turns out to be concise in form: T2→1 = [C11 C12 C2,d1 — C112 C1,d1] / [C112 C22 — C11C122] where Cij (i,j=1,2) is the sample covariance between Xi and Xj, and Ci,dj the covariance between Xi and ΔXj/Δt, the difference approximation of dXj/dt using the Euler forward scheme. An immediate corollary is that causation implies correlation, but not vice versa, resolving the long-standing debate over causation versus correlation. The above formula has been validated with touchstone series purportedly generated with one-way causality that evades the classical approaches such as Granger causality test and transfer entropy analysis. It has also been applied successfully to the investigation of many real problems. Through a simple analysis with the stock series of IBM and GE, an unusually strong one-way causality is identified from the former to the latter in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a "Giant" for the computer market. Another example presented here regards the cause-effect relation between the two climate modes, El Niño and Indian Ocean Dipole (IOD). In general, these modes are mutually causal, but the causality is asymmetric. To El Niño, the information flowing from IOD manifests itself as a propagation of uncertainty from the Indian Ocean. In the third example, an unambiguous one-way causality is found between CO2 and the global mean temperature anomaly. While it is confirmed that CO2 indeed drives the recent global warming, on paleoclimate scales the cause-effect relation may be completely reversed. Key words: Causation, Information flow, Uncertainty Generation, El Niño, IOD, CO2/Global warming Reference : Liang, 2014: Unraveling the cause-effect relation between time series. PRE 90, 052150 News Report: http://scitation.aip.org/content/aip/magazine/physicstoday/news/10.1063/PT.5.7124

  11. Acceptance Inspection for Audio Cassette Recorders.

    ERIC Educational Resources Information Center

    Smith, Edgar A.

    A series of inspections for cassette recorders that can be performed to assure that the devices are acceptable is described. The inspections can be completed in 20 minutes and can be performed by instructional personnel. The series of inspection procedures includes tests of the intelligibility of audio, physical condition, tape speed, impulse…

  12. Upcoming Summer Programs for Students and Staff | Poster

    Cancer.gov

    By Robin Meckley, Contributing Writer This summer, the Scientific Library is hosting three programs for students and NCI at Frederick staff: the Summer Video Series, Mini Science Film & Discussion Series, and Eighth Annual Student Science Jeopardy Tournament. Complete information on the programs is available on the Scientific Library’s website.

  13. Forecasting and analyzing high O3 time series in educational area through an improved chaotic approach

    NASA Astrophysics Data System (ADS)

    Hamid, Nor Zila Abd; Adenan, Nur Hamiza; Noorani, Mohd Salmi Md

    2017-08-01

    Forecasting and analyzing the ozone (O3) concentration time series is important because the pollutant is harmful to health. This study is a pilot study for forecasting and analyzing the O3 time series in one of Malaysian educational area namely Shah Alam using chaotic approach. Through this approach, the observed hourly scalar time series is reconstructed into a multi-dimensional phase space, which is then used to forecast the future time series through the local linear approximation method. The main purpose is to forecast the high O3 concentrations. The original method performed poorly but the improved method addressed the weakness thereby enabling the high concentrations to be successfully forecast. The correlation coefficient between the observed and forecasted time series through the improved method is 0.9159 and both the mean absolute error and root mean squared error are low. Thus, the improved method is advantageous. The time series analysis by means of the phase space plot and Cao method identified the presence of low-dimensional chaotic dynamics in the observed O3 time series. Results showed that at least seven factors affect the studied O3 time series, which is consistent with the listed factors from the diurnal variations investigation and the sensitivity analysis from past studies. In conclusion, chaotic approach has been successfully forecast and analyzes the O3 time series in educational area of Shah Alam. These findings are expected to help stakeholders such as Ministry of Education and Department of Environment in having a better air pollution management.

  14. Transformation-cost time-series method for analyzing irregularly sampled data

    NASA Astrophysics Data System (ADS)

    Ozken, Ibrahim; Eroglu, Deniz; Stemler, Thomas; Marwan, Norbert; Bagci, G. Baris; Kurths, Jürgen

    2015-06-01

    Irregular sampling of data sets is one of the challenges often encountered in time-series analysis, since traditional methods cannot be applied and the frequently used interpolation approach can corrupt the data and bias the subsequence analysis. Here we present the TrAnsformation-Cost Time-Series (TACTS) method, which allows us to analyze irregularly sampled data sets without degenerating the quality of the data set. Instead of using interpolation we consider time-series segments and determine how close they are to each other by determining the cost needed to transform one segment into the following one. Using a limited set of operations—with associated costs—to transform the time series segments, we determine a new time series, that is our transformation-cost time series. This cost time series is regularly sampled and can be analyzed using standard methods. While our main interest is the analysis of paleoclimate data, we develop our method using numerical examples like the logistic map and the Rössler oscillator. The numerical data allows us to test the stability of our method against noise and for different irregular samplings. In addition we provide guidance on how to choose the associated costs based on the time series at hand. The usefulness of the TACTS method is demonstrated using speleothem data from the Secret Cave in Borneo that is a good proxy for paleoclimatic variability in the monsoon activity around the maritime continent.

  15. Transformation-cost time-series method for analyzing irregularly sampled data.

    PubMed

    Ozken, Ibrahim; Eroglu, Deniz; Stemler, Thomas; Marwan, Norbert; Bagci, G Baris; Kurths, Jürgen

    2015-06-01

    Irregular sampling of data sets is one of the challenges often encountered in time-series analysis, since traditional methods cannot be applied and the frequently used interpolation approach can corrupt the data and bias the subsequence analysis. Here we present the TrAnsformation-Cost Time-Series (TACTS) method, which allows us to analyze irregularly sampled data sets without degenerating the quality of the data set. Instead of using interpolation we consider time-series segments and determine how close they are to each other by determining the cost needed to transform one segment into the following one. Using a limited set of operations-with associated costs-to transform the time series segments, we determine a new time series, that is our transformation-cost time series. This cost time series is regularly sampled and can be analyzed using standard methods. While our main interest is the analysis of paleoclimate data, we develop our method using numerical examples like the logistic map and the Rössler oscillator. The numerical data allows us to test the stability of our method against noise and for different irregular samplings. In addition we provide guidance on how to choose the associated costs based on the time series at hand. The usefulness of the TACTS method is demonstrated using speleothem data from the Secret Cave in Borneo that is a good proxy for paleoclimatic variability in the monsoon activity around the maritime continent.

  16. a Method of Time-Series Change Detection Using Full Polsar Images from Different Sensors

    NASA Astrophysics Data System (ADS)

    Liu, W.; Yang, J.; Zhao, J.; Shi, H.; Yang, L.

    2018-04-01

    Most of the existing change detection methods using full polarimetric synthetic aperture radar (PolSAR) are limited to detecting change between two points in time. In this paper, a novel method was proposed to detect the change based on time-series data from different sensors. Firstly, the overall difference image of a time-series PolSAR was calculated by ominous statistic test. Secondly, difference images between any two images in different times ware acquired by Rj statistic test. Generalized Gaussian mixture model (GGMM) was used to obtain time-series change detection maps in the last step for the proposed method. To verify the effectiveness of the proposed method, we carried out the experiment of change detection by using the time-series PolSAR images acquired by Radarsat-2 and Gaofen-3 over the city of Wuhan, in China. Results show that the proposed method can detect the time-series change from different sensors.

  17. The method of trend analysis of parameters time series of gas-turbine engine state

    NASA Astrophysics Data System (ADS)

    Hvozdeva, I.; Myrhorod, V.; Derenh, Y.

    2017-10-01

    This research substantiates an approach to interval estimation of time series trend component. The well-known methods of spectral and trend analysis are used for multidimensional data arrays. The interval estimation of trend component is proposed for the time series whose autocorrelation matrix possesses a prevailing eigenvalue. The properties of time series autocorrelation matrix are identified.

  18. Network structure of multivariate time series.

    PubMed

    Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito

    2015-10-21

    Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.

  19. Homogenising time series: beliefs, dogmas and facts

    NASA Astrophysics Data System (ADS)

    Domonkos, P.

    2011-06-01

    In the recent decades various homogenisation methods have been developed, but the real effects of their application on time series are still not known sufficiently. The ongoing COST action HOME (COST ES0601) is devoted to reveal the real impacts of homogenisation methods more detailed and with higher confidence than earlier. As a part of the COST activity, a benchmark dataset was built whose characteristics approach well the characteristics of real networks of observed time series. This dataset offers much better opportunity than ever before to test the wide variety of homogenisation methods, and analyse the real effects of selected theoretical recommendations. Empirical results show that real observed time series usually include several inhomogeneities of different sizes. Small inhomogeneities often have similar statistical characteristics than natural changes caused by climatic variability, thus the pure application of the classic theory that change-points of observed time series can be found and corrected one-by-one is impossible. However, after homogenisation the linear trends, seasonal changes and long-term fluctuations of time series are usually much closer to the reality than in raw time series. Some problems around detecting multiple structures of inhomogeneities, as well as that of time series comparisons within homogenisation procedures are discussed briefly in the study.

  20. The Timeseries Toolbox - A Web Application to Enable Accessible, Reproducible Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Veatch, W.; Friedman, D.; Baker, B.; Mueller, C.

    2017-12-01

    The vast majority of data analyzed by climate researchers are repeated observations of physical process or time series data. This data lends itself of a common set of statistical techniques and models designed to determine trends and variability (e.g., seasonality) of these repeated observations. Often, these same techniques and models can be applied to a wide variety of different time series data. The Timeseries Toolbox is a web application designed to standardize and streamline these common approaches to time series analysis and modeling with particular attention to hydrologic time series used in climate preparedness and resilience planning and design by the U. S. Army Corps of Engineers. The application performs much of the pre-processing of time series data necessary for more complex techniques (e.g. interpolation, aggregation). With this tool, users can upload any dataset that conforms to a standard template and immediately begin applying these techniques to analyze their time series data.

Top