Jones, Matthew; Lewis, Sarah; Parrott, Steve; Wormall, Stephen; Coleman, Tim
2016-06-01
In pregnant smoking cessation trial participants, to estimate (1) among women abstinent at the end of pregnancy, the proportion who re-start smoking at time-points afterwards (primary analysis) and (2) among all trial participants, the proportion smoking at the end of pregnancy and at selected time-points during the postpartum period (secondary analysis). Trials identified from two Cochrane reviews plus searches of Medline and EMBASE. Twenty-seven trials were included. The included trials were randomized or quasi-randomized trials of within-pregnancy cessation interventions given to smokers who reported abstinence both at end of pregnancy and at one or more defined time-points after birth. Outcomes were validated biochemically and self-reported continuous abstinence from smoking and 7-day point prevalence abstinence. The primary random-effects meta-analysis used longitudinal data to estimate mean pooled proportions of re-starting smoking; a secondary analysis used cross-sectional data to estimate the mean proportions smoking at different postpartum time-points. Subgroup analyses were performed on biochemically validated abstinence. The pooled mean proportion re-starting at 6 months postpartum was 43% [95% confidence interval (CI) = 16-72%, I(2) = 96.7%] (11 trials, 571 abstinent women). The pooled mean proportion smoking at the end of pregnancy was 87% (95% CI = 84-90%, I(2) = 93.2%) and 94% (95% CI = 92-96%, I(2) = 88%) at 6 months postpartum (23 trials, 9262 trial participants). Findings were similar when using biochemically validated abstinence. In clinical trials of smoking cessation interventions during pregnancy only 13% are abstinent at term. Of these, 43% re-start by 6 months postpartum. © 2016 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.
COREnet: The Fusion of Social Network Analysis and Target Audience Analysis
2014-12-01
misunderstanding of MISO (PSYOP) not only in doctrine, but also in practice, is easily understood. MISO has a long history of name changes starting ...TAA does not strictly adhere to any particular theory; studying dynamics is a valid starting point for analysis, and is naturally congruent with the...provides a starting point for further analysis. The PO is a pre-approved objective by the Office of the Secretary of Defense (OSD) (JP 3–53, 2003, V-1
[Valuating public health in some zoos in Colombia. Phase 1: designing and validating instruments].
Agudelo-Suárez, Angela N; Villamil-Jiménez, Luis C
2009-10-01
Designing and validating instruments for identifying public health problems in some zoological parks in Colombia, thereby allowing them to be evaluated. Four instruments were designed and validated along with the participation of five zoos. The instruments were validated regarding appearance, content, sensitivity to change, reliability tests and determining the tools' usefulness. An evaluation scale was created which assigned a maximum of 400 points, having the following evaluation intervals: 350-400 points meant good public health management, 100-349 points for regular management and 0-99 points for deficient management. The instruments were applied to the five zoos as part of the validation, forming a base-line for future evaluation of public health in them. Four valid and useful instruments were obtained for evaluating public health in zoos in Colombia. The five zoos presented regular public health management. The base-line obtained when validating the instruments led to identifying strengths and weaknesses regarding public health management in the zoos. The instruments obtained generally and specifically evaluated public health management; they led to diagnosing, identifying, quantifying and scoring zoos in Colombia in terms of public health. The base-line provided a starting point for making comparisons and enabling future follow-up of public health in Colombian zoos.
ERIC Educational Resources Information Center
Testa, Italo; Galano, Silvia; Leccia, Silvio; Puddu, Emanuella
2015-01-01
In this paper, we report about the development and validation of a learning progression about the Celestial Motion big idea. Existing curricula, research studies on alternative conceptions about these phenomena, and students' answers to an open questionnaire were the starting point to develop initial learning progressions about change of seasons,…
Art Therapy and Dissociative Disorders.
ERIC Educational Resources Information Center
Engle, Patricia
1997-01-01
Demonstrates how art therapy helped a woman address her identity and memory difficulties while she managed her daily activities. The process helped her validate traumatic events in her history and provided a starting point for addressing internal conflicts. The client's artwork helped the therapist learn about the client's unconscious states. (MKA)
DEVELOPMENT AND VALIDATION OF AN AIR-TO-BEEF FOOD CHAIN MODEL FOR DIOXIN-LIKE COMPOUNDS
A model for predicting concentrations of dioxin-like compounds in beef is developed and tested. The key premise of the model is that concentrations of these compounds in air are the source term, or starting point, for estimating beef concentrations. Vapor-phase concentrations t...
A quantitative dynamic systems model of health-related quality of life among older adults
Roppolo, Mattia; Kunnen, E Saskia; van Geert, Paul L; Mulasso, Anna; Rabaglietti, Emanuela
2015-01-01
Health-related quality of life (HRQOL) is a person-centered concept. The analysis of HRQOL is highly relevant in the aged population, which is generally suffering from health decline. Starting from a conceptual dynamic systems model that describes the development of HRQOL in individuals over time, this study aims to develop and test a quantitative dynamic systems model, in order to reveal the possible dynamic trends of HRQOL among older adults. The model is tested in different ways: first, with a calibration procedure to test whether the model produces theoretically plausible results, and second, with a preliminary validation procedure using empirical data of 194 older adults. This first validation tested the prediction that given a particular starting point (first empirical data point), the model will generate dynamic trajectories that lead to the observed endpoint (second empirical data point). The analyses reveal that the quantitative model produces theoretically plausible trajectories, thus providing support for the calibration procedure. Furthermore, the analyses of validation show a good fit between empirical and simulated data. In fact, no differences were found in the comparison between empirical and simulated final data for the same subgroup of participants, whereas the comparison between different subgroups of people resulted in significant differences. These data provide an initial basis of evidence for the dynamic nature of HRQOL during the aging process. Therefore, these data may give new theoretical and applied insights into the study of HRQOL and its development with time in the aging population. PMID:26604722
Analyzing user-generated online content for drug discovery: development and use of MedCrawler.
Helfenstein, Andreas; Tammela, Päivi
2017-04-15
Ethnopharmacology, or the scientific validation of traditional medicine, is a respected starting point in drug discovery. Home remedies and traditional use of plants are still widespread, also in Western societies. Instead of perusing ancient pharmacopeias, we developed MedCrawler, which we used to analyze blog posts for mentions of home remedies and their applications. This method is free and accessible from the office computer. We developed MedCrawler, a data mining tool for analyzing user-generated blog posts aiming to find modern 'traditional' medicine or home remedies. It searches user-generated blog posts and analyzes them for correlations between medically relevant terms. We also present examples and show that this method is capable of delivering both scientifically validated uses as well as not so well documented applications, which might serve as a starting point for follow-up research. Source code is available on GitHub at {{ https://github.com/a-hel/medcrawler }}. paivi.tammela@helsinki.fi. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Accurate Grid-based Clustering Algorithm with Diagonal Grid Searching and Merging
NASA Astrophysics Data System (ADS)
Liu, Feng; Ye, Chengcheng; Zhu, Erzhou
2017-09-01
Due to the advent of big data, data mining technology has attracted more and more attentions. As an important data analysis method, grid clustering algorithm is fast but with relatively lower accuracy. This paper presents an improved clustering algorithm combined with grid and density parameters. The algorithm first divides the data space into the valid meshes and invalid meshes through grid parameters. Secondly, from the starting point located at the first point of the diagonal of the grids, the algorithm takes the direction of “horizontal right, vertical down” to merge the valid meshes. Furthermore, by the boundary grid processing, the invalid grids are searched and merged when the adjacent left, above, and diagonal-direction grids are all the valid ones. By doing this, the accuracy of clustering is improved. The experimental results have shown that the proposed algorithm is accuracy and relatively faster when compared with some popularly used algorithms.
Grigg, Josephine; Haakonssen, Eric; Rathbone, Evelyne; Orr, Robin; Keogh, Justin W L
2017-11-13
The aim of this study was to quantify the validity and intra-tester reliability of a novel method of kinematic measurement. The measurement target was the joint angles of an athlete performing a BMX Supercross (SX) gate start action through the first 1.2 s of movement in situ on a BMX SX ramp using a standard gate start procedure. The method employed GoPro® Hero 4 Silver (GoPro Inc., USA) cameras capturing data at 120 fps 720 p on a 'normal' lens setting. Kinovea 0.8.15 (Kinovea.org, France) was used for analysis. Tracking data was exported and angles computed in Matlab (Mathworks®, USA). The gold standard 3D method for joint angle measurement could not safely be employed in this environment, so a rigid angle was used. Validity was measured to be within 2°. Intra-tester reliability was measured by the same tester performing the analysis twice with an average of 55 days between analyses. Intra-tester reliability was high, with an absolute error <6° and <9 frames (0.075 s) across all angles and time points for key positions, respectively. The methodology is valid within 2° and reliable within 6° for the calculation of joint angles in the first ~1.25 s.
STOPP/START Medication Criteria Modified for US Nursing Home Setting
Khodyakov, Dmitry; Ochoa, Aileen; Olivieri-Mui, Brianne L.; Bouwmeester, Carla; Zarowitz, Barbara J.; Patel, Meenakshi; Ching, Diana; Briesacher, Becky
2016-01-01
STRUCTURED ABSTRACT BACKGROUND/OBJECTIVES A barrier to assessing the quality of prescribing in nursing homes (NH) is the lack of explicit criteria for this setting. Our objective was to develop a set of prescribing indicators measurable with available data from electronic nursing home databases by adapting the European-based 2014 STOPP/START criteria of potentially inappropriate and underused medications for the US setting. DESIGN A two-stage expert panel process. In first stage, investigator team reviewed 114 criteria for compatibility and measurability. In second stage, we convened an online modified e-Delphi (OMD) panel to rate the validity of criteria and two webinars to identify criteria with highest relevance to US NHs. PARTICIPANTS Seventeen experts with recognized reputations in NH care participated in the e-Delphi panel and 12 in the webinar. MEASUREMENTS Compatibility and measurability were assessed by comparing criteria to US terminology/setting standards and data elements in NH databases. Validity was rated with a 9-point Likert-type scale (1=not valid at all, 9=highly valid). Mean, median, interpercentile ranges, and agreement were determined for each criterion score. Relevance was determined by ranking the mean panel ratings on criteria that reached agreement; half of the criteria with the highest mean values were reviewed and approved by the webinar participants. RESULTS Fifty-three STOPP/START criteria were deemed as compatible with US setting and measurable using data from electronic NH databases. E-Delphi panelists rated 48 criteria as valid for US NHs. Twenty-four criteria were deemed as most relevant, consisting of 22 measures of potentially inappropriate medications and 2 measures of underused medications. CONCLUSION This study created the first explicit criteria for assessing the quality of prescribing in US NHs. PMID:28008599
The feminist perspective: searching the cosmos for a valid voice.
Sugarman, Roy
2009-01-01
The author explores the nature of what is valid in life and what is not. This is done with particular reference to the contention that most men suffer from the conflicts that the modern world throws their way, and that their psychological nature suffers from paradoxical inputs across the lifespan. Baby boomers in particular have learned of their father's heroism, but faced their mother's wrath as the latter half of the 20(th) century unwound and they found no refuge for failed heroism, but rather invalid fantasy in their choices as husbands and fathers. The author concludes with the realization that heroism demands that the starting point is a void, where all struggle is valid, and heroic, with no benchmarks.
The Feminist Perspective: Searching the Cosmos for a Valid Voice
Sugarman, Roy
2009-01-01
The author explores the nature of what is valid in life and what is not. This is done with particular reference to the contention that most men suffer from the conflicts that the modern world throws their way, and that their psychological nature suffers from paradoxical inputs across the lifespan. Baby boomers in particular have learned of their father's heroism, but faced their mother's wrath as the latter half of the 20th century unwound and they found no refuge for failed heroism, but rather invalid fantasy in their choices as husbands and fathers. The author concludes with the realization that heroism demands that the starting point is a void, where all struggle is valid, and heroic, with no benchmarks. PMID:21836783
Development of Driver/Vehicle Steering Interaction Models for Dynamic Analysis
1988-12-01
Figure 5-10. The Linearized Single-Unit Vehicle Model ............................... 41 Figure 5-11. Interpretation of the Single-Unit Model...The starting point for the driver modelling research conducted under this project was a linear preview control model originally proposed by MacAdam 1...regardless of its origin, can pass at least the elementary validation test of exhibiting "cross-over model"-like- behavior in the vicinity of its
Approaches to Validation of Models for Low Gravity Fluid Behavior
NASA Technical Reports Server (NTRS)
Chato, David J.; Marchetta, Jeffery; Hochstein, John I.; Kassemi, Mohammad
2005-01-01
This paper details the author experiences with the validation of computer models to predict low gravity fluid behavior. It reviews the literature of low gravity fluid behavior as a starting point for developing a baseline set of test cases. It examines authors attempts to validate their models against these cases and the issues they encountered. The main issues seem to be that: Most of the data is described by empirical correlation rather than fundamental relation; Detailed measurements of the flow field have not been made; Free surface shapes are observed but through thick plastic cylinders, and therefore subject to a great deal of optical distortion; and Heat transfer process time constants are on the order of minutes to days but the zero-gravity time available has been only seconds.
VALUE - A Framework to Validate Downscaling Approaches for Climate Change Studies
NASA Astrophysics Data System (ADS)
Maraun, Douglas; Widmann, Martin; Gutiérrez, José M.; Kotlarski, Sven; Chandler, Richard E.; Hertig, Elke; Wibig, Joanna; Huth, Radan; Wilke, Renate A. I.
2015-04-01
VALUE is an open European network to validate and compare downscaling methods for climate change research. VALUE aims to foster collaboration and knowledge exchange between climatologists, impact modellers, statisticians, and stakeholders to establish an interdisciplinary downscaling community. A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. Here, we present the key ingredients of this framework. VALUE's main approach to validation is user-focused: starting from a specific user problem, a validation tree guides the selection of relevant validation indices and performance measures. Several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur: what is the isolated downscaling skill? How do statistical and dynamical methods compare? How do methods perform at different spatial scales? Do methods fail in representing regional climate change? How is the overall representation of regional climate, including errors inherited from global climate models? The framework will be the basis for a comprehensive community-open downscaling intercomparison study, but is intended also to provide general guidance for other validation studies.
VALUE: A framework to validate downscaling approaches for climate change studies
NASA Astrophysics Data System (ADS)
Maraun, Douglas; Widmann, Martin; Gutiérrez, José M.; Kotlarski, Sven; Chandler, Richard E.; Hertig, Elke; Wibig, Joanna; Huth, Radan; Wilcke, Renate A. I.
2015-01-01
VALUE is an open European network to validate and compare downscaling methods for climate change research. VALUE aims to foster collaboration and knowledge exchange between climatologists, impact modellers, statisticians, and stakeholders to establish an interdisciplinary downscaling community. A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. In this paper, we present the key ingredients of this framework. VALUE's main approach to validation is user- focused: starting from a specific user problem, a validation tree guides the selection of relevant validation indices and performance measures. Several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur: what is the isolated downscaling skill? How do statistical and dynamical methods compare? How do methods perform at different spatial scales? Do methods fail in representing regional climate change? How is the overall representation of regional climate, including errors inherited from global climate models? The framework will be the basis for a comprehensive community-open downscaling intercomparison study, but is intended also to provide general guidance for other validation studies.
New Expression for Collisionless Magnetic Reconnection Rate
NASA Technical Reports Server (NTRS)
Klimas, Alexander J.
2014-01-01
For 2D, symmetric, anti-parallel, collisionless magnetic reconnection, a new expression for the reconnection rate in the electron diffusion region is introduced. It is shown that this expression can be derived in just a few simple steps from a physically intuitive starting point; the derivation is given in its entirety and the validity of each step is confirmed. The predictions of this expression are compared to the results of several long-duration, open-boundary PIC reconnection simulations to demonstrate excellent agreement.
Methods for Geometric Data Validation of 3d City Models
NASA Astrophysics Data System (ADS)
Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.
2015-12-01
Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges are detected, among additional criteria. Self-intersection might lead to different results, e.g. intersection points, lines or areas. Depending on the geometric constellation, they might represent gaps between bounding polygons of the solids, overlaps, or violations of the 2-manifoldness. Not least due to the floating point problem in digital numbers, tolerances must be considered in some algorithms, e.g. planarity and solid self-intersection. Effects of different tolerance values and their handling is discussed; recommendations for suitable values are given. The goal of the paper is to give a clear understanding of geometric validation in the context of 3D city models. This should also enable the data holder to get a better comprehension of the validation results and their consequences on the deployment fields of the validated data set.
NASA National Combustion Code Simulations
NASA Technical Reports Server (NTRS)
Iannetti, Anthony; Davoudzadeh, Farhad
2001-01-01
A systematic effort is in progress to further validate the National Combustion Code (NCC) that has been developed at NASA Glenn Research Center (GRC) for comprehensive modeling and simulation of aerospace combustion systems. The validation efforts include numerical simulation of the gas-phase combustor experiments conducted at the Center for Turbulence Research (CTR), Stanford University, followed by comparison and evaluation of the computed results with the experimental data. Presently, at GRC, a numerical model of the experimental gaseous combustor is built to simulate the experimental model. The constructed numerical geometry includes the flow development sections for air annulus and fuel pipe, 24 channel air and fuel swirlers, hub, combustor, and tail pipe. Furthermore, a three-dimensional multi-block, multi-grid grid (1.6 million grid points, 3-levels of multi-grid) is generated. Computational simulation of the gaseous combustor flow field operating on methane fuel has started. The computational domain includes the whole flow regime starting from the fuel pipe and the air annulus, through the 12 air and 12 fuel channels, in the combustion region and through the tail pipe.
Thermodynamic constraints on fluctuation phenomena
NASA Astrophysics Data System (ADS)
Maroney, O. J. E.
2009-12-01
The relationships among reversible Carnot cycles, the absence of perpetual motion machines, and the existence of a nondecreasing globally unique entropy function form the starting point of many textbook presentations of the foundations of thermodynamics. However, the thermal fluctuation phenomena associated with statistical mechanics has been argued to restrict the domain of validity of this basis of the second law of thermodynamics. Here we demonstrate that fluctuation phenomena can be incorporated into the traditional presentation, extending rather than restricting the domain of validity of the phenomenologically motivated second law. Consistency conditions lead to constraints upon the possible spectrum of thermal fluctuations. In a special case this uniquely selects the Gibbs canonical distribution and more generally incorporates the Tsallis distributions. No particular model of microscopic dynamics need be assumed.
Thermodynamic constraints on fluctuation phenomena.
Maroney, O J E
2009-12-01
The relationships among reversible Carnot cycles, the absence of perpetual motion machines, and the existence of a nondecreasing globally unique entropy function form the starting point of many textbook presentations of the foundations of thermodynamics. However, the thermal fluctuation phenomena associated with statistical mechanics has been argued to restrict the domain of validity of this basis of the second law of thermodynamics. Here we demonstrate that fluctuation phenomena can be incorporated into the traditional presentation, extending rather than restricting the domain of validity of the phenomenologically motivated second law. Consistency conditions lead to constraints upon the possible spectrum of thermal fluctuations. In a special case this uniquely selects the Gibbs canonical distribution and more generally incorporates the Tsallis distributions. No particular model of microscopic dynamics need be assumed.
Open access chemical probes for epigenetic targets
Brown, Peter J; Müller, Susanne
2015-01-01
Background High attrition rates in drug discovery call for new approaches to improve target validation. Academia is filling gaps, but often lacks the experience and resources of the pharmaceutical industry resulting in poorly characterized tool compounds. Discussion The SGC has established an open access chemical probe consortium, currently encompassing ten pharmaceutical companies. One of its mandates is to create well-characterized inhibitors (chemical probes) for epigenetic targets to enable new biology and target validation for drug development. Conclusion Epigenetic probe compounds have proven to be very valuable and have not only spurred a plethora of novel biological findings, but also provided starting points for clinical trials. These probes have proven to be critical complementation to traditional genetic targeting strategies and provided sometimes surprising results. PMID:26397018
Olives, Casey; Pagano, Marcello; Deitchler, Megan; Hedt, Bethany L; Egge, Kari; Valadez, Joseph J
2009-04-01
Traditional lot quality assurance sampling (LQAS) methods require simple random sampling to guarantee valid results. However, cluster sampling has been proposed to reduce the number of random starting points. This study uses simulations to examine the classification error of two such designs, a 67x3 (67 clusters of three observations) and a 33x6 (33 clusters of six observations) sampling scheme to assess the prevalence of global acute malnutrition (GAM). Further, we explore the use of a 67x3 sequential sampling scheme for LQAS classification of GAM prevalence. Results indicate that, for independent clusters with moderate intracluster correlation for the GAM outcome, the three sampling designs maintain approximate validity for LQAS analysis. Sequential sampling can substantially reduce the average sample size that is required for data collection. The presence of intercluster correlation can impact dramatically the classification error that is associated with LQAS analysis.
CFD study of some factors affecting performance of HAWT with swept blades
NASA Astrophysics Data System (ADS)
Khalafallah, M. G.; Ahmed, A. M.; Emam, M. K.
2017-05-01
Most modern high-power wind turbines are horizontal axis type with straight twisted blades. Upgrading power and performance of these turbines is considered a challenge. A recent trend towards improving the horizontal axis wind turbine (HAWT) performance is to use swept blades or sweep twist adaptive blades. In the present work, the effect of blade curvature, sweep starting point and sweep direction on the wind turbine performance was investigated. The CFD simulation method was validated against available experimental data of a 0.9 m diameter HAWT. The wind turbine power and thrust coefficients at different tip speed ratios were calculated. Flow field, pressure distribution and local tangential and streamwise forces were also analysed. The results show that the downstream swept blade has the highest Cp value at design point as compared with the straight blade profile. However, the improvement in power coefficient is accompanied by a thrust increase. Results also show that the best performance is obtained when the starting blade sweeps at 25% of blade radius for different directions of sweep.
Predicting the Operational Acceptability of Route Advisories
NASA Technical Reports Server (NTRS)
Evans, Antony; Lee, Paul
2017-01-01
NASA envisions a future Air Traffic Management system that allows safe, efficient growth in global operations, enabled by increasing levels of automation and autonomy. In a safety-critical system, the introduction of increasing automation and autonomy has to be done in stages, making human-system integrated concepts critical in the foreseeable future. One example where this is relevant is for tools that generate more efficient flight routings or reroute advisories. If these routes are not operationally acceptable, they will be rejected by human operators, and the associated benefits will not be realized. Operational acceptance is therefore required to enable the increased efficiency and reduced workload benefits associated with these tools. In this paper, the authors develop a predictor of operational acceptability for reroute advisories. Such a capability has applications in tools that identify more efficient routings around weather and congestion and that better meet airline preferences. The capability is based on applying data mining techniques to flight plan amendment data reported by the Federal Aviation Administration and data on requested reroutes collected from a field trial of the NASA developed Dynamic Weather Routes tool, which advised efficient route changes to American Airlines dispatchers in 2014. 10-Fold cross validation was used for feature, model and parameter selection, while nested cross validation was used to validate the model. The model performed well in predicting controller acceptance or rejection of a route change as indicated by chosen performance metrics. Features identified as relevant to controller acceptance included the historical usage of the advised route, the location of the maneuver start point relative to the boundaries of the airspace sector containing the maneuver start (the maneuver start sector), the reroute deviation from the original flight plan, and the demand level in the maneuver start sector. A random forest with forty trees was the best performing of the five models evaluated in this paper.
Prior probability and feature predictability interactively bias perceptual decisions
Dunovan, Kyle E.; Tremel, Joshua J.; Wheeler, Mark E.
2014-01-01
Anticipating a forthcoming sensory experience facilitates perception for expected stimuli but also hinders perception for less likely alternatives. Recent neuroimaging studies suggest that expectation biases arise from feature-level predictions that enhance early sensory representations and facilitate evidence accumulation for contextually probable stimuli while suppressing alternatives. Reasonably then, the extent to which prior knowledge biases subsequent sensory processing should depend on the precision of expectations at the feature level as well as the degree to which expected features match those of an observed stimulus. In the present study we investigated how these two sources of uncertainty modulated pre- and post-stimulus bias mechanisms in the drift-diffusion model during a probabilistic face/house discrimination task. We tested several plausible models of choice bias, concluding that predictive cues led to a bias in both the starting-point and rate of evidence accumulation favoring the more probable stimulus category. We further tested the hypotheses that prior bias in the starting-point was conditional on the feature-level uncertainty of category expectations and that dynamic bias in the drift-rate was modulated by the match between expected and observed stimulus features. Starting-point estimates suggested that subjects formed a constant prior bias in favor of the face category, which exhibits less feature-level variability, that was strengthened or weakened by trial-wise predictive cues. Furthermore, we found that the gain on face/house evidence was increased for stimuli with less ambiguous features and that this relationship was enhanced by valid category expectations. These findings offer new evidence that bridges psychological models of decision-making with recent predictive coding theories of perception. PMID:24978303
Davydov, D M; Naliboff, B; Shahabi, L; Shapiro, D
2018-02-01
Objective measures of pain severity remain ill defined, although its accurate measurement is critical. Reciprocal baroreflex mechanisms of blood pressure (BP) control were found to impact differently on pain regulation, and thus their asymmetry was hypothesized to also connect to chronic pain duration and severity. Seventy-eight female patients with irritable bowel syndrome (IBS) and 27 healthy women were assessed for IBS severity and chronicity, negative affect, and various measures of resting autonomic function including BP, heart rate and its variability (HRV), baroreceptor-sensitivity to activations and inhibitions, gains of brady- and tachy-cardiac baro-responses, gains of BP falls/rises, and BP start points for these spontaneous baroreflexes. IBS directly and indirectly (through increased negative affect) was associated with asymmetry between baroreceptor activations/inhibitions compared to symmetrical baroreflex reciprocity in the healthy women. In the IBS group, independently of specific IBS symptoms, pain chronicity was associated with (i) decreased BP falls coupled with either (a) decreased tachycardia associated with lower disease severity (earlier "pain resilience" mechanism), or (b) decreased bradycardia associated with higher disease severity (later "pain decompensation" mechanism), or (ii) increased BP start point for baroreceptor activations coupled with either (a) BP increase (delayed "pain adaptation" mechanism) or (b) affect-related HRV decrease (delayed "pain aggravation" mechanism). We anticipate the findings to be a starting point for validating these autonomic metrics of pain suffering and pain coping mechanisms in other chronic pain syndromes to suggest them as biomarkers of its severity and duration for profiling and correct management of chronic pain patients. © 2017 John Wiley & Sons Ltd.
Starting Point: Pedagogic Resources for Teaching and Learning Economics
ERIC Educational Resources Information Center
Maier, Mark H.; McGoldrick, KimMarie; Simkins, Scott P.
2012-01-01
This article describes Starting Point: Teaching and Learning Economics, a Web-based portal that makes innovative pedagogic resources and effective teaching practices easily accessible to economists. Starting Point introduces economists to teaching innovations through 16 online modules, each containing a general description of a specific pedagogic…
Patients' perceived level of social isolation affects the prognosis of low back pain.
Oliveira, V C; Ferreira, M L; Morso, L; Albert, H B; Refshauge, K M; Ferreira, P H
2015-04-01
Perceived social isolation is prevalent among patients with low back pain (LBP) and could be a potential prognostic factor for clinical outcomes following an episode of LBP. A secondary analysis of an original prospective cohort study, which investigated the validity of the Danish version of the STarT Back Screening Tool (STarT), investigated whether social isolation predicts the clinical outcomes of disability, anxiety, depression and pain catastrophizing in people with LBP. Patients with LBP of any duration (N = 204) from Middelfart, Denmark, were included. Social isolation was measured at baseline using the friendship scale (score ranges from 0 to 24, with lower values meaning higher perceived social isolation), and outcomes were measured at baseline and at 6-month follow-up. Regression models investigated whether social isolation at baseline predicted the outcomes at 6-month follow-up. Some level of social isolation was reported by 39.2% of the participants (n = 80) with 5.9% (n = 12) being very socially isolated. One-point difference on social isolation predicted one point on a 100-point disability scale (adjusted unstandardized coefficient: -0.91; 95% confidence interval (CI): -1.56 to -0.26). Social isolation predicted anxiety; however, a change of one point on the social isolation scale represents a difference of only 0.08 points on a 22-point scale in anxiety (95% CI: 0.01-0.15) and is unlikely to denote clinical importance. Social isolation did not predict pain catastrophizing or depression. Patients' perceived social isolation predicts disability related to LBP. Further understanding of the role of social isolation in LBP is warranted. © 2014 European Pain Federation - EFIC®
1991-03-01
the array are used cyclically, that is when the end of the array is reached, the pattern starts over at the beginning. Dashed lines wrap around curves...the dash pattern relative to the start of the path. It is interpreted as a .distance into the dash pattern at which the pattern should be started ...cubic seldom Is drawn using the four points specified. The curve starts at the first point and ends at the fourth point; the second and third point are
Desmarais, Sarah L.; Nicholls, Tonia L.; Wilson, Catherine M.; Brink, Johann
2012-01-01
The Short-Term Assessment of Risk and Treatability (START) is a relatively new structured professional judgment guide for the assessment and management of short-term risks associated with mental, substance use, and personality disorders. The scheme may be distinguished from other violence risk instruments because of its inclusion of 20 dynamic factors that are rated in terms of both vulnerability and strength. This study examined the reliability and validity of START assessments in predicting inpatient aggression. Research assistants completed START assessments for 120 male forensic psychiatric patients through review of hospital files. They additionally completed Historical-Clinical-Risk Management – 20 (HCR-20) and the Hare Psychopathy Checklist: Screening Version (PCL:SV) assessments. Outcome data was coded from hospital files for a 12-month follow-up period using the Overt Aggression Scale (OAS). START assessments evidenced excellent interrater reliability and demonstrated both predictive and incremental validity over the HCR-20 Historical subscale scores and PCL:SV total scores. Overall, results support the reliability and validity of START assessments, and use of the structured professional judgment approach more broadly, as well as the value of using dynamic risk and protective factors to assess violence risk. PMID:22250595
Cappellari, Manuel; Turcato, Gianni; Forlivesi, Stefano; Zivelonghi, Cecilia; Bovi, Paolo; Bonetti, Bruno; Toni, Danilo
2018-02-01
Symptomatic intracerebral hemorrhage (sICH) is a rare but the most feared complication of intravenous thrombolysis for ischemic stroke. We aimed to develop and validate a nomogram for individualized prediction of sICH in intravenous thrombolysis-treated stroke patients included in the multicenter SITS-ISTR (Safe Implementation of Thrombolysis in Stroke-International Stroke Thrombolysis Register). All patients registered in the SITS-ISTR by 179 Italian centers between May 2001 and March 2016 were originally included. The main outcome measure was sICH per the European Cooperative Acute Stroke Study II definition (any type of intracerebral hemorrhage with increase of ≥4 National Institutes of Health Stroke Scale score points from baseline or death <7 days). On the basis of multivariate logistic model, the nomogram was generated. We assessed the discriminative performance by using the area under the receiver-operating characteristic curve and calibration of risk prediction model by using the Hosmer-Lemeshow test. A total of 15 949 patients with complete data for generating the nomogram was randomly dichotomized into training (3/4; n=12 030) and test (1/4; n=3919) sets. After multivariate logistic regression, 10 variables remained independent predictors of sICH to compose the STARTING-SICH (systolic blood pressure, age, onset-to-treatment time for thrombolysis, National Institutes of Health Stroke Scale score, glucose, aspirin alone, aspirin plus clopidogrel, anticoagulant with INR ≤1.7, current infarction sign, hyperdense artery sign) nomogram. The area under the receiver-operating characteristic curve of STARTING-SICH was 0.739. Calibration was good ( P =0.327 for the Hosmer-Lemeshow test). The STARTING-SICH is the first nomogram developed and validated in a large SITS-ISTR cohort for individualized prediction of sICH in intravenous thrombolysis-treated stroke patients. © 2018 American Heart Association, Inc.
Evaluating the Social Validity of the Early Start Denver Model: A Convergent Mixed Methods Study
ERIC Educational Resources Information Center
Ogilvie, Emily; McCrudden, Matthew T.
2017-01-01
An intervention has social validity to the extent that it is socially acceptable to participants and stakeholders. This pilot convergent mixed methods study evaluated parents' perceptions of the social validity of the Early Start Denver Model (ESDM), a naturalistic behavioral intervention for children with autism. It focused on whether the parents…
NASA Astrophysics Data System (ADS)
Kheimi, M.; Wang, D.
2017-12-01
Water operating in reservoir system is similar to natural catchment systems in water regulations. The most contributing role in both systems is found to be mitigating of available water deficits from excessive and keeping it away from prolonged droughts. In this paper, Bodyko equation and hedging rule are presented by two stage portioning monthly water balance model. The first stage is the partitioning of precipitation to evapotranspiration (E) plus future storage (S1) and runoff (Q); the second stage is using hedging rule where evapotranspiration and future storage to be recognized by a tradeoff between evapotranspiration and future water storage. The model introduces a linear two point hedging parameters: starting water availability (y1) and ending of water availability (y2).The calibration of the model is based on five parameters: three derived from Budyko equation (S0, ξ, and Yp) and two from hedging rule (y1 and y2).The catchment climate zone along with its physical properties have an effect on the degree of hedging. The y1 and y2 parameters are indicators of the amount of hedging in dry and wet zones. The span between the starting point (y1) and ending point (y2) of hedging indicate there is hedging against future evapotranspiration shortage. Observation of 187 catchments was examined using this model concept for the period of 21 years starting from 1983 to 2003. After calibration and validation using a genetic algorithm it shows that hedging effect in catchment against future evapotranspiration shortages exists with an abundance of hedging effect in dry areas more than wet areas.
Validation metrics for turbulent plasma transport
Holland, C.
2016-06-22
Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnosticsmore » to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. Furthermore, the utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak, as part of a multi-year transport model validation activity.« less
Trujillo-Orrego, N; Pineda, D A; Uribe, L H
2012-03-01
The diagnostic criteria for the attentional deficit hyperactivity disorder (ADHD), were defined by the American Psychiatric Association in the Diagnostic and Statistical Manual of Mental Disorders fourth version (DSM-IV) and World Health Organization in the ICD-10. The American Psychiatric Association used an internal validity analysis to select specific behavioral symptoms associated with the disorder and to build five cross-cultural criteria for its use in the categorical diagnosis. The DSM has been utilized for clinicians and researchers as a valid and stable approach since 1968. We did a systematic review of scientific literature in Spanish and English, aimed to identify the historical origin that supports ADHD as a psychiatric construct. This comprehensive review started exploring the concept of minimal brain dysfunction, hyper-activity, inattention, impulsivity since 1932 to 2011. This paper summarize all the DSM versions that include the definition of ADHD or its equivalent, and it point out the statistical and methodological approach implemented for defining ADHD as a valid epidemiological and psychometric construct. Finally the paper discusses some considerations and suggestions for the new versions of the manual.
Validation metrics for turbulent plasma transport
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holland, C.
Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnosticsmore » to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. Furthermore, the utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak, as part of a multi-year transport model validation activity.« less
Band-edge positions in G W : Effects of starting point and self-consistency
NASA Astrophysics Data System (ADS)
Chen, Wei; Pasquarello, Alfredo
2014-10-01
We study the effect of starting point and self-consistency within G W on the band-edge positions of semiconductors and insulators. Compared to calculations based on a semilocal starting point, the use of a hybrid-functional starting point shows a larger quasiparticle correction for both band-edge states. When the self-consistent treatment is employed, the band-gap opening is found to result mostly from a shift of the valence-band edge. Within the non-self-consistent methods, we analyse the performance of empirical and nonempirical schemes in which the starting point is optimally tuned. We further assess the accuracy of the band-edge positions through the calculation of ionization potentials of surfaces. The ionization potentials for most systems are reasonably well described by one-shot calculations. However, in the case of TiO2, we find that the use of self-consistency is critical to obtain a good agreement with experiment.
Reflections on self in relation to other: core community values of a moral/ethical foundation.
Rosa, William
2014-01-01
One of the first steps toward reaffirming the core community values of nursing as we see, feel, hear, and acknowledge them is the awareness of a moral/ethical foundation that preserves, promotes, and protects human dignity. This foundation serves as a starting point and evolutionary path for education, research, and practice (Watson, 2008). Nursing-specific malignancies of compassion fatigue, burnout, moral distress, and nurse-to-nurse bullying can metastasize throughout nursing communities in which caring environments are not nourished as priorities and starting points for being, doing, knowing, and belonging. An understanding that we all participate in holographic membership results in an ethical display of moral empathy, so that the complexities of nursing can be articulated and validated in safe environments. In addition, preparing for our deaths in a way that celebrates and honors life may potentially lead to peaceful relationships with self, other, and the community as a whole. The nature of such a community implies that nurses are invested in ensuring the integrity of the human experience, will serve as advocates of ethical/moral engagement, and are the embodiment of the sacred, if we so choose to honor it.
One-loop transition amplitudes in the D1D5 CFT
NASA Astrophysics Data System (ADS)
Carson, Zaq; Hampton, Shaun; Mathur, Samir D.
2017-01-01
We consider the issue of thermalization in the D1D5 CFT. Thermalization is expected to correspond to the formation of a black hole in the dual gravity theory. We start from the orbifold point, where the theory is essentially free, and does not thermalize. In earlier work it was noted that there was no clear thermalization effect when the theory was deformed off the orbifold point to first order in the relevant twist perturbation. In this paper we consider the deformation to second order in the twist, where we do find effects that can cause thermalization of an initial perturbation. We consider a 1-loop process where two untwisted copies of the CFT are twisted to one copy and then again untwisted to two copies. We start with a single oscillator excitation on the initial CFT, and compute the effect of the two twists on this state. We find simple approximate expressions for the Bogoliubov coefficients and the behavior of the single oscillator excitation in the continuum limit, where the mode numbers involved are taken to be much larger than unity. We also prove a number of useful relationships valid for processes with an arbitrary number of twist insertions.
Hrast, Martina; Turk, Samo; Sosič, Izidor; Knez, Damijan; Randall, Christopher P; Barreteau, Hélène; Contreras-Martel, Carlos; Dessen, Andréa; O'Neill, Alex J; Mengin-Lecreulx, Dominique; Blanot, Didier; Gobec, Stanislav
2013-08-01
Peptidoglycan is an essential component of the bacterial cell wall, and enzymes involved in its biosynthesis represent validated targets for antibacterial drug discovery. MurF catalyzes the final intracellular peptidoglycan biosynthesis step: the addition of D-Ala-D-Ala to the nucleotide precursor UDP-MurNAc-L-Ala-γ-D-Glu-meso-DAP (or L-Lys). As MurF has no human counterpart, it represents an attractive target for the development of new antibacterial drugs. Using recently published cyanothiophene inhibitors of MurF from Streptococcus pneumoniae as a starting point, we designed and synthesized a series of structurally related derivatives and investigated their inhibition of MurF enzymes from different bacterial species. Systematic structural modifications of the parent compounds resulted in a series of nanomolar inhibitors of MurF from S. pneumoniae and micromolar inhibitors of MurF from Escherichia coli and Staphylococcus aureus. Some of the inhibitors also show antibacterial activity against S. pneumoniae R6. These findings, together with two new co-crystal structures, represent an excellent starting point for further optimization toward effective novel antibacterials. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
An optimization program based on the method of feasible directions: Theory and users guide
NASA Technical Reports Server (NTRS)
Belegundu, Ashok D.; Berke, Laszlo; Patnaik, Surya N.
1994-01-01
The theory and user instructions for an optimization code based on the method of feasible directions are presented. The code was written for wide distribution and ease of attachment to other simulation software. Although the theory of the method of feasible direction was developed in the 1960's, many considerations are involved in its actual implementation as a computer code. Included in the code are a number of features to improve robustness in optimization. The search direction is obtained by solving a quadratic program using an interior method based on Karmarkar's algorithm. The theory is discussed focusing on the important and often overlooked role played by the various parameters guiding the iterations within the program. Also discussed is a robust approach for handling infeasible starting points. The code was validated by solving a variety of structural optimization test problems that have known solutions obtained by other optimization codes. It has been observed that this code is robust: it has solved a variety of problems from different starting points. However, the code is inefficient in that it takes considerable CPU time as compared with certain other available codes. Further work is required to improve its efficiency while retaining its robustness.
Goodenough, Christopher J; Ko, Tien C; Kao, Lillian S; Nguyen, Mylan T; Holihan, Julie L; Alawadi, Zeinab; Nguyen, Duyen H; Flores, Juan R; Arita, Nestor T; Roth, J Scott; Liang, Mike K
2015-04-01
Ventral incisional hernias (VIH) develop in up to 20% of patients after abdominal surgery. No widely applicable preoperative risk-assessment tool exists. We aimed to develop and validate a risk-assessment tool to predict VIH after abdominal surgery. A prospective study of all patients undergoing abdominal surgery was conducted at a single institution from 2008 to 2010. Variables were defined in accordance with the National Surgical Quality Improvement Project, and VIH was determined through clinical and radiographic evaluation. A multivariate Cox proportional hazard model was built from a development cohort (2008 to 2009) to identify predictors of VIH. The HERNIAscore was created by converting the hazards ratios (HR) to points. The predictive accuracy was assessed on the validation cohort (2010) using a receiver operator characteristic curve and calculating the area under the curve (AUC). Of 625 patients followed for a median of 41 months (range 0.3 to 64 months), 93 (13.9%) developed a VIH. The training cohort (n = 428, VIH = 70, 16.4%) identified 4 independent predictors: laparotomy (HR 4.77, 95% CI 2.61 to 8.70) or hand-assisted laparoscopy (HAL, HR 4.00, 95% CI 2.08 to 7.70), COPD (HR 2.35; 95% CI 1.44 to 3.83), and BMI ≥ 25 kg/m(2) (HR1.74; 95% CI 1.04 to 2.91). Factors that were not predictive included age, sex, American Society of Anesthesiologists (ASA) score, albumin, immunosuppression, previous surgery, and suture material or technique. The predictive score had an AUC = 0.77 (95% CI 0.68 to 0.86) using the validation cohort (n = 197, VIH = 23, 11.6%). Using the HERNIAscore: HERNIAscore = 4(∗)Laparotomy+3(∗)HAL+1(∗)COPD+1(∗) BMI ≥ 25, 3 classes stratified the risk of VIH: class I (0 to 3 points),5.2%; class II (4 to 5 points),19.6%; and class III (6 points), 55.0%. The HERNIAscore accurately identifies patients at increased risk for VIH. Although external validation is needed, this provides a starting point to counsel patients and guide clinical decisions. Increasing the use of laparoscopy, weight-loss programs, community smoking prevention programs, and incisional reinforcement may help reduce rates of VIH. Copyright © 2015 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Olives, Casey; Pagano, Marcello; Deitchler, Megan; Hedt, Bethany L; Egge, Kari; Valadez, Joseph J
2009-01-01
Traditional lot quality assurance sampling (LQAS) methods require simple random sampling to guarantee valid results. However, cluster sampling has been proposed to reduce the number of random starting points. This study uses simulations to examine the classification error of two such designs, a 67×3 (67 clusters of three observations) and a 33×6 (33 clusters of six observations) sampling scheme to assess the prevalence of global acute malnutrition (GAM). Further, we explore the use of a 67×3 sequential sampling scheme for LQAS classification of GAM prevalence. Results indicate that, for independent clusters with moderate intracluster correlation for the GAM outcome, the three sampling designs maintain approximate validity for LQAS analysis. Sequential sampling can substantially reduce the average sample size that is required for data collection. The presence of intercluster correlation can impact dramatically the classification error that is associated with LQAS analysis. PMID:20011037
O'Shea, Laura E; Picchioni, Marco M; Dickens, Geoffrey L
2016-04-01
The Short-Term Assessment of Risk and Treatability (START) aims to assist mental health practitioners to estimate an individual's short-term risk for a range of adverse outcomes via structured consideration of their risk ("Vulnerabilities") and protective factors ("Strengths") in 20 areas. It has demonstrated predictive validity for aggression but this is less established for other outcomes. We collated START assessments for N = 200 adults in a secure mental health hospital and ascertained 3-month risk event incidence using the START Outcomes Scale. The specific risk estimates, which are the tool developers' suggested method of overall assessment, predicted aggression, self-harm/suicidality, and victimization, and had incremental validity over the Strength and Vulnerability scales for these outcomes. The Strength scale had incremental validity over the Vulnerability scale for aggressive outcomes; therefore, consideration of protective factors had demonstrable value in their prediction. Further evidence is required to support use of the START for the full range of outcomes it aims to predict. © The Author(s) 2015.
NASA Astrophysics Data System (ADS)
Jun, LIU; Huang, Wei; Hongjie, Fan
2016-02-01
A novel method for finding the initial structure parameters of an optical system via the genetic algorithm (GA) is proposed in this research. Usually, optical designers start their designs from the commonly used structures from a patent database; however, it is time consuming to modify the patented structures to meet the specification. A high-performance design result largely depends on the choice of the starting point. Accordingly, it would be highly desirable to be able to calculate the initial structure parameters automatically. In this paper, a method that combines a genetic algorithm and aberration analysis is used to determine an appropriate initial structure of an optical system. We use a three-mirror system as an example to demonstrate the validity and reliability of this method. On-axis and off-axis telecentric three-mirror systems are obtained based on this method.
On the stochastic dissemination of faults in an admissible network
NASA Technical Reports Server (NTRS)
Kyrala, A.
1987-01-01
The dynamic distribution of faults in a general type network is discussed. The starting point is a uniquely branched network in which each pair of nodes is connected by a single branch. Mathematical expressions for the uniquely branched network transition matrix are derived to show that sufficient stationarity exists to ensure the validity of the use of the Markov Chain model to analyze networks. In addition the conditions for the use of Semi-Markov models are discussed. General mathematical expressions are derived in an examination of branch redundancy techniques commonly used to increase reliability.
Explicitly computing geodetic coordinates from Cartesian coordinates
NASA Astrophysics Data System (ADS)
Zeng, Huaien
2013-04-01
This paper presents a new form of quartic equation based on Lagrange's extremum law and a Groebner basis under the constraint that the geodetic height is the shortest distance between a given point and the reference ellipsoid. A very explicit and concise formulae of the quartic equation by Ferrari's line is found, which avoids the need of a good starting guess for iterative methods. A new explicit algorithm is then proposed to compute geodetic coordinates from Cartesian coordinates. The convergence region of the algorithm is investigated and the corresponding correct solution is given. Lastly, the algorithm is validated with numerical experiments.
Morizot, Julien
2014-10-01
While there are a number of short personality trait measures that have been validated for use with adults, few are specifically validated for use with adolescents. To trust such measures, it must be demonstrated that they have adequate construct validity. According to the view of construct validity as a unifying form of validity requiring the integration of different complementary sources of information, this article reports the evaluation of content, factor, convergent, and criterion validities as well as reliability of adolescents' self-reported personality traits. Moreover, this study sought to address an inherent potential limitation of short personality trait measures, namely their limited conceptual breadth. In this study, starting with items from a known measure, after the language-level was adjusted for use with adolescents, items tapping fundamental primary traits were added to determine the impact of added conceptual breadth on the psychometric properties of the scales. The resulting new measure was named the Big Five Personality Trait Short Questionnaire (BFPTSQ). A group of expert judges considered the items to have adequate content validity. Using data from a community sample of early adolescents, the results confirmed the factor validity of the Big Five structure in adolescence as well as its measurement invariance across genders. More important, the added items did improve the convergent and criterion validities of the scales, but did not negatively affect their reliability. This study supports the construct validity of adolescents' self-reported personality traits and points to the importance of conceptual breadth in short personality measures. © The Author(s) 2014.
Calderón, Félix; Barros, David; Bueno, José María; Coterón, José Miguel; Fernández, Esther; Gamo, Francisco Javier; Lavandera, José Luís; León, María Luisa; Macdonald, Simon J F; Mallo, Araceli; Manzano, Pilar; Porras, Esther; Fiandor, José María; Castro, Julia
2011-10-13
In 2010, GlaxoSmithKline published the structures of 13533 chemical starting points for antimalarial lead identification. By using an agglomerative structural clustering technique followed by computational filters such as antimalarial activity, physicochemical properties, and dissimilarity to known antimalarial structures, we have identified 47 starting points for lead optimization. Their structures are provided. We invite potential collaborators to work with us to discover new clinical candidates.
Model-Based Engineering Design for Trade Space Exploration throughout the Design Cycle
NASA Technical Reports Server (NTRS)
Lamassoure, Elisabeth S.; Wall, Stephen D.; Easter, Robert W.
2004-01-01
This paper presents ongoing work to standardize model-based system engineering as a complement to point design development in the conceptual design phase of deep space missions. It summarizes two first steps towards practical application of this capability within the framework of concurrent engineering design teams and their customers. The first step is standard generation of system sensitivities models as the output of concurrent engineering design sessions, representing the local trade space around a point design. A review of the chosen model development process, and the results of three case study examples, demonstrate that a simple update to the concurrent engineering design process can easily capture sensitivities to key requirements. It can serve as a valuable tool to analyze design drivers and uncover breakpoints in the design. The second step is development of rough-order- of-magnitude, broad-range-of-validity design models for rapid exploration of the trade space, before selection of a point design. At least one case study demonstrated the feasibility to generate such models in a concurrent engineering session. The experiment indicated that such a capability could yield valid system-level conclusions for a trade space composed of understood elements. Ongoing efforts are assessing the practicality of developing end-to-end system-level design models for use before even convening the first concurrent engineering session, starting with modeling an end-to-end Mars architecture.
Modeling the Object-Oriented Space Through Validated Measures
NASA Technical Reports Server (NTRS)
Neal, Ralph D.
1996-01-01
In order to truly understand software and the software development process, software measurement must be better understood. A beginning step toward a better understanding of software measurement is the categorization of the measurements by some meaningful taxonomy. The most meaningful taxonomy would capture the basic nature of the subject oriented (O-O) space. The interesting characteristics of object oriented software offer a starting point for such a categorization of measures. A taxonomy has been developed based on fourteen characteristics of object-oriented software gathered from the literature This taxonomy allows us to easily see gaps and redundancies in the O-O measures. The taxonomy also clearly differentiates among taxa so that there is no ambiguity as to the taxon to which a measure belongs. The taxonomy has been populated with thirty-two measures that have been validated in the narrow sense of Fenton, using measurement theory with Zuse's augmentation.
2018-01-01
Plant homeodomain (PHD) zinc fingers are histone reader domains that are often associated with human diseases. Despite this, they constitute a poorly targeted class of readers, suggesting low ligandability. Here, we describe a successful fragment-based campaign targeting PHD fingers from the proteins BAZ2A and BAZ2B as model systems. We validated a pool of in silico fragments both biophysically and structurally and solved the first crystal structures of PHD zinc fingers in complex with fragments bound to an anchoring pocket at the histone binding site. The best-validated hits were found to displace a histone H3 tail peptide in competition assays. This work identifies new chemical scaffolds that provide suitable starting points for future ligand optimization using structure-guided approaches. The demonstrated ligandability of the PHD reader domains could pave the way for the development of chemical probes to drug this family of epigenetic readers. PMID:29529862
Amato, Anastasia; Lucas, Xavier; Bortoluzzi, Alessio; Wright, David; Ciulli, Alessio
2018-04-20
Plant homeodomain (PHD) zinc fingers are histone reader domains that are often associated with human diseases. Despite this, they constitute a poorly targeted class of readers, suggesting low ligandability. Here, we describe a successful fragment-based campaign targeting PHD fingers from the proteins BAZ2A and BAZ2B as model systems. We validated a pool of in silico fragments both biophysically and structurally and solved the first crystal structures of PHD zinc fingers in complex with fragments bound to an anchoring pocket at the histone binding site. The best-validated hits were found to displace a histone H3 tail peptide in competition assays. This work identifies new chemical scaffolds that provide suitable starting points for future ligand optimization using structure-guided approaches. The demonstrated ligandability of the PHD reader domains could pave the way for the development of chemical probes to drug this family of epigenetic readers.
Validation metrics for turbulent plasma transport
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holland, C., E-mail: chholland@ucsd.edu
Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnosticsmore » to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. The utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak [J. L. Luxon, Nucl. Fusion 42, 614 (2002)], as part of a multi-year transport model validation activity.« less
Validation of a novel virtual reality simulator for robotic surgery.
Schreuder, Henk W R; Persson, Jan E U; Wolswijk, Richard G H; Ihse, Ingmar; Schijven, Marlies P; Verheijen, René H M
2014-01-01
With the increase in robotic-assisted laparoscopic surgery there is a concomitant rising demand for training methods. The objective was to establish face and construct validity of a novel virtual reality simulator (dV-Trainer, Mimic Technologies, Seattle, WA) for the use in training of robot-assisted surgery. A comparative cohort study was performed. Participants (n = 42) were divided into three groups according to their robotic experience. To determine construct validity, participants performed three different exercises twice. Performance parameters were measured. To determine face validity, participants filled in a questionnaire after completion of the exercises. Experts outperformed novices in most of the measured parameters. The most discriminative parameters were "time to complete" and "economy of motion" (P < 0.001). The training capacity of the simulator was rated 4.6 ± 0.5 SD on a 5-point Likert scale. The realism of the simulator in general, visual graphics, movements of instruments, interaction with objects, and the depth perception were all rated as being realistic. The simulator is considered to be a very useful training tool for residents and medical specialist starting with robotic surgery. Face and construct validity for the dV-Trainer could be established. The virtual reality simulator is a useful tool for training robotic surgery.
Validation of a Novel Virtual Reality Simulator for Robotic Surgery
Schreuder, Henk W. R.; Persson, Jan E. U.; Wolswijk, Richard G. H.; Ihse, Ingmar; Schijven, Marlies P.; Verheijen, René H. M.
2014-01-01
Objective. With the increase in robotic-assisted laparoscopic surgery there is a concomitant rising demand for training methods. The objective was to establish face and construct validity of a novel virtual reality simulator (dV-Trainer, Mimic Technologies, Seattle, WA) for the use in training of robot-assisted surgery. Methods. A comparative cohort study was performed. Participants (n = 42) were divided into three groups according to their robotic experience. To determine construct validity, participants performed three different exercises twice. Performance parameters were measured. To determine face validity, participants filled in a questionnaire after completion of the exercises. Results. Experts outperformed novices in most of the measured parameters. The most discriminative parameters were “time to complete” and “economy of motion” (P < 0.001). The training capacity of the simulator was rated 4.6 ± 0.5 SD on a 5-point Likert scale. The realism of the simulator in general, visual graphics, movements of instruments, interaction with objects, and the depth perception were all rated as being realistic. The simulator is considered to be a very useful training tool for residents and medical specialist starting with robotic surgery. Conclusions. Face and construct validity for the dV-Trainer could be established. The virtual reality simulator is a useful tool for training robotic surgery. PMID:24600328
WEC-SIM Validation Testing Plan FY14 Q4.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruehl, Kelley Michelle
2016-02-01
The WEC-Sim project is currently on track, having met both the SNL and NREL FY14 Milestones, as shown in Table 1 and Table 2. This is also reflected in the Gantt chart uploaded to the WEC-Sim SharePoint site in the FY14 Q4 Deliverables folder. The work completed in FY14 includes code verification through code-to-code comparison (FY14 Q1 and Q2), preliminary code validation through comparison to experimental data (FY14 Q2 and Q3), presentation and publication of the WEC-Sim project at OMAE 2014 [1], [2], [3] and GMREC/METS 2014 [4] (FY14 Q3), WEC-Sim code development and public open-source release (FY14 Q3), andmore » development of a preliminary WEC-Sim validation test plan (FY14 Q4). This report presents the preliminary Validation Testing Plan developed in FY14 Q4. The validation test effort started in FY14 Q4 and will go on through FY15. Thus far the team has developed a device selection method, selected a device, and placed a contract with the testing facility, established several collaborations including industry contacts, and have working ideas on the testing details such as scaling, device design, and test conditions.« less
One-loop transition amplitudes in the D1D5 CFT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carson, Zaq; Hampton, Shaun; Mathur, Samir D.
We consider the issue of thermalization in the D1D5 CFT. Thermalization is expected to correspond to the formation of a black hole in the dual gravity theory. We start from the orbifold point, where the theory is essentially free, and does not thermalize. In earlier work it was noted that there was no clear thermalization effect when the theory was deformed of the orbifold point to first order in the relevant twist perturbation. In this paper we consider the deformation to second order in the twist, where we do find effects that can cause thermalization of an initial perturbation. Wemore » consider a 1-loop process where two untwisted copies of the CFT are twisted to one copy and then again untwisted to two copies. We start with a single oscillator excitation on the initial CFT, and compute the effect of the two twists on this state. We find simple approximate expressions for the Bogoliubov coeffcients and the behavior of the single oscillator excitation in the continuum limit, where the mode numbers involved are taken to be much larger than unity. We also prove a number of useful relationships valid for processes with an arbitrary number of twist insertions.« less
One-loop transition amplitudes in the D1D5 CFT
Carson, Zaq; Hampton, Shaun; Mathur, Samir D.
2017-01-02
We consider the issue of thermalization in the D1D5 CFT. Thermalization is expected to correspond to the formation of a black hole in the dual gravity theory. We start from the orbifold point, where the theory is essentially free, and does not thermalize. In earlier work it was noted that there was no clear thermalization effect when the theory was deformed of the orbifold point to first order in the relevant twist perturbation. In this paper we consider the deformation to second order in the twist, where we do find effects that can cause thermalization of an initial perturbation. Wemore » consider a 1-loop process where two untwisted copies of the CFT are twisted to one copy and then again untwisted to two copies. We start with a single oscillator excitation on the initial CFT, and compute the effect of the two twists on this state. We find simple approximate expressions for the Bogoliubov coeffcients and the behavior of the single oscillator excitation in the continuum limit, where the mode numbers involved are taken to be much larger than unity. We also prove a number of useful relationships valid for processes with an arbitrary number of twist insertions.« less
Concept mapping as an approach for expert-guided model building: The example of health literacy.
Soellner, Renate; Lenartz, Norbert; Rudinger, Georg
2017-02-01
Concept mapping served as the starting point for the aim of capturing the comprehensive structure of the construct of 'health literacy.' Ideas about health literacy were generated by 99 experts and resulted in 105 statements that were subsequently organized by 27 experts in an unstructured card sorting. Multidimensional scaling was applied to the sorting data and a two and three-dimensional solution was computed. The three dimensional solution was used in subsequent cluster analysis and resulted in a concept map of nine "clusters": (1) self-regulation, (2) self-perception, (3) proactive approach to health, (4) basic literacy and numeracy skills, (5) information appraisal, (6) information search, (7) health care system knowledge and acting, (8) communication and cooperation, and (9) beneficial personality traits. Subsequently, this concept map served as a starting point for developing a "qualitative" structural model of health literacy and a questionnaire for the measurement of health literacy. On the basis of questionnaire data, a "quantitative" structural model was created by first applying exploratory factor analyses (EFA) and then cross-validating the model with confirmatory factor analyses (CFA). Concept mapping proved to be a highly valuable tool for the process of model building up to translational research in the "real world". Copyright © 2016 Elsevier Ltd. All rights reserved.
Deplacement effect of the laminar boundary layer and the pressure drag
NASA Technical Reports Server (NTRS)
Gortler, H
1951-01-01
The displacement effect of the boundary layer on the outer frictionless flow is discussed for both steady and unsteady flows. The analysis is restricted to cases in which the potential flow pressure distribution remains valid for the boundary-layer calculation. Formulas are given for the dependence of the pressure drag, friction drag, and total drag of circular cylinders on the time from the start of motion for cases in which the velocity varies as a power of the time. Formulas for the locations and for the time for the appearance of the separation point are given for two dimensional bodies of arbitrary shape.
Behavioral Economics Applied to Energy Demand Analysis: A Foundation
2014-01-01
Neoclassical economics has shaped our understanding of human behavior for several decades. While still an important starting point for economic studies, neoclassical frameworks have generally imposed strong assumptions, for example regarding utility maximization, information, and foresight, while treating consumer preferences as given or external to the framework. In real life, however, such strong assumptions tend to be less than fully valid. Behavioral economics refers to the study and formalizing of theories regarding deviations from traditionally-modeled economic decision-making in the behavior of individuals. The U.S. Energy Information Administration (EIA) has an interest in behavioral economics as one influence on energy demand.
A starting point of an integrated optics concept for a space-based interferometer
NASA Astrophysics Data System (ADS)
Labadie, Lucas; Kern, Pierre; Schanen, Isabelle
2017-11-01
This article deals with instrumentation challenges of the stellar interferometry mission IRSI-Darwin of the European Space Agency. The necessity to have a reliable and performant system for beam recombination has enlightened the advantages of an integrated optics solution, which is already in use for ground-base interferomety in the near infrared. However, since Darwin will operate in the mid infrared, this requires extending the integrated optics concept in this spectral range. This paper presents the guiding lines of the characterization work that should validate a new integrated optics concept for the mid infrared. We present also one example of characterization experiment we are working on.
Phase diagram of two-dimensional hard rods from fundamental mixed measure density functional theory
NASA Astrophysics Data System (ADS)
Wittmann, René; Sitta, Christoph E.; Smallenburg, Frank; Löwen, Hartmut
2017-10-01
A density functional theory for the bulk phase diagram of two-dimensional orientable hard rods is proposed and tested against Monte Carlo computer simulation data. In detail, an explicit density functional is derived from fundamental mixed measure theory and freely minimized numerically for hard discorectangles. The phase diagram, which involves stable isotropic, nematic, smectic, and crystalline phases, is obtained and shows good agreement with the simulation data. Our functional is valid for a multicomponent mixture of hard particles with arbitrary convex shapes and provides a reliable starting point to explore various inhomogeneous situations of two-dimensional hard rods and their Brownian dynamics.
ENKI - A tool for analysing the learning efficiency
NASA Astrophysics Data System (ADS)
Simona, Dudáková; Boris, Lacsný; Aba, Teleki
2017-01-01
Long-term memory plays a crucial role in learning mechanisms. We start to build up a probability model of learning (ENKI) ten years ago based on findings of micro genetics published in [1]. We accomplished a number of experiments in our department to testify the validity of the model with success. We described ENKI in detail here, giving the general mathematical formula of the learning curve. This paper pointed out that the model ENKI can detect its own strategy of learning in the brain as well as the simulation of the process of learning that will lead to the development of this method using its own strategy.
Tanaka, Kanji; Watanabe, Katsumi
2016-02-01
The present study examined whether sequence learning led to more accurate and shorter performance time if people who are learning a sequence start over from the beginning when they make an error (i.e., practice the whole sequence) or only from the point of error (i.e., practice a part of the sequence). We used a visuomotor sequence learning paradigm with a trial-and-error procedure. In Experiment 1, we found fewer errors, and shorter performance time for those who restarted their performance from the beginning of the sequence as compared to those who restarted from the point at which an error occurred, indicating better learning of spatial and motor representations of the sequence. This might be because the learned elements were repeated when the next performance started over from the beginning. In subsequent experiments, we increased the occasions for the repetitions of learned elements by modulating the number of fresh start points in the sequence after errors. The results showed that fewer fresh start points were likely to lead to fewer errors and shorter performance time, indicating that the repetitions of learned elements enabled participants to develop stronger spatial and motor representations of the sequence. Thus, a single or two fresh start points in the sequence (i.e., starting over only from the beginning or from the beginning or midpoint of the sequence after errors) is likely to lead to more accurate and faster performance. Copyright © 2016 Elsevier B.V. All rights reserved.
Application of change-point problem to the detection of plant patches.
López, I; Gámez, M; Garay, J; Standovár, T; Varga, Z
2010-03-01
In ecology, if the considered area or space is large, the spatial distribution of individuals of a given plant species is never homogeneous; plants form different patches. The homogeneity change in space or in time (in particular, the related change-point problem) is an important research subject in mathematical statistics. In the paper, for a given data system along a straight line, two areas are considered, where the data of each area come from different discrete distributions, with unknown parameters. In the paper a method is presented for the estimation of the distribution change-point between both areas and an estimate is given for the distributions separated by the obtained change-point. The solution of this problem will be based on the maximum likelihood method. Furthermore, based on an adaptation of the well-known bootstrap resampling, a method for the estimation of the so-called change-interval is also given. The latter approach is very general, since it not only applies in the case of the maximum-likelihood estimation of the change-point, but it can be also used starting from any other change-point estimation known in the ecological literature. The proposed model is validated against typical ecological situations, providing at the same time a verification of the applied algorithms.
Is there inter-procedural transfer of skills in intraocular surgery? A randomized controlled trial.
Thomsen, Ann Sofia Skou; Kiilgaard, Jens Folke; la Cour, Morten; Brydges, Ryan; Konge, Lars
2017-12-01
To investigate how experience in simulated cataract surgery impacts and transfers to the learning curves for novices in vitreoretinal surgery. Twelve ophthalmology residents without previous experience in intraocular surgery were randomized to (1) intensive training in cataract surgery on a virtual-reality simulator until passing a test with predefined validity evidence (cataract trainees) or to (2) no cataract surgery training (novices). Possible skill transfer was assessed using a test consisting of all 11 vitreoretinal modules on the EyeSi virtual-reality simulator. All participants repeated the test of vitreoretinal surgical skills until their performance curve plateaued. Three experienced vitreoretinal surgeons also performed the test to establish validity evidence. Analysis with independent samples t-tests was performed. The vitreoretinal test on the EyeSi simulator demonstrated evidence of validity, given statistically significant differences in mean test scores for the first repetition; experienced surgeons scored higher than novices (p = 0.023) and cataract trainees (p = 0.003). Internal consistency for the 11 modules of the test was acceptable (Cronbach's α = 0.73). Our findings did not indicate a transfer effect with no significant differences found between cataract trainees and novices in their starting scores (mean ± SD 381 ± 129 points versus 455 ± 82 points, p = 0.262), time to reach maximum performance level (10.7 ± 3.0 hr versus 8.7 ± 2.8 hr, p = 0.265), or maximum scores (785 ± 162 points versus 805 ± 73 points, p = 0.791). Pretraining in cataract surgery did not demonstrate any measurable effect on vitreoretinal procedural performance. The results of this study indicate that we should not anticipate extensive transfer of surgical skills when planning training programmes in intraocular surgery. © 2017 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.
A Baseline Patient Model to Support Testing of Medical Cyber-Physical Systems.
Silva, Lenardo C; Perkusich, Mirko; Almeida, Hyggo O; Perkusich, Angelo; Lima, Mateus A M; Gorgônio, Kyller C
2015-01-01
Medical Cyber-Physical Systems (MCPS) are currently a trending topic of research. The main challenges are related to the integration and interoperability of connected medical devices, patient safety, physiologic closed-loop control, and the verification and validation of these systems. In this paper, we focus on patient safety and MCPS validation. We present a formal patient model to be used in health care systems validation without jeopardizing the patient's health. To determine the basic patient conditions, our model considers the four main vital signs: heart rate, respiratory rate, blood pressure and body temperature. To generate the vital signs we used regression models based on statistical analysis of a clinical database. Our solution should be used as a starting point for a behavioral patient model and adapted to specific clinical scenarios. We present the modeling process of the baseline patient model and show its evaluation. The conception process may be used to build different patient models. The results show the feasibility of the proposed model as an alternative to the immediate need for clinical trials to test these medical systems.
A learning progression based teaching module on the causes of seasons
NASA Astrophysics Data System (ADS)
Galano, S.
2016-03-01
In this paper, we report about designing and validating a teaching learning module based on a learning progression and focused on the causes of seasons. An initial learning progression about the Celestial Motion big idea -causes of seasons, lunar and solar eclipse and Moon phases- was developed and validated. Existing curricula, research studies on alternative conceptions about these phenomena, and students' answers to an open questionnaire were the starting point to develop initial learning progressions; then, a two-tier multiple-choice questionnaire was designed to validate and improve it. The questionnaire was submitted to about 300 secondary-school students whose answers were used to revise the hypothesized learning progressions. This improved version of the learning progression was used to design a module focused on the causes of seasons in which students were engaged in quantitative measurements with a photovoltaic panel to explain changes of the Sun rays' flow on the Earth's surface over the year. The efficacy of our module in improving students' understanding of the phenomenon of the seasons was tested using our questionnaire as pre- and post-test.
The intelligent OR: design and validation of a context-aware surgical working environment.
Franke, Stefan; Rockstroh, Max; Hofer, Mathias; Neumuth, Thomas
2018-05-24
Interoperability of medical devices based on standards starts to establish in the operating room (OR). Devices share their data and control functionalities. Yet, the OR technology rarely implements cooperative, intelligent behavior, especially in terms of active cooperation with the OR team. Technical context-awareness will be an essential feature of the next generation of medical devices to address the increasing demands to clinicians in information seeking, decision making, and human-machine interaction in complex surgical working environments. The paper describes the technical validation of an intelligent surgical working environment for endoscopic ear-nose-throat surgery. We briefly summarize the design of our framework for context-aware system's behavior in integrated OR and present example realizations of novel assistance functionalities. In a study on patient phantoms, twenty-four procedures were implemented in the proposed intelligent surgical working environment based on recordings of real interventions. Subsequently, the whole processing pipeline for context-awareness from workflow recognition to the final system's behavior is analyzed. Rule-based behavior that considers multiple perspectives on the procedure can partially compensate recognition errors. A considerable robustness could be achieved with a reasonable quality of the recognition. Overall, reliable reactive as well as proactive behavior of the surgical working environment can be implemented in the proposed environment. The obtained validation results indicate the suitability of the overall approach. The setup is a reliable starting point for a subsequent evaluation of the proposed context-aware assistance. The major challenge for future work will be to implement the complex approach in a cross-vendor setting.
Dickens, G L; O'Shea, L E
2015-08-01
The Short-Term Assessment of Risk and Treatability (START) is a tool used in some mental health services to assess patients to see if they are at risk of violence, self-harm, self-neglect or victimization. The recommended time between assessments is 3 months but there is currently no evidence to show that this is best practice. We have investigated whether assessing at 1- or 2-month intervals would be more accurate and therefore facilitate more individualized risk management interventions. We found that many patients who were rated as low risk had been involved in risk behaviours before 3 months had passed; some patients who were rated at increased risk did not get involved in risk behaviours at all. Results are mixed for different outcomes but on balance, we think that the recommendation to conduct START assessment every 3 months is supported by the evidence. However, reassessment should be considered if risk behaviours are not prevented and teams should always consider whether risk management practices are too restrictive. The Short-Term Assessment of Risk and Treatability (START) guides assessment of potential adverse outcomes. Assessment is recommended every 3 months but there is no evidence for this interval. We aimed to inform whether earlier reassessment was warranted. We collated START assessments for N = 217 adults in a secure mental health hospital, and subsequent aggressive, self-harm, self-neglect and victimization incidents. We used receiver operating characteristic analysis to assess predictive validity; survival function analysis to examine differences between low-, medium-, and high-risk groups; and hazard function analysis to determine the optimum interval for reassessment. The START predicted aggression and self-harm at 1, 2 and 3 months. At-risk individuals engaged in adverse outcomes earlier than low-risk patients. About half warranted reassessment before 3 months due to engagement in risk behaviour before that point despite a low-risk rating, or because of non-engagement by that point despite an elevated risk rating. Risk assessment should occur at appropriate intervals so that management strategies can be individually tailored. Assessment at 3-month intervals is supported by the evidence. START assessments should be revisited earlier if risk behaviours are not prevented; teams should constantly re-evaluate the need for restrictive practices. © 2015 John Wiley & Sons Ltd.
Deconinck, E; Crevits, S; Baten, P; Courselle, P; De Beer, J
2011-04-05
A fully validated UHPLC method for the identification and quantification of folic acid in pharmaceutical preparations was developed. The starting conditions for the development were calculated starting from the HPLC conditions of a validated method. These start conditions were tested on four different UHPLC columns: Grace Vision HT™ C18-P, C18, C18-HL and C18-B (2 mm × 100 mm, 1.5 μm). After selection of the stationary phase, the method was further optimised by testing two aqueous and two organic phases and by adapting to a gradient method. The obtained method was fully validated based on its measurement uncertainty (accuracy profile) and robustness tests. A UHPLC method was obtained for the identification and quantification of folic acid in pharmaceutical preparations, which will cut analysis times and solvent consumption. Copyright © 2010 Elsevier B.V. All rights reserved.
Turning Points in Even Start Programs. Occasional Paper #4.
ERIC Educational Resources Information Center
Rasinski, Timothy; Padak, Nancy
To investigate the initial experiences of the various Even Start programs, a project developed a survey that was sent to program coordinators in Ohio. It asked open-ended questions to get descriptions and perceptions of situations that preceded turning point events and the turning point events themselves. Data from eight programs highlighted their…
Murphy, Conor C; Greiner, Kathrin; Plskova, Jarka; Frost, N Andrew; Forrester, John V; Dick, Andrew D
2007-01-01
Aim To evaluate the responsiveness of the Vision core module 1 (VCM1) vision‐related quality of life (VR‐QOL) questionnaire to changes in visual acuity in patients with posterior and intermediate uveitis and to validate its use as a clinical end point in uveitis. Methods Logarithm of the minimum angle of resolution visual acuity and VR‐QOL using the VCM1 questionnaire were prospectively recorded in 37 patients with active posterior segment intraocular inflammation before starting systemic immunosuppression with ciclosporin, tacrolimus or the anti‐tumour necrosis factor (TNF) agent, p55TNFr‐Ig, and again 3 months later. Spearman analysis was used to correlate improvements in visual acuity and VR‐QOL between baseline and 3 months. Results The correlation between changes in visual acuity and VR‐QOL was moderate to good for the worse eye (r = 0.47, p = 0.003), but poor for the better eye (r = −0.05, p = 0.91). The responsiveness indices effect size and standardised response mean were 0.57 and 0.59, respectively, showing that the VCM1 questionnaire is moderately responsive to immunsosuppressive therapy for active uveitis. Conclusion Changes in VR‐QOL measured with the VCM1 questionnaire correlated moderately well with changes in the worse eye visual acuity, suggesting that the VCM1 is a valid instrument for monitoring response to treatment in uveitis. PMID:16973657
ReSTART: A Novel Framework for Resource-Based Triage in Mass-Casualty Events.
Mills, Alex F; Argon, Nilay T; Ziya, Serhan; Hiestand, Brian; Winslow, James
2014-01-01
Current guidelines for mass-casualty triage do not explicitly use information about resource availability. Even though this limitation has been widely recognized, how it should be addressed remains largely unexplored. The authors present a novel framework developed using operations research methods to account for resource limitations when determining priorities for transportation of critically injured patients. To illustrate how this framework can be used, they also develop two specific example methods, named ReSTART and Simple-ReSTART, both of which extend the widely adopted triage protocol Simple Triage and Rapid Treatment (START) by using a simple calculation to determine priorities based on the relative scarcity of transportation resources. The framework is supported by three techniques from operations research: mathematical analysis, optimization, and discrete-event simulation. The authors? algorithms were developed using mathematical analysis and optimization and then extensively tested using 9,000 discrete-event simulations on three distributions of patient severity (representing low, random, and high acuity). For each incident, the expected number of survivors was calculated under START, ReSTART, and Simple-ReSTART. A web-based decision support tool was constructed to help providers make prioritization decisions in the aftermath of mass-casualty incidents based on ReSTART. In simulations, ReSTART resulted in significantly lower mortality than START regardless of which severity distribution was used (paired t test, p<.01). Mean decrease in critical mortality, the percentage of immediate and delayed patients who die, was 8.5% for low-acuity distribution (range ?2.2% to 21.1%), 9.3% for random distribution (range ?0.2% to 21.2%), and 9.1% for high-acuity distribution (range ?0.7% to 21.1%). Although the critical mortality improvement due to ReSTART was different for each of the three severity distributions, the variation was less than 1 percentage point, indicating that the ReSTART policy is relatively robust to different severity distributions. Taking resource limitations into account in mass-casualty situations, triage has the potential to increase the expected number of survivors. Further validation is required before field implementation; however, the framework proposed in here can serve as the foundation for future work in this area. 2014.
The Underrepresentation of African Americans in Army Combat Arms Branches
2014-12-04
a starting point for the Army to determine true causality. This monograph is simply reviewing data and identifying correlation, and based on...correlation, assigning causality based on historical information and scholarly literature. These potential causes are not fact, and provide a starting ...1988 is the starting point for the commissioning statistics. Subject matter experts hypothesized that the number African American officers
Stochastic oscillations in models of epidemics on a network of cities
NASA Astrophysics Data System (ADS)
Rozhnova, G.; Nunes, A.; McKane, A. J.
2011-11-01
We carry out an analytic investigation of stochastic oscillations in a susceptible-infected-recovered model of disease spread on a network of n cities. In the model a fraction fjk of individuals from city k commute to city j, where they may infect, or be infected by, others. Starting from a continuous-time Markov description of the model the deterministic equations, which are valid in the limit when the population of each city is infinite, are recovered. The stochastic fluctuations about the fixed point of these equations are derived by use of the van Kampen system-size expansion. The fixed point structure of the deterministic equations is remarkably simple: A unique nontrivial fixed point always exists and has the feature that the fraction of susceptible, infected, and recovered individuals is the same for each city irrespective of its size. We find that the stochastic fluctuations have an analogously simple dynamics: All oscillations have a single frequency, equal to that found in the one-city case. We interpret this phenomenon in terms of the properties of the spectrum of the matrix of the linear approximation of the deterministic equations at the fixed point.
Brainstorming Themes that Connect Art and Ideas across the Curriculum
ERIC Educational Resources Information Center
Walling, Donovan R.
2006-01-01
Ideas are starting points-for thought, discussion, reading, viewing, writing, and making. The two "brainstorms on paper" presented in this article illustrate how taking an idea and examining it from an artistic point of view can generate thematic starting points to help teachers and students connect the visual arts to ideas that ripple across the…
Castellazzi, Giovanni; D'Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro
2015-07-28
In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation.
Exact Closed-form Solutions for Lamb's Problem
NASA Astrophysics Data System (ADS)
Feng, Xi; Zhang, Haiming
2018-04-01
In this article, we report on an exact closed-form solution for the displacement at the surface of an elastic half-space elicited by a buried point source that acts at some point underneath that surface. This is commonly referred to as the 3-D Lamb's problem, for which previous solutions were restricted to sources and receivers placed at the free surface. By means of the reciprocity theorem, our solution should also be valid as a means to obtain the displacements at interior points when the source is placed at the free surface. We manage to obtain explicit results by expressing the solution in terms of elementary algebraic expression as well as elliptic integrals. We anchor our developments on Poisson's ratio 0.25 starting from Johnson's (1974) integral solutions which must be computed numerically. In the end, our closed-form results agree perfectly with the numerical results of Johnson (1974), which strongly confirms the correctness of our explicit formulas. It is hoped that in due time, these formulas may constitute a valuable canonical solution that will serve as a yardstick against which other numerical solutions can be compared and measured.
Exact closed-form solutions for Lamb's problem
NASA Astrophysics Data System (ADS)
Feng, Xi; Zhang, Haiming
2018-07-01
In this paper, we report on an exact closed-form solution for the displacement at the surface of an elastic half-space elicited by a buried point source that acts at some point underneath that surface. This is commonly referred to as the 3-D Lamb's problem for which previous solutions were restricted to sources and receivers placed at the free surface. By means of the reciprocity theorem, our solution should also be valid as a means to obtain the displacements at interior points when the source is placed at the free surface. We manage to obtain explicit results by expressing the solution in terms of elementary algebraic expression as well as elliptic integrals. We anchor our developments on Poisson's ratio 0.25 starting from Johnson's integral solutions which must be computed numerically. In the end, our closed-form results agree perfectly with the numerical results of Johnson, which strongly confirms the correctness of our explicit formulae. It is hoped that in due time, these formulae may constitute a valuable canonical solution that will serve as a yardstick against which other numerical solutions can be compared and measured.
Cross Validation on the Equality of Uav-Based and Contour-Based Dems
NASA Astrophysics Data System (ADS)
Ma, R.; Xu, Z.; Wu, L.; Liu, S.
2018-04-01
Unmanned Aerial Vehicles (UAV) have been widely used for Digital Elevation Model (DEM) generation in geographic applications. This paper proposes a novel framework of generating DEM from UAV images. It starts with the generation of the point clouds by image matching, where the flight control data are used as reference for searching for the corresponding images, leading to a significant time saving. Besides, a set of ground control points (GCP) obtained from field surveying are used to transform the point clouds to the user's coordinate system. Following that, we use a multi-feature based supervised classification method for discriminating non-ground points from ground ones. In the end, we generate DEM by constructing triangular irregular networks and rasterization. The experiments are conducted in the east of Jilin province in China, which has been suffered from soil erosion for several years. The quality of UAV based DEM (UAV-DEM) is compared with that generated from contour interpolation (Contour-DEM). The comparison shows a higher resolution, as well as higher accuracy of UAV-DEMs, which contains more geographic information. In addition, the RMSE errors of the UAV-DEMs generated from point clouds with and without GCPs are ±0.5 m and ±20 m, respectively.
Hielm-Björkman, Anna K; Kapatkin, Amy S; Rita, Hannu J
2011-05-01
To assess validity and reliability for a visual analogue scale (VAS) used by owners to measure chronic pain in their osteoarthritic dogs. 68, 61, and 34 owners who completed a questionnaire. Owners answered questionnaires at 5 time points. Criterion validity of the VAS was evaluated for all dogs in the intended-to-treat population by correlating scores for the VAS with scores for the validated Helsinki Chronic Pain Index (HCPI) and a relative quality-of-life scale. Intraclass correlation was used to assess repeatability of the pain VAS at 2 baseline evaluations. To determine sensitivity to change and face validity of the VAS, 2 blinded, randomized control groups (17 dogs receiving carprofen and 17 receiving a placebo) were analyzed over time. Significant correlations existed between the VAS score and the quality-of-life scale and HCPI scores. Intraclass coefficient (r = 0.72; 95% confidence interval, 0.57 to 0.82) for the VAS indicated good repeatability. In the carprofen and placebo groups, there was poor correlation between the 2 pain evaluation methods (VAS and HCPI items) at the baseline evaluation, but the correlation improved in the carprofen group over time. No correlation was detected for the placebo group over time. Although valid and reliable, the pain VAS was a poor tool for untrained owners because of poor face validity (ie, owners could not recognize their dogs' behavior as signs of pain). Only after owners had seen pain diminish and then return (after starting and discontinuing NSAID use) did the VAS have face validity.
Identifying the starting point of a spreading process in complex networks.
Comin, Cesar Henrique; Costa, Luciano da Fontoura
2011-11-01
When dealing with the dissemination of epidemics, one important question that can be asked is the location where the contamination began. In this paper, we analyze three spreading schemes and propose and validate an effective methodology for the identification of the source nodes. The method is based on the calculation of the centrality of the nodes on the sampled network, expressed here by degree, betweenness, closeness, and eigenvector centrality. We show that the source node tends to have the highest measurement values. The potential of the methodology is illustrated with respect to three theoretical complex network models as well as a real-world network, the email network of the University Rovira i Virgili.
Science, marketing and wishful thinking in quantitative proteomics.
Hackett, Murray
2008-11-01
In a recent editorial (J. Proteome Res. 2007, 6, 1633) and elsewhere questions have been raised regarding the lack of attention paid to good analytical practice with respect to the reporting of quantitative results in proteomics. Using those comments as a starting point, several issues are discussed that relate to the challenges involved in achieving adequate sampling with MS-based methods in order to generate valid data for large-scale studies. The discussion touches on the relationships that connect sampling depth and the power to detect protein abundance change, conflict of interest, and strategies to overcome bureaucratic obstacles that impede the use of peer-to-peer technologies for transfer and storage of large data files generated in such experiments.
Loureiro, Luiz de França Bahia; de Freitas, Paulo Barbosa
2016-04-01
Badminton requires open and fast actions toward the shuttlecock, but there is no specific agility test for badminton players with specific movements. To develop an agility test that simultaneously assesses perception and motor capacity and examine the test's concurrent and construct validity and its test-retest reliability. The Badcamp agility test consists of running as fast as possible to 6 targets placed on the corners and middle points of a rectangular area (5.6 × 4.2 m) from the start position located in the center of it, following visual stimuli presented in a luminous panel. The authors recruited 43 badminton players (17-32 y old) to evaluate concurrent (with shuttle-run agility test--SRAT) and construct validity and test-retest reliability. Results revealed that Badcamp presents concurrent and construct validity, as its performance is strongly related to SRAT (ρ = 0.83, P < .001), with performance of experts being better than nonexpert players (P < .01). In addition, Badcamp is reliable, as no difference (P = .07) and a high intraclass correlation (ICC = .93) were found in the performance of the players on 2 different occasions. The findings indicate that Badcamp is an effective, valid, and reliable tool to measure agility, allowing coaches and athletic trainers to evaluate players' athletic condition and training effectiveness and possibly detect talented individuals in this sport.
ERIC Educational Resources Information Center
Sekino, Yumiko; Fantuzzo, John
2005-01-01
The study examined the validity of the Child Observation Record (COR). Participants were 242 children, a stratified, random sample of a large, urban Head Start program. Teachers trained to collect COR data provided assessments on the Cognitive, Social Engagement, and Coordinated Movement dimensions of the COR. Outcome data included cognitive and…
A Rapid Python-Based Methodology for Target-Focused Combinatorial Library Design.
Li, Shiliang; Song, Yuwei; Liu, Xiaofeng; Li, Honglin
2016-01-01
The chemical space is so vast that only a small portion of it has been examined. As a complementary approach to systematically probe the chemical space, virtual combinatorial library design has extended enormous impacts on generating novel and diverse structures for drug discovery. Despite the favorable contributions, high attrition rates in drug development that mainly resulted from lack of efficacy and side effects make it increasingly challenging to discover good chemical starting points. In most cases, focused libraries, which are restricted to particular regions of the chemical space, are deftly exploited to maximize hit rate and improve efficiency at the beginning of the drug discovery and drug development pipeline. This paper presented a valid methodology for fast target-focused combinatorial library design in both reaction-based and production-based ways with the library creating rates of approximately 70,000 molecules per second. Simple, quick and convenient operating procedures are the specific features of the method. SHAFTS, a hybrid 3D similarity calculation software, was embedded to help refine the size of the libraries and improve hit rates. Two target-focused (p38-focused and COX2-focused) libraries were constructed efficiently in this study. This rapid library enumeration method is portable and applicable to any other targets for good chemical starting points identification collaborated with either structure-based or ligand-based virtual screening.
NASA Astrophysics Data System (ADS)
Schultz, E.; Genuer, V.; Marcoux, P.; Gal, O.; Belafdil, C.; Decq, D.; Maurin, Max; Morales, S.
2018-02-01
Elastic Light Scattering (ELS) is an innovative technique to identify bacterial pathogens directly on culture plates. Compelling results have already been reported for agri-food applications. Here, we have developed ELS for clinical diagnosis, starting with Staphylococcus aureus early screening. Our goal is to bring a result (positive/negative) after only 6 h of growth to fight surgical-site infections. The method starts with the acquisition of the scattering pattern arising from the interaction between a laser beam and a single bacterial colony growing on a culture medium. Then, the resulting image, considered as the bacterial species signature, is analyzed using statistical learning techniques. We present a custom optical setup able to target bacterial colonies with various sizes (30-500 microns). This system was used to collect a reference dataset of 38 strains of S. aureus and other Staphyloccocus species (5459 images) on ChromIDSAID/ MRSA bi-plates. A validation set from 20 patients has then been acquired and clinically-validated according to chromogenic enzymatic tests. The best correct-identification rate between S. aureus and S. non-aureus (94.7%) has been obtained using a support vector machine classifier trained on a combination of Fourier-Bessel moments and Local- Binary-Patterns extracted features. This statistical model applied to the validation set provided a sensitivity and a specificity of 90.0% and 56.9%, or alternatively, a positive predictive value of 47% and a negative predictive value of 93%. From a clinical point of view, the results head in the right direction and pave the way toward the WHO's requirements for rapid, low-cost, and automated diagnosis tools.
Psychometric properties of the Florence CyberBullying-CyberVictimization Scales.
Palladino, Benedetta Emanuela; Nocentini, Annalaura; Menesini, Ersilia
2015-02-01
The present study tried to answer the research need for empirically validated and theoretically based instruments to assess cyberbullying and cybervictimization. The psychometric properties of the Florence CyberBullying-CyberVictimization Scales (FCBVSs) were analyzed in a sample of 1,142 adolescents (Mage=15.18 years; SD=1.12 years; 54.5% male). For both cybervictimization and cyberbullying, results support a gender invariant model involving 14 items and four factors covering four types of behaviors (written-verbal, visual, impersonation, and exclusion). The second-order confirmatory factor analysis confirmed that a "global," second-order measure of cyberbullying and cybervictimization fits the data well. Overall, the scales showed good validity (construct, concurrent, and convergent) and reliability (internal consistency and test-retest). In addition, using the global key question measure as a criterion, ROC analyses, determining the ability of a test to discriminate between groups, allowed us to identify cutoff points to classify respondents as involved/not involved starting from the continuum measure derived from the scales.
Validation of intermediate end points in cancer research.
Schatzkin, A; Freedman, L S; Schiffman, M H; Dawsey, S M
1990-11-21
Investigations using intermediate end points as cancer surrogates are quicker, smaller, and less expensive than studies that use malignancy as the end point. We present a strategy for determining whether a given biomarker is a valid intermediate end point between an exposure and incidence of cancer. Candidate intermediate end points may be selected from case series, ecologic studies, and animal experiments. Prospective cohort and sometimes case-control studies may be used to quantify the intermediate end point-cancer association. The most appropriate measure of this association is the attributable proportion. The intermediate end point is a valid cancer surrogate if the attributable proportion is close to 1.0, but not if it is close to 0. Usually, the attributable proportion is close to neither 1.0 nor 0; in this case, valid surrogacy requires that the intermediate end point mediate an established exposure-cancer relation. This would in turn imply that the exposure effect would vanish if adjusted for the intermediate end point. We discuss the relative advantages of intervention and observational studies for the validation of intermediate end points. This validation strategy also may be applied to intermediate end points for adverse reproductive outcomes and chronic diseases other than cancer.
Development and validation of the intuitive exercise scale.
Reel, Justine J; Galli, Nick; Miyairi, Maya; Voelker, Dana; Greenleaf, Christy
2016-08-01
Up to 80% of individuals with eating disorders engage in dysfunctional exercise, which is characterized by exercising in excessive quantities often past the point of pain as well as compulsive feelings and negative affect when exercise is disrupted (Cook, Hausenblas, Crosby, Cao, & Wonderlich, 2015). Intuitive exercise involves an awareness of the senses while moving and attending to one's bodily cues for when to start and stop exercise, rather than feeling compelled to adhere to a rigid program (Reel, 2015). The purpose of this study was to design a measurement tool to evaluate the construct of intuitive exercise in research, treatment, and prevention settings. The 14-item Intuitive Exercise Scale (IEXS) was developed and validated in the current study with completed surveys from 518 female and male adult participants. Exploratory factor analysis was used to identify four latent constructs, including emotional exercise, exercise rigidity, body trust, and mindful exercise, which were supported via confirmatory factor analysis (CFI=0.96; SRMR=0.06). The IEXS demonstrated configural, metric, and scalar invariance across women and men. Correlations with measures of intuitive eating, exercise dependence, and exercise motivation supported convergent and discriminant validity. Published by Elsevier Ltd.
Viljoen, Jodi L; Cruise, Keith R; Nicholls, Tonia L; Desmarais, Sarah L; Webster, Christopher
2012-01-01
The field of violence risk assessment has matured considerably, possibly advancing beyond its own adolescence. At this point in the field's evolution, it is more important than ever for the development of any new device to be accompanied by a strong rationale and the capacity to provide a unique contribution. With this issue in mind, we first take stock of the field of adolescent risk assessment in order to describe the rapid progress that this field has made, as well as the gaps that led us to adapt the Short-Term Assessment of Risk and Treatability (START; Webster, Martin, Brink, Nicholls, & Desmarais, 2009) for use with adolescents. We view the Short-Term Assessment of Risk and Treatability: Adolescent Version (START:AV; Nicholls, Viljoen, Cruise, Desmarais, & Webster, 2010; Viljoen, Cruise, Nicholls, Desmarais, & Webster, in progress) as complementing other risk measures in four primary ways: 1) rather than focusing solely on violence risk, it examines broader adverse outcomes to which some adolescents are vulnerable (including self-harm, suicide, victimization, substance abuse, unauthorized leave, self-neglect, general offending); 2) it places a balanced emphasis on adolescents' strengths; 3) it focuses on dynamic factors that are relevant to short-term assessment, risk management, and treatment planning; and 4) it is designed for both mental health and justice populations. We describe the developmentally-informed approach we took in the adaptation of the START for adolescents, and outline future steps for the continuing validation and refinement of the START:AV.
34 CFR 200.16 - Starting points.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) Using data from the 2001-2002 school year, each State must establish starting points in reading/language... of proficient students in the school that represents 20 percent of the State's total enrollment among all schools ranked by the percentage of students at the proficient level. The State must determine...
Starting geometry creation and design method for freeform optics.
Bauer, Aaron; Schiesser, Eric M; Rolland, Jannick P
2018-05-01
We describe a method for designing freeform optics based on the aberration theory of freeform surfaces that guides the development of a taxonomy of starting-point geometries with an emphasis on manufacturability. An unconventional approach to the optimization of these starting designs wherein the rotationally invariant 3rd-order aberrations are left uncorrected prior to unobscuring the system is shown to be effective. The optimal starting-point geometry is created for an F/3, 200 mm aperture-class three-mirror imager and is fully optimized using a novel step-by-step method over a 4 × 4 degree field-of-view to exemplify the design method. We then optimize an alternative starting-point geometry that is common in the literature but was quantified here as a sub-optimal candidate for optimization with freeform surfaces. A comparison of the optimized geometries shows the performance of the optimal geometry is at least 16× better, which underscores the importance of the geometry when designing freeform optics.
Validating a hydrodynamic framework for long-term modelling of the German Bight
NASA Astrophysics Data System (ADS)
Koesters, Frank; Pluess, Andreas; Heyer, Harro; Kastens, Marko; Sehili, Aissa
2010-05-01
The intention of the "AufMod" project is to set up a modelling framework for questions concerning the large-scale, long-term morphodynamic evolution of the German Bight. First a hydrodynamic model has been set up which includes the entire North Sea and a sophisticated representation of the German Bight. In a second step, simulations of sediment transport and morphodynamic changes will be processed. This paper deals with the calibration and validation process for the hydrodynamic model in detail. The starting point for "AufMod" was the aim to better understand the morphodynamic processes in the German Bight. Changes in bottom topography need to be predicted to ensure a safe and easy transport through the German waterways leading to ports at the German coast such as Hamburg and Bremerhaven. Within "AufMod" this question is addressed through a combined effort of gaining a comprehensive sedimentological and bathymetric data set as well as running different numerical models. The model is based on the numerical method UnTRIM (Casulli and Zanolli, 2002). The model uses an unstructured grid in the horizontal to provide a good representation of the complex topography. The spatial resolution increases from about 20 km in the North Sea to 20 m within the estuaries. The model forcing represents conditions for the year 2006 and consists of wind stress at the surface, water level elevation and salinity at the open boundaries as well as freshwater inflows. Temperature is not taken into account. For the model validation, there exists a large number of over 40 hydrodynamic monitoring stations which are used to compare modelled and measured data. The calibration process consists of adapting the tidal components at the open boundaries following the approach of Pluess (2003). The validation process includes the analysis of tidal components of water level elevation and current values as well as an analysis of tidal characteristic values, e.g. tidal low and high water. Based on these numerical measures, the representation of the underlying physics is quantified by using a skill score. The overall hydrodynamic structure is represented well by the model and will be starting point for the following morphodynamic experiments. Literature Casulli and Zanolli (2002) V. Casulli and P. Zanolli. Semi-Implicit Numerical Modelling of Non-Hydrostatic Free-surface Flows for Environmental Problems. Mathematical and Computer Modelling, 36:1131-1149, 2002. Pluess (2003) A. Pluess. Das Nordseemodell der BAW zur Simulation der Tide in der Deutschen Bucht. Die Kueste, Heft 67, 2003, ISBN 3-8042-1058-9, pp 83-128
Cost-of-illness studies of atrial fibrillation: methodological considerations.
Becker, Christian
2014-10-01
Atrial fibrillation (AF) is the most common heart rhythm arrhythmia, which has considerable economic consequences. This study aims to identify the current cost-of-illness estimates of AF; a focus was put on describing the studies' methodology. A literature review was conducted. Twenty-eight cost-of-illness studies were identified. Cost-of-illness estimates exist for health insurance members, hospital and primary care populations. In addition, the cost of stroke in AF patients and the costs of post-operative AF were calculated. The methods used were heterogeneous, mostly studies calculated excess costs. The identified annual excess costs varied, even among studies from the USA (∼US$1900 to ∼US$19,000). While pointing toward considerable costs, the cost-of-illness studies' relevance could be improved by focusing on subpopulations and treatment mixes. As possible starting points for subsequent economic studies, the methodology of cost-of-illness studies should be taken into account using methods, allowing stakeholders to find suitable studies and validate estimates.
GO1 Inert Test Article Captive Carry
2018-01-10
Generation Orbit Launch Services, Inc. (GO) completed the GO1 Inert Test Article captive carry flight test at NASA’s Armstrong Flight Research Center in December. Under a public-private partnership with NASA, GO developed the GO1-ITA, a mass properties and outer mold line simulator for the GO1 hypersonic flight testbed and earned NASA airworthiness approval for flight on NASA’s C-20a. NASA’s C-20a was originally modified to add a centerline hard point to carry the Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) pod. Together with the NASA Armstrong team, a campaign of three flight tests was conducted, successfully completing all test objectives including clearing the operational flight envelope of the C-20a with the GO1-ITA mounted to the centerline hard point, and demonstrated the unique launch maneuver designed for air launch of the GO1 on operational flights starting in 2019. Data collected during the campaign will be used to validate models and inform the ongoing design and development of GO1.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roper, J; Ghavidel, B; Godette, K
Purpose: To validate a knowledge-based algorithm for prostate LDR brachytherapy treatment planning. Methods: A dataset of 100 cases was compiled from an active prostate seed implant service. Cases were randomized into 10 subsets. For each subset, the 90 remaining library cases were registered to a common reference frame and then characterized on a point by point basis using principle component analysis (PCA). Each test case was converted to PCA vectors using the same process and compared with each library case using a Mahalanobis distance to evaluate similarity. Rank order PCA scores were used to select the best-matched library case. Themore » seed arrangement was extracted from the best-matched case and used as a starting point for planning the test case. Any subsequent modifications were recorded that required input from a treatment planner to achieve V100>95%, V150<60%, V200<20%. To simulate operating-room planning constraints, seed activity was held constant, and the seed count could not increase. Results: The computational time required to register test-case contours and evaluate PCA similarity across the library was 10s. Preliminary analysis of 2 subsets shows that 9 of 20 test cases did not require any seed modifications to obtain an acceptable plan. Five test cases required fewer than 10 seed modifications or a grid shift. Another 5 test cases required approximately 20 seed modifications. An acceptable plan was not achieved for 1 outlier, which was substantially larger than its best match. Modifications took between 5s and 6min. Conclusion: A knowledge-based treatment planning algorithm for prostate LDR brachytherapy is being cross validated using 100 prior cases. Preliminary results suggest that for this size library, acceptable plans can be achieved without planner input in about half of the cases while varying amounts of planner input are needed in remaining cases. Computational time and planning time are compatible with clinical practice.« less
Real time validation of GPS TEC precursor mask for Greece
NASA Astrophysics Data System (ADS)
Pulinets, Sergey; Davidenko, Dmitry
2013-04-01
It was established by earlier studies of pre-earthquake ionospheric variations that for every specific site these variations manifest definite stability in their temporal behavior within the time interval few days before the seismic shock. This self-similarity (characteristic to phenomena registered for processes observed close to critical point of the system) permits us to consider these variations as a good candidate to short-term precursor. Physical mechanism of GPS TEC variations before earthquakes is developed within the framework of Lithosphere-Atmosphere-Ionosphere Coupling (LAIC) model. Taking into account the different tectonic structure and different source mechanisms of earthquakes in different regions of the globe, every site has its individual behavior in pre-earthquake activity what creates individual "imprint" on the ionosphere behavior at every given point. Just this so called "mask" of the ionosphere variability before earthquake in the given point creates opportunity to detect anomalous behavior of electron concentration in ionosphere basing not only on statistical processing procedure but applying the pattern recognition technique what facilitates the automatic recognition of short-term ionospheric precursors of earthquakes. Such kind of precursor mask was created using the GPS TEC variation around the time of 9 earthquakes with magnitude from M6.0 till M6.9 which took place in Greece within the time interval 2006-2011. The major anomaly revealed in the relative deviation of the vertical TEC was the positive anomaly appearing at ~04PM UT one day before the seismic shock and lasting nearly 12 hours till ~04AM UT. To validate this approach it was decided to check the mask in real-time monitoring of earthquakes in Greece starting from the 1 of December 2012 for the earthquakes with magnitude more than 4.5. During this period (till 9 of January 2013) 4 cases of seismic shocks were registered, including the largest one M5.7 on 8 of January. For all of them the mask confirmed its validity and 6 of December event was predicted in advance.
Influence of Femoral Component Design on Retrograde Femoral Nail Starting Point.
Service, Benjamin C; Kang, William; Turnbull, Nathan; Langford, Joshua; Haidukewych, George; Koval, Kenneth J
2015-10-01
Our experience with retrograde femoral nailing after periprosthetic distal femur fractures was that femoral components with deep trochlear grooves posteriorly displace the nail entry point resulting in recurvatum deformity. This study evaluated the influence of distal femoral prosthetic design on the starting point. One hundred lateral knee images were examined. The distal edge of Blumensaat's line was used to create a ratio of its location compared with the maximum anteroposterior condylar width called the starting point ratio (SPR). Femoral trials from 6 manufacturers were analyzed to determine the location of simulated nail position in the sagittal plane compared with the maximum anteroposterior prosthetic width. These measurements were used to create a ratio, the femoral component ratio (FCR). The FCR was compared with the SPR to determine if a femoral component would be at risk for retrograde nail starting point posterior to the Blumensaat's line. The mean SPR was 0.392 ± 0.03, and the mean FCR was 0.416 ± 0.05, which was significantly greater (P = 0.003). The mean FCR was 0.444 ± 0.06 for the cruciate retaining (CR) trials and was 0.393 ± 0.04 for the posterior stabilized trials; this difference was significant (P < 0.001). The FCR for the femoral trials studied was significantly greater than the SPR for native knees and was significantly greater for CR femoral components compared with posterior stabilized components. These findings demonstrate that many total knee prostheses, particularly CR designs, are at risk for a starting point posterior to Blumensaat's line.
Building intelligent systems: Artificial intelligence research at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Friedland, P.; Lum, H.
1987-01-01
The basic components that make up the goal of building autonomous intelligent systems are discussed, and ongoing work at the NASA Ames Research Center is described. It is noted that a clear progression of systems can be seen through research settings (both within and external to NASA) to Space Station testbeds to systems which actually fly on the Space Station. The starting point for the discussion is a truly autonomous Space Station intelligent system, responsible for a major portion of Space Station control. Attention is given to research in fiscal 1987, including reasoning under uncertainty, machine learning, causal modeling and simulation, knowledge from design through operations, advanced planning work, validation methodologies, and hierarchical control of and distributed cooperation among multiple knowledge-based systems.
Building intelligent systems - Artificial intelligence research at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Friedland, Peter; Lum, Henry
1987-01-01
The basic components that make up the goal of building autonomous intelligent systems are discussed, and ongoing work at the NASA Ames Research Center is described. It is noted that a clear progression of systems can be seen through research settings (both within and external to NASA) to Space Station testbeds to systems which actually fly on the Space Station. The starting point for the discussion is a 'truly' autonomous Space Station intelligent system, responsible for a major portion of Space Station control. Attention is given to research in fiscal 1987, including reasoning under uncertainty, machine learning, causal modeling and simulation, knowledge from design through operations, advanced planning work, validation methodologies, and hierarchical control of and distributed cooperation among multiple knowledge-based systems.
Rahm, Fredrik; Viklund, Jenny; Trésaugues, Lionel; Ellermann, Manuel; Giese, Anja; Ericsson, Ulrika; Forsblom, Rickard; Ginman, Tobias; Günther, Judith; Hallberg, Kenth; Lindström, Johan; Persson, Lars Boukharta; Silvander, Camilla; Talagas, Antoine; Díaz-Sáez, Laura; Fedorov, Oleg; Huber, Kilian V M; Panagakou, Ioanna; Siejka, Paulina; Gorjánácz, Mátyás; Bauser, Marcus; Andersson, Martin
2018-03-22
Recent literature has both suggested and questioned MTH1 as a novel cancer target. BAY-707 was just published as a target validation small molecule probe for assessing the effects of pharmacological inhibition of MTH1 on tumor cell survival, both in vitro and in vivo. (1) In this report, we describe the medicinal chemistry program creating BAY-707, where fragment-based methods were used to develop a series of highly potent and selective MTH1 inhibitors. Using structure-based drug design and rational medicinal chemistry approaches, the potency was increased over 10,000 times from the fragment starting point while maintaining high ligand efficiency and drug-like properties.
C-5M Super Galaxy Utilization with Joint Precision Airdrop System
2012-03-22
System Notes FireFly 900-2,200 Steerable Parafoil Screamer 500-2,200 Steerable Parafoil w/additional chutes to slow touchdown Dragonfly...setting . This initial feasible solution provides the Nonlinear Program algorithm a starting point to continue its calculations. The model continues...provides the NLP with a starting point of 1. This provides the NLP algorithm a point within the feasible region to begin its calculations in an attempt
NASA Astrophysics Data System (ADS)
Zhao, Xiuliang; Cheng, Yong; Wang, Limei; Ji, Shaobo
2017-03-01
Accurate combustion parameters are the foundations of effective closed-loop control of engine combustion process. Some combustion parameters, including the start of combustion, the location of peak pressure, the maximum pressure rise rate and its location, can be identified from the engine block vibration signals. These signals often include non-combustion related contributions, which limit the prompt acquisition of the combustion parameters computationally. The main component in these non-combustion related contributions is considered to be caused by the reciprocating inertia force excitation (RIFE) of engine crank train. A mathematical model is established to describe the response of the RIFE. The parameters of the model are recognized with a pattern recognition algorithm, and the response of the RIFE is predicted and then the related contributions are removed from the measured vibration velocity signals. The combustion parameters are extracted from the feature points of the renovated vibration velocity signals. There are angle deviations between the feature points in the vibration velocity signals and those in the cylinder pressure signals. For the start of combustion, a system bias is adopted to correct the deviation and the error bound of the predicted parameters is within 1.1°. To predict the location of the maximum pressure rise rate and the location of the peak pressure, algorithms based on the proportion of high frequency components in the vibration velocity signals are introduced. Tests results show that the two parameters are able to be predicted within 0.7° and 0.8° error bound respectively. The increase from the knee point preceding the peak value point to the peak value in the vibration velocity signals is used to predict the value of the maximum pressure rise rate. Finally, a monitoring frame work is inferred to realize the combustion parameters prediction. Satisfactory prediction for combustion parameters in successive cycles is achieved, which validates the proposed methods.
Kobashigawa, Jon; Patel, Jignesh; Azarbal, Babak; Kittleson, Michelle; Chang, David; Czer, Lawrence; Daun, Tiffany; Luu, Minh; Trento, Alfredo; Cheng, Richard; Esmailian, Fardad
2015-05-01
The endomyocardial biopsy (EMB) is considered the gold standard in rejection surveillance post cardiac transplant, but is invasive, with risk of complications. A previous trial suggested that the gene expression profiling (GEP) blood test was noninferior to EMB between 6 and 60 months post transplant. As most rejections occur in the first 6 months, we conducted a single-center randomized trial of GEP versus EMB starting at 55 days post transplant (when GEP is valid). Sixty heart transplant patients meeting inclusion criteria were randomized beginning at 55 days post transplant to either GEP or EMB arms. A positive GEP ≥30 between 2 and 6 months, or ≥34 after 6 months, prompted a follow-up biopsy. The primary end point included a composite of death/retransplant, rejection with hemodynamic compromise or graft dysfunction at 18 months post transplant. A coprimary end point included change in first-year maximal intimal thickness by intravascular ultrasound, a recognized surrogate for long-term outcome. Corticosteroid weaning was assessed in both the groups. The composite end point was similar between the GEP and EMB groups (10% versus 17%; log-rank P=0.44). The coprimary end point of first-year intravascular ultrasound change demonstrated no difference in mean maximal intimal thickness (0.35±0.36 versus 0.36±0.26 mm; P=0.944). Steroid weaning was successful in both the groups (91% versus 95%). In this pilot study, GEP starting at 55 days post transplant seems comparable with EMB for rejection surveillance in selected heart transplant patients and does not result in increased adverse outcomes. GEP also seems useful to guide corticosteroid weaning. Larger randomized trials are required to confirm these findings. URL: http://www.clinicaltrials.gov. Unique identifier: NCT014182482377. © 2015 American Heart Association, Inc.
An Alternative Starting Point for Fraction Instruction
ERIC Educational Resources Information Center
Cortina, José Luis; Višnovská, Jana; Zúñiga, Claudia
2015-01-01
We analyze the results of a study conducted for the purpose of assessing the viability of an alternative starting point for teaching fractions. The alternative is based on Freudenthal's insights about fraction as comparison. It involves portraying the entities that unit fractions quantify as always being apart from the reference unit, instead of…
33 CFR 165.915 - Security zones; Captain of the Port Detroit.
Code of Federal Regulations, 2010 CFR
2010-07-01
... the starting point at 41°58.4′ N, 083°15.4′ W (NAD 83). (2) Davis Besse Nuclear Power Station. All... to the starting point 41°36.1′ N, 083°04.7′ W (NAD 83). (b) Regulations. (1) In accordance with § 165...
Length, Area, and Volume--or, Just Geometry Really
ERIC Educational Resources Information Center
Ball, Derek
2012-01-01
Many delegates at "conference" relish the opportunity, and the space, to "do some mathematics". Opportunity and space help to make the experience memorable, but how often is the quality of the starting point, or question acknowledged? Here is a set of starting points or problems that invite the reader to "do some mathematics". Deliberately, no…
Laibhen-Parkes, Natasha; Kimble, Laura P; Melnyk, Bernadette Mazurek; Sudia, Tanya; Codone, Susan
2018-06-01
Instruments used to assess evidence-based practice (EBP) competence in nurses have been subjective, unreliable, or invalid. The Fresno test was identified as the only instrument to measure all the steps of EBP with supportive reliability and validity data. However, the items and psychometric properties of the original Fresno test are only relevant to measure EBP with medical residents. Therefore, the purpose of this paper is to describe the development of the adapted Fresno test for pediatric nurses, and provide preliminary validity and reliability data for its use with Bachelor of Science in Nursing-prepared pediatric bedside nurses. General adaptations were made to the original instrument's case studies, item content, wording, and format to meet the needs of a pediatric nursing sample. The scoring rubric was also modified to complement changes made to the instrument. Content and face validity, and intrarater reliability of the adapted Fresno test were assessed during a mixed-methods pilot study conducted from October to December 2013 with 29 Bachelor of Science in Nursing-prepared pediatric nurses. Validity data provided evidence for good content and face validity. Intrarater reliability estimates were high. The adapted Fresno test presented here appears to be a valid and reliable assessment of EBP competence in Bachelor of Science in Nursing-prepared pediatric nurses. However, further testing of this instrument is warranted using a larger sample of pediatric nurses in diverse settings. This instrument can be a starting point for evaluating the impact of EBP competence on patient outcomes. © 2018 Sigma Theta Tau International.
The Servant Leadership Survey: Development and Validation of a Multidimensional Measure.
van Dierendonck, Dirk; Nuijten, Inge
2011-09-01
PURPOSE: The purpose of this paper is to describe the development and validation of a multi-dimensional instrument to measure servant leadership. DESIGN/METHODOLOGY/APPROACH: Based on an extensive literature review and expert judgment, 99 items were formulated. In three steps, using eight samples totaling 1571 persons from The Netherlands and the UK with a diverse occupational background, a combined exploratory and confirmatory factor analysis approach was used. This was followed by an analysis of the criterion-related validity. FINDINGS: The final result is an eight-dimensional measure of 30 items: the eight dimensions being: standing back, forgiveness, courage, empowerment, accountability, authenticity, humility, and stewardship. The internal consistency of the subscales is good. The results show that the Servant Leadership Survey (SLS) has convergent validity with other leadership measures, and also adds unique elements to the leadership field. Evidence for criterion-related validity came from studies relating the eight dimensions to well-being and performance. IMPLICATIONS: With this survey, a valid and reliable instrument to measure the essential elements of servant leadership has been introduced. ORIGINALITY/VALUE: The SLS is the first measure where the underlying factor structure was developed and confirmed across several field studies in two countries. It can be used in future studies to test the underlying premises of servant leadership theory. The SLS provides a clear picture of the key servant leadership qualities and shows where improvements can be made on the individual and organizational level; as such, it may also offer a valuable starting point for training and leadership development.
[Validation of SHI Claims Data Exemplified by Gender-specific Diagnoses].
Hartmann, J; Weidmann, C; Biehle, R
2016-10-01
Aim: Use of statutory health insurance (SHI) data in health services research is increasing steadily and questions of validity are gaining importance. Using gender-specific diagnosis as an example, the aim of this study was to estimate the prevalence of implausible diagnosis and demonstrate an internal validation strategy. Method: The analysis is based on the SHI data from Baden-Württemberg for 2012. Subject of validation are gender-specific outpatient diagnoses that mismatch with the gender of the insured. To uncover this implausibility, it is necessary to clarify whether the diagnosis or the gender is wrong. The validation criteria used were the presence of further gender-specific diagnoses, the presence of gender-specific settlement items, the specialization of the physician in charge and the gender assignment of the first name of the insured. To review the quality of the validation, it was verified if the gender was changed during the following year. Results: Around 5.1% of all diagnoses were gender-specific and there was a mismatch between diagnosis and gender in 0.04% of these cases. All validation criteria were useful to sort out implausibility, whereas the last one was the most effective. Only 14% remained unsolved. From the total of 1 145 insured with implausible gender-specific diagnoses, one year later 128 had a new gender (in the data). 119 of these cases were rightly classified as insured with wrong gender and 9 cases were in the unsolved group. This confirms that the validation works well. Conclusion: Implausibility in SHI data is relatively small and can be solved with appropriate validation criteria. When validating SHI data, it is advisable to question all data used critically, to use multiple validation criteria instead of just one and to abandon the idea that reality and the associated data conform to standardized norms. Keeping these aspects in mind, analysis of SHI data is a good starting point for research in health services. © Georg Thieme Verlag KG Stuttgart · New York.
Bean, Melanie K; Raynor, Hollie A; Thornton, Laura M; Sova, Alexandra; Dunne Stewart, Mary; Mazzeo, Suzanne E
2018-04-12
Scientifically sound methods for investigating dietary consumption patterns from self-serve salad bars are needed to inform school policies and programs. To examine the reliability and validity of digital imagery for determining starting portions and plate waste of self-serve salad bar vegetables (which have variable starting portions) compared with manual weights. In a laboratory setting, 30 mock salads with 73 vegetables were made, and consumption was simulated. Each component (initial and removed portion) was weighed; photographs of weighed reference portions and pre- and post-consumption mock salads were taken. Seven trained independent raters visually assessed images to estimate starting portions to the nearest ¼ cup and percentage consumed in 20% increments. These values were converted to grams for comparison with weighed values. Intraclass correlations between weighed and digital imagery-assessed portions and plate waste were used to assess interrater reliability and validity. Pearson's correlations between weights and digital imagery assessments were also examined. Paired samples t tests were used to evaluate mean differences (in grams) between digital imagery-assessed portions and measured weights. Interrater reliabilities were excellent for starting portions and plate waste with digital imagery. For accuracy, intraclass correlations were moderate, with lower accuracy for determining starting portions of leafy greens compared with other vegetables. However, accuracy of digital imagery-assessed plate waste was excellent. Digital imagery assessments were not significantly different from measured weights for estimating overall vegetable starting portions or waste; however, digital imagery assessments slightly underestimated starting portions (by 3.5 g) and waste (by 2.1 g) of leafy greens. This investigation provides preliminary support for use of digital imagery in estimating starting portions and plate waste from school salad bars. Results might inform methods used in empirical investigations of dietary intake in schools with self-serve salad bars. Copyright © 2018 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.
Increasing large scale windstorm damage in Western, Central and Northern European forests, 1951-2010
NASA Astrophysics Data System (ADS)
Gregow, H.; Laaksonen, A.; Alper, M. E.
2017-04-01
Using reports of forest losses caused directly by large scale windstorms (or primary damage, PD) from the European forest institute database (comprising 276 PD reports from 1951-2010), total growing stock (TGS) statistics of European forests and the daily North Atlantic Oscillation (NAO) index, we identify a statistically significant change in storm intensity in Western, Central and Northern Europe (17 countries). Using the validated set of storms, we found that the year 1990 represents a change-point at which the average intensity of the most destructive storms indicated by PD/TGS > 0.08% increased by more than a factor of three. A likelihood ratio test provides strong evidence that the change-point represents a real shift in the statistical behaviour of the time series. All but one of the seven catastrophic storms (PD/TGS > 0.2%) occurred since 1990. Additionally, we detected a related decrease in September-November PD/TGS and an increase in December-February PD/TGS. Our analyses point to the possibility that the impact of climate change on the North Atlantic storms hitting Europe has started during the last two and half decades.
Increasing large scale windstorm damage in Western, Central and Northern European forests, 1951–2010
Gregow, H.; Laaksonen, A.; Alper, M. E.
2017-01-01
Using reports of forest losses caused directly by large scale windstorms (or primary damage, PD) from the European forest institute database (comprising 276 PD reports from 1951–2010), total growing stock (TGS) statistics of European forests and the daily North Atlantic Oscillation (NAO) index, we identify a statistically significant change in storm intensity in Western, Central and Northern Europe (17 countries). Using the validated set of storms, we found that the year 1990 represents a change-point at which the average intensity of the most destructive storms indicated by PD/TGS > 0.08% increased by more than a factor of three. A likelihood ratio test provides strong evidence that the change-point represents a real shift in the statistical behaviour of the time series. All but one of the seven catastrophic storms (PD/TGS > 0.2%) occurred since 1990. Additionally, we detected a related decrease in September–November PD/TGS and an increase in December–February PD/TGS. Our analyses point to the possibility that the impact of climate change on the North Atlantic storms hitting Europe has started during the last two and half decades. PMID:28401947
Castellazzi, Giovanni; D’Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro
2015-01-01
In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation. PMID:26225978
The influence of plan modulation on the interplay effect in VMAT liver SBRT treatments.
Hubley, Emily; Pierce, Greg
2017-08-01
Volumetric modulated arc therapy (VMAT) uses multileaf collimator (MLC) leaves, gantry speed, and dose rate to modulate beam fluence, producing the highly conformal doses required for liver radiotherapy. When targets that move with respiration are treated with a dynamic fluence, there exists the possibility for interplay between the target and leaf motions. This study employs a novel motion simulation technique to determine if VMAT liver SBRT plans with an increase in MLC leaf modulation are more susceptible to dosimetric differences in the GTV due to interplay effects. For ten liver SBRT patients, two VMAT plans with different amounts of MLC leaf modulation were created. Motion was simulated using a random starting point in the respiratory cycle for each fraction. To isolate the interplay effect, motion was also simulated using four specific starting points in the respiratory cycle. The dosimetric differences caused by different starting points were examined by subtracting resultant dose distributions from each other. When motion was simulated using random starting points for each fraction, or with specific starting points, there were significantly more dose differences in the GTV (maximum 100cGy) for more highly modulated plans, but the overall plan quality was not adversely affected. Plans with more MLC leaf modulation are more susceptible to interplay effects, but dose differences in the GTV are clinically negligible in magnitude. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
33 CFR 165.911 - Security Zones; Captain of the Port Buffalo Zone.
Code of Federal Regulations, 2010 CFR
2010-07-01
...′ N, 076°23.2′ W; and then following the shoreline back to the point of origin (NAD 83). (2) Ginna... 43°16.7′ N, 077°18.3′ W; then following the shoreline back to starting point (NAD 83). (3) Moses... northwest to 45°00.36′ N, 074°48.16′ W; then northeast back to the starting point (NAD 83). (4) Long Sault...
ERIC Educational Resources Information Center
Shuell, Julie; Hanna, Jeff; Oterlei, Jannell; Kariger, Patricia
This National Head Start Association booklet outlines the main provisions of the Personal Responsibility and Work Opportunity Act and describes how it may affect local Head Start Programs. The document is intended to serve as a starting point for local programs, parents, administrators and policy workers to discuss and plan how Head Start will…
Di Nuovo, Alessandro G; Di Nuovo, Santo; Buono, Serafino
2012-02-01
The estimation of a person's intelligence quotient (IQ) by means of psychometric tests is indispensable in the application of psychological assessment to several fields. When complex tests as the Wechsler scales, which are the most commonly used and universally recognized parameter for the diagnosis of degrees of retardation, are not applicable, it is necessary to use other psycho-diagnostic tools more suited for the subject's specific condition. But to ensure a homogeneous diagnosis it is necessary to reach a common metric, thus, the aim of our work is to build models able to estimate accurately and reliably the Wechsler IQ, starting from different psycho-diagnostic tools. Four different psychometric tests (Leiter international performance scale; coloured progressive matrices test; the mental development scale; psycho educational profile), along with the Wechsler scale, were administered to a group of 40 mentally retarded subjects, with various pathologies, and control persons. The obtained database is used to evaluate Wechsler IQ estimation models starting from the scores obtained in the other tests. Five modelling methods, two statistical and three from machine learning, that belong to the family of artificial neural networks (ANNs) are employed to build the estimator. Several error metrics for estimated IQ and for retardation level classification are defined to compare the performance of the various models with univariate and multivariate analyses. Eight empirical studies show that, after ten-fold cross-validation, best average estimation error is of 3.37 IQ points and mental retardation level classification error of 7.5%. Furthermore our experiments prove the superior performance of ANN methods over statistical regression ones, because in all cases considered ANN models show the lowest estimation error (from 0.12 to 0.9 IQ points) and the lowest classification error (from 2.5% to 10%). Since the estimation performance is better than the confidence interval of Wechsler scales (five IQ points), we consider models built very accurate and reliable and they can be used into help clinical diagnosis. Therefore a computer software based on the results of our work is currently used in a clinical center and empirical trails confirm its validity. Furthermore positive results in our multivariate studies suggest new approaches for clinicians. Copyright © 2011 Elsevier B.V. All rights reserved.
An Examination of the Starting Point Approach to Design and Technology
ERIC Educational Resources Information Center
Good, Keith; Jarvinen, Esa-Matti
2007-01-01
This study examines the Starting Point Approach (SPA) to design and technology, which is intended to maximize creativity while being manageable for the teacher. The purpose of the study was to examine whether the children could do what the approach requires and in particular whether it promoted their innovative thinking. Data were collected during…
The Use of Mixed Methods in Randomized Control Trials
ERIC Educational Resources Information Center
White, Howard
2013-01-01
Evaluations should be issues driven, not methods driven. The starting point should be priority programs to be evaluated or policies to be tested. From this starting point, a list of evaluation questions is identified. For each evaluation question, the task is to identify the best available method for answering that question. Hence it is likely…
In the Heart of Teaching: A Two-Dimensional Conception of Teachers' Relational Competence
ERIC Educational Resources Information Center
Aspelin, Jonas
2017-01-01
Research reveals that teachers' relational competence is crucial for successful education. However, the field is still small and largely unexplored, and arguably needs a better and more precise theoretical starting point. This article seeks to help establish such a starting point, aiming to outline a relational framework based on the philosophies…
Computers Don't Byte. A Starting Point for Teachers Using Computers. A Resource Booklet.
ERIC Educational Resources Information Center
Lieberman, Michael; And Others
Designed to provide a starting point for the teacher without computer experience, this booklet deals with both the "how" and the "when" of computers in education. Educational applications described include classroom uses with the student as a passive or an active user and programs for the handicapped; the purpose of computers…
Discrete Event Simulation Model of the Polaris 2.1 Gamma Ray Imaging Radiation Detection Device
2016-06-01
out China, Pakistan, and India as having a minimalist point of view with regards to nuclear weapons. For those in favor of this approach, he does...Referee event graph The referee listens to the start and stops of the mover and determines whether or not the Polaris has entered or exited the...are highlighted in Figure 17: • Polaris start point • Polaris end point • Polaris original waypoints • Polaris ad hoc waypoints • Number of
Viljoen, Jodi L.; Cruise, Keith R.; Nicholls, Tonia L.; Desmarais, Sarah L.; Webster, Christopher
2012-01-01
The field of violence risk assessment has matured considerably, possibly advancing beyond its own adolescence. At this point in the field’s evolution, it is more important than ever for the development of any new device to be accompanied by a strong rationale and the capacity to provide a unique contribution. With this issue in mind, we first take stock of the field of adolescent risk assessment in order to describe the rapid progress that this field has made, as well as the gaps that led us to adapt the Short-Term Assessment of Risk and Treatability (START; Webster, Martin, Brink, Nicholls, & Desmarais, 2009) for use with adolescents. We view the Short-Term Assessment of Risk and Treatability: Adolescent Version (START:AV; Nicholls, Viljoen, Cruise, Desmarais, & Webster, 2010; Viljoen, Cruise, Nicholls, Desmarais, & Webster, in progress) as complementing other risk measures in four primary ways: 1) rather than focusing solely on violence risk, it examines broader adverse outcomes to which some adolescents are vulnerable (including self-harm, suicide, victimization, substance abuse, unauthorized leave, self-neglect, general offending); 2) it places a balanced emphasis on adolescents’ strengths; 3) it focuses on dynamic factors that are relevant to short-term assessment, risk management, and treatment planning; and 4) it is designed for both mental health and justice populations. We describe the developmentally-informed approach we took in the adaptation of the START for adolescents, and outline future steps for the continuing validation and refinement of the START:AV. PMID:23436982
Clark, Ross A; Paterson, Kade; Ritchie, Callan; Blundell, Simon; Bryant, Adam L
2011-03-01
Commercial timing light systems (CTLS) provide precise measurement of athletes running velocity, however they are often expensive and difficult to transport. In this study an inexpensive, wireless and portable timing light system was created using the infrared camera in Nintendo Wii hand controllers (NWHC). System creation with gold-standard validation. A Windows-based software program using NWHC to replicate a dual-beam timing gate was created. Firstly, data collected during 2m walking and running trials were validated against a 3D kinematic system. Secondly, data recorded during 5m running trials at various intensities from standing or flying starts were compared to a single beam CTLS and the independent and average scores of three handheld stopwatch (HS) operators. Intraclass correlation coefficient and Bland-Altman plots were used to assess validity. Absolute error quartiles and percentage of trials in absolute error threshold ranges were used to determine accuracy. The NWHC system was valid when compared against the 3D kinematic system (ICC=0.99, median absolute error (MAR)=2.95%). For the flying 5m trials the NWHC system possessed excellent validity and precision (ICC=0.97, MAR<3%) when compared with the CTLS. In contrast, the NWHC system and the HS values during standing start trials possessed only modest validity (ICC<0.75) and accuracy (MAR>8%). A NWHC timing light system is inexpensive, portable and valid for assessing running velocity. Errors in the 5m standing start trials may have been due to erroneous event detection by either the commercial or NWHC-based timing light systems. Copyright © 2010 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
Testing primary-school children's understanding of the nature of science.
Koerber, Susanne; Osterhaus, Christopher; Sodian, Beate
2015-03-01
Understanding the nature of science (NOS) is a critical aspect of scientific reasoning, yet few studies have investigated its developmental beginnings and initial structure. One contributing reason is the lack of an adequate instrument. Two studies assessed NOS understanding among third graders using a multiple-select (MS) paper-and-pencil test. Study 1 investigated the validity of the MS test by presenting the items to 68 third graders (9-year-olds) and subsequently interviewing them on their underlying NOS conception of the items. All items were significantly related between formats, indicating that the test was valid. Study 2 applied the same instrument to a larger sample of 243 third graders, and their performance was compared to a multiple-choice (MC) version of the test. Although the MC format inflated the guessing probability, there was a significant relation between the two formats. In summary, the MS format was a valid method revealing third graders' NOS understanding, thereby representing an economical test instrument. A latent class analysis identified three groups of children with expertise in qualitatively different aspects of NOS, suggesting that there is not a single common starting point for the development of NOS understanding; instead, multiple developmental pathways may exist. © 2014 The British Psychological Society.
WASP (Write a Scientific Paper) using Excel - 1: Data entry and validation.
Grech, Victor
2018-02-01
Data collection for the purposes of analysis, after the planning and execution of a research study, commences with data input and validation. The process of data entry and analysis may appear daunting to the uninitiated, but as pointed out in the 1970s in a series of papers by British Medical Journal Deputy Editor TDV Swinscow, modern hardware and software (he was then referring to the availability of hand calculators) permits the performance of statistical testing outside a computer laboratory. In this day and age, modern software, such as the ubiquitous and almost universally familiar Microsoft Excel™ greatly facilitates this process. This first paper comprises the first of a collection of papers which will emulate Swinscow's series, in his own words, "addressed to readers who want to start at the beginning, not to those who are already skilled statisticians." These papers will have less focus on the actual arithmetic, and more emphasis on how to actually implement simple statistics, step by step, using Excel, thereby constituting the equivalent of Swinscow's papers in the personal computer age. Data entry can be facilitated by several underutilised features in Excel. This paper will explain Excel's little-known form function, data validation implementation at input stage, simple coding tips and data cleaning tools. Copyright © 2018 Elsevier B.V. All rights reserved.
DEVELOPMENT AND VALIDATION OF AN AIR-TO-BEEF ...
A model for predicting concentrations of dioxin-like compounds in beef is developed and tested. The key premise of the model is that concentrations of these compounds in air are the source term, or starting point, for estimating beef concentrations. Vapor-phase concentrations transfer to vegetations cattle consume, and particle-bound concentrations deposit onto soils and these vegetations as well. Congener-specific bioconcentration parameters, coupled with assumptions on cattle diet, transform soil and vegetative concentrations into beef fat concentrations. The premise of the validation exercise is that a profile of typical air concentrations of dioxin-like compounds in a United States rural environment is an appropriate observed independent data set, and that a representative profile of United States beef concentrations of dioxin-like compounds is an appropriate observed dependent result. These data were developed for the validation exercise. An observed concentration of dioxin toxic equivalents in whole beef of 0.48 ng/kg is compared with a predicted 0.36 ng/kg. Principal uncertainties in the approach are identified and discussed. A major finding of this exercise was that vapor phase transfers of dioxin-like compounds to vegetations that cattle consume dominate the estimation of final beef concentrations: over 80% of the modeled beef concentration was attributed to such transfers. journal article
Alvarsson, Jonathan; Andersson, Claes; Spjuth, Ola; Larsson, Rolf; Wikberg, Jarl E S
2011-05-20
Compound profiling and drug screening generates large amounts of data and is generally based on microplate assays. Current information systems used for handling this are mainly commercial, closed source, expensive, and heavyweight and there is a need for a flexible lightweight open system for handling plate design, and validation and preparation of data. A Bioclipse plugin consisting of a client part and a relational database was constructed. A multiple-step plate layout point-and-click interface was implemented inside Bioclipse. The system contains a data validation step, where outliers can be removed, and finally a plate report with all relevant calculated data, including dose-response curves. Brunn is capable of handling the data from microplate assays. It can create dose-response curves and calculate IC50 values. Using a system of this sort facilitates work in the laboratory. Being able to reuse already constructed plates and plate layouts by starting out from an earlier step in the plate layout design process saves time and cuts down on error sources.
Hucka, Michael; Bergmann, Frank T.; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M.; Le Novére, Nicolas; Myers, Chris J.; Olivier, Brett G.; Sahle, Sven; Schaff, James C.; Smith, Lucian P.; Waltemath, Dagmar; Wilkinson, Darren J.
2017-01-01
Summary Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/. PMID:26528569
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core
Hucka, Michael; Bergmann, Frank T.; Hoops, Stefan; Keating, Sarah M.; Sahle, Sven; Schaff, James C.; Smith, Lucian P.; Wilkinson, Darren J.
2017-01-01
Summary Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/. PMID:26528564
Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J
2015-09-04
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org.
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core.
Hucka, Michael; Bergmann, Frank T; Hoops, Stefan; Keating, Sarah M; Sahle, Sven; Schaff, James C; Smith, Lucian P; Wilkinson, Darren J
2015-09-04
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core.
Hucka, Michael; Bergmann, Frank T; Hoops, Stefan; Keating, Sarah M; Sahle, Sven; Schaff, James C; Smith, Lucian P; Wilkinson, Darren J
2015-06-01
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.
Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J
2015-06-01
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.
Valerio, Stephane; Clark, Benjamin J.; Chan, Jeremy H. M.; Frost, Carlton P.; Harris, Mark J.; Taube, Jeffrey S.
2010-01-01
Previous studies have identified neurons throughout the rat limbic system that fire as a function of the animal's head direction (HD). This HD signal is particularly robust when rats locomote in the horizontal and vertical planes, but is severely attenuated when locomoting upside-down (Calton & Taube, 2005). Given the hypothesis that the HD signal represents an animal's sense of its directional heading, we evaluated whether rats could accurately navigate in an inverted (upside-down) orientation. The task required the animals to find an escape hole while locomoting inverted on a circular platform suspended from the ceiling. In experiment 1, Long-Evans rats were trained to navigate to the escape hole by locomoting from either one or four start points. Interestingly, no animals from the 4-start point group reached criterion, even after 30 days of training. Animals in the 1-start point group reached criterion after about 6 training sessions. In Experiment 2, probe tests revealed that animals navigating from either 1- or 2-start points utilized distal visual landmarks for accurate orientation. However, subsequent probe tests revealed that their performance was markedly attenuated when required to navigate to the escape hole from a novel starting point. This absence of flexibility while navigating upside-down was confirmed in experiment 3 where we show that the rats do not learn to reach a place, but instead learn separate trajectories to the target hole(s). Based on these results we argue that inverted navigation primarily involves a simple directional strategy based on visual landmarks. PMID:20109566
ERIC Educational Resources Information Center
Nicklas, Theresa A.; O'Neil, Carol E.; Stuff, Janice; Goodell, Lora Suzanne; Liu, Yan; Martin, Corby K.
2012-01-01
Objective: The goal of the study was to assess the validity and feasibility of a digital diet estimation method for use with preschool children in "Head Start." Methods: Preschool children and their caregivers participated in validation (n = 22) and feasibility (n = 24) pilot studies. Validity was determined in the metabolic research unit using…
ERIC Educational Resources Information Center
Friedman, Dana E.
This brief paper was prepared as a starting point for employers considering the adoption of a new management initiative for working parents. It is not an exhaustive outline of all considerations in the decision-making process, nor does it provide solutions to all the known pitfalls. It does, however, suggest the potential scope and complexity of…
An Investigation of Starting Point Preferences in Human Performance on Traveling Salesman Problems
ERIC Educational Resources Information Center
MacGregor, James N.
2014-01-01
Previous studies have shown that people start traveling sales problem tours significantly more often from boundary than from interior nodes. There are a number of possible reasons for such a tendency: first, it may arise as a direct result of the processes involved in tour construction; second, boundary points may be perceptually more salient than…
Sykes, Melissa L.; Jones, Amy J.; Shelper, Todd B.; Simpson, Moana; Lang, Rebecca; Poulsen, Sally-Ann; Sleebs, Brad E.
2017-01-01
ABSTRACT Open-access drug discovery provides a substantial resource for diseases primarily affecting the poor and disadvantaged. The open-access Pathogen Box collection is comprised of compounds with demonstrated biological activity against specific pathogenic organisms. The supply of this resource by the Medicines for Malaria Venture has the potential to provide new chemical starting points for a number of tropical and neglected diseases, through repurposing of these compounds for use in drug discovery campaigns for these additional pathogens. We tested the Pathogen Box against kinetoplastid parasites and malaria life cycle stages in vitro. Consequently, chemical starting points for malaria, human African trypanosomiasis, Chagas disease, and leishmaniasis drug discovery efforts have been identified. Inclusive of this in vitro biological evaluation, outcomes from extensive literature reviews and database searches are provided. This information encompasses commercial availability, literature reference citations, other aliases and ChEMBL number with associated biological activity, where available. The release of this new data for the Pathogen Box collection into the public domain will aid the open-source model of drug discovery. Importantly, this will provide novel chemical starting points for drug discovery and target identification in tropical disease research. PMID:28674055
Duffy, Sandra; Sykes, Melissa L; Jones, Amy J; Shelper, Todd B; Simpson, Moana; Lang, Rebecca; Poulsen, Sally-Ann; Sleebs, Brad E; Avery, Vicky M
2017-09-01
Open-access drug discovery provides a substantial resource for diseases primarily affecting the poor and disadvantaged. The open-access Pathogen Box collection is comprised of compounds with demonstrated biological activity against specific pathogenic organisms. The supply of this resource by the Medicines for Malaria Venture has the potential to provide new chemical starting points for a number of tropical and neglected diseases, through repurposing of these compounds for use in drug discovery campaigns for these additional pathogens. We tested the Pathogen Box against kinetoplastid parasites and malaria life cycle stages in vitro Consequently, chemical starting points for malaria, human African trypanosomiasis, Chagas disease, and leishmaniasis drug discovery efforts have been identified. Inclusive of this in vitro biological evaluation, outcomes from extensive literature reviews and database searches are provided. This information encompasses commercial availability, literature reference citations, other aliases and ChEMBL number with associated biological activity, where available. The release of this new data for the Pathogen Box collection into the public domain will aid the open-source model of drug discovery. Importantly, this will provide novel chemical starting points for drug discovery and target identification in tropical disease research. Copyright © 2017 Duffy et al.
ERIC Educational Resources Information Center
Cohen, Arthur M.; Brawer, Florence B.; Kozeracki, Carol A.
This final report for the JumpStart III program presents a summary of the entrepreneurship training programs developed by each of the four JumpStart III partners selected in March 1997. Grants for the colleges totaled $354,546 over 2 years. The Jumpstart funding has been only a starting point for these and the other 12 Jumpstart partners in…
A test and re-estimation of Taylor's empirical capacity-reserve relationship
Long, K.R.
2009-01-01
In 1977, Taylor proposed a constant elasticity model relating capacity choice in mines to reserves. A test of this model using a very large (n = 1,195) dataset confirms its validity but obtains significantly different estimated values for the model coefficients. Capacity is somewhat inelastic with respect to reserves, with an elasticity of 0.65 estimated for open-pit plus block-cave underground mines and 0.56 for all other underground mines. These new estimates should be useful for capacity determinations as scoping studies and as a starting point for feasibility studies. The results are robust over a wide range of deposit types, deposit sizes, and time, consistent with physical constraints on mine capacity that are largely independent of technology. ?? 2009 International Association for Mathematical Geology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scherrer, Arne; UMR 8640 ENS-CNRS-UPMC, Département de Chimie, 24 rue Lhomond, École Normale Supérieure, 75005 Paris; UPMC Université Paris 06, 4, Place Jussieu, 75005 Paris
The nuclear velocity perturbation theory (NVPT) for vibrational circular dichroism (VCD) is derived from the exact factorization of the electron-nuclear wave function. This new formalism offers an exact starting point to include correction terms to the Born-Oppenheimer (BO) form of the molecular wave function, similar to the complete-adiabatic approximation. The corrections depend on a small parameter that, in a classical treatment of the nuclei, is identified as the nuclear velocity. Apart from proposing a rigorous basis for the NVPT, we show that the rotational strengths, related to the intensity of the VCD signal, contain a new contribution beyond-BO that canmore » be evaluated with the NVPT and that only arises when the exact factorization approach is employed. Numerical results are presented for chiral and non-chiral systems to test the validity of the approach.« less
Program Model Checking: A Practitioner's Guide
NASA Technical Reports Server (NTRS)
Pressburger, Thomas T.; Mansouri-Samani, Masoud; Mehlitz, Peter C.; Pasareanu, Corina S.; Markosian, Lawrence Z.; Penix, John J.; Brat, Guillaume P.; Visser, Willem C.
2008-01-01
Program model checking is a verification technology that uses state-space exploration to evaluate large numbers of potential program executions. Program model checking provides improved coverage over testing by systematically evaluating all possible test inputs and all possible interleavings of threads in a multithreaded system. Model-checking algorithms use several classes of optimizations to reduce the time and memory requirements for analysis, as well as heuristics for meaningful analysis of partial areas of the state space Our goal in this guidebook is to assemble, distill, and demonstrate emerging best practices for applying program model checking. We offer it as a starting point and introduction for those who want to apply model checking to software verification and validation. The guidebook will not discuss any specific tool in great detail, but we provide references for specific tools.
Relating Land Use and Human Intra-City Mobility
Lee, Minjin; Holme, Petter
2015-01-01
Understanding human mobility patterns—how people move in their everyday lives—is an interdisciplinary research field. It is a question with roots back to the 19th century that has been dramatically revitalized with the recent increase in data availability. Models of human mobility often take the population distribution as a starting point. Another, sometimes more accurate, data source is land-use maps. In this paper, we discuss how the intra-city movement patterns, and consequently population distribution, can be predicted from such data sources. As a link between land use and mobility, we show that the purposes of people’s trips are strongly correlated with the land use of the trip’s origin and destination. We calibrate, validate and discuss our model using survey data. PMID:26445147
Anthranilate-Activating Modules from Fungal Nonribosomal Peptide Assembly Lines†
Ames, Brian D.; Walsh, Christopher T.
2010-01-01
Fungal natural products containing benzodiazepinone- and quinazolinone-fused ring systems can be assembled by nonribosomal peptide synthetases (NRPS) using the conformationally restricted β-amino acid anthranilate as one of the key building blocks. We validated that the first module of the acetylaszonalenin synthetase of Neosartorya fischeri NRRL 181 activates anthranilate to anthranilyl-AMP. With this as starting point, we then used bioinformatic predictions about fungal adenylation domain selectivities to identify and confirm an anthranilate-activating module in the fumiquinazoline A producer Aspergillus fumigatus Af293 as well as a second anthranilate-activating NRPS in N. fischeri. This establishes an anthranilate adenylation domain code for fungal NRPS and should facilitate detection and cloning of gene clusters for benzodiazepine- and quinazoline-containing polycyclic alkaloids with a wide range of biological activities. PMID:20225828
Althof, Stanley E; Brock, Gerald B; Rosen, Raymond C; Rowland, David L; Aquilina, Joseph W; Rothman, Margaret; Tesfaye, Fisseha; Bull, Scott
2010-06-01
The Clinical Global Impression of Change (CGIC) measures have high utility in clinical practice. However, it is unknown whether the CGIC is valued for assessing premature ejaculation (PE) symptoms and/or the relationship between CGIC and other validated PE patient-reported measures. The study aims to assess the validity of the patient-reported CGIC measure in men with PE and to examine the relationship between CGIC ratings and assessments of control, satisfaction, personal distress, and interpersonal difficulty. Data from a randomized, double-blind, 24-week phase 3 trial in 1,162 men with PE who received dapoxetine (30 mg or 60 mg) or placebo on demand provided the basis for the analysis. Patients were ≥18 years, in a stable monogamous relationship for ≥6 months, met the Diagnostic and Statistical Manual of Mental Disorders-Fourth Edition-Text Revision criteria for PE for ≥6 months, and had an intravaginal ejaculatory latency time (IELT) ≤2 minutes in ≥75% of intercourse episodes. The CGIC asked patients to rate improvement or worsening of their PE compared with the start of the study using a 7-point response scale; other patient-reported measures were control over ejaculation, satisfaction with sexual intercourse, interpersonal difficulty, and personal distress related to ejaculation. Stopwatch-measured IELT was recorded. Associations between CGIC and change in other measures at study end point were assessed. The magnitude of IELT increased for each category of improvement on the CGIC: 1.63, 4.03, and 7.15 minutes for slightly better, better, and much better, respectively. Higher CGIC ratings were correlated with greater improvement in control (r = 0.73), satisfaction (r = 0.62), greater reduction in distress (r = -0.52), and interpersonal difficulty (r = -0.39). Total variance accounted for was 57.4%: control (48.7%), satisfaction (4.5%), IELT (2.8%), and distress (1.15%). The analyses support the validity of the CGIC measure in men with PE. The CGIC can provide clinicians in practice with a valid and brief outcome assessment of their patient's condition.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fraga, Carlos G.; Bronk, Krys; Dockendorff, Brian P.
Chemical attribution signatures (CAS) are being investigated for the sourcing of chemical warfare (CW) agents and their starting materials that may be implicated in chemical attacks or CW proliferation. The work reported here demonstrates for the first time trace impurities produced during the synthesis of tris(2-chloroethyl)amine (HN3) that point to specific reagent stocks used in the synthesis of this CW agent. Thirty batches of HN3 were synthesized using different combinations of commercial stocks of triethanolamine (TEA), thionyl chloride, chloroform, and acetone. The HN3 batches and reagent stocks were then analyzed for impurities by gas chromatography/mass spectrometry. Reaction-produced impurities indicative ofmore » specific TEA and chloroform stocks were exclusively discovered in HN3 batches made with those reagent stocks. In addition, some reagent impurities were found in the HN3 batches that were presumably not altered during synthesis and believed to be indicative of reagent type regardless of stock. Supervised classification using partial least squares discriminant analysis (PLSDA) on the impurity profiles of chloroform samples from seven stocks resulted in an average classification error by cross-validation of 2.4%. A classification error of zero was obtained using the seven-stock PLSDA model on a validation set of samples from an arbitrarily selected chloroform stock. In a separate analysis, all samples from two of seven chloroform stocks that were purposely not modeled had their samples matched to a chloroform stock rather than assigned a “no class” classification.« less
van den Berg, Margo J; Wu, Lora J; Gander, Philippa H
This study examined whether subjective measurements of in-flight sleep could be a reliable alternative to actigraphic measurements for monitoring pilot fatigue in a large-scale survey. Pilots (3-pilot crews) completed a 1-page survey on outbound and inbound long-haul flights crossing 1-7 time zones (N = 586 surveys) between 53 city pairs with 1-d layovers. Across each flight, pilots documented flight start and end times, break times, and in-flight sleep duration and quality if they attempted sleep. They also rated their fatigue (Samn-Perelli Crew Status Check) and sleepiness (Karolinska Sleepiness Scale) at top of descent (TOD). Mixed model ANCOVA was used to identify independent factors associated with sleep duration, quality, and TOD measures. Domicile time was used as a surrogate measure of circadian phase. Sleep duration increased by 10.2 min for every 1-h increase in flight duration. Sleep duration and quality varied by break start time, with significantly more sleep obtained during breaks starting between (domicile) 22:00-01:59 and 02:00-05:59 compared to earlier breaks. Pilots were more fatigued and sleepy at TOD on flights arriving between 02:00-05:59 and 06:00-09:59 domicile time compared to other flights. With every 1-h increase in sleep duration, sleepiness ratings at TOD decreased by 0.6 points and fatigue ratings decreased by 0.4 points. The present findings are consistent with previous actigraphic studies, suggesting that self-reported sleep duration is a reliable alternative to actigraphic sleep in this type of study, with use of validated measures, sufficiently large sample sizes, and where fatigue risk is expected to be low. van den Berg MJ, Wu LJ, Gander PH. Subjective measurements of in-flight sleep, circadian variation, and their relationship with fatigue. Aerosp Med Hum Perform. 2016; 87(10):869-875.
Chen, Derek E; Willick, Darryl L; Ruckel, Joseph B; Floriano, Wely B
2015-01-01
Directed evolution is a technique that enables the identification of mutants of a particular protein that carry a desired property by successive rounds of random mutagenesis, screening, and selection. This technique has many applications, including the development of G protein-coupled receptor-based biosensors and designer drugs for personalized medicine. Although effective, directed evolution is not without challenges and can greatly benefit from the development of computational techniques to predict the functional outcome of single-point amino acid substitutions. In this article, we describe a molecular dynamics-based approach to predict the effects of single amino acid substitutions on agonist binding (salicin) to a human bitter taste receptor (hT2R16). An experimentally determined functional map of single-point amino acid substitutions was used to validate the whole-protein molecular dynamics-based predictive functions. Molecular docking was used to construct a wild-type agonist-receptor complex, providing a starting structure for single-point substitution simulations. The effects of each single amino acid substitution in the functional response of the receptor to its agonist were estimated using three binding energy schemes with increasing inclusion of solvation effects. We show that molecular docking combined with molecular mechanics simulations of single-point mutants of the agonist-receptor complex accurately predicts the functional outcome of single amino acid substitutions in a human bitter taste receptor.
Compensatable muon collider calorimeter with manageable backgrounds
Raja, Rajendran
2015-02-17
A method and system for reducing background noise in a particle collider, comprises identifying an interaction point among a plurality of particles within a particle collider associated with a detector element, defining a trigger start time for each of the pixels as the time taken for light to travel from the interaction point to the pixel and a trigger stop time as a selected time after the trigger start time, and collecting only detections that occur between the start trigger time and the stop trigger time in order to thereafter compensate the result from the particle collider to reduce unwanted background detection.
Modelling of polymer photodegradation for solar cell modules
NASA Technical Reports Server (NTRS)
Somersall, A. C.; Guillet, J. E.
1981-01-01
A computer program developed to model and calculate by numerical integration the varying concentrations of chemical species formed during photooxidation of a polymeric material over time, using as input data a choice set of elementary reactions, corresponding rate constants and a convenient set of starting conditions is evaluated. Attempts were made to validate the proposed mechanism by experimentally monitoring the photooxidation products of small liquid alkane which are useful starting models for ethylene segments of polymers like EVA. The model system proved in appropriate for the intended purposes. Another validation model is recommended.
Combinatorial and High Throughput Discovery of High Temperature Piezoelectric Ceramics
2011-10-10
the known candidate piezoelectric ferroelectric perovskites. Unlike most computational studies on crystal chemistry, where the starting point is some...studies on crystal chemistry, where the starting point is some form of electronic structure calculation, we use a data driven approach to initiate our...experimental measurements reported in the literature. Given that our models are based solely on crystal and electronic structure data and did not
Uljevic, Ognjen; Spasic, Miodrag; Sekulic, Damir
2013-01-01
Sport-specific motor fitness tests are not often examined in water polo. In this study we examined the reliability, factorial and discriminative validity of 10 water-polo-specific motor-fitness tests, namely: three tests of in-water jumps (thrusts), two characteristic swimming sprints (10 and 20 metres from the water start), three ball-throws (shoots), one test of passing precision (accuracy), and a test of the dynamometric force produced while using the eggbeater kick. The sample of subjects consisted of 54 young male water polo players (15 to 17 years of age; 1.86 ± 0.07 m, and 83.1 ± 9.9 kg). All tests were applied over three testing trials. Reliability analyses included Cronbach Alpha coefficients (CA), inter-item- correlations (IIR) and coefficients of the variation (CV), while an analysis of variance was used to define any systematic bias between the testing trials. All tests except the test of accuracy (precision) were found to be reliable (CA ranged from 0.83 to 0.97; IIR from 0.62 to 0.91; CV from 2% to 21%); with small and irregular biases between the testing trials. Factor analysis revealed that jumping capacities as well as throwing and sprinting capacities should be observed as a relatively independent latent dimensions among young water polo players. Discriminative validity of the applied tests is partially proven since the playing positions significantly (p < 0.05) differed in some of the applied tests, with the points being superior in their fitness capacities in comparison to their teammates. This study included players from one of the world’s best junior National leagues, and reported values could be used as fitness standards for such an age. Further studies are needed to examine the applicability of the proposed test procedures to older subjects and females. Key Points Here presented and validated sport specific water polo motor fitness tests are found to be reliable in the sample of young male water polo players. Factor analysis revealed existence of three inde-pendent latent motor dimensions, namely, in-water jumping capacity, throwing ability, and sprint swimming capacity. Points are found to be most advanced in their fitness capacities which are mainly related to their game duties which allowed them to develop variety of fit-ness components. PMID:24421723
Fractional spectral and pseudo-spectral methods in unbounded domains: Theory and applications
NASA Astrophysics Data System (ADS)
Khosravian-Arab, Hassan; Dehghan, Mehdi; Eslahchi, M. R.
2017-06-01
This paper is intended to provide exponentially accurate Galerkin, Petrov-Galerkin and pseudo-spectral methods for fractional differential equations on a semi-infinite interval. We start our discussion by introducing two new non-classical Lagrange basis functions: NLBFs-1 and NLBFs-2 which are based on the two new families of the associated Laguerre polynomials: GALFs-1 and GALFs-2 obtained recently by the authors in [28]. With respect to the NLBFs-1 and NLBFs-2, two new non-classical interpolants based on the associated- Laguerre-Gauss and Laguerre-Gauss-Radau points are introduced and then fractional (pseudo-spectral) differentiation (and integration) matrices are derived. Convergence and stability of the new interpolants are proved in detail. Several numerical examples are considered to demonstrate the validity and applicability of the basis functions to approximate fractional derivatives (and integrals) of some functions. Moreover, the pseudo-spectral, Galerkin and Petrov-Galerkin methods are successfully applied to solve some physical ordinary differential equations of either fractional orders or integer ones. Some useful comments from the numerical point of view on Galerkin and Petrov-Galerkin methods are listed at the end.
NASA Astrophysics Data System (ADS)
van Veenstra, Anne Fleur; Janssen, Marijn
One of the main challenges for e-government is to create coherent services for citizens and businesses. Realizing Integrated Service Delivery (ISD) requires government agencies to collaborate across their organizational boundaries. The coordination of processes across multiple organizations to realize ISD is called orchestration. One way of achieving orchestration is to formalize processes using architecture. In this chapter we identify architectural principles for orchestration by looking at three case studies of cross-organizational service delivery chain formation in the Netherlands. In total, six generic principles were formulated and subsequently validated in two workshops with experts. These principles are: (i) build an intelligent front office, (ii) give processes a clear starting point and end, (iii) build a central workflow application keeping track of the process, (iv) differentiate between simple and complex processes, (v) ensure that the decision-making responsibility and the overview of the process are not performed by the same process role, and (vi) create a central point where risk profiles are maintained. Further research should focus on how organizations can adapt these principles to their own situation.
Evaluating the Social Validity of the Early Start Denver Model: A Convergent Mixed Methods Study.
Ogilvie, Emily; McCrudden, Matthew T
2017-09-01
An intervention has social validity to the extent that it is socially acceptable to participants and stakeholders. This pilot convergent mixed methods study evaluated parents' perceptions of the social validity of the Early Start Denver Model (ESDM), a naturalistic behavioral intervention for children with autism. It focused on whether the parents viewed (a) the ESDM goals as appropriate for their children, (b) the intervention procedures as acceptable and appropriate, and (c) whether changes in their children's behavior was practically significant. Parents of four children who participated in the ESDM completed the TARF-R questionnaire and participated in a semi-structured interview. Both data sets indicated that parents rated their experiences with the ESDM positively and rated it as socially-valid. The findings indicated that what was implemented in the intervention is complemented by how it was implemented and by whom.
Developing workshop module of realistic mathematics education: Follow-up workshop
NASA Astrophysics Data System (ADS)
Palupi, E. L. W.; Khabibah, S.
2018-01-01
Realistic Mathematics Education (RME) is a learning approach which fits the aim of the curriculum. The success of RME in teaching mathematics concepts, triggering students’ interest in mathematics and teaching high order thinking skills to the students will make teachers start to learn RME. Hence, RME workshop is often offered and done. This study applied development model proposed by Plomp. Based on the study by RME team, there are three kinds of RME workshop: start-up workshop, follow-up workshop, and quality boost. However, there is no standardized or validated module which is used in that workshops. This study aims to develop a module of RME follow-up workshop which is valid and can be used. Plopm’s developmental model includes materials analysis, design, realization, implementation, and evaluation. Based on the validation, the developed module is valid. While field test shows that the module can be used effectively.
He, Pei
2014-07-01
The advancements in biotechnology and genetics lead to an increasing research interest in personalized medicine, where a patient's genetic profile or biological traits contribute to choosing the most effective treatment for the patient. The process starts with finding a specific biomarker among all possible candidates that can best predict the treatment effect. After a biomarker is chosen, identifying a cut point of the biomarker value that splits the patients into treatment effective and non-effective subgroups becomes an important scientific problem. Numerous methods have been proposed to validate the predictive marker and select the appropriate cut points either prospectively or retrospectively using clinical trial data. In trials with survival outcomes, the current practice applies an interaction testing procedure and chooses the cut point that minimizes the p-values for the tests. Such method assumes independence between the baseline hazard and biomarker value. In reality, however, this assumption is often violated, as the chosen biomarker might also be prognostic in addition to its predictive nature for treatment effect. In this paper we propose a block-wise estimation and a sequential testing approach to identify the cut point in biomarkers that can group the patients into subsets based on their distinct treatment outcomes without assuming independence between the biomarker and baseline hazard. Numerical results based on simulated survival data show that the proposed method could pinpoint accurately the cut points in biomarker values that separate the patient subpopulations into subgroups with distinctive treatment outcomes. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Quattrini, R.; Battini, C.; Mammoli, R.
2018-05-01
Recently we assist to an increasing availability of HBIM models rich in geometric and informative terms. Instead, there is still a lack of researches implementing dedicated libraries, based on parametric intelligence and semantically aware, related to the architectural heritage. Additional challenges became from their portability in non-desktop environment (such as VR). The research article demonstrates the validity of a workflow applied to the architectural heritage, which starting from the semantic modeling reaches the visualization in a virtual reality environment, passing through the necessary phases of export, data migration and management. The three-dimensional modeling of the classical Doric order takes place in the BIM work environment and is configured as a necessary starting point for the implementation of data, parametric intelligences and definition of ontologies that exclusively qualify the model. The study also enables an effective method for data migration from the BIM model to databases integrated into VR technologies for AH. Furthermore, the process intends to propose a methodology, applicable in a return path, suited to the achievement of an appropriate data enrichment of each model and to the possibility of interaction in VR environment with the model.
NASA Astrophysics Data System (ADS)
Colasante, Annarita
2017-02-01
This paper presents an investigation about cooperation in a Public Good Game using an Agent Based Model calibrated on experimental data. Starting from the experiment proposed in Colasante and Russo (2016), we analyze the dynamic of cooperation in a Public Good Game where agents receive an heterogeneous income and choose both the level of contribution and the distribution rule. The starting point is the calibration and the output validation of the model using the experimental results. Once tested the goodness of fit of the Agent Based Model, we run some policy experiment in order to verify how each distribution rule, i.e. equidistribution, proportional to contribution and progressive, affects the level of contribution in the simulated model. We find out that the share of cooperators decreases over time if we exogenously set the equidistribution rule. On the contrary, the share of cooperators converges to 100 % if we impose the progressive rule. Finally, the most interesting result refers to the effect of the progressive rule. We observe that, in the case of high inequality, this rule is not able to reduce the heterogeneity of income.
Where Do I Start (Beginning the Investigation)?
NASA Astrophysics Data System (ADS)
Kornacki, Jeffrey L.
No doubt some will open directly to this chapter, because your product is contaminated with an undesirable microbe, or perhaps you have been asked to do such an investigation for another company's facility not previously observed by you and naturally you want tips on how to find where the contaminant is getting into the product stream. This chapter takes the reader through the process of beginning the investigation including understanding the process including the production schedule and critically reviewing previously generated laboratory data. Understanding the critical control points and validity of their critical limits is also important. Scoping the extent of the problem is next. It is always a good idea for the factory to have a rigorously validated cleaning and sanitation procedure that provides a documented "sanitation breakpoint," which can be useful in the "scoping" process, although some contamination events may extend past these "break-points." Touring the facility is next wherein preliminary pre-selection of areas for future sampling can be done. Operational samples and observations in non-food contact areas can be taken at this time. Then the operations personnel need to be consulted and plans made for an appropriate amount of time to observe equipment break down for "post-operational" sampling and "pre-operational" investigational sampling. Hence the chapter further discusses preparing operations personnel for the disruptions that go along with these investigations and assembling the sampling team. The chapter concludes with a discussion of post-startup observations after an investigation and sampling.
Presence capture cameras - a new challenge to the image quality
NASA Astrophysics Data System (ADS)
Peltoketo, Veli-Tapani
2016-04-01
Commercial presence capture cameras are coming to the markets and a new era of visual entertainment starts to get its shape. Since the true presence capturing is still a very new technology, the real technical solutions are just passed a prototyping phase and they vary a lot. Presence capture cameras have still the same quality issues to tackle as previous phases of digital imaging but also numerous new ones. This work concentrates to the quality challenges of presence capture cameras. A camera system which can record 3D audio-visual reality as it is has to have several camera modules, several microphones and especially technology which can synchronize output of several sources to a seamless and smooth virtual reality experience. Several traditional quality features are still valid in presence capture cameras. Features like color fidelity, noise removal, resolution and dynamic range create the base of virtual reality stream quality. However, co-operation of several cameras brings a new dimension for these quality factors. Also new quality features can be validated. For example, how the camera streams should be stitched together with 3D experience without noticeable errors and how to validate the stitching? The work describes quality factors which are still valid in the presence capture cameras and defines the importance of those. Moreover, new challenges of presence capture cameras are investigated in image and video quality point of view. The work contains considerations how well current measurement methods can be used in presence capture cameras.
Predictors of early growth in academic achievement: the head-toes-knees-shoulders task
McClelland, Megan M.; Cameron, Claire E.; Duncan, Robert; Bowles, Ryan P.; Acock, Alan C.; Miao, Alicia; Pratt, Megan E.
2014-01-01
Children's behavioral self-regulation and executive function (EF; including attentional or cognitive flexibility, working memory, and inhibitory control) are strong predictors of academic achievement. The present study examined the psychometric properties of a measure of behavioral self-regulation called the Head-Toes-Knees-Shoulders (HTKS) by assessing construct validity, including relations to EF measures, and predictive validity to academic achievement growth between prekindergarten and kindergarten. In the fall and spring of prekindergarten and kindergarten, 208 children (51% enrolled in Head Start) were assessed on the HTKS, measures of cognitive flexibility, working memory (WM), and inhibitory control, and measures of emergent literacy, mathematics, and vocabulary. For construct validity, the HTKS was significantly related to cognitive flexibility, working memory, and inhibitory control in prekindergarten and kindergarten. For predictive validity in prekindergarten, a random effects model indicated that the HTKS significantly predicted growth in mathematics, whereas a cognitive flexibility task significantly predicted growth in mathematics and vocabulary. In kindergarten, the HTKS was the only measure to significantly predict growth in all academic outcomes. An alternative conservative analytical approach, a fixed effects analysis (FEA) model, also indicated that growth in both the HTKS and measures of EF significantly predicted growth in mathematics over four time points between prekindergarten and kindergarten. Results demonstrate that the HTKS involves cognitive flexibility, working memory, and inhibitory control, and is substantively implicated in early achievement, with the strongest relations found for growth in achievement during kindergarten and associations with emergent mathematics. PMID:25071619
Hayashi, Shuji; Yamada, Hirotsugu; Bando, Mika; Saijo, Yoshihito; Nishio, Susumu; Hirata, Yukina; Klein, Allan L; Sata, Masataka
2015-08-01
Left atrial (LA) strain analysis using speckle tracking echocardiography is useful for assessing LA function. However, there is no established procedure for this method. Most investigators have determined the electrocardiographic R-wave peak as the starting point for LA strain analysis. To test our hypothesis that P-wave onset should be used as the starting point, we measured LA strain using 2 different starting points and compared the strain values with the corresponding LA volume indices obtained by three-dimensional (3D) echocardiography. We enrolled 78 subjects (61 ± 17 years, 25 males) with and without various cardiac diseases in this study and assessed global longitudinal LA strain by two-dimensional speckle tracking strain echocardiography using EchoPac software. We used either R-wave peak or P-wave onset as the starting point for determining LA strains during the reservoir (Rres, Pres), conduit (Rcon, Pcon), and booster pump (Rpump, Ppump) phases. We determined the maximum, minimum, and preatrial contraction LA volumes, and calculated the LA total, passive, and active emptying fractions using 3D echocardiography. The correlation between Pres and LA total emptying fraction was better than the correlation between Rres and LA total emptying fraction (r = 0.458 vs. 0.308, P = 0.026). Pcon and Ppump exhibited better correlation with the corresponding 3D echocardiographic parameters than Rcon (r = 0.560 vs. 0.479, P = 0.133) and Rpump (r = 0.577 vs. 0.345, P = 0.003), respectively. LA strain in any phase should be analyzed using P-wave onset as the starting point rather than R-wave peak. © 2014, Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
van Setten, M. J.; Giantomassi, M.; Gonze, X.; Rignanese, G.-M.; Hautier, G.
2017-10-01
The search for new materials based on computational screening relies on methods that accurately predict, in an automatic manner, total energy, atomic-scale geometries, and other fundamental characteristics of materials. Many technologically important material properties directly stem from the electronic structure of a material, but the usual workhorse for total energies, namely density-functional theory, is plagued by fundamental shortcomings and errors from approximate exchange-correlation functionals in its prediction of the electronic structure. At variance, the G W method is currently the state-of-the-art ab initio approach for accurate electronic structure. It is mostly used to perturbatively correct density-functional theory results, but is, however, computationally demanding and also requires expert knowledge to give accurate results. Accordingly, it is not presently used in high-throughput screening: fully automatized algorithms for setting up the calculations and determining convergence are lacking. In this paper, we develop such a method and, as a first application, use it to validate the accuracy of G0W0 using the PBE starting point and the Godby-Needs plasmon-pole model (G0W0GN @PBE) on a set of about 80 solids. The results of the automatic convergence study utilized provide valuable insights. Indeed, we find correlations between computational parameters that can be used to further improve the automatization of G W calculations. Moreover, we find that G0W0GN @PBE shows a correlation between the PBE and the G0W0GN @PBE gaps that is much stronger than that between G W and experimental gaps. However, the G0W0GN @PBE gaps still describe the experimental gaps more accurately than a linear model based on the PBE gaps. With this paper, we hence show that G W can be made automatic and is more accurate than using an empirical correction of the PBE gap, but that, for accurate predictive results for a broad class of materials, an improved starting point or some type of self-consistency is necessary.
Automatic derivation of natural and artificial lineaments from ALS point clouds in floodplains
NASA Astrophysics Data System (ADS)
Mandlburger, G.; Briese, C.
2009-04-01
Water flow is one of the most important driving forces in geomorphology and river systems have ever since formed our landscapes. With increasing urbanisation fertile flood plains were more and more cultivated and the defence of valuable settlement areas by dikes and dams became an important issue. Today, we are dealing with landscapes built up by natural as well as man-made artificial forces. In either case the general shape of the terrain can be portrayed by lineaments representing discontinuities of the terrain slope. Our contribution, therefore, presents an automatic method for delineating natural and artificial structure lines based on randomly distributed point data with high density of more than one point/m2. Preferably, the last echoes of airborne laser scanning (ALS) point clouds are used, since the laser signal is able to penetrate vegetation through small gaps in the foliage. Alternatively, point clouds from (multi) image matching can be employed, but poor ground point coverage in vegetated areas is often the limiting factor. Our approach is divided into three main steps: First, potential 2D start segments are detected by analyzing the surface curvature in the vicinity of each data point, second, the detailed 3D progression of each structure line is modelled patch-wise by intersecting surface pairs (e.g. planar patch pairs) based on the detected start segments and by performing line growing and, finally, post-processing like line cleaning, smoothing and networking is carried out in a last step. For the initial detection of start segments a best fitting two dimensional polynomial surface (quadric) is computed in each data point based on a set of neighbouring points, from which the minimum and maximum curvature is derived. Patches showing high maximum and low minimum curvatures indicate linear discontinuities in the surface slope and serve as start segments for the subsequent 3D modelling. Based on the 2D location and orientation of the start segments, surface patches can be identified as to the left or the right of the structure line. For each patch pair the intersection line is determined by least squares adjustment. The stochastic model considers the planimetric accuracy of the start segments, and the vertical measurement errors in the data points. A robust estimation approach is embedded in the patch adjustment for elimination of off-terrain ALS last echo points. Starting from an initial patch pair, structure line modelling is continued in forward and backward direction as long as certain thresholds (e.g. minimum surface intersection angles) are fulfilled. In the final post-processing step the resulting line set is cleaned by connecting corresponding line parts, by removing short line strings of minor relevance, and by thinning the resulting line set with respect to a certain approximation tolerance in order to reduce the amount of line data. Thus, interactive human verification and editing is limited to a minimum. In a real-world example structure lines were computed for a section of the river Main (ALS, last echoes, 4 points/m2) demonstrating the high potential of the proposed method with respect to accuracy and completeness. Terrestrial control measurements have confirmed the high accuracy expectations both in planimetry (<0.4m) and height (<0.2m).
Modelling and Predicting Backstroke Start Performance Using Non-Linear and Linear Models.
de Jesus, Karla; Ayala, Helon V H; de Jesus, Kelly; Coelho, Leandro Dos S; Medeiros, Alexandre I A; Abraldes, José A; Vaz, Mário A P; Fernandes, Ricardo J; Vilas-Boas, João Paulo
2018-03-01
Our aim was to compare non-linear and linear mathematical model responses for backstroke start performance prediction. Ten swimmers randomly completed eight 15 m backstroke starts with feet over the wedge, four with hands on the highest horizontal and four on the vertical handgrip. Swimmers were videotaped using a dual media camera set-up, with the starts being performed over an instrumented block with four force plates. Artificial neural networks were applied to predict 5 m start time using kinematic and kinetic variables and to determine the accuracy of the mean absolute percentage error. Artificial neural networks predicted start time more robustly than the linear model with respect to changing training to the validation dataset for the vertical handgrip (3.95 ± 1.67 vs. 5.92 ± 3.27%). Artificial neural networks obtained a smaller mean absolute percentage error than the linear model in the horizontal (0.43 ± 0.19 vs. 0.98 ± 0.19%) and vertical handgrip (0.45 ± 0.19 vs. 1.38 ± 0.30%) using all input data. The best artificial neural network validation revealed a smaller mean absolute error than the linear model for the horizontal (0.007 vs. 0.04 s) and vertical handgrip (0.01 vs. 0.03 s). Artificial neural networks should be used for backstroke 5 m start time prediction due to the quite small differences among the elite level performances.
Riedl, Janet; Esslinger, Susanne; Fauhl-Hassek, Carsten
2015-07-23
Food fingerprinting approaches are expected to become a very potent tool in authentication processes aiming at a comprehensive characterization of complex food matrices. By non-targeted spectrometric or spectroscopic chemical analysis with a subsequent (multivariate) statistical evaluation of acquired data, food matrices can be investigated in terms of their geographical origin, species variety or possible adulterations. Although many successful research projects have already demonstrated the feasibility of non-targeted fingerprinting approaches, their uptake and implementation into routine analysis and food surveillance is still limited. In many proof-of-principle studies, the prediction ability of only one data set was explored, measured within a limited period of time using one instrument within one laboratory. Thorough validation strategies that guarantee reliability of the respective data basis and that allow conclusion on the applicability of the respective approaches for its fit-for-purpose have not yet been proposed. Within this review, critical steps of the fingerprinting workflow were explored to develop a generic scheme for multivariate model validation. As a result, a proposed scheme for "good practice" shall guide users through validation and reporting of non-targeted fingerprinting results. Furthermore, food fingerprinting studies were selected by a systematic search approach and reviewed with regard to (a) transparency of data processing and (b) validity of study results. Subsequently, the studies were inspected for measures of statistical model validation, analytical method validation and quality assurance measures. In this context, issues and recommendations were found that might be considered as an actual starting point for developing validation standards of non-targeted metabolomics approaches for food authentication in the future. Hence, this review intends to contribute to the harmonization and standardization of food fingerprinting, both required as a prior condition for the authentication of food in routine analysis and official control. Copyright © 2015 Elsevier B.V. All rights reserved.
33 CFR 165.704 - Safety Zone; Tampa Bay, Florida.
Code of Federal Regulations, 2011 CFR
2011-07-01
... safety zone starts at Tampa Bay Cut “F” Channel from Lighted Buoys “3F” and “4F” and proceeds north ending at Gadsden Point Cut Lighted Buoys “3” and “4”. The safety zone starts again at Gadsden Point Cut Lighted Buoys “7” and “8” and proceeds north through Hillsborough Cut “C”, Port Sutton Entrance Channel...
[Disputes and history of fetal heart monitoring].
Dueñas-García, Omar Felipe; Díaz-Sotomayor, Maricela
2011-01-01
The concept of fetal heart monitoring to determine the fetal wellbeing state has been employed for almost 300 years, but in the last 50 years it has observed drastic changes due to the incorporation of the electronic devices that has started controversy since the moment of its description and point of start. The purpose of this article is to mention the key points and controversial moments in the history of the cardiotocography
ERIC Educational Resources Information Center
Wellard, Ian
2014-01-01
This paper provides a response to questions which emerged when reading Gilbourne et al's paper, questions it is suggested which compel us to go back to the very heart of what critical social science is (or can be) about. Central to this debate is the extent to which a perceived starting point in any investigation has implications upon the…
Breakout Reconnection Observed by the TESIS EUV Telescope
NASA Astrophysics Data System (ADS)
Reva, A. A.; Ulyanov, A. S.; Shestov, S. V.; Kuzin, S. V.
2016-01-01
We present experimental evidence of the coronal mass ejection (CME) breakout reconnection, observed by the TESIS EUV telescope. The telescope could observe solar corona up to 2 R⊙ from the Sun center in the Fe 171 Å line. Starting from 2009 April 8, TESIS observed an active region (AR) that had a quadrupolar structure with an X-point 0.5 R⊙ above photosphere. A magnetic field reconstructed from the Michelson Doppler Imager data also has a multipolar structure with an X-point above the AR. At 21:45 UT on April 9, the loops near the X-point started to move away from each other with a velocity of ≈7 km s-1. At 01:15 UT on April 10, a bright stripe appeared between the loops, and the flux in the GOES 0.5-4 Å channel increased. We interpret the loops’ sideways motion and the bright stripe as evidence of the breakout reconnection. At 01:45 UT, the loops below the X-point started to slowly move up. At 15:10 UT, the CME started to accelerate impulsively, while at the same time a flare arcade formed below the CME. After 15:50 UT, the CME moved with constant velocity. The CME evolution precisely followed the breakout model scenario.
PowerPoint Workshop for Teachers[TM].
ERIC Educational Resources Information Center
Caughlin, Janet
This guide for teachers to the Microsoft PowerPoint multimedia presentation program begins with a section that introduces what PowerPoint is and why teachers should use it, Windows 95/98 basics, Macintosh basics, getting started, PowerPoint toolbars, and presentation tips. The next section discusses learning PowerPoint, including creating a…
Development and validation of a prognostic index for 4-year mortality in older adults.
Lee, Sei J; Lindquist, Karla; Segal, Mark R; Covinsky, Kenneth E
2006-02-15
Both comorbid conditions and functional measures predict mortality in older adults, but few prognostic indexes combine both classes of predictors. Combining easily obtained measures into an accurate predictive model could be useful to clinicians advising patients, as well as policy makers and epidemiologists interested in risk adjustment. To develop and validate a prognostic index for 4-year mortality using information that can be obtained from patient report. Using the 1998 wave of the Health and Retirement Study (HRS), a population-based study of community-dwelling US adults older than 50 years, we developed the prognostic index from 11,701 individuals and validated the index with 8009. Individuals were asked about their demographic characteristics, whether they had specific diseases, and whether they had difficulty with a series of functional measures. We identified variables independently associated with mortality and weighted the variables to create a risk index. Death by December 31, 2002. The overall response rate was 81%. During the 4-year follow-up, there were 1361 deaths (12%) in the development cohort and 1072 deaths (13%) in the validation cohort. Twelve independent predictors of mortality were identified: 2 demographic variables (age: 60-64 years, 1 point; 65-69 years, 2 points; 70-74 years, 3 points; 75-79 years, 4 points; 80-84 years, 5 points, >85 years, 7 points and male sex, 2 points), 6 comorbid conditions (diabetes, 1 point; cancer, 2 points; lung disease, 2 points; heart failure, 2 points; current tobacco use, 2 points; and body mass index <25, 1 point), and difficulty with 4 functional variables (bathing, 2 points; walking several blocks, 2 points; managing money, 2 points, and pushing large objects, 1 point. Scores on the risk index were strongly associated with 4-year mortality in the validation cohort, with 0 to 5 points predicting a less than 4% risk, 6 to 9 points predicting a 15% risk, 10 to 13 points predicting a 42% risk, and 14 or more points predicting a 64% risk. The risk index showed excellent discrimination with a cstatistic of 0.84 in the development cohort and 0.82 in the validation cohort. This prognostic index, incorporating age, sex, self-reported comorbid conditions, and functional measures, accurately stratifies community-dwelling older adults into groups at varying risk of mortality.
A new design approach to innovative spectrometers. Case study: TROPOLITE
NASA Astrophysics Data System (ADS)
Volatier, Jean-Baptiste; Baümer, Stefan; Kruizinga, Bob; Vink, Rob
2014-05-01
Designing a novel optical system is a nested iterative process. The optimization loop, from a starting point to final system is already mostly automated. However this loop is part of a wider loop which is not. This wider loop starts with an optical specification and ends with a manufacturability assessment. When designing a new spectrometer with emphasis on weight and cost, numerous iterations between the optical- and mechanical designer are inevitable. The optical designer must then be able to reliably produce optical designs based on new input gained from multidisciplinary studies. This paper presents a procedure that can automatically generate new starting points based on any kind of input or new constraint that might arise. These starting points can then be handed over to a generic optimization routine to make the design tasks extremely efficient. The optical designer job is then not to design optical systems, but to meta-design a procedure that produces optical systems paving the way for system level optimization. We present here this procedure and its application to the design of TROPOLITE a lightweight push broom imaging spectrometer.
Oral desensitization to milk: how to choose the starting dose!
Mori, Francesca; Pucci, Neri; Rossi, Maria Elisabetta; de Martino, Maurizio; Azzari, Chiara; Novembre, Elio
2010-01-01
Mori F, Pucci N, Rossi ME, de Martino M, Azzari C, Novembre E. Oral desensitization to milk: how to choose the starting dose! Pediatr Allergy Immunol 2010: 21: e450–e453. © 2009 John Wiley & Sons A/S A renewed interest in oral desensitization as treatment for food allergy has been observed in the last few years. We studied a novel method based on the end point skin prick test procedure to establish the starting dose for oral desensitization in a group of 30 children higly allergic to milk. The results (in terms of reactions to the first dose administered) were compared with a group of 20 children allergic to milk as well. Such control group started to swallow the same dose of 0.015 mg/ml of milk. None reacted to the first dose when administered according to the end point skin prick test. On the other side, ten out of 20 children (50%) from the control group showed mild allergic reactions to the first dose of milk. In conclusion the end point skin prick test procedure results safe and easy to be performed in each single child in order to find out the starting dose for oral desensitization to milk, also by taking into account the individual variability. PMID:19624618
UpStart Parent Survey-Prenatal: A New Tool for Evaluating Prenatal Education Programs.
Benzies, Karen M; Barker, Leslie; Churchill, Jocelyn; Smith, Jennifer; Horn, Sarah
2016-09-01
To evaluate a new prenatal education program evaluation tool, the UpStart Parent Survey - Prenatal, in terms of: (a) reliability and validity; (b) sensitivity to change over time; (c) whether results differed for mothers versus fathers; and (d) whether results differed when using an electronic tablet-computer versus a paper survey. Psychometric study. Participants were 277 expectant mothers (n = 161) and fathers (n = 106) enrolled in Childbirth Essentials, a 6-week prenatal education program. The UpStart Parent Survey - Prenatal is a retrospective pretest/posttest survey with three scales: Parenting Knowledge, Parenting Experience, and Program Satisfaction, and three open-ended questions. The UpStart Parent Survey - Prenatal is sensitive to change and demonstrated significant positive differences in parenting knowledge and parenting experience. There was no difference in results whether the survey was completed by mothers or fathers. Results were similar whether paper or electronic formats were used. The survey was easy to complete. The UpStart Parent Survey - Prenatal holds promise as a reliable and valid evaluation tool to capture outcomes of brief prenatal education programs that target the general population of expectant parents. © 2016 Wiley Periodicals, Inc.
Pettersson, David; Bottai, Matteo; Mathiesen, Tiit; Prochazka, Michaela; Feychting, Maria
2015-01-01
The possible effect of radiofrequency exposure from mobile phones on tumor risk has been studied since the late 1990s. Yet, empirical information about recall of the start of mobile phone use among adult cases and controls has never been reported. Limited knowledge about recall errors hampers interpretations of the epidemiological evidence. We used network operator data to validate the self-reported start year of mobile phone use in a case-control study of mobile phone use and acoustic neuroma risk. The answers of 96 (29%) cases and 111 (22%) controls could be included in the validation. The larger proportion of cases reflects a more complete and detailed reporting of subscription history. Misclassification was substantial, with large random errors, small systematic errors, and no significant differences between cases and controls. The average difference between self-reported and operator start year was -0.62 (95% confidence interval: -1.42, 0.17) years for cases and -0.71 (-1.50, 0.07) years for controls, standard deviations were 3.92 and 4.17 years, respectively. Agreement between self-reported and operator-recorded data categorized into short, intermediate and long-term use was moderate (kappa statistic: 0.42). Should an association exist, dilution of risk estimates and distortion of exposure-response patterns for time since first mobile phone use could result from the large random errors in self-reported start year. Retrospective collection of operator data likely leads to a selection of "good reporters", with a higher proportion of cases. Thus, differential recall cannot be entirely excluded.
40 CFR 86.535-90 - Dynamometer procedure.
Code of Federal Regulations, 2010 CFR
2010-07-01
... run consists of two tests, a “cold” start test and a “hot” start test following the “cold” start by 10... Administrator. (d) Practice runs over the prescribed driving schedule may be performed at test points, provided... the proper speed-time relationship, or to permit sampling system adjustments. (e) The drive wheel...
Molecular Dynamic Studies of Particle Wake Potentials in Plasmas
NASA Astrophysics Data System (ADS)
Ellis, Ian; Graziani, Frank; Glosli, James; Strozzi, David; Surh, Michael; Richards, David; Decyk, Viktor; Mori, Warren
2010-11-01
Fast Ignition studies require a detailed understanding of electron scattering, stopping, and energy deposition in plasmas with variable values for the number of particles within a Debye sphere. Presently there is disagreement in the literature concerning the proper description of these processes. Developing and validating proper descriptions requires studying the processes using first-principle electrostatic simulations and possibly including magnetic fields. We are using the particle-particle particle-mesh (P^3M) code ddcMD to perform these simulations. As a starting point in our study, we examined the wake of a particle passing through a plasma. In this poster, we compare the wake observed in 3D ddcMD simulations with that predicted by Vlasov theory and those observed in the electrostatic PIC code BEPS where the cell size was reduced to .03λD.
Detection of degenerative change in lateral projection cervical spine x-ray images
NASA Astrophysics Data System (ADS)
Jebri, Beyrem; Phillips, Michael; Knapp, Karen; Appelboam, Andy; Reuben, Adam; Slabaugh, Greg
2015-03-01
Degenerative changes to the cervical spine can be accompanied by neck pain, which can result from narrowing of the intervertebral disc space and growth of osteophytes. In a lateral x-ray image of the cervical spine, degenerative changes are characterized by vertebral bodies that have indistinct boundaries and limited spacing between vertebrae. In this paper, we present a machine learning approach to detect and localize degenerative changes in lateral x-ray images of the cervical spine. Starting from a user-supplied set of points in the center of each vertebral body, we fit a central spline, from which a region of interest is extracted and image features are computed. A Random Forest classifier labels regions as degenerative change or normal. Leave-one-out cross-validation studies performed on a dataset of 103 patients demonstrates performance of above 95% accuracy.
Frau, Juan; Glossman-Mitnik, Daniel
2017-01-01
Amino acids and peptides have the potential to perform as corrosion inhibitors. The chemical reactivity descriptors that arise from Conceptual DFT for the twenty natural amino acids have been calculated by using the latest Minnesota family of density functionals. In order to verify the validity of the calculation of the descriptors directly from the HOMO and LUMO, a comparison has been performed with those obtained through ΔSCF results. Moreover, the active sites for nucleophilic and electrophilic attacks have been identified through Fukui function indices, the dual descriptor Δf( r ) and the electrophilic and nucleophilic Parr functions. The results could be of interest as a starting point for the study of large peptides where the calculation of the radical cation and anion of each system may be computationally harder and costly.
A bootstrap based Neyman-Pearson test for identifying variable importance.
Ditzler, Gregory; Polikar, Robi; Rosen, Gail
2015-04-01
Selection of most informative features that leads to a small loss on future data are arguably one of the most important steps in classification, data analysis and model selection. Several feature selection (FS) algorithms are available; however, due to noise present in any data set, FS algorithms are typically accompanied by an appropriate cross-validation scheme. In this brief, we propose a statistical hypothesis test derived from the Neyman-Pearson lemma for determining if a feature is statistically relevant. The proposed approach can be applied as a wrapper to any FS algorithm, regardless of the FS criteria used by that algorithm, to determine whether a feature belongs in the relevant set. Perhaps more importantly, this procedure efficiently determines the number of relevant features given an initial starting point. We provide freely available software implementations of the proposed methodology.
Harraghy, Niamh; Homerova, Dagmar; Herrmann, Mathias; Kormanec, Jan
2008-01-01
Mapping the transcription start points of the eap, emp, and vwb promoters revealed a conserved octanucleotide sequence (COS). Deleting this sequence abolished the expression of eap, emp, and vwb. However, electrophoretic mobility shift assays gave no evidence that this sequence was a binding site for SarA or SaeR, known regulators of eap and emp.
Investigations of magnesium, histamine and immunoglobulins dynamics in acute urticaria.
Mureşan, D; Oană, A; Nicolae, I; Alecu, M; Moşescu, L; Benea, V; Flueraş, M
1990-01-01
In 42 urticaria patients, magnesium, histamine and IgE were dosed. Magnesium, IgE and histamine variations were followed in urticaria evolution, during acute phase and clinical remission. We noticed magnesium, histamine, IgE values variations depending on disease evolution and applied therapeutic scheme. Therefore: At disease starting point, histamine presented 3.5 times higher values than the normal ones. The value decreases following a curve which tends to reach normal values during clinical remission. At disease starting point, magnesium presented values under the inferior limit of the normal, 0.5 m mol/L respectively, as a mean. The value increases towards the normal limit during clinical remission. Immunoglobulins E follow a similar curve to histamine one, presenting 1,250 U/L values at the starting point, that, under medication, influence decrease between normal limits (800 U/L), during clinical remission. Analyzing the variations of biochemical parameters, the authors emphasize magnesium substitution treatment in urticaria.
Quantitation of monomers in poly(glyerol-co-diacid) gels using gas chromatography
USDA-ARS?s Scientific Manuscript database
The validation of a gas chromatography (GC) method developed to quantify amounts of starting material from the synthesis of hyperbranched polymers made from glycerol and either succinic acid, glutaric acid, or azelaic acid is described. The GC response to concentration was linear for all starting r...
Assessing Mastery Motivation in a Head Start Sample.
ERIC Educational Resources Information Center
MacPhee, David; Fritz, Janet J.; Miller-Heyl, Jan; Hite, Judy
Although mastery motivation appears to predict school success, individual assessment of mastery motivation is too time consuming and limits the application of this research. This study examined the psychometric properties of the Dimensions of Mastery Questionnaire (DMQ). The study focused on the validity of the measure for Head Start parents,…
Standardized set-point acupuncture for migraines.
Plank, Sharon; Goodard, Janet Lee; Pasierb, Lisa; Simunich, Thomas Jason; Croner, Jeanette Renee
2013-01-01
Migraine headaches are common, debilitating, underdiagnosed, and undertreated, and medications are not always effective. Research has shown that acupuncture may be an effective and safe adjuvant or alternative migraine treatment. The purpose of the current study was to evaluate whether a standardized set of acupuncture points, when used to deliver treatment over a predefined period of time, could reduce the frequency and intensity of migraines. This is a prospective interventional study using set point acupuncture for migraines. The study took place at Conemaugh Memorial Medical Center in Johnstown, PA, USA. Participants were 59 individuals with a diagnosis of migraine. Acupuncture was administered 2 ×/wk for 4 wks, followed by 1 ×/wk for 4 more wks, using one set of acupoints. Participants collected daily headache diaries and migraine quality-of-life measurements on a personal digital assistant for 12 wks before starting the acupuncture intervention. Participants continued to record the frequency and intensity of their migraines during the intervention and for an additional 12 wks beyond the intervention. The Migraine Disability Assessment (MIDAS), Headache Impact Test (HIT-6), and Beck Depression Inventory (BDI-II) were completed 4 × during the study: 12 wks prior to the start of the intervention, immediately prior to the first acupuncture treatment, at the end of treatment, and 12 wks after the end of treatment. When preintervention measurements were compared to postintervention measurements, migraine frequency and pain intensity showed a significant decrease (α = 0.05) after acupuncture intervention. Results had not returned to the preintervention baseline even 12 wks after the last acupuncture session. Acupuncture significantly influenced migraine frequency and intensity in the study's participants when preintervention measurements were compared to postintervention measurements. These results indicate that not only did acupuncture decrease both the frequency and intensity of migraines, but also the benefit had not subsided for 12 wks after the final acupuncture session. Validated survey measurements used to assess migraine impact on quality of life showed statistically significant improvement over baseline.
Fu, Sau Nga; Chin, Weng Yee; Wong, Carlos King Ho; Yeung, Vincent Tok Fai; Yiu, Ming Pong; Tsui, Hoi Yee; Chan, Ka Hung
2013-01-01
To develop and evaluate the psychometric properties of a Chinese questionnaire which assesses the barriers and enablers to commencing insulin in primary care patients with poorly controlled Type 2 diabetes. Questionnaire items were identified using literature review. Content validation was performed and items were further refined using an expert panel. Following translation, back translation and cognitive debriefing, the translated Chinese questionnaire was piloted on target patients. Exploratory factor analysis and item-scale correlations were performed to test the construct validity of the subscales and items. Internal reliability was tested by Cronbach's alpha. Twenty-seven identified items underwent content validation, translation and cognitive debriefing. The translated questionnaire was piloted on 303 insulin naïve (never taken insulin) Type 2 diabetes patients recruited from 10 government-funded primary care clinics across Hong Kong. Sufficient variability in the dataset for factor analysis was confirmed by Bartlett's Test of Sphericity (P<0.001). Using exploratory factor analysis with varimax rotation, 10 factors were generated onto which 26 items loaded with loading scores > 0.4 and Eigenvalues >1. Total variance for the 10 factors was 66.22%. Kaiser-Meyer-Olkin measure was 0.725. Cronbach's alpha coefficients for the first four factors were ≥0.6 identifying four sub-scales to which 13 items correlated. Remaining sub-scales and items with poor internal reliability were deleted. The final 13-item instrument had a four scale structure addressing: 'Self-image and stigmatization'; 'Factors promoting self-efficacy; 'Fear of pain or needles'; and 'Time and family support'. The Chinese Attitudes to Starting Insulin Questionnaire (Ch-ASIQ) appears to be a reliable and valid measure for assessing barriers to starting insulin. This short instrument is easy to administer and may be used by healthcare providers and researchers as an assessment tool for Chinese diabetic primary care patients, including the elderly, who are unwilling to start insulin.
Modelling of individual subject ozone exposure response kinetics.
Schelegle, Edward S; Adams, William C; Walby, William F; Marion, M Susan
2012-06-01
A better understanding of individual subject ozone (O(3)) exposure response kinetics will provide insight into how to improve models used in the risk assessment of ambient ozone exposure. To develop a simple two compartment exposure-response model that describes individual subject decrements in forced expiratory volume in one second (FEV(1)) induced by the acute inhalation of O(3) lasting up to 8 h. FEV(1) measurements of 220 subjects who participated in 14 previously completed studies were fit to the model using both particle swarm and nonlinear least squares optimization techniques to identify three subject-specific coefficients producing minimum "global" and local errors, respectively. Observed and predicted decrements in FEV(1) of the 220 subjects were used for validation of the model. Further validation was provided by comparing the observed O(3)-induced FEV(1) decrements in an additional eight studies with predicted values obtained using model coefficients estimated from the 220 subjects used in cross validation. Overall the individual subject measured and modeled FEV(1) decrements were highly correlated (mean R(2) of 0.69 ± 0.24). In addition, it was shown that a matrix of individual subject model coefficients can be used to predict the mean and variance of group decrements in FEV(1). This modeling approach provides insight into individual subject O(3) exposure response kinetics and provides a potential starting point for improving the risk assessment of environmental O(3) exposure.
Impact of External Cue Validity on Driving Performance in Parkinson's Disease
Scally, Karen; Charlton, Judith L.; Iansek, Robert; Bradshaw, John L.; Moss, Simon; Georgiou-Karistianis, Nellie
2011-01-01
This study sought to investigate the impact of external cue validity on simulated driving performance in 19 Parkinson's disease (PD) patients and 19 healthy age-matched controls. Braking points and distance between deceleration point and braking point were analysed for red traffic signals preceded either by Valid Cues (correctly predicting signal), Invalid Cues (incorrectly predicting signal), and No Cues. Results showed that PD drivers braked significantly later and travelled significantly further between deceleration and braking points compared with controls for Invalid and No-Cue conditions. No significant group differences were observed for driving performance in response to Valid Cues. The benefit of Valid Cues relative to Invalid Cues and No Cues was significantly greater for PD drivers compared with controls. Trail Making Test (B-A) scores correlated with driving performance for PDs only. These results highlight the importance of external cues and higher cognitive functioning for driving performance in mild to moderate PD. PMID:21789275
Ma, Hon Ming; Ip, Margaret; Woo, Jean; Hui, David S C
2014-05-01
Health care-associated pneumonia (HCAP) and drug-resistant bacterial pneumonia may not share identical risk factors. We have shown that bronchiectasis, recent hospitalization and severe pneumonia (confusion, blood urea level, respiratory rate, low blood pressure and 65 year old (CURB-65) score ≥ 3) were independent predictors of pneumonia caused by potentially drug-resistant (PDR) pathogens. This study aimed to develop and validate a clinical risk score for predicting drug-resistant bacterial pneumonia in older patients. We derived a risk score by assigning a weighting to each of these risk factors as follows: 14, bronchiectasis; 5, recent hospitalization; 2, severe pneumonia. A 0.5 point was defined for the presence of other risk factors for HCAP. We compared the areas under the receiver-operating characteristics curve (AUROC) of our risk score and the HCAP definition in predicting PDR pathogens in two cohorts of older patients hospitalized with non-nosocomial pneumonia. The derivation and validation cohorts consisted of 354 and 96 patients with bacterial pneumonia, respectively. PDR pathogens were isolated in 48 and 21 patients in the derivation and validation cohorts, respectively. The AUROCs of our risk score and the HCAP definition were 0.751 and 0.650, respectively, in the derivation cohort, and were 0.782 and 0.671, respectively, in the validation cohort. The differences between our risk score and the HCAP definition reached statistical significance. A score ≥ 2.5 had the best balance between sensitivity and specificity. Our risk score outperformed the HCAP definition to predict pneumonia caused by PDR pathogens. A history of bronchiectasis or recent hospitalization is the major indication of starting empirical broad-spectrum antibiotics. © 2014 Asian Pacific Society of Respirology.
Modelling and Predicting Backstroke Start Performance Using Non-Linear and Linear Models
de Jesus, Karla; Ayala, Helon V. H.; de Jesus, Kelly; Coelho, Leandro dos S.; Medeiros, Alexandre I.A.; Abraldes, José A.; Vaz, Mário A.P.; Fernandes, Ricardo J.; Vilas-Boas, João Paulo
2018-01-01
Abstract Our aim was to compare non-linear and linear mathematical model responses for backstroke start performance prediction. Ten swimmers randomly completed eight 15 m backstroke starts with feet over the wedge, four with hands on the highest horizontal and four on the vertical handgrip. Swimmers were videotaped using a dual media camera set-up, with the starts being performed over an instrumented block with four force plates. Artificial neural networks were applied to predict 5 m start time using kinematic and kinetic variables and to determine the accuracy of the mean absolute percentage error. Artificial neural networks predicted start time more robustly than the linear model with respect to changing training to the validation dataset for the vertical handgrip (3.95 ± 1.67 vs. 5.92 ± 3.27%). Artificial neural networks obtained a smaller mean absolute percentage error than the linear model in the horizontal (0.43 ± 0.19 vs. 0.98 ± 0.19%) and vertical handgrip (0.45 ± 0.19 vs. 1.38 ± 0.30%) using all input data. The best artificial neural network validation revealed a smaller mean absolute error than the linear model for the horizontal (0.007 vs. 0.04 s) and vertical handgrip (0.01 vs. 0.03 s). Artificial neural networks should be used for backstroke 5 m start time prediction due to the quite small differences among the elite level performances. PMID:29599857
Balluerka, Nekane; Gorostiaga, Arantxa; Ulacia, Imanol
2014-11-14
Personal initiative characterizes people who are proactive, persistent and self-starting when facing the difficulties that arise in achieving goals. Despite its importance in the educational field there is a scarcity of measures to assess students' personal initiative. Thus, the aim of the present study was to develop a questionnaire to assess this variable in the academic environment and to validate it for adolescents and young adults. The sample comprised 244 vocational training students. The questionnaire showed a factor structure including three factors (Proactivity-Prosocial behavior, Persistence and Self-Starting) with acceptable indices of internal consistency (ranging between α = .57 and α =.73) and good convergent validity with respect to the Self-Reported Initiative scale. Evidence of external validity was also obtained based on the relationships between personal initiative and variables such as self-efficacy, enterprising attitude, responsibility and control aspirations, conscientiousness, and academic achievement. The results indicate that this new measure is very useful for assessing personal initiative among vocational training students.
2016-06-10
viewed as the panacea for all military problems. Politicians view SOF as a low risk minimalist investment that produces results; even for problems...of published work that has been dedicated to discussing special operations theory as an element of military strategy. A good starting point to...utility. Doctrine As a starting point for framing understanding of special operations, Joint Publication (JP) 3-05 Special Operations, provides the basis
[A set of quality and safety indicators for hospitals of the "Agencia Valenciana de Salud"].
Nebot-Marzal, C M; Mira-Solves, J J; Guilabert-Mora, M; Pérez-Jover, V; Pablo-Comeche, D; Quirós-Morató, T; Cuesta Peredo, D
2014-01-01
To prepare a set of quality and safety indicators for Hospitals of the «Agencia Valenciana de Salud». The qualitative technique Metaplan® was applied in order to gather proposals on sustainability and nursing. The catalogue of the «Spanish Society of Quality in Healthcare» was adopted as a starting point for clinical indicators. Using the Delphi technique, 207 professionals were invited to participate in the selecting the most reliable and feasible indicators. Lastly, the resulting proposal was validated with the managers of 12 hospitals, taking into account the variability, objectivity, feasibility, reliability and sensitivity, of the indicators. Participation rates varied between 66.67% and 80.71%. Of the 159 initial indicators, 68 were prioritized and selected (21 economic or management indicators, 22 nursing indicators, and 25 clinical or hospital indicators). Three of them were common to all three categories and two did not match the specified criteria during the validation phase, thus obtaining a final catalogue of 63 indicators. A set of quality and safety indicators for Hospitals was prepared. They are currently being monitored using the hospital information systems. Copyright © 2013 SECA. Published by Elsevier Espana. All rights reserved.
NASA Astrophysics Data System (ADS)
Freeman, A. J.; Yu, Jaejun
1990-04-01
For years, there has been controversy on whether the normal state of the Cu-oxide superconductors is a Fermi liquid or some other exotic ground state. However, some experimentalists are clarifying the nature of the normal state of the high T(sub c) superconductors by surmounting the experimental difficulties in producing clean, well characterized surfaces so as to obtain meaningful high resolved photoemission data, which agrees with earlier positron-annihilation experiments. The experimental work on high resolution angle resolved photoemission by Campuzano et al. and positron-annihilation studies by Smedskjaer et al. has verified the calculated Fermi surfaces in YBa2Cu3O7 superconductors and has provided evidence for the validity of the energy band approach. Similar good agreement was found for Bi2Sr2CaCu2O8 by Olson et al. As a Fermi liquid (metallic) nature of the normal state of the high T(sub c) superconductors becomes evident, these experimental observations have served to confirm the predictions of the local density functional calculations and hence the energy band approach as a valid natural starting point for further studies of their superconductivity.
NASA Technical Reports Server (NTRS)
Freeman, A. J.; Yu, Jaejun
1990-01-01
For years, there has been controversy on whether the normal state of the Cu-oxide superconductors is a Fermi liquid or some other exotic ground state. However, some experimentalists are clarifying the nature of the normal state of the high T(sub c) superconductors by surmounting the experimental difficulties in producing clean, well characterized surfaces so as to obtain meaningful high resolved photoemission data, which agrees with earlier positron-annihilation experiments. The experimental work on high resolution angle resolved photoemission by Campuzano et al. and positron-annihilation studies by Smedskjaer et al. has verified the calculated Fermi surfaces in YBa2Cu3O7 superconductors and has provided evidence for the validity of the energy band approach. Similar good agreement was found for Bi2Sr2CaCu2O8 by Olson et al. As a Fermi liquid (metallic) nature of the normal state of the high T(sub c) superconductors becomes evident, these experimental observations have served to confirm the predictions of the local density functional calculations and hence the energy band approach as a valid natural starting point for further studies of their superconductivity.
Creating experimental color harmony map
NASA Astrophysics Data System (ADS)
Chamaret, Christel; Urban, Fabrice; Lepinel, Josselin
2014-02-01
Starting in the 17th century with Newton, color harmony is a topic that did not reach a consensus on definition, representation or modeling so far. Previous work highlighted specific characteristics for color harmony on com- bination of color doublets or triplets by means of a human rating on a harmony scale. However, there were no investigation involving complex stimuli or pointing out how harmony is spatially located within a picture. The modeling of such concept as well as a reliable ground-truth would be of high value for the community, since the applications are wide and concern several communities: from psychology to computer graphics. We propose a protocol for creating color harmony maps from a controlled experiment. Through an eye-tracking protocol, we focus on the identification of disharmonious colors in pictures. The experiment was composed of a free viewing pass in order to let the observer be familiar with the content before a second pass where we asked "to search for the most disharmonious areas in the picture". Twenty-seven observers participated to the experiments that was composed of a total of 30 different stimuli. The high inter-observer agreement as well as a cross-validation confirm the validity of the proposed ground-truth.
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 2 Core.
Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J
2018-03-09
Computational models can help researchers to interpret data, understand biological functions, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that different software systems can exchange. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 2 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML, their encoding in XML (the eXtensible Markup Language), validation rules that determine the validity of an SBML document, and examples of models in SBML form. The design of Version 2 differs from Version 1 principally in allowing new MathML constructs, making more child elements optional, and adding identifiers to all SBML elements instead of only selected elements. Other materials and software are available from the SBML project website at http://sbml.org/.
On the behavior of the leading eigenvalue of Eigen's evolutionary matrices.
Semenov, Yuri S; Bratus, Alexander S; Novozhilov, Artem S
2014-12-01
We study general properties of the leading eigenvalue w¯(q) of Eigen's evolutionary matrices depending on the replication fidelity q. This is a linear algebra problem that has various applications in theoretical biology, including such diverse fields as the origin of life, evolution of cancer progression, and virus evolution. We present the exact expressions for w¯(q),w¯(')(q),w¯('')(q) for q = 0, 0.5, 1 and prove that the absolute minimum of w¯(q), which always exists, belongs to the interval (0, 0.5]. For the specific case of a single peaked landscape we also find lower and upper bounds on w¯(q), which are used to estimate the critical mutation rate, after which the distribution of the types of individuals in the population becomes almost uniform. This estimate is used as a starting point to conjecture another estimate, valid for any fitness landscape, and which is checked by numerical calculations. The last estimate stresses the fact that the inverse dependence of the critical mutation rate on the sequence length is not a generally valid fact. Copyright © 2014 Elsevier Inc. All rights reserved.
Detecting Brain State Changes via Fiber-Centered Functional Connectivity Analysis
Li, Xiang; Lim, Chulwoo; Li, Kaiming; Guo, Lei; Liu, Tianming
2013-01-01
Diffusion tensor imaging (DTI) and functional magnetic resonance imaging (fMRI) have been widely used to study structural and functional brain connectivity in recent years. A common assumption used in many previous functional brain connectivity studies is the temporal stationarity. However, accumulating literature evidence has suggested that functional brain connectivity is under temporal dynamic changes in different time scales. In this paper, a novel and intuitive approach is proposed to model and detect dynamic changes of functional brain states based on multimodal fMRI/DTI data. The basic idea is that functional connectivity patterns of all fiber-connected cortical voxels are concatenated into a descriptive functional feature vector to represent the brain’s state, and the temporal change points of brain states are decided by detecting the abrupt changes of the functional vector patterns via the sliding window approach. Our extensive experimental results have shown that meaningful brain state change points can be detected in task-based fMRI/DTI, resting state fMRI/DTI, and natural stimulus fMRI/DTI data sets. Particularly, the detected change points of functional brain states in task-based fMRI corresponded well to the external stimulus paradigm administered to the participating subjects, thus partially validating the proposed brain state change detection approach. The work in this paper provides novel perspective on the dynamic behaviors of functional brain connectivity and offers a starting point for future elucidation of the complex patterns of functional brain interactions and dynamics. PMID:22941508
NASA Astrophysics Data System (ADS)
Arrighi, Chiara; Campo, Lorenzo
2017-04-01
In last years, the concern about the economical and lives loss due to urban floods has grown hand in hand with the numerical skills in simulating such events. The large amount of computational power needed in order to address the problem (simulating a flood in a complex terrain such as a medium-large city) is only one of the issues. Among them it is possible to consider the general lack of exhaustive observations during the event (exact extension, dynamic, water level reached in different parts of the involved area), needed for calibration and validation of the model, the need of considering the sewers effects, and the availability of a correct and precise description of the geometry of the problem. In large cities the topographic surveys are in general available with a number of points, but a complete hydraulic simulation needs a detailed description of the terrain on the whole computational domain. LIDAR surveys can achieve this goal, providing a comprehensive description of the terrain, although they often lack precision. In this work an optimal merging of these two sources of geometrical information, measured elevation points and LIDAR survey, is proposed, by taking into account the error variance of both. The procedure is applied to a flood-prone city over an area of 35 square km approximately starting with a DTM from LIDAR with a spatial resolution of 1 m, and 13000 measured points. The spatial pattern of the error (LIDAR vs points) is analysed, and the merging method is tested with a series of Jackknife procedures that take into account different densities of the available points. A discussion of the results is provided.
The Preschool Learning Behaviors Scale: Dimensionality and External Validity in Head Start
ERIC Educational Resources Information Center
McDermott, Paul A.; Rikoon, Samuel H.; Waterman, Clare; Fantuzzo, John W.
2012-01-01
Given the importance of accurately gauging early childhood approaches to learning, this study reports evidence for the dimensionality and utility of the Preschool Learning Behaviors Scale for use with disadvantaged preschool children. Data from a large (N = 1,666) sample representative of urban Head Start classrooms revealed three reliable…
Investigation of the influence of sampling schemes on quantitative dynamic fluorescence imaging
Dai, Yunpeng; Chen, Xueli; Yin, Jipeng; Wang, Guodong; Wang, Bo; Zhan, Yonghua; Nie, Yongzhan; Wu, Kaichun; Liang, Jimin
2018-01-01
Dynamic optical data from a series of sampling intervals can be used for quantitative analysis to obtain meaningful kinetic parameters of probe in vivo. The sampling schemes may affect the quantification results of dynamic fluorescence imaging. Here, we investigate the influence of different sampling schemes on the quantification of binding potential (BP) with theoretically simulated and experimentally measured data. Three groups of sampling schemes are investigated including the sampling starting point, sampling sparsity, and sampling uniformity. In the investigation of the influence of the sampling starting point, we further summarize two cases by considering the missing timing sequence between the probe injection and sampling starting time. Results show that the mean value of BP exhibits an obvious growth trend with an increase in the delay of the sampling starting point, and has a strong correlation with the sampling sparsity. The growth trend is much more obvious if throwing the missing timing sequence. The standard deviation of BP is inversely related to the sampling sparsity, and independent of the sampling uniformity and the delay of sampling starting time. Moreover, the mean value of BP obtained by uniform sampling is significantly higher than that by using the non-uniform sampling. Our results collectively suggest that a suitable sampling scheme can help compartmental modeling of dynamic fluorescence imaging provide more accurate results and simpler operations. PMID:29675325
Stijkel, A; van Eijndhoven, J C; Bal, R
1996-12-01
The Dutch procedure for standard setting for occupational exposure to chemicals, just like the European Union (EU) procedure, is characterized by an organizational separation between considerations of health on the one side, and of technology, economics, and policy on the other side. Health considerations form the basis for numerical guidelines. These guidelines are next combined with technical-economical considerations. Standards are then proposed, and are finally set by the Ministry of Social Affairs and Employment. An analysis of this procedure might be of relevance to the US, where other procedures are used and criticized. In this article we focus on the first stage of the standard-setting procedure. In this stage, the Dutch Expert Committee on Occupational Standards (DECOS) drafts a criteria document in which a health-based guideline is proposed. The drafting is based on a set of starting points for assessing toxicity. We raise the questions, "Does DECOS limit itself only to health considerations? And if not, what are the consequences of such a situation?" We discuss DECOS' starting points and analyze the relationships between those starting points, and then explore eight criteria documents where DECOS was considering reproductive risks as a possible critical effect. For various reasons, it will be concluded that the starting points leave much interpretative space, and that this space is widened further by the manner in which DECOS utilizes it. This is especially true in situations involving sex-specific risks and uncertainties in knowledge. Consequently, even at the first stage, where health considerations alone are intended to play a role, there is much room for other than health-related factors to influence decision making, although it is unavoidable that some interpretative space will remain. We argue that separating the various types of consideration should not be abandoned. Rather, through adjustments in the starting points and aspects of the procedure, clarity should be guaranteed about the way the interpretative space is being employed.
Code of Federal Regulations, 2010 CFR
2010-01-01
... means a group of validity screening tests that were made from the same starting material. ... past 24 hours. Adulterated specimen means a urine specimen that has been altered, as evidenced by test... whole specimen. Analytical run means the process of testing a group of urine specimens for validity or...
Code of Federal Regulations, 2013 CFR
2013-01-01
... means a group of validity screening tests that were made from the same starting material. ... past 24 hours. Adulterated specimen means a urine specimen that has been altered, as evidenced by test... whole specimen. Analytical run means the process of testing a group of urine specimens for validity or...
Code of Federal Regulations, 2014 CFR
2014-01-01
... means a group of validity screening tests that were made from the same starting material. ... past 24 hours. Adulterated specimen means a urine specimen that has been altered, as evidenced by test... whole specimen. Analytical run means the process of testing a group of urine specimens for validity or...
Code of Federal Regulations, 2012 CFR
2012-01-01
... means a group of validity screening tests that were made from the same starting material. ... past 24 hours. Adulterated specimen means a urine specimen that has been altered, as evidenced by test... whole specimen. Analytical run means the process of testing a group of urine specimens for validity or...
Code of Federal Regulations, 2011 CFR
2011-01-01
... means a group of validity screening tests that were made from the same starting material. ... past 24 hours. Adulterated specimen means a urine specimen that has been altered, as evidenced by test... whole specimen. Analytical run means the process of testing a group of urine specimens for validity or...
Advances in the regionalization approach: geostatistical techniques for estimating flood quantiles
NASA Astrophysics Data System (ADS)
Chiarello, Valentina; Caporali, Enrica; Matthies, Hermann G.
2015-04-01
The knowledge of peak flow discharges and associated floods is of primary importance in engineering practice for planning of water resources and risk assessment. Streamflow characteristics are usually estimated starting from measurements of river discharges at stream gauging stations. However, the lack of observations at site of interest as well as the measurement inaccuracies, bring inevitably to the necessity of developing predictive models. Regional analysis is a classical approach to estimate river flow characteristics at sites where little or no data exists. Specific techniques are needed to regionalize the hydrological variables over the considered area. Top-kriging or topological kriging, is a kriging interpolation procedure that takes into account the geometric organization and structure of hydrographic network, the catchment area and the nested nature of catchments. The continuous processes in space defined for the point variables are represented by a variogram. In Top-kriging, the measurements are not point values but are defined over a non-zero catchment area. Top-kriging is applied here over the geographical space of Tuscany Region, in Central Italy. The analysis is carried out on the discharge data of 57 consistent runoff gauges, recorded from 1923 to 2014. Top-kriging give also an estimation of the prediction uncertainty in addition to the prediction itself. The results are validated using a cross-validation procedure implemented in the package rtop of the open source statistical environment R The results are compared through different error measurement methods. Top-kriging seems to perform better in nested catchments and larger scale catchments but no for headwater or where there is a high variability for neighbouring catchments.
Bancone, Germana; Gornsawun, Gornpan; Chu, Cindy S; Porn, Pen; Pal, Sampa; Bansil, Pooja; Domingo, Gonzalo J; Nosten, Francois
2018-01-01
Glucose-6-phosphate dehydrogenase (G6PD) deficiency is the most common enzymopathy in the human population affecting an estimated 8% of the world population, especially those living in areas of past and present malaria endemicity. Decreased G6PD enzymatic activity is associated with drug-induced hemolysis and increased risk of severe neonatal hyperbilirubinemia leading to brain damage. The G6PD gene is on the X chromosome therefore mutations cause enzymatic deficiency in hemizygote males and homozygote females while the majority of heterozygous females have an intermediate activity (between 30-80% of normal) with a large distribution into the range of deficiency and normality. Current G6PD qualitative tests are unable to diagnose G6PD intermediate activities which could hinder wide use of 8-aminoquinolines for Plasmodium vivax elimination. The aim of the study was to assess the diagnostic performances of the new Carestart G6PD quantitative biosensor. A total of 150 samples of venous blood with G6PD deficient, intermediate and normal phenotypes were collected among healthy volunteers living along the north-western Thailand-Myanmar border. Samples were analyzed by complete blood count, by gold standard spectrophotometric assay using Trinity kits and by the latest model of Carestart G6PD biosensor which analyzes both G6PD and hemoglobin. Bland-Altman comparison of the CareStart normalized G6PD values to that of the gold standard assay showed a strong bias in values resulting in poor area under-the-curve values for both 30% and 80% thresholds. Performing a receiver operator curve identified threshold values for the CareStart product equivalent to the 30% and 80% gold standard values with good sensitivity and specificity values, 100% and 92% (for 30% G6PD activity) and 92% and 94% (for 80% activity) respectively. The Carestart G6PD biosensor represents a significant improvement for quantitative diagnosis of G6PD deficiency over previous versions. Further improvements and validation studies are required to assess its utility for informing radical cure decisions in malaria endemic settings.
Gornsawun, Gornpan; Chu, Cindy S.; Porn, Pen; Pal, Sampa; Bansil, Pooja
2018-01-01
Introduction Glucose-6-phosphate dehydrogenase (G6PD) deficiency is the most common enzymopathy in the human population affecting an estimated 8% of the world population, especially those living in areas of past and present malaria endemicity. Decreased G6PD enzymatic activity is associated with drug-induced hemolysis and increased risk of severe neonatal hyperbilirubinemia leading to brain damage. The G6PD gene is on the X chromosome therefore mutations cause enzymatic deficiency in hemizygote males and homozygote females while the majority of heterozygous females have an intermediate activity (between 30–80% of normal) with a large distribution into the range of deficiency and normality. Current G6PD qualitative tests are unable to diagnose G6PD intermediate activities which could hinder wide use of 8-aminoquinolines for Plasmodium vivax elimination. The aim of the study was to assess the diagnostic performances of the new Carestart G6PD quantitative biosensor. Methods A total of 150 samples of venous blood with G6PD deficient, intermediate and normal phenotypes were collected among healthy volunteers living along the north-western Thailand-Myanmar border. Samples were analyzed by complete blood count, by gold standard spectrophotometric assay using Trinity kits and by the latest model of Carestart G6PD biosensor which analyzes both G6PD and hemoglobin. Results Bland-Altman comparison of the CareStart normalized G6PD values to that of the gold standard assay showed a strong bias in values resulting in poor area under-the-curve values for both 30% and 80% thresholds. Performing a receiver operator curve identified threshold values for the CareStart product equivalent to the 30% and 80% gold standard values with good sensitivity and specificity values, 100% and 92% (for 30% G6PD activity) and 92% and 94% (for 80% activity) respectively. Conclusion The Carestart G6PD biosensor represents a significant improvement for quantitative diagnosis of G6PD deficiency over previous versions. Further improvements and validation studies are required to assess its utility for informing radical cure decisions in malaria endemic settings. PMID:29738562
Mähringer-Kunz, Aline; Kloeckner, Roman; Pitton, Michael B; Düber, Christoph; Schmidtmann, Irene; Galle, Peter R; Koch, Sandra; Weinmann, Arndt
2017-07-01
Several scoring systems that guide patients' treatment regimen for transarterial chemoembolization (TACE) of hepatocellular carcinoma (HCC) have been introduced, but none have gained widespread acceptance in clinical practice. The purpose of this study is to externally validate the Selection for TrAnsarterial chemoembolization TrEatment (STATE)-score and START-strategy [i.e., sequential use of the STATE-score and Assessment for Retreatment with TACE (ART)-score]. From January 2000 to September 2015, 933 patients with HCC underwent TACE at our institution. All variables needed to calculate the STATE-score and implement the START-strategy were determined. STATE comprised serum albumin, up-to-seven criteria, and C-reactive protein (CRP). ART comprised an increase in aspartate aminotransferase, the Child-Pugh score, and a radiological tumor response. Overall survival was calculated, and multivariate analysis performed. In addition, the STATE-score and START-strategy were validated using the Harrell's C-index and integrated Brier score (IBS). The STATE-score was calculated in 228 patients. Low and high STATE-scores corresponded to median survival of 14.3 and 20.2 months, respectively. Harrell's C was 0.558 and IBS 0.133. For the STATE-score, significant predictors of survival were up-to-seven criteria (p = 0.006) and albumin (p = 0.022). CRP values were not predictive (p = 0.367). The ART-score was calculated in 207 patients. Combining the STATE-score and ART-score led to a Harrell's C of 0.580 and IBS of 0.132. The STATE-score was unable to reliably determine the suitability for initial TACE. The START-strategy only slightly improved the predictive ability compared to the ART-score alone. Therefore, neither the STATE-score nor START-strategy alone provides sufficient certainty for clear-cut clinical decisions.
Advanced Cooling for High Power Electric Actuators
1993-01-01
heat and heat transfer rates. At point B, the fluid temperature reaches the melting temperature of the PCM and it starts to melt , storing energy in the...working fluid through the duty cycle represented by the square wave in the upper half of the figure. Starting at point A, the actuator goes to peak load...form of latent heat. As the solid material melts , the coolant temperature continues to rise, but at a much lower rate, as the heat conducts through the
The role of the optimization process in illumination design
NASA Astrophysics Data System (ADS)
Gauvin, Michael A.; Jacobsen, David; Byrne, David J.
2015-07-01
This paper examines the role of the optimization process in illumination design. We will discuss why the starting point of the optimization process is crucial to a better design and why it is also important that the user understands the basic design problem and implements the correct merit function. Both a brute force method and the Downhill Simplex method will be used to demonstrate optimization methods with focus on using interactive design tools to create better starting points to streamline the optimization process.
Matsuda, F; Lan, W C; Tanimura, R
1999-02-01
In Matsuda's 1996 study, 4- to 11-yr.-old children (N = 133) watched two cars running on two parallel tracks on a CRT display and judged whether their durations and distances were equal and, if not, which was larger. In the present paper, the relative contributions of the four critical stimulus attributes (whether temporal starting points, temporal stopping points, spatial starting points, and spatial stopping points were the same or different between two cars) to the production of errors were quantitatively estimated based on the data for rates of errors obtained by Matsuda. The present analyses made it possible not only to understand numerically the findings about qualitative characteristics of the critical attributes described by Matsuda, but also to add more detailed findings about them.
Malaria and global change: Insights, uncertainties and possible surprises
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, P.H.; Steel, A.
Malaria may change with global change. Indeed, global change may affect malaria risk and malaria epidemiology. Malaria risk may change in response to a greenhouse warming; malaria epidemiology, in response to the social, economic, and political developments which a greenhouse warming may trigger. To date, malaria receptivity and epidemiology futures have been explored within the context of equilibrium studies. Equilibrium studies of climate change postulate an equilibrium present climate (the starting point) and a doubled-carbon dioxide climate (the end point), simulate conditions in both instances, and compare the two. What happens while climate changes, i.e., between the starting point andmore » the end point, is ignored. The present paper focuses on malaria receptivity and addresses what equilibrium studies miss, namely transient malaria dynamics.« less
Peeters, Patrick; Van Biesen, Wim; Veys, Nic; Lemahieu, Wim; De Moor, Bart; De Meester, Johan
2016-04-07
Shared decision making is nowadays acknowledged as an essential step when deciding on starting renal replacement therapy. Valid risk stratification of prognosis is, besides discussing quality of life, crucial in this regard. We intended to validate a recently published risk stratification model in a large cohort of incident patients starting renal replacement therapy in Flanders. During 3 years (2001-2003), the data set collected for the Nederlandstalige Belgische Vereniging voor Nefrologie (NBVN) registry was expanded with parameters of comorbidity. For all incident patients, the abbreviated REIN score(aREIN), being the REIN score without the parameter "mobility", was calculated, and prognostication of mortality at 3, 6 and 12 month after start of renal replacement therapy (RRT) was evaluated. Three thousand four hundred seventy-two patients started RRT in Flanders during the observation period (mean age 67.6 ± 14.3, 56.7 % men, 33.6 % diabetes). The mean aREIN score was 4.1 ± 2.8, and 56.8, 23.1, 12.6 and 7.4 % of patients had a score of ≤4, 5-6, 7-8 or ≥9 respectively. Mortality at 3, 6 and 12 months was 8.6, 14.1 and 19.6 % in the overall and 13.2, 21.5 and 31.9 % in the group with age >75 respectively. In RoC analysis, the aREIN score had an AUC of 0.74 for prediction of survival at 3, 6 and 12 months. There was an incremental increase in mortality with the aREIN score from 5.6 to 45.8 % mortality at 6 months for those with a score ≤4 or ≥9 respectively. The aREIN score is a useful tool to predict short term prognosis of patients starting renal replacement therapy as based on comorbidity and age, and delivers meaningful discrimination between low and high risk populations. As such, it can be a useful instrument to be incorporated in shared decision making on whether or not start of dialysis is worthwhile.
The Orbiting Carbon Observatory: NASA's First Dedicated Carbon Dioxide Mission
NASA Technical Reports Server (NTRS)
Crisp, D.
2008-01-01
The Orbiting Carbon Observatory is scheduled for launch from Vandenberg Air Force Base in California in January 2009. This Earth System Science Pathfinder (ESSP) mission carries and points a single instrument that incorporates 3 high-resolution grating spectrometers designed to measure the absorption of reflected sunlight by near-infrared carbon dioxide (CO2) and molecular oxygen bands. These spectra will be analyzed to retrieve estimates of the column-averaged CO2 dry air mole fraction, X(sub CO2). Pre-flight qualification and calibration tests completed in early 2008 indicate that the instrument will provide high quality X(sub CO2) data. The instrument was integrated into the spacecraft, and the completed Observatory was qualified and tested during the spring and summer of 2008, in preparation for delivery to the launch site in the fall of this year. The Observatory will initially be launched into a 635 km altitude, near-polar orbit. The on-board propulsion system will then raise the orbit to 705 km and insert OCO into the Earth Observing System Afternoon Constellation (A-Train). The first routine science observations are expected about 45 days after launch. Calibrated spectral radiances will be archived starting about 6 months later. An exploratory X(sub CO2) product will be validated and then archived starting about 3 months after that.
The Orbiting Carbon Observatory: NASA's first dedicated carbon dioxide mission
NASA Astrophysics Data System (ADS)
Crisp, D.
2008-10-01
The Orbiting Carbon Observatory is scheduled for launch from Vandenberg Air Force Base in California in January 2009. This Earth System Science Pathfinder (ESSP) mission carries and points a single instrument that incorporates 3 high-resolution grating spectrometers designed to measure the absorption of reflected sunlight by near-infrared carbon dioxide (CO2) and molecular oxygen bands. These spectra will be analyzed to retrieve estimates of the column-averaged CO2 dry air mole fraction, XCO2. Pre-flight qualification and calibration tests completed in early 2008 indicate that the instrument will provide high quality XCO2 data. The instrument was integrated into the spacecraft, and the completed Observatory was qualified and tested during the spring and summer of 2008, in preparation for delivery to the launch site in the fall of this year. The Observatory will initially be launched into a 635 km altitude, near-polar orbit. The on-board propulsion system will then raise the orbit to 705 km and insert OCO into the Earth Observing System Afternoon Constellation (A-Train). The first routine science observations are expected about 45 days after launch. Calibrated spectral radiances will be archived starting about 6 months later. An exploratory XCO2 product will be validated and then archived starting about 3 months after that.
Setting the scene for SWOT: global maps of river reach hydrodynamic variables
NASA Astrophysics Data System (ADS)
Schumann, Guy J.-P.; Durand, Michael; Pavelsky, Tamlin; Lion, Christine; Allen, George
2017-04-01
Credible and reliable characterization of discharge from the Surface Water and Ocean Topography (SWOT) mission using the Manning-based algorithms needs a prior estimate constraining reach-scale channel roughness, base flow and river bathymetry. For some places, any one of those variables may exist locally or even regionally as a measurement, which is often only at a station, or sometimes as a basin-wide model estimate. However, to date none of those exist at the scale required for SWOT and thus need to be mapped at a continental scale. The prior estimates will be employed for producing initial discharge estimates, which will be used as starting-guesses for the various Manning-based algorithms, to be refined using the SWOT measurements themselves. A multitude of reach-scale variables were derived, including Landsat-based width, SRTM slope and accumulation area. As a possible starting point for building the prior database of low flow, river bathymetry and channel roughness estimates, we employed a variety of sources, including data from all GRDC records, simulations from the long-time runs of the global water balance model (WBM), and reach-based calculations from hydraulic geometry relationships as well as Manning's equation. Here, we present the first global maps of this prior database with some initial validation, caveats and prospective uses.
PCC Framework for Program-Generators
NASA Technical Reports Server (NTRS)
Kong, Soonho; Choi, Wontae; Yi, Kwangkeun
2009-01-01
In this paper, we propose a proof-carrying code framework for program-generators. The enabling technique is abstract parsing, a static string analysis technique, which is used as a component for generating and validating certificates. Our framework provides an efficient solution for certifying program-generators whose safety properties are expressed in terms of the grammar representing the generated program. The fixed-point solution of the analysis is generated and attached with the program-generator on the code producer side. The consumer receives the code with a fixed-point solution and validates that the received fixed point is indeed a fixed point of the received code. This validation can be done in a single pass.
Discriminantly Valid Personality Measures: Some Propositions. Research Bulletin No. 339.
ERIC Educational Resources Information Center
Jackson, Douglas N.
Starting with the premise that the construct-oriented approach is the only viable approach to personality assessment, this paper considers five propositions. First, a prerequisite to generalizable and valid psychometric measurement of personality rests on the choice of broad-based constructs with systematic univocal definitions. Next, measures…
Validating an Asthma Case Detection Instrument in a Head Start Sample
ERIC Educational Resources Information Center
Bonner, Sebastian; Matte, Thomas; Rubin, Mitchell; Sheares, Beverley J.; Fagan, Joanne K.; Evans, David; Mellins, Robert B.
2006-01-01
Although specific tests screen children in preschool programs for vision, hearing, and dental conditions, there are no published validated instruments to detect preschool-age children with asthma, one of the most common pediatric chronic conditions affecting children in economically disadvantaged communities of color. As part of an asthma…
2010-11-01
peer, racoon (IKE-daemon) will start authenticating using certificates. After a successful authentication, IPSec security associations will be set up...colour had credentials from one CA. Racoon and ipsec-tools are open-source software, implementing IKE and IPSec. Validation of the PCN Concept; Mobility
76 FR 22308 - Airworthiness Directives; Airbus Model A340-541 and -642 Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-21
... airworthiness information (MCAI) originated by an aviation authority of another country to identify and correct... PB201 were de-validated starting from the SRM revision issued on January 2009. The terminology ``De... ``de-validated SRM'' repairs and, if necessary, to apply the associated corrective actions [repair...
Ecological validity of virtual environments to assess human navigation ability
van der Ham, Ineke J. M.; Faber, Annemarie M. E.; Venselaar, Matthijs; van Kreveld, Marc J.; Löffler, Maarten
2015-01-01
Route memory is frequently assessed in virtual environments. These environments can be presented in a fully controlled manner and are easy to use. Yet they lack the physical involvement that participants have when navigating real environments. For some aspects of route memory this may result in reduced performance in virtual environments. We assessed route memory performance in four different environments: real, virtual, virtual with directional information (compass), and hybrid. In the hybrid environment, participants walked the route outside on an open field, while all route information (i.e., path, landmarks) was shown simultaneously on a handheld tablet computer. Results indicate that performance in the real life environment was better than in the virtual conditions for tasks relying on survey knowledge, like pointing to start and end point, and map drawing. Performance in the hybrid condition however, hardly differed from real life performance. Performance in the virtual environment did not benefit from directional information. Given these findings, the hybrid condition may offer the best of both worlds: the performance level is comparable to that of real life for route memory, yet it offers full control of visual input during route learning. PMID:26074831
Chemical genetics of Plasmodium falciparum
Guiguemde, W. Armand; Shelat, Anang A.; Bouck, David; Duffy, Sandra; Crowther, Gregory J.; Davis, Paul H.; Smithson, David C.; Connelly, Michele; Clark, Julie; Zhu, Fangyi; Jiménez-Díaz, María B; Martinez, María S; Wilson, Emily B.; Tripathi, Abhai K.; Gut, Jiri; Sharlow, Elizabeth R.; Bathurst, Ian; El Mazouni, Farah; Fowble, Joseph W; Forquer, Isaac; McGinley, Paula L; Castro, Steve; Angulo-Barturen, Iñigo; Ferrer, Santiago; Rosenthal, Philip J.; DeRisi, Joseph L; Sullivan, David J.; Lazo, John S.; Roos, David S.; Riscoe, Michael K.; Phillips, Margaret A.; Rathod, Pradipsinh K.; Van Voorhis, Wesley C.; Avery, Vicky M; Guy, R. Kiplin
2010-01-01
Malaria caused by Plasmodium falciparum is a catastrophic disease worldwide (880,000 deaths yearly). Vaccine development has proved difficult and resistance has emerged for most antimalarials. In order to discover new antimalarial chemotypes, we have employed a phenotypic forward chemical genetic approach to assay 309,474 chemicals. Here we disclose structures and biological activity of the entire library, many of which exhibited potent in vitro activity against drug resistant strains, and detailed profiling of 172 representative candidates. A reverse chemical genetic study identified 19 new inhibitors of 4 validated drug targets and 15 novel binders among 61 malarial proteins. Phylochemogenetic profiling in multiple organisms revealed similarities between Toxoplasma gondii and mammalian cell lines and dissimilarities between P. falciparum and related protozoans. One exemplar compound displayed efficacy in a murine model. Overall, our findings provide the scientific community with new starting points for malaria drug discovery. PMID:20485428
Critical phenomena in active matter
NASA Astrophysics Data System (ADS)
Paoluzzi, M.; Maggi, C.; Marini Bettolo Marconi, U.; Gnan, N.
2016-11-01
We investigate the effect of self-propulsion on a mean-field order-disorder transition. Starting from a φ4 scalar field theory subject to an exponentially correlated noise, we exploit the unified colored-noise approximation to map the nonequilibrium active dynamics onto an effective equilibrium one. This allows us to follow the evolution of the second-order critical point as a function of the noise parameters: the correlation time τ and the noise strength D . Our results suggest that the universality class of the model remains unchanged. We also estimate the effect of Gaussian fluctuations on the mean-field approximation finding an Ornstein-Zernike-like expression for the static structure factor at long wavelengths. Finally, to assess the validity of our predictions, we compare the mean-field theoretical results with numerical simulations of active Lennard-Jones particles in two and three dimensions, finding good qualitative agreement at small τ values.
Verification of adolescent self-reported smoking.
Kentala, Jukka; Utriainen, Pekka; Pahkala, Kimmo; Mattila, Kari
2004-02-01
Smoking and the validity of information obtained on it is often questioned in view of the widespread belief that adolescents tend to under- or over-report the habit. The aim here was to verify smoking habits as reported in a questionnaire given in conjunction with dental examinations by asking participants directly whether they smoked or not and performing biochemical measurements of thiocyanate in the saliva and carbon monoxide in the expired air. The series consisted of 150 pupils in the ninth grade (age 15 years). The reports in the questionnaires seemed to provide a reliable estimate of adolescent smoking, the sensitivity of the method being 81-96%, specificity 77-95%. Biochemical verification or control of smoking proved needless in normal dental practice. Accepting information offered by the patient provides a good starting point for health education and work motivating and supporting of self-directed breaking of the habit.
Simple model of hydrophobic hydration.
Lukšič, Miha; Urbic, Tomaz; Hribar-Lee, Barbara; Dill, Ken A
2012-05-31
Water is an unusual liquid in its solvation properties. Here, we model the process of transferring a nonpolar solute into water. Our goal was to capture the physical balance between water's hydrogen bonding and van der Waals interactions in a model that is simple enough to be nearly analytical and not heavily computational. We develop a 2-dimensional Mercedes-Benz-like model of water with which we compute the free energy, enthalpy, entropy, and the heat capacity of transfer as a function of temperature, pressure, and solute size. As validation, we find that this model gives the same trends as Monte Carlo simulations of the underlying 2D model and gives qualitative agreement with experiments. The advantages of this model are that it gives simple insights and that computational time is negligible. It may provide a useful starting point for developing more efficient and more realistic 3D models of aqueous solvation.
The Multidimensional Assessment of Interoceptive Awareness (MAIA)
Mehling, Wolf E.; Price, Cynthia; Daubenmier, Jennifer J.; Acree, Mike; Bartmess, Elizabeth; Stewart, Anita
2012-01-01
This paper describes the development of a multidimensional self-report measure of interoceptive body awareness. The systematic mixed-methods process involved reviewing the current literature, specifying a multidimensional conceptual framework, evaluating prior instruments, developing items, and analyzing focus group responses to scale items by instructors and patients of body awareness-enhancing therapies. Following refinement by cognitive testing, items were field-tested in students and instructors of mind-body approaches. Final item selection was achieved by submitting the field test data to an iterative process using multiple validation methods, including exploratory cluster and confirmatory factor analyses, comparison between known groups, and correlations with established measures of related constructs. The resulting 32-item multidimensional instrument assesses eight concepts. The psychometric properties of these final scales suggest that the Multidimensional Assessment of Interoceptive Awareness (MAIA) may serve as a starting point for research and further collaborative refinement. PMID:23133619
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pooley, G.R.
In the aftermath of the Cold War it becomes necessary to explore the validity of nuclear deterrence as the cornerstone of the United States National Military Strategy for the upcoming period of transition in international relations. Using the current world situation as a starting point, the evolving trends in international relations, arms control and nuclear proliferation, the strategic threat and the evolution of technology will be analyzed in an effort to forecast the complexion of international relations twenty years hence. Then, within this context, nuclear deterrence and a non nuclear alternative nonoffensive defense, proposed by the Danish political scientist, Bjornmore » Moller, will be examined. In the final analysis, this project will suggest an appropriate direction for the evolution of the United States' National Military Strategy which, in the opinion of the author, provides the best probability for long term world peace.« less
Pilger, Beatrice D; Cui, Can; Coen, Donald M
2004-05-01
The interaction between the catalytic subunit Pol and the processivity subunit UL42 of herpes simplex virus DNA polymerase has been characterized structurally and mutationally and is a potential target for novel antiviral drugs. We developed and validated an assay for small molecules that could disrupt the interaction of UL42 and a Pol-derived peptide and used it to screen approximately 16,000 compounds. Of 37 "hits" identified, four inhibited UL42-stimulated long-chain DNA synthesis by Pol in vitro, of which two exhibited little inhibition of polymerase activity by Pol alone. One of these specifically inhibited the physical interaction of Pol and UL42 and also inhibited viral replication at concentrations below those that caused cytotoxic effects. Thus, a small molecule can inhibit this protein-protein interaction, which provides a starting point for the discovery of new antiviral drugs.
Studies of particle wake potentials in plasmas
NASA Astrophysics Data System (ADS)
Ellis, Ian N.; Graziani, Frank R.; Glosli, James N.; Strozzi, David J.; Surh, Michael P.; Richards, David F.; Decyk, Viktor K.; Mori, Warren B.
2011-09-01
A detailed understanding of electron stopping and scattering in plasmas with variable values for the number of particles within a Debye sphere is still not at hand. Presently, there is some disagreement in the literature concerning the proper description of these processes. Theoretical models assume electrostatic (Coulomb force) interactions between particles and neglect magnetic effects. Developing and validating proper descriptions requires studying the processes using first-principle plasma simulations. We are using the particle-particle particle-mesh (PPPM) code ddcMD and the particle-in-cell (PIC) code BEPS to perform these simulations. As a starting point in our study, we examine the wake of a particle passing through a plasma in 3D electrostatic simulations performed with ddcMD and BEPS. In this paper, we compare the wakes observed in these simulations with each other and predictions from collisionless kinetic theory. The relevance of the work to Fast Ignition is discussed.
Roy-Steiner equations for pion-nucleon scattering
NASA Astrophysics Data System (ADS)
Ditsche, C.; Hoferichter, M.; Kubis, B.; Meißner, U.-G.
2012-06-01
Starting from hyperbolic dispersion relations, we derive a closed system of Roy-Steiner equations for pion-nucleon scattering that respects analyticity, unitarity, and crossing symmetry. We work out analytically all kernel functions and unitarity relations required for the lowest partial waves. In order to suppress the dependence on the high energy regime we also consider once- and twice-subtracted versions of the equations, where we identify the subtraction constants with subthreshold parameters. Assuming Mandelstam analyticity we determine the maximal range of validity of these equations. As a first step towards the solution of the full system we cast the equations for the π π to overline N N partial waves into the form of a Muskhelishvili-Omnès problem with finite matching point, which we solve numerically in the single-channel approximation. We investigate in detail the role of individual contributions to our solutions and discuss some consequences for the spectral functions of the nucleon electromagnetic form factors.
NASA Astrophysics Data System (ADS)
Dias, R. G.; Gouveia, J. D.
2015-11-01
We present a method of construction of exact localized many-body eigenstates of the Hubbard model in decorated lattices, both for U = 0 and U → ∞. These states are localized in what concerns both hole and particle movement. The starting point of the method is the construction of a plaquette or a set of plaquettes with a higher symmetry than that of the whole lattice. Using a simple set of rules, the tight-binding localized state in such a plaquette can be divided, folded and unfolded to new plaquette geometries. This set of rules is also valid for the construction of a localized state for one hole in the U → ∞ limit of the same plaquette, assuming a spin configuration which is a uniform linear combination of all possible permutations of the set of spins in the plaquette.
Frau, Juan; Glossman-Mitnik, Daniel
2017-01-01
Amino acids and peptides have the potential to perform as corrosion inhibitors. The chemical reactivity descriptors that arise from Conceptual DFT for the twenty natural amino acids have been calculated by using the latest Minnesota family of density functionals. In order to verify the validity of the calculation of the descriptors directly from the HOMO and LUMO, a comparison has been performed with those obtained through ΔSCF results. Moreover, the active sites for nucleophilic and electrophilic attacks have been identified through Fukui function indices, the dual descriptor Δf(r) and the electrophilic and nucleophilic Parr functions. The results could be of interest as a starting point for the study of large peptides where the calculation of the radical cation and anion of each system may be computationally harder and costly. PMID:28361050
Chemical probes targeting epigenetic proteins: Applications beyond oncology
Ackloo, Suzanne; Brown, Peter J.; Müller, Susanne
2017-01-01
ABSTRACT Epigenetic chemical probes are potent, cell-active, small molecule inhibitors or antagonists of specific domains in a protein; they have been indispensable for studying bromodomains and protein methyltransferases. The Structural Genomics Consortium (SGC), comprising scientists from academic and pharmaceutical laboratories, has generated most of the current epigenetic chemical probes. Moreover, the SGC has shared about 4 thousand aliquots of these probes, which have been used primarily for phenotypic profiling or to validate targets in cell lines or primary patient samples cultured in vitro. Epigenetic chemical probes have been critical tools in oncology research and have uncovered mechanistic insights into well-established targets, as well as identify new therapeutic starting points. Indeed, the literature primarily links epigenetic proteins to oncology, but applications in inflammation, viral, metabolic and neurodegenerative diseases are now being reported. We summarize the literature of these emerging applications and provide examples where existing probes might be used. PMID:28080202
A Population Where Men Live As Long As Women: Villagrande Strisaili, Sardinia
Poulain, Michel; Pes, Gianni; Salaris, Luisa
2011-01-01
Usually women live longer than men and female centenarians largely outnumber male centenarians. The findings of previous studies identifying a population with a femininity ratio close to 1.0 among centenarians in the mountainous region of Sardinia was the starting point of an in-depth investigation in order to compare mortality trajectories between men and women in that population. The exceptional survival of men compared to women emerges from the comparison with similar Italian data. Age exaggeration for men has been strictly excluded as a result of the age validation procedure. The discussion suggests that besides biological/genetic factors, the behavioral factors including life style, demographic behavior, family support, and community characteristics may play an important role. No single explanation is likely to account for such an exceptional situation and a fully integrated multidisciplinary approach is urgently needed. PMID:22132327
Automated Analysis of Stateflow Models
NASA Technical Reports Server (NTRS)
Bourbouh, Hamza; Garoche, Pierre-Loic; Garion, Christophe; Gurfinkel, Arie; Kahsaia, Temesghen; Thirioux, Xavier
2017-01-01
Stateflow is a widely used modeling framework for embedded and cyber physical systems where control software interacts with physical processes. In this work, we present a framework a fully automated safety verification technique for Stateflow models. Our approach is two-folded: (i) we faithfully compile Stateflow models into hierarchical state machines, and (ii) we use automated logic-based verification engine to decide the validity of safety properties. The starting point of our approach is a denotational semantics of State flow. We propose a compilation process using continuation-passing style (CPS) denotational semantics. Our compilation technique preserves the structural and modal behavior of the system. The overall approach is implemented as an open source toolbox that can be integrated into the existing Mathworks Simulink Stateflow modeling framework. We present preliminary experimental evaluations that illustrate the effectiveness of our approach in code generation and safety verification of industrial scale Stateflow models.
The Quality of the Head Start Planned Variation Data. Volume II.
ERIC Educational Resources Information Center
Walker, Debbie Klein; And Others
This publication continues the descriptions of the cognitive, psychomotor, and socioemotional measures used in all years of the Head Start Planned Variation Evaluation study. Included is a detailed examination of each measure, a discussion of the theory behind it, and a review of the available data on the measure's reliability, validity and other…
Evaluating the Validity of Classroom Observations in the Head Start Designation Renewal System
ERIC Educational Resources Information Center
Mashburn, Andrew J.
2017-01-01
Classroom observations are increasingly common in education policies as a means to assess the quality of teachers and/or education programs for purposes of making high-stakes decisions. This article considers one policy, the Head Start Designation Renewal System (DRS), which involves classroom observations to assess the quality of Head Start…
NASA Astrophysics Data System (ADS)
Sato, Daiki; Ohdaira, Keisuke
2018-04-01
We succeed in the crystallization of hydrogenated amorphous silicon (a-Si:H) films by flash lamp annealing (FLA) at a low fluence by intentionally creating starting points for the trigger of explosive crystallization (EC). We confirm that a partly thick a-Si part can induce the crystallization of a-Si films. A periodic wavy structure is observed on the surface of polycrystalline silicon (poly-Si) on and near the thick parts, which is a clear indication of the emergence of EC. Creating partly thick a-Si parts can thus be effective for the control of the starting point of crystallization by FLA and can realize the crystallization of a-Si with high reproducibility. We also compare the effects of creating thick parts at the center and along the edge of the substrates, and a thick part along the edge of the substrates leads to the initiation of crystallization at a lower fluence.
Nearby Search Indekos Based Android Using A Star (A*) Algorithm
NASA Astrophysics Data System (ADS)
Siregar, B.; Nababan, EB; Rumahorbo, JA; Andayani, U.; Fahmi, F.
2018-03-01
Indekos or rented room is a temporary residence for months or years. Society of academicians who come from out of town need a temporary residence, such as Indekos or rented room during their education, teaching, or duties. They are often found difficulty in finding a Indekos because lack of information about the Indekos. Besides, new society of academicians don’t recognize the areas around the campus and desire the shortest path from Indekos to get to the campus. The problem can be solved by implementing A Star (A*) algorithm. This algorithm is one of the shortest path algorithm to a finding shortest path from campus to the Indekos application, where the faculties in the campus as the starting point of the finding. Determination of the starting point used in this study aims to allow students to determine the starting point in finding the Indekos. The mobile based application facilitates the finding anytime and anywhere. Based on the experimental results, A* algorithm can find the shortest path with 86,67% accuracy.
Carmona-Bayonas, Alberto; Jiménez-Fonseca, Paula; Virizuela Echaburu, Juan; Antonio, Maite; Font, Carme; Biosca, Mercè; Ramchandani, Avinash; Martínez, Jerónimo; Hernando Cubero, Jorge; Espinosa, Javier; Martínez de Castro, Eva; Ghanem, Ismael; Beato, Carmen; Blasco, Ana; Garrido, Marcelo; Bonilla, Yaiza; Mondéjar, Rebeca; Arcusa Lanza, María Ángeles; Aragón Manrique, Isabel; Manzano, Aránzazu; Sevillano, Elena; Castañón, Eduardo; Cardona, Mercé; Gallardo Martín, Elena; Pérez Armillas, Quionia; Sánchez Lasheras, Fernando; Ayala de la Peña, Francisco
2015-02-10
To validate a prognostic score predicting major complications in patients with solid tumors and seemingly stable episodes of febrile neutropenia (FN). The definition of clinical stability implies the absence of organ dysfunction, abnormalities in vital signs, and major infections. We developed the Clinical Index of Stable Febrile Neutropenia (CISNE), with six explanatory variables associated with serious complications: Eastern Cooperative Oncology Group performance status ≥ 2 (2 points), chronic obstructive pulmonary disease (1 point), chronic cardiovascular disease (1 point), mucositis of grade ≥ 2 (National Cancer Institute Common Toxicity Criteria; 1 point), monocytes < 200 per μL (1 point), and stress-induced hyperglycemia (2 points). We integrated these factors into a score ranging from 0 to 8, which classifies patients into three prognostic classes: low (0 points), intermediate (1 to 2 points), and high risk (≥ 3 points). We present a multicenter validation of CISNE. We prospectively recruited 1,133 patients with seemingly stable FN from 25 hospitals. Complication rates in the training and validation subsets, respectively, were 1.1% and 1.1% in low-, 6.1% and 6.2% in intermediate-, and 32.5% and 36% in high-risk patients; mortality rates within each class were 0% in low-, 1.6% and 0% in intermediate-, and 4.3% and 3.1% in high-risk patients. Areas under the receiver operating characteristic curves in the validation subset were 0.652 (95% CI, 0.598 to 0.703) for Talcott, 0.721 (95% CI, 0.669 to 0.768) for Multinational Association for Supportive Care in Cancer (MASCC), and 0.868 (95% CI, 0.827 to 0.903) for CISNE (P = .002 for comparison between CISNE and MASCC). CISNE is a valid model for accurately classifying patients with cancer with seemingly stable FN episodes. © 2015 by American Society of Clinical Oncology.
Inferring pregnancy episodes and outcomes within a network of observational databases
Ryan, Patrick; Fife, Daniel; Gifkins, Dina; Knoll, Chris; Friedman, Andrew
2018-01-01
Administrative claims and electronic health records are valuable resources for evaluating pharmaceutical effects during pregnancy. However, direct measures of gestational age are generally not available. Establishing a reliable approach to infer the duration and outcome of a pregnancy could improve pharmacovigilance activities. We developed and applied an algorithm to define pregnancy episodes in four observational databases: three US-based claims databases: Truven MarketScan® Commercial Claims and Encounters (CCAE), Truven MarketScan® Multi-state Medicaid (MDCD), and the Optum ClinFormatics® (Optum) database and one non-US database, the United Kingdom (UK) based Clinical Practice Research Datalink (CPRD). Pregnancy outcomes were classified as live births, stillbirths, abortions and ectopic pregnancies. Start dates were estimated using a derived hierarchy of available pregnancy markers, including records such as last menstrual period and nuchal ultrasound dates. Validation included clinical adjudication of 700 electronic Optum and CPRD pregnancy episode profiles to assess the operating characteristics of the algorithm, and a comparison of the algorithm’s Optum pregnancy start estimates to starts based on dates of assisted conception procedures. Distributions of pregnancy outcome types were similar across all four data sources and pregnancy episode lengths found were as expected for all outcomes, excepting term lengths in episodes that used amenorrhea and urine pregnancy tests for start estimation. Validation survey results found highest agreement between reviewer chosen and algorithm operating characteristics for questions assessing pregnancy status and accuracy of outcome category with 99–100% agreement for Optum and CPRD. Outcome date agreement within seven days in either direction ranged from 95–100%, while start date agreement within seven days in either direction ranged from 90–97%. In Optum validation sensitivity analysis, a total of 73% of algorithm estimated starts for live births were in agreement with fertility procedure estimated starts within two weeks in either direction; ectopic pregnancy 77%, stillbirth 47%, and abortion 36%. An algorithm to infer live birth and ectopic pregnancy episodes and outcomes can be applied to multiple observational databases with acceptable accuracy for further epidemiologic research. Less accuracy was found for start date estimations in stillbirth and abortion outcomes in our sensitivity analysis, which may be expected given the nature of the outcomes. PMID:29389968
ERIC Educational Resources Information Center
Kramer, David C.
1983-01-01
Describes a procedure for starting tree cuttings from woody plants, explaining "lag time," recommending materials, and giving step-by-step instructions for rooting and planting. Points out species which are likely candidates for cuttings and provides tips for teachers for developing a unit. (JM)
Robotics virtual rail system and method
Bruemmer, David J [Idaho Falls, ID; Few, Douglas A [Idaho Falls, ID; Walton, Miles C [Idaho Falls, ID
2011-07-05
A virtual track or rail system and method is described for execution by a robot. A user, through a user interface, generates a desired path comprised of at least one segment representative of the virtual track for the robot. Start and end points are assigned to the desired path and velocities are also associated with each of the at least one segment of the desired path. A waypoint file is generated including positions along the virtual track representing the desired path with the positions beginning from the start point to the end point including the velocities of each of the at least one segment. The waypoint file is sent to the robot for traversing along the virtual track.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-28
... period will be addressed in July 2011, prior to the start of the 2011 Winter II fishery. Per the quota... inappropriate years of very low harvest as the foundation of the constant catch starting point. The commenters... summer flounder quota at the start of the fishing year that begins January 1, 2011, is required by the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rades, Dirk, E-mail: Rades.Dirk@gmx.net; Dziggel, Liesa; Haatanen, Tiina
2011-07-15
Purpose: To create and validate scoring systems for intracerebral control (IC) and overall survival (OS) of patients irradiated for brain metastases. Methods and Materials: In this study, 1,797 patients were randomly assigned to the test (n = 1,198) or the validation group (n = 599). Two scoring systems were developed, one for IC and another for OS. The scores included prognostic factors found significant on multivariate analyses. Age, performance status, extracerebral metastases, interval tumor diagnosis to RT, and number of brain metastases were associated with OS. Tumor type, performance status, interval, and number of brain metastases were associated with IC.more » The score for each factor was determined by dividing the 6-month IC or OS rate (given in percent) by 10. The total score represented the sum of the scores for each factor. The score groups of the test group were compared with the corresponding score groups of the validation group. Results: In the test group, 6-month IC rates were 17% for 14-18 points, 49% for 19-23 points, and 77% for 24-27 points (p < 0.0001). IC rates in the validation group were 19%, 52%, and 77%, respectively (p < 0.0001). In the test group, 6-month OS rates were 9% for 15-19 points, 41% for 20-25 points, and 78% for 26-30 points (p < 0.0001). OS rates in the validation group were 7%, 39%, and 79%, respectively (p < 0.0001). Conclusions: Patients irradiated for brain metastases can be given scores to estimate OS and IC. IC and OS rates of the validation group were similar to the test group demonstrating the validity and reproducibility of both scores.« less
2013-01-01
Background A scale validated in one language is not automatically valid in another language or culture. The purpose of this study was to validate the English version of the UNESP-Botucatu multidimensional composite pain scale (MCPS) to assess postoperative pain in cats. The English version was developed using translation, back-translation, and review by individuals with expertise in feline pain management. In sequence, validity and reliability tests were performed. Results Of the three domains identified by factor analysis, the internal consistency was excellent for ‘pain expression’ and ‘psychomotor change’ (0.86 and 0.87) but not for ‘physiological variables’ (0.28). Relevant changes in pain scores at clinically distinct time points (e.g., post-surgery, post-analgesic therapy), confirmed the construct validity and responsiveness (Wilcoxon test, p < 0.001). Favorable correlation with the IVAS scores (p < 0.001) and moderate to very good agreement between blinded observers and ‘gold standard’ evaluations, supported criterion validity. The cut-off point for rescue analgesia was > 7 (range 0–30 points) with 96.5% sensitivity and 99.5% specificity. Conclusions The English version of the UNESP-Botucatu-MCPS is a valid, reliable and responsive instrument for assessing acute pain in cats undergoing ovariohysterectomy, when used by anesthesiologists or anesthesia technicians. The cut-off point for rescue analgesia provides an additional tool for guiding analgesic therapy. PMID:23867090
Brondani, Juliana T; Mama, Khursheed R; Luna, Stelio P L; Wright, Bonnie D; Niyom, Sirirat; Ambrosio, Jennifer; Vogel, Pamela R; Padovani, Carlos R
2013-07-17
A scale validated in one language is not automatically valid in another language or culture. The purpose of this study was to validate the English version of the UNESP-Botucatu multidimensional composite pain scale (MCPS) to assess postoperative pain in cats. The English version was developed using translation, back-translation, and review by individuals with expertise in feline pain management. In sequence, validity and reliability tests were performed. Of the three domains identified by factor analysis, the internal consistency was excellent for 'pain expression' and 'psychomotor change' (0.86 and 0.87) but not for 'physiological variables' (0.28). Relevant changes in pain scores at clinically distinct time points (e.g., post-surgery, post-analgesic therapy), confirmed the construct validity and responsiveness (Wilcoxon test, p < 0.001). Favorable correlation with the IVAS scores (p < 0.001) and moderate to very good agreement between blinded observers and 'gold standard' evaluations, supported criterion validity. The cut-off point for rescue analgesia was > 7 (range 0-30 points) with 96.5% sensitivity and 99.5% specificity. The English version of the UNESP-Botucatu-MCPS is a valid, reliable and responsive instrument for assessing acute pain in cats undergoing ovariohysterectomy, when used by anesthesiologists or anesthesia technicians. The cut-off point for rescue analgesia provides an additional tool for guiding analgesic therapy.
Collette, Laurence; Burzykowski, Tomasz; Carroll, Kevin J; Newling, Don; Morris, Tom; Schröder, Fritz H
2005-09-01
The long duration of phase III clinical trials of overall survival (OS) slows down the treatment-development process. It could be shortened by using surrogate end points. Prostate-specific antigen (PSA) is the most studied biomarker in prostate cancer (PCa). This study attempts to validate PSA end points as surrogates for OS in advanced PCa. Individual data from 2,161 advanced PCa patients treated in studies comparing bicalutamide to castration were used in a meta-analytic approach to surrogate end-point validation. PSA response, PSA normalization, time to PSA progression, and longitudinal PSA measurements were considered. The known association between PSA and OS at the individual patient level was confirmed. The association between the effect of intervention on any PSA end point and on OS was generally low (determination coefficient, < 0.69). It is a common misconception that high correlation between biomarkers and true end point justify the use of the former as surrogates. To statistically validate surrogate end points, a high correlation between the treatment effects on the surrogate and true end point needs to be established across groups of patients treated with two alternative interventions. The levels of association observed in this study indicate that the effect of hormonal treatment on OS cannot be predicted with a high degree of precision from observed treatment effects on PSA end points, and thus statistical validity is unproven. In practice, non-null treatment effects on OS can be predicted only from precisely estimated large effects on time to PSA progression (TTPP; hazard ratio, < 0.50).
A calibration protocol for population-specific accelerometer cut-points in children.
Mackintosh, Kelly A; Fairclough, Stuart J; Stratton, Gareth; Ridgers, Nicola D
2012-01-01
To test a field-based protocol using intermittent activities representative of children's physical activity behaviours, to generate behaviourally valid, population-specific accelerometer cut-points for sedentary behaviour, moderate, and vigorous physical activity. Twenty-eight children (46% boys) aged 10-11 years wore a hip-mounted uniaxial GT1M ActiGraph and engaged in 6 activities representative of children's play. A validated direct observation protocol was used as the criterion measure of physical activity. Receiver Operating Characteristics (ROC) curve analyses were conducted with four semi-structured activities to determine the accelerometer cut-points. To examine classification differences, cut-points were cross-validated with free-play and DVD viewing activities. Cut-points of ≤ 372, >2160 and >4806 counts • min(-1) representing sedentary, moderate and vigorous intensity thresholds, respectively, provided the optimal balance between the related needs for sensitivity (accurately detecting activity) and specificity (limiting misclassification of the activity). Cross-validation data demonstrated that these values yielded the best overall kappa scores (0.97; 0.71; 0.62), and a high classification agreement (98.6%; 89.0%; 87.2%), respectively. Specificity values of 96-97% showed that the developed cut-points accurately detected physical activity, and sensitivity values (89-99%) indicated that minutes of activity were seldom incorrectly classified as inactivity. The development of an inexpensive and replicable field-based protocol to generate behaviourally valid and population-specific accelerometer cut-points may improve the classification of physical activity levels in children, which could enhance subsequent intervention and observational studies.
Design and validation of a model to predict early mortality in haemodialysis patients.
Mauri, Joan M; Clèries, Montse; Vela, Emili
2008-05-01
Mortality and morbidity rates are higher in patients receiving haemodialysis therapy than in the general population. Detection of risk factors related to early death in these patients could be of aid for clinical and administrative decision making. Objectives. The aims of this study were (1) to identify risk factors (comorbidity and variables specific to haemodialysis) associated with death in the first year following the start of haemodialysis and (2) to design and validate a prognostic model to quantify the probability of death for each patient. An analysis was carried out on all patients starting haemodialysis treatment in Catalonia during the period 1997-2003 (n = 5738). The data source was the Renal Registry of Catalonia, a mandatory population registry. Patients were randomly divided into two samples: 60% (n = 3455) of the total were used to develop the prognostic model and the remaining 40% (n = 2283) to validate the model. Logistic regression analysis was used to construct the model. One-year mortality in the total study population was 16.5%. The predictive model included the following variables: age, sex, primary renal disease, grade of functional autonomy, chronic obstructive pulmonary disease, malignant processes, chronic liver disease, cardiovascular disease, initial vascular access and malnutrition. The analyses showed adequate calibration for both the sample to develop the model and the validation sample (Hosmer-Lemeshow statistic 0.97 and P = 0.49, respectively) as well as adequate discrimination (ROC curve 0.78 in both cases). Risk factors implicated in mortality at one year following the start of haemodialysis have been determined and a prognostic model designed. The validated, easy-to-apply model quantifies individual patient risk attributable to various factors, some of them amenable to correction by directed interventions.
Development and Validation of a Photonumeric Scale for Assessment of Chin Retrusion.
Sykes, Jonathan M; Carruthers, Alastair; Hardas, Bhushan; Murphy, Diane K; Jones, Derek; Carruthers, Jean; Donofrio, Lisa; Creutz, Lela; Marx, Ann; Dill, Sara
2016-10-01
A validated scale is needed for objective and reproducible comparisons of chin appearance before and after chin augmentation in practice and clinical studies. To describe the development and validation of the 5-point photonumeric Allergan Chin Retrusion Scale. The Allergan Chin Retrusion Scale was developed to include an assessment guide, verbal descriptors, morphed images, and real subject images for each scale grade. The clinical significance of a 1-point score difference was evaluated in a review of multiple image pairs representing varying differences in severity. Interrater and intrarater reliability was evaluated in a live-subject validation study (N = 298) completed during 2 sessions occurring 3 weeks apart. A difference of ≥1 point on the scale was shown to reflect a clinically meaningful difference (mean [95% confidence interval] absolute score difference, 1.07 [0.94-1.20] for clinically different image pairs and 0.51 [0.39-0.63] for not clinically different pairs). Intrarater agreement between the 2 live-subject validation sessions was substantial (mean weighted kappa = 0.79). Interrater agreement was substantial during the second rating session (0.68, primary end point). The Allergan Chin Retrusion Scale is a validated and reliable scale for physician rating of severity of chin retrusion.
Fu, Sau Nga; Chin, Weng Yee; Wong, Carlos King Ho; Yeung, Vincent Tok Fai; Yiu, Ming Pong; Tsui, Hoi Yee; Chan, Ka Hung
2013-01-01
Objectives To develop and evaluate the psychometric properties of a Chinese questionnaire which assesses the barriers and enablers to commencing insulin in primary care patients with poorly controlled Type 2 diabetes. Research Design and Method Questionnaire items were identified using literature review. Content validation was performed and items were further refined using an expert panel. Following translation, back translation and cognitive debriefing, the translated Chinese questionnaire was piloted on target patients. Exploratory factor analysis and item-scale correlations were performed to test the construct validity of the subscales and items. Internal reliability was tested by Cronbach’s alpha. Results Twenty-seven identified items underwent content validation, translation and cognitive debriefing. The translated questionnaire was piloted on 303 insulin naïve (never taken insulin) Type 2 diabetes patients recruited from 10 government-funded primary care clinics across Hong Kong. Sufficient variability in the dataset for factor analysis was confirmed by Bartlett’s Test of Sphericity (P<0.001). Using exploratory factor analysis with varimax rotation, 10 factors were generated onto which 26 items loaded with loading scores > 0.4 and Eigenvalues >1. Total variance for the 10 factors was 66.22%. Kaiser-Meyer-Olkin measure was 0.725. Cronbach’s alpha coefficients for the first four factors were ≥0.6 identifying four sub-scales to which 13 items correlated. Remaining sub-scales and items with poor internal reliability were deleted. The final 13-item instrument had a four scale structure addressing: ‘Self-image and stigmatization’; ‘Factors promoting self-efficacy; ‘Fear of pain or needles’; and ‘Time and family support’. Conclusion The Chinese Attitudes to Starting Insulin Questionnaire (Ch-ASIQ) appears to be a reliable and valid measure for assessing barriers to starting insulin. This short instrument is easy to administer and may be used by healthcare providers and researchers as an assessment tool for Chinese diabetic primary care patients, including the elderly, who are unwilling to start insulin. PMID:24236071
NASA Astrophysics Data System (ADS)
Wang, Jinhu; Lindenbergh, Roderik; Menenti, Massimo
2017-06-01
Urban road environments contain a variety of objects including different types of lamp poles and traffic signs. Its monitoring is traditionally conducted by visual inspection, which is time consuming and expensive. Mobile laser scanning (MLS) systems sample the road environment efficiently by acquiring large and accurate point clouds. This work proposes a methodology for urban road object recognition from MLS point clouds. The proposed method uses, for the first time, shape descriptors of complete objects to match repetitive objects in large point clouds. To do so, a novel 3D multi-scale shape descriptor is introduced, that is embedded in a workflow that efficiently and automatically identifies different types of lamp poles and traffic signs. The workflow starts by tiling the raw point clouds along the scanning trajectory and by identifying non-ground points. After voxelization of the non-ground points, connected voxels are clustered to form candidate objects. For automatic recognition of lamp poles and street signs, a 3D significant eigenvector based shape descriptor using voxels (SigVox) is introduced. The 3D SigVox descriptor is constructed by first subdividing the points with an octree into several levels. Next, significant eigenvectors of the points in each voxel are determined by principal component analysis (PCA) and mapped onto the appropriate triangle of a sphere approximating icosahedron. This step is repeated for different scales. By determining the similarity of 3D SigVox descriptors between candidate point clusters and training objects, street furniture is automatically identified. The feasibility and quality of the proposed method is verified on two point clouds obtained in opposite direction of a stretch of road of 4 km. 6 types of lamp pole and 4 types of road sign were selected as objects of interest. Ground truth validation showed that the overall accuracy of the ∼170 automatically recognized objects is approximately 95%. The results demonstrate that the proposed method is able to recognize street furniture in a practical scenario. Remaining difficult cases are touching objects, like a lamp pole close to a tree.
ERIC Educational Resources Information Center
Coolahan, Kathleen; McWayne, Christine; Fantuzzo, John; Grim, Suzanne
2002-01-01
Examined the construct and concurrent validity of the Parenting Behavior Questionnaire-Head Start (PBQ-HS) with low-income African-American families with preschoolers, and whether parenting styles differed by caregiver characteristics. Derived Active-Responsive, Active-Restrictive, and Passive-Permissive parenting dimensions; the last differed…
USDA-ARS?s Scientific Manuscript database
Young children are not meeting recommendations for vegetable intake. Our objective is to provide evidence of validity and reliability for a pictorial vegetable behavioral assessment for use by federally funded community nutrition programs. Parent/child pairs (n=133) from Head Start and the Special S...
The Pepsi Challenge: Building a Leader-Driven Organization.
ERIC Educational Resources Information Center
Tichy, Noel M.; DeRose, Christopher
1996-01-01
PepsiCo's change-leadership model starts with a teachable point of view, showing trainees how to think in different terms, develop a point of view, test it, crystallize the vision, and implement it. The human resources department plays an important role in articulating the point of view. (SK)
Kolich, Mike
2009-04-01
This article describes a new and more repeatable, reproducible, and valid test method for characterizing lumbar support in automotive seating. Lumbar support is important because it affects occupant accommodation and perceptions of seat comfort. Assessing only the lumbar mechanism--particularly in terms of travel--is inadequate because it does not consider the effects of trim and foam. The Society of Automotive Engineers' next-generation H-Point machine and associated loading protocol were used as the basis for the new test. The method was found to satisfy minimum gage repeatability and reproducibility requirements. Validity was demonstrated through a regression model that revealed 93.9% of the variance in subjective ratings of poor uncomfortable lumbar support can be explained by two objective indicators: (1) lumbar support prominence in the full-off position and (2) effective travel. The method can be used to differentiate between seats offering two-way adjustable lumbar support. The best two-way adjustable lumbar seat systems are those that couple little to no lumbar support in the starting or off position (i.e., they are nonintrusive) with a considerable amount of effective or perceptible travel. The automotive industry has long needed a way to address the fact that consumers want more lumbar support than their seats currently supply. This contribution offers a method to objectify an important aspect of automotive seating comfort-namely, lumbar support. This should help human factors professionals produce, but not necessarily guarantee, better consumer ratings.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-27
..., transmission control ECU, electronic control module (ECM) and security indicator. The Prius v wagon will... located on the instrument panel to start the vehicle. The correct key has to be recognized by the ECM in... receive confirmation of the valid key and allows the ECM to start the engine. On the Prius v model, the...
ERIC Educational Resources Information Center
Hatala, John-Paul
2005-01-01
The present study sought to add to our knowledge about forces that negatively affect an individual's decision to start a business by identifying barriers they encounter. By identifying barriers to starting a business, we stand to learn much about how an individual identifies, confronts, and responds to decisions which may seem to be beyond their…
Travelling Randomly on the Poincaré Half-Plane with a Pythagorean Compass
NASA Astrophysics Data System (ADS)
Cammarota, V.; Orsingher, E.
2008-02-01
A random motion on the Poincaré half-plane is studied. A particle runs on the geodesic lines changing direction at Poisson-paced times. The hyperbolic distance is analyzed, also in the case where returns to the starting point are admitted. The main results concern the mean hyperbolic distance (and also the conditional mean distance) in all versions of the motion envisaged. Also an analogous motion on orthogonal circles of the sphere is examined and the evolution of the mean distance from the starting point is investigated.
Saad, E D; Katz, A; Hoff, P M; Buyse, M
2010-01-01
Significant achievements in the systemic treatment of both advanced breast cancer and advanced colorectal cancer over the past 10 years have led to a growing number of drugs, combinations, and sequences to be tested. The choice of surrogate and true end points has become a critical issue and one that is currently the subject of much debate. Many recent randomized trials in solid tumor oncology have used progression-free survival (PFS) as the primary end point. PFS is an attractive end point because it is available earlier than overall survival (OS) and is not influenced by second-line treatments. PFS is now undergoing validation as a surrogate end point in various disease settings. The question of whether PFS can be considered an acceptable surrogate end point depends not only on formal validation studies but also on a standardized definition and unbiased ascertainment of disease progression in clinical trials. In advanced breast cancer, formal validation of PFS as a surrogate for OS has so far been unsuccessful. In advanced colorectal cancer, in contrast, current evidence indicates that PFS is a valid surrogate for OS after first-line treatment with chemotherapy. The other question is whether PFS sufficiently reflects clinical benefit to be considered a true end point in and of itself.
Validation of a dynamic linked segment model to calculate joint moments in lifting.
de Looze, M P; Kingma, I; Bussmann, J B; Toussaint, H M
1992-08-01
A two-dimensional dynamic linked segment model was constructed and applied to a lifting activity. Reactive forces and moments were calculated by an instantaneous approach involving the application of Newtonian mechanics to individual adjacent rigid segments in succession. The analysis started once at the feet and once at a hands/load segment. The model was validated by comparing predicted external forces and moments at the feet or at a hands/load segment to actual values, which were simultaneously measured (ground reaction force at the feet) or assumed to be zero (external moments at feet and hands/load and external forces, beside gravitation, at hands/load). In addition, results of both procedures, in terms of joint moments, including the moment at the intervertebral disc between the fifth lumbar and first sacral vertebra (L5-S1), were compared. A correlation of r = 0.88 between calculated and measured vertical ground reaction forces was found. The calculated external forces and moments at the hands showed only minor deviations from the expected zero level. The moments at L5-S1, calculated starting from feet compared to starting from hands/load, yielded a coefficient of correlation of r = 0.99. However, moments calculated from hands/load were 3.6% (averaged values) and 10.9% (peak values) higher. This difference is assumed to be due mainly to erroneous estimations of the positions of centres of gravity and joint rotation centres. The estimation of the location of L5-S1 rotation axis can affect the results significantly. Despite the numerous studies estimating the load on the low back during lifting on the basis of linked segment models, only a few attempts to validate these models have been made. This study is concerned with the validity of the presented linked segment model. The results support the model's validity. Effects of several sources of error threatening the validity are discussed. Copyright © 1992. Published by Elsevier Ltd.
77 FR 65815 - Special Local Regulations; Marine Events in the Seventh Coast Guard District
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-31
... imaginary line connecting the following points: Starting at Point 1 in position 24[deg]32'08'' N, 81[deg]50'19'' W; thence east to Point 2 in position 24[deg]32'23'' N, 81[deg]48'58'' W; thence northeast to Point 3 in position 24[deg]33'14'' N, 81[deg]48'47'' W; thence northeast to Point 4 in position 24[deg...
NASA Astrophysics Data System (ADS)
Lachat, E.; Landes, T.; Grussenmeyer, P.
2018-05-01
Terrestrial and airborne laser scanning, photogrammetry and more generally 3D recording techniques are used in a wide range of applications. After recording several individual 3D datasets known in local systems, one of the first crucial processing steps is the registration of these data into a common reference frame. To perform such a 3D transformation, commercial and open source software as well as programs from the academic community are available. Due to some lacks in terms of computation transparency and quality assessment in these solutions, it has been decided to develop an open source algorithm which is presented in this paper. It is dedicated to the simultaneous registration of multiple point clouds as well as their georeferencing. The idea is to use this algorithm as a start point for further implementations, involving the possibility of combining 3D data from different sources. Parallel to the presentation of the global registration methodology which has been employed, the aim of this paper is to confront the results achieved this way with the above-mentioned existing solutions. For this purpose, first results obtained with the proposed algorithm to perform the global registration of ten laser scanning point clouds are presented. An analysis of the quality criteria delivered by two selected software used in this study and a reflexion about these criteria is also performed to complete the comparison of the obtained results. The final aim of this paper is to validate the current efficiency of the proposed method through these comparisons.
CwicStart - a proof-of-concept client for the CEOSWGISS Integrated Catalog (CWIC)
NASA Astrophysics Data System (ADS)
Newman, D. J.; Mitchell, A. E.
2012-12-01
Keywords - Earth Science, data discovery, agile development, ruby on rails, catalog, OGC Audience - Earth Science application developers What is CwicStart CwicStart is a prototypical earth science data discovery web application designed, developed and hosted by NASA's Earth Observing System (EOS) Clearinghouse (ECHO). CwicStart searches the CEOS WGISS Integrated Catalog (CWIC) to provide users with dataset and granule level metadata from sources as diverse as NASA, NOAA, INPE and AOE. CwicStart demonstrates the ease of which it is possible to stand up a functioning client against the CWIC. From start to finish, CwicStart was designed, developed and deployed in one month. Built from OGC getCapabilities document of CWIC The CwicStart application takes the OGC getCapabilities (http://www.opengeospatial.org/standards/is) document describing CWIC, as it's starting point for providing a user interface suitable for interrogating CWIC. Consequently, it can allow the user to constrain their search by the following criteria, - Generic search terms - Spatial bounding box - Start date/time and end date/time - ISO-queryable key-value pairs User Interface inspired by Reverb ECHO's state-of-the-art earth science discovery tool, Reverb (http://reverb.echo.nasa.gov) was used as a guideline for the user interface components of CwicStart. It incorporates OpenLayers to provide point-and-click spatial constraint specification and calendar input for temporal constraints. Discovery involves two phases: dataset discovery and granule discovery with full pagination support for large results sets. CwicStart supports 'graceful degradation' of support for multiple browsers and accessibility requirements. Implemented in Ruby on Rails for Agile Development CwicStart is implemented in Ruby on Rails, a dynamic, rapid development language and environment that facilitates Agile development and is resilient to changing requirements. Using an Agile development methodology ECHO was able to stand up the first iteration of CwicStart in an iteration lasting only one week. Three subsequent week-long iterations delivered the current functionality. CwicStart can be found at the following location, https://testbed.echo.nasa.gov/cwic-start/ About CWIC The WGISS team provides an application, the CEOS WGISS Integrated Catalog (CWIC) with the following capabilities. - Provide an access point for major CEOS agency catalog systems. - Interface to user interface clients by using the GEO standards. - Send directory/collection searches to the International Directory Network. - Distribute inventory/product searches to the CEOS agency inventory systems using the agency systems native protocol. - Offered as the CEOS community catalog as part of the GEO common infrastructure. CWIC Partners - Committee on Earth Observing Satellites (CEOS) - International Directory Network (IDN) - U.S. National Aeronautics and Space (NASA) - Earth Observing System (EOS) Clearinghouse (ECHO) - U.S. National Oceanographic and Atmospheric Administration (NOAA) - Comprehensive Large Array Data Stewardship System (CLASS) - U.S. Geological Survey (USGS) - Landsat Catalog System - U.S. Geological Survey (USGS) - LSI Portal - National Institute for Space Research (INPE), Brazil - Academy of Opto-Electronics (AOE), Chinese Academy of Sciences (CAS)
NASA Astrophysics Data System (ADS)
Kuriyama, M.; Kumamoto, T.; Fujita, M.
2005-12-01
The 1995 Hyogo-ken Nambu Earthquake (1995) near Kobe, Japan, spurred research on strong motion prediction. To mitigate damage caused by large earthquakes, a highly precise method of predicting future strong motion waveforms is required. In this study, we applied empirical Green's function method to forward modeling in order to simulate strong ground motion in the Noubi Fault zone and examine issues related to strong motion prediction for large faults. Source models for the scenario earthquakes were constructed using the recipe of strong motion prediction (Irikura and Miyake, 2001; Irikura et al., 2003). To calculate the asperity area ratio of a large fault zone, the results of a scaling model, a scaling model with 22% asperity by area, and a cascade model were compared, and several rupture points and segmentation parameters were examined for certain cases. A small earthquake (Mw: 4.6) that occurred in northern Fukui Prefecture in 2004 were examined as empirical Green's function, and the source spectrum of this small event was found to agree with the omega-square scaling law. The Nukumi, Neodani, and Umehara segments of the 1891 Noubi Earthquake were targeted in the present study. The positions of the asperity area and rupture starting points were based on the horizontal displacement distributions reported by Matsuda (1974) and the fault branching pattern and rupture direction model proposed by Nakata and Goto (1998). Asymmetry in the damage maps for the Noubi Earthquake was then examined. We compared the maximum horizontal velocities for each case that had a different rupture starting point. In the case, rupture started at the center of the Nukumi Fault, while in another case, rupture started on the southeastern edge of the Umehara Fault; the scaling model showed an approximately 2.1-fold difference between these cases at observation point FKI005 of K-Net. This difference is considered to relate to the directivity effect associated with the direction of rupture propagation. Moreover, it was clarified that the horizontal velocities by assuming the cascade model was underestimated more than one standard deviation of empirical relation by Si and Midorikawa (1999). The scaling and cascade models showed an approximately 6.4-fold difference for the case, in which the rupture started along the southeastern edge of the Umehara Fault at observation point GIF020. This difference is significantly large in comparison with the effect of different rupture starting points, and shows that it is important to base scenario earthquake assumptions on active fault datasets before establishing the source characterization model. The distribution map of seismic intensity for the 1891 Noubi Earthquake also suggests that the synthetic waveforms in the southeastern Noubi Fault zone may be underestimated. Our results indicate that outer fault parameters (e.g., earthquake moment) related to the construction of scenario earthquakes influence strong motion prediction, rather than inner fault parameters such as the rupture starting point. Based on these methods, we will predict strong motion for approximately 140 to 150 km of the Itoigawa-Shizuoka Tectonic Line.
Automatic seed picking for brachytherapy postimplant validation with 3D CT images.
Zhang, Guobin; Sun, Qiyuan; Jiang, Shan; Yang, Zhiyong; Ma, Xiaodong; Jiang, Haisong
2017-11-01
Postimplant validation is an indispensable part in the brachytherapy technique. It provides the necessary feedback to ensure the quality of operation. The ability to pick implanted seed relates directly to the accuracy of validation. To address it, an automatic approach is proposed for picking implanted brachytherapy seeds in 3D CT images. In order to pick seed configuration (location and orientation) efficiently, the approach starts with the segmentation of seed from CT images using a thresholding filter which based on gray-level histogram. Through the process of filtering and denoising, the touching seed and single seed are classified. The true novelty of this approach is found in the application of the canny edge detection and improved concave points matching algorithm to separate touching seeds. Through the computation of image moments, the seed configuration can be determined efficiently. Finally, two different experiments are designed to verify the performance of the proposed approach: (1) physical phantom with 60 model seeds, and (2) patient data with 16 cases. Through assessment of validated results by a medical physicist, the proposed method exhibited promising results. Experiment on phantom demonstrates that the error of seed location and orientation is within ([Formula: see text]) mm and ([Formula: see text])[Formula: see text], respectively. In addition, the most seed location and orientation error is controlled within 0.8 mm and 3.5[Formula: see text] in all cases, respectively. The average process time of seed picking is 8.7 s per 100 seeds. In this paper, an automatic, efficient and robust approach, performed on CT images, is proposed to determine the implanted seed location as well as orientation in a 3D workspace. Through the experiments with phantom and patient data, this approach also successfully exhibits good performance.
Validated MicroRNA Target Databases: An Evaluation.
Lee, Yun Ji Diana; Kim, Veronica; Muth, Dillon C; Witwer, Kenneth W
2015-11-01
Preclinical Research Positive findings from preclinical and clinical studies involving depletion or supplementation of microRNA (miRNA) engender optimism about miRNA-based therapeutics. However, off-target effects must be considered. Predicting these effects is complicated. Each miRNA may target many gene transcripts, and the rules governing imperfectly complementary miRNA: target interactions are incompletely understood. Several databases provide lists of the relatively small number of experimentally confirmed miRNA: target pairs. Although incomplete, this information might allow assessment of at least some of the off-target effects. We evaluated the performance of four databases of experimentally validated miRNA: target interactions (miRWalk 2.0, miRTarBase, miRecords, and TarBase 7.0) using a list of 50 alphabetically consecutive genes. We examined the provided citations to determine the degree to which each interaction was experimentally supported. To assess stability, we tested at the beginning and end of a five-month period. Results varied widely by database. Two of the databases changed significantly over the course of 5 months. Most reported evidence for miRNA: target interactions were indirect or otherwise weak, and relatively few interactions were supported by more than one publication. Some returned results appear to arise from simplistic text searches that offer no insight into the relationship of the search terms, may not even include the reported gene or miRNA, and may thus, be invalid. We conclude that validation databases provide important information, but not all information in all extant databases is up-to-date or accurate. Nevertheless, the more comprehensive validation databases may provide useful starting points for investigation of off-target effects of proposed small RNA therapies. © 2015 Wiley Periodicals, Inc.
A Framework for Validating Traffic Simulation Models at the Vehicle Trajectory Level
DOT National Transportation Integrated Search
2017-03-01
Based on current practices, traffic simulation models are calibrated and validated using macroscopic measures such as 15-minute averages of traffic counts or average point-to-point travel times. For an emerging number of applications, including conne...
Avalanche of entanglement and correlations at quantum phase transitions.
Krutitsky, Konstantin V; Osterloh, Andreas; Schützhold, Ralf
2017-06-16
We study the ground-state entanglement in the quantum Ising model with nearest neighbor ferromagnetic coupling J and find a sequential increase of entanglement depth d with growing J. This entanglement avalanche starts with two-point entanglement, as measured by the concurrence, and continues via the three-tangle and four-tangle, until finally, deep in the ferromagnetic phase for J = ∞, arriving at a pure L-partite (GHZ type) entanglement of all L spins. Comparison with the two, three, and four-point correlations reveals a similar sequence and shows strong ties to the above entanglement measures for small J. However, we also find a partial inversion of the hierarchy, where the four-point correlation exceeds the three- and two-point correlations, well before the critical point is reached. Qualitatively similar behavior is also found for the Bose-Hubbard model, suggesting that this is a general feature of a quantum phase transition. This should be taken into account in the approximations starting from a mean-field limit.
Understanding the Budget Battle.
ERIC Educational Resources Information Center
Hritz, Townley
1996-01-01
Describes Head Start's financial uncertainty for the future due to the government's budget battle. Presents information on the key points in the budget process, how that process got off track in fiscal year 1996, the resulting government shutdowns, and how Head Start can prepare for the 1997 budget debates. (MOK)
A burner for plasma-coal starting of a boiler
NASA Astrophysics Data System (ADS)
Peregudov, V. S.
2008-04-01
Advanced schemes of a plasma-coal burner with single-and two-stage chambers for thermochemical preparation of fuel are described. The factors causing it becoming contaminated with slag during oil-free starting of a boiler are considered, and methods for preventing this phenomenon are pointed out.
A systematic approach to sound decision making starts with financial reporting.
Taylor, R B
1989-11-01
Managers and supervisors need information to measure departmental performance. Designing a reporting system requires managers to obtain needed information without being flooded by extraneous data. A reporting framework designed to examine five control points is a necessary tool, and a good place to start.
Reveles, Kelly R; Mortensen, Eric M; Koeller, Jim M; Lawson, Kenneth A; Pugh, Mary Jo V; Rumbellow, Sarah A; Argamany, Jacqueline R; Frei, Christopher R
2018-03-01
Prior studies have identified risk factors for recurrent Clostridium difficile infection (CDI), but few studies have integrated these factors into a clinical prediction rule that can aid clinical decision-making. The objectives of this study were to derive and validate a CDI recurrence prediction rule to identify patients at risk for first recurrence in a national cohort of veterans. Retrospective cohort study. Veterans Affairs Informatics and Computing Infrastructure. A total of 22,615 adult Veterans Health Administration beneficiaries with first-episode CDI between October 1, 2002, and September 30, 2014; of these patients, 7538 were assigned to the derivation cohort and 15,077 to the validation cohort. A 60-day CDI recurrence prediction rule was created in a derivation cohort using backward logistic regression. Those variables significant at p<0.01 were assigned an integer score proportional to the regression coefficient. The model was then validated in the derivation cohort and a separate validation cohort. Patients were then split into three risk categories, and rates of recurrence were described for each category. The CDI recurrence prediction rule included the following predictor variables with their respective point values: prior third- and fourth-generation cephalosporins (1 point), prior proton pump inhibitors (1 point), prior antidiarrheals (1 point), nonsevere CDI (2 points), and community-onset CDI (3 points). In the derivation cohort, the 60-day CDI recurrence risk for each score ranged from 7.5% (0 points) to 57.9% (8 points). The risk score was strongly correlated with recurrence (R 2 = 0.94). Patients were split into low-risk (0-2 points), medium-risk (3-5 points), and high-risk (6-8 points) classes and had the following recurrence rates: 8.9%, 20.2%, and 35.0%, respectively. Findings were similar in the validation cohort. Several CDI and patient-specific factors were independently associated with 60-day CDI recurrence risk. When integrated into a clinical prediction rule, higher risk scores and risk classes were strongly correlated with CDI recurrence. This clinical prediction rule can be used by providers to identify patients at high risk for CDI recurrence and help guide preventive strategy decisions, while accounting for clinical judgment. © 2018 Pharmacotherapy Publications, Inc.
Water-sanitation-hygiene mapping: an improved approach for data collection at local level.
Giné-Garriga, Ricard; de Palencia, Alejandro Jiménez-Fernández; Pérez-Foguet, Agustí
2013-10-01
Strategic planning and appropriate development and management of water and sanitation services are strongly supported by accurate and accessible data. If adequately exploited, these data might assist water managers with performance monitoring, benchmarking comparisons, policy progress evaluation, resources allocation, and decision making. A variety of tools and techniques are in place to collect such information. However, some methodological weaknesses arise when developing an instrument for routine data collection, particularly at local level: i) comparability problems due to heterogeneity of indicators, ii) poor reliability of collected data, iii) inadequate combination of different information sources, and iv) statistical validity of produced estimates when disaggregated into small geographic subareas. This study proposes an improved approach for water, sanitation and hygiene (WASH) data collection at decentralised level in low income settings, as an attempt to overcome previous shortcomings. The ultimate aim is to provide local policymakers with strong evidences to inform their planning decisions. The survey design takes the Water Point Mapping (WPM) as a starting point to record all available water sources at a particular location. This information is then linked to data produced by a household survey. Different survey instruments are implemented to collect reliable data by employing a variety of techniques, such as structured questionnaires, direct observation and water quality testing. The collected data is finally validated through simple statistical analysis, which in turn produces valuable outputs that might feed into the decision-making process. In order to demonstrate the applicability of the method, outcomes produced from three different case studies (Homa Bay District-Kenya-; Kibondo District-Tanzania-; and Municipality of Manhiça-Mozambique-) are presented. Copyright © 2013 Elsevier B.V. All rights reserved.
Physics of singularities in pressure-impulse theory
NASA Astrophysics Data System (ADS)
Krechetnikov, R.
2018-05-01
The classical solution in the pressure-impulse theory for the inviscid, incompressible, and zero-surface-tension water impact of a flat plate at zero dead-rise angle exhibits both singular-in-time initial fluid acceleration, ∂v /∂ t |t =0˜δ (t ) , and a near-plate-edge spatial singularity in the velocity distribution, v ˜r-1 /2 , where r is the distance from the plate edge. The latter velocity divergence also leads to the interface being stretched infinitely right after the impact, which is another nonphysical artifact. From the point of view of matched asymptotic analysis, this classical solution is a singular limit when three physical quantities achieve limiting values: sound speed c0→∞ , fluid kinematic viscosity ν →0 , and surface tension σ →0 . This leaves open a question on how to resolve these singularities mathematically by including the neglected physical effects—compressibility, viscosity, and surface tension—first one by one and then culminating in the local compressible viscous solution valid for t →0 and r →0 , demonstrating a nontrivial flow structure that changes with the degree of the bulk compressibility. In the course of this study, by starting with the general physically relevant formulation of compressible viscous flow, we clarify the parameter range(s) of validity of the key analytical solutions including classical ones (inviscid incompressible and compressible, etc.) and understand the solution structure, its intermediate asymptotics nature, characteristics influencing physical processes, and the role of potential and rotational flow components. In particular, it is pointed out that sufficiently close to the plate edge surface tension must be taken into account. Overall, the idea is to highlight the interesting physics behind the singularities in the pressure-impulse theory.
Al-Shafei, Ahmad I M; Wise, R G; Gresham, G A; Bronns, G; Carpenter, T A; Hall, L D; Huang, Christopher L-H
2002-01-01
A non-invasive cine magnetic resonance imaging (MRI) technique was developed to allow, for the first time, detection and characterization of chronic changes in myocardial tissue volume and the effects upon these of treatment by the angiotensin-converting enzyme (ACE) inhibitor captopril in streptozotocin (STZ)-diabetic male Wistar rats. Animals that had been made diabetic at the ages of 7, 10 and 13 weeks and a captopril-treated group of animals made diabetic at the age of 7 weeks were scanned. The findings were compared with the results from age-matched controls. All animal groups (n = 4 animals in each) were consistently scanned at 16 weeks. Left and right ventricular myocardial volumes were reconstructed from complete data sets of left and right ventricular transverse sections which covered systole and most of diastole using twelve equally incremented time points through the cardiac cycle. The calculated volumes remained consistent through all twelve time points of the cardiac cycle in all five experimental groups and agreed with the corresponding post-mortem determinations. These gave consistent myocardial densities whose values could additionally be corroborated by previous reports, confirming the validity of the quantitative MRI results and analysis. The myocardial volumes were conserved in animals whose diabetes was induced at 13 weeks but were significantly increased relative to body weight in animals made diabetic at 7 and 10 weeks. Captopril treatment, which was started immediately after induction of diabetes, prevented the development of this relative hypertrophy in both the left and right ventricles. We have thus introduced and validated quantitative MRI methods in a demonstration, for the first time, of chronic myocardial changes in both the right and left ventricles of STZ-diabetic rats and their prevention by the ACE inhibitor captopril. PMID:11790818
Active versus Passive Proprioceptive Straight-Ahead Pointing in Normal Subjects
ERIC Educational Resources Information Center
Chokron, Sylvie; Colliot, Pascale; Atzeni, Thierry; Bartolomeo, Paolo; Ohlmann, Theophile
2004-01-01
Eighty blindfolded healthy female subjects participated in an active and a passive straight-ahead pointing task to study the estimation of the subjective sagittal middle in the presence or absence of an active haptic exploration. Subjects were to point straight-ahead with their left or right index finger starting from different right- or…
Validation of laboratory-scale recycling test method of paper PSA label products
Carl Houtman; Karen Scallon; Richard Oldack
2008-01-01
Starting with test methods and a specification developed by the U.S. Postal Service (USPS) Environmentally Benign Pressure Sensitive Adhesive Postage Stamp Program, a laboratory-scale test method and a specification were developed and validated for pressure-sensitive adhesive labels, By comparing results from this new test method and pilot-scale tests, which have been...
Validating the Heirarchy of the iStartSmart® Academic Content
ERIC Educational Resources Information Center
McManis, Perry, W.; McManis, Mark, H.
2016-01-01
The purpose of this analysis was to investigate the validity of skill groupings in an instructional technology learning system designed for use by children in early childhood education classrooms. A Principal Component Analysis was performed to measure the fit of 18 skill games to their 5 assigned groupings in the system, covering a range of…
Comprehension of Multiple Documents with Conflicting Information: A Two-Step Model of Validation
ERIC Educational Resources Information Center
Richter, Tobias; Maier, Johanna
2017-01-01
In this article, we examine the cognitive processes that are involved when readers comprehend conflicting information in multiple texts. Starting from the notion of routine validation during comprehension, we argue that readers' prior beliefs may lead to a biased processing of conflicting information and a one-sided mental model of controversial…
Gas-injection-start and shutdown characteristics of a 2-kilowatt to 15-kilowatt Brayton power system
NASA Technical Reports Server (NTRS)
Cantoni, D. A.
1972-01-01
Two methods of starting the Brayton power system have been considered: (1) using the alternator as a motor to spin the Brayton rotating unit (BRU), and (2) spinning the BRU by forced gas injection. The first method requires the use of an auxiliary electrical power source. An alternating voltage is applied to the terminals of the alternator to drive it as an induction motor. Only gas-injection starts are discussed in this report. The gas-injection starting method requires high-pressure gas storage and valves to route the gas flow to provide correct BRU rotation. An analog computer simulation was used to size hardware and to determine safe start and shutdown procedures. The simulation was also used to define the range of conditions for successful startups. Experimental data were also obtained under various test conditions. These data verify the validity of the start and shutdown procedures.
Why are natural disasters not 'natural' for victims?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumagai, Yoshitaka; Edwards, John; Carroll, Matthew S.
Some type of formal or informal social assessment is often carried out in the wake of natural disasters. One often-observed phenomenon in such situations is that disaster victims and their sympathizers tend to focus on those elements of disasters that might have been avoided or mitigated by human intervention and thus assign 'undue' levels of responsibility to human agents. Often the responsibility or blame is directed at the very government agencies charged with helping people cope with and recover from the event. This phenomenon presents particular challenges for those trying to understand the social impacts of such events because ofmore » the reflexive nature of such analysis. Often the social analyst or even the government agency manager must sort through such perceptions and behavior and (at least implicitly) make judgments about which assignments of responsibility may have some validity and which are largely the result of the psychology of the disaster itself. This article presents a conceptual framework derived largely from social psychology to help develop a better understand such perceptions and behavior. While no 'magic bullet' formula for evaluating the validity of disaster victims' claims is presented, the conceptual framework is presented as a starting point for understanding this particular aspect of the psychology of natural disasters.« less
Validation of "AW3D" Global Dsm Generated from Alos Prism
NASA Astrophysics Data System (ADS)
Takaku, Junichi; Tadono, Takeo; Tsutsui, Ken; Ichikawa, Mayumi
2016-06-01
Panchromatic Remote-sensing Instrument for Stereo Mapping (PRISM), one of onboard sensors carried by Advanced Land Observing Satellite (ALOS), was designed to generate worldwide topographic data with its optical stereoscopic observation. It has an exclusive ability to perform a triplet stereo observation which views forward, nadir, and backward along the satellite track in 2.5 m ground resolution, and collected its derived images all over the world during the mission life of the satellite from 2006 through 2011. A new project, which generates global elevation datasets with the image archives, was started in 2014. The data is processed in unprecedented 5 m grid spacing utilizing the original triplet stereo images in 2.5 m resolution. As the number of processed data is growing steadily so that the global land areas are almost covered, a trend of global data qualities became apparent. This paper reports on up-to-date results of the validations for the accuracy of data products as well as the status of data coverage in global areas. The accuracies and error characteristics of datasets are analyzed by the comparison with existing global datasets such as Ice, Cloud, and land Elevation Satellite (ICESat) data, as well as ground control points (GCPs) and the reference Digital Elevation Model (DEM) derived from the airborne Light Detection and Ranging (LiDAR).
Calibration and accuracy analysis of a focused plenoptic camera
NASA Astrophysics Data System (ADS)
Zeller, N.; Quint, F.; Stilla, U.
2014-08-01
In this article we introduce new methods for the calibration of depth images from focused plenoptic cameras and validate the results. We start with a brief description of the concept of a focused plenoptic camera and how from the recorded raw image a depth map can be estimated. For this camera, an analytical expression of the depth accuracy is derived for the first time. In the main part of the paper, methods to calibrate a focused plenoptic camera are developed and evaluated. The optical imaging process is calibrated by using a method which is already known from the calibration of traditional cameras. For the calibration of the depth map two new model based methods, which make use of the projection concept of the camera are developed. These new methods are compared to a common curve fitting approach, which is based on Taylor-series-approximation. Both model based methods show significant advantages compared to the curve fitting method. They need less reference points for calibration than the curve fitting method and moreover, supply a function which is valid in excess of the range of calibration. In addition the depth map accuracy of the plenoptic camera was experimentally investigated for different focal lengths of the main lens and is compared to the analytical evaluation.
Wang, QuanQiu; Li, Li; Xu, Rong
2018-04-18
Colorectal cancer (CRC) is the second leading cause of cancer-related deaths. It is estimated that about half the cases of CRC occurring today are preventable. Recent studies showed that human gut microbiota and their collective metabolic outputs play important roles in CRC. However, the mechanisms by which human gut microbial metabolites interact with host genetics in contributing CRC remain largely unknown. We hypothesize that computational approaches that integrate and analyze vast amounts of publicly available biomedical data have great potential in better understanding how human gut microbial metabolites are mechanistically involved in CRC. Leveraging vast amount of publicly available data, we developed a computational algorithm to predict human gut microbial metabolites for CRC. We validated the prediction algorithm by showing that previously known CRC-associated gut microbial metabolites ranked highly (mean ranking: top 10.52%; median ranking: 6.29%; p-value: 3.85E-16). Moreover, we identified new gut microbial metabolites likely associated with CRC. Through computational analysis, we propose potential roles for tartaric acid, the top one ranked metabolite, in CRC etiology. In summary, our data-driven computation-based study generated a large amount of associations that could serve as a starting point for further experiments to refute or validate these microbial metabolite associations in CRC cancer.
Gluons and gravitons at one loop from ambitwistor strings
NASA Astrophysics Data System (ADS)
Geyer, Yvonne; Monteiro, Ricardo
2018-03-01
We present new and explicit formulae for the one-loop integrands of scattering amplitudes in non-supersymmetric gauge theory and gravity, valid for any number of particles. The results exhibit the colour-kinematics duality in gauge theory and the double-copy relation to gravity, in a form that was recently observed in supersymmetric theories. The new formulae are expressed in a particular representation of the loop integrand, with only one quadratic propagator, which arises naturally from the framework of the loop-level scattering equations. The starting point in our work are the expressions based on the scattering equations that were recently derived from ambitwistor string theory. We turn these expressions into explicit formulae depending only on the loop momentum, the external momenta and the external polarisations. These formulae are valid in any number of spacetime dimensions for pure Yang-Mills theory (gluon) and its natural double copy, NS-NS gravity (graviton, dilaton, B-field), and we also present formulae in four spacetime dimensions for pure gravity (graviton). We perform several tests of our results, such as checking gauge invariance and directly matching our four-particle formulae to previously known expressions. While these tests would be elaborate in a Feynman-type representation of the loop integrand, they become straightforward in the representation we use.
Distribution of immunodeficiency fact files with XML--from Web to WAP.
Väliaho, Jouni; Riikonen, Pentti; Vihinen, Mauno
2005-06-26
Although biomedical information is growing rapidly, it is difficult to find and retrieve validated data especially for rare hereditary diseases. There is an increased need for services capable of integrating and validating information as well as proving it in a logically organized structure. A XML-based language enables creation of open source databases for storage, maintenance and delivery for different platforms. Here we present a new data model called fact file and an XML-based specification Inherited Disease Markup Language (IDML), that were developed to facilitate disease information integration, storage and exchange. The data model was applied to primary immunodeficiencies, but it can be used for any hereditary disease. Fact files integrate biomedical, genetic and clinical information related to hereditary diseases. IDML and fact files were used to build a comprehensive Web and WAP accessible knowledge base ImmunoDeficiency Resource (IDR) available at http://bioinf.uta.fi/idr/. A fact file is a user oriented user interface, which serves as a starting point to explore information on hereditary diseases. The IDML enables the seamless integration and presentation of genetic and disease information resources in the Internet. IDML can be used to build information services for all kinds of inherited diseases. The open source specification and related programs are available at http://bioinf.uta.fi/idml/.
Status of acute systemic toxicity testing requirements and data uses by U.S. regulatory agencies.
Strickland, Judy; Clippinger, Amy J; Brown, Jeffrey; Allen, David; Jacobs, Abigail; Matheson, Joanna; Lowit, Anna; Reinke, Emily N; Johnson, Mark S; Quinn, Michael J; Mattie, David; Fitzpatrick, Suzanne C; Ahir, Surender; Kleinstreuer, Nicole; Casey, Warren
2018-04-01
Acute systemic toxicity data are used by a number of U.S. federal agencies, most commonly for hazard classification and labeling and/or risk assessment for acute chemical exposures. To identify opportunities for the implementation of non-animal approaches to produce these data, the regulatory needs and uses for acute systemic toxicity information must first be clarified. Thus, we reviewed acute systemic toxicity testing requirements for six U.S. agencies (Consumer Product Safety Commission, Department of Defense, Department of Transportation, Environmental Protection Agency, Food and Drug Administration, Occupational Safety and Health Administration) and noted whether there is flexibility in satisfying data needs with methods that replace or reduce animal use. Understanding the current regulatory use and acceptance of non-animal data is a necessary starting point for future method development, optimization, and validation efforts. The current review will inform the development of a national strategy and roadmap for implementing non-animal approaches to assess potential hazards associated with acute exposures to industrial chemicals and medical products. The Acute Toxicity Workgroup of the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), U.S. agencies, non-governmental organizations, and other stakeholders will work to execute this strategy. Copyright © 2018 Elsevier Inc. All rights reserved.
[Practical experiences in legal counseling of foreign workers].
Pestalozzi-Seger, G
1992-09-01
When foreign workers ask for legal advice, very often their questions concern primarily insurance rights for disability. Most uncertainties exist about specific clauses in the legislation on disability insurance and about the measurings of disability. Primarily, discussions arise from controversy about claims made to the state disability insurance. The legislation on disability insurance establishes strict requirements for foreigners asking for insurance rights for disability. However, the Agreement on Social Security signed worldwide by over 20 nations being more tolerant in terms of disability insurance, Swiss legislation can be applied only to a minority of foreigners. That is why the system of legislation has become so complex. There are two major points that are rigidly to be observed: On one hand, the process of reintegration measures can start only if the prescribed minimum duration of contributions is guaranteed. On the other, proceedings for disability pensions can be initiated only after the currently valid waiting period. In both cases, it is considerably important that the patient has a domicile in Switzerland or a valid residence permit. Numerous disagreements can possibly result during the evaluation of the degree of disability, as certain factors-such as language problems, lack of education or the labour market situation-, which are not directly linked to the disability, are not taken into consideration.
Strategy and Grand Strategy: What Students and Practitioners Need to Know
2015-12-01
available, acceptable, and well-suited to the purpose for which they will be used. These questions, though crucial, are only a starting place... starting point (rarely an ideal one) and then constantly reassessing the situation in light of changing conditions. This requires an ongoing...him.87 In the end, Britain and France reluctantly decided that they had to stand up to Hitler’s challenge to the international system. At the start
Starting Performance Analysis for Universal Motors by FEM
NASA Astrophysics Data System (ADS)
Kurihara, Kazumi; Sakamoto, Shin-Ichi
This paper presents a novel transient analysis of the universal motors taking into account the time-varying brush-contact resistance and mechanical loss. The transient current, torque and speed during the starting process are computed by solving the electromagnetic, circuit and dynamic motion equations, simultaneously. The computed performances have been validated by tests in a 500-W, 2-pole, 50Hz, 100V universal motor.
ERIC Educational Resources Information Center
Desmarais, Sarah L.; Nicholls, Tonia L.; Wilson, Catherine M.; Brink, Johann
2012-01-01
The Short-Term Assessment of Risk and Treatability (START; C. D. Webster, M. L. Martin, J. Brink, T. L. Nicholls, & S. L. Desmarais, 2009; C. D. Webster, M. L. Martin, J. Brink, T. L. Nicholls, & C. Middleton, 2004) is a relatively new structured professional judgment guide for the assessment and management of short-term risks associated…
Proof of Concept for the Trajectory-Level Validation Framework for Traffic Simulation Models
DOT National Transportation Integrated Search
2017-10-30
Based on current practices, traffic simulation models are calibrated and validated using macroscopic measures such as 15-minute averages of traffic counts or average point-to-point travel times. For an emerging number of applications, including conne...
NASA Astrophysics Data System (ADS)
Bowley, Dean K.; Gaertner, Paul S.
2003-07-01
In this paper the argument is made that the offensive fire support organisation and doctrine, born of the "indirect fire revolution" of the first world war, is the start point for distributed sensors, shooters and deciders that may be transferred to a joint force; that the culture of directive control and mission orders developed by the German Army in 1918 and then adopted by most western armies is the start point for the culture required to achieve "self synchronisation" and that the network developed for the air defence of carrier battle groups is the start point for developing a networked ground manoeuvre force. We discuss the strategic expectations of network centric warfare, a "virtual war" scenario and the inherent vulnerabilities. The current level of understanding and implementation in specific areas is analysed and lessons for general application are developed and the potential payoff identified. Three broad operational domains are investigated, networked platform versus platform warfare between states, guerrilla/counter-insurfence operations and the emerging domain of "netwars" (terror organisations and criminal gangs).
A hierarchical wavefront reconstruction algorithm for gradient sensors
NASA Astrophysics Data System (ADS)
Bharmal, Nazim; Bitenc, Urban; Basden, Alastair; Myers, Richard
2013-12-01
ELT-scale extreme adaptive optics systems will require new approaches tocompute the wavefront suitably quickly, when the computational burden ofapplying a MVM is no longer practical. An approach is demonstrated here whichis hierarchical in transforming wavefront slopes from a WFS into a wavefront,and then to actuator values. First, simple integration in 1D is used to create1D-wavefront estimates with unknown starting points at the edges of independentspatial domains. Second, these starting points are estimated globally. By thesestarting points are a sub-set of the overall grid where wavefront values are tobe estimated, sparse representations are produced and numerical complexity canbe chosen by the spacing of the starting point grid relative to the overallgrid. Using a combination of algebraic expressions, sparse representation, anda conjugate gradient solver, the number of non-parallelized operations forreconstruction on a 100x100 sub-aperture sized problem is ~600,000 or O(N^3/2),which is approximately the same as for each thread of a MVM solutionparallelized over 100 threads. To reduce the effects of noise propagationwithin each domain, a noise reduction algorithm can be applied which ensuresthe continuity of the wavefront. To apply this additional step has a cost of~1,200,000 operations. We conclude by briefly discussing how the final step ofconverting from wavefront to actuator values can be achieved.
Effects of placement point of background music on shopping website.
Lai, Chien-Jung; Chiang, Chia-Chi
2012-01-01
Consumer on-line behaviors are more important than ever due to highly growth of on-line shopping. The purposes of this study were to design placement methods of background music for shopping website and examine the effect on browsers' emotional and cognitive response. Three placement points of background music during the browsing, i.e. 2 min., 4 min., and 6 min. from the start of browsing were considered for entry points. Both browsing without music (no music) and browsing with constant music volume (full music) were treated as control groups. Participants' emotional state, approach-avoidance behavior intention, and action to adjust music volume were collected. Results showed that participants had a higher level of pleasure, arousal and approach behavior intention for the three placement points than for no music and full music. Most of the participants for full music (5/6) adjusted the background music. Only 16.7% (3/18) participants for other levels turn off the background music. The results indicate that playing background music after the start of browsing is benefit for on-line shopping atmosphere. It is inappropriate to place background music at the start of browsing shopping website. The marketer must manipulated placement methods of background music for a web store carefully.
78 FR 18475 - Special Local Regulations; Stuart Sailfish Regatta, Indian River; Stuart, FL
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-27
... Cove that are encompassed within an imaginary line connecting the following points: Starting at Point 1 in position 27[deg]12'46'' N, 80[deg]11'10'' W; thence southeast to Point 2 in position 27[deg]12'41'' N, 80[deg]11'09'' W; thence southwest to Point 3 in position 27[deg]12'37'' N, 80[deg]11'11'' W...
78 FR 1792 - Special Local Regulations, Stuart Sailfish Regatta, Indian River; Stuart, FL
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-09
... that are encompassed within an imaginary line connecting the following points: starting at Point 1 in position 27[deg]12'46'' N, 80[deg]11'09'' W; thence southeast to Point 2 in position 27[deg]12'41'' N, 80[deg]11'08'' W; thence southwest to Point 3 in position 27[deg]12'37'' N, 80[deg]11'11'' W; thence...
Development and Validation of a Photonumeric Scale for Evaluation of Volume Deficit of the Hand
Donofrio, Lisa; Hardas, Bhushan; Murphy, Diane K.; Carruthers, Jean; Carruthers, Alastair; Sykes, Jonathan M.; Creutz, Lela; Marx, Ann; Dill, Sara
2016-01-01
BACKGROUND A validated scale is needed for objective and reproducible comparisons of hand appearance before and after treatment in practice and clinical studies. OBJECTIVE To describe the development and validation of the 5-point photonumeric Allergan Hand Volume Deficit Scale. METHODS The scale was developed to include an assessment guide, verbal descriptors, morphed images, and real-subject images for each grade. The clinical significance of a 1-point score difference was evaluated in a review of image pairs representing varying differences in severity. Interrater and intrarater reliability was evaluated in a live-subject validation study (N = 296) completed during 2 sessions occurring 3 weeks apart. RESULTS A score difference of ≥1 point was shown to reflect a clinically significant difference (mean [95% confidence interval] absolute score difference, 1.12 [0.99–1.26] for clinically different image pairs and 0.45 [0.33–0.57] for not clinically different pairs). Intrarater agreement between the 2 validation sessions was almost perfect (mean weighted kappa = 0.83). Interrater agreement was almost perfect during the second session (0.82, primary end point). CONCLUSION The Allergan Hand Volume Deficit Scale is a validated and reliable scale for physician rating of hand volume deficit. PMID:27661741
Development and Validation of a Photonumeric Scale for Evaluation of Facial Skin Texture
Carruthers, Alastair; Hardas, Bhushan; Murphy, Diane K.; Carruthers, Jean; Jones, Derek; Sykes, Jonathan M.; Creutz, Lela; Marx, Ann; Dill, Sara
2016-01-01
BACKGROUND A validated scale is needed for objective and reproducible comparisons of facial skin roughness before and after aesthetic treatment in practice and in clinical studies. OBJECTIVE To describe the development and validation of the 5-point photonumeric Allergan Skin Roughness Scale. METHODS The scale was developed to include an assessment guide, verbal descriptors, morphed images, and real subject images for each grade. The clinical significance of a 1-point score difference was evaluated in a review of image pairs representing varying differences in severity. Interrater and intrarater reliability was evaluated in a live-subject validation study (N = 290) completed during 2 sessions occurring 3 weeks apart. RESULTS A score difference of ≥1 point was shown to reflect a clinically meaningful difference (mean [95% confidence interval] absolute score difference 1.09 [0.96–1.23] for clinically different image pairs and 0.53 [0.38–0.67] for not clinically different pairs). Intrarater agreement between the 2 validation sessions was almost perfect (weighted kappa = 0.83). Interrater agreement was almost perfect during the second rating session (0.81, primary end point). CONCLUSION The Allergan Skin Roughness Scale is a validated and reliable scale for physician rating of midface skin roughness. PMID:27661744
ERIC Educational Resources Information Center
Boohan, Richard
2014-01-01
This article describes an approach to teaching about the energy concept that aims to be accessible to students starting in early secondary school, while being scientifically rigorous and forming the foundation for later work. It discusses how exploring thermal processes is a good starting point for a more general consideration of the ways that…
40 CFR 86.135-90 - Dynamometer procedure.
Code of Federal Regulations, 2012 CFR
2012-07-01
... startup and operation over the first 505 seconds of the driving schedule complete the hot start test. The... 505 seconds of the driving schedule complete the hot start test. The exhaust emissions are diluted... over the prescribed driving schedule may be performed at test point, provided an emission sample is not...
40 CFR 86.135-90 - Dynamometer procedure.
Code of Federal Regulations, 2013 CFR
2013-07-01
... startup and operation over the first 505 seconds of the driving schedule complete the hot start test. The... 505 seconds of the driving schedule complete the hot start test. The exhaust emissions are diluted... over the prescribed driving schedule may be performed at test point, provided an emission sample is not...
Getting started with package sampSurf
Jeffrey H. Gove
2014-01-01
The sampSurf package is designed to facilitate the comparison of new and existing areal sampling methods through simulation. The package is thoroughly documented in several vignettes as mentioned below. This document is meant to point you in the right direction in finding the needed information to get started using sampSurf.
Stevenson, Douglass E; Feng, Ge; Zhang, Runjie; Harris, Marvin K
2005-08-01
Scirpophaga incertulas (Walker) (Lepidoptera: Pyralidae) is autochthonous and monophagous on rice, Oryza spp., which favors the development of a physiological time model using degree-days (degrees C) to establish a well defined window during which adults will be present in fields. Model development of S. incertulas adult flight phenology used climatic data and historical field observations of S. incertulas from 1962 through 1988. Analysis of variance was used to evaluate 5,203 prospective models with starting dates ranging from 1 January (day 1) to 30 April (day 121) and base temperatures ranging from -3 through 18.5 degrees C. From six candidate models, which shared the lowest standard deviation of prediction error, a model with a base temperature of 10 degrees C starting on 19 January was selected for validation. Validation with linear regression evaluated the differences between predicted and observed events and showed the model consistently predicted phenological events of 10 to 90% cumulative flight activity within a 3.5-d prediction interval regarded as acceptable for pest management decision making. The degree-day phenology model developed here is expected to find field application in Guandong Province. Expansion to other areas of rice production will require field validation. We expect the degree-day characterization of the activity period will remain essentially intact, but the start day may vary based on climate and geographic location. The development and validation of the phenology model of the S. incertulas by using procedures originally developed for pecan nut casebearer, Acrobasis nuxvorella Neunzig, shows the fungibility of this approach to developing prediction models for other insects.
The inverse-square law and quantum gravity
NASA Technical Reports Server (NTRS)
Nieto, Michael Martin; Goldman, T.; Hughes, Richard J.
1989-01-01
A program is described which measures the gravitational acceleration of antiprotons. This idea was approached from a particle physics point of view. That point of view is examined starting with some history of physics over the last 200 years.
41 CFR 301-11.9 - When does per diem or actual expense entitlement start/stop?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Federal Travel Regulation System TEMPORARY DUTY (TDY) TRAVEL ALLOWANCES ALLOWABLE TRAVEL EXPENSES 11-PER... authorized point and ends on the day you return to your home, office or other authorized point. ...
Review of surface steam sterilization for validation purposes.
van Doornmalen, Joost; Kopinga, Klaas
2008-03-01
Sterilization is an essential step in the process of producing sterile medical devices. To guarantee sterility, the process of sterilization must be validated. Because there is no direct way to measure sterility, the techniques applied to validate the sterilization process are based on statistical principles. Steam sterilization is the most frequently applied sterilization method worldwide and can be validated either by indicators (chemical or biological) or physical measurements. The steam sterilization conditions are described in the literature. Starting from these conditions, criteria for the validation of steam sterilization are derived and can be described in terms of physical parameters. Physical validation of steam sterilization appears to be an adequate and efficient validation method that could be considered as an alternative for indicator validation. Moreover, physical validation can be used for effective troubleshooting in steam sterilizing processes.
Black start research of the wind and storage system based on the dual master-slave control
NASA Astrophysics Data System (ADS)
Leng, Xue; Shen, Li; Hu, Tian; Liu, Li
2018-02-01
Black start is the key to solving the problem of large-scale power failure, while the introduction of new renewable clean energy as a black start power supply was a new hotspot. Based on the dual master-slave control strategy, the wind and storage system was taken as the black start reliable power, energy storage and wind combined to ensure the stability of the micorgrid systems, to realize the black start. In order to obtain the capacity ratio of the storage in the small system based on the dual master-slave control strategy, and the black start constraint condition of the wind and storage combined system, obtain the key points of black start of wind storage combined system, but also provide reference and guidance for the subsequent large-scale wind and storage combined system in black start projects.
2013-07-31
pedals and releasing the hand brake ), the ROS ’actionlib’ is started. Atlas starts driving to the received way points until it reaches the last gate...grip at angle 0………………………………………………………………… ..31 FIGURE 3.5.6: MATLAB ’Steering wheel angle ’ output membership function schematic……………...……31...repeatability of the motion sequence. We added an optional motion to press the gas pedal at the end of the sequence in order to secure additional points
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potts, T.T.; Hylko, J.M.; Almond, D.
2007-07-01
A company's overall safety program becomes an important consideration to continue performing work and for procuring future contract awards. When injuries or accidents occur, the employer ultimately loses on two counts - increased medical costs and employee absences. This paper summarizes the human and organizational components that contributed to successful safety programs implemented by WESKEM, LLC's Environmental, Safety, and Health Departments located in Paducah, Kentucky, and Oak Ridge, Tennessee. The philosophy of 'safety, compliance, and then production' and programmatic components implemented at the start of the contracts were qualitatively identified as contributing factors resulting in a significant accumulation of safemore » work hours and an Experience Modification Rate (EMR) of <1.0. Furthermore, a study by the Associated General Contractors of America quantitatively validated components, already found in the WESKEM, LLC programs, as contributing factors to prevent employee accidents and injuries. Therefore, an investment in the human and organizational components now can pay dividends later by reducing the EMR, which is the key to reducing Workers' Compensation premiums. Also, knowing your employees' demographics and taking an active approach to evaluate and prevent fatigue may help employees balance work and non-work responsibilities. In turn, this approach can assist employers in maintaining a healthy and productive workforce. For these reasons, it is essential that safety needs be considered as the starting point when performing work. (authors)« less
Estimating corresponding locations in ipsilateral breast tomosynthesis views
NASA Astrophysics Data System (ADS)
van Schie, Guido; Tanner, Christine; Karssemeijer, Nico
2011-03-01
To improve cancer detection in mammography, breast exams usually consist of two views per breast. To combine information from both views, radiologists and multiview computer-aided detection (CAD) systems need to match corresponding regions in the two views. In digital breast tomosynthesis (DBT), finding corresponding regions in ipsilateral volumes may be a difficult and time-consuming task for radiologists, because many slices have to be inspected individually. In this study we developed a method to quickly estimate corresponding locations in ipsilateral tomosynthesis views by applying a mathematical transformation. First a compressed breast model is matched to the tomosynthesis view containing a point of interest. Then we decompress, rotate and compress again to estimate the location of the corresponding point in the ipsilateral view. In this study we use a simple elastically deformable sphere model to obtain an analytical solution for the transformation in a given DBT case. The model is matched to the volume by using automatic segmentation of the pectoral muscle, breast tissue and nipple. For validation we annotated 181 landmarks in both views and applied our method to each location. Results show a median 3D distance between the actual location and estimated location of 1.5 cm; a good starting point for a feature based local search method to link lesions for a multiview CAD system. Half of the estimated locations were at most 1 slice away from the actual location, making our method useful as a tool in mammographic workstations to interactively find corresponding locations in ipsilateral tomosynthesis views.
Hydrodynamic interaction of two particles in confined linear shear flow at finite Reynolds number
NASA Astrophysics Data System (ADS)
Yan, Yiguang; Morris, Jeffrey F.; Koplik, Joel
2007-11-01
We discuss the hydrodynamic interactions of two solid bodies placed in linear shear flow between parallel plane walls in a periodic geometry at finite Reynolds number. The computations are based on the lattice Boltzmann method for particulate flow, validated here by comparison to previous results for a single particle. Most of our results pertain to cylinders in two dimensions but some examples are given for spheres in three dimensions. Either one mobile and one fixed particle or else two mobile particles are studied. The motion of a mobile particle is qualitatively similar in both cases at early times, exhibiting either trajectory reversal or bypass, depending upon the initial vector separation of the pair. At longer times, if a mobile particle does not approach a periodic image of the second, its trajectory tends to a stable limit point on the symmetry axis. The effect of interactions with periodic images is to produce nonconstant asymptotic long-time trajectories. For one free particle interacting with a fixed second particle within the unit cell, the free particle may either move to a fixed point or take up a limit cycle. Pairs of mobile particles starting from symmetric initial conditions are shown to asymptotically reach either fixed points, or mirror image limit cycles within the unit cell, or to bypass one another (and periodic images) indefinitely on a streamwise periodic trajectory. The limit cycle possibility requires finite Reynolds number and arises as a consequence of streamwise periodicity when the system length is sufficiently short.
Avoiding Terminations, Single Offer Competition, and Costly Change Orders with Fixed-Price Contracts
2015-04-30
deriving performance outputs from FPDS.6 To ensure reproducibility of this analysis and to provide a starting point for future research, the entirety...available in bulk from USAspending.gov starting in FY2000. However, data quality steadily improves over that decade, particularly in the commonly...available prior to FY2007, the study team chose to set FY2007 as the start date rather than risk sample bias by including only those earlier
A user-targeted synthesis of the VALUE perfect predictor experiment
NASA Astrophysics Data System (ADS)
Maraun, Douglas; Widmann, Martin; Gutierrez, Jose; Kotlarski, Sven; Hertig, Elke; Wibig, Joanna; Rössler, Ole; Huth, Radan
2016-04-01
VALUE is an open European network to validate and compare downscaling methods for climate change research. A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. VALUE's main approach to validation is user-focused: starting from a specific user problem, a validation tree guides the selection of relevant validation indices and performance measures. We consider different aspects: (1) marginal aspects such as mean, variance and extremes; (2) temporal aspects such as spell length characteristics; (3) spatial aspects such as the de-correlation length of precipitation extremes; and multi-variate aspects such as the interplay of temperature and precipitation or scale-interactions. Several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur. Experiment 1 (perfect predictors): what is the isolated downscaling skill? How do statistical and dynamical methods compare? How do methods perform at different spatial scales? Experiment 2 (Global climate model predictors): how is the overall representation of regional climate, including errors inherited from global climate models? Experiment 3 (pseudo reality): do methods fail in representing regional climate change? Here, we present a user-targeted synthesis of the results of the first VALUE experiment. In this experiment, downscaling methods are driven with ERA-Interim reanalysis data to eliminate global climate model errors, over the period 1979-2008. As reference data we use, depending on the question addressed, (1) observations from 86 meteorological stations distributed across Europe; (2) gridded observations at the corresponding 86 locations or (3) gridded spatially extended observations for selected European regions. With more than 40 contributing methods, this study is the most comprehensive downscaling inter-comparison project so far. The results clearly indicate that for several aspects, the downscaling skill varies considerably between different methods. For specific purposes, some methods can therefore clearly be excluded.
The Role of Testing in Affirmative Action.
ERIC Educational Resources Information Center
Manning, Winton H.
Graphs and charts pertaining to testing in affirmative action are presented. Data concern the following: the predictive validity of College Board admissions tests using freshman grade point average as the criterion; validity coefficients of undergraduate grade point average (UGPA) alone, Law School Admission Test (LSAT) scores, and undergraduate…
Rades, Dirk; Dziggel, Liesa; Nagy, Viorica; Segedin, Barbara; Lohynska, Radka; Veninga, Theo; Khoa, Mai T; Trang, Ngo T; Schild, Steven E
2013-07-01
Survival scores for patients with brain metastasis exist. However, the treatment regimens used to create these scores were heterogeneous. This study aimed to develop and validate a survival score in homogeneously treated patients. Eight-hundred-and-eighty-two patients receiving 10 × 3Gy of WBRT alone were randomly assigned to a test group (N=441) or a validation group (N=441). In the multivariate analysis of the test group, age, performance status, extracranial metastasis, and systemic treatment prior to WBRT were independent predictors of survival. The score for each factor was determined by dividing the 6-month survival rate (in %) by 10. Scores were summed and total scores ranged from 6 to 19 points. Patients were divided into four prognostic groups. The 6-month survival rates were 4% for 6-9 points, 29% for 10-14 points, 62% for 15-17 points, and 93% for 17-18 points (p<0.001) in the test group. The survival rates were 3%, 28%, 54% and 96%, respectively (p<0.001) in the validation group. Since the 6-month survival rates in the validation group were very similar to the test group, this new score (WBRT-30) appears valid and reproducible. It can help making treatment choices and stratifying patients in future trials. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Software Patching Lessons Learned - Video Text Version | Energy Systems
eyes of an OEM, several different OEMs for many years. Then also in 2013, we were awarded a cooperative validating for OEM solutions and those products is very different from validating with end users, when we and started reaching out to vendors, I was a bit taken aback at the discrepancy and how different
NASA Astrophysics Data System (ADS)
Lobl, E. S.
2003-12-01
AMSR and AMSR-E are passive microwave radiometers built by NASDA in Japan. AMSR flies on ADEOS II, launched December 14 2001, and AMSR-E flies on NASA's Aqua satellite, launched May 4 2001. The Science teams in both countries have developed algorithms to retrieve different atmospheric parameters from the data obtained by these radiometers. The US Science team has developed a Validation plan that involved several campaigns. In fact most of these campaign have taken place this year: 2003, nicknamed the "Golden Year" for AMSR Validation. The first campaign started in January 2003 with the Extra-tropical precipitation campaign, followed by IOP3 for Cold Lands Processes Experiment (CLPX) in Colorado. After the change out of some of the instruments, the Validation program continued with the Arctic Sea Ice campaign based in Alaska, and followed by CLPX IOP 4, back in Colorado. Soil Moisture EXperiment 03 (SMEX03) started in late June in Alabama and Georgia, and then completed in Oklahoma mid-July. The last campaign in this series is AMSR Antarctic Sea Ice (AASI)/SMEX in Brazil. The major goals of each campaign, and very preliminary data will be shown. Most of these campaigns were in collaboration with the Japanese AMSR scientists.
Interacting charges and the classical electron radius
NASA Astrophysics Data System (ADS)
De Luca, Roberto; Di Mauro, Marco; Faella, Orazio; Naddeo, Adele
2018-03-01
The equation of the motion of a point charge q repelled by a fixed point-like charge Q is derived and studied. In solving this problem useful concepts in classical and relativistic kinematics, in Newtonian mechanics and in non-linear ordinary differential equations are revised. The validity of the approximations is discussed from the physical point of view. In particular the classical electron radius emerges naturally from the requirement that the initial distance is large enough for the non-relativistic approximation to be valid. The relevance of this topic for undergraduate physics teaching is pointed out.
Ducheyne, Els; Tran Minh, Nhu Nguyen; Haddad, Nabil; Bryssinckx, Ward; Buliva, Evans; Simard, Frédéric; Malik, Mamunur Rahman; Charlier, Johannes; De Waele, Valérie; Mahmoud, Osama; Mukhtar, Muhammad; Bouattour, Ali; Hussain, Abdulhafid; Hendrickx, Guy; Roiz, David
2018-02-14
Aedes-borne diseases as dengue, zika, chikungunya and yellow fever are an emerging problem worldwide, being transmitted by Aedes aegypti and Aedes albopictus. Lack of up to date information about the distribution of Aedes species hampers surveillance and control. Global databases have been compiled but these did not capture data in the WHO Eastern Mediterranean Region (EMR), and any models built using these datasets fail to identify highly suitable areas where one or both species may occur. The first objective of this study was therefore to update the existing Ae. aegypti (Linnaeus, 1762) and Ae. albopictus (Skuse, 1895) compendia and the second objective was to generate species distribution models targeted to the EMR. A final objective was to engage the WHO points of contacts within the region to provide feedback and hence validate all model outputs. The Ae. aegypti and Ae. albopictus compendia provided by Kraemer et al. (Sci Data 2:150035, 2015; Dryad Digit Repos, 2015) were used as starting points. These datasets were extended with more recent species and disease data. In the next step, these sets were filtered using the Köppen-Geiger classification and the Mahalanobis distance. The occurrence data were supplemented with pseudo-absence data as input to Random Forests. The resulting suitability and maximum risk of establishment maps were combined into hard-classified maps per country for expert validation. The EMR datasets consisted of 1995 presence locations for Ae. aegypti and 2868 presence locations for Ae. albopictus. The resulting suitability maps indicated that there exist areas with high suitability and/or maximum risk of establishment for these disease vectors in contrast with previous model output. Precipitation and host availability, expressed as population density and night-time lights, were the most important variables for Ae. aegypti. Host availability was the most important predictor in case of Ae. albopictus. Internal validation was assessed geographically. External validation showed high agreement between the predicted maps and the experts' extensive knowledge of the terrain. Maps of distribution and maximum risk of establishment were created for Ae. aegypti and Ae. albopictus for the WHO EMR. These region-specific maps highlighted data gaps and these gaps will be filled using targeted monitoring and surveillance. This will increase the awareness and preparedness of the different countries for Aedes borne diseases.
Thompson, R.S.; Anderson, K.H.; Bartlein, P.J.
2008-01-01
The method of modern analogs is widely used to obtain estimates of past climatic conditions from paleobiological assemblages, and despite its frequent use, this method involved so-far untested assumptions. We applied four analog approaches to a continental-scale set of bioclimatic and plant-distribution presence/absence data for North America to assess how well this method works under near-optimal modern conditions. For each point on the grid, we calculated the similarity between its vegetation assemblage and those of all other points on the grid (excluding nearby points). The climate of the points with the most similar vegetation was used to estimate the climate at the target grid point. Estimates based the use of the Jaccard similarity coefficient had smaller errors than those based on the use of a new similarity coefficient, although the latter may be more robust because it does not assume that the "fossil" assemblage is complete. The results of these analyses indicate that presence/absence vegetation assemblages provide a valid basis for estimating bioclimates on the continental scale. However, the accuracy of the estimates is strongly tied to the number of species in the target assemblage, and the analog method is necessarily constrained to produce estimates that fall within the range of observed values. We applied the four modern analog approaches and the mutual overlap (or "mutual climatic range") method to estimate bioclimatic conditions represented by the plant macrofossil assemblage from a packrat midden of Last Glacial Maximum age from southern Nevada. In general, the estimation approaches produced similar results in regard to moisture conditions, but there was a greater range of estimates for growing-degree days. Despite its limitations, the modern analog technique can provide paleoclimatic reconstructions that serve as the starting point to the interpretation of past climatic conditions.
Markovic, Gabriela; Schult, Marie-Louise; Bartfai, Aniko; Elg, Mattias
2017-01-31
Progress in early cognitive recovery after acquired brain injury is uneven and unpredictable, and thus the evaluation of rehabilitation is complex. The use of time-series measurements is susceptible to statistical change due to process variation. To evaluate the feasibility of using a time-series method, statistical process control, in early cognitive rehabilitation. Participants were 27 patients with acquired brain injury undergoing interdisciplinary rehabilitation of attention within 4 months post-injury. The outcome measure, the Paced Auditory Serial Addition Test, was analysed using statistical process control. Statistical process control identifies if and when change occurs in the process according to 3 patterns: rapid, steady or stationary performers. The statistical process control method was adjusted, in terms of constructing the baseline and the total number of measurement points, in order to measure a process in change. Statistical process control methodology is feasible for use in early cognitive rehabilitation, since it provides information about change in a process, thus enabling adjustment of the individual treatment response. Together with the results indicating discernible subgroups that respond differently to rehabilitation, statistical process control could be a valid tool in clinical decision-making. This study is a starting-point in understanding the rehabilitation process using a real-time-measurements approach.
Mares-García, Emma; Palazón-Bru, Antonio; Folgado-de la Rosa, David Manuel; Pereira-Expósito, Avelino; Martínez-Martín, Álvaro; Cortés-Castell, Ernesto; Gil-Guillén, Vicente Francisco
2017-01-01
Other studies have assessed nonadherence to proton pump inhibitors (PPIs), but none has developed a screening test for its detection. To construct and internally validate a predictive model for nonadherence to PPIs. This prospective observational study with a one-month follow-up was carried out in 2013 in Spain, and included 302 patients with a prescription for PPIs. The primary variable was nonadherence to PPIs (pill count). Secondary variables were gender, age, antidepressants, type of PPI, non-guideline-recommended prescription (NGRP) of PPIs, and total number of drugs. With the secondary variables, a binary logistic regression model to predict nonadherence was constructed and adapted to a points system. The ROC curve, with its area (AUC), was calculated and the optimal cut-off point was established. The points system was internally validated through 1,000 bootstrap samples and implemented in a mobile application (Android). The points system had three prognostic variables: total number of drugs, NGRP of PPIs, and antidepressants. The AUC was 0.87 (95% CI [0.83-0.91], p < 0.001). The test yielded a sensitivity of 0.80 (95% CI [0.70-0.87]) and a specificity of 0.82 (95% CI [0.76-0.87]). The three parameters were very similar in the bootstrap validation. A points system to predict nonadherence to PPIs has been constructed, internally validated and implemented in a mobile application. Provided similar results are obtained in external validation studies, we will have a screening tool to detect nonadherence to PPIs.
Nanocrystalline ceramic materials
Siegel, Richard W.; Nieman, G. William; Weertman, Julia R.
1994-01-01
A method for preparing a treated nanocrystalline metallic material. The method of preparation includes providing a starting nanocrystalline metallic material with a grain size less than about 35 nm, compacting the starting nanocrystalline metallic material in an inert atmosphere and annealing the compacted metallic material at a temperature less than about one-half the melting point of the metallic material.
Transforming the structure of a health system.
Reilly, Patrick
2012-06-01
In starting the planning process for an organization's transformation or restructuring, healthcare finance leaders should: Identify strategic imperatives for the organization and physicians, Remember the organization's core area of business, Define the starting point and create clear objectives, Develop a strategy that engages front-line employees to change the culture of the organization.
Validation of a dew-point generator for pressures up to 6 MPa using nitrogen and air
NASA Astrophysics Data System (ADS)
Bosma, R.; Mutter, D.; Peruzzi, A.
2012-08-01
A new primary humidity standard was developed at VSL that, in addition to ordinary operation with air and nitrogen at atmospheric pressure, can be operated with other carrier gases such as natural gas at pressures up to 6 MPa and SF6 at pressures up to 1 MPa. The temperature range of the standard is from -80 °C to +20 °C. In this paper, we report the validation of the new primary dew-point generator in the temperature range -41 °C to +5 °C and the pressure range 0.1 MPa to 6 MPa using nitrogen and air. For the validation the flow through the dew-point generator was varied up to 10 l min-1 (at 23 °C and 1013 hPa) and the dew point of the gas entering the generator was varied up to 15 °C above the dew point exiting the generator. The validation results showed that the new generator, over the tested temperature and pressure range, can be used with a standard uncertainty of 0.02 °C frost/dew point. The measurements used for the validation at -41 °C and -20 °C with nitrogen and at +5 °C with air were also used to calculate the enhancement factor at pressures up to 6 MPa. For +5 °C the differences between the measured and literature values were compatible with the respective uncertainties. For -41 °C and -20 °C they were compatible only up to 3 MPa. At 6 MPa a discrepancy was observed.
Implications of dynamic changes in miR-192 expression in ischemic acute kidney injury.
Zhang, Lulu; Xu, Yuan; Xue, Song; Wang, Xudong; Dai, Huili; Qian, Jiaqi; Ni, Zhaohui; Yan, Yucheng
2017-03-01
Ischemia-reperfusion injury (IRI) is a major cause of acute kidney injury (AKI) with poor outcomes. While many important functions of microRNAs (miRNAs) have been identified in various diseases, few studies reported miRNAs in acute kidney IRI, especially the dynamic changes in their expression and their implications during disease progression. The expression of miR-192, a specific kidney-enriched miRNA, was assessed in both the plasma and kidney of IRI rats at different time points after kidney injury and compared to renal function and kidney histological changes. The results were validated in the plasma of the selected patients with AKI after cardiac surgery compared with those matched patients without AKI. The performance characteristics of miR-192 were summarized using area under the receiver operator characteristic (ROC) curves (AUC-ROC). MiRNA profiling in plasma led to the identification of 42 differentially expressed miRNAs in the IRI group compared to the sham group. MiR-192 was kidney-enriched and chosen for further validation. Real-time PCR showed that miR-192 levels increased by fourfold in the plasma and decreased by about 40% in the kidney of IRI rats. Plasma miR-192 expression started increasing at 3 h and peaked at 12 h, while kidney miR-192 expression started decreasing at 6 h and remained at a low level for 7 days after reperfusion. Plasma miR-192 level in patients with AKI increased at the time of ICU admission, was stable for 2 h and decreased after 24 h. AUC-ROC was 0.673 (95% CI: 0.540-0.806, p = 0.014). Plasma miR-192 expression was induced in a time-dependent manner after IRI in rats and patients with AKI after cardiac surgery, comparably to the kidney injury development and recovery process, and may be useful for the detection of AKI.
2016-12-01
United States v. Knotts.9 Analysis of previous cases provides a starting point for discussion between government officials and the public they...authorized under this ordinance to deploy this technology. At the start of the legislative session in the fall of 2015, Senator John Kavanagh, a...2015, through April 30, 2016 (a span of approximately 15 months). The start date of this query is intended to correspond with the operational
Recommendations and Privacy Requirements for a Bring-Your-Own-Device User Policy and Agreement
2015-03-01
manipulate data from non-traditional workplaces to support mission requirements. The United States Marine Corps (USMC) has started a pilot BYOD program, but...contrasted to obtain a starting point to develop a user agreement for the USMC. The security controls identified within these case studies were also...participating in a BYOD program. A. MARINE CORPS PILOT PROGRAM Starting in January 2015 and at the behest of the USMC, the Marine Corps Network Operations and
NASA Astrophysics Data System (ADS)
Testa, Italo; Galano, Silvia; Leccia, Silvio; Puddu, Emanuella
2015-12-01
In this paper, we report about the development and validation of a learning progression about the Celestial Motion big idea. Existing curricula, research studies on alternative conceptions about these phenomena, and students' answers to an open questionnaire were the starting point to develop initial learning progressions about change of seasons, solar and lunar eclipses, and Moon phases; then, a two-tier multiple choice questionnaire was designed to validate and improve them. The questionnaire was submitted to about 300 secondary students of different school levels (14 to 18 years old). Item response analysis and curve integral method were used to revise the hypothesized learning progressions. Findings support that spatial reasoning is a key cognitive factor for building an explanatory framework for the Celestial Motion big idea, but also suggest that causal reasoning based on physics mechanisms underlying the phenomena, as light flux laws or energy transfers, may significantly impact a students' understanding. As an implication of the study, we propose that the teaching of the three discussed astronomy phenomena should follow a single teaching-learning path along the following sequence: (i) emphasize from the beginning the geometrical aspects of the Sun-Moon-Earth system motion; (ii) clarify consequences of the motion of the Sun-Moon-Earth system, as the changing solar radiation flow on the surface of Earth during the revolution around the Sun; (iii) help students moving between different reference systems (Earth and space observer's perspective) to understand how Earth's rotation and revolution can change the appearance of the Sun and Moon. Instructional and methodological implications are also briefly discussed.
Analytical Plan for Roman Glasses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strachan, Denis M.; Buck, Edgar C.; Mueller, Karl T.
Roman glasses that have been in the sea or underground for about 1800 years can serve as the independent “experiment” that is needed for validation of codes and models that are used in performance assessment. Two sets of Roman-era glasses have been obtained for this purpose. One set comes from the sunken vessel the Iulia Felix; the second from recently excavated glasses from a Roman villa in Aquileia, Italy. The specimens contain glass artifacts and attached sediment or soil. In the case of the Iulia Felix glasses quite a lot of analytical work has been completed at the University ofmore » Padova, but from an archaeological perspective. The glasses from Aquileia have not been so carefully analyzed, but they are similar to other Roman glasses. Both glass and sediment or soil need to be analyzed and are the subject of this analytical plan. The glasses need to be analyzed with the goal of validating the model used to describe glass dissolution. The sediment and soil need to be analyzed to determine the profile of elements released from the glass. This latter need represents a significant analytical challenge because of the trace quantities that need to be analyzed. Both pieces of information will yield important information useful in the validation of the glass dissolution model and the chemical transport code(s) used to determine the migration of elements once released from the glass. In this plan, we outline the analytical techniques that should be useful in obtaining the needed information and suggest a useful starting point for this analytical effort.« less
Cadorin, Lucia; Bagnasco, Annamaria; Tolotti, Angela; Pagnucci, Nicola; Sasso, Loredana
2016-09-01
To identify, evaluate and describe the psychometric properties of instruments that measure learning outcomes in healthcare students. Meaningful learning is an active process that enables a wider and deeper understanding of concepts. It is the result of an interaction between new and prior knowledge and produces a long-standing change in knowledge and skills. In the field of education, validated and reliable instruments for assessing meaningful learning are needed. A psychometric systematic review. MEDLINE CINAHL, SCOPUS, ERIC, Cochrane Library, Psychology & Behavioural Sciences Collection Database from 1990-December 2013. Using pre-determined inclusion criteria, three reviewers independently identified studies for full-text review. Then they extracted data for quality appraisal and graded instrument validity using the Consensus-based Standards for the selection of the health status Measurement INstruments checklist and the Psychometric Grading Framework. Of the 57 studies identified for full-text review, 16 met the inclusion criteria and 13 different instruments were assessed. Following quality assessment, only one instrument was considered of good quality but it measured meaningful learning only in part; the others were either fair or poor. The Psychometric Grading Framework indicated that one instrument was weak, while the others were very weak. No instrument displayed adequate validity. The systematic review produced a synthesis of the psychometric properties of tools that measure learning outcomes in students of healthcare disciplines. Measuring learning outcomes is very important when educating health professionals. The identified tools may constitute a starting point for the development of other assessment tools. © 2016 John Wiley & Sons Ltd.
Alassaad, Anna; Melhus, Håkan; Hammarlund-Udenaes, Margareta; Bertilsson, Maria; Gillespie, Ulrika; Sundström, Johan
2015-01-01
Objectives To construct and internally validate a risk score, the ‘80+ score’, for revisits to hospital and mortality for older patients, incorporating aspects of pharmacotherapy. Our secondary aim was to compare the discriminatory ability of the score with that of three validated tools for measuring inappropriate prescribing: Screening Tool of Older Person's Prescriptions (STOPP), Screening Tool to Alert doctors to Right Treatment (START) and Medication Appropriateness Index (MAI). Setting Two acute internal medicine wards at Uppsala University hospital. Patient data were used from a randomised controlled trial investigating the effects of a comprehensive clinical pharmacist intervention. Participants Data from 368 patients, aged 80 years and older, admitted to one of the study wards. Primary outcome measure Time to rehospitalisation or death during the year after discharge from hospital. Candidate variables were selected among a large number of clinical and drug-specific variables. After a selection process, a score for risk estimation was constructed. The 80+ score was internally validated, and the discriminatory ability of the score and of STOPP, START and MAI was assessed using C-statistics. Results Seven variables were selected. Impaired renal function, pulmonary disease, malignant disease, living in a nursing home, being prescribed an opioid or being prescribed a drug for peptic ulcer or gastroesophageal reflux disease were associated with an increased risk, while being prescribed an antidepressant drug (tricyclic antidepressants not included) was linked to a lower risk of the outcome. These variables made up the components of the 80+ score. The C-statistics were 0.71 (80+), 0.57 (STOPP), 0.54 (START) and 0.63 (MAI). Conclusions We developed and internally validated a score for prediction of risk of rehospitalisation and mortality in hospitalised older people. The score discriminated risk better than available tools for inappropriate prescribing. Pending external validation, this score can aid in clinical identification of high-risk patients and targeting of interventions. PMID:25694461
Computational approach to compact Riemann surfaces
NASA Astrophysics Data System (ADS)
Frauendiener, Jörg; Klein, Christian
2017-01-01
A purely numerical approach to compact Riemann surfaces starting from plane algebraic curves is presented. The critical points of the algebraic curve are computed via a two-dimensional Newton iteration. The starting values for this iteration are obtained from the resultants with respect to both coordinates of the algebraic curve and a suitable pairing of their zeros. A set of generators of the fundamental group for the complement of these critical points in the complex plane is constructed from circles around these points and connecting lines obtained from a minimal spanning tree. The monodromies are computed by solving the defining equation of the algebraic curve on collocation points along these contours and by analytically continuing the roots. The collocation points are chosen to correspond to Chebychev collocation points for an ensuing Clenshaw-Curtis integration of the holomorphic differentials which gives the periods of the Riemann surface with spectral accuracy. At the singularities of the algebraic curve, Puiseux expansions computed by contour integration on the circles around the singularities are used to identify the holomorphic differentials. The Abel map is also computed with the Clenshaw-Curtis algorithm and contour integrals. As an application of the code, solutions to the Kadomtsev-Petviashvili equation are computed on non-hyperelliptic Riemann surfaces.
The new Guidance in AQUATOX Setup and Application provides a quick start guide to introduce major model features, as well as being a type of cookbook to guide basic model setup, calibration, and validation.
Fractal Clustering and Knowledge-driven Validation Assessment for Gene Expression Profiling.
Wang, Lu-Yong; Balasubramanian, Ammaiappan; Chakraborty, Amit; Comaniciu, Dorin
2005-01-01
DNA microarray experiments generate a substantial amount of information about the global gene expression. Gene expression profiles can be represented as points in multi-dimensional space. It is essential to identify relevant groups of genes in biomedical research. Clustering is helpful in pattern recognition in gene expression profiles. A number of clustering techniques have been introduced. However, these traditional methods mainly utilize shape-based assumption or some distance metric to cluster the points in multi-dimension linear Euclidean space. Their results shows poor consistence with the functional annotation of genes in previous validation study. From a novel different perspective, we propose fractal clustering method to cluster genes using intrinsic (fractal) dimension from modern geometry. This method clusters points in such a way that points in the same clusters are more self-affine among themselves than to the points in other clusters. We assess this method using annotation-based validation assessment for gene clusters. It shows that this method is superior in identifying functional related gene groups than other traditional methods.
ERIC Educational Resources Information Center
Bohman, Benjamin; Nyberg, Gisela; Sundblom, Elinor; Schäfer Elinder, Liselotte
2014-01-01
Introduction: Measures of parental self-efficacy (PSE) for healthy dietary or physical activity (PA) behaviors in children have been used in several studies; however, further psychometric validation of PSE for these behaviors is needed. The purpose of the present study was to evaluate the psychometric properties of a new PSE instrument. Methods:…
DOE Office of Scientific and Technical Information (OSTI.GOV)
BRISC is a developmental prototype for a nextgeneration systems-level integrated performance and safety code (IPSC) for nuclear reactors. Its development served to demonstrate how a lightweight multi-physics coupling approach can be used to tightly couple the physics models in several different physics codes (written in a variety of languages) into one integrated package for simulating accident scenarios in a liquid sodium cooled burner nuclear reactor. For example, the RIO Fluid Flow and Heat transfer code developed at Sandia (SNL: Chris Moen, Dept. 08005) is used in BRISC to model fluid flow and heat transfer, as well as conduction heat transfermore » in solids. Because BRISC is a prototype, its most practical application is as a foundation or starting point for developing a true production code. The sub-codes and the associated models and correlations currently employed within BRISC were chosen to cover the required application space and demonstrate feasibility, but were not optimized or validated against experimental data within the context of their use in BRISC.« less
Spatial averaging of a dissipative particle dynamics model for active suspensions
NASA Astrophysics Data System (ADS)
Panchenko, Alexander; Hinz, Denis F.; Fried, Eliot
2018-03-01
Starting from a fine-scale dissipative particle dynamics (DPD) model of self-motile point particles, we derive meso-scale continuum equations by applying a spatial averaging version of the Irving-Kirkwood-Noll procedure. Since the method does not rely on kinetic theory, the derivation is valid for highly concentrated particle systems. Spatial averaging yields stochastic continuum equations similar to those of Toner and Tu. However, our theory also involves a constitutive equation for the average fluctuation force. According to this equation, both the strength and the probability distribution vary with time and position through the effective mass density. The statistics of the fluctuation force also depend on the fine scale dissipative force equation, the physical temperature, and two additional parameters which characterize fluctuation strengths. Although the self-propulsion force entering our DPD model contains no explicit mechanism for aligning the velocities of neighboring particles, our averaged coarse-scale equations include the commonly encountered cubically nonlinear (internal) body force density.
Boundary-layer equations in generalized curvilinear coordinates
NASA Technical Reports Server (NTRS)
Panaras, Argyris G.
1987-01-01
A set of higher-order boundary-layer equations is derived valid for three-dimensional compressible flows. The equations are written in a generalized curvilinear coordinate system, in which the surface coordinates are nonorthogonal; the third axis is restricted to be normal to the surface. Also, higher-order viscous terms which are retained depend on the surface curvature of the body. Thus, the equations are suitable for the calculation of the boundary layer about arbitrary vehicles. As a starting point, the Navier-Stokes equations are derived in a tensorian notation. Then by means of an order-of-magnitude analysis, the boundary-layer equations are developed. To provide an interface between the analytical partial differentiation notation and the compact tensor notation, a brief review of the most essential theorems of the tensor analysis related to the equations of the fluid dynamics is given. Many useful quantities, such as the contravariant and the covariant metrics and the physical velocity components, are written in both notations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beiu, V.; Makaruk, H.E.
1997-09-01
The starting points of this paper are two size-optimal solutions: (1) one for implementing arbitrary Boolean functions; and (2) another one for implementing certain subclasses of Boolean functions. Because VLSI implementations do not cope well with highly interconnected nets -- the area of a chip grows with the cube of the fan-in -- this paper will analyze the influence of limited fan-in on the size optimality for the two solutions mentioned. First, the authors will extend a result from Horne and Hush valid for fan-in {Delta} = 2 to arbitrary fan-in. Second, they will prove that size-optimal solutions are obtainedmore » for small constant fan-ins for both constructions, while relative minimum size solutions can be obtained for fan-ins strictly lower that linear. These results are in agreement with similar ones proving that for small constant fan-ins ({Delta} = 6...9) there exist VLSI-optimal (i.e., minimizing AT{sup 2}) solutions, while there are similar small constants relating to the capacity of processing information.« less
Tumpa, Anja; Stajić, Ana; Jančić-Stojanović, Biljana; Medenica, Mirjana
2017-02-05
This paper deals with the development of hydrophilic interaction liquid chromatography (HILIC) method with gradient elution, in accordance with Analytical Quality by Design (AQbD) methodology, for the first time. The method is developed for olanzapine and its seven related substances. Following step by step AQbD methodology, firstly as critical process parameters (CPPs) temperature, starting content of aqueous phase and duration of linear gradient are recognized, and as critical quality attributes (CQAs) separation criterion S of critical pairs of substances are investigated. Rechtschaffen design is used for the creation of models that describe the dependence between CPPs and CQAs. The design space that is obtained at the end is used for choosing the optimal conditions (set point). The method is fully validated at the end to verify the adequacy of the chosen optimal conditions and applied to real samples. Copyright © 2016 Elsevier B.V. All rights reserved.
A reconfigurable visual-programming library for real-time closed-loop cellular electrophysiology
Biró, István; Giugliano, Michele
2015-01-01
Most of the software platforms for cellular electrophysiology are limited in terms of flexibility, hardware support, ease of use, or re-configuration and adaptation for non-expert users. Moreover, advanced experimental protocols requiring real-time closed-loop operation to investigate excitability, plasticity, dynamics, are largely inaccessible to users without moderate to substantial computer proficiency. Here we present an approach based on MATLAB/Simulink, exploiting the benefits of LEGO-like visual programming and configuration, combined to a small, but easily extendible library of functional software components. We provide and validate several examples, implementing conventional and more sophisticated experimental protocols such as dynamic-clamp or the combined use of intracellular and extracellular methods, involving closed-loop real-time control. The functionality of each of these examples is demonstrated with relevant experiments. These can be used as a starting point to create and support a larger variety of electrophysiological tools and methods, hopefully extending the range of default techniques and protocols currently employed in experimental labs across the world. PMID:26157385
Haug, Tobias
2011-01-01
There is a current need for reliable and valid test instruments in different countries in order to monitor deaf children's sign language acquisition. However, very few tests are commercially available that offer strong evidence for their psychometric properties. A German Sign Language (DGS) test focusing on linguistic structures that are acquired in preschool- and school-aged children (4-8 years old) is urgently needed. Using the British Sign Language Receptive Skills Test, that has been standardized and has sound psychometric properties, as a template for adaptation thus provides a starting point for tests of a sign language that is less documented, such as DGS. This article makes a novel contribution to the field by examining linguistic, cultural, and methodological issues in the process of adapting a test from the source language to the target language. The adapted DGS test has sound psychometric properties and provides the basis for revision prior to standardization. © The Author 2011. Published by Oxford University Press. All rights reserved.
Studies of Particle Wake Potentials in Plasmas
NASA Astrophysics Data System (ADS)
Ellis, Ian; Graziani, Frank; Glosli, James; Strozzi, David; Surh, Michael; Richards, David; Decyk, Viktor; Mori, Warren
2011-10-01
Fast Ignition studies require a detailed understanding of electron scattering, stopping, and energy deposition in plasmas with variable values for the number of particles within a Debye sphere. Presently there is disagreement in the literature concerning the proper description of these processes. Developing and validating proper descriptions requires studying the processes using first-principle electrostatic simulations and possibly including magnetic fields. We are using the particle-particle particle-mesh (PPPM) code ddcMD and the particle-in-cell (PIC) code BEPS to perform these simulations. As a starting point in our study, we examine the wake of a particle passing through a plasma in 3D electrostatic simulations performed with ddcMD and with BEPS using various cell sizes. In this poster, we compare the wakes we observe in these simulations with each other and predictions from Vlasov theory. Prepared by LLNL under Contract DE-AC52-07NA27344 and by UCLA under Grant DE-FG52-09NA29552.
False confessions, expert testimony, and admissibility.
Watson, Clarence; Weiss, Kenneth J; Pouncey, Claire
2010-01-01
The confession of a criminal defendant serves as a prosecutor's most compelling piece of evidence during trial. Courts must preserve a defendant's constitutional right to a fair trial while upholding the judicial interests of presenting competent and reliable evidence to the jury. When a defendant seeks to challenge the validity of that confession through expert testimony, the prosecution often contests the admissibility of the expert's opinion. Depending on the content and methodology of the expert's opinion, testimony addressing the phenomenon of false confessions may or may not be admissible. This article outlines the scientific and epistemological bases of expert testimony on false confession, notes the obstacles facing its admissibility, and provides guidance to the expert in formulating opinions that will reach the judge or jury. We review the 2006 New Jersey Superior Court decision in State of New Jersey v. George King to illustrate what is involved in the admissibility of false-confession testimony and use the case as a starting point in developing a best-practice approach to working in this area.
Video game addiction and college performance among males: results from a 1 year longitudinal study.
Schmitt, Zachary L; Livingston, Michael G
2015-01-01
This study explored the pattern of video game usage and video game addiction among male college students and examined how video game addiction was related to expectations of college engagement, college grade point average (GPA), and on-campus drug and alcohol violations. Participants were 477 male, first year students at a liberal arts college. In the week before the start of classes, participants were given two surveys: one of expected college engagement, and the second of video game usage, including a measure of video game addiction. Results suggested that video game addiction is (a) negatively correlated with expected college engagement, (b) negatively correlated with college GPA, even when controlling for high school GPA, and (c) negatively correlated with drug and alcohol violations that occurred during the first year in college. Results are discussed in terms of implications for male students' engagement and success in college, and in terms of the construct validity of video game addiction.
Clustering and Network Analysis of Reverse Phase Protein Array Data.
Byron, Adam
2017-01-01
Molecular profiling of proteins and phosphoproteins using a reverse phase protein array (RPPA) platform, with a panel of target-specific antibodies, enables the parallel, quantitative proteomic analysis of many biological samples in a microarray format. Hence, RPPA analysis can generate a high volume of multidimensional data that must be effectively interrogated and interpreted. A range of computational techniques for data mining can be applied to detect and explore data structure and to form functional predictions from large datasets. Here, two approaches for the computational analysis of RPPA data are detailed: the identification of similar patterns of protein expression by hierarchical cluster analysis and the modeling of protein interactions and signaling relationships by network analysis. The protocols use freely available, cross-platform software, are easy to implement, and do not require any programming expertise. Serving as data-driven starting points for further in-depth analysis, validation, and biological experimentation, these and related bioinformatic approaches can accelerate the functional interpretation of RPPA data.
[The name in schizophrenia: a study of 60 patients].
Rafrafi, Rim; Bram, Nesrine; Bergaoui, Haifa; Ben Romdhane, Imene; El Hechmi, Zouhaier
2014-03-01
Clinical aspects in schizophrenia suggest a unique relationship with the proper name. aim: Discuss the validity of the hypothesis that the non-transmission of the surname may be a vulnerability factor in schizophrenia. Descriptive cross-sectional study conducted among 60 patients with schizophrenia and their families. Data were collected using a semi-structured interview. results: Seven patients carried a different surname from their father (11.6% of participants). The disparity has only concerned the child with schizophrenia. Family characteristics (birth rank, desired character of pregnancy, family history of schizophrenia) and evolutif profile of the disease were comparable between patients with a family name according to the father and those with a different surname. It appears that patients with schizophrenia maintain a special relationship with the proper name, which could be involved in the genesis of schizophrenia. Our early hypothesis, supported by the psychoanalytic, transgenerational and behavioral theories, would be a plausible starting point for studies with a broader spectrum including witnesses of the general and psychiatric populations.
Gupta, Veer; Henriksen, Kim; Edwards, Melissa; Jeromin, Andreas; Lista, Simone; Bazenet, Chantal; Soares, Holly; Lovestone, Simon; Hampel, Harald; Montine, Thomas; Blennow, Kaj; Foroud, Tatiana; Carrillo, Maria; Graff-Radford, Neill; Laske, Christoph; Breteler, Monique; Shaw, Leslie; Trojanowski, John Q.; Schupf, Nicole; Rissman, Robert A.; Fagan, Anne M.; Oberoi, Pankaj; Umek, Robert; Weiner, Michael W.; Grammas, Paula; Posner, Holly; Martins, Ralph
2015-01-01
The lack of readily available biomarkers is a significant hindrance towards progressing to effective therapeutic and preventative strategies for Alzheimer’s disease (AD). Blood-based biomarkers have potential to overcome access and cost barriers and greatly facilitate advanced neuroimaging and cerebrospinal fluid biomarker approaches. Despite the fact that preanalytical processing is the largest source of variability in laboratory testing, there are no currently available standardized preanalytical guidelines. The current international working group provides the initial starting point for such guidelines for standardized operating procedures (SOPs). It is anticipated that these guidelines will be updated as additional research findings become available. The statement provides (1) a synopsis of selected preanalytical methods utilized in many international AD cohort studies, (2) initial draft guidelines/SOPs for preanalytical methods, and (3) a list of required methodological information and protocols to be made available for publications in the field in order to foster cross-validation across cohorts and laboratories. PMID:25282381
Large eddy simulation of shock train in a convergent-divergent nozzle
NASA Astrophysics Data System (ADS)
Mousavi, Seyed Mahmood; Roohi, Ehsan
2014-12-01
This paper discusses the suitability of the Large Eddy Simulation (LES) turbulence modeling for the accurate simulation of the shock train phenomena in a convergent-divergent nozzle. To this aim, we selected an experimentally tested geometry and performed LES simulation for the same geometry. The structure and pressure recovery inside the shock train in the nozzle captured by LES model are compared with the experimental data, analytical expressions and numerical solutions obtained using various alternative turbulence models, including k-ɛ RNG, k-ω SST, and Reynolds stress model (RSM). Comparing with the experimental data, we observed that the LES solution not only predicts the "locations of the first shock" precisely, but also its results are quite accurate before and after the shock train. After validating the LES solution, we investigate the effects of the inlet total pressure on the shock train starting point and length. The effects of changes in the back pressure, nozzle inlet angle (NIA) and wall temperature on the behavior of the shock train are investigated by details.
A CSP-Based Agent Modeling Framework for the Cougaar Agent-Based Architecture
NASA Technical Reports Server (NTRS)
Gracanin, Denis; Singh, H. Lally; Eltoweissy, Mohamed; Hinchey, Michael G.; Bohner, Shawn A.
2005-01-01
Cognitive Agent Architecture (Cougaar) is a Java-based architecture for large-scale distributed agent-based applications. A Cougaar agent is an autonomous software entity with behaviors that represent a real-world entity (e.g., a business process). A Cougaar-based Model Driven Architecture approach, currently under development, uses a description of system's functionality (requirements) to automatically implement the system in Cougaar. The Communicating Sequential Processes (CSP) formalism is used for the formal validation of the generated system. Two main agent components, a blackboard and a plugin, are modeled as CSP processes. A set of channels represents communications between the blackboard and individual plugins. The blackboard is represented as a CSP process that communicates with every agent in the collection. The developed CSP-based Cougaar modeling framework provides a starting point for a more complete formal verification of the automatically generated Cougaar code. Currently it is used to verify the behavior of an individual agent in terms of CSP properties and to analyze the corresponding Cougaar society.
Law, Jason M; Stark, Sebastian C; Liu, Ke; Liang, Norah E; Hussain, Mahmud M; Leiendecker, Matthias; Ito, Daisuke; Verho, Oscar; Stern, Andrew M; Johnston, Stephen E; Zhang, Yan-Ling; Dunn, Gavin P; Shamji, Alykhan F; Schreiber, Stuart L
2016-10-13
Evidence suggests that specific mutations of isocitrate dehydrogenases 1 and 2 (IDH1/2) are critical for the initiation and maintenance of certain tumor types and that inhibiting these mutant enzymes with small molecules may be therapeutically beneficial. In order to discover mutant allele-selective IDH1 inhibitors with chemical features distinct from existing probes, we screened a collection of small molecules derived from diversity-oriented synthesis. The assay identified compounds that inhibit the IDH1-R132H mutant allele commonly found in glioma. Here, we report the discovery of a potent (IC 50 = 50 nM) series of IDH1-R132H inhibitors having 8-membered ring sulfonamides as exemplified by the compound BRD2879. The inhibitors suppress ( R )-2-hydroxyglutarate production in cells without apparent toxicity. Although the solubility and pharmacokinetic properties of the specific inhibitor BRD2879 prevent its use in vivo , the scaffold presents a validated starting point for the synthesis of future IDH1-R132H inhibitors having improved pharmacological properties.
de Sequeira, Danielly C M; Peixoto, Mariana L P; De Luca, Paula M; Oliveira-Ferreira, Joseli; Antas, Paulo R Z; Borba, Cintia M
2013-10-31
Purpureocillium lilacinum is an emerging pathogenic fungus that can cause different clinical manifestations ranging from cutaneous and sub-cutaneous infections to severe oculomycosis. In this study, using both conventional indirect immunofluorescence and non-conventional flow cytometry approaches, IgG antibodies were readily detected in both C57BL/6 immunocompetent and immunosuppressed mice after i.v. infection with P. lilacinum. The humoral immune response was specific, since virtually no antibodies were detected in the serum of control mice. Flow cytometry assays also showed both quantitative and qualitative differences in total IgG and its isotypes in sera of immunocompetent and immunosupressed infected mice. Although a good starting point, it is clear that the effectiveness of serological assays for P. lilacinum hyalohyphomycosis identification in clinical studies still requires further standardization. Upon further validation in humans, these techniques have the potential to be suitable to detect P. lilacinum infection in patients, thereby avoiding current laborious and time-consuming culture techniques. © 2013.
Investigation of starting transients in the thermally choked ram accelerator
NASA Technical Reports Server (NTRS)
Burnham, E. A.; Hinkey, J. B.; Bruckner, A. P.
1992-01-01
An experimental investigation of the starting transients of the thermally choked ram accelerator is presented in this paper. Construction of a highly instrumented tube section and instrumentation inserts provide high resolution experimental pressure, luminosity, and electromagnetic data of the starting transients. Data obtained prior to and following the entrance diaphragm show detailed development of shock systems in both combustible and inert mixtures. With an evacuated launch tube, starting the diffuser is possible at any Mach number above the Kantrowitz Mach number. The detrimental effects and possible solutions of higher launch tube pressures and excessive obturator leakage (blow-by) are discussed. Ignition of a combustible mixture is demonstrated with both perforated and solid obturators. The relative advantages and disadvantages of each are discussed. Data obtained from these starting experiments enhance the understanding of the ram accelerator, as well as assist in the validation of unsteady, chemically reacting CFD codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katanin, A. A., E-mail: katanin@mail.ru
We consider formulations of the functional renormalization-group (fRG) flow for correlated electronic systems with the dynamical mean-field theory as a starting point. We classify the corresponding renormalization-group schemes into those neglecting one-particle irreducible six-point vertices (with respect to the local Green’s functions) and neglecting one-particle reducible six-point vertices. The former class is represented by the recently introduced DMF{sup 2}RG approach [31], but also by the scale-dependent generalization of the one-particle irreducible representation (with respect to local Green’s functions, 1PI-LGF) of the generating functional [20]. The second class is represented by the fRG flow within the dual fermion approach [16, 32].more » We compare formulations of the fRG approach in each of these cases and suggest their further application to study 2D systems within the Hubbard model.« less
NASA Astrophysics Data System (ADS)
Shayanfar, Mohsen Ali; Barkhordari, Mohammad Ali; Roudak, Mohammad Amin
2017-06-01
Monte Carlo simulation (MCS) is a useful tool for computation of probability of failure in reliability analysis. However, the large number of required random samples makes it time-consuming. Response surface method (RSM) is another common method in reliability analysis. Although RSM is widely used for its simplicity, it cannot be trusted in highly nonlinear problems due to its linear nature. In this paper, a new efficient algorithm, employing the combination of importance sampling, as a class of MCS, and RSM is proposed. In the proposed algorithm, analysis starts with importance sampling concepts and using a represented two-step updating rule of design point. This part finishes after a small number of samples are generated. Then RSM starts to work using Bucher experimental design, with the last design point and a represented effective length as the center point and radius of Bucher's approach, respectively. Through illustrative numerical examples, simplicity and efficiency of the proposed algorithm and the effectiveness of the represented rules are shown.
Brigo, Alessandro; Lee, Keun Woo; Iurcu Mustata, Gabriela; Briggs, James M.
2005-01-01
HIV-1 integrase (IN) is an essential enzyme for the viral replication and an interesting target for the design of new pharmaceuticals for multidrug therapy of AIDS. Single and multiple mutations of IN at residues T66, S153, or M154 confer degrees of resistance to several inhibitors that prevent the enzyme from performing its normal strand transfer activity. Four different conformations of IN were chosen from a prior molecular dynamics (MD) simulation on the modeled IN T66I/M154I catalytic core domain as starting points for additional MD studies. The aim of this article is to understand the dynamic features that may play roles in the catalytic activity of the double mutant enzyme in the absence of any inhibitor. Moreover, we want to verify the influence of using different starting points on the MD trajectories and associated dynamical properties. By comparison of the trajectories obtained from these MD simulations we have demonstrated that the starting point does not affect the conformational space explored by this protein and that the time of the simulation is long enough to achieve convergence for this system. PMID:15764656
Validation of psychoanalytic theories: towards a conceptualization of references.
Zachrisson, Anders; Zachrisson, Henrik Daae
2005-10-01
The authors discuss criteria for the validation of psychoanalytic theories and develop a heuristic and normative model of the references needed for this. Their core question in this paper is: can psychoanalytic theories be validated exclusively from within psychoanalytic theory (internal validation), or are references to sources of knowledge other than psychoanalysis also necessary (external validation)? They discuss aspects of the classic truth criteria correspondence and coherence, both from the point of view of contemporary psychoanalysis and of contemporary philosophy of science. The authors present arguments for both external and internal validation. Internal validation has to deal with the problems of subjectivity of observations and circularity of reasoning, external validation with the problem of relevance. They recommend a critical attitude towards psychoanalytic theories, which, by carefully scrutinizing weak points and invalidating observations in the theories, reduces the risk of wishful thinking. The authors conclude by sketching a heuristic model of validation. This model combines correspondence and coherence with internal and external validation into a four-leaf model for references for the process of validating psychoanalytic theories.
Group Cohesion DEOCS 4.1 Construct Validity Summary
2017-08-01
Group Cohesion DEOCS 4.1 Construct Validity Summary DEFENSE EQUAL OPPORTUNITY MANAGEMENT INSTITUTE DIRECTORATE...See Table 4 for more information regarding item reliabilities. The relationship between the original four-point scale (Organizational Cohesion) and...future analyses, including those using the seven-point scale. Tables 4 and 5 provide additional information regarding the reliability and descriptive
Comprehensive overview of the Point-by-Point model of prompt emission in fission
NASA Astrophysics Data System (ADS)
Tudora, A.; Hambsch, F.-J.
2017-08-01
The investigation of prompt emission in fission is very important in understanding the fission process and to improve the quality of evaluated nuclear data required for new applications. In the last decade remarkable efforts were done for both the development of prompt emission models and the experimental investigation of the properties of fission fragments and the prompt neutrons and γ-ray emission. The accurate experimental data concerning the prompt neutron multiplicity as a function of fragment mass and total kinetic energy for 252Cf(SF) and 235 ( n, f) recently measured at JRC-Geel (as well as other various prompt emission data) allow a consistent and very detailed validation of the Point-by-Point (PbP) deterministic model of prompt emission. The PbP model results describe very well a large variety of experimental data starting from the multi-parametric matrices of prompt neutron multiplicity ν (A,TKE) and γ-ray energy E_{γ}(A,TKE) which validate the model itself, passing through different average prompt emission quantities as a function of A ( e.g., ν(A), E_{γ}(A), < ɛ > (A) etc.), as a function of TKE ( e.g., ν (TKE), E_{γ}(TKE)) up to the prompt neutron distribution P (ν) and the total average prompt neutron spectrum. The PbP model does not use free or adjustable parameters. To calculate the multi-parametric matrices it needs only data included in the reference input parameter library RIPL of IAEA. To provide average prompt emission quantities as a function of A, of TKE and total average quantities the multi-parametric matrices are averaged over reliable experimental fragment distributions. The PbP results are also in agreement with the results of the Monte Carlo prompt emission codes FIFRELIN, CGMF and FREYA. The good description of a large variety of experimental data proves the capability of the PbP model to be used in nuclear data evaluations and its reliability to predict prompt emission data for fissioning nuclei and incident energies for which the experimental information is completely missing. The PbP treatment can also provide input parameters of the improved Los Alamos model with non-equal residual temperature distributions recently reported by Madland and Kahler, especially for fissioning nuclei without any experimental information concerning the prompt emission.
Vegetation in transition: the Southwest's dynamic past century
Raymond M. Turner
2005-01-01
Monitoring that follows long-term vegetation changes often requires selection of a temporal baseline. Any such starting point is to some degree artificial, but in some instances there are aids that can be used as guides to baseline selection. Matched photographs duplicating scenes first recorded on film a century or more ago reveal changes that help select the starting...
ERIC Educational Resources Information Center
Tovstiga, George; Fantner, Ernest J.
2000-01-01
Examines implications of the networked economy for e-commerce business start-ups. Revisits the notion of "value" and "value creation" in a network context. Examines "value" relative to technological innovation. Looks at implications of the network environment for the organization and transformation of the enterprise's…
Increased Coal Plant Flexibility Can Improve Renewables Integration |
practices that enable lower turndowns, faster starts and stops, and faster ramping between load set-points faster ramp rates and faster and less expensive starts. Flexible Load - Demand Response Resources Demand response (DR) is a load management practice of deliberately reducing or adding load to balance the system
Nanocrystalline ceramic materials
Siegel, R.W.; Nieman, G.W.; Weertman, J.R.
1994-06-14
A method is disclosed for preparing a treated nanocrystalline metallic material. The method of preparation includes providing a starting nanocrystalline metallic material with a grain size less than about 35 nm, compacting the starting nanocrystalline metallic material in an inert atmosphere and annealing the compacted metallic material at a temperature less than about one-half the melting point of the metallic material. 19 figs.
Let's Start in Our Own Backyard: Children's Engagement with Science through the Natural Environment
ERIC Educational Resources Information Center
Alexander, Athalie; Russo, Sharon
2010-01-01
Capitalising on areas in which teachers feel most comfortable, the teaching of Biology, environmental education or nature to young children can be an alternative way of introducing and understanding Science. A "Citizen Science" program currently being run by the University of South Australia (UniSA) may be an appropriate starting point.…
ERIC Educational Resources Information Center
Avraamidou, Lucy
2017-01-01
Given reform recommendations emphasizing scientific inquiry and empirical evidence pointing to the difficulties beginning teachers face in enacting inquiry-based science, this study explores a well-started beginning elementary teacher's (Sofia) beliefs about inquiry-based science and related instructional practices. In order to explore Sofia's…
Improving Head Start Students' Early Literacy Skills Using Technology
ERIC Educational Resources Information Center
Shamir, Haya; Yoder, Erik; Feehan, Kathryn; Pocklington, David
2018-01-01
While recent literature has pointed to the efficacy of computer-assisted instruction (CAI) in providing educational intervention at an early age, there is a lack of research exploring its use in a pre-kindergarten setting. The current study assessed the efficacy of CAI on students at the start of their academic careers. The Waterford Early…
CVD facility electrical system captor/dapper study
DOE Office of Scientific and Technical Information (OSTI.GOV)
SINGH, G.
1999-10-28
Project W-441, CVD Facility Electrical System CAPTOWDAPPER Study validates Meier's hand calculations. This study includes Load flow, short circuit, voltage drop, protective device coordination, and transient motor starting (TMS) analyses.
Mind the Noise When Identifying Computational Models of Cognition from Brain Activity.
Kolossa, Antonio; Kopp, Bruno
2016-01-01
The aim of this study was to analyze how measurement error affects the validity of modeling studies in computational neuroscience. A synthetic validity test was created using simulated P300 event-related potentials as an example. The model space comprised four computational models of single-trial P300 amplitude fluctuations which differed in terms of complexity and dependency. The single-trial fluctuation of simulated P300 amplitudes was computed on the basis of one of the models, at various levels of measurement error and at various numbers of data points. Bayesian model selection was performed based on exceedance probabilities. At very low numbers of data points, the least complex model generally outperformed the data-generating model. Invalid model identification also occurred at low levels of data quality and under low numbers of data points if the winning model's predictors were closely correlated with the predictors from the data-generating model. Given sufficient data quality and numbers of data points, the data-generating model could be correctly identified, even against models which were very similar to the data-generating model. Thus, a number of variables affects the validity of computational modeling studies, and data quality and numbers of data points are among the main factors relevant to the issue. Further, the nature of the model space (i.e., model complexity, model dependency) should not be neglected. This study provided quantitative results which show the importance of ensuring the validity of computational modeling via adequately prepared studies. The accomplishment of synthetic validity tests is recommended for future applications. Beyond that, we propose to render the demonstration of sufficient validity via adequate simulations mandatory to computational modeling studies.
John Stanturf; Brian J. Palik; Mary I. Williams; R. Kasten Dumroese
2014-01-01
An estimated 2 billion ha of forests are degraded globally and global change suggests even greater need for forest restoration. Four forest restoration paradigms are identified and discussed: revegetation, ecological restoration, functional restoration, and forest landscape restoration. Restoration is examined in terms of a degraded starting point and an ending point...
27 CFR 9.108 - Ozark Mountain.
Code of Federal Regulations, 2010 CFR
2010-04-01
..., revised 1969); (2) Jefferson City, Missouri (1955, revised 1970); (3) Springfield, Missouri (1954, revised... following boundary description is the point at which the Missouri River joins the Mississippi River north of... the starting point westward along the Missouri River until it meets the Osage River; (ii) Then further...
Validation of a prognostic score for hidden cancer in unprovoked venous thromboembolism
Otero, Remedios; Jimenez, David; Praena-Fernandez, Juan Manuel; Font, Carme; Falga, Conxita; Soler, Silvia; Riesco, David; Verhamme, Peter; Monreal, Manuel
2018-01-01
The usefulness of a diagnostic workup for occult cancer in patients with venous thromboembolism (VTE) is controversial. We used the RIETE (Registro Informatizado Enfermedad Trombo Embólica) database to perform a nested case-control study to validate a prognostic score that identifies patients with unprovoked VTE at increased risk for cancer. We dichotomized patients as having low- (≤2 points) or high (≥3 points) risk for cancer, and tried to validate the score at 12 and 24 months. From January 2014 to October 2016, 11,695 VTE patients were recruited. Of these, 1,360 with unprovoked VTE (11.6%) were eligible for the study. At 12 months, 52 patients (3.8%; 95%CI: 2.9–5%) were diagnosed with cancer. Among 905 patients (67%) scoring ≤2 points, 22 (2.4%) had cancer. Among 455 scoring ≥3 points, 30 (6.6%) had cancer (hazard ratio 2.8; 95%CI 1.6–5; p<0.01). C-statistic was 0.63 (95%CI 0.55–0.71). At 24 months, 58 patients (4.3%; 95%CI: 3.3–5.5%) were diagnosed with cancer. Among 905 patients scoring ≤2 points, 26 (2.9%) had cancer. Among 455 patients scoring ≥3 points, 32 (7%) had cancer (hazard ratio 2.6; 95%CI 1.5–4.3; p<0.01). C-statistic was 0.61 (95%CI, 0.54–0.69). We validated our prognostic score at 12 and 24 months, although prospective cohort validation is needed. This may help to identify patients for whom more extensive screening workup may be required. PMID:29558509
Tympanic thermometer performance validation by use of a body-temperature fixed point blackbody
NASA Astrophysics Data System (ADS)
Machin, Graham; Simpson, Robert
2003-04-01
The use of infrared tympanic thermometers within the medical community (and more generically in the public domain) has recently grown rapidly, displacing more traditional forms of thermometry such as mercury-in-glass. Besides the obvious health concerns over mercury the increase in the use of tympanic thermometers is related to a number of factors such as their speed and relatively non-invasive method of operation. The calibration and testing of such devices is covered by a number of international standards (ASTM1, prEN2, JIS3) which specify the design of calibration blackbodies. However these calibration sources are impractical for day-to-day in-situ validation purposes. In addition several studies (e.g. Modell et al4, Craig et al5) have thrown doubt on the accuracy of tympanic thermometers in clinical use. With this in mind the NPL is developing a practical, portable and robust primary reference fixed point source for tympanic thermometer validation. The aim of this simple device is to give the clinician a rapid way of validating the performance of their tympanic thermometer, enabling the detection of mal-functioning thermometers and giving confidence in the measurement to the clinician (and patient!) at point of use. The reference fixed point operates at a temperature of 36.3 °C (97.3 °F) with a repeatability of approximately +/- 20 mK. The fixed-point design has taken into consideration the optical characteristics of tympanic thermometers enabling wide-angled field of view devices to be successfully tested. The overall uncertainty of the device is estimated to be is less than 0.1°C. The paper gives a description of the fixed point, its design and construction as well as the results to date of validation tests.
A General Method for Solving Systems of Non-Linear Equations
NASA Technical Reports Server (NTRS)
Nachtsheim, Philip R.; Deiss, Ron (Technical Monitor)
1995-01-01
The method of steepest descent is modified so that accelerated convergence is achieved near a root. It is assumed that the function of interest can be approximated near a root by a quadratic form. An eigenvector of the quadratic form is found by evaluating the function and its gradient at an arbitrary point and another suitably selected point. The terminal point of the eigenvector is chosen to lie on the line segment joining the two points. The terminal point found lies on an axis of the quadratic form. The selection of a suitable step size at this point leads directly to the root in the direction of steepest descent in a single step. Newton's root finding method not infrequently diverges if the starting point is far from the root. However, the current method in these regions merely reverts to the method of steepest descent with an adaptive step size. The current method's performance should match that of the Levenberg-Marquardt root finding method since they both share the ability to converge from a starting point far from the root and both exhibit quadratic convergence near a root. The Levenberg-Marquardt method requires storage for coefficients of linear equations. The current method which does not require the solution of linear equations requires more time for additional function and gradient evaluations. The classic trade off of time for space separates the two methods.
Bham, Ghulam H; Leu, Ming C; Vallati, Manoj; Mathur, Durga R
2014-06-01
This study is aimed at validating a driving simulator (DS) for the study of driver behavior in work zones. A validation study requires field data collection. For studies conducted in highway work zones, the availability of safe vantage points for data collection at critical locations can be a significant challenge. A validation framework is therefore proposed in this paper, demonstrated using a fixed-based DS that addresses the issue by using a global positioning system (GPS). The validation of the DS was conducted using objective and subjective evaluations. The objective validation was divided into qualitative and quantitative evaluations. The DS was validated by comparing the results of simulation with the field data, which were collected using a GPS along the highway and video recordings at specific locations in a work zone. The constructed work zone scenario in the DS was subjectively evaluated with 46 participants. The objective evaluation established the absolute and relative validity of the DS. The mean speeds from the DS data showed excellent agreement with the field data. The subjective evaluation indicated realistic driving experience by the participants. The use of GPS showed that continuous data collected along the highway can overcome the challenges of unavailability of safe vantage points especially at critical locations. Further, a validated DS can be used for examining driver behavior in complex situations by replicating realistic scenarios. Copyright © 2014 Elsevier Ltd. All rights reserved.
STRUM: structure-based prediction of protein stability changes upon single-point mutation.
Quan, Lijun; Lv, Qiang; Zhang, Yang
2016-10-01
Mutations in human genome are mainly through single nucleotide polymorphism, some of which can affect stability and function of proteins, causing human diseases. Several methods have been proposed to predict the effect of mutations on protein stability; but most require features from experimental structure. Given the fast progress in protein structure prediction, this work explores the possibility to improve the mutation-induced stability change prediction using low-resolution structure modeling. We developed a new method (STRUM) for predicting stability change caused by single-point mutations. Starting from wild-type sequences, 3D models are constructed by the iterative threading assembly refinement (I-TASSER) simulations, where physics- and knowledge-based energy functions are derived on the I-TASSER models and used to train STRUM models through gradient boosting regression. STRUM was assessed by 5-fold cross validation on 3421 experimentally determined mutations from 150 proteins. The Pearson correlation coefficient (PCC) between predicted and measured changes of Gibbs free-energy gap, ΔΔG, upon mutation reaches 0.79 with a root-mean-square error 1.2 kcal/mol in the mutation-based cross-validations. The PCC reduces if separating training and test mutations from non-homologous proteins, which reflects inherent correlations in the current mutation sample. Nevertheless, the results significantly outperform other state-of-the-art methods, including those built on experimental protein structures. Detailed analyses show that the most sensitive features in STRUM are the physics-based energy terms on I-TASSER models and the conservation scores from multiple-threading template alignments. However, the ΔΔG prediction accuracy has only a marginal dependence on the accuracy of protein structure models as long as the global fold is correct. These data demonstrate the feasibility to use low-resolution structure modeling for high-accuracy stability change prediction upon point mutations. http://zhanglab.ccmb.med.umich.edu/STRUM/ CONTACT: qiang@suda.edu.cn and zhng@umich.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
STRUM: structure-based prediction of protein stability changes upon single-point mutation
Quan, Lijun; Lv, Qiang; Zhang, Yang
2016-01-01
Motivation: Mutations in human genome are mainly through single nucleotide polymorphism, some of which can affect stability and function of proteins, causing human diseases. Several methods have been proposed to predict the effect of mutations on protein stability; but most require features from experimental structure. Given the fast progress in protein structure prediction, this work explores the possibility to improve the mutation-induced stability change prediction using low-resolution structure modeling. Results: We developed a new method (STRUM) for predicting stability change caused by single-point mutations. Starting from wild-type sequences, 3D models are constructed by the iterative threading assembly refinement (I-TASSER) simulations, where physics- and knowledge-based energy functions are derived on the I-TASSER models and used to train STRUM models through gradient boosting regression. STRUM was assessed by 5-fold cross validation on 3421 experimentally determined mutations from 150 proteins. The Pearson correlation coefficient (PCC) between predicted and measured changes of Gibbs free-energy gap, ΔΔG, upon mutation reaches 0.79 with a root-mean-square error 1.2 kcal/mol in the mutation-based cross-validations. The PCC reduces if separating training and test mutations from non-homologous proteins, which reflects inherent correlations in the current mutation sample. Nevertheless, the results significantly outperform other state-of-the-art methods, including those built on experimental protein structures. Detailed analyses show that the most sensitive features in STRUM are the physics-based energy terms on I-TASSER models and the conservation scores from multiple-threading template alignments. However, the ΔΔG prediction accuracy has only a marginal dependence on the accuracy of protein structure models as long as the global fold is correct. These data demonstrate the feasibility to use low-resolution structure modeling for high-accuracy stability change prediction upon point mutations. Availability and Implementation: http://zhanglab.ccmb.med.umich.edu/STRUM/ Contact: qiang@suda.edu.cn and zhng@umich.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27318206
Agreeing on Validity Arguments
ERIC Educational Resources Information Center
Sireci, Stephen G.
2013-01-01
Kane (this issue) presents a comprehensive review of validity theory and reminds us that the focus of validation is on test score interpretations and use. In reacting to his article, I support the argument-based approach to validity and all of the major points regarding validation made by Dr. Kane. In addition, I call for a simpler, three-step…
NASA Astrophysics Data System (ADS)
Draghici, Sorin; Cumberland, Lonnie T., Jr.; Kovari, Ladislau C.
2000-04-01
This paper presents some results of data mining HIV genotypic and structural data. Our aim is to try to relate structural features of HIV enzymes essential to its reproductive abilities to the drug resistance phenomenon. This paper concentrates on the HIV protease enzyme and Indinavir which is one of the FDA approved protease inhibitors. Our starting point was the current list of HIV mutations related to drug resistance. We used the fact that some molecular structures determined through high resolution X-ray crystallography were available for the protease-Indinavir complex. Starting with these structures and the known mutations, we modelled the mutant proteases and studied the pattern of atomic contacts between the protease and the drug. After suitable pre- processing, these patterns have been used as the input of our data mining process. We have used both supervised and unsupervised learning techniques with the aim of understanding the relationship between structural features at a molecular level and resistance to Indinavir. The supervised learning was aimed at predicting IC90 values for arbitrary mutants. The SOFM was aimed at identifying those structural features that are important for drug resistance and discovering a classifier based on such features. We have used validation and cross validation to test the generalization abilities of the learning paradigm we have designed. The straightforward supervised learning was able to learn very successfully but validation results are less than satisfactory. This is due to the insufficient number of patterns in the training set which in turn is due to the scarcity of the available data. The data mining using SOFM was very successful. We have managed to distinguish between resistant and non-resistant mutants using structural features. We have been able to divide all reported HIV mutants into several categories based on their 3- dimensional molecular structures and the pattern of contacts between the mutant protease and Indinavir. Our classifier shows reasonably good prediction performance being able to predict the drug resistance of previously unseen mutants with an accuracy of between 60% and 70%. We believe that this performance can be greatly improved once more data becomes available. The results presented here support the hypothesis that structural features of the molecular structure can be used in antiviral drug treatment selection and drug design.
Janssen, Meriam M; Mathijssen, Jolanda J P; van Bon-Martens, Marja J H; van Oers, Hans A M; Garretsen, Henk F L
2014-05-24
An earlier study using social marketing and audience segmentation distinguished five segments of Dutch adolescents aged 12-18 years based on their attitudes towards alcohol. The present, qualitative study focuses on two of these five segments ('ordinaries' and 'ordinary sobers') and explores the attitudes of these two segments towards alcohol, and the role of parents and peers in their alcohol use in more detail. This qualitative study was conducted in the province of North-Brabant, the Netherlands. With a 28-item questionnaire, segments of adolescents were identified. From the ordinaries and ordinary sobers who were willing to participate in a focus group, 55 adolescents (30 ordinaries and 25 ordinary sobers) were selected and invited to participate. Finally, six focus groups were conducted with 12-17 year olds, i.e., three interviews with 17 ordinaries and three interviews with 20 ordinary sobers at three different high schools. The ordinaries thought that drinking alcohol was fun and relaxing. Curiosity was an important factor in starting to drink alcohol. Peer pressure played a role, e.g., it was difficult not to drink when peers were drinking. Most parents advised their child to drink a small amount only. The attitude of ordinary sobers towards alcohol was that drinking alcohol was stupid; moreover, they did not feel the need to drink. Most parents set strict rules and prohibited the use of alcohol before the age of 16. Qualitative insight into the attitudes towards alcohol and the role played by parents and peers, revealed differences between ordinaries and ordinary sobers. Based on these differences and on health education theories, starting points for the development of interventions, for both parents and adolescents, are formulated. Important starting points for interventions targeting ordinaries are reducing perceived peer pressure and learning to make one's own choices. For the ordinary sobers, an important starting point includes enabling them to express to others that they do not feel the need to drink alcohol. Starting points for parents include setting strict rules, restricting alcohol availability at home and monitoring their child's alcohol use.
Bailey, Beth A
2013-10-01
Measurement of carbon monoxide in expired air samples (ECO) is a non-invasive, cost-effective biochemical marker for smoking. Cut points of 6ppm-10ppm have been established, though appropriate cut-points for pregnant woman have been debated due to metabolic changes. This study assessed whether an ECO cut-point identifying at least 90% of pregnant smokers, and misidentifying fewer than 10% of non-smokers, could be established. Pregnant women (N=167) completed a validated self-report smoking assessment, a urine drug screen for cotinine (UDS), and provided an expired air sample twice during pregnancy. Half of women reported non-smoking status early (51%) and late (53%) in pregnancy, confirmed by UDS. Using a traditional 8ppm+cut-point for the early pregnancy reading, only 1% of non-smokers were incorrectly identified as smokers, but only 56% of all smokers, and 67% who smoked 5+ cigarettes in the previous 24h, were identified. However, at 4ppm+, only 8% of non-smokers were misclassified as smokers, and 90% of all smokers and 96% who smoked 5+ cigarettes in the previous 24h were identified. False positives were explained by heavy second hand smoke exposure and marijuana use. Results were similar for late pregnancy ECO, with ROC analysis revealing an area under the curve of .95 for early pregnancy, and .94 for late pregnancy readings. A lower 4ppm ECO cut-point may be necessary to identify pregnant smokers using expired air samples, and this cut-point appears valid throughout pregnancy. Work is ongoing to validate findings in larger samples, but it appears if an appropriate cut-point is used, ECO is a valid method for determining smoking status in pregnancy. Copyright © 2013 Elsevier Ltd. All rights reserved.
Radar Image Simulation: Validation of the Point Scattering Method. Volume 2
1977-09-01
the Engineer Topographic Labor - atory (ETL), Fort Belvoir, Virginia. This Radar Simulation Study was performed to validate the point tcattering radar...e.n For radar, the number of Independent samples in a given re.-olution cell is given by 5 ,: N L 2w (16) L Acoso where: 0 Radar incidence angle; w
Saucedo-Molina, T J; Gómez-Peresmitré, G
1998-01-01
To determine the diagnostic validity of the nutritional index (NI) in a sample of Mexican preadolescents. A total of 256 preadolescents, between 10 and 12 years old, male and female, students from Mexico City, were used to establish the diagnostic validity of NI using the sensitivity and specificity method. The findings show that the conventional NI cut-off points showed good sensitivity and specificity for the diagnosis of low weight, normality and obesity but not for overweight. When the cut-off points of NI were normalized, the sensitivity, specificity and prediction potency values were more suitable in all categories. When working with preadolescents, it is better to use the new cut-off points of NI, to obtain more reliable diagnosis.
Fragmentation Point Detection of JPEG Images at DHT Using Validator
NASA Astrophysics Data System (ADS)
Mohamad, Kamaruddin Malik; Deris, Mustafa Mat
File carving is an important, practical technique for data recovery in digital forensics investigation and is particularly useful when filesystem metadata is unavailable or damaged. The research on reassembly of JPEG files with RST markers, fragmented within the scan area have been done before. However, fragmentation within Define Huffman Table (DHT) segment is yet to be resolved. This paper analyzes the fragmentation within the DHT area and list out all the fragmentation possibilities. Two main contributions are made in this paper. Firstly, three fragmentation points within DHT area are listed. Secondly, few novel validators are proposed to detect these fragmentations. The result obtained from tests done on manually fragmented JPEG files, showed that all three fragmentation points within DHT are successfully detected using validators.
Stoll, Dwight R; Sajulga, Ray W; Voigt, Bryan N; Larson, Eli J; Jeong, Lena N; Rutan, Sarah C
2017-11-10
An important research direction in the continued development of two-dimensional liquid chromatography (2D-LC) is to improve the detection sensitivity of the method. This is especially important in applications where injection of large volumes of effluent from the first dimension ( 1 D) column into the second dimension ( 2 D) column leads to severe 2 D peak broadening and peak shape distortion. For example, this is common when coupling two reversed-phase columns and the organic solvent content of the 1 D mobile phase overwhelms the 2 D column with each injection of 1 D effluent, leading to low resolution in the second dimension. In a previous study we validated a simulation approach based on the Craig distribution model and adapted from the work of Czok and Guiochon [1] that enabled accurate simulation of simple isocratic and gradient separations with very small injection volumes, and isocratic separations with mismatched injection and mobile phase solvents [2]. In the present study we have extended this simulation approach to simulate separations relevant to 2D-LC. Specifically, we have focused on simulating 2 D separations where gradient elution conditions are used, there is mismatch between the sample solvent and the starting point in the gradient elution program, injection volumes approach or even exceed the dead volume of the 2 D column, and the extent of sample loop filling is varied. To validate this simulation we have compared results from simulations and experiments for 101 different conditions, including variation in injection volume (0.4-80μL), loop filling level (25-100%), and degree of mismatch between sample organic solvent and the starting point in the gradient elution program (-20 to +20% ACN). We find that that the simulation is accurate enough (median errors in retention time and peak width of -1.0 and -4.9%, without corrections for extra-column dispersion) to be useful in guiding optimization of 2D-LC separations. However, this requires that real injection profiles obtained from 2D-LC interface valves are used to simulate the introduction of samples into the 2 D column. These profiles are highly asymmetric - simulation using simple rectangular pulses leads to peak widths that are far too narrow under many conditions. We believe the simulation approach developed here will be useful for addressing practical questions in the development of 2D-LC methods. Copyright © 2017 Elsevier B.V. All rights reserved.
Mancilla-Martinez, Jeannette; Gámez, Perla B; Vagh, Shaher Banu; Lesaux, Nonie K
2016-01-01
This 2-phase study aims to extend research on parent report measures of children's productive vocabulary by investigating the development (n = 38) of the Spanish Vocabulary Extension and validity (n = 194) of the 100-item Spanish and English MacArthur-Bates Communicative Development Inventories Toddler Short Forms and Upward Extension (Fenson et al., 2000, 2007; Jackson-Maldonado, Marchman, & Fernald, 2013) and the Spanish Vocabulary Extension for use with parents from low-income homes and their 24- to 48-month-old Spanish-English bilingual children. Study participants were drawn from Early Head Start and Head Start collaborative programs in the Northeastern United States in which English was the primary language used in the classroom. All families reported Spanish or Spanish-English as their home language(s). The MacArthur Communicative Development Inventories as well as the researcher-designed Spanish Vocabulary Extension were used as measures of children's English and Spanish productive vocabularies. Findings revealed the forms' concurrent and discriminant validity, on the basis of standardized measures of vocabulary, as measures of productive vocabulary for this growing bilingual population. These findings suggest that parent reports, including our researcher-designed form, represent a valid, cost-effective mechanism for vocabulary monitoring purposes in early childhood education settings.
Mandillo, Silvia; Tucci, Valter; Hölter, Sabine M.; Meziane, Hamid; Banchaabouchi, Mumna Al; Kallnik, Magdalena; Lad, Heena V.; Nolan, Patrick M.; Ouagazzal, Abdel-Mouttalib; Coghill, Emma L.; Gale, Karin; Golini, Elisabetta; Jacquot, Sylvie; Krezel, Wojtek; Parker, Andy; Riet, Fabrice; Schneider, Ilka; Marazziti, Daniela; Auwerx, Johan; Brown, Steve D. M.; Chambon, Pierre; Rosenthal, Nadia; Tocchini-Valentini, Glauco; Wurst, Wolfgang
2008-01-01
Establishing standard operating procedures (SOPs) as tools for the analysis of behavioral phenotypes is fundamental to mouse functional genomics. It is essential that the tests designed provide reliable measures of the process under investigation but most importantly that these are reproducible across both time and laboratories. For this reason, we devised and tested a set of SOPs to investigate mouse behavior. Five research centers were involved across France, Germany, Italy, and the UK in this study, as part of the EUMORPHIA program. All the procedures underwent a cross-validation experimental study to investigate the robustness of the designed protocols. Four inbred reference strains (C57BL/6J, C3HeB/FeJ, BALB/cByJ, 129S2/SvPas), reflecting their use as common background strains in mutagenesis programs, were analyzed to validate these tests. We demonstrate that the operating procedures employed, which includes open field, SHIRPA, grip-strength, rotarod, Y-maze, prepulse inhibition of acoustic startle response, and tail flick tests, generated reproducible results between laboratories for a number of the test output parameters. However, we also identified several uncontrolled variables that constitute confounding factors in behavioral phenotyping. The EUMORPHIA SOPs described here are an important start-point for the ongoing development of increasingly robust phenotyping platforms and their application in large-scale, multicentre mouse phenotyping programs. PMID:18505770
Lactation in the Human Breast From a Fluid Dynamics Point of View.
Negin Mortazavi, S; Geddes, Donna; Hassanipour, Fatemeh
2017-01-01
This study is a collaborative effort among lactation specialists and fluid dynamic engineers. The paper presents clinical results for suckling pressure pattern in lactating human breast as well as a 3D computational fluid dynamics (CFD) modeling of milk flow using these clinical inputs. The investigation starts with a careful, statistically representative measurement of suckling vacuum pressure, milk flow rate, and milk intake in a group of infants. The results from clinical data show that suckling action does not occur with constant suckling rate but changes in a rhythmic manner for infants. These pressure profiles are then used as the boundary condition for the CFD study using commercial ansys fluent software. For the geometric model of the ductal system of the human breast, this work takes advantage of a recent advance in the development of a validated phantom that has been produced as a ground truth for the imaging applications for the breast. The geometric model is introduced into CFD simulations with the aforementioned boundary conditions. The results for milk intake from the CFD simulation and clinical data were compared and cross validated. Also, the variation of milk intake versus suckling pressure are presented and analyzed. Both the clinical and CFD simulation show that the maximum milk flow rate is not related to the largest vacuum pressure or longest feeding duration indicating other factors influence the milk intake by infants.
Automated classification and quantitative analysis of arterial and venous vessels in fundus images
NASA Astrophysics Data System (ADS)
Alam, Minhaj; Son, Taeyoon; Toslak, Devrim; Lim, Jennifer I.; Yao, Xincheng
2018-02-01
It is known that retinopathies may affect arteries and veins differently. Therefore, reliable differentiation of arteries and veins is essential for computer-aided analysis of fundus images. The purpose of this study is to validate one automated method for robust classification of arteries and veins (A-V) in digital fundus images. We combine optical density ratio (ODR) analysis and blood vessel tracking algorithm to classify arteries and veins. A matched filtering method is used to enhance retinal blood vessels. Bottom hat filtering and global thresholding are used to segment the vessel and skeleton individual blood vessels. The vessel tracking algorithm is used to locate the optic disk and to identify source nodes of blood vessels in optic disk area. Each node can be identified as vein or artery using ODR information. Using the source nodes as starting point, the whole vessel trace is then tracked and classified as vein or artery using vessel curvature and angle information. 50 color fundus images from diabetic retinopathy patients were used to test the algorithm. Sensitivity, specificity, and accuracy metrics were measured to assess the validity of the proposed classification method compared to ground truths created by two independent observers. The algorithm demonstrated 97.52% accuracy in identifying blood vessels as vein or artery. A quantitative analysis upon A-V classification showed that average A-V ratio of width for NPDR subjects with hypertension decreased significantly (43.13%).
Schorer, Jörg; Rienhoff, Rebecca; Fischer, Lennart; Baker, Joseph
2017-01-01
In most sports, the development of elite athletes is a long-term process of talent identification and support. Typically, talent selection systems administer a multi-faceted strategy including national coach observations and varying physical and psychological tests when deciding who is chosen for talent development. The aim of this exploratory study was to evaluate the prognostic validity of talent selections by varying groups 10 years after they had been conducted. This study used a unique, multi-phased approach. Phase 1 involved players (n = 68) in 2001 completing a battery of general and sport-specific tests of handball ‘talent’ and performance. In Phase 2, national and regional coaches (n = 7) in 2001 who attended training camps identified the most talented players. In Phase 3, current novice and advanced handball players (n = 12 in each group) selected the most talented from short videos of matches played during the talent camp. Analyses compared predictions among all groups with a best model-fit derived from the motor tests. Results revealed little difference between regional and national coaches in the prediction of future performance and little difference in forecasting performance between novices and players. The best model-fit regression by the motor-tests outperformed all predictions. While several limitations are discussed, this study is a useful starting point for future investigations considering athlete selection decisions in talent identification in sport. PMID:28744238
Schorer, Jörg; Rienhoff, Rebecca; Fischer, Lennart; Baker, Joseph
2017-01-01
In most sports, the development of elite athletes is a long-term process of talent identification and support. Typically, talent selection systems administer a multi-faceted strategy including national coach observations and varying physical and psychological tests when deciding who is chosen for talent development. The aim of this exploratory study was to evaluate the prognostic validity of talent selections by varying groups 10 years after they had been conducted. This study used a unique, multi-phased approach. Phase 1 involved players ( n = 68) in 2001 completing a battery of general and sport-specific tests of handball 'talent' and performance. In Phase 2, national and regional coaches ( n = 7) in 2001 who attended training camps identified the most talented players. In Phase 3, current novice and advanced handball players ( n = 12 in each group) selected the most talented from short videos of matches played during the talent camp. Analyses compared predictions among all groups with a best model-fit derived from the motor tests. Results revealed little difference between regional and national coaches in the prediction of future performance and little difference in forecasting performance between novices and players. The best model-fit regression by the motor-tests outperformed all predictions. While several limitations are discussed, this study is a useful starting point for future investigations considering athlete selection decisions in talent identification in sport.
Annotated chemical patent corpus: a gold standard for text mining.
Akhondi, Saber A; Klenner, Alexander G; Tyrchan, Christian; Manchala, Anil K; Boppana, Kiran; Lowe, Daniel; Zimmermann, Marc; Jagarlapudi, Sarma A R P; Sayle, Roger; Kors, Jan A; Muresan, Sorel
2014-01-01
Exploring the chemical and biological space covered by patent applications is crucial in early-stage medicinal chemistry activities. Patent analysis can provide understanding of compound prior art, novelty checking, validation of biological assays, and identification of new starting points for chemical exploration. Extracting chemical and biological entities from patents through manual extraction by expert curators can take substantial amount of time and resources. Text mining methods can help to ease this process. To validate the performance of such methods, a manually annotated patent corpus is essential. In this study we have produced a large gold standard chemical patent corpus. We developed annotation guidelines and selected 200 full patents from the World Intellectual Property Organization, United States Patent and Trademark Office, and European Patent Office. The patents were pre-annotated automatically and made available to four independent annotator groups each consisting of two to ten annotators. The annotators marked chemicals in different subclasses, diseases, targets, and modes of action. Spelling mistakes and spurious line break due to optical character recognition errors were also annotated. A subset of 47 patents was annotated by at least three annotator groups, from which harmonized annotations and inter-annotator agreement scores were derived. One group annotated the full set. The patent corpus includes 400,125 annotations for the full set and 36,537 annotations for the harmonized set. All patents and annotated entities are publicly available at www.biosemantics.org.
Annotated Chemical Patent Corpus: A Gold Standard for Text Mining
Akhondi, Saber A.; Klenner, Alexander G.; Tyrchan, Christian; Manchala, Anil K.; Boppana, Kiran; Lowe, Daniel; Zimmermann, Marc; Jagarlapudi, Sarma A. R. P.; Sayle, Roger; Kors, Jan A.; Muresan, Sorel
2014-01-01
Exploring the chemical and biological space covered by patent applications is crucial in early-stage medicinal chemistry activities. Patent analysis can provide understanding of compound prior art, novelty checking, validation of biological assays, and identification of new starting points for chemical exploration. Extracting chemical and biological entities from patents through manual extraction by expert curators can take substantial amount of time and resources. Text mining methods can help to ease this process. To validate the performance of such methods, a manually annotated patent corpus is essential. In this study we have produced a large gold standard chemical patent corpus. We developed annotation guidelines and selected 200 full patents from the World Intellectual Property Organization, United States Patent and Trademark Office, and European Patent Office. The patents were pre-annotated automatically and made available to four independent annotator groups each consisting of two to ten annotators. The annotators marked chemicals in different subclasses, diseases, targets, and modes of action. Spelling mistakes and spurious line break due to optical character recognition errors were also annotated. A subset of 47 patents was annotated by at least three annotator groups, from which harmonized annotations and inter-annotator agreement scores were derived. One group annotated the full set. The patent corpus includes 400,125 annotations for the full set and 36,537 annotations for the harmonized set. All patents and annotated entities are publicly available at www.biosemantics.org. PMID:25268232
Development and Validation of a Model for Hydrogen Reduction of JSC-1A
NASA Technical Reports Server (NTRS)
Hegde, U.; Balasubramaniam, R.; Gokoglu, S.
2009-01-01
Hydrogen reduction of lunar regolith has been proposed as a viable technology for oxygen production on the moon. Hydrogen reduces FeO present in the lunar regolith to form metallic iron and water. The water may be electrolyzed to recycle the hydrogen and produce oxygen. Depending upon the regolith composition, FeO may be bound to TiO2 as ilmenite or it may be dispersed in glassy substrates. Some testing of hydrogen reduction has been conducted with Apollo-returned lunar regolith samples. However, due to the restricted amount of lunar material available for testing, detailed understanding and modeling of the reduction process in regolith have not yet been developed. As a step in this direction, hydrogen reduction studies have been carried out in more detail with lunar regolith simulants such as JSC-1A by NASA and other organizations. While JSC-1A has some similarities with lunar regolith, it does not duplicate the wide variety of regolith types on the moon, for example, it contains almost no ilmenite. Nonetheless, it is a good starting point for developing an understanding of the hydrogen reduction process with regolith-like material. In this paper, a model utilizing a shrinking core formulation coupled with the reactor flow is described and validated against experimental data on hydrogen reduction of JSC-1A.
Trends: Bearding the Proverbial Lion.
ERIC Educational Resources Information Center
Greckel, Wil
1989-01-01
Describes the use of television commercials to teach classical music. Points out that a large number of commercials use classical selections which can serve as a starting point for introducing students to this form. Urges music educators to broaden their views and use these truncated selections to transmit our cultural heritage. (KO)
The Ideology of Certainty in Mathematics Education.
ERIC Educational Resources Information Center
Borba, Marcelo C.; Skovsmose, Ole
1997-01-01
Presents one aspect that makes mathematics the final word in many discussions, the ideology of certainty. Argues that one way of challenging the ideology of certainty is to change classroom practice by introducing a landscape of discussion on chaotic nature where relativity, provisional starting points, different points of view, and uncertainty…
Music Cultural Pedagogy in the "Network Society"
ERIC Educational Resources Information Center
Sakai, Winfried
2014-01-01
The present contribution to theory construction in music educational research focuses on the contemporary requirements for general music education. One starting point are the normative claims of a democratic liberal education as to find in the field of critical pedagogy and the sociology of education. Another point of departure is provided by…
Using Data Visualization to Examine an Academic Library Collection
ERIC Educational Resources Information Center
Finch, Jannette L.; Flenner, Angela R.
2016-01-01
The authors generated data visualizations to compare sections of the library book collection, expenditures in those areas, student enrollment in majors and minors, and number of courses. The visualizations resulting from the entered data provide an excellent starting point for conversations about possible imbalances in the collection and point to…
Development of the PEBLebl Traveling Salesman Problem Computerized Testbed
ERIC Educational Resources Information Center
Mueller, Shane T.; Perelman, Brandon S.; Tan, Yin Yin; Thanasuan, Kejkaew
2015-01-01
The traveling salesman problem (TSP) is a combinatorial optimization problem that requires finding the shortest path through a set of points ("cities") that returns to the starting point. Because humans provide heuristic near-optimal solutions to Euclidean versions of the problem, it has sometimes been used to investigate human visual…
Björkelund, Cecilia; Guo, Xinxin; Skoog, Ingmar; Bosaeus, Ingvar; Lissner, Lauren
2014-01-01
Aim: To investigate validity of widely recommended anthropometric and total fat percentage cut-off points in screening for cardiovascular risk factors in women of different ages. Methods: A population-based sample of 1002 Swedish women aged 38, 50, 75 (younger, middle-aged and elderly, respectively) underwent anthropometry, health examinations and blood tests. Total fat was estimated (bioimpedance) in 670 women. Sensitivity, specificity of body mass index (BMI; ≥25 and ≥30), waist circumference (WC; ≥80 cm and ≥88 cm) and total fat percentage (TF; ≥35%) cut-off points for cardiovascular risk factors (dyslipidaemias, hypertension and hyperglycaemia) were calculated for each age. Cut-off points yielding high sensitivity together with modest specificity were considered valid. Women reporting hospital admission for cardiovascular disease were excluded. Results: The sensitivity of WC ≥80 cm for one or more risk factors was ~60% in younger and middle-aged women, and 80% in elderly women. The specificity of WC ≥80 cm for one or more risk factors was 69%, 57% and 40% at the three ages (p < .05 for age trends). WC ≥80 cm yielded ~80% sensitivity for two or more risk factors across all ages. However, specificity decreased with increasing age (p < .0001), being 33% in elderly. WC ≥88 cm provided better specificity in elderly women. BMI and TF % cut-off points were not better than WC. Conclusions: Validity of recommended anthropometric cut-off points in screening asymptomatic women varies with age. In younger and middle-age, WC ≥80 cm yielded high sensitivity and modest specificity for two or more risk factors, however, sensitivity for one or more risk factor was less than optimal. WC ≥88 cm showed better validity than WC ≥80 cm in elderly. Our results support age-specific screening cut-off points for women. PMID:25294689
Subramoney, Sreevidya; Björkelund, Cecilia; Guo, Xinxin; Skoog, Ingmar; Bosaeus, Ingvar; Lissner, Lauren
2014-12-01
To investigate validity of widely recommended anthropometric and total fat percentage cut-off points in screening for cardiovascular risk factors in women of different ages. A population-based sample of 1002 Swedish women aged 38, 50, 75 (younger, middle-aged and elderly, respectively) underwent anthropometry, health examinations and blood tests. Total fat was estimated (bioimpedance) in 670 women. Sensitivity, specificity of body mass index (BMI; ≥25 and ≥30), waist circumference (WC; ≥80 cm and ≥88 cm) and total fat percentage (TF; ≥35%) cut-off points for cardiovascular risk factors (dyslipidaemias, hypertension and hyperglycaemia) were calculated for each age. Cut-off points yielding high sensitivity together with modest specificity were considered valid. Women reporting hospital admission for cardiovascular disease were excluded. The sensitivity of WC ≥80 cm for one or more risk factors was ~60% in younger and middle-aged women, and 80% in elderly women. The specificity of WC ≥80 cm for one or more risk factors was 69%, 57% and 40% at the three ages (p < .05 for age trends). WC ≥80 cm yielded ~80% sensitivity for two or more risk factors across all ages. However, specificity decreased with increasing age (p < .0001), being 33% in elderly. WC ≥88 cm provided better specificity in elderly women. BMI and TF % cut-off points were not better than WC. Validity of recommended anthropometric cut-off points in screening asymptomatic women varies with age. In younger and middle-age, WC ≥80 cm yielded high sensitivity and modest specificity for two or more risk factors, however, sensitivity for one or more risk factor was less than optimal. WC ≥88 cm showed better validity than WC ≥80 cm in elderly. Our results support age-specific screening cut-off points for women. © 2014 the Nordic Societies of Public Health.
One Third More: Maine Head Start Expansion with State Funds.
ERIC Educational Resources Information Center
Weil, Jane
The expansion of Project Head Start in Maine to the point of serving nearly 25 percent of eligible children is detailed in this report. Section I describes the expansion and some of its benefits, such as equalization of services across county boundaries and the establishment of a uniform unit cost-per-child for use in appropriating state funds.…
ERIC Educational Resources Information Center
Wright, Tanya S.; Gotwals, Amelia Wenk
2017-01-01
In this article, the authors first review the research literature to show why supporting talk from the start of school is important for students' long-term literacy development. The authors then define and describe disciplinary talk and argue that it is an important entry point into science and disciplinary literacy learning for young students.…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-10
... the events start and finish points. The Coast Guard received information about the Riverhead Rocks Triathlon from the event sponsor, Event Power, on May 2, 2013. Event Power held the Riverhead Rocks... difficulty of rescheduling the early morning start of the swim event with the desired high tide cycle. While...
ERIC Educational Resources Information Center
Carter, Carolyn J.
2011-01-01
Charter school administrators weathered start-up woes, among them challenging students with unmet needs, verbal assaults, dwindling student enrollment, and a first-year budget deficit exceeding $200,000 to implement a reading improvement strategy that holds promise for helping at-risk students read better, a point particularly underscored…
Cosmological viability conditions for f(T) dark energy models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Setare, M.R.; Mohammadipour, N., E-mail: rezakord@ipm.ir, E-mail: N.Mohammadipour@uok.ac.ir
2012-11-01
Recently f(T) modified teleparallel gravity where T is the torsion scalar has been proposed as the natural gravitational alternative for dark energy. We perform a detailed dynamical analysis of these models and find conditions for the cosmological viability of f(T) dark energy models as geometrical constraints on the derivatives of these models. We show that in the phase space exists two cosmologically viable trajectory which (i) The universe would start from an unstable radiation point, then pass a saddle standard matter point which is followed by accelerated expansion de sitter point. (ii) The universe starts from a saddle radiation epoch,more » then falls onto the stable matter era and the system can not evolve to the dark energy dominated epoch. Finally, for a number of f(T) dark energy models were proposed in the more literature, the viability conditions are investigated.« less
NASA Technical Reports Server (NTRS)
Appelbaum, Joseph; Singer, S.
1989-01-01
Direct current (dc) motors are used in terrestrial photovoltaic (PV) systems such as in water-pumping systems for irrigation and water supply. Direct current motors may also be used for space applications. Simple and low weight systems including dc motors may be of special interest in space where the motors are directly coupled to the solar cell array (with no storage). The system will operate only during times when sufficient insolation is available. An important performance characteristic of electric motors is the starting to rated torque ratio. Different types of dc motors have different starting torque ratios. These ratios are dictated by the size of solar cell array, and the developed motor torque may not be sufficient to overcome the load starting torque. By including a maximum power point tracker (MPPT) in the PV system, the starting to rated torque ratio will increase, the amount of which depends on the motor type. The starting torque ratio is calculated for the permanent magnet, series and shunt excited dc motors when powered by solar cell arrays for two cases: with and without MPPT's. Defining a motor torque magnification by the ratio of the motor torque with an MPPT to the motor torque without an MPPT, a magnification of 3 was obtained for the permanent magnet motor and a magnification of 7 for both the series and shunt motors. The effect of the variation of solar insolation on the motor starting torque was covered. All motor types are less sensitive to insolation variation in systems including MPPT's as compared to systems with MPPT's. The analysis of this paper will assist the PV system designed to determine whether or not to include an MPPT in the system for a specific motor type.
A population-based study of stimulant drug treatment of ADHD and academic progress in children.
Zoëga, Helga; Rothman, Kenneth J; Huybrechts, Krista F; Ólafsson, Örn; Baldursson, Gísli; Almarsdóttir, Anna B; Jónsdóttir, Sólveig; Halldórsson, Matthías; Hernández-Diaz, Sonia; Valdimarsdóttir, Unnur A
2012-07-01
We evaluated the hypothesis that later start of stimulant treatment of attention-deficit/hyperactivity disorder adversely affects academic progress in mathematics and language arts among 9- to 12-year-old children. We linked nationwide data from the Icelandic Medicines Registry and the Database of National Scholastic Examinations. The study population comprised 11,872 children born in 1994-1996 who took standardized tests in both fourth and seventh grade. We estimated the probability of academic decline (drop of ≥ 5.0 percentile points) according to drug exposure and timing of treatment start between examinations. To limit confounding by indication, we concentrated on children who started treatment either early or later, but at some point between fourth-grade and seventh-grade standardized tests. In contrast with nonmedicated children, children starting stimulant treatment between their fourth- and seventh-grade tests were more likely to decline in test performance. The crude probability of academic decline was 72.9% in mathematics and 42.9% in language arts for children with a treatment start 25 to 36 months after the fourth-grade test. Compared with those starting treatment earlier (≤ 12 months after tests), the multivariable adjusted risk ratio (RR) for decline was 1.7 (95% confidence interval [CI]: 1.2-2.4) in mathematics and 1.1 (95% CI: 0.7-1.8) in language arts. The adjusted RR of mathematics decline with later treatment was higher among girls (RR, 2.7; 95% CI: 1.2-6.0) than boys (RR, 1.4; 95% CI: 0.9-2.0). Later start of stimulant drug treatment of attention-deficit/hyperactivity disorder is associated with academic decline in mathematics.
Validating a Fidelity Scale to Understand Intervention Effects in Classroom-Based Studies
ERIC Educational Resources Information Center
Buckley, Pamela; Moore, Brooke; Boardman, Alison G.; Arya, Diana J.; Maul, Andrew
2017-01-01
K-12 intervention studies often include fidelity of implementation (FOI) as a mediating variable, though most do not report the validity of fidelity measures. This article discusses the critical need for validated FOI scales. To illustrate our point, we describe the development and validation of the Implementation Validity Checklist (IVC-R), an…
Cozzarini, Cesare; Rancati, Tiziana; Palorini, Federica; Avuzzi, Barbara; Garibaldi, Elisabetta; Balestrini, Damiano; Cante, Domenico; Munoz, Fernando; Franco, Pierfrancesco; Girelli, Giuseppe; Sini, Carla; Vavassori, Vittorio; Valdagni, Riccardo; Fiorino, Claudio
2017-10-01
Urinary incontinence following radiotherapy (RT) for prostate cancer (PCa) has a relevant impact on patient's quality of life. The aim of the study was to assess the unknown dose-effect relationship for late patient-reported urinary incontinence (LPRUI). Patients were enrolled within the multi-centric study DUE01. Clinical and dosimetry data including the prescribed 2Gy equivalent dose (EQD2) were prospectively collected. LPRUI was evaluated through the ICIQ-SF questionnaire filled in by the patients at RT start/end and therefore every 6months. Patients were treated with conventional (74-80Gy, 1.8-2Gy/fr) or moderately hypo-fractionated RT (65-75.2Gy, 2.2-2.7Gy/fr) in 5 fractions/week with intensity-modulated radiotherapy. Six different end-points of 3-year LPRUI, including or not patient's perception (respectively, subjective and objective end-points), were considered. Multivariable logistic models were developed for each end-point. Data of 298 patients were analyzed. The incidence of the most severe end-point (ICIQ-SF>12) was 5.1%. EQD2 calculated with alpha-beta=0.8Gy showed the best performance in fitting data: the risk of LPRUI markedly increased for EQD2>80Gy. Previous abdominal/pelvic surgery and previous TURP were the clinical factors more significantly predictive of LPRUI. Models showed excellent performances in terms of goodness-of-fit and calibration, confirmed by bootstrap-based internal validation. When included in the analyses, baseline symptoms were a major predictor for 5 out of six end-points. LPRUI after RT for PCa dramatically depends on EQD2 and few clinical factors. Results are consistent with a larger than expected impact of moderate hypo-fractionation on the risk of LPRUI. As expected, baseline symptoms, as captured by ICIQ-SF, are associated to an increased risk of LPRUI. Copyright © 2017 Elsevier B.V. All rights reserved.
LESTO: an Open Source GIS-based toolbox for LiDAR analysis
NASA Astrophysics Data System (ADS)
Franceschi, Silvia; Antonello, Andrea; Tonon, Giustino
2015-04-01
During the last five years different research institutes and private companies stared to implement new algorithms to analyze and extract features from LiDAR data but only a few of them also created a public available software. In the field of forestry there are different examples of software that can be used to extract the vegetation parameters from LiDAR data, unfortunately most of them are closed source (even if free), which means that the source code is not shared with the public for anyone to look at or make changes to. In 2014 we started the development of the library LESTO (LiDAR Empowered Sciences Toolbox Opensource): a set of modules for the analysis of LiDAR point cloud with an Open Source approach with the aim of improving the performance of the extraction of the volume of biomass and other vegetation parameters on large areas for mixed forest structures. LESTO contains a set of modules for data handling and analysis implemented within the JGrassTools spatial processing library. The main subsections are dedicated to 1) preprocessing of LiDAR raw data mainly in LAS format (utilities and filtering); 2) creation of raster derived products; 3) flight-lines identification and normalization of the intensity values; 4) tools for extraction of vegetation and buildings. The core of the LESTO library is the extraction of the vegetation parameters. We decided to follow the single tree based approach starting with the implementation of some of the most used algorithms in literature. These have been tweaked and applied on LiDAR derived raster datasets (DTM, DSM) as well as point clouds of raw data. The methods range between the simple extraction of tops and crowns from local maxima, the region growing method, the watershed method and individual tree segmentation on point clouds. The validation procedure consists in finding the matching between field and LiDAR-derived measurements at individual tree and plot level. An automatic validation procedure has been developed considering an Optimizer Algorithm based on Particle Swarm (PS) and a matching procedure which takes the position and the height of the extracted trees respect to the measured ones and iteratively tries to improve the candidate solution changing the models' parameters. Example of application of the LESTO tools will be presented on test sites. Test area consists in a series of circular sampling plots randomly selected from a 50x50 m regular grid within a buffer zone of 150 m from the forest road. Other studies on the same sites take as reference measurements of position, diameter, species and height and proposed allometric relationships. These allometric relationship were obtained for each species deriving the stem volume of single trees based on height and diameter at breast height. LESTO is integrated in the JGrassTools project and available for download at www.jgrasstools.org. A simple and easy to use graphical interface to run the models is available at https://github.com/moovida/STAGE/releases.
NASA Technical Reports Server (NTRS)
Appelbaum, J.; Singer, S.
1989-01-01
A calculation of the starting torque ratio of permanent magnet, series, and shunt-excited dc motors powered by solar cell arrays is presented for two cases, i.e., with and without a maximum-power-point tracker (MPPT). Defining motor torque magnification by the ratio of the motor torque with an MPPT to the motor torque without an MPPT, a magnification of 3 for the permanent magnet motor and a magnification of 7 for both the series and shunt motors are obtained. The study also shows that all motor types are less sensitive to solar insolation variation in systems including MPPTs as compared to systems without MPPTs.
[The forgotten capitulation of evidence-based medicine].
Schoemaker, Casper G; Smulders, Yvo M
2015-01-01
In 1992, the Canadian physician Gordon Guyatt wrote an article that is generally regarded as the starting point of evidence-based medicine (EBM). He described the ideas behind the McMaster residency programme for 'evidence-based practitioners', founded by David Sackett. Eight years later, in 2000, Guyatt concluded that this programme was too ambitious. In a new publication he described most doctors as 'evidence-users'. This editorial marks the transition from an individual to a collective form of EBM, emphasizing the use of evidence-based guidelines. The starting point of this collective form of EBM is not the well-known 1992 paper, but the forgotten editorial in 2000, which was described by Guyatt's colleagues as the capitulation of EBM.
D'Autry, Ward; Zheng, Chao; Bugalama, John; Wolfs, Kris; Hoogmartens, Jos; Adams, Erwin; Wang, Bochu; Van Schepdael, Ann
2011-07-15
Residual solvents are volatile organic compounds which can be present in pharmaceutical substances. A generic static headspace-gas chromatography analysis method for the identification and control of residual solvents is described in the European Pharmacopoeia. Although this method is proved to be suitable for the majority of samples and residual solvents, the method may lack sensitivity for high boiling point residual solvents such as N,N-dimethylformamide, N,N-dimethylacetamide, dimethyl sulfoxide and benzyl alcohol. In this study, liquid paraffin was investigated as new dilution medium for the analysis of these residual solvents. The headspace-gas chromatography method was developed and optimized taking the official Pharmacopoeia method as a starting point. The optimized method was validated according to ICH criteria. It was found that the detection limits were below 1μg/vial for each compound, indicating a drastically increased sensitivity compared to the Pharmacopoeia method, which failed to detect the compounds at their respective limit concentrations. Linearity was evaluated based on the R(2) values, which were above 0.997 for all compounds, and inspection of residual plots. Instrument and method precision were examined by calculating the relative standard deviations (RSD) of repeated analyses within the linearity and accuracy experiments, respectively. It was found that all RSD values were below 10%. Accuracy was checked by a recovery experiment at three different levels. Mean recovery values were all in the range 95-105%. Finally, the optimized method was applied to residual DMSO analysis in four different Kollicoat(®) sample batches. Copyright © 2011 Elsevier B.V. All rights reserved.
Method and apparatus for high speed data acquisition and processing
Ferron, J.R.
1997-02-11
A method and apparatus are disclosed for high speed digital data acquisition. The apparatus includes one or more multiplexers for receiving multiple channels of digital data at a low data rate and asserting a multiplexed data stream at a high data rate, and one or more FIFO memories for receiving data from the multiplexers and asserting the data to a real time processor. Preferably, the invention includes two multiplexers, two FIFO memories, and a 64-bit bus connecting the FIFO memories with the processor. Each multiplexer receives four channels of 14-bit digital data at a rate of up to 5 MHz per channel, and outputs a data stream to one of the FIFO memories at a rate of 20 MHz. The FIFO memories assert output data in parallel to the 64-bit bus, thus transferring 14-bit data values to the processor at a combined rate of 40 MHz. The real time processor is preferably a floating-point processor which processes 32-bit floating-point words. A set of mask bits is prestored in each 32-bit storage location of the processor memory into which a 14-bit data value is to be written. After data transfer from the FIFO memories, mask bits are concatenated with each stored 14-bit data value to define a valid 32-bit floating-point word. Preferably, a user can select any of several modes for starting and stopping direct memory transfers of data from the FIFO memories to memory within the real time processor, by setting the content of a control and status register. 15 figs.
Method and apparatus for high speed data acquisition and processing
Ferron, John R.
1997-01-01
A method and apparatus for high speed digital data acquisition. The apparatus includes one or more multiplexers for receiving multiple channels of digital data at a low data rate and asserting a multiplexed data stream at a high data rate, and one or more FIFO memories for receiving data from the multiplexers and asserting the data to a real time processor. Preferably, the invention includes two multiplexers, two FIFO memories, and a 64-bit bus connecting the FIFO memories with the processor. Each multiplexer receives four channels of 14-bit digital data at a rate of up to 5 MHz per channel, and outputs a data stream to one of the FIFO memories at a rate of 20 MHz. The FIFO memories assert output data in parallel to the 64-bit bus, thus transferring 14-bit data values to the processor at a combined rate of 40 MHz. The real time processor is preferably a floating-point processor which processes 32-bit floating-point words. A set of mask bits is prestored in each 32-bit storage location of the processor memory into which a 14-bit data value is to be written. After data transfer from the FIFO memories, mask bits are concatenated with each stored 14-bit data value to define a valid 32-bit floating-point word. Preferably, a user can select any of several modes for starting and stopping direct memory transfers of data from the FIFO memories to memory within the real time processor, by setting the content of a control and status register.
40 CFR 1065.514 - Cycle-validation criteria for operation over specified duty cycles.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Cycle-validation criteria for... Over Specified Duty Cycles § 1065.514 Cycle-validation criteria for operation over specified duty...-validation criteria. You must compare the original reference duty cycle points generated as described in...
40 CFR 1065.514 - Cycle-validation criteria for operation over specified duty cycles.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Cycle-validation criteria for... Over Specified Duty Cycles § 1065.514 Cycle-validation criteria for operation over specified duty...-validation criteria. You must compare the original reference duty cycle points generated as described in...
40 CFR 1065.514 - Cycle-validation criteria for operation over specified duty cycles.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Cycle-validation criteria for... Over Specified Duty Cycles § 1065.514 Cycle-validation criteria for operation over specified duty...-validation criteria. You must compare the original reference duty cycle points generated as described in...
40 CFR 1065.514 - Cycle-validation criteria for operation over specified duty cycles.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Cycle-validation criteria for... Over Specified Duty Cycles § 1065.514 Cycle-validation criteria for operation over specified duty...-validation criteria. You must compare the original reference duty cycle points generated as described in...
NASA Astrophysics Data System (ADS)
Famiglietti, C.; Fisher, J.; Halverson, G. H.
2017-12-01
This study validates a method of remote sensing near-surface meteorology that vertically interpolates MODIS atmospheric profiles to surface pressure level. The extraction of air temperature and dew point observations at a two-meter reference height from 2001 to 2014 yields global moderate- to fine-resolution near-surface temperature distributions that are compared to geographically and temporally corresponding measurements from 114 ground meteorological stations distributed worldwide. This analysis is the first robust, large-scale validation of the MODIS-derived near-surface air temperature and dew point estimates, both of which serve as key inputs in models of energy, water, and carbon exchange between the land surface and the atmosphere. Results show strong linear correlations between remotely sensed and in-situ near-surface air temperature measurements (R2 = 0.89), as well as between dew point observations (R2 = 0.77). Performance is relatively uniform across climate zones. The extension of mean climate-wise percent errors to the entire remote sensing dataset allows for the determination of MODIS air temperature and dew point uncertainties on a global scale.
Pedestrian Pathfinding in Urban Environments: Preliminary Results
NASA Astrophysics Data System (ADS)
López-Pazos, G.; Balado, J.; Díaz-Vilariño, L.; Arias, P.; Scaioni, M.
2017-12-01
With the rise of urban population, many initiatives are focused upon the smart city concept, in which mobility of citizens arises as one of the main components. Updated and detailed spatial information of outdoor environments is needed to accurate path planning for pedestrians, especially for people with reduced mobility, in which physical barriers should be considered. This work presents a methodology to use point clouds to direct path planning. The starting point is a classified point cloud in which ground elements have been previously classified as roads, sidewalks, crosswalks, curbs and stairs. The remaining points compose the obstacle class. The methodology starts by individualizing ground elements and simplifying them into representative points, which are used as nodes in the graph creation. The region of influence of obstacles is used to refine the graph. Edges of the graph are weighted according to distance between nodes and according to their accessibility for wheelchairs. As a result, we obtain a very accurate graph representing the as-built environment. The methodology has been tested in a couple of real case studies and Dijkstra algorithm was used to pathfinding. The resulting paths represent the optimal according to motor skills and safety.
78 FR 42592 - Proposed Collection; Comment Request for Form 1120-ND
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-16
... the collection of information displays a valid OMB control number. Books or records relating to a... collection techniques or other forms of information technology; and (e) estimates of capital or start-up...
78 FR 60378 - Proposed Collection; Comment Request for Form 1023
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-01
... displays a valid OMB control number. Books or records relating to a collection of information must be... information technology; and (e) estimates of capital or start-up costs and costs of operation, maintenance...
78 FR 36638 - Proposed Collection; Comment Request for Form 8916-A
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-18
... the collection of information displays a valid OMB control number. Books or records relating to a... collection techniques or other forms of information technology; and (e) estimates of capital or start-up...
Aguilar-Navarro, Sara Gloria; Fuentes-Cantú, Alejandro; Avila-Funes, José Alberto; García-Mayo, Emilio José
2007-01-01
To assess the validity and reliability of a geriatric depression questionnaire used in the Mexican Health and Age Study (MHAS). The study was conducted at the Instituto Nacional de Ciencias Médicas y Nutrición Salvador Zubirán (INCMNSZ) clinic from May 2005 to March 2006. This depression screening nine-item questionnaire was validated using the Diagnostic and Statistical Manual of Mental Disorders (DSM-IV-TR) (fourth revised version) and Yesavage's 15-item Geriatric Depression Scale (GDS-15) criteria. The instrument belongs to the MHAS, a prospective panel study of health and aging in Mexico. A total of 199 subjects 65 years of age and older participated in the validation process (median age= 79.5 years). MHAS questionnaire result was significantly correlated to the clinical depression diagnosis (p<0.001) and to the GDS-15 score (p<0.001). Internal consistency was adequate (alpha coefficient: 0.74). The cutoff point > or = 5/9 points yielded an 80.7% and 68.7% sensitivity and specificity respectively. The fidelity for the test retest was excellent (intra-class correlation coefficient= 0.933). Finally, the Bland and Altman agreement points indicated a difference 0.22 percent points between test retest. The MHAS questionnaire is valid and trustworthy, and allows screening in the research field for the presence of depression in the elderly.
University Students' Grasp of Inflection Points
ERIC Educational Resources Information Center
Tsamir, Pessia; Ovodenko, Regina
2013-01-01
This paper describes university students' grasp of inflection points. The participants were asked what inflection points are, to mark inflection points on graphs, to judge the validity of related statements, and to find inflection points by investigating (1) a function, (2) the derivative, and (3) the graph of the derivative. We found four…
Vacca, Davide; Cancila, Valeria; Gulino, Alessandro; Lo Bosco, Giosuè; Belmonte, Beatrice; Di Napoli, Arianna; Florena, Ada Maria; Tripodo, Claudio; Arancio, Walter
2018-02-01
The MinION is a miniaturized high-throughput next generation sequencing platform of novel conception. The use of nucleic acids derived from formalin-fixed paraffin-embedded samples is highly desirable, but their adoption for molecular assays is hurdled by the high degree of fragmentation and by the chemical-induced mutations stemming from the fixation protocols. In order to investigate the suitability of MinION sequencing on formalin-fixed paraffin-embedded samples, the presence and frequency of BRAF c.1799T > A mutation was investigated in two archival tissue specimens of Hairy cell leukemia and Hairy cell leukemia Variant. Despite the poor quality of the starting DNA, BRAF mutation was successfully detected in the Hairy cell leukemia sample with around 50% of the reads obtained within 2 h of the sequencing start. Notably, the mutational burden of the Hairy cell leukemia sample as derived from nanopore sequencing proved to be comparable to a sensitive method for the detection of point mutations, namely the Digital PCR, using a validated assay. Nanopore sequencing can be adopted for targeted sequencing of genetic lesions on critical DNA samples such as those extracted from archival routine formalin-fixed paraffin-embedded samples. This result let speculating about the possibility that the nanopore sequencing could be trustably adopted for the real-time targeted sequencing of genetic lesions. Our report opens the window for the adoption of nanopore sequencing in molecular pathology for research and diagnostics.
Analysis of a Channeled Centerbody Supersonic Inlet for F-15B Flight Research
NASA Technical Reports Server (NTRS)
Ratnayake, Nalin A.
2010-01-01
The Propulsion Flight Test Fixture at the NASA Dryden Flight Research Center is a unique test platform available for use on the NASA F-15B airplane, tail number 836, as a modular host for a variety of aerodynamics and propulsion research. The first experiment that is to be flown on the test fixture is the Channeled Centerbody Inlet Experiment. The objectives of this project at Dryden are twofold: 1) flight evaluation of an innovative new approach to variable geometry for high-speed inlets, and 2) flight validation of channeled inlet performance prediction by complex computational fluid dynamics codes. The inlet itself is a fixed-geometry version of a mixed-compression, variable-geometry, supersonic in- let developed by TechLand Research, Inc. (North Olmsted, Ohio) to improve the efficiency of supersonic flight at off-nominal conditions. The concept utilizes variable channels in the centerbody section to vary the mass flow of the inlet, enabling efficient operation at a range of flight conditions. This study is particularly concerned with the starting characteristics of the inlet. Computational fluid dynamics studies were shown to align well with analytical predictions, showing the inlet to remain unstarted as designed at the primary test point of Mach 1.5 at an equivalent pressure altitude of 29,500 ft local conditions. Mass-flow-related concerns such as the inlet start problem, as well as inlet efficiency in terms of total pressure loss, are assessed using the flight test geometry.
Gupta, Shaloo; Wang, Hongwei; Skolnik, Neil; Tong, Liyue; Liebert, Ryan M; Lee, Lulu K; Stella, Peter; Cali, Anna; Preblick, Ronald
2018-01-01
Usage patterns and effectiveness of a longer-acting formulation of insulin glargine at a strength of 300 units per milliliter (Gla-300) have not been studied in real-world clinical practice. This study evaluated differences in dosing and clinical outcomes before and after Gla-300 treatment initiation in patients with type 2 diabetes starting or switching to treatment with Gla-300 to assess whether the benefits observed in clinical trials translate into real-world settings. This was a retrospective observational study using medical record data obtained by physician survey for patients starting treatment with insulin glargine at a strength of 100 units per milliliter (Gla-100) or Gla-300, or switching to treatment with Gla-300 from treatment with another basal insulin (BI). Differences in dosing and clinical outcomes before versus after treatment initiation or switching were examined by generalized linear mixed-effects models. Among insulin-naive patients starting BI treatment, no difference in the final titrated dose was observed in patients starting Gla-300 treatment versus those starting Gla-100 treatment [least-squares (LS) mean 0.43 units per kilogram vs 0.44 units per kilogram; P = 0.77]. Both groups had significant hemoglobin A 1c level reductions (LS mean 1.21 percentage points for Gla-300 and 1.12 percentage points for Gla-100 ; both P < 0.001). The relative risk of hypoglycemic events after Gla-300 treatment initiation was lower than that after Gla-100 treatment initiation [0.31, 95% confidence interval (CI) 0.12-0.81; P = 0.018] at similar daily doses. The daily dose of BI was significantly lower after switching to treatment with Gla-300 from treatment with another BI (0.73 units per kilogram before switch vs 0.58 units per kilogram after switch; P = 0.02). The mean hemoglobin A 1c level was significantly lower after switching than before switching (adjusted difference - 0.95 percentage points, 95% CI - 1.13 to - 0.78 percentage points ; P < 0.0001). Hypoglycemic events per patient-year were significantly lower (relative risk 0.17, 95% CI 0.11-0.26; P < 0.0001). Insulin-naive patients starting Gla-300 treatment had fewer hypoglycemic events, a similar hemoglobin A 1c level reduction, and no difference in insulin dose versus patients starting Gla-100 treatment. Patients switching to Gla-300 treatment from treatment with other BIs had significantly lower daily doses of BI, with fewer hypoglycemic events, without compromise of hemoglobin A 1c level reduction. These findings suggest Gla-300 in a real-world setting provides benefits in terms of dosing, with improved hemoglobin A 1c level and hypoglycemia rates. Sanofi US Inc. (Bridgewater, NJ, USA).
Bakris, George L; Pitt, Bertram; Weir, Matthew R; Freeman, Mason W; Mayo, Martha R; Garza, Dahlia; Stasiv, Yuri; Zawadzki, Rezi; Berman, Lance; Bushinsky, David A
2015-07-14
Hyperkalemia is a potentially life-threatening condition predominantly seen in patients treated with renin-angiotensin-aldosterone system (RAAS) inhibitors with stage 3 or greater chronic kidney disease (CKD) who may also have diabetes, heart failure, or both. To select starting doses for a phase 3 study and to evaluate the long-term safety and efficacy of a potassium-binding polymer, patiromer, in outpatients with hyperkalemia. Phase 2, multicenter, open-label, dose-ranging, randomized clinical trial (AMETHYST-DN), conducted at 48 sites in Europe from June 2011 to June 2013 evaluating patiromer in 306 outpatients with type 2 diabetes (estimated glomerular filtration rate, 15 to <60 mL/min/1.73 m2 and serum potassium level >5.0 mEq/L). All patients received RAAS inhibitors prior to and during study treatment. Patients were stratified by baseline serum potassium level into mild or moderate hyperkalemia groups and received 1 of 3 randomized starting doses of patiromer (4.2 g [n = 74], 8.4 g [n = 74], or 12.6 g [n = 74] twice daily [mild hyperkalemia] or 8.4 g [n = 26], 12.6 g [n = 28], or 16.8 g [n = 30] twice daily [moderate hyperkalemia]). Patiromer was titrated to achieve and maintain serum potassium level 5.0 mEq/L or lower. The primary efficacy end point was mean change in serum potassium level from baseline to week 4 or prior to initiation of dose titration. The primary safety end point was adverse events through 52 weeks. Secondary efficacy end points included mean change in serum potassium level through 52 weeks. A total of 306 patients were randomized. The least squares mean reduction from baseline in serum potassium level at week 4 or time of first dose titration in patients with mild hyperkalemia was 0.35 (95% CI, 0.22-0.48) mEq/L for the 4.2 g twice daily starting-dose group, 0.51 (95% CI, 0.38-0.64) mEq/L for the 8.4 g twice daily starting-dose group, and 0.55 (95% CI, 0.42-0.68) mEq/L for the 12.6 g twice daily starting-dose group. In those with moderate hyperkalemia, the reduction was 0.87 (95% CI, 0.60-1.14) mEq/L for the 8.4 g twice daily starting-dose group, 0.97 (95% CI, 0.70-1.23) mEq/L for the 12.6 g twice daily starting-dose group, and 0.92 (95% CI, 0.67-1.17) mEq/L for the 16.8 g twice daily starting-dose group (P < .001 for all changes vs baseline by hyperkalemia starting-dose groups within strata). From week 4 through week 52, statistically significant mean decreases in serum potassium levels were observed at each monthly point in patients with mild and moderate hyperkalemia. Over the 52 weeks, hypomagnesemia (7.2%) was the most common treatment-related adverse event, mild to moderate constipation (6.3%) was the most common gastrointestinal adverse event, and hypokalemia (<3.5 mEq/L) occurred in 5.6% of patients. Among patients with hyperkalemia and diabetic kidney disease, patiromer starting doses of 4.2 to 16.8 g twice daily resulted in statistically significant decreases in serum potassium level after 4 weeks of treatment, lasting through 52 weeks. clinicaltrials.gov Identifier:NCT01371747.
Acoustic and Perceptual Effects of Left–Right Laryngeal Asymmetries Based on Computational Modeling
Samlan, Robin A.; Story, Brad H.; Lotto, Andrew J.; Bunton, Kate
2015-01-01
Purpose Computational modeling was used to examine the consequences of 5 different laryngeal asymmetries on acoustic and perceptual measures of vocal function. Method A kinematic vocal fold model was used to impose 5 laryngeal asymmetries: adduction, edge bulging, nodal point ratio, amplitude of vibration, and starting phase. Thirty /a/ and /I/ vowels were generated for each asymmetry and analyzed acoustically using cepstral peak prominence (CPP), harmonics-to-noise ratio (HNR), and 3 measures of spectral slope (H1*-H2*, B0-B1, and B0-B2). Twenty listeners rated voice quality for a subset of the productions. Results Increasingly asymmetric adduction, bulging, and nodal point ratio explained significant variance in perceptual rating (R2 = .05, p < .001). The same factors resulted in generally decreasing CPP, HNR, and B0-B2 and in increasing B0-B1. Of the acoustic measures, only CPP explained significant variance in perceived quality (R2 = .14, p < .001). Increasingly asymmetric amplitude of vibration or starting phase minimally altered vocal function or voice quality. Conclusion Asymmetries of adduction, bulging, and nodal point ratio drove acoustic measures and perception in the current study, whereas asymmetric amplitude of vibration and starting phase demonstrated minimal influence on the acoustic signal or voice quality. PMID:24845730
NASA Astrophysics Data System (ADS)
Qiao, Y.
2013-12-01
As China's economic development, water pollution incidents happened frequently. For example, the cyanobacterial bloom events repeatedly occur in Taihu Lake. In this research, we investigate the pollutants solute transport start at different points, such as the eutrophication substances Nitrogen and Phosphorus et al, with the Lattice Boltzmann Method (LBM) performed on real pore geometries. The LBM has emerged as a powerful tool for simulating the behaviour of multi-component fluid systems in complex pore networks. We will build a quick response simulation system, which is base on the high resolution GIS figure, using the LBM numerical method.When the start two deferent points at the Meiliang Bay nearby the Wuxi City, it is shown that the pollutants solute can't transport out of the bay to influence the Taihu Lake and the diffusion areas are similar. On the other hand, when the start point at central region of the Taihu Lake, it is found that the pollutants solute covered the almost whole area of the lake and the cyanobacterial bloom with good condition. In the same way, if the cyanobacterial bloom transport in the central area, then it will pollute the whole Taihu Lake. Therefore, when we monitor and deal with the eutrophication substances, we need to focus on the central area of lake.
Effort, symptom validity testing, performance validity testing and traumatic brain injury.
Bigler, Erin D
2014-01-01
To understand the neurocognitive effects of brain injury, valid neuropsychological test findings are paramount. This review examines the research on what has been referred to a symptom validity testing (SVT). Above a designated cut-score signifies a 'passing' SVT performance which is likely the best indicator of valid neuropsychological test findings. Likewise, substantially below cut-point performance that nears chance or is at chance signifies invalid test performance. Significantly below chance is the sine qua non neuropsychological indicator for malingering. However, the interpretative problems with SVT performance below the cut-point yet far above chance are substantial, as pointed out in this review. This intermediate, border-zone performance on SVT measures is where substantial interpretative challenges exist. Case studies are used to highlight the many areas where additional research is needed. Historical perspectives are reviewed along with the neurobiology of effort. Reasons why performance validity testing (PVT) may be better than the SVT term are reviewed. Advances in neuroimaging techniques may be key in better understanding the meaning of border zone SVT failure. The review demonstrates the problems with rigidity in interpretation with established cut-scores. A better understanding of how certain types of neurological, neuropsychiatric and/or even test conditions may affect SVT performance is needed.
Development and Validation of a Photonumeric Scale for Evaluation of Volume Deficit of the Temple
Jones, Derek; Hardas, Bhushan; Murphy, Diane K.; Donofrio, Lisa; Sykes, Jonathan M.; Carruthers, Alastair; Creutz, Lela; Marx, Ann; Dill, Sara
2016-01-01
BACKGROUND A validated scale is needed for objective and reproducible comparisons of temple appearance before and after aesthetic treatment in practice and clinical studies. OBJECTIVE To describe the development and validation of the 5-point photonumeric Allergan Temple Hollowing Scale. METHODS The scale was developed to include an assessment guide, verbal descriptors, morphed images, and real subject images for each grade. The clinical significance of a 1-point score difference was evaluated in a review of image pairs representing varying differences in severity. Interrater and intrarater reliability was evaluated in a live-subject validation study (N = 298) completed during 2 sessions occurring 3 weeks apart. RESULTS A score difference of ≥1 point was shown to reflect a clinically significant difference (mean [95% confidence interval] absolute score difference, 1.1 [0.94–1.26] for clinically different image pairs and 0.67 [0.51–0.83] for not clinically different pairs). Intrarater agreement between the 2 validation sessions was almost perfect (mean weighted kappa = 0.86). Interrater agreement was almost perfect during the second session (0.81, primary endpoint). CONCLUSION The Allergan Temple Hollowing Scale is a validated and reliable scale for physician rating of temple volume deficit. PMID:27661742
Preparing for Workplace Numeracy: A Modelling Perspective
ERIC Educational Resources Information Center
Wake, Geoff
2015-01-01
The starting point of this article is the question, "how might we inform an epistemology of numeracy from the point of view of better preparing young people for workplace competence?" To inform thinking illustrative data from two projects that researched into mathematics in workplace activity and the teaching and learning of modelling in…
Replacement Planning: A Starting Point for Succession Planning and Talent Management
ERIC Educational Resources Information Center
Rothwell, William J.
2011-01-01
Replacement planning is a process of identifying short-term or long-term backups so that organizations have people who can assume responsibility for critical positions during emergencies. Individuals identified as "replacements" are not promised promotions; rather, they are prepared to the point where they can assume a critical position long…
40 CFR 1054.145 - Are there interim provisions that apply only for a limited time?
Code of Federal Regulations, 2010 CFR
2010-07-01
... scheduled emission-related maintenance falls within 10 hours of a test point, delay the maintenance until the engine reaches the test point. Measure emissions before and after peforming the maintenance. Use... example, for the fuel line permeation standards starting in 2012, equipment manufacturers may order a...
Stabilizing Crystal Oscillators With Melting Metals
NASA Technical Reports Server (NTRS)
Stephens, J. B.; Miller, C. G.
1984-01-01
Heat of fusion provides extended period of constant temperature and frequency. Crystal surrounded by metal in spherical container. As outside temperature rises to melting point of metal, metal starts to liquefy; but temperature stays at melting point until no solid metal remains. Potential terrestrial applications include low-power environmental telemetering transmitters and instrumentation transmitters for industrial processes.
Fractured Connections: Migration and Holistic Models of Counselling
ERIC Educational Resources Information Center
Wright, Jeannie; Lang, Steve K. W.; Cornforth, Sue
2011-01-01
In this article we aim to explore those points at which migrant identity and landscape intersect. We also consider implications for holistic models of counselling with migrant groups. The New Zealand migration literature was the starting point to consider how and why the experience of migration has been studied. We asked how collective biography…
Indentations and Starting Points in Traveling Sales Tour Problems: Implications for Theory
ERIC Educational Resources Information Center
MacGregor, James N.
2012-01-01
A complete, non-trivial, traveling sales tour problem contains at least one "indentation", where nodes in the interior of the point set are connected between two adjacent nodes on the boundary. Early research reported that human tours exhibited fewer such indentations than expected. A subsequent explanation proposed that this was because…
Homogeneity of Moral Judgment? Apprentices Solving Business Conflicts.
ERIC Educational Resources Information Center
Beck, Klaus; Heinrichs, Karin; Minnameier, Gerhard; Parche-Kawik, Kirsten
In an ongoing longitudinal study that started in 1994, the moral development of business apprentices is being studied. The focal point of this project is a critical analysis of L. Kohlberg's thesis of homogeneity, according to which people should judge every moral issue from the point of view of their "modal" stage (the most frequently…
Eckberg, E.E.
1960-09-27
A multiple molecular vacuum pump capable of producing a vacuum of the order of 10/sup -9/ mm Hg is described. The pump comprises a casing of an aggregate of paired and matched cylindrical plates, a recessed portion on one face of each plate concentrically positioned formed by a radially extending wall and matching the similarly recessed portion of its twin plate of that pair of plates and for all paired and matched plates; a plurality of grooves formed in the radially extending walls of each and all recesses progressing in a spiral manner from their respective starting points out at the periphery of the recess inwardly to the central area; a plurality of rotors rotatably mounted to closely occupy the spaces as presented by the paired and matched recesses between all paired plates; a hollowed drive-shaft perforated at points adjacent to the termini of all spiral grooves; inlet ports at the starting points of all grooves and through all plates at common points to each respectively; and a common outlet passage presented by the hollow portion of the perforated hollowed drive-shaft of the molecular pump. (AEC)
ERIC Educational Resources Information Center
Schmidt, Sebastian; Troge, Thomas A.; Lorrain, Denis
2013-01-01
A theory of listening to music is proposed. It suggests that, for listeners, the process of prediction is the starting point to experiencing music. This implies that perception of music starts through both a predisposed and an experience-based extrapolation into the future (this is labeled "a priori" listening). Indications for this…
ERIC Educational Resources Information Center
Kershaw, Amy
To address the growing demand for high-quality child care, many communities are seeking to develop specialized child care facilities funds to build new, and improve the quality of existing, child care programs. This toolkit is designed for policymakers, nonprofit leaders, child care providers, and others interested in increasing access to…
Starting School at a Disadvantage: The School Readiness of Poor Children. The Social Genome Project
ERIC Educational Resources Information Center
Isaacs, Julia B.
2012-01-01
Poor children in the United States start school at a disadvantage in terms of their early skills, behaviors, and health. Fewer than half (48 percent) of poor children are ready for school at age five, compared to 75 percent of children from families with moderate and high income, a 27 percentage point gap. This paper examines the reasons why poor…
Passmore, Erin; Shepherd, Brooke; Milat, Andrew; Maher, Louise; Hennessey, Kiel; Havrlant, Rachael; Maxwell, Michelle; Hodge, Wendy; Christian, Fiona; Richards, Justin; Mitchell, Jo
2017-12-13
Aboriginal people in Australia experience significant health burden from chronic disease. There has been limited research to identify effective healthy lifestyle programs to address risk factors for chronic disease among Aboriginal people. The Knockout Health Challenge is a community-led healthy lifestyle program for Aboriginal communities across New South Wales, Australia. An evaluation of the 2013 Knockout Health Challenge was undertaken. Participants' self-reported physical activity and diet were measured at four time points - at the start and end of the Challenge (via paper form), and 5 and 9 months after the Challenge (via telephone survey). Participants' weight was measured objectively at the start and end of the Challenge, and self-reported (via telephone survey) 5 and 9 months after the Challenge. Changes in body composition, physical activity and diet between time points were analysed using linear mixed models. As part of the telephone survey participants were also asked to identify other impacts of the Challenge; these were analysed descriptively (quantitative items) and thematically (qualitative items). A total of 586 people registered in 22 teams to participate in the Challenge. The mean weight at the start was 98.54kg (SD 22.4), and 94% of participants were overweight or obese. Among participants who provided data at all four time points (n=122), the mean weight loss from the start to the end of the Challenge was 2.3kg (95%CI -3.0 to -1.9, p<0.001), and from the start to 9 months after the Challenge was 2.3kg (95%CI -3.3 to -1.3, p<0.001). Body mass index decreased by an average of 0.9kg/m 2 (95%CI -1.0 to -0.7, p<0.001) from the start to the end of the Challenge, and 0.8kg/m 2 (95%CI -1.2 to -0.4, p<0.001) 9 months after. At the end of the Challenge, participants reported they were more physically active and had increased fruit and vegetable consumption compared with the start of the Challenge, and identified a range of other positive impacts. The Challenge was effective in reducing weight and promoting healthy lifestyles among Aboriginal people across New South Wales, and has potential to contribute to closing the health gap between Aboriginal and non-Aboriginal people.
FireBird - a small satellite fire monitoring mission: Status and first results
NASA Astrophysics Data System (ADS)
Lorenz, Eckehard; Rücker, Gernot; Terzibaschian, Thomas; Klein, Doris; Tiemann, Joachim
2014-05-01
The scientific mission FireBird is operated by the German Aerospace Center (DLR) and consists of two small satellites. The first satellite - TET-1 - was successfully launched from Baikonur, Russia in July 2012. Its first year in orbit was dedicated to a number of experiments within the framework of the DLR On Orbit Verification (OOV) program which is dedicated to technology testing in space. After successful completion of its OOV phase, TET-1 was handed over to the DLR FireBird mission and is now a dedicated Earth Observation mission. Its primary goal is sensing of hot phenomena such as wildfires, volcanoes, gas flares and industrial hotspots. The second satellite, BiROS is scheduled for launch in the second or third quarter of 2015. The satellite builds on the heritage of the DLR BIRD (BIspectral Infrared Detection) mission and delivers quantitative information (such as Fire Radiative Power, FRP) at a spatial resolution of 350 m, superior to any current fire enabled satellite system such as NPP VIIRS, MODIS or Meteosat SEVIRI. The satellite is undergoing a four month validation phase during which satellite operations are adapted to the new mission goals of FireBIRD and processing capacities are established to guarantee swift processing and delivery of high quality data. The validation phase started with an informal Operational Readiness Review and will be completed with a formal review, covering all aspects of the space and ground segments. The satellite is equipped with a camera with a 42 m ground pixel size in the red, green and near infrared spectral range, and a 370 m ground pixel size camera in the mid and thermal infrared with a swath of 185 km. The satellite can be pointed towards a target in order to enhance observation frequency. First results of the FireBird mission include a ground validation experiment and acquisitions over fires across the world. Once the validation phase is finished the data will be made available to a wide scientific community.
Measurement Properties of Instruments for Measuring of Lymphedema: Systematic Review.
Hidding, Janine T; Viehoff, Peter B; Beurskens, Carien H G; van Laarhoven, Hanneke W M; Nijhuis-van der Sanden, Maria W G; van der Wees, Philip J
2016-12-01
Lymphedema is a common complication of cancer treatment, resulting in swelling and subjective symptoms. Reliable and valid measurement of this side effect of medical treatment is important. The purpose of this study was to provide best evidence regarding which measurement instruments are most appropriate in measuring lymphedema in its different stages. The PubMed and Web of Science databases were used, and the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines were followed. Clinical studies on measurement instruments assessing lymphedema were reviewed using the Quality Assessment of Diagnostic Accuracy Studies-2 (QUADAS-2) scoring instrument for quality assessment. Data on reliability, concurrent validity, convergent validity, sensitivity, specificity, applicability, and costs were extracted. Pooled data showed good intrarater intraclass correlation coefficients (ICCs) (.89) for bioimpedance spectroscopy (BIS) in the lower extremities and high intrarater and interrater ICCs for water volumetry, tape measurement, and perometry (.98-.99) in the upper extremities. In the upper extremities, the standard error of measurement was 3.6% (σ=0.7%) for water volumetry, 5.6% (σ=2.1%) for perometry, and 6.6% (σ=2.6%) for tape measurement. Sensitivity of tape measurement in the upper extremities, using different cutoff points, varied from 0.73 to 0.90, and specificity values varied from 0.72 to 0.78. No uniform definition of lymphedema was available, and a gold standard as a reference test was lacking. Items concerning risk of bias were study design, patient selection, description of lymphedema, blinding of test outcomes, and number of included participants. Measurement instruments with evidence for good reliability and validity were BIS, water volumetry, tape measurement, and perometry, where BIS can detect alterations in extracellular fluid in stage 1 lymphedema and the other measurement instruments can detect alterations in volume starting from stage 2. In research, water volumetry is indicated as a reference test for measuring lymphedema in the upper extremities. © 2016 American Physical Therapy Association.
NASA Astrophysics Data System (ADS)
Lana, Arancha; Fernández, Vicente; Orfila, Alejandro; Troupin, Charles; Tintoré, Joaquín
2015-04-01
SOCIB High Frequency (HF) radar is one component of a multi-platform system located in the Balearic Islands and made up of Lagrangian platforms (profilers and drifting buoys), fixed stations (sea-level, weather, mooring and coastal), beach monitoring (camera), gliders, a research vessel as well as an ocean forecast system (waves and hydrodynamics). The HF radar system overlooks the Ibiza Channel, known as a 'choke point" where Atlantic and Mediterranean water masses interact and where meridional exchanges of water mass properties between the Balearic and the Algerian sub-basins take place. In order to determine the reliability of surface velocity measurements in this area, a quality assessment of the HF Radar is essential. We present the results of several validation experiments performed in the Ibiza Channel in 2013 and 2014. Of particular interest is an experiment started in September 2014 when a set of 13 surface drifters with different shapes and drogue lengths were released in the area covered by the HF radar. The drifter trajectories can be examined following the SOCIB Deployment Application (DAPP): http://apps.socib.es/dapp. Additionally, a 1-year long time series of surface currents obtained from a moored surface current-meter located in the Ibiza Channel, inside the area covered by the HF radar, was also used as a useful complementary validation exercise. Direct comparison between both radial surface currents from each radar station and total derived velocities against drifters and moored current meter velocities provides an assessment of the HF radar data quality at different temporal periods and geographical areas. Statistics from these comparisons give good correlation and low root-mean-square deviation. The results will be discussed for different months, geographical areas and types of surface drifters and wind exposure. Moreover, autonomous underwater glider constitutes an additional source of information for the validation of the observed velocity structures and some statistics will be presented.
NASA Astrophysics Data System (ADS)
Aditya Parikesit, Arli; Nurdiansyah, Rizki
2018-01-01
The research for finding the cure for breast cancer is currently entering the interesting phase of the transcriptomics based method. With the application of Next Generation Sequencing (NGS), molecular information on breast cancer could be gathered. Thus, both in silico and wet lab research has determined that the role of lincRNA-RoR/miR-145/ARF6 expression Pathway could not be ignored as one of the cardinal starting points for Triple-Negative Breast Cancer (TNBC). As the most hazardous type of breast cancer, TNBC should be treated with the most advanced approach that available in the scientific community. Bioinformatics approach has found the possible siRNA-based drug candidates for TNBC. It was found that siRNA that interfere with lincRNA-ROR and mRNA ARF6 could be a feasible opportunity as the drug candidate for TNBC. However, this claim should be validated with more thorough thermodynamics and kinetics computational approach as the comprehensive way to comprehend their molecular repertoire. In this respect, the claim was validated using various tools such as the RNAfold server to determine the 2D structure, Barriers server to comprehend the RNA folding kinetics, RNAeval server to validate the siRNA-target interaction. It was found that the thermodynamics and kinetics repertoire of the siRNA are indeed rational and feasible. In this end, our computation approach has proven that our designed siRNA could interact with lincRNA-RoR/miR-145/ARF6 expression Pathway.
Udo, Renate; Tcherny-Lessenot, Stéphanie; Brauer, Ruth; Dolin, Paul; Irvine, David; Wang, Yunxun; Auclert, Laurent; Juhaeri, Juhaeri; Kurz, Xavier; Abenhaim, Lucien; Grimaldi, Lamiae; De Bruin, Marie L
2016-03-01
To examine the robustness of findings of case-control studies on the association between acute liver injury (ALI) and antibiotic use in the following different situations: (i) Replication of a protocol in different databases, with different data types, as well as replication in the same database, but performed by a different research team. (ii) Varying algorithms to identify cases, with and without manual case validation. (iii) Different exposure windows for time at risk. Five case-control studies in four different databases were performed with a common study protocol as starting point to harmonize study outcome definitions, exposure definitions and statistical analyses. All five studies showed an increased risk of ALI associated with antibiotic use ranging from OR 2.6 (95% CI 1.3-5.4) to 7.7 (95% CI 2.0-29.3). Comparable trends could be observed in the five studies: (i) without manual validation the use of the narrowest definition for ALI showed higher risk estimates, (ii) narrow and broad algorithm definitions followed by manual validation of cases resulted in similar risk estimates, and (iii) the use of a larger window (30 days vs 14 days) to define time at risk led to a decrease in risk estimates. Reproduction of a study using a predefined protocol in different database settings is feasible, although assumptions had to be made and amendments in the protocol were inevitable. Despite differences, the strength of association was comparable between the studies. In addition, the impact of varying outcome definitions and time windows showed similar trends within the data sources. Copyright © 2015 John Wiley & Sons, Ltd.
Henderson, Valerie C; Kimmelman, Jonathan; Fergusson, Dean; Grimshaw, Jeremy M; Hackam, Dan G
2013-01-01
The vast majority of medical interventions introduced into clinical development prove unsafe or ineffective. One prominent explanation for the dismal success rate is flawed preclinical research. We conducted a systematic review of preclinical research guidelines and organized recommendations according to the type of validity threat (internal, construct, or external) or programmatic research activity they primarily address. We searched MEDLINE, Google Scholar, Google, and the EQUATOR Network website for all preclinical guideline documents published up to April 9, 2013 that addressed the design and conduct of in vivo animal experiments aimed at supporting clinical translation. To be eligible, documents had to provide guidance on the design or execution of preclinical animal experiments and represent the aggregated consensus of four or more investigators. Data from included guidelines were independently extracted by two individuals for discrete recommendations on the design and implementation of preclinical efficacy studies. These recommendations were then organized according to the type of validity threat they addressed. A total of 2,029 citations were identified through our search strategy. From these, we identified 26 guidelines that met our eligibility criteria--most of which were directed at neurological or cerebrovascular drug development. Together, these guidelines offered 55 different recommendations. Some of the most common recommendations included performance of a power calculation to determine sample size, randomized treatment allocation, and characterization of disease phenotype in the animal model prior to experimentation. By identifying the most recurrent recommendations among preclinical guidelines, we provide a starting point for developing preclinical guidelines in other disease domains. We also provide a basis for the study and evaluation of preclinical research practice. Please see later in the article for the Editors' Summary.
PENN Biomarker Core of the Alzheimer’s Disease Neuroimaging Initiative
Shaw, Leslie M.
2009-01-01
There is a pressing need to develop effective prevention and disease-modifying treatments for Alzheimer’s disease (AD), a dreaded affliction whose incidence increases almost logarithmically with age starting at about 65 years. A key need in the field of AD research is the validation of imaging and biochemical biomarkers. Biomarker tests that are shown to reliably predict the disease before it is clinically expressed would permit testing of new therapeutics at the earliest time point possible in order to give the best chance for delaying the onset of dementia in these patients. In this review the current state of AD biochemical biomarker research is discussed. A new set of guidelines for the diagnosis of AD in the research setting places emphasis on the inclusion of selected imaging and biochemical biomarkers, in addition to neuropsychological behavioral testing. Importantly, the revised guidelines were developed to identify patients at the earliest stages prior to full-blown dementia as well as patients with the full spectrum of the disease. The Alzheimer’s Disease Neuroimaging Initiative is a multicenter consortium study that includes as one of its primary goals the development of standardized neuroimaging and biochemical biomarker methods for AD clinical trials, as well as using these to measure changes over time in mildly cognitively impaired patients who convert to AD as compared to the natural variability of these in control subjects and their further change over time in AD patients. Validation of the biomarker results by correlation analyses with neuropsychological and neurobehavioral test data is one of the primary outcomes of this study. This validation data will hopefully provide biomarker test performance needed for effective measurement of the efficacy of new treatment and prevention therapeutic agents. PMID:18097156
Selection, calibration, and validation of models of tumor growth.
Lima, E A B F; Oden, J T; Hormuth, D A; Yankeelov, T E; Almeida, R C
2016-11-01
This paper presents general approaches for addressing some of the most important issues in predictive computational oncology concerned with developing classes of predictive models of tumor growth. First, the process of developing mathematical models of vascular tumors evolving in the complex, heterogeneous, macroenvironment of living tissue; second, the selection of the most plausible models among these classes, given relevant observational data; third, the statistical calibration and validation of models in these classes, and finally, the prediction of key Quantities of Interest (QOIs) relevant to patient survival and the effect of various therapies. The most challenging aspects of this endeavor is that all of these issues often involve confounding uncertainties: in observational data, in model parameters, in model selection, and in the features targeted in the prediction. Our approach can be referred to as "model agnostic" in that no single model is advocated; rather, a general approach that explores powerful mixture-theory representations of tissue behavior while accounting for a range of relevant biological factors is presented, which leads to many potentially predictive models. Then representative classes are identified which provide a starting point for the implementation of OPAL, the Occam Plausibility Algorithm (OPAL) which enables the modeler to select the most plausible models (for given data) and to determine if the model is a valid tool for predicting tumor growth and morphology ( in vivo ). All of these approaches account for uncertainties in the model, the observational data, the model parameters, and the target QOI. We demonstrate these processes by comparing a list of models for tumor growth, including reaction-diffusion models, phase-fields models, and models with and without mechanical deformation effects, for glioma growth measured in murine experiments. Examples are provided that exhibit quite acceptable predictions of tumor growth in laboratory animals while demonstrating successful implementations of OPAL.
Content validation of an interprofessional learning video peer assessment tool.
Nisbet, Gillian; Jorm, Christine; Roberts, Chris; Gordon, Christopher J; Chen, Timothy F
2017-12-16
Large scale models of interprofessional learning (IPL) where outcomes are assessed are rare within health professional curricula. To date, there is sparse research describing robust assessment strategies to support such activities. We describe the development of an IPL assessment task based on peer rating of a student generated video evidencing collaborative interprofessional practice. We provide content validation evidence of an assessment rubric in the context of large scale IPL. Two established approaches to scale development in an educational setting were combined. A literature review was undertaken to develop a conceptual model of the relevant domains and issues pertaining to assessment of student generated videos within IPL. Starting with a prototype rubric developed from the literature, a series of staff and student workshops were undertaken to integrate expert opinion and user perspectives. Participants assessed five-minute videos produced in a prior pilot IPL activity. Outcomes from each workshop informed the next version of the rubric until agreement was reached on anchoring statements and criteria. At this point the rubric was declared fit to be used in the upcoming mandatory large scale IPL activity. The assessment rubric consisted of four domains: patient issues, interprofessional negotiation; interprofessional management plan in action; and effective use of video medium to engage audience. The first three domains reflected topic content relevant to the underlying construct of interprofessional collaborative practice. The fourth domain was consistent with the broader video assessment literature calling for greater emphasis on creativity in education. We have provided evidence for the content validity of a video-based peer assessment task portraying interprofessional collaborative practice in the context of large-scale IPL activities for healthcare professional students. Further research is needed to establish the reliability of such a scale.
Accelerometer-based measures in physical activity surveillance: current practices and issues.
Pedišić, Željko; Bauman, Adrian
2015-02-01
Self-reports of physical activity (PA) have been the mainstay of measurement in most non-communicable disease (NCD) surveillance systems. To these, other measures are added to summate to a comprehensive PA surveillance system. Recently, some national NCD surveillance systems have started using accelerometers as a measure of PA. The purpose of this paper was specifically to appraise the suitability and role of accelerometers for population-level PA surveillance. A thorough literature search was conducted to examine aspects of the generalisability, reliability, validity, comprehensiveness and between-study comparability of accelerometer estimates, and to gauge the simplicity, cost-effectiveness, adaptability and sustainability of their use in NCD surveillance. Accelerometer data collected in PA surveillance systems may not provide estimates that are generalisable to the target population. Accelerometer-based estimates have adequate reliability for PA surveillance, but there are still several issues associated with their validity. Accelerometer-based prevalence estimates are largely dependent on the investigators' choice of intensity cut-off points. Maintaining standardised accelerometer data collections in long-term PA surveillance systems is difficult, which may cause discontinuity in time-trend data. The use of accelerometers does not necessarily produce useful between-study and international comparisons due to lack of standardisation of data collection and processing methods. To conclude, it appears that accelerometers still have limitations regarding generalisability, validity, comprehensiveness, simplicity, affordability, adaptability, between-study comparability and sustainability. Therefore, given the current evidence, it seems that the widespread adoption of accelerometers specifically for large-scale PA surveillance systems may be premature. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Fifield, Leonard S.; Gandhi, Umesh N.
This project proposed to integrate, optimize and validate the fiber orientation and length distribution models previously developed and implemented in the Autodesk Simulation Moldflow Insight (ASMI) package for injection-molded long-carbon-fiber thermoplastic composites into a cohesive prediction capability. The current effort focused on rendering the developed models more robust and efficient for automotive industry part design to enable weight savings and cost reduction. The project goal has been achieved by optimizing the developed models, improving and integrating their implementations in ASMI, and validating them for a complex 3D LCF thermoplastic automotive part (Figure 1). Both PP and PA66 were used asmore » resin matrices. After validating ASMI predictions for fiber orientation and fiber length for this complex part against the corresponding measured data, in collaborations with Toyota and Magna PNNL developed a method using the predictive engineering tool to assess LCF/PA66 complex part design in terms of stiffness performance. Structural three-point bending analyses of the complex part and similar parts in steel were then performed for this purpose, and the team has then demonstrated the use of stiffness-based complex part design assessment to evaluate weight savings relative to the body system target (≥ 35%) set in Table 2 of DE-FOA-0000648 (AOI #1). In addition, starting from the part-to-part analysis, the PE tools enabled an estimated weight reduction for the vehicle body system using 50 wt% LCF/PA66 parts relative to the current steel system. Also, from this analysis an estimate of the manufacturing cost including the material cost for making the equivalent part in steel has been determined and compared to the costs for making the LCF/PA66 part to determine the cost per “saved” pound.« less
Cluster Cooperation in Wireless-Powered Sensor Networks: Modeling and Performance Analysis.
Zhang, Chao; Zhang, Pengcheng; Zhang, Weizhan
2017-09-27
A wireless-powered sensor network (WPSN) consisting of one hybrid access point (HAP), a near cluster and the corresponding far cluster is investigated in this paper. These sensors are wireless-powered and they transmit information by consuming the harvested energy from signal ejected by the HAP. Sensors are able to harvest energy as well as store the harvested energy. We propose that if sensors in near cluster do not have their own information to transmit, acting as relays, they can help the sensors in a far cluster to forward information to the HAP in an amplify-and-forward (AF) manner. We use a finite Markov chain to model the dynamic variation process of the relay battery, and give a general analyzing model for WPSN with cluster cooperation. Though the model, we deduce the closed-form expression for the outage probability as the metric of this network. Finally, simulation results validate the start point of designing this paper and correctness of theoretical analysis and show how parameters have an effect on system performance. Moreover, it is also known that the outage probability of sensors in far cluster can be drastically reduced without sacrificing the performance of sensors in near cluster if the transmit power of HAP is fairly high. Furthermore, in the aspect of outage performance of far cluster, the proposed scheme significantly outperforms the direct transmission scheme without cooperation.
Maravall, Darío; de Lope, Javier; Fuentes, Juan P
2017-01-01
We introduce a hybrid algorithm for the self-semantic location and autonomous navigation of robots using entropy-based vision and visual topological maps. In visual topological maps the visual landmarks are considered as leave points for guiding the robot to reach a target point (robot homing) in indoor environments. These visual landmarks are defined from images of relevant objects or characteristic scenes in the environment. The entropy of an image is directly related to the presence of a unique object or the presence of several different objects inside it: the lower the entropy the higher the probability of containing a single object inside it and, conversely, the higher the entropy the higher the probability of containing several objects inside it. Consequently, we propose the use of the entropy of images captured by the robot not only for the landmark searching and detection but also for obstacle avoidance. If the detected object corresponds to a landmark, the robot uses the suggestions stored in the visual topological map to reach the next landmark or to finish the mission. Otherwise, the robot considers the object as an obstacle and starts a collision avoidance maneuver. In order to validate the proposal we have defined an experimental framework in which the visual bug algorithm is used by an Unmanned Aerial Vehicle (UAV) in typical indoor navigation tasks.
Maravall, Darío; de Lope, Javier; Fuentes, Juan P.
2017-01-01
We introduce a hybrid algorithm for the self-semantic location and autonomous navigation of robots using entropy-based vision and visual topological maps. In visual topological maps the visual landmarks are considered as leave points for guiding the robot to reach a target point (robot homing) in indoor environments. These visual landmarks are defined from images of relevant objects or characteristic scenes in the environment. The entropy of an image is directly related to the presence of a unique object or the presence of several different objects inside it: the lower the entropy the higher the probability of containing a single object inside it and, conversely, the higher the entropy the higher the probability of containing several objects inside it. Consequently, we propose the use of the entropy of images captured by the robot not only for the landmark searching and detection but also for obstacle avoidance. If the detected object corresponds to a landmark, the robot uses the suggestions stored in the visual topological map to reach the next landmark or to finish the mission. Otherwise, the robot considers the object as an obstacle and starts a collision avoidance maneuver. In order to validate the proposal we have defined an experimental framework in which the visual bug algorithm is used by an Unmanned Aerial Vehicle (UAV) in typical indoor navigation tasks. PMID:28900394
Segmentation of 3d Models for Cultural Heritage Structural Analysis - Some Critical Issues
NASA Astrophysics Data System (ADS)
Gonizzi Barsanti, S.; Guidi, G.; De Luca, L.
2017-08-01
Cultural Heritage documentation and preservation has become a fundamental concern in this historical period. 3D modelling offers a perfect aid to record ancient buildings and artefacts and can be used as a valid starting point for restoration, conservation and structural analysis, which can be performed by using Finite Element Methods (FEA). The models derived from reality-based techniques, made up of the exterior surfaces of the objects captured at high resolution, are - for this reason - made of millions of polygons. Such meshes are not directly usable in structural analysis packages and need to be properly pre-processed in order to be transformed in volumetric meshes suitable for FEA. In addition, dealing with ancient objects, a proper segmentation of 3D volumetric models is needed to analyse the behaviour of the structure with the most suitable level of detail for the different sections of the structure under analysis. Segmentation of 3D models is still an open issue, especially when dealing with ancient, complicated and geometrically complex objects that imply the presence of anomalies and gaps, due to environmental agents such as earthquakes, pollution, wind and rain, or human factors. The aims of this paper is to critically analyse some of the different methodologies and algorithms available to segment a 3D point cloud or a mesh, identifying difficulties and problems by showing examples on different structures.
Fizeau interferometric cophasing of segmented mirrors: experimental validation.
Cheetham, Anthony; Cvetojevic, Nick; Norris, Barnaby; Sivaramakrishnan, Anand; Tuthill, Peter
2014-06-02
We present an optical testbed demonstration of the Fizeau Interferometric Cophasing of Segmented Mirrors (FICSM) algorithm. FICSM allows a segmented mirror to be phased with a science imaging detector and three filters (selected among the normal science complement). It requires no specialised, dedicated wavefront sensing hardware. Applying random piston and tip/tilt aberrations of more than 5 wavelengths to a small segmented mirror array produced an initial unphased point spread function with an estimated Strehl ratio of 9% that served as the starting point for our phasing algorithm. After using the FICSM algorithm to cophase the pupil, we estimated a Strehl ratio of 94% based on a comparison between our data and simulated encircled energy metrics. Our final image quality is limited by the accuracy of our segment actuation, which yields a root mean square (RMS) wavefront error of 25 nm. This is the first hardware demonstration of coarse and fine phasing an 18-segment pupil with the James Webb Space Telescope (JWST) geometry using a single algorithm. FICSM can be implemented on JWST using any of its scientic imaging cameras making it useful as a fall-back in the event that accepted phasing strategies encounter problems. We present an operational sequence that would co-phase such an 18-segment primary in 3 sequential iterations of the FICSM algorithm. Similar sequences can be readily devised for any segmented mirror.
Cluster Cooperation in Wireless-Powered Sensor Networks: Modeling and Performance Analysis
Zhang, Chao; Zhang, Pengcheng; Zhang, Weizhan
2017-01-01
A wireless-powered sensor network (WPSN) consisting of one hybrid access point (HAP), a near cluster and the corresponding far cluster is investigated in this paper. These sensors are wireless-powered and they transmit information by consuming the harvested energy from signal ejected by the HAP. Sensors are able to harvest energy as well as store the harvested energy. We propose that if sensors in near cluster do not have their own information to transmit, acting as relays, they can help the sensors in a far cluster to forward information to the HAP in an amplify-and-forward (AF) manner. We use a finite Markov chain to model the dynamic variation process of the relay battery, and give a general analyzing model for WPSN with cluster cooperation. Though the model, we deduce the closed-form expression for the outage probability as the metric of this network. Finally, simulation results validate the start point of designing this paper and correctness of theoretical analysis and show how parameters have an effect on system performance. Moreover, it is also known that the outage probability of sensors in far cluster can be drastically reduced without sacrificing the performance of sensors in near cluster if the transmit power of HAP is fairly high. Furthermore, in the aspect of outage performance of far cluster, the proposed scheme significantly outperforms the direct transmission scheme without cooperation. PMID:28953231
Berhenke, Amanda; Miller, Alison L.; Brown, Eleanor; Seifer, Ronald; Dickstein, Susan
2011-01-01
Emotions and behaviors observed during challenging tasks are hypothesized to be valuable indicators of young children's motivation, the assessment of which may be particularly important for children at risk for school failure. The current study demonstrated reliability and concurrent validity of a new observational assessment of motivation in young children. Head Start graduates completed challenging puzzle and trivia tasks during their kindergarten year. Children's emotion expression and task engagement were assessed based on their observed facial and verbal expressions and behavioral cues. Hierarchical regression analyses revealed that observed persistence and shame predicted teacher ratings of children's academic achievement, whereas interest, anxiety, pride, shame, and persistence predicted children's social skills and learning-related behaviors. Children's emotional and behavioral responses to challenge thus appeared to be important indicators of school success. Observation of such responses may be a useful and valid alternative to self-report measures of motivation at this age. PMID:21949599
Berhenke, Amanda; Miller, Alison L; Brown, Eleanor; Seifer, Ronald; Dickstein, Susan
2011-01-01
Emotions and behaviors observed during challenging tasks are hypothesized to be valuable indicators of young children's motivation, the assessment of which may be particularly important for children at risk for school failure. The current study demonstrated reliability and concurrent validity of a new observational assessment of motivation in young children. Head Start graduates completed challenging puzzle and trivia tasks during their kindergarten year. Children's emotion expression and task engagement were assessed based on their observed facial and verbal expressions and behavioral cues. Hierarchical regression analyses revealed that observed persistence and shame predicted teacher ratings of children's academic achievement, whereas interest, anxiety, pride, shame, and persistence predicted children's social skills and learning-related behaviors. Children's emotional and behavioral responses to challenge thus appeared to be important indicators of school success. Observation of such responses may be a useful and valid alternative to self-report measures of motivation at this age.
Aerodynamic side-force alleviator means
NASA Technical Reports Server (NTRS)
Rao, D. M. (Inventor)
1980-01-01
An apparatus for alleviating high angle of attack side force on slender pointed cylindrical forebodies such as fighter aircraft, missiles and the like is described. A symmetrical pair of helical separation trips was employed to disrupt the leeside vortices normally attained. The symmetrical pair of trips starts at either a common point or at space points on the upper surface of the forebody and extends along separate helical paths along the circumference of the forebody.
A new prognostic model for chemotherapy-induced febrile neutropenia.
Ahn, Shin; Lee, Yoon-Seon; Lee, Jae-Lyun; Lim, Kyung Soo; Yoon, Sung-Cheol
2016-02-01
The objective of this study was to develop and validate a new prognostic model for febrile neutropenia (FN). This study comprised 1001 episodes of FN: 718 for the derivation set and 283 for the validation set. Multivariate logistic regression analysis was performed with unfavorable outcome as the primary endpoint and bacteremia as the secondary endpoint. In the derivation set, risk factors for adverse outcomes comprised age ≥ 60 years (2 points), procalcitonin ≥ 0.5 ng/mL (5 points), ECOG performance score ≥ 2 (2 points), oral mucositis grade ≥ 3 (3 points), systolic blood pressure <90 mmHg (3 points), and respiratory rate ≥ 24 breaths/min (3 points). The model stratified patients into three severity classes, with adverse event rates of 6.0 % in class I (score ≤ 2), 27.3 % in class II (score 3-8), and 67.9 % in class III (score ≥ 9). Bacteremia was present in 1.1, 11.5, and 29.8 % of patients in class I, II, and III, respectively. The outcomes of the validation set were similar in each risk class. When the derivation and validation sets were integrated, unfavorable outcomes occurred in 5.9 % of the low-risk group classified by the new prognostic model and in 12.2 % classified by the Multinational Association for Supportive Care in Cancer (MASCC) risk index. With the new prognostic model, we can classify patients with FN into three classes of increasing adverse outcomes and bacteremia. Early discharge would be possible for class I patients, short-term observation could safely manage class II patients, and inpatient admission is warranted for class III patients.
[Validity of AUDIT test for detection of disorders related with alcohol consumption in women].
Pérula-de Torres, Luis Angel; Fernández-García, José Angel; Arias-Vega, Raquel; Muriel-Palomino, María; Márquez-Rebollo, Encarnación; Ruiz-Moral, Roger
2005-11-26
Early detection of patients with alcohol problems is important in clinical practice. The AUDIT (Alcohol Use Disorders Identification Test) questionnaire is a valid tool for this aim, especially in the male population. The objective of this study was to validate how useful is this questionnaire in females patients and to assess their test cut-off point for the diagnosis of alcohol problems in women. 414 woman were recruited in 2 health center and specialized center for addiction treatment. The AUDIT test and a semistructured interview (SCAN as gold standard) were performed to all patients. Internal consistency and criteria validity was assessed. Cronbach alpha was 0.93 (95% confidence interval [CI], 0.921-0.941). When the DSM-IV was taken as reference the most useful cut-off point was 6 points, with 89.6% (95% CI, 76.11-96.02) sensitivity and 95.07% (95% CI, 92.18-96.97) specificity. When CIE-10 was taken as reference the sensitivity was 89.58% (95% CI, 76.56-96.10) and the specificity was 95.33% (95% CI, 92.48-97.17). AUDIT is a questionnaire with good psychometrics properties and is valid for detecting dependence and risk alcohol consumption in women.
Émond, Marcel; Guimont, Chantal; Chauny, Jean-Marc; Daoust, Raoul; Bergeron, Éric; Vanier, Laurent; Moore, Lynne; Plourde, Miville; Kuimi, Batomen; Boucher, Valérie; Allain-Boulé, Nadine; Le Sage, Natalie
2017-01-01
Background: About 75% of patients with minor thoracic injury are discharged after an emergency department visit. However, complications such as delayed hemothorax can occur. We sought to derive and validate a clinical decision rule to predict hemothorax in patients discharged from the emergency department. Methods: We conducted a 6-year prospective cohort study in 4 university-affiliated emergency departments. Patients aged 16 years or older presenting with a minor thoracic injury were assessed at 5 time points (initial visit and 7, 14, 30 and 90 d after the injury). Radiologists' reports were reviewed for the presence of hemothorax. We used log-binomial regression models to identify predictors of hemothorax. Results: A total of 1382 patients were included: 830 in the derivation phase and 552 in the validation phase. Of these, 151 (10.9%) had hemothorax at the 14-day follow-up. Patients 65 years of age or older represented 25.3% (210/830) and 23.7% (131/552) of the derivation and validation cohorts, respectively. The final clinical decision rule included a combination of age (> 70 yr, 2 points; 45-70 yr, 1 point), fracture of any high to mid thorax rib (ribs 3-9, 2 points) and presence of 3 or more rib fractures (1 point). Twenty (30.8%) of the 65 high-risk patients (score ≥ 4) experienced hemothorax during the follow-up period. The clinical decision rule had a high specificity (90.7%, 95% confidence interval 87.7%-93.1%) in this high-risk group, thus guiding appropriate post-emergency care. Interpretation: One patient out of every 10 presented with delayed hemothorax after discharge from the emergency department. Implementation of this validated clinical decision rule for minor thoracic injury could guide emergency discharge plans. PMID:28611156
Sellers, Brian G; Viljoen, Jodi L.; Cruise, Keith R.; Nicholls, Tonia L.; Dvoskin, Joel A.
2012-01-01
The Short-Term Assessment of Risk and Treatability: Adolescent Version (START:AV) is a new structured professional judgment guide for assessing short-term risks in adolescents. The scheme may be distinguished from other youth risk assessment and treatment planning instruments by its inclusion of 23 dynamic factors that are each rated for both vulnerability and strength. In addition, START:AV is also unique in that it focuses on multiple adverse outcomes—namely, violence, self-harm, suicide, unauthorized leave, substance abuse, self-neglect, victimization, and general offending—over the short-term (i.e., weeks to months) rather than long-term (i.e., years). This paper describes a pilot implementation and preliminary evaluation of START:AV in three secure juvenile correctional facilities in the southern United States. Specifically, we examined the descriptive characteristics and psychometric properties of START:AV assessments completed by 21 case managers on 291 adolescent offenders (250 boys and 41 girls) at the time of admission. Results provide preliminary support for the feasibility of completing START:AV assessments as part of routine practice. Findings also highlight differences in the characteristics of START:AV assessments for boys and girls and differential associations between the eight START:AV risk domains. Though results are promising, further research is needed to establish the reliability and validity of START:AV assessments completed in the field. PMID:23316116
Ockhuijsen, Henrietta D L; van Smeden, Maarten; van den Hoogen, Agnes; Boivin, Jacky
2017-06-01
To examine construct and criterion validity of the Dutch SCREENIVF among women and men undergoing a fertility treatment. A prospective longitudinal study nested in a randomized controlled trial. University hospital. Couples, 468 women and 383 men, undergoing an IVF/intracytoplasmic sperm injection (ICSI) treatment in a fertility clinic, completed the SCREENIVF. Construct and criteria validity of the SCREENIVF. The comparative fit index and root mean square error of approximation for women and men show a good fit of the factor model. Across time, the sensitivity for Hospital Anxiety and Depression Scale subscale in women ranged from 61%-98%, specificity 53%-65%, predictive value of a positive test (PVP) 13%-56%, predictive value of a negative test (PVN) 70%-99%. The sensitivity scores for men ranged from 38%-100%, specificity 71%-75%, PVP 9%-27%, PVN 92%-100%. A prediction model revealed that for women 68.7% of the variance in the Hospital Anxiety and Depression Scale on time 1 and 42.5% at time 2 and 38.9% at time 3 was explained by the predictors, the sum score scales of the SCREENIVF. For men, 58.1% of the variance in the Hospital Anxiety and Depression Scale on time 1 and 46.5% at time 2 and 37.3% at time 3 was explained by the predictors, the sum score scales of the SCREENIVF. The SCREENIVF has good construct validity but the concurrent validity is better than the predictive validity. SCREENIVF will be most effectively used in fertility clinics at the start of treatment and should not be used as a predictive tool. Copyright © 2017 American Society for Reproductive Medicine. All rights reserved.
Intratester Reliability and Construct Validity of a Hip Abductor Eccentric Strength Test.
Brindle, Richard A; Ebaugh, David; Milner, Clare E
2018-06-06
Side-lying hip abductor strength tests are commonly used to evaluate muscle strength. In a "break" test, the tester applies sufficient force to lower the limb to the table while the patient resists. The peak force is postulated to occur while the leg is lowering, thus representing the participant's eccentric muscle strength. However, it is unclear whether peak force occurs before or after the leg begins to lower. To determine intrarater reliability and construct validity of a hip abductor eccentric strength test. Intrarater reliability and construct validity study. Twenty healthy adults (26 [6] y; 1.66 [0.06] m; 62.2 [8.0] kg) made 2 visits to the laboratory at least 1 week apart. During the hip abductor eccentric strength test, a handheld dynamometer recorded peak force and time to peak force, and limb position was recorded via a motion capture system. Intrarater reliability was determined using intraclass correlation, SEM, and minimal detectable difference. Construct validity was assessed by determining if peak force occurred after the start of the lowering phase using a 1-sample t test. The hip abductor eccentric strength test had substantial intrarater reliability (intraclass correlation (3,3) = .88; 95% confidence interval, .65-.95), SEM of 0.9 %BWh, and a minimal detectable difference of 2.5 %BWh. Construct validity was established as peak force occurred 2.1 (0.6) seconds (range: 0.7-3.7 s) after the start of the lowering phase of the test (P ≤ .001). The hip abductor eccentric strength test is a valid and reliable measure of eccentric muscle strength. This test may be used clinically to assess changes in eccentric muscle strength over time.
ERIC Educational Resources Information Center
Vanderlinden, Loren
2002-01-01
Discusses concerns about polluted breast milk. Offers research results which support concerns about lead exposure and human beings. Points out that often the benefits of breastfeeding outweigh the risks. (DDR)
From Pong to Pokemon Go, catching the essence of the Internet Gaming Disorder diagnosis
Carbonell, Xavier
2017-01-01
Taking as a starting point, this commentary proposes some issues regarding the diagnosis of Internet Gaming Disorder discussed in Kuss et al. (2016). In our opinion, the confusion in DSM-5 diagnosis could be due to the weak starting point in building the criteria. The criteria such as functional impairment and stability of the dysfunctional behavior are considered. It is suggested that avatar identification, playing motivations, and types of video games should be considered for diagnosis. The diagnostic process is highly influenced by social context and the rapid development of video game industry. The commentary ends by considering the distinction between online and offline video gaming and the critical consideration of everyday behaviors as being addictive. PMID:28301965
From Pong to Pokemon Go, catching the essence of the Internet Gaming Disorder diagnosis.
Carbonell, Xavier
2017-06-01
Taking as a starting point, this commentary proposes some issues regarding the diagnosis of Internet Gaming Disorder discussed in Kuss et al. (2016). In our opinion, the confusion in DSM-5 diagnosis could be due to the weak starting point in building the criteria. The criteria such as functional impairment and stability of the dysfunctional behavior are considered. It is suggested that avatar identification, playing motivations, and types of video games should be considered for diagnosis. The diagnostic process is highly influenced by social context and the rapid development of video game industry. The commentary ends by considering the distinction between online and offline video gaming and the critical consideration of everyday behaviors as being addictive.
A philosophy for big-bang cosmology.
McCrea, W H
1970-10-03
According to recent developments in cosmology we seem bound to find a model universe like the observed universe, almost independently of how we suppose it started. Such ideas, if valid, provide fresh justification for the procedures of current cosmological theory.
Gunaydin, Gurkan; Citaker, Seyit; Meray, Jale; Cobanoglu, Gamze; Gunaydin, Ozge Ece; Hazar Kanik, Zeynep
2016-11-01
Validation of a self-report questionnaire. The purpose of this study was to investigate adaptation, validity, and reliability of the Turkish version of the Bournemouth Questionnaire. Low back pain is one of the most frequent disorders leading to activity limitation. This pain affects most of people in their lives. The most important point to evaluate patient's functional abilities and to decide a successful therapy procedure is to manage the assessment questionnaires precisely. One hundred ten patients with chronic low back pain were included in present study. To assess reliability, test-retest and internal consistency analyses were applied. The results of test-retest analysis were assessed by using Intraclass Correlation Coefficient method (95% confidence interval). For internal consistency, Cronbach alpha value was calculated. Validity of the questionnaire was assessed in terms of construct validity. For construct validity, factor analysis and convergent validity were tested. For convergent validity, total points of the Bournemouth Questionnaire were assessed with the total points of Quebec Back Pain Disability Scale and Roland Morris Disability Questionnaire by using Pearson correlation coefficient analysis. Cronbach alpha value was found 0.914, showing that this questionnaire has high internal consistency. The results of test-retest analysis were varying between 0.851 and 0.927, which shows that test-retest results are highly correlated. Factor analysis test indicated that this questionnaire had one factor. Pearson correlation coefficient of the Bournemouth Questionnaire with Roland Morris Disability Questionnaire was calculated 0.703 and it was found with Quebec Back Pain Disability Scale is 0.659. These results showed that the Bournemouth Questionnaire is very good correlated with Roland Morris Disability Questionnaire and Quebec Back Pain Disability Scale. The Turkish version of the Bournemouth Questionnaire is valid and reliable. 3.
Silva, Adriana Lucia Pastore E; Croci, Alberto Tesconi; Gobbi, Riccardo Gomes; Hinckel, Betina Bremer; Pecora, José Ricardo; Demange, Marco Kawamura
2017-01-01
Translation, cultural adaptation, and validation of the new version of the Knee Society Score - The 2011 KS Score - into Brazilian Portuguese and verification of its measurement properties, reproducibility, and validity. In 2012, the new version of the Knee Society Score was developed and validated. This scale comprises four separate subscales: (a) objective knee score (seven items: 100 points); (b) patient satisfaction score (five items: 40 points); (c) patient expectations score (three items: 15 points); and (d) functional activity score (19 items: 100 points). A total of 90 patients aged 55-85 years were evaluated in a clinical cross-sectional study. The pre-operative translated version was applied to patients with TKA referral, and the post-operative translated version was applied to patients who underwent TKA. Each patient answered the same questionnaire twice and was evaluated by two experts in orthopedic knee surgery. Evaluations were performed pre-operatively and three, six, or 12 months post-operatively. The reliability of the questionnaire was evaluated using the intraclass correlation coefficient (ICC) between the two applications. Internal consistency was evaluated using Cronbach's alpha. The ICC found no difference between the means of the pre-operative, three-month, and six-month post-operative evaluations between sub-scale items. The Brazilian Portuguese version of The 2011 KS Score is a valid and reliable instrument for objective and subjective evaluation of the functionality of Brazilian patients who undergo TKA and revision TKA.
Ursodeoxycholic Acid for Treatment of Enlarged Polycystic Liver.
Iijima, Takashi; Hoshino, Junichi; Suwabe, Tatsuya; Sumida, Keiichi; Mise, Koki; Kawada, Masahiro; Imafuku, Aya; Hayami, Noriko; Hiramatsu, Rikako; Hasegawa, Eiko; Sawa, Naoki; Takaichi, Kenmei; Ubara, Yoshifumi
2016-02-01
Patients with autosomal dominant polycystic kidney disease and polycystic liver disease (PLD) often have elevated serum levels of alkaline phosphatase (ALP) and gamma-glutamyl transpeptidase (GGT). Ursodeoxycholic acid (UDCA) is used to treat biliary tract diseases, but its effect on PLD remains unclear. UDCA was administered for 1 year at a dose of 300 mg daily to seven PLD patients with elevated ALP or GGT levels who were selected for this treatment by experienced clinicians. Laboratory data and liver volumes were compared among three time points: 1 year before UDCA treatment, at the start of UDCA therapy, and 1 year after the start of therapy. Median GGT did not show a significant change between 1 year before UDCA (180 IU/L) and the start of UDCA therapy (209 IU/L), but it decreased significantly to 98 IU/L after 1 year of UDCA therapy (P = 0.015 vs. the start of therapy). ALP showed a significant increase from 1 year before UDCA (456 IU/L) to the start of UDCA therapy (561 IU/L), and then decreased significantly after 1 year of UDCA therapy (364 IU/L). Median liver volume did not show any significant changes among these three time points of assessment. UDCA may be effective for reducing biliary enzyme levels and inhibiting the growth of liver cysts in patients with PLD. © 2015 International Society for Apheresis, Japanese Society for Apheresis, and Japanese Society for Dialysis Therapy.
Starting and Managing a Retail Flower Shop. The Starting and Managing Series, Volume 18.
ERIC Educational Resources Information Center
Krone, Paul R.
This booklet is intended to give a general idea of what is required to set up and manage a flower shop, to point out some of the problems and rewards, and to tell where to find more detailed information. First, an overview of the business is provided, telling the background required in education and experience as well as the amount of profit that…