ERIC Educational Resources Information Center
Wage and Labor Standards Administration (DOL), Washington, DC.
Definitions of terms used in the Fair Labor Standards Act and statistical tables compiled from a survey of agricultural processing firms comprise this appendix, which is the second volume of a two volume report. Volume I is available as VT 012 247. (BH)
Bottle, Alex; Darzi, Ara W; Athanasiou, Thanos; Vale, Justin A
2010-01-01
Objectives To investigate the relation between volume and mortality after adjustment for case mix for radical cystectomy in the English healthcare setting using improved statistical methodology, taking into account the institutional and surgeon volume effects and institutional structural and process of care factors. Design Retrospective analysis of hospital episode statistics using multilevel modelling. Setting English hospitals carrying out radical cystectomy in the seven financial years 2000/1 to 2006/7. Participants Patients with a primary diagnosis of cancer undergoing an inpatient elective cystectomy. Main outcome measure Mortality within 30 days of cystectomy. Results Compared with low volume institutions, medium volume ones had a significantly higher odds of in-hospital and total mortality: odds ratio 1.72 (95% confidence interval 1.00 to 2.98, P=0.05) and 1.82 (1.08 to 3.06, P=0.02). This was only seen in the final model, which included adjustment for structural and processes of care factors. The surgeon volume-mortality relation showed weak evidence of reduced odds of in-hospital mortality (by 35%) for the high volume surgeons, although this did not reach statistical significance at the 5% level. Conclusions The relation between case volume and mortality after radical cystectomy for bladder cancer became evident only after adjustment for structural and process of care factors, including staffing levels of nurses and junior doctors, in addition to case mix. At least for this relatively uncommon procedure, adjusting for these confounders when examining the volume-outcome relation is critical before considering centralisation of care to a few specialist institutions. Outcomes other than mortality, such as functional morbidity and disease recurrence may ultimately influence towards centralising care. PMID:20305302
The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes
ERIC Educational Resources Information Center
Cartier, Stephen F.
2011-01-01
A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…
NASA Astrophysics Data System (ADS)
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Research in Stochastic Processes.
1984-10-01
Handbook of Statistics, Volume 5: Time Series in Time Domain, E.J. Hannan, P.R. Krishnaiah and M.M. Rao, eds., North Holland, 1984, to appear. 5. J.A...designs for time series." S. Cambanis, Handbook of Statistics. Volume 5: Time Series in Time Domain, E.J. Hannan, P.R. Krishnaiah and M.M. Rao, eds... Krishnaiah and M.M. Rao, eds., North Holland, 1984, to appear. 59. "Ergodic properties of stationary stable processes." S. Cambanis, C.D. Hardin, and A
1993-08-01
subtitled "Simulation Data," consists of detailed infonrnation on the design parmneter variations tested, subsequent statistical analyses conducted...used with confidence during the design process. The data quality can be examined in various forms such as statistical analyses of measure of merit data...merit, such as time to capture or nmaximurn pitch rate, can be calculated from the simulation time history data. Statistical techniques are then used
NASA Technical Reports Server (NTRS)
Davis, B. J.; Feiveson, A. H.
1975-01-01
Results are presented of CITARS data processing in raw form. Tables of descriptive statistics are given along with descriptions and results of inferential analyses. The inferential results are organized by questions which CITARS was designed to answer.
Brain tissues volume measurements from 2D MRI using parametric approach
NASA Astrophysics Data System (ADS)
L'vov, A. A.; Toropova, O. A.; Litovka, Yu. V.
2018-04-01
The purpose of the paper is to propose a fully automated method of volume assessment of structures within human brain. Our statistical approach uses maximum interdependency principle for decision making process of measurements consistency and unequal observations. Detecting outliers performed using maximum normalized residual test. We propose a statistical model which utilizes knowledge of tissues distribution in human brain and applies partial data restoration for precision improvement. The approach proposes completed computationally efficient and independent from segmentation algorithm used in the application.
Schwarz, Daniel A.; Arman, Krikor G.; Kakwan, Mehreen S.; Jamali, Ameen M.; Elmeligy, Ayman A.; Buchman, Steven R.
2015-01-01
Background The authors’ goal was to ascertain regenerate bone-healing metrics using quantitative histomorphometry at a single consolidation period. Methods Rats underwent either mandibular distraction osteogenesis (n=7) or partially reduced fractures (n=7); their contralateral mandibles were used as controls (n=11). External fixators were secured and unilateral osteotomies performed, followed by either mandibular distraction osteogenesis (4 days’ latency, then 0.3 mm every 12 hours for 8 days; 5.1 mm) or partially reduced fractures (fixed immediately postoperatively; 2.1 mm); both groups underwent 4 weeks of consolidation. After tissue processing, bone volume/tissue volume ratio, osteoid volume/tissue volume ratio, and osteocyte count per high-power field were analyzed by means of quantitative histomorphometry. Results Contralateral mandibles had statistically greater bone volume/tissue volume ratio and osteocyte count per high-power field compared with both mandibular distraction osteogenesis and partially reduced fractures by almost 50 percent, whereas osteoid volume/tissue volume ratio was statistically greater in both mandibular distraction osteogenesis specimens and partially reduced fractures compared with contralateral mandibles. No statistical difference in bone volume/tissue volume ratio, osteoid volume/tissue volume ratio, or osteocyte count per high-power field was found between mandibular distraction osteogenesis specimens and partially reduced fractures. Conclusions The authors’ findings demonstrate significantly decreased bone quantity and maturity in mandibular distraction osteogenesis specimens and partially reduced fractures compared with contralateral mandibles using the clinically analogous protocols. If these results are extrapolated clinically, treatment strategies may require modification to ensure reliable, predictable, and improved outcomes. PMID:20463629
Carro, N; García, I; Ignacio, M-C; Llompart, M; Yebra, M-C; Mouteira, A
2002-10-01
A sample-preparation procedure (extraction and saponification) using microwave energy is proposed for determination of organochlorine pesticides in oyster samples. A Plackett-Burman factorial design has been used to optimize the microwave-assisted extraction and mild saponification on a freeze dried sample spiked with a mixture of aldrin, endrin, dieldrin, heptachlor, heptachorepoxide, isodrin, transnonachlor, p, p'-DDE, and p, p'-DDD. Six variables: solvent volume, extraction time, extraction temperature, amount of acetone (%) in the extractant solvent, amount of sample, and volume of NaOH solution were considered in the optimization process. The results show that the amount of sample is statistically significant for dieldrin, aldrin, p, p'-DDE, heptachlor, and transnonachlor and solvent volume for dieldrin, aldrin, and p, p'-DDE. The volume of NaOH solution is statistically significant for aldrin and p, p'-DDE only. Extraction temperature and extraction time seem to be the main factors determining the efficiency of extraction process for isodrin and p, p'-DDE, respectively. The optimized procedure was compared with conventional Soxhlet extraction.
Expert Systems on Multiprocessor Architectures. Volume 4. Technical Reports
1991-06-01
Floated-Current-Time0 -> The time that this function is called in user time uflts, expressed as a floating point number. Halt- Poligono Arrests the...default a statistics file will be printed out, if it can be. To prevent this make No-Statistics true. Unhalt- Poligono Unarrests the process in which the
Statistical Model of Dynamic Markers of the Alzheimer's Pathological Cascade.
Balsis, Steve; Geraci, Lisa; Benge, Jared; Lowe, Deborah A; Choudhury, Tabina K; Tirso, Robert; Doody, Rachelle S
2018-05-05
Alzheimer's disease (AD) is a progressive disease reflected in markers across assessment modalities, including neuroimaging, cognitive testing, and evaluation of adaptive function. Identifying a single continuum of decline across assessment modalities in a single sample is statistically challenging because of the multivariate nature of the data. To address this challenge, we implemented advanced statistical analyses designed specifically to model complex data across a single continuum. We analyzed data from the Alzheimer's Disease Neuroimaging Initiative (ADNI; N = 1,056), focusing on indicators from the assessments of magnetic resonance imaging (MRI) volume, fluorodeoxyglucose positron emission tomography (FDG-PET) metabolic activity, cognitive performance, and adaptive function. Item response theory was used to identify the continuum of decline. Then, through a process of statistical scaling, indicators across all modalities were linked to that continuum and analyzed. Findings revealed that measures of MRI volume, FDG-PET metabolic activity, and adaptive function added measurement precision beyond that provided by cognitive measures, particularly in the relatively mild range of disease severity. More specifically, MRI volume, and FDG-PET metabolic activity become compromised in the very mild range of severity, followed by cognitive performance and finally adaptive function. Our statistically derived models of the AD pathological cascade are consistent with existing theoretical models.
Hernández-Martin, Estefania; Marcano, Francisco; Casanova, Oscar; Modroño, Cristian; Plata-Bello, Julio; González-Mora, Jose Luis
2017-01-01
Abstract. Diffuse optical tomography (DOT) measures concentration changes in both oxy- and deoxyhemoglobin providing three-dimensional images of local brain activations. A pilot study, which compares both DOT and functional magnetic resonance imaging (fMRI) volumes through t-maps given by canonical statistical parametric mapping (SPM) processing for both data modalities, is presented. The DOT series were processed using a method that is based on a Bayesian filter application on raw DOT data to remove physiological changes and minimum description length application index to select a number of singular values, which reduce the data dimensionality during image reconstruction and adaptation of DOT volume series to normalized standard space. Therefore, statistical analysis is performed with canonical SPM software in the same way as fMRI analysis is done, accepting DOT volumes as if they were fMRI volumes. The results show the reproducibility and ruggedness of the method to process DOT series on group analysis using cognitive paradigms on the prefrontal cortex. Difficulties such as the fact that scalp–brain distances vary between subjects or cerebral activations are difficult to reproduce due to strategies used by the subjects to solve arithmetic problems are considered. T-images given by fMRI and DOT volume series analyzed in SPM show that at the functional level, both DOT and fMRI measures detect the same areas, although DOT provides complementary information to fMRI signals about cerebral activity. PMID:28386575
Self-assessed performance improves statistical fusion of image labels
Bryan, Frederick W.; Xu, Zhoubing; Asman, Andrew J.; Allen, Wade M.; Reich, Daniel S.; Landman, Bennett A.
2014-01-01
Purpose: Expert manual labeling is the gold standard for image segmentation, but this process is difficult, time-consuming, and prone to inter-individual differences. While fully automated methods have successfully targeted many anatomies, automated methods have not yet been developed for numerous essential structures (e.g., the internal structure of the spinal cord as seen on magnetic resonance imaging). Collaborative labeling is a new paradigm that offers a robust alternative that may realize both the throughput of automation and the guidance of experts. Yet, distributing manual labeling expertise across individuals and sites introduces potential human factors concerns (e.g., training, software usability) and statistical considerations (e.g., fusion of information, assessment of confidence, bias) that must be further explored. During the labeling process, it is simple to ask raters to self-assess the confidence of their labels, but this is rarely done and has not been previously quantitatively studied. Herein, the authors explore the utility of self-assessment in relation to automated assessment of rater performance in the context of statistical fusion. Methods: The authors conducted a study of 66 volumes manually labeled by 75 minimally trained human raters recruited from the university undergraduate population. Raters were given 15 min of training during which they were shown examples of correct segmentation, and the online segmentation tool was demonstrated. The volumes were labeled 2D slice-wise, and the slices were unordered. A self-assessed quality metric was produced by raters for each slice by marking a confidence bar superimposed on the slice. Volumes produced by both voting and statistical fusion algorithms were compared against a set of expert segmentations of the same volumes. Results: Labels for 8825 distinct slices were obtained. Simple majority voting resulted in statistically poorer performance than voting weighted by self-assessed performance. Statistical fusion resulted in statistically indistinguishable performance from self-assessed weighted voting. The authors developed a new theoretical basis for using self-assessed performance in the framework of statistical fusion and demonstrated that the combined sources of information (both statistical assessment and self-assessment) yielded statistically significant improvement over the methods considered separately. Conclusions: The authors present the first systematic characterization of self-assessed performance in manual labeling. The authors demonstrate that self-assessment and statistical fusion yield similar, but complementary, benefits for label fusion. Finally, the authors present a new theoretical basis for combining self-assessments with statistical label fusion. PMID:24593721
Load research manual. Volume 1. Load research procedures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brandenburg, L.; Clarkson, G.; Grund, Jr., C.
1980-11-01
This three-volume manual presents technical guidelines for electric utility load research. Special attention is given to issues raised by the load data reporting requirements of the Public Utility Regulatory Policies Act of 1978 and to problems faced by smaller utilities that are initiating load research programs. In Volumes 1 and 2, procedures are suggested for determining data requirements for load research, establishing the size and customer composition of a load survey sample, selecting and using equipment to record customer electricity usage, processing data tapes from the recording equipment, and analyzing the data. Statistical techniques used in customer sampling are discussedmore » in detail. The costs of load research also are estimated, and ongoing load research programs at three utilities are described. The manual includes guides to load research literature and glossaries of load research and statistical terms.« less
A Statistics-Based Cracking Criterion of Resin-Bonded Silica Sand for Casting Process Simulation
NASA Astrophysics Data System (ADS)
Wang, Huimin; Lu, Yan; Ripplinger, Keith; Detwiler, Duane; Luo, Alan A.
2017-02-01
Cracking of sand molds/cores can result in many casting defects such as veining. A robust cracking criterion is needed in casting process simulation for predicting/controlling such defects. A cracking probability map, relating to fracture stress and effective volume, was proposed for resin-bonded silica sand based on Weibull statistics. Three-point bending test results of sand samples were used to generate the cracking map and set up a safety line for cracking criterion. Tensile test results confirmed the accuracy of the safety line for cracking prediction. A laboratory casting experiment was designed and carried out to predict cracking of a cup mold during aluminum casting. The stress-strain behavior and the effective volume of the cup molds were calculated using a finite element analysis code ProCAST®. Furthermore, an energy dispersive spectroscopy fractographic examination of the sand samples confirmed the binder cracking in resin-bonded silica sand.
A PROPOSED CHEMICAL INFORMATION AND DATA SYSTEM. VOLUME I.
CHEMICAL COMPOUNDS, *DATA PROCESSING, *INFORMATION RETRIEVAL, * CHEMICAL ANALYSIS, INPUT OUTPUT DEVICES, COMPUTER PROGRAMMING, CLASSIFICATION...CONFIGURATIONS, DATA STORAGE SYSTEMS, ATOMS, MOLECULES, PERFORMANCE( ENGINEERING ), MAINTENANCE, SUBJECT INDEXING, MAGNETIC TAPE, AUTOMATIC, MILITARY REQUIREMENTS, TYPEWRITERS, OPTICS, TOPOLOGY, STATISTICAL ANALYSIS, FLOW CHARTING.
Load research manual. Volume 2. Fundamentals of implementing load research procedures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brandenburg, L.; Clarkson, G.; Grund, Jr., C.
This three-volume manual presents technical guidelines for electric utility load research. Special attention is given to issues raised by the load data reporting requirements of the Public Utility Regulatory Policies Act of 1978 and to problems faced by smaller utilities that are initiating load research programs. In Volumes 1 and 2, procedures are suggested for determining data requirements for load research, establishing the size and customer composition of a load survey sample, selecting and using equipment to record customer electricity usage, processing data tapes from the recording equipment, and analyzing the data. Statistical techniques used in customer sampling are discussedmore » in detail. The costs of load research also are estimated, and ongoing load research programs at three utilities are described. The manual includes guides to load research literature and glossaries of load research and statistical terms.« less
Maritime dynamic traffic generator : Volume II. Electronic data processing program.
DOT National Transportation Integrated Search
1975-06-01
The processor program is designed to move 18,000 merchant vessels along standard routes to their destination and keep statistical records of the ports visited, the five degree squares passed through and the occurrence of casualties. This document pre...
[Laser's biostimulation in healing or crural ulcerations].
Król, P; Franek, A; Huńka-Zurawińska, W; Bil, J; Swist, D; Polak, A; Bendkowski, W
2001-11-01
The objective of this paper was to evaluate effect of laser's biostimulation on the process of healing of crural ulcerations. Three comparative groups of patients, A, B and C, were made at random from the patients with venous crural ulcerations. The group A consisted of 17, the group B 15, the group C 17 patients. The patients in all comparative groups were treated pharmacologically and got compress therapy. Ulcerations at patients in group A were additionally irradiated by light of biostimulation's laser (810 nm) in this way that every time ulcerations got dose of energy 4 J/cm2. The patient's in-group B additionally got blind trial (with placebo in the form of quasi-laserotherapy). The evaluated factors were to estimate how laser's biostimulation causes any changes of the size of the ulcers and of the volume of tissue defect. The speed of changes of size and volume of tissue defect per week was calculated. After the treatment there was statistically significant decrease of size of ulcers in all comparative groups while there was no statistically significant difference between the groups observed. After the treatment there was statistically significant decrease of volume of ulcers only in groups A and C but there was no statistically significant difference between the groups observed.
Studies in Mathematics Education: The Teaching of Statistics, Volume 7.
ERIC Educational Resources Information Center
Morris, Robert, Ed.
This volume examines the teaching of statistics in the whole range of education, but concentrates on primary and secondary schools. It is based upon selected topics from the Second International Congress on Teaching Statistics (ICOTS 2), convened in Canada in August 1986. The contents of this volume divide broadly into four parts: statistics in…
Vartanian, Oshin; Wertz, Christopher J; Flores, Ranee A; Beatty, Erin L; Smith, Ingrid; Blackler, Kristen; Lam, Quan; Jung, Rex E
2018-04-15
Openness/Intellect (i.e., openness to experience) is the Big Five personality factor most consistently associated with individual differences in creativity. Recent psychometric evidence has demonstrated that this factor consists of two distinct aspects-Intellect and Openness. Whereas Intellect reflects perceived intelligence and intellectual engagement, Openness reflects engagement with fantasy, perception, and aesthetics. We investigated the extent to which Openness and Intellect are associated with variations in brain structure as measured by cortical thickness, area, and volume (N = 185). Our results demonstrated that Openness was correlated inversely with cortical thickness and volume in left middle frontal gyrus (BA 6), middle temporal gyrus (MTG, BA 21), and superior temporal gyrus (BA 41), and exclusively with cortical thickness in left inferior parietal lobule (BA 40), right inferior frontal gyrus (IFG, BA 45), and MTG (BA 37). When age and sex were statistically controlled for, the inverse correlations between Openness and cortical thickness remained statistically significant for all regions except left MTG, whereas the correlations involving cortical volume remained statistically significant only for left middle frontal gyrus. There was no statistically significant correlation between Openness and cortical area, and no statistically significant correlation between Intellect and cortical thickness, area, or volume. Our results demonstrate that individual differences in Openness are correlated with variation in brain structure-particularly as indexed by cortical thickness. Given the involvement of the above regions in processes related to memory and cognitive control, we discuss the implications of our findings for the possible contribution of personality to creative cognition. © 2018 Her Majesty the Queen in Right of Canada 2018. Reproduced with permission of the Minister of Health, Canada. Human Brain Mapping.
ERIC Educational Resources Information Center
Hida, Takeyuki; Shimizu, Akinobu
This volume contains the papers and comments from the Workshop on Mathematics Education, a special session of the 15th Conference on Stochastic Processes and Their Applications, held in Nagoya, Japan, July 2-5, 1985. Topics covered include: (1) probability; (2) statistics; (3) deviation; (4) Japanese mathematics curriculum; (5) statistical…
Estimating the volume and age of water stored in global lakes using a geo-statistical approach
Messager, Mathis Loïc; Lehner, Bernhard; Grill, Günther; Nedeva, Irena; Schmitt, Oliver
2016-01-01
Lakes are key components of biogeochemical and ecological processes, thus knowledge about their distribution, volume and residence time is crucial in understanding their properties and interactions within the Earth system. However, global information is scarce and inconsistent across spatial scales and regions. Here we develop a geo-statistical model to estimate the volume of global lakes with a surface area of at least 10 ha based on the surrounding terrain information. Our spatially resolved database shows 1.42 million individual polygons of natural lakes with a total surface area of 2.67 × 106 km2 (1.8% of global land area), a total shoreline length of 7.2 × 106 km (about four times longer than the world's ocean coastline) and a total volume of 181.9 × 103 km3 (0.8% of total global non-frozen terrestrial water stocks). We also compute mean and median hydraulic residence times for all lakes to be 1,834 days and 456 days, respectively. PMID:27976671
Franc, Jeffrey Michael; Ingrassia, Pier Luigi; Verde, Manuela; Colombo, Davide; Della Corte, Francesco
2015-02-01
Surge capacity, or the ability to manage an extraordinary volume of patients, is fundamental for hospital management of mass-casualty incidents. However, quantification of surge capacity is difficult and no universal standard for its measurement has emerged, nor has a standardized statistical method been advocated. As mass-casualty incidents are rare, simulation may represent a viable alternative to measure surge capacity. Hypothesis/Problem The objective of the current study was to develop a statistical method for the quantification of surge capacity using a combination of computer simulation and simple process-control statistical tools. Length-of-stay (LOS) and patient volume (PV) were used as metrics. The use of this method was then demonstrated on a subsequent computer simulation of an emergency department (ED) response to a mass-casualty incident. In the derivation phase, 357 participants in five countries performed 62 computer simulations of an ED response to a mass-casualty incident. Benchmarks for ED response were derived from these simulations, including LOS and PV metrics for triage, bed assignment, physician assessment, and disposition. In the application phase, 13 students of the European Master in Disaster Medicine (EMDM) program completed the same simulation scenario, and the results were compared to the standards obtained in the derivation phase. Patient-volume metrics included number of patients to be triaged, assigned to rooms, assessed by a physician, and disposed. Length-of-stay metrics included median time to triage, room assignment, physician assessment, and disposition. Simple graphical methods were used to compare the application phase group to the derived benchmarks using process-control statistical tools. The group in the application phase failed to meet the indicated standard for LOS from admission to disposition decision. This study demonstrates how simulation software can be used to derive values for objective benchmarks of ED surge capacity using PV and LOS metrics. These objective metrics can then be applied to other simulation groups using simple graphical process-control tools to provide a numeric measure of surge capacity. Repeated use in simulations of actual EDs may represent a potential means of objectively quantifying disaster management surge capacity. It is hoped that the described statistical method, which is simple and reusable, will be useful for investigators in this field to apply to their own research.
Application of Tube Dynamics to Non-Statistical Reaction Processes
NASA Astrophysics Data System (ADS)
Gabern, F.; Koon, W. S.; Marsden, J. E.; Ross, S. D.; Yanao, T.
2006-06-01
A technique based on dynamical systems theory is introduced for the computation of lifetime distributions and rates of chemical reactions and scattering phenomena, even in systems that exhibit non-statistical behavior. In particular, we merge invariant manifold tube dynamics with Monte Carlo volume determination for accurate rate calculations. This methodology is applied to a three-degree-of-freedom model problem and some ideas on how it might be extended to higher-degree-of-freedom systems are presented.
Research on Secure Systems and Automatic Programming. Volume I
1977-10-14
for the enforcement of adherence to authorization; they include physical limitations, legal codes, social pressures, and the psychological makeup of...systems job statistics and possibly indications of an support instructions. The criteria for their abnormal termination. * inclusion were high execution...interrupt processes, for the output data page. Jobs may also terminate however, use the standard SWI TCH PROCESS instruc- abnormally by executing an
Eu, Byung Chan
2008-09-07
In the traditional theories of irreversible thermodynamics and fluid mechanics, the specific volume and molar volume have been interchangeably used for pure fluids, but in this work we show that they should be distinguished from each other and given distinctive statistical mechanical representations. In this paper, we present a general formula for the statistical mechanical representation of molecular domain (volume or space) by using the Voronoi volume and its mean value that may be regarded as molar domain (volume) and also the statistical mechanical representation of volume flux. By using their statistical mechanical formulas, the evolution equations of volume transport are derived from the generalized Boltzmann equation of fluids. Approximate solutions of the evolution equations of volume transport provides kinetic theory formulas for the molecular domain, the constitutive equations for molar domain (volume) and volume flux, and the dissipation of energy associated with volume transport. Together with the constitutive equation for the mean velocity of the fluid obtained in a previous paper, the evolution equations for volume transport not only shed a fresh light on, and insight into, irreversible phenomena in fluids but also can be applied to study fluid flow problems in a manner hitherto unavailable in fluid dynamics and irreversible thermodynamics. Their roles in the generalized hydrodynamics will be considered in the sequel.
Flood type specific construction of synthetic design hydrographs
NASA Astrophysics Data System (ADS)
Brunner, Manuela I.; Viviroli, Daniel; Sikorska, Anna E.; Vannier, Olivier; Favre, Anne-Catherine; Seibert, Jan
2017-02-01
Accurate estimates of flood peaks, corresponding volumes, and hydrographs are required to design safe and cost-effective hydraulic structures. In this paper, we propose a statistical approach for the estimation of the design variables peak and volume by constructing synthetic design hydrographs for different flood types such as flash-floods, short-rain floods, long-rain floods, and rain-on-snow floods. Our approach relies on the fitting of probability density functions to observed flood hydrographs of a certain flood type and accounts for the dependence between peak discharge and flood volume. It makes use of the statistical information contained in the data and retains the process information of the flood type. The method was tested based on data from 39 mesoscale catchments in Switzerland and provides catchment specific and flood type specific synthetic design hydrographs for all of these catchments. We demonstrate that flood type specific synthetic design hydrographs are meaningful in flood-risk management when combined with knowledge on the seasonality and the frequency of different flood types.
Invariance in the recurrence of large returns and the validation of models of price dynamics
NASA Astrophysics Data System (ADS)
Chang, Lo-Bin; Geman, Stuart; Hsieh, Fushing; Hwang, Chii-Ruey
2013-08-01
Starting from a robust, nonparametric definition of large returns (“excursions”), we study the statistics of their occurrences, focusing on the recurrence process. The empirical waiting-time distribution between excursions is remarkably invariant to year, stock, and scale (return interval). This invariance is related to self-similarity of the marginal distributions of returns, but the excursion waiting-time distribution is a function of the entire return process and not just its univariate probabilities. Generalized autoregressive conditional heteroskedasticity (GARCH) models, market-time transformations based on volume or trades, and generalized (Lévy) random-walk models all fail to fit the statistical structure of excursions.
Statistical self-similarity of hotspot seamount volumes modeled as self-similar criticality
Tebbens, S.F.; Burroughs, S.M.; Barton, C.C.; Naar, D.F.
2001-01-01
The processes responsible for hotspot seamount formation are complex, yet the cumulative frequency-volume distribution of hotspot seamounts in the Easter Island/Salas y Gomez Chain (ESC) is found to be well-described by an upper-truncated power law. We develop a model for hotspot seamount formation where uniform energy input produces events initiated on a self-similar distribution of critical cells. We call this model Self-Similar Criticality (SSC). By allowing the spatial distribution of magma migration to be self-similar, the SSC model recreates the observed ESC seamount volume distribution. The SSC model may have broad applicability to other natural systems.
Characteristics and quality of intra-operative cell salvage in paediatric scoliosis surgery.
Perez-Ferrer, A; Gredilla-Díaz, E; de Vicente-Sánchez, J; Navarro-Suay, R; Gilsanz-Rodríguez, F
2016-02-01
To determine the haematological and microbiological characteristics of blood recovered by using a cell saver with a rigid centrifuge bowl (100ml) in paediatric scoliosis surgery and to determine whether it conforms to the standard expected in adult patients. A cross-sectional, descriptive cohort study was performed on 24 consecutive red blood cell (RBC) units recovered from the surgical field and processed by a Haemolite® 2+ (Haemonetics Corp., Braintree, MA, EE. UU.) cell saver. Data were collected regarding age, weight, surgical approach (anterior or posterior), processed shed volume and volume of autologous RBC recovered, full blood count, and blood culture obtained from the RBC concentrate, and incidence of fever after reinfusion. The processed shed volume was very low (939±569ml) with high variability (coefficient of variation=0.6), unlike the recovered volume 129±50ml (coefficient of variation=0.38). A statistically significant correlation between the processed shed volume and recovered RBC concentrate haematocrit was found (Pearson, r=.659, P=.001). Haematological parameters in the recovered concentrate were: Hb 11±5.3g dl(-1); haematocrit: 32.1±15.4% (lower than expected); white cells 5.34±4.22×103 ul(-)1; platelets 37.88±23.5×103 ul(-1) (mean±SD). Blood culture was positive in the RBC concentrate recovered in 13 cases (54.2%) in which Staphylococcus coagulase (-) was isolated. Cell salvage machines with rigid centrifuge bowls (including paediatric small volume) do not obtain the expected haematocrit if low volumes are processed, and therefore they are not the best choice in paediatric surgery. Copyright © 2015 Sociedad Española de Anestesiología, Reanimación y Terapéutica del Dolor. Publicado por Elsevier España, S.L.U. All rights reserved.
Enumerating Sparse Organisms in Ships’ Ballast Water: Why Counting to 10 Is Not So Easy
2011-01-01
To reduce ballast water-borne aquatic invasions worldwide, the International Maritime Organization and United States Coast Guard have each proposed discharge standards specifying maximum concentrations of living biota that may be released in ships’ ballast water (BW), but these regulations still lack guidance for standardized type approval and compliance testing of treatment systems. Verifying whether BW meets a discharge standard poses significant challenges. Properly treated BW will contain extremely sparse numbers of live organisms, and robust estimates of rare events require extensive sampling efforts. A balance of analytical rigor and practicality is essential to determine the volume of BW that can be reasonably sampled and processed, yet yield accurate live counts. We applied statistical modeling to a range of sample volumes, plankton concentrations, and regulatory scenarios (i.e., levels of type I and type II errors), and calculated the statistical power of each combination to detect noncompliant discharge concentrations. The model expressly addresses the roles of sampling error, BW volume, and burden of proof on the detection of noncompliant discharges in order to establish a rigorous lower limit of sampling volume. The potential effects of recovery errors (i.e., incomplete recovery and detection of live biota) in relation to sample volume are also discussed. PMID:21434685
Enumerating sparse organisms in ships' ballast water: why counting to 10 is not so easy.
Miller, A Whitman; Frazier, Melanie; Smith, George E; Perry, Elgin S; Ruiz, Gregory M; Tamburri, Mario N
2011-04-15
To reduce ballast water-borne aquatic invasions worldwide, the International Maritime Organization and United States Coast Guard have each proposed discharge standards specifying maximum concentrations of living biota that may be released in ships' ballast water (BW), but these regulations still lack guidance for standardized type approval and compliance testing of treatment systems. Verifying whether BW meets a discharge standard poses significant challenges. Properly treated BW will contain extremely sparse numbers of live organisms, and robust estimates of rare events require extensive sampling efforts. A balance of analytical rigor and practicality is essential to determine the volume of BW that can be reasonably sampled and processed, yet yield accurate live counts. We applied statistical modeling to a range of sample volumes, plankton concentrations, and regulatory scenarios (i.e., levels of type I and type II errors), and calculated the statistical power of each combination to detect noncompliant discharge concentrations. The model expressly addresses the roles of sampling error, BW volume, and burden of proof on the detection of noncompliant discharges in order to establish a rigorous lower limit of sampling volume. The potential effects of recovery errors (i.e., incomplete recovery and detection of live biota) in relation to sample volume are also discussed.
The Condition of Education, 1990. Volume 2: Postsecondary Education.
ERIC Educational Resources Information Center
Alsalam, Nabeel, Ed.; Rogers, Gayle Thompson, Ed.
The National Center for Education Statistics' annual statistical report on the condition of education in the United States is presented in two volumes for 1990. This volume covers postsecondary education, while the first volume addresses elementary and secondary education. Condition of education indicators (CEIs)--key data that measure the health…
Bernick, Charles; Banks, Sarah J; Shin, Wanyong; Obuchowski, Nancy; Butler, Sam; Noback, Michael; Phillips, Michael; Lowe, Mark; Jones, Stephen; Modic, Michael
2015-01-01
Objectives Cumulative head trauma may alter brain structure and function. We explored the relationship between exposure variables, cognition and MRI brain structural measures in a cohort of professional combatants. Methods 224 fighters (131 mixed martial arts fighters and 93 boxers) participating in the Professional Fighters Brain Health Study, a longitudinal cohort study of licensed professional combatants, were recruited, as were 22 controls. Each participant underwent computerised cognitive testing and volumetric brain MRI. Fighting history including years of fighting and fights per year was obtained from self-report and published records. Statistical analyses of the baseline evaluations were applied cross-sectionally to determine the relationship between fight exposure variables and volumes of the hippocampus, amygdala, thalamus, caudate, putamen. Moreover, the relationship between exposure and brain volumes with cognitive function was assessed. Results Increasing exposure to repetitive head trauma measured by number of professional fights, years of fighting, or a Fight Exposure Score (FES) was associated with lower brain volumes, particularly the thalamus and caudate. In addition, speed of processing decreased with decreased thalamic volumes and with increasing fight exposure. Higher scores on a FES used to reflect exposure to repetitive head trauma were associated with greater likelihood of having cognitive impairment. Conclusions Greater exposure to repetitive head trauma is associated with lower brain volumes and lower processing speed in active professional fighters. PMID:25633832
Automated objective characterization of visual field defects in 3D
NASA Technical Reports Server (NTRS)
Fink, Wolfgang (Inventor)
2006-01-01
A method and apparatus for electronically performing a visual field test for a patient. A visual field test pattern is displayed to the patient on an electronic display device and the patient's responses to the visual field test pattern are recorded. A visual field representation is generated from the patient's responses. The visual field representation is then used as an input into a variety of automated diagnostic processes. In one process, the visual field representation is used to generate a statistical description of the rapidity of change of a patient's visual field at the boundary of a visual field defect. In another process, the area of a visual field defect is calculated using the visual field representation. In another process, the visual field representation is used to generate a statistical description of the volume of a patient's visual field defect.
The Condition of Education, 1990. Volume 1: Elementary and Secondary Education.
ERIC Educational Resources Information Center
Ogle, Laurence T., Ed.; Alsalam, Nabeel, Ed.
This is the first of two volumes of the National Center for Education Statistics' annual statistical report on the condition of education in the United States for 1990. This volume addresses elementary and secondary education, while the second volume covers postsecondary education (PE). Condition of education indicators (CEIs)--key data that…
Automatic identification of bacterial types using statistical imaging methods
NASA Astrophysics Data System (ADS)
Trattner, Sigal; Greenspan, Hayit; Tepper, Gapi; Abboud, Shimon
2003-05-01
The objective of the current study is to develop an automatic tool to identify bacterial types using computer-vision and statistical modeling techniques. Bacteriophage (phage)-typing methods are used to identify and extract representative profiles of bacterial types, such as the Staphylococcus Aureus. Current systems rely on the subjective reading of plaque profiles by human expert. This process is time-consuming and prone to errors, especially as technology is enabling the increase in the number of phages used for typing. The statistical methodology presented in this work, provides for an automated, objective and robust analysis of visual data, along with the ability to cope with increasing data volumes.
IVHS Countermeasures for Rear-End Collisions, Task 1; Vol. II: Statistical Analysis
DOT National Transportation Integrated Search
1994-02-25
This report is from the NHTSA sponsored program, "IVHS Countermeasures for Rear-End Collisions". This Volume, Volume II, Statistical Analysis, presents the statistical analysis of rear-end collision accident data that characterizes the accidents with...
Forest statistics for Arkansas' Ouachita counties - 1988
F. Dee Hines
1988-01-01
Tabulated results were derived from data obtained during a recent inventory of 10 counties comprising the Ouachita Unit of Arkansas. Data on forest acreage and timber volume were secured by a three-step process. A forest-nonforest classification using aerial photographs was accomplished for points representing approximately 230 acres. These photo classifications were...
Forest statistics for Southwest Mississippi counties - 1987
John F. Kelly; F. Dee Hines
1987-01-01
The tables in this report were derived from data obtained from an inventory of 11 counties comprising the Southwest Mississippi Unit (fig. I). The data on forest acreage and timber volume were secured by a three-step process. A forest-nonforest classification was accomplished on aerial photographs for points representing approximately 230 acres. These photo...
Mesial Temporal Sclerosis: Accuracy of NeuroQuant versus Neuroradiologist.
Azab, M; Carone, M; Ying, S H; Yousem, D M
2015-08-01
We sought to compare the accuracy of a volumetric fully automated computer assessment of hippocampal volume asymmetry versus neuroradiologists' interpretations of the temporal lobes for mesial temporal sclerosis. Detecting mesial temporal sclerosis (MTS) is important for the evaluation of patients with temporal lobe epilepsy as it often guides surgical intervention. One feature of MTS is hippocampal volume loss. Electronic medical record and researcher reports of scans of patients with proved mesial temporal sclerosis were compared with volumetric assessment with an FDA-approved software package, NeuroQuant, for detection of mesial temporal sclerosis in 63 patients. The degree of volumetric asymmetry was analyzed to determine the neuroradiologists' threshold for detecting right-left asymmetry in temporal lobe volumes. Thirty-six patients had left-lateralized MTS, 25 had right-lateralized MTS, and 2 had bilateral MTS. The estimated accuracy of the neuroradiologist was 72.6% with a κ statistic of 0.512 (95% CI, 0.315-0.710) [moderate agreement, P < 3 × 10(-6)]), whereas the estimated accuracy of NeuroQuant was 79.4% with a κ statistic of 0.588 (95% CI, 0.388-0.787) [moderate agreement, P < 2 × 10(-6)]). This discrepancy in accuracy was not statistically significant. When at least a 5%-10% volume discrepancy between temporal lobes was present, the neuroradiologists detected it 75%-80% of the time. As a stand-alone fully automated software program that can process temporal lobe volume in 5-10 minutes, NeuroQuant compares favorably with trained neuroradiologists in predicting the side of mesial temporal sclerosis. Neuroradiologists can often detect even small temporal lobe volumetric changes visually. © 2015 by American Journal of Neuroradiology.
ERIC Educational Resources Information Center
Byrne, Eileen M.
This volume is to be used in conjunction with volume I (Final Research Report) of the Women in Science and Technology in Australia (WISTA) research project. This document contains the main statistical tables of grade 12 and higher education enrollments used as the basis for the statistical element of the WISTA research report. The document is…
NASA Technical Reports Server (NTRS)
Seasholtz, R. G.
1977-01-01
A laser Doppler velocimeter (LDV) built for use in the Lewis Research Center's turbine stator cascade facilities is described. The signal processing and self contained data processing are based on a computing counter. A procedure is given for mode matching the laser to the probe volume. An analysis is presented of biasing errors that were observed in turbulent flow when the mean flow was not normal to the fringes.
United States Air Force Summer Faculty Research Program (1987). Program Technical Report. Volume 2.
1987-12-01
the area of statistical inference, distribution theory and stochastic * •processes. I have taught courses in random processes and sample % j .functions...controlled phase separation of isotropic, binary mixtures, the theory of spinodal decomposition has been developed by Cahn and Hilliard.5 ,6 This theory is...peak and its initial rate of growth at a given temperature are predicted by the spinodal theory . The angle of maximum intensity is then determined by
Physical properties of wild mango fruit and nut
NASA Astrophysics Data System (ADS)
Ehiem, J.; Simonyan, K.
2012-02-01
Physical properties of two wild mango varieties were studied at 81.9 and 24.5% moisture (w.b.) for the fruits and nuts, respectively. The shape and size of the fruit are the same while that of nuts differs at P = 0.05. The mass, density and bulk density of the fruits are statistically different at P = 0.05 but the volume is the same. The shape and size, volume and bulk density of the nuts are statistically the same at P = 0.05. The nuts of both varieties are also the same at P = 0.05 in terms of mass and density. The packing factor for both fruits and nut of the two varieties are the same at 0.95. The relevant data obtained for the two varieties would be useful for design and development of machines and equipment for processing and handling operations.
Lifelong Learning NCES Task Force: Final Report, Volume I. Working Paper Series.
ERIC Educational Resources Information Center
Binkley, Marilyn; Hudson, Lisa; Knepper, Paula; Kolstad, Andy; Stowe, Peter; Wirt, John
In September 1998, the National Center for Education Statistics (NCES) established a 1-year task force to review the NCES's role concerning lifelong learning. The eight-member task force established a working definition of lifelong learning ("a process or system through which individuals are able and willing to learn at all stages of life,…
Forest statistics for Mississippi counties - 1987
Bryan L. Donner; F. Dee Hines
1987-01-01
The tables and figures in this report were derived from data obtained through a multi-resource inventory of 82 counties and five survey regions comprising the state of Mississippi (fig. 1). The data on forest acreage and timber volume were secured by a three-step process. First, a forest-non-forest classification was accomplished on aerial photographs for points...
Mato Abad, Virginia; Quirós, Alicia; García-Álvarez, Roberto; Loureiro, Javier Pereira; Alvarez-Linera, Juan; Frank, Ana; Hernández-Tamames, Juan Antonio
2014-01-01
1H-MRS variability increases due to normal aging and also as a result of atrophy in grey and white matter caused by neurodegeneration. In this work, an automatic process was developed to integrate data from spectra and high-resolution anatomical images to quantify metabolites, taking into account tissue partial volumes within the voxel of interest avoiding additional spectra acquisitions required for partial volume correction. To evaluate this method, we use a cohort of 135 subjects (47 male and 88 female, aged between 57 and 99 years) classified into 4 groups: 38 healthy participants, 20 amnesic mild cognitive impairment patients, 22 multi-domain mild cognitive impairment patients, and 55 Alzheimer's disease patients. Our findings suggest that knowing the voxel composition of white and grey matter and cerebrospinal fluid is necessary to avoid partial volume variations in a single-voxel study and to decrease part of the variability found in metabolites quantification, particularly in those studies involving elder patients and neurodegenerative diseases. The proposed method facilitates the use of 1H-MRS techniques in statistical studies in Alzheimer's disease, because it provides more accurate quantitative measurements, reduces the inter-subject variability, and improves statistical results when performing group comparisons.
Robust statistical reconstruction for charged particle tomography
Schultz, Larry Joe; Klimenko, Alexei Vasilievich; Fraser, Andrew Mcleod; Morris, Christopher; Orum, John Christopher; Borozdin, Konstantin N; Sossong, Michael James; Hengartner, Nicolas W
2013-10-08
Systems and methods for charged particle detection including statistical reconstruction of object volume scattering density profiles from charged particle tomographic data to determine the probability distribution of charged particle scattering using a statistical multiple scattering model and determine a substantially maximum likelihood estimate of object volume scattering density using expectation maximization (ML/EM) algorithm to reconstruct the object volume scattering density. The presence of and/or type of object occupying the volume of interest can be identified from the reconstructed volume scattering density profile. The charged particle tomographic data can be cosmic ray muon tomographic data from a muon tracker for scanning packages, containers, vehicles or cargo. The method can be implemented using a computer program which is executable on a computer.
Luque-Oliveros, Manuel; Garcia-Carpintero, Maria Angeles; Cauli, Omar
2017-01-01
Patients undergoing cardiac surgery with extracorporeal circulation (ECC) frequently present haemorrhages as a complication associated with high morbidity and mortality. One of the factors that influences this risk is the volume of blood infused during surgery. The objective of this study was to determine the optimal volume of autologous blood that can be processed during cardiac surgery with ECC. We also determined the number of salvaged red blood cells to be reinfused into the patient in order to minimize the risk of haemorrhage in the postoperative period. This was an observational retrospective cross-sectional study performed in 162 ECC cardiac surgery patients. Data regarding the sociodemographic profiles of the patients, their pathologies and surgical treatments, and the blood volume recovered, processed, and reinfused after cell salvage were collected. We also evaluated the occurrence of postoperative haemorrhage. The volume of blood infused after cell salvage had a statistically significant effect (p < 0.01) on the risk of post-operative haemorrhage; the receiver operating characteristic sensitivity was 0.813 and the optimal blood volume cut-off was 1800 ml. The best clinical outcome (16.7% of patients presenting haemorrhages) was in patients that had received less than 1800 ml of recovered and processed autologous blood, which represented a volume of up to 580 ml reinfused red blood cells. The optimum thresholds for autologous processed blood and red blood cells reinfused into the patient were 1800 and 580 ml, respectively. Increasing these thresholds augmented the risk of haemorrhage as an immediate postoperative period complication. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Bregant, Tina; Rados, Milan; Vasung, Lana; Derganc, Metka; Evans, Alan C; Neubauer, David; Kostovic, Ivica
2013-11-01
A severe form of perinatal hypoxic-ischaemic encephalopathy (HIE) carries a high risk of perinatal death and severe neurological sequelae while in mild HIE only discrete cognitive disorders may occur. To compare total brain volumes and region-specific cortical measurements between young adults with mild-moderate perinatal HIE and a healthy control group of the same age. MR imaging was performed in a cohort of 14 young adults (9 males, 5 females) with a history of mild or moderate perinatal HIE. The control group consisted of healthy participants, matched with HIE group by age and gender. Volumetric analysis was done after the processing of MR images using a fully automated CIVET pipeline. We measured gyrification indexes, total brain volume, volume of grey and white matter, and of cerebrospinal fluid. We also measured volume, thickness and area of the cerebral cortex in the parietal, occipital, frontal, and temporal lobe, and of the isthmus cinguli, parahippocampal and cingulated gyrus, and insula. The HIE patient group showed smaller absolute volumetric data. Statistically significant (p < 0.05) reductions of gyrification index in the right hemisphere, of cortical areas in the right temporal lobe and parahippocampal gyrus, of cortical volumes in the right temporal lobe and of cortical thickness in the right isthmus of the cingulate gyrus were found. Comparison between the healthy group and the HIE group of the same gender showed statistically significant changes in the male HIE patients, where a significant reduction was found in whole brain volume; left parietal, bilateral temporal, and right parahippocampal gyrus cortical areas; and bilateral temporal lobe cortical volume. Our analysis of total brain volumes and region-specific corticometric parameters suggests that mild-moderate forms of perinatal HIE lead to reductions in whole brain volumes. In the study reductions were most pronounced in temporal lobe and parahippocampal gyrus. Copyright © 2013 European Paediatric Neurology Society. All rights reserved.
Du, Yiping P; Jin, Zhaoyang
2009-10-01
To develop a robust algorithm for tissue-air segmentation in magnetic resonance imaging (MRI) using the statistics of phase and magnitude of the images. A multivariate measure based on the statistics of phase and magnitude was constructed for tissue-air volume segmentation. The standard deviation of first-order phase difference and the standard deviation of magnitude were calculated in a 3 x 3 x 3 kernel in the image domain. To improve differentiation accuracy, the uniformity of phase distribution in the kernel was also calculated and linear background phase introduced by field inhomogeneity was corrected. The effectiveness of the proposed volume segmentation technique was compared to a conventional approach that uses the magnitude data alone. The proposed algorithm was shown to be more effective and robust in volume segmentation in both synthetic phantom and susceptibility-weighted images of human brain. Using our proposed volume segmentation method, veins in the peripheral regions of the brain were well depicted in the minimum-intensity projection of the susceptibility-weighted images. Using the additional statistics of phase, tissue-air volume segmentation can be substantially improved compared to that using the statistics of magnitude data alone. (c) 2009 Wiley-Liss, Inc.
NASA Technical Reports Server (NTRS)
Merril, R. B.
1977-01-01
Solar system processes are considered along with the origin and evolution of the moon, planetary geophysics, lunar basins and crustal layering, lunar magnetism, the lunar surface as a planetary probe, remote observations of lunar and planetary surfaces, earth-based measurements, integrated studies, physical properties of lunar materials, and asteroids, meteorites, and the early solar system. Attention is also given to studies of mare basalts, the kinetics of basalt crystallization, topical studies of mare basalts, highland rocks, experimental studies of highland rocks, geochemical studies of highland rocks, studies of materials of KREEP composition, a consortium study of lunar breccia 73215, topical studies on highland rocks, Venus, and regional studies of the moon. Studies of surface processes, are reported, taking into account cratering mechanics and fresh crater morphology, crater statistics and surface dating, effects of exposure and gardening, and the chemistry of surfaces.
NASA Technical Reports Server (NTRS)
Ruedger, W. H.; Aanstoos, J. V.; Snyder, W. E.
1982-01-01
The NASA NEEDS program goals present a requirement for on-board signal processing to achieve user-compatible, information-adaptive data acquisition. This volume addresses the impact of data set selection on data formatting required for efficient telemetering of the acquired satellite sensor data. More specifically, the FILE algorithm developed by Martin-Marietta provides a means for the determination of those pixels from the data stream effects an improvement in the achievable system throughput. It will be seen that based on the lack of statistical stationarity in cloud cover, spatial distribution periods exist where data acquisition rates exceed the throughput capability. The study therefore addresses various approaches to data compression and truncation as applicable to this sensor mission.
Grassi, Hilda Cristina; García, Lisbette C; Lobo-Sulbarán, María Lorena; Velásquez, Ana; Andrades-Grassi, Francisco A; Cabrera, Humberto; Andrades-Grassi, Jesús E; Andrades, Efrén D J
2016-12-01
In this paper we report a quantitative laser Biospeckle method using VDRL plates to monitor the activity of Trypanosoma cruzi and the calibration conditions including three image processing algorithms and three programs (ImageJ and two programs designed in this work). Benznidazole was used as a test drug. Variable volume (constant density) and variable density (constant volume) were used for the quantitative evaluation of parasite activity in calibrated wells of the VDRL plate. The desiccation process within the well was monitored as a function of volume and of the activity of the Biospeckle pattern of the parasites as well as the quantitative effect of the surface parasite quantity (proportion of the object's plane). A statistical analysis was performed with ANOVA, Tukey post hoc and Descriptive Statistics using R and R Commander. Conditions of volume (100μl) and parasite density (2-4x104 parasites/well, in exponential growth phase), assay time (up to 204min), frame number (11 frames), algorithm and program (RCommander/SAGA) for image processing were selected to test the effect of variable concentrations of benznidazole (0.0195 to 20μg/mL / 0.075 to 76.8μM) at various times (1, 61, 128 and 204min) on the activity of the Biospeckle pattern. The flat wells of the VDRL plate were found to be suitable for the quantitative calibration of the activity of Trypanosoma cruzi using the appropriate algorithm and program. Under these conditions, benznidazole produces at 1min an instantaneous effect on the activity of the Biospeckle pattern of T. cruzi, which remains with a similar profile up to 1 hour. A second effect which is dependent on concentrations above 1.25μg/mL and is statistically different from the effect at lower concentrations causes a decrease in the activity of the Biospeckle pattern. This effect is better detected after 1 hour of drug action. This behavior may be explained by an instantaneous effect on a membrane protein of Trypanosoma cruzi that could mediate the translocation of benznidazole. At longer times the effect may possibly be explained by the required transformation of the pro-drug into the active drug.
Lee, Sun Jin; Chong, Semin; Kang, Kyung Ho; Hur, Joonho; Hong, Byung-Woo; Kim, Hyun Jung; Kim, Soo Jin
2014-11-01
The objective of our study was to measure thyroid volumes using semiautomated 3D CT and to compare the 3D CT volumes with volumes measured using 2D ultrasound, 2D CT, and the water displacement method. In 47 patients, 2D ultrasound volumes and 2D CT volumes of the thyroid gland were estimated using the ellipsoid volume formula, and 3D CT volumes were calculated using semiautomated reconstructive techniques. All volume data were compared with thyroid specimen volumes obtained using the water displacement method and were statistically analyzed using the one-way ANOVA, the Pearson correlation coefficient (R), linear regression, and the concordance correlation coefficient (CCC). The processing time of semiautomated 3D CT thyroid volumetry was measured. The paired mean differences ± SD between the three imaging-determined volumes and the specimen volumes were 0.8 ± 3.1 mL for 2D ultrasound, 4.0 ± 4.7 mL for 2D CT, and 0.2 ± 2.5 mL for 3D CT. A significant difference in the mean thyroid volume was found between 2D CT and specimen volumes (p = 0.016) compared with the other pairs (p = 0.937 for 2D ultrasound mean volume vs specimen mean volume, and p = 0.999 for 3D CT mean volume vs specimen mean volume). Between specimen volume and 2D ultrasound volume, specimen volume and 2D CT volume, and specimen volume and 3D CT volume, R values were 0.885, 0.724, and 0.929, respectively, and CCC values were 0.876, 0.598, and 0.925, respectively. The mean processing time of semiautomated 3D CT thyroid volumetry was 7.0 minutes. Thyroid volumes measured using 2D ultrasound or semiautomated 3D CT are substantially close to thyroid specimen volumes measured using the water displacement method. Semiautomated 3D CT thyroid volumetry can provide a more reliable measure of thyroid volume than 2D ultrasound.
Forest statistics of western Kentucky
The Forest Survey Organization Central States Forest Experiment Station
1950-01-01
This Survey Release presents the more significant preliminary statistics on the forest area and timber volume for the western region of Kentucky. Similar reports for the remainder of the state will be published as soon as statistical tabulations are completed. Later, an analytical report for the state will be published which will interpret forest area, timber volume,...
Forest statistics of southern Indiana
The Forest Survey Organization Central States Forest Experiment Station
1951-01-01
This Survey Release presents the more significant preliminary statistics on the forest area and timber volume for each of the three regions of southern Indiana. A similar report will be published for the two northern Indiana regions. Later, an analytical report for the state will be published which will interpret statistics on forest area, timber- volume, growth, and...
Minerals Yearbook, volume II, Area Reports—Domestic
,
2018-01-01
The U.S. Geological Survey (USGS) Minerals Yearbook discusses the performance of the worldwide minerals and materials industries and provides background information to assist in interpreting that performance. Content of the individual Minerals Yearbook volumes follows:Volume I, Metals and Minerals, contains chapters about virtually all metallic and industrial mineral commodities important to the U.S. economy. Chapters on survey methods, summary statistics for domestic nonfuel minerals, and trends in mining and quarrying in the metals and industrial mineral industries in the United States are also included.Volume II, Area Reports: Domestic, contains a chapter on the mineral industry of each of the 50 States and Puerto Rico and the Administered Islands. This volume also has chapters on survey methods and summary statistics of domestic nonfuel minerals.Volume III, Area Reports: International, is published as four separate reports. These regional reports contain the latest available minerals data on more than 180 foreign countries and discuss the importance of minerals to the economies of these nations and the United States. Each report begins with an overview of the region’s mineral industries during the year. It continues with individual country chapters that examine the mining, refining, processing, and use of minerals in each country of the region and how each country’s mineral industry relates to U.S. industry. Most chapters include production tables and industry structure tables, information about Government policies and programs that affect the country’s mineral industry, and an outlook section.The USGS continually strives to improve the value of its publications to users. Constructive comments and suggestions by readers of the Minerals Yearbook are welcomed.
Minerals Yearbook, volume I, Metals and Minerals
,
2018-01-01
The U.S. Geological Survey (USGS) Minerals Yearbook discusses the performance of the worldwide minerals and materials industries and provides background information to assist in interpreting that performance. Content of the individual Minerals Yearbook volumes follows:Volume I, Metals and Minerals, contains chapters about virtually all metallic and industrial mineral commodities important to the U.S. economy. Chapters on survey methods, summary statistics for domestic nonfuel minerals, and trends in mining and quarrying in the metals and industrial mineral industries in the United States are also included.Volume II, Area Reports: Domestic, contains a chapter on the mineral industry of each of the 50 States and Puerto Rico and the Administered Islands. This volume also has chapters on survey methods and summary statistics of domestic nonfuel minerals.Volume III, Area Reports: International, is published as four separate reports. These regional reports contain the latest available minerals data on more than 180 foreign countries and discuss the importance of minerals to the economies of these nations and the United States. Each report begins with an overview of the region’s mineral industries during the year. It continues with individual country chapters that examine the mining, refining, processing, and use of minerals in each country of the region and how each country’s mineral industry relates to U.S. industry. Most chapters include production tables and industry structure tables, information about Government policies and programs that affect the country’s mineral industry, and an outlook section.The USGS continually strives to improve the value of its publications to users. Constructive comments and suggestions by readers of the Minerals Yearbook are welcomed.
Minerals Yearbook, volume III, Area Reports—International
,
2018-01-01
The U.S. Geological Survey (USGS) Minerals Yearbook discusses the performance of the worldwide minerals and materials industries and provides background information to assist in interpreting that performance. Content of the individual Minerals Yearbook volumes follows:Volume I, Metals and Minerals, contains chapters about virtually all metallic and industrial mineral commodities important to the U.S. economy. Chapters on survey methods, summary statistics for domestic nonfuel minerals, and trends in mining and quarrying in the metals and industrial mineral industries in the United States are also included.Volume II, Area Reports: Domestic, contains a chapter on the mineral industry of each of the 50 States and Puerto Rico and the Administered Islands. This volume also has chapters on survey methods and summary statistics of domestic nonfuel minerals.Volume III, Area Reports: International, is published as four separate reports. These regional reports contain the latest available minerals data on more than 180 foreign countries and discuss the importance of minerals to the economies of these nations and the United States. Each report begins with an overview of the region’s mineral industries during the year. It continues with individual country chapters that examine the mining, refining, processing, and use of minerals in each country of the region and how each country’s mineral industry relates to U.S. industry. Most chapters include production tables and industry structure tables, information about Government policies and programs that affect the country’s mineral industry, and an outlook section.The USGS continually strives to improve the value of its publications to users. Constructive comments and suggestions by readers of the Minerals Yearbook are welcomed.
A trait-based test for habitat filtering: Convex hull volume
Cornwell, W.K.; Schwilk, D.W.; Ackerly, D.D.
2006-01-01
Community assembly theory suggests that two processes affect the distribution of trait values within communities: competition and habitat filtering. Within a local community, competition leads to ecological differentiation of coexisting species, while habitat filtering reduces the spread of trait values, reflecting shared ecological tolerances. Many statistical tests for the effects of competition exist in the literature, but measures of habitat filtering are less well-developed. Here, we present convex hull volume, a construct from computational geometry, which provides an n-dimensional measure of the volume of trait space occupied by species in a community. Combined with ecological null models, this measure offers a useful test for habitat filtering. We use convex hull volume and a null model to analyze California woody-plant trait and community data. Our results show that observed plant communities occupy less trait space than expected from random assembly, a result consistent with habitat filtering. ?? 2006 by the Ecological Society of America.
Sahoo, Satya S.; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A.; Lhatoo, Samden D.
2016-01-01
The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This “neuroscience Big data” represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability—the ability to efficiently process increasing volumes of data; (b) Adaptability—the toolkit can be deployed across different computing configurations; and (c) Ease of programming—the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that the toolkit is highly scalable and adaptable, which makes it suitable for use in neuroscience applications as a scalable data processing toolkit. As part of the ongoing extension of NeuroPigPen, we are developing new modules to support statistical functions to analyze signal data for brain connectivity research. In addition, the toolkit is being extended to allow integration with scientific workflow systems. NeuroPigPen is released under BSD license at: https://sites.google.com/a/case.edu/neuropigpen/. PMID:27375472
Sahoo, Satya S; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A; Lhatoo, Samden D
2016-01-01
The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This "neuroscience Big data" represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability-the ability to efficiently process increasing volumes of data; (b) Adaptability-the toolkit can be deployed across different computing configurations; and (c) Ease of programming-the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that the toolkit is highly scalable and adaptable, which makes it suitable for use in neuroscience applications as a scalable data processing toolkit. As part of the ongoing extension of NeuroPigPen, we are developing new modules to support statistical functions to analyze signal data for brain connectivity research. In addition, the toolkit is being extended to allow integration with scientific workflow systems. NeuroPigPen is released under BSD license at: https://sites.google.com/a/case.edu/neuropigpen/.
Cargo/Logistics Airlift System Study (CLASS), Volume 1
NASA Technical Reports Server (NTRS)
Norman, J. M.; Henderson, R. D.; Macey, F. C.; Tuttle, R. P.
1978-01-01
Current and advanced air cargo systems are evaluated using industrial and consumer statistics. Market and commodity characteristics that influence the use of the air mode are discussed along with a comparison of air and surface mode on typical routes. Results of on-site surveys of cargo processing facilities at airports are presented, and institutional controls and influences on air cargo operations are considered.
ERIC Educational Resources Information Center
GREBLER, LEO; AND OTHERS
THIS PRELIMINARY REPORT DESCRIBES THAT PHASE OF THE UCLA MEXICAN-AMERICAN STUDY PROJECT WHICH CONCERNS THE IMMIGRATION PROCESS OF MEXICANS TO THE UNITED STATES. STATISTICS ARE PRESENTED ABOUT--(1) THE VOLUME OF IMMIGRATION OVER THE YEARS, (2) THE SOCIO-ECONOMIC CHARACTERISTICS OF IMMIGRATING MEXICANS, (3) THE GEOGRAPHIC DISTRIBUTION OF MIGRANTS…
Quantitative analysis of the renal aging in rats. Stereological study.
Melchioretto, Eduardo Felippe; Zeni, Marcelo; Veronez, Djanira Aparecida da Luz; Martins, Eduardo Lopes; Fraga, Rogério de
2016-05-01
To evaluate the renal function and the renal histological alterations through the stereology and morphometrics in rats submitted to the natural process of aging. Seventy two Wistar rats, divided in six groups. Each group was sacrificed in a different age: 3, 6, 9, 12, 18 and 24 months. It was performed right nephrectomy, stereological and morphometric analysis of the renal tissue (renal volume and weight, density of volume (Vv[glom]) and numerical density (Nv[glom]) of the renal glomeruli and average glomerular volume (Vol[glom])) and also it was evaluated the renal function for the dosage of serum creatinine and urea. There was significant decrease of the renal function in the oldest rats. The renal volume presented gradual increase during the development of the rats with the biggest values registered in the group of animals at 12 months of age and significant progressive decrease in older animals. Vv[glom] presented statistically significant gradual reduction between the groups and the Nv[glom] also decreased significantly. The renal function proved to be inferior in senile rats when compared to the young rats. The morphometric and stereological analysis evidenced renal atrophy, gradual reduction of the volume density and numerical density of the renal glomeruli associated to the aging process.
Recurrence interval analysis of trading volumes
NASA Astrophysics Data System (ADS)
Ren, Fei; Zhou, Wei-Xing
2010-06-01
We study the statistical properties of the recurrence intervals τ between successive trading volumes exceeding a certain threshold q . The recurrence interval analysis is carried out for the 20 liquid Chinese stocks covering a period from January 2000 to May 2009, and two Chinese indices from January 2003 to April 2009. Similar to the recurrence interval distribution of the price returns, the tail of the recurrence interval distribution of the trading volumes follows a power-law scaling, and the results are verified by the goodness-of-fit tests using the Kolmogorov-Smirnov (KS) statistic, the weighted KS statistic and the Cramér-von Mises criterion. The measurements of the conditional probability distribution and the detrended fluctuation function show that both short-term and long-term memory effects exist in the recurrence intervals between trading volumes. We further study the relationship between trading volumes and price returns based on the recurrence interval analysis method. It is found that large trading volumes are more likely to occur following large price returns, and the comovement between trading volumes and price returns is more pronounced for large trading volumes.
Recurrence interval analysis of trading volumes.
Ren, Fei; Zhou, Wei-Xing
2010-06-01
We study the statistical properties of the recurrence intervals τ between successive trading volumes exceeding a certain threshold q. The recurrence interval analysis is carried out for the 20 liquid Chinese stocks covering a period from January 2000 to May 2009, and two Chinese indices from January 2003 to April 2009. Similar to the recurrence interval distribution of the price returns, the tail of the recurrence interval distribution of the trading volumes follows a power-law scaling, and the results are verified by the goodness-of-fit tests using the Kolmogorov-Smirnov (KS) statistic, the weighted KS statistic and the Cramér-von Mises criterion. The measurements of the conditional probability distribution and the detrended fluctuation function show that both short-term and long-term memory effects exist in the recurrence intervals between trading volumes. We further study the relationship between trading volumes and price returns based on the recurrence interval analysis method. It is found that large trading volumes are more likely to occur following large price returns, and the comovement between trading volumes and price returns is more pronounced for large trading volumes.
Jimenez-Jimenez, E; Mateos, P; Aymar, N; Roncero, R; Ortiz, I; Gimenez, M; Pardo, J; Salinas, J; Sabater, S
2018-05-02
Evidence supporting the use of 18F-FDG-PET/CT in the segmentation process of oesophageal cancer for radiotherapy planning is limited. Our aim was to compare the volumes and tumour lengths defined by fused PET/CT vs. CT simulation. Twenty-nine patients were analyzed. All patients underwent a single PET/CT simulation scan. Two separate GTVs were defined: one based on CT data alone and another based on fused PET/CT data. Volume sizes for both data sets were compared and the spatial overlap was assessed by the Dice similarity coefficient (DSC). The gross tumour volume (GTVtumour) and maximum tumour diameter were greater by PET/CT, and length of primary tumour was greater by CT, but differences were not statistically significant. However, the gross node volume (GTVnode) was significantly greater by PET/CT. The DSC analysis showed excellent agreement for GTVtumour, 0.72, but was very low for GTVnode, 0.25. Our study shows that the volume definition by PET/CT and CT data differs. CT simulation, without taking into account PET/CT information, might leave cancer-involved nodes out of the radiotherapy-delineated volumes.
The psychophysiology of real-time financial risk processing.
Lo, Andrew W; Repin, Dmitry V
2002-04-01
A longstanding controversy in economics and finance is whether financial markets are governed by rational forces or by emotional responses. We study the importance of emotion in the decision-making process of professional securities traders by measuring their physiological characteristics (e.g., skin conductance, blood volume pulse, etc.) during live trading sessions while simultaneously capturing real-time prices from which market events can be detected. In a sample of 10 traders, we find statistically significant differences in mean electrodermal responses during transient market events relative to no-event control periods, and statistically significant mean changes in cardiovascular variables during periods of heightened market volatility relative to normal-volatility control periods. We also observe significant differences in these physiological responses across the 10 traders that may be systematically related to the traders' levels of experience.
NASA historical data book. Volume 4: NASA resources 1969-1978
NASA Technical Reports Server (NTRS)
Gawdiak, Ihor Y.; Fedor, Helen
1994-01-01
This is Volume 4, NASA Resources 1969-1978, of a series providing a 20-year statistical summary of NASA programs. This series is an important component of NASA published historical reference works, used by NASA personnel, managers, external researchers, and other government agencies. This volume combines statistical data of the component facilities with the data of the parent installation.
ERIC Educational Resources Information Center
Reshad, Rosalind S.
One of six volumes summarizing through narrative and statistical tables data collected by the Equal Employment Opportunity Commission in its 1974 survey, this fifth volume details nationwide statistics on the employment status of minorities and women working in township governments. Data from 299 actual units of government in fourteen states were…
ERIC Educational Resources Information Center
Skinner, Alice W.
One of six volumes summarizing through narrative and statistical tables data collected by the Equal Employment Opportunity Commission in its 1974 survey, this fourth volume details the employment status of minorities and women in municipal governments. Based on reports filed by 2,230 municipalities, statistics in this study are designed to…
Chen, Yantian; Bloemen, Veerle; Impens, Saartje; Moesen, Maarten; Luyten, Frank P; Schrooten, Jan
2011-12-01
Cell seeding into scaffolds plays a crucial role in the development of efficient bone tissue engineering constructs. Hence, it becomes imperative to identify the key factors that quantitatively predict reproducible and efficient seeding protocols. In this study, the optimization of a cell seeding process was investigated using design of experiments (DOE) statistical methods. Five seeding factors (cell type, scaffold type, seeding volume, seeding density, and seeding time) were selected and investigated by means of two response parameters, critically related to the cell seeding process: cell seeding efficiency (CSE) and cell-specific viability (CSV). In addition, cell spatial distribution (CSD) was analyzed by Live/Dead staining assays. Analysis identified a number of statistically significant main factor effects and interactions. Among the five seeding factors, only seeding volume and seeding time significantly affected CSE and CSV. Also, cell and scaffold type were involved in the interactions with other seeding factors. Within the investigated ranges, optimal conditions in terms of CSV and CSD were obtained when seeding cells in a regular scaffold with an excess of medium. The results of this case study contribute to a better understanding and definition of optimal process parameters for cell seeding. A DOE strategy can identify and optimize critical process variables to reduce the variability and assists in determining which variables should be carefully controlled during good manufacturing practice production to enable a clinically relevant implant.
Illinois forest statistics, 1985.
Jerold T. Hahn
1987-01-01
The third inventory of the timber resource of Illinois shows a 1% increase in commercial forest area and a 40% gain in growing-stock volume between 1962 and 1985. Presented are highlights and statistics on area, volume, growth, mortality, removals, utilization, and biomass.
Wisconsin forest statistics, 1983.
Gerhard K. Raile
1983-01-01
The fourth inventory of the timber resource of Wisconsin shows a 2% increase in commercial forest area and a 39% gain in growing-stock volume between 1968 and 1983. Presented are highlights and statistics on area, volume, growth, mortality, removals, utilization, and biomass.
Michigan forest statistics, 1980.
Gerhard K. Raile; W. Brad Smith
1983-01-01
The fourth inventory of the timber resource of Michigan shows a 7% decline in commercial forest area and a 27% gain in growing-stock volume between 1966 and 1980. Highlights and statistics are presented on area, volume, growth, mortality, removals, utilization, and biomass.
Plackett-Burman experimental design to facilitate syntactic foam development
Smith, Zachary D.; Keller, Jennie R.; Bello, Mollie; ...
2015-09-14
The use of an eight-experiment Plackett–Burman method can assess six experimental variables and eight responses in a polysiloxane-glass microsphere syntactic foam. The approach aims to decrease the time required to develop a tunable polymer composite by identifying a reduced set of variables and responses suitable for future predictive modeling. The statistical design assesses the main effects of mixing process parameters, polymer matrix composition, microsphere density and volume loading, and the blending of two grades of microspheres, using a dummy factor in statistical calculations. Responses cover rheological, physical, thermal, and mechanical properties. The cure accelerator content of the polymer matrix andmore » the volume loading of the microspheres have the largest effects on foam properties. These factors are the most suitable for controlling the gel point of the curing foam, and the density of the cured foam. The mixing parameters introduce widespread variability and therefore should be fixed at effective levels during follow-up testing. Some responses may require greater contrast in microsphere-related factors. As a result, compared to other possible statistical approaches, the run economy of the Plackett–Burman method makes it a valuable tool for rapidly characterizing new foams.« less
Kulesz, Paulina A.; Tian, Siva; Juranek, Jenifer; Fletcher, Jack M.; Francis, David J.
2015-01-01
Objective Weak structure-function relations for brain and behavior may stem from problems in estimating these relations in small clinical samples with frequently occurring outliers. In the current project, we focused on the utility of using alternative statistics to estimate these relations. Method Fifty-four children with spina bifida meningomyelocele performed attention tasks and received MRI of the brain. Using a bootstrap sampling process, the Pearson product moment correlation was compared with four robust correlations: the percentage bend correlation, the Winsorized correlation, the skipped correlation using the Donoho-Gasko median, and the skipped correlation using the minimum volume ellipsoid estimator Results All methods yielded similar estimates of the relations between measures of brain volume and attention performance. The similarity of estimates across correlation methods suggested that the weak structure-function relations previously found in many studies are not readily attributable to the presence of outlying observations and other factors that violate the assumptions behind the Pearson correlation. Conclusions Given the difficulty of assembling large samples for brain-behavior studies, estimating correlations using multiple, robust methods may enhance the statistical conclusion validity of studies yielding small, but often clinically significant, correlations. PMID:25495830
Kulesz, Paulina A; Tian, Siva; Juranek, Jenifer; Fletcher, Jack M; Francis, David J
2015-03-01
Weak structure-function relations for brain and behavior may stem from problems in estimating these relations in small clinical samples with frequently occurring outliers. In the current project, we focused on the utility of using alternative statistics to estimate these relations. Fifty-four children with spina bifida meningomyelocele performed attention tasks and received MRI of the brain. Using a bootstrap sampling process, the Pearson product-moment correlation was compared with 4 robust correlations: the percentage bend correlation, the Winsorized correlation, the skipped correlation using the Donoho-Gasko median, and the skipped correlation using the minimum volume ellipsoid estimator. All methods yielded similar estimates of the relations between measures of brain volume and attention performance. The similarity of estimates across correlation methods suggested that the weak structure-function relations previously found in many studies are not readily attributable to the presence of outlying observations and other factors that violate the assumptions behind the Pearson correlation. Given the difficulty of assembling large samples for brain-behavior studies, estimating correlations using multiple, robust methods may enhance the statistical conclusion validity of studies yielding small, but often clinically significant, correlations. PsycINFO Database Record (c) 2015 APA, all rights reserved.
Kansas forest statistics, 1981.
Gerhard K. Raile; John S. Jr. Spencer
1984-01-01
The third inventory of the timber resources of Kansas shows a 1.4% increase in commercial forest area and a 42% gain in growing-stock volume between 1965 and 1981. Highlights and statistics are presented on area, volume, growth, mortality, removals, utilization and biomass.
Minerals Yearbook, volume III, Area Reports—International—Africa and the Middle East
,
2018-01-01
The U.S. Geological Survey (USGS) Minerals Yearbook discusses the performance of the worldwide minerals and materials industries and provides background information to assist in interpreting that performance. Content of the individual Minerals Yearbook volumes follows:Volume I, Metals and Minerals, contains chapters about virtually all metallic and industrial mineral commodities important to the U.S. economy. Chapters on survey methods, summary statistics for domestic nonfuel minerals, and trends in mining and quarrying in the metals and industrial mineral industries in the United States are also included.Volume II, Area Reports: Domestic, contains a chapter on the mineral industry of each of the 50 States and Puerto Rico and the Administered Islands. This volume also has chapters on survey methods and summary statistics of domestic nonfuel minerals.Volume III, Area Reports: International, is published as four separate reports. These regional reports contain the latest available minerals data on more than 180 foreign countries and discuss the importance of minerals to the economies of these nations and the United States. Each report begins with an overview of the region’s mineral industries during the year. It continues with individual country chapters that examine the mining, refining, processing, and use of minerals in each country of the region and how each country’s mineral industry relates to U.S. industry. Most chapters include production tables and industry structure tables, information about Government policies and programs that affect the country’s mineral industry, and an outlook section.The USGS continually strives to improve the value of its publications to users. Constructive comments and suggestions by readers of the Minerals Yearbook are welcomed.
Minerals Yearbook, volume III, Area Reports—International—Asia and the Pacific
Geological Survey, U.S.
2018-01-01
The U.S. Geological Survey (USGS) Minerals Yearbook discusses the performance of the worldwide minerals and materials industries and provides background information to assist in interpreting that performance. Content of the individual Minerals Yearbook volumes follows:Volume I, Metals and Minerals, contains chapters about virtually all metallic and industrial mineral commodities important to the U.S. economy. Chapters on survey methods, summary statistics for domestic nonfuel minerals, and trends in mining and quarrying in the metals and industrial mineral industries in the United States are also included.Volume II, Area Reports: Domestic, contains a chapter on the mineral industry of each of the 50 States and Puerto Rico and the Administered Islands. This volume also has chapters on survey methods and summary statistics of domestic nonfuel minerals.Volume III, Area Reports: International, is published as four separate reports. These regional reports contain the latest available minerals data on more than 180 foreign countries and discuss the importance of minerals to the economies of these nations and the United States. Each report begins with an overview of the region’s mineral industries during the year. It continues with individual country chapters that examine the mining, refining, processing, and use of minerals in each country of the region and how each country’s mineral industry relates to U.S. industry. Most chapters include production tables and industry structure tables, information about Government policies and programs that affect the country’s mineral industry, and an outlook section.The USGS continually strives to improve the value of its publications to users. Constructive comments and suggestions by readers of the Minerals Yearbook are welcomed.
Minerals Yearbook, volume III, Area Reports—International—Latin America and Canada
,
2018-01-01
The U.S. Geological Survey (USGS) Minerals Yearbook discusses the performance of the worldwide minerals and materials industries and provides background information to assist in interpreting that performance. Content of the individual Minerals Yearbook volumes follows:Volume I, Metals and Minerals, contains chapters about virtually all metallic and industrial mineral commodities important to the U.S. economy. Chapters on survey methods, summary statistics for domestic nonfuel minerals, and trends in mining and quarrying in the metals and industrial mineral industries in the United States are also included.Volume II, Area Reports: Domestic, contains a chapter on the mineral industry of each of the 50 States and Puerto Rico and the Administered Islands. This volume also has chapters on survey methods and summary statistics of domestic nonfuel minerals.Volume III, Area Reports: International, is published as four separate reports. These regional reports contain the latest available minerals data on more than 180 foreign countries and discuss the importance of minerals to the economies of these nations and the United States. Each report begins with an overview of the region’s mineral industries during the year. It continues with individual country chapters that examine the mining, refining, processing, and use of minerals in each country of the region and how each country’s mineral industry relates to U.S. industry. Most chapters include production tables and industry structure tables, information about Government policies and programs that affect the country’s mineral industry, and an outlook section.The USGS continually strives to improve the value of its publications to users. Constructive comments and suggestions by readers of the Minerals Yearbook are welcomed.
Minerals Yearbook, volume III, Area Reports—International—Europe and Central Eurasia
Geological Survey, U.S.
2018-01-01
The U.S. Geological Survey (USGS) Minerals Yearbook discusses the performance of the worldwide minerals and materials industries and provides background information to assist in interpreting that performance. Content of the individual Minerals Yearbook volumes follows:Volume I, Metals and Minerals, contains chapters about virtually all metallic and industrial mineral commodities important to the U.S. economy. Chapters on survey methods, summary statistics for domestic nonfuel minerals, and trends in mining and quarrying in the metals and industrial mineral industries in the United States are also included.Volume II, Area Reports: Domestic, contains a chapter on the mineral industry of each of the 50 States and Puerto Rico and the Administered Islands. This volume also has chapters on survey methods and summary statistics of domestic nonfuel minerals.Volume III, Area Reports: International, is published as four separate reports. These regional reports contain the latest available minerals data on more than 180 foreign countries and discuss the importance of minerals to the economies of these nations and the United States. Each report begins with an overview of the region’s mineral industries during the year. It continues with individual country chapters that examine the mining, refining, processing, and use of minerals in each country of the region and how each country’s mineral industry relates to U.S. industry. Most chapters include production tables and industry structure tables, information about Government policies and programs that affect the country’s mineral industry, and an outlook section.The USGS continually strives to improve the value of its publications to users. Constructive comments and suggestions by readers of the Minerals Yearbook are welcomed.
Cerebral gray matter volume in patients with chronic migraine: correlations with clinical features.
Coppola, Gianluca; Petolicchio, Barbara; Di Renzo, Antonio; Tinelli, Emanuele; Di Lorenzo, Cherubino; Parisi, Vincenzo; Serrao, Mariano; Calistri, Valentina; Tardioli, Stefano; Cartocci, Gaia; Ambrosini, Anna; Caramia, Francesca; Di Piero, Vittorio; Pierelli, Francesco
2017-12-08
To date, few MRI studies have been performed in patients affected by chronic migraine (CM), especially in those without medication overuse. Here, we performed magnetic resonance imaging (MRI) voxel-based morphometry (VBM) analyses to investigate the gray matter (GM) volume of the whole brain in patients affected by CM. Our aim was to investigate whether fluctuations in the GM volumes were related to the clinical features of CM. Twenty untreated patients with CM without a past medical history of medication overuse underwent 3-Tesla MRI scans and were compared to a group of 20 healthy controls (HCs). We used SPM12 and the CAT12 toolbox to process the MRI data and to perform VBM analyses of the structural T1-weighted MRI scans. The GM volume of patients was compared to that of HCs with various corrected and uncorrected thresholds. To check for possible correlations, patients' clinical features and GM maps were regressed. Initially, we did not find significant differences in the GM volume between patients with CM and HCs (p < 0.05 corrected for multiple comparisons). However, using more-liberal uncorrected statistical thresholds, we noted that compared to HCs, patients with CM exhibited clusters of regions with lower GM volumes including the cerebellum, left middle temporal gyrus, left temporal pole/amygdala/hippocampus/pallidum/orbitofrontal cortex, and left occipital areas (Brodmann areas 17/18). The GM volume of the cerebellar hemispheres was negatively correlated with the disease duration and positively correlated with the number of tablets taken per month. No gross morphometric changes were observed in patients with CM when compared with HCs. However, using more-liberal uncorrected statistical thresholds, we observed that CM is associated with subtle GM volume changes in several brain areas known to be involved in nociception/antinociception, multisensory integration, and analgesic dependence. We speculate that these slight morphometric impairments could lead, at least in a subgroup of patients, to the development and continuation of maladaptive acute medication usage.
The importance of topographically corrected null models for analyzing ecological point processes.
McDowall, Philip; Lynch, Heather J
2017-07-01
Analyses of point process patterns and related techniques (e.g., MaxEnt) make use of the expected number of occurrences per unit area and second-order statistics based on the distance between occurrences. Ecologists working with point process data often assume that points exist on a two-dimensional x-y plane or within a three-dimensional volume, when in fact many observed point patterns are generated on a two-dimensional surface existing within three-dimensional space. For many surfaces, however, such as the topography of landscapes, the projection from the surface to the x-y plane preserves neither area nor distance. As such, when these point patterns are implicitly projected to and analyzed in the x-y plane, our expectations of the point pattern's statistical properties may not be met. When used in hypothesis testing, we find that the failure to account for the topography of the generating surface may bias statistical tests that incorrectly identify clustering and, furthermore, may bias coefficients in inhomogeneous point process models that incorporate slope as a covariate. We demonstrate the circumstances under which this bias is significant, and present simple methods that allow point processes to be simulated with corrections for topography. These point patterns can then be used to generate "topographically corrected" null models against which observed point processes can be compared. © 2017 by the Ecological Society of America.
Dose-mass inverse optimization for minimally moving thoracic lesions
NASA Astrophysics Data System (ADS)
Mihaylov, I. B.; Moros, E. G.
2015-05-01
In the past decade, several different radiotherapy treatment plan evaluation and optimization schemes have been proposed as viable approaches, aiming for dose escalation or an increase of healthy tissue sparing. In particular, it has been argued that dose-mass plan evaluation and treatment plan optimization might be viable alternatives to the standard of care, which is realized through dose-volume evaluation and optimization. The purpose of this investigation is to apply dose-mass optimization to a cohort of lung cancer patients and compare the achievable healthy tissue sparing to that one achievable through dose-volume optimization. Fourteen non-small cell lung cancer (NSCLC) patient plans were studied retrospectively. The range of tumor motion was less than 0.5 cm and motion management in the treatment planning process was not considered. For each case, dose-volume (DV)-based and dose-mass (DM)-based optimization was performed. Nine-field step-and-shoot IMRT was used, with all of the optimization parameters kept the same between DV and DM optimizations. Commonly used dosimetric indices (DIs) such as dose to 1% the spinal cord volume, dose to 50% of the esophageal volume, and doses to 20 and 30% of healthy lung volumes were used for cross-comparison. Similarly, mass-based indices (MIs), such as doses to 20 and 30% of healthy lung masses, 1% of spinal cord mass, and 33% of heart mass, were also tallied. Statistical equivalence tests were performed to quantify the findings for the entire patient cohort. Both DV and DM plans for each case were normalized such that 95% of the planning target volume received the prescribed dose. DM optimization resulted in more organs at risk (OAR) sparing than DV optimization. The average sparing of cord, heart, and esophagus was 23, 4, and 6%, respectively. For the majority of the DIs, DM optimization resulted in lower lung doses. On average, the doses to 20 and 30% of healthy lung were lower by approximately 3 and 4%, whereas lung volumes receiving 2000 and 3000 cGy were lower by 3 and 2%, respectively. The behavior of MIs was very similar. The statistical analyses of the results again indicated better healthy anatomical structure sparing with DM optimization. The presented findings indicate that dose-mass-based optimization results in statistically significant OAR sparing as compared to dose-volume-based optimization for NSCLC. However, the sparing is case-dependent and it is not observed for all tallied dosimetric endpoints.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Yi; Chen, Wei; Xu, Hongyi
To provide a seamless integration of manufacturing processing simulation and fiber microstructure modeling, two new stochastic 3D microstructure reconstruction methods are proposed for two types of random fiber composites: random short fiber composites, and Sheet Molding Compounds (SMC) chopped fiber composites. A Random Sequential Adsorption (RSA) algorithm is first developed to embed statistical orientation information into 3D RVE reconstruction of random short fiber composites. For the SMC composites, an optimized Voronoi diagram based approach is developed for capturing the substructure features of SMC chopped fiber composites. The proposed methods are distinguished from other reconstruction works by providing a way ofmore » integrating statistical information (fiber orientation tensor) obtained from material processing simulation, as well as capturing the multiscale substructures of the SMC composites.« less
Development of superalloys by powder metallurgy for use at 1000 - 1400 F
NASA Technical Reports Server (NTRS)
Calhoun, C. D.
1971-01-01
Consolidated powders of four nickel-base superalloys were studied for potential application as compressor and turbine discs in jet engines. All of the alloys were based on the Rene' 95 chemistry. Three of these had variations in carbon and A12O3 contents, and the fourth alloy was chemically modified to a higher volume fraction. The A12O3 was added by preoxidation of the powders prior to extrusion. Various levels of four experimental factors (1) alloy composition, (2) grain size, (3) thermomechanical processing, and (4) room temperature deformation plus final age were evaluated by tensile and stress rupture testing at 1200 F. Various levels of the four factors were assumed in order to construct the statistically-designed experiment, but the actual levels investigated were established in preliminary studies that preceded the statistical process development study.
Influence of signal intensity non-uniformity on brain volumetry using an atlas-based method.
Goto, Masami; Abe, Osamu; Miyati, Tosiaki; Kabasawa, Hiroyuki; Takao, Hidemasa; Hayashi, Naoto; Kurosu, Tomomi; Iwatsubo, Takeshi; Yamashita, Fumio; Matsuda, Hiroshi; Mori, Harushi; Kunimatsu, Akira; Aoki, Shigeki; Ino, Kenji; Yano, Keiichi; Ohtomo, Kuni
2012-01-01
Many studies have reported pre-processing effects for brain volumetry; however, no study has investigated whether non-parametric non-uniform intensity normalization (N3) correction processing results in reduced system dependency when using an atlas-based method. To address this shortcoming, the present study assessed whether N3 correction processing provides reduced system dependency in atlas-based volumetry. Contiguous sagittal T1-weighted images of the brain were obtained from 21 healthy participants, by using five magnetic resonance protocols. After image preprocessing using the Statistical Parametric Mapping 5 software, we measured the structural volume of the segmented images with the WFU-PickAtlas software. We applied six different bias-correction levels (Regularization 10, Regularization 0.0001, Regularization 0, Regularization 10 with N3, Regularization 0.0001 with N3, and Regularization 0 with N3) to each set of images. The structural volume change ratio (%) was defined as the change ratio (%) = (100 × [measured volume - mean volume of five magnetic resonance protocols] / mean volume of five magnetic resonance protocols) for each bias-correction level. A low change ratio was synonymous with lower system dependency. The results showed that the images with the N3 correction had a lower change ratio compared with those without the N3 correction. The present study is the first atlas-based volumetry study to show that the precision of atlas-based volumetry improves when using N3-corrected images. Therefore, correction for signal intensity non-uniformity is strongly advised for multi-scanner or multi-site imaging trials.
Influence of Signal Intensity Non-Uniformity on Brain Volumetry Using an Atlas-Based Method
Abe, Osamu; Miyati, Tosiaki; Kabasawa, Hiroyuki; Takao, Hidemasa; Hayashi, Naoto; Kurosu, Tomomi; Iwatsubo, Takeshi; Yamashita, Fumio; Matsuda, Hiroshi; Mori, Harushi; Kunimatsu, Akira; Aoki, Shigeki; Ino, Kenji; Yano, Keiichi; Ohtomo, Kuni
2012-01-01
Objective Many studies have reported pre-processing effects for brain volumetry; however, no study has investigated whether non-parametric non-uniform intensity normalization (N3) correction processing results in reduced system dependency when using an atlas-based method. To address this shortcoming, the present study assessed whether N3 correction processing provides reduced system dependency in atlas-based volumetry. Materials and Methods Contiguous sagittal T1-weighted images of the brain were obtained from 21 healthy participants, by using five magnetic resonance protocols. After image preprocessing using the Statistical Parametric Mapping 5 software, we measured the structural volume of the segmented images with the WFU-PickAtlas software. We applied six different bias-correction levels (Regularization 10, Regularization 0.0001, Regularization 0, Regularization 10 with N3, Regularization 0.0001 with N3, and Regularization 0 with N3) to each set of images. The structural volume change ratio (%) was defined as the change ratio (%) = (100 × [measured volume - mean volume of five magnetic resonance protocols] / mean volume of five magnetic resonance protocols) for each bias-correction level. Results A low change ratio was synonymous with lower system dependency. The results showed that the images with the N3 correction had a lower change ratio compared with those without the N3 correction. Conclusion The present study is the first atlas-based volumetry study to show that the precision of atlas-based volumetry improves when using N3-corrected images. Therefore, correction for signal intensity non-uniformity is strongly advised for multi-scanner or multi-site imaging trials. PMID:22778560
Forest statistics for Ohio, 1991
Douglas M. Griffith; Dawn M. DiGiovanni; Teresa L. Witzel; Eric H. Wharton
1993-01-01
A statistical report on the fourth forest inventory of Ohio conducted in 1988-90. Findings are displayed in tables containing estimates of forest area, number of trees, sawtimber volume, growing-stock volume, biomass, growth, and removals. Data are presented at three levels: state, geographic unit, and county.
Interleukin-6 -174 and -572 genotypes and the volume of deep gray matter in preterm infants.
Reiman, Milla; Parkkola, Riitta; Lapinleimu, Helena; Lehtonen, Liisa; Haataja, Leena
2009-01-01
Preterm infants have smaller cerebral and cerebellar volumes at term compared with term born infants. Perinatal factors leading to the reduction in volumes are not well known. IL-6 -174 and -572 genotypes partly regulate individual immunologic responses and have also been connected with deviant neurologic development in preterm infants. Our hypothesis was that IL-6 -174 and -572 genetic polymorphisms are associated with brain lesions and regional brain volumes in very low birth weight or in very preterm infants. DNA was genotyped for IL-6 -174 and -572 polymorphisms (GG/GC/CC). Study infants (n = 175) were categorized into three groups according to the most pathologic brain finding in ultrasound examinations until term. The brain MRI performed at term was analyzed for regional brain volumes. Analyzed IL-6 genotypes did not show statistically significant association with structural brain lesions. However, IL-6 -174 CC and -572 GG genotypes associated with reduced volume of one brain region, the combined volume of basal ganglia and thalami, both in univariate and in multivariate analyses (p = 0.009, 0.009, respectively). The association of IL-6 -174 and -572 genetic polymorphisms with smaller volumes in deep gray matter provides us new ways to understand the processes leading to neurologic impairments in preterm infants.
Moon, Chung-Man; Jeong, Gwang-Woo
2015-11-01
Only a few morphological studies have focused on changes in white matter (WM) volume in patients with generalized anxiety disorder (GAD). We evaluated alterations in WM volume and its correlation with symptom severity and duration of illness in adults with GAD. The 44 subjects were comprised of 22 patients with GAD (13 males and nine females) diagnosed using the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision (DSM-IV-TR) and 22 age-matched healthy controls (13 males and nine females). High-resolution magnetic resonance imaging (MRI) data were processed by voxel-based morphometry (VBM) analysis based on diffeomorphic anatomical registration using the exponentiated Lie algebra (DARTEL) algorithm in SPM8. Patients with GAD showed significantly reduced WM volume, particularly in the dorsolateral prefrontal cortex (DLPFC), anterior limb of the internal capsule (ALIC), and midbrain. In addition, DLPFC volume was negatively correlated with GAD-7 score and illness duration. ALIC volume was negatively correlated with GAD-7 score. Female patients had significantly less orbitofrontal cortex volume compared to that in male patients. The findings demonstrate localized changes in WM volume associated with cognitive and emotional dysfunction in patients with GAD. The finding will be helpful for understanding the neuropathology in patients with GAD.
Dalwani, Manish S; McMahon, Mary Agnes; Mikulich-Gilbertson, Susan K; Young, Susan E; Regner, Michael F; Raymond, Kristen M; McWilliams, Shannon K; Banich, Marie T; Tanabe, Jody L; Crowley, Thomas J; Sakai, Joseph T
2015-01-01
Structural neuroimaging studies have demonstrated lower regional gray matter volume in adolescents with severe substance and conduct problems. These research studies, including ours, have generally focused on male-only or mixed-sex samples of adolescents with conduct and/or substance problems. Here we compare gray matter volume between female adolescents with severe substance and conduct problems and female healthy controls of similar ages. Female adolescents with severe substance and conduct problems will show significantly less gray matter volume in frontal regions critical to inhibition (i.e. dorsolateral prefrontal cortex and ventrolateral prefrontal cortex), conflict processing (i.e., anterior cingulate), valuation of expected outcomes (i.e., medial orbitofrontal cortex) and the dopamine reward system (i.e. striatum). We conducted whole-brain voxel-based morphometric comparison of structural MR images of 22 patients (14-18 years) with severe substance and conduct problems and 21 controls of similar age using statistical parametric mapping (SPM) and voxel-based morphometric (VBM8) toolbox. We tested group differences in regional gray matter volume with analyses of covariance, adjusting for age and IQ at p<0.05, corrected for multiple comparisons at whole-brain cluster-level threshold. Female adolescents with severe substance and conduct problems compared to controls showed significantly less gray matter volume in right dorsolateral prefrontal cortex, left ventrolateral prefrontal cortex, medial orbitofrontal cortex, anterior cingulate, bilateral somatosensory cortex, left supramarginal gyrus, and bilateral angular gyrus. Considering the entire brain, patients had 9.5% less overall gray matter volume compared to controls. Female adolescents with severe substance and conduct problems in comparison to similarly aged female healthy controls showed substantially lower gray matter volume in brain regions involved in inhibition, conflict processing, valuation of outcomes, decision-making, reward, risk-taking, and rule-breaking antisocial behavior.
Statistical Analyses of Brain Surfaces Using Gaussian Random Fields on 2-D Manifolds
Staib, Lawrence H.; Xu, Dongrong; Zhu, Hongtu; Peterson, Bradley S.
2008-01-01
Interest in the morphometric analysis of the brain and its subregions has recently intensified because growth or degeneration of the brain in health or illness affects not only the volume but also the shape of cortical and subcortical brain regions, and new image processing techniques permit detection of small and highly localized perturbations in shape or localized volume, with remarkable precision. An appropriate statistical representation of the shape of a brain region is essential, however, for detecting, localizing, and interpreting variability in its surface contour and for identifying differences in volume of the underlying tissue that produce that variability across individuals and groups of individuals. Our statistical representation of the shape of a brain region is defined by a reference region for that region and by a Gaussian random field (GRF) that is defined across the entire surface of the region. We first select a reference region from a set of segmented brain images of healthy individuals. The GRF is then estimated as the signed Euclidean distances between points on the surface of the reference region and the corresponding points on the corresponding region in images of brains that have been coregistered to the reference. Correspondences between points on these surfaces are defined through deformations of each region of a brain into the coordinate space of the reference region using the principles of fluid dynamics. The warped, coregistered region of each subject is then unwarped into its native space, simultaneously bringing into that space the map of corresponding points that was established when the surfaces of the subject and reference regions were tightly coregistered. The proposed statistical description of the shape of surface contours makes no assumptions, other than smoothness, about the shape of the region or its GRF. The description also allows for the detection and localization of statistically significant differences in the shapes of the surfaces across groups of subjects at both a fine and coarse scale. We demonstrate the effectiveness of these statistical methods by applying them to study differences in shape of the amygdala and hippocampus in a large sample of normal subjects and in subjects with attention deficit/hyperactivity disorder (ADHD). PMID:17243583
A semi-automatic method for left ventricle volume estimate: an in vivo validation study
NASA Technical Reports Server (NTRS)
Corsi, C.; Lamberti, C.; Sarti, A.; Saracino, G.; Shiota, T.; Thomas, J. D.
2001-01-01
This study aims to the validation of the left ventricular (LV) volume estimates obtained by processing volumetric data utilizing a segmentation model based on level set technique. The validation has been performed by comparing real-time volumetric echo data (RT3DE) and magnetic resonance (MRI) data. A validation protocol has been defined. The validation protocol was applied to twenty-four estimates (range 61-467 ml) obtained from normal and pathologic subjects, which underwent both RT3DE and MRI. A statistical analysis was performed on each estimate and on clinical parameters as stroke volume (SV) and ejection fraction (EF). Assuming MRI estimates (x) as a reference, an excellent correlation was found with volume measured by utilizing the segmentation procedure (y) (y=0.89x + 13.78, r=0.98). The mean error on SV was 8 ml and the mean error on EF was 2%. This study demonstrated that the segmentation technique is reliably applicable on human hearts in clinical practice.
Temporal processing and adaptation in the songbird auditory forebrain.
Nagel, Katherine I; Doupe, Allison J
2006-09-21
Songbird auditory neurons must encode the dynamics of natural sounds at many volumes. We investigated how neural coding depends on the distribution of stimulus intensities. Using reverse-correlation, we modeled responses to amplitude-modulated sounds as the output of a linear filter and a nonlinear gain function, then asked how filters and nonlinearities depend on the stimulus mean and variance. Filter shape depended strongly on mean amplitude (volume): at low mean, most neurons integrated sound over many milliseconds, while at high mean, neurons responded more to local changes in amplitude. Increasing the variance (contrast) of amplitude modulations had less effect on filter shape but decreased the gain of firing in most cells. Both filter and gain changes occurred rapidly after a change in statistics, suggesting that they represent nonlinearities in processing. These changes may permit neurons to signal effectively over a wider dynamic range and are reminiscent of findings in other sensory systems.
Estimating total maximum daily loads with the Stochastic Empirical Loading and Dilution Model
Granato, Gregory; Jones, Susan Cheung
2017-01-01
The Massachusetts Department of Transportation (DOT) and the Rhode Island DOT are assessing and addressing roadway contributions to total maximum daily loads (TMDLs). Example analyses for total nitrogen, total phosphorus, suspended sediment, and total zinc in highway runoff were done by the U.S. Geological Survey in cooperation with FHWA to simulate long-term annual loads for TMDL analyses with the stochastic empirical loading and dilution model known as SELDM. Concentration statistics from 19 highway runoff monitoring sites in Massachusetts were used with precipitation statistics from 11 long-term monitoring sites to simulate long-term pavement yields (loads per unit area). Highway sites were stratified by traffic volume or surrounding land use to calculate concentration statistics for rural roads, low-volume highways, high-volume highways, and ultraurban highways. The median of the event mean concentration statistics in each traffic volume category was used to simulate annual yields from pavement for a 29- or 30-year period. Long-term average yields for total nitrogen, phosphorus, and zinc from rural roads are lower than yields from the other categories, but yields of sediment are higher than for the low-volume highways. The average yields of the selected water quality constituents from high-volume highways are 1.35 to 2.52 times the associated yields from low-volume highways. The average yields of the selected constituents from ultraurban highways are 1.52 to 3.46 times the associated yields from high-volume highways. Example simulations indicate that both concentration reduction and flow reduction by structural best management practices are crucial for reducing runoff yields.
Metal Matrix Composite Material by Direct Metal Deposition
NASA Astrophysics Data System (ADS)
Novichenko, D.; Marants, A.; Thivillon, L.; Bertrand, P. H.; Smurov, I.
Direct Metal Deposition (DMD) is a laser cladding process for producing a protective coating on the surface of a metallic part or manufacturing layer-by-layer parts in a single-step process. The objective of this work is to demonstrate the possibility to create carbide-reinforced metal matrix composite objects. Powders of steel 16NCD13 with different volume contents of titanium carbide are tested. On the base of statistical analysis, a laser cladding processing map is constructed. Relationships between the different content of titanium carbide in a powder mixture and the material microstructure are found. Mechanism of formation of various precipitated titanium carbides is investigated.
Similarity in Bilateral Isolated Internal Orbital Fractures.
Chen, Hung-Chang; Cox, Jacob T; Sanyal, Abanti; Mahoney, Nicholas R
2018-04-13
In evaluating patients sustaining bilateral isolated internal orbital fractures, the authors have observed both similar fracture locations and also similar expansion of orbital volumes. In this study, we aim to investigate if there is a propensity for the 2 orbits to fracture in symmetrically similar patterns when sustaining similar trauma. A retrospective chart review was performed studying all cases at our institution of bilateral isolated internal orbital fractures involving the medial wall and/or the floor at the time of presentation. The similarity of the bilateral fracture locations was evaluated using the Fisher's exact test. The bilateral expanded orbital volumes were analyzed using the Wilcoxon signed-rank test to assess for orbital volume similarity. Twenty-four patients with bilateral internal orbital fractures were analyzed for fracture location similarity. Seventeen patients (70.8%) had 100% concordance in the orbital subregion fractured, and the association between the right and the left orbital fracture subregion locations was statistically significant (P < 0.0001). Fifteen patients were analyzed for orbital volume similarity. The average orbital cavity volume was 31.2 ± 3.8 cm on the right and 32.0 ± 3.7 cm on the left. There was a statistically significant difference between right and left orbital cavity volumes (P = 0.0026). The data from this study suggest that an individual who suffers isolated bilateral internal orbital fractures has a statistically significant similarity in the location of their orbital fractures. However, there does not appear to be statistically significant similarity in the expansion of the orbital volumes in these patients.
First passage times in homogeneous nucleation: Dependence on the total number of particles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yvinec, Romain; Bernard, Samuel; Pujo-Menjouet, Laurent
2016-01-21
Motivated by nucleation and molecular aggregation in physical, chemical, and biological settings, we present an extension to a thorough analysis of the stochastic self-assembly of a fixed number of identical particles in a finite volume. We study the statistics of times required for maximal clusters to be completed, starting from a pure-monomeric particle configuration. For finite volumes, we extend previous analytical approaches to the case of arbitrary size-dependent aggregation and fragmentation kinetic rates. For larger volumes, we develop a scaling framework to study the first assembly time behavior as a function of the total quantity of particles. We find thatmore » the mean time to first completion of a maximum-sized cluster may have a surprisingly weak dependence on the total number of particles. We highlight how higher statistics (variance, distribution) of the first passage time may nevertheless help to infer key parameters, such as the size of the maximum cluster. Finally, we present a framework to quantify formation of macroscopic sized clusters, which are (asymptotically) very unlikely and occur as a large deviation phenomenon from the mean-field limit. We argue that this framework is suitable to describe phase transition phenomena, as inherent infrequent stochastic processes, in contrast to classical nucleation theory.« less
First passage times in homogeneous nucleation: Dependence on the total number of particles
NASA Astrophysics Data System (ADS)
Yvinec, Romain; Bernard, Samuel; Hingant, Erwan; Pujo-Menjouet, Laurent
2016-01-01
Motivated by nucleation and molecular aggregation in physical, chemical, and biological settings, we present an extension to a thorough analysis of the stochastic self-assembly of a fixed number of identical particles in a finite volume. We study the statistics of times required for maximal clusters to be completed, starting from a pure-monomeric particle configuration. For finite volumes, we extend previous analytical approaches to the case of arbitrary size-dependent aggregation and fragmentation kinetic rates. For larger volumes, we develop a scaling framework to study the first assembly time behavior as a function of the total quantity of particles. We find that the mean time to first completion of a maximum-sized cluster may have a surprisingly weak dependence on the total number of particles. We highlight how higher statistics (variance, distribution) of the first passage time may nevertheless help to infer key parameters, such as the size of the maximum cluster. Finally, we present a framework to quantify formation of macroscopic sized clusters, which are (asymptotically) very unlikely and occur as a large deviation phenomenon from the mean-field limit. We argue that this framework is suitable to describe phase transition phenomena, as inherent infrequent stochastic processes, in contrast to classical nucleation theory.
Transportation statistics annual report 1995
DOT National Transportation Integrated Search
1995-01-01
The summary of transportation statistics : programs and many of the tables and : graphs pioneered in last years Transportation : Statistics Annual Report have : been incorporated into the companion volume, : National Transportation Statistics. The...
NASA Technical Reports Server (NTRS)
Lin, Shian-Jiann; DaSilva, Arlindo; Atlas, Robert (Technical Monitor)
2001-01-01
Toward the development of a finite-volume Data Assimilation System (fvDAS), a consistent finite-volume methodology is developed for interfacing the NASA/DAO's Physical Space Statistical Analysis System (PSAS) to the joint NASA/NCAR finite volume CCM3 (fvCCM3). To take advantage of the Lagrangian control-volume vertical coordinate of the fvCCM3, a novel "shaving" method is applied to the lowest few model layers to reflect the surface pressure changes as implied by the final analysis. Analysis increments (from PSAS) to the upper air variables are then consistently put onto the Lagrangian layers as adjustments to the volume-mean quantities during the analysis cycle. This approach is demonstrated to be superior to the conventional method of using independently computed "tendency terms" for surface pressure and upper air prognostic variables.
NASA Astrophysics Data System (ADS)
Nicolae Lerma, Alexandre; Bulteau, Thomas; Elineau, Sylvain; Paris, François; Durand, Paul; Anselme, Brice; Pedreros, Rodrigo
2018-01-01
A modelling chain was implemented in order to propose a realistic appraisal of the risk in coastal areas affected by overflowing as well as overtopping processes. Simulations are performed through a nested downscaling strategy from regional to local scale at high spatial resolution with explicit buildings, urban structures such as sea front walls and hydraulic structures liable to affect the propagation of water in urban areas. Validation of the model performance is based on hard and soft available data analysis and conversion of qualitative to quantitative information to reconstruct the area affected by flooding and the succession of events during two recent storms. Two joint probability approaches (joint exceedance contour and environmental contour) are used to define 100-year offshore conditions scenarios and to investigate the flood response to each scenario in terms of (1) maximum spatial extent of flooded areas, (2) volumes of water propagation inland and (3) water level in flooded areas. Scenarios of sea level rise are also considered in order to evaluate the potential hazard evolution. Our simulations show that for a maximising 100-year hazard scenario, for the municipality as a whole, 38 % of the affected zones are prone to overflow flooding and 62 % to flooding by propagation of overtopping water volume along the seafront. Results also reveal that for the two kinds of statistic scenarios a difference of about 5 % in the forcing conditions (water level, wave height and period) can produce significant differences in terms of flooding like +13.5 % of water volumes propagating inland or +11.3 % of affected surfaces. In some areas, flood response appears to be very sensitive to the chosen scenario with differences of 0.3 to 0.5 m in water level. The developed approach enables one to frame the 100-year hazard and to characterize spatially the robustness or the uncertainty over the results. Considering a 100-year scenario with mean sea level rise (0.6 m), hazard characteristics are dramatically changed with an evolution of the overtopping / overflowing process ratio and an increase of a factor 4.84 in volumes of water propagating inland and 3.47 in flooded surfaces.
ARL Academic Health Sciences Library Statistics, 2000-01.
ERIC Educational Resources Information Center
Young, Mark, Comp.; Kyrillidou, Martha, Comp.
This document presents results of the 2000-01 Association of Research Libraries (ARL) Medical Library Statistics Questionnaire. Of 113 ARL university libraries, 63 responded to the survey. Results for each library are presented in the following data tables: (1) collections, including volumes in library, volumes added, monographs purchased, current…
ARL Academic Law Library Statistics, 2007-2008
ERIC Educational Resources Information Center
Kyrillidou, Martha, Comp.; Bland, Les, Comp.
2009-01-01
This document presents results of the 2007-2008 Association of Research Libraries (ARL) Law Library Statistics Questionnaire. Of 113 ARL university libraries, 74 responded to the survey. Results for each library are presented in the following data tables: (1) collections (2-parts), including volumes in library, volumes added, monographs purchased,…
ARL Academic Law Library Statistics 2006-2007
ERIC Educational Resources Information Center
Kyrillidou, Martha, Comp.; Bland, Les, Comp.
2008-01-01
This document presents results of the 2006-2007 Association of Research Libraries (ARL) Law Library Statistics Questionnaire. Of 113 ARL university libraries, 74 responded to the survey. Results for each library are presented in the following data tables: (1) collections (2-parts), including volumes in library, volumes added, monographs purchased,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michels-Clark, Tara M.; Savici, Andrei T.; Lynch, Vickie E.
Evidence is mounting that potentially exploitable properties of technologically and chemically interesting crystalline materials are often attributable to local structure effects, which can be observed as modulated diffuse scattering (mDS) next to Bragg diffraction (BD). BD forms a regular sparse grid of intense discrete points in reciprocal space. Traditionally, the intensity of each Bragg peak is extracted by integration of each individual reflection first, followed by application of the required corrections. In contrast, mDS is weak and covers expansive volumes of reciprocal space close to, or between, Bragg reflections. For a representative measurement of the diffuse scattering, multiple sample orientationsmore » are generally required, where many points in reciprocal space are measured multiple times and the resulting data are combined. The common post-integration data reduction method is not optimal with regard to counting statistics. A general and inclusive data processing method is needed. In this contribution, a comprehensive data analysis approach is introduced to correct and merge the full volume of scattering data in a single step, while correctly accounting for the statistical weight of the individual measurements. Lastly, development of this new approach required the exploration of a data treatment and correction protocol that includes the entire collected reciprocal space volume, using neutron time-of-flight or wavelength-resolved data collected at TOPAZ at the Spallation Neutron Source at Oak Ridge National Laboratory.« less
Michels-Clark, Tara M.; Savici, Andrei T.; Lynch, Vickie E.; ...
2016-03-01
Evidence is mounting that potentially exploitable properties of technologically and chemically interesting crystalline materials are often attributable to local structure effects, which can be observed as modulated diffuse scattering (mDS) next to Bragg diffraction (BD). BD forms a regular sparse grid of intense discrete points in reciprocal space. Traditionally, the intensity of each Bragg peak is extracted by integration of each individual reflection first, followed by application of the required corrections. In contrast, mDS is weak and covers expansive volumes of reciprocal space close to, or between, Bragg reflections. For a representative measurement of the diffuse scattering, multiple sample orientationsmore » are generally required, where many points in reciprocal space are measured multiple times and the resulting data are combined. The common post-integration data reduction method is not optimal with regard to counting statistics. A general and inclusive data processing method is needed. In this contribution, a comprehensive data analysis approach is introduced to correct and merge the full volume of scattering data in a single step, while correctly accounting for the statistical weight of the individual measurements. Lastly, development of this new approach required the exploration of a data treatment and correction protocol that includes the entire collected reciprocal space volume, using neutron time-of-flight or wavelength-resolved data collected at TOPAZ at the Spallation Neutron Source at Oak Ridge National Laboratory.« less
Suppa, Per; Hampel, Harald; Spies, Lothar; Fiebach, Jochen B; Dubois, Bruno; Buchert, Ralph
2015-01-01
Hippocampus volumetry based on magnetic resonance imaging (MRI) has not yet been translated into everyday clinical diagnostic patient care, at least in part due to limited availability of appropriate software tools. In the present study, we evaluate a fully-automated and computationally efficient processing pipeline for atlas based hippocampal volumetry using freely available Statistical Parametric Mapping (SPM) software in 198 amnestic mild cognitive impairment (MCI) subjects from the Alzheimer's Disease Neuroimaging Initiative (ADNI1). Subjects were grouped into MCI stable and MCI to probable Alzheimer's disease (AD) converters according to follow-up diagnoses at 12, 24, and 36 months. Hippocampal grey matter volume (HGMV) was obtained from baseline T1-weighted MRI and then corrected for total intracranial volume and age. Average processing time per subject was less than 4 minutes on a standard PC. The area under the receiver operator characteristic curve of the corrected HGMV for identification of MCI to probable AD converters within 12, 24, and 36 months was 0.78, 0.72, and 0.71, respectively. Thus, hippocampal volume computed with the fully-automated processing pipeline provides similar power for prediction of MCI to probable AD conversion as computationally more expensive methods. The whole processing pipeline has been made freely available as an SPM8 toolbox. It is easily set up and integrated into everyday clinical patient care.
Knowledge Discovery and Data Mining in Iran's Climatic Researches
NASA Astrophysics Data System (ADS)
Karimi, Mostafa
2013-04-01
Advances in measurement technology and data collection is the database gets larger. Large databases require powerful tools for analysis data. Iterative process of acquiring knowledge from information obtained from data processing is done in various forms in all scientific fields. However, when the data volume large, and many of the problems the Traditional methods cannot respond. in the recent years, use of databases in various scientific fields, especially atmospheric databases in climatology expanded. in addition, increases in the amount of data generated by the climate models is a challenge for analysis of it for extraction of hidden pattern and knowledge. The approach to this problem has been made in recent years uses the process of knowledge discovery and data mining techniques with the use of the concepts of machine learning, artificial intelligence and expert (professional) systems is overall performance. Data manning is analytically process for manning in massive volume data. The ultimate goal of data mining is access to information and finally knowledge. climatology is a part of science that uses variety and massive volume data. Goal of the climate data manning is Achieve to information from variety and massive atmospheric and non-atmospheric data. in fact, Knowledge Discovery performs these activities in a logical and predetermined and almost automatic process. The goal of this research is study of uses knowledge Discovery and data mining technique in Iranian climate research. For Achieve This goal, study content (descriptive) analysis and classify base method and issue. The result shown that in climatic research of Iran most clustering, k-means and wards applied and in terms of issues precipitation and atmospheric circulation patterns most introduced. Although several studies in geography and climate issues with statistical techniques such as clustering and pattern extraction is done, Due to the nature of statistics and data mining, but cannot say for internal climate studies in data mining and knowledge discovery techniques are used. However, it is necessary to use the KDD Approach and DM techniques in the climatic studies, specific interpreter of climate modeling result.
Acceptance Probability (P a) Analysis for Process Validation Lifecycle Stages.
Alsmeyer, Daniel; Pazhayattil, Ajay; Chen, Shu; Munaretto, Francesco; Hye, Maksuda; Sanghvi, Pradeep
2016-04-01
This paper introduces an innovative statistical approach towards understanding how variation impacts the acceptance criteria of quality attributes. Because of more complex stage-wise acceptance criteria, traditional process capability measures are inadequate for general application in the pharmaceutical industry. The probability of acceptance concept provides a clear measure, derived from specific acceptance criteria for each quality attribute. In line with the 2011 FDA Guidance, this approach systematically evaluates data and scientifically establishes evidence that a process is capable of consistently delivering quality product. The probability of acceptance provides a direct and readily understandable indication of product risk. As with traditional capability indices, the acceptance probability approach assumes that underlying data distributions are normal. The computational solutions for dosage uniformity and dissolution acceptance criteria are readily applicable. For dosage uniformity, the expected AV range may be determined using the s lo and s hi values along with the worst case estimates of the mean. This approach permits a risk-based assessment of future batch performance of the critical quality attributes. The concept is also readily applicable to sterile/non sterile liquid dose products. Quality attributes such as deliverable volume and assay per spray have stage-wise acceptance that can be converted into an acceptance probability. Accepted statistical guidelines indicate processes with C pk > 1.33 as performing well within statistical control and those with C pk < 1.0 as "incapable" (1). A C pk > 1.33 is associated with a centered process that will statistically produce less than 63 defective units per million. This is equivalent to an acceptance probability of >99.99%.
D'Ambrosio, Alessandro; Pagani, Elisabetta; Riccitelli, Gianna C; Colombo, Bruno; Rodegher, Mariaemma; Falini, Andrea; Comi, Giancarlo; Filippi, Massimo; Rocca, Maria A
2017-08-01
To investigate the role of cerebellar sub-regions on motor and cognitive performance in multiple sclerosis (MS) patients. Whole and sub-regional cerebellar volumes, brain volumes, T2 hyperintense lesion volumes (LV), and motor performance scores were obtained from 95 relapse-onset MS patients and 32 healthy controls (HC). MS patients also underwent an evaluation of working memory and processing speed functions. Cerebellar anterior and posterior lobes were segmented using the Spatially Unbiased Infratentorial Toolbox (SUIT) from Statistical Parametric Mapping (SPM12). Multivariate linear regression models assessed the relationship between magnetic resonance imaging (MRI) measures and motor/cognitive scores. Compared to HC, only secondary progressive multiple sclerosis (SPMS) patients had lower cerebellar volumes (total and posterior cerebellum). In MS patients, lower anterior cerebellar volume and brain T2 LV predicted worse motor performance, whereas lower posterior cerebellar volume and brain T2 LV predicted poor cognitive performance. Global measures of brain volume and infratentorial T2 LV were not selected by the final multivariate models. Cerebellar volumetric abnormalities are likely to play an important contribution to explain motor and cognitive performance in MS patients. Consistently with functional mapping studies, cerebellar posterior-inferior volume accounted for variance in cognitive measures, whereas anterior cerebellar volume accounted for variance in motor performance, supporting the assessment of cerebellar damage at sub-regional level.
Metalorganic chemical vapor deposition of AlGaAs and InGaP heterojunction bipolar transistors
NASA Astrophysics Data System (ADS)
Pan, N.; Welser, R. E.; Lutz, C. R.; DeLuca, P. M.; Han, B.; Hong, K.
2001-05-01
Heterojunction bipolar transistors (HBT) are now beginning to be widely incorporated as power amplifiers, laser drivers, multiplexers, clock data recovery circuits, as well as transimpedance and broadband amplifiers in high performance millimeter wave circuits (MMICs). The increasing acceptance of this device is principally due to advancements in metalorganic chemical vapor deposition (MOCVD), device processing, and circuit design technologies. Many of the DC electrical characteristics of large area devices can be directly correlated to the DC performance of small area RF devices. A precise understanding of the growth parameters and their relationship to device characteristics is critical for ensuring the high degree of reproducibility required for low cost high-yield volume manufacturing. Significant improvements in the understanding of the MOCVD growth process have been realized through the implementation of statistical process control on the key HBT device parameters. This tool has been successfully used to maintain the high quality of the device characteristics in high-volume production of 4″ GaAs-based HBTs. There is a growing demand to migrate towards 6″ diameter wafer size due to the potential cost reductions and increased volume production that can be realized. Preliminary results, indicating good heterostructure layer characteristics, demonstrate the feasibility of 6″ InGaP-based HBT devices.
Zhu, Bo; Liu, Jianli; Gao, Weidong
2017-09-01
This paper reports on the process optimization of ultrasonic assisted alcoholic-alkaline treatment to prepare granular cold water swelling (GCWS) starches. In this work, three statistical approaches such as Plackett-Burman, steepest ascent path analysis and Box-Behnken design were successfully combined to investigate the effects of major treatment process variables including starch concentration, ethanol volume fraction, sodium hydroxide dosage, ultrasonic power and treatment time, and drying operation, that is, vacuum degree and drying time on cold-water solubility. Results revealed that ethanol volume fraction, sodium hydroxide dosage, applied power and ultrasonic treatment time were significant factors that affected the cold-water solubility of GCWS starches. The maximum cold-water solubility was obtained when treated at 400W of applied power for 27.38min. Optimum volume fraction of ethanol and sodium hydroxide dosage were 66.85% and 53.76mL, respectively. The theoretical values (93.87%) and the observed values (93.87%) were in reasonably good agreement and the deviation was less than 1%. Verification and repeated trial results indicated that the ultrasound-assisted alcoholic-alkaline treatment could be successfully used for the preparation of granular cold water swelling starches at room temperatures and had excellent improvement on the cold-water solubility of GCWS starches. Copyright © 2016. Published by Elsevier B.V.
Finite size effects in phase transformation kinetics in thin films and surface layers
NASA Astrophysics Data System (ADS)
Trofimov, Vladimir I.; Trofimov, Ilya V.; Kim, Jong-Il
2004-02-01
In studies of phase transformation kinetics in thin films, e.g. crystallization of amorphous films, until recent time is widely used familiar Kolmogorov-Johnson-Mehl-Avrami (KJMA) statistical model of crystallization despite it is applicable only to an infinite medium. In this paper a model of transformation kinetics in thin films based on a concept of the survival probability for randomly chosen point during transformation process is presented. Two model versions: volume induced transformation (VIT) when the second-phase grains nucleate over a whole film volume and surface induced transformation (SIT) when they form on an interface with two nucleation mode: instantaneous nucleation at transformation onset and continuous one during all the process are studied. At VIT-process due to the finite film thickness effects the transformation profile has a maximum in a film middle, whereas that of the grains population reaches a minimum inhere, the grains density is always higher than in a volume material, and the thinner film the slower it transforms. The transformation kinetics in a thin film obeys a generalized KJMA equation with parameters depending on a film thickness and in limiting cases of extremely thin and thick film it reduces to classical KJMA equation for 2D- and 3D-system, respectively.
Relaxation mechanisms in glassy dynamics: the Arrhenius and fragile regimes.
Hentschel, H George E; Karmakar, Smarajit; Procaccia, Itamar; Zylberg, Jacques
2012-06-01
Generic glass formers exhibit at least two characteristic changes in their relaxation behavior, first to an Arrhenius-type relaxation at some characteristic temperature and then at a lower characteristic temperature to a super-Arrhenius (fragile) behavior. We address these transitions by studying the statistics of free energy barriers for different systems at different temperatures and space dimensions. We present a clear evidence for changes in the dynamical behavior at the transition to Arrhenius and then to a super-Arrhenius behavior. A simple model is presented, based on the idea of competition between single-particle and cooperative dynamics. We argue that Arrhenius behavior can take place as long as there is enough free volume for the completion of a simple T1 relaxation process. Once free volume is absent one needs a cooperative mechanism to "collect" enough free volume. We show that this model captures all the qualitative behavior observed in simulations throughout the considered temperature range.
Recent progress in simulating galaxy formation from the largest to the smallest scales
NASA Astrophysics Data System (ADS)
Faucher-Giguère, Claude-André
2018-05-01
Galaxy formation simulations are an essential part of the modern toolkit of astrophysicists and cosmologists alike. Astrophysicists use the simulations to study the emergence of galaxy populations from the Big Bang, as well as the formation of stars and supermassive black holes. For cosmologists, galaxy formation simulations are needed to understand how baryonic processes affect measurements of dark matter and dark energy. Owing to the extreme dynamic range of galaxy formation, advances are driven by novel approaches using simulations with different tradeoffs between volume and resolution. Large-volume but low-resolution simulations provide the best statistics, while higher-resolution simulations of smaller cosmic volumes can be evolved with self-consistent physics and reveal important emergent phenomena. I summarize recent progress in galaxy formation simulations, including major developments in the past five years, and highlight some key areas likely to drive further advances over the next decade.
PREFACE: ELC International Meeting on Inference, Computation, and Spin Glasses (ICSG2013)
NASA Astrophysics Data System (ADS)
Kabashima, Yoshiyuki; Hukushima, Koji; Inoue, Jun-ichi; Tanaka, Toshiyuki; Watanabe, Osamu
2013-12-01
The close relationship between probability-based inference and statistical mechanics of disordered systems has been noted for some time. This relationship has provided researchers with a theoretical foundation in various fields of information processing for analytical performance evaluation and construction of efficient algorithms based on message-passing or Monte Carlo sampling schemes. The ELC International Meeting on 'Inference, Computation, and Spin Glasses (ICSG2013)', was held in Sapporo 28-30 July 2013. The meeting was organized as a satellite meeting of STATPHYS25 in order to offer a forum where concerned researchers can assemble and exchange information on the latest results and newly established methodologies, and discuss future directions of the interdisciplinary studies between statistical mechanics and information sciences. Financial support from Grant-in-Aid for Scientific Research on Innovative Areas, MEXT, Japan 'Exploring the Limits of Computation (ELC)' is gratefully acknowledged. We are pleased to publish 23 papers contributed by invited speakers of ICSG2013 in this volume of Journal of Physics: Conference Series. We hope that this volume will promote further development of this highly vigorous interdisciplinary field between statistical mechanics and information/computer science. Editors and ICSG2013 Organizing Committee: Koji Hukushima Jun-ichi Inoue (Local Chair of ICSG2013) Yoshiyuki Kabashima (Editor-in-Chief) Toshiyuki Tanaka Osamu Watanabe (General Chair of ICSG2013)
Brain volumetric changes and cognitive ageing during the eighth decade of life
Dickie, David Alexander; Cox, Simon R.; Valdes Hernandez, Maria del C.; Corley, Janie; Royle, Natalie A.; Pattie, Alison; Aribisala, Benjamin S.; Redmond, Paul; Muñoz Maniega, Susana; Taylor, Adele M.; Sibbett, Ruth; Gow, Alan J.; Starr, John M.; Bastin, Mark E.; Wardlaw, Joanna M.; Deary, Ian J.
2015-01-01
Abstract Later‐life changes in brain tissue volumes—decreases in the volume of healthy grey and white matter and increases in the volume of white matter hyperintensities (WMH)—are strong candidates to explain some of the variation in ageing‐related cognitive decline. We assessed fluid intelligence, memory, processing speed, and brain volumes (from structural MRI) at mean age 73 years, and at mean age 76 in a narrow‐age sample of older individuals (n = 657 with brain volumetric data at the initial wave, n = 465 at follow‐up). We used latent variable modeling to extract error‐free cognitive levels and slopes. Initial levels of cognitive ability were predictive of subsequent brain tissue volume changes. Initial brain volumes were not predictive of subsequent cognitive changes. Brain volume changes, especially increases in WMH, were associated with declines in each of the cognitive abilities. All statistically significant results were modest in size (absolute r‐values ranged from 0.114 to 0.334). These results build a comprehensive picture of macrostructural brain volume changes and declines in important cognitive faculties during the eighth decade of life. Hum Brain Mapp 36:4910–4925, 2015. © 2015 The Authors. Human Brain Mapping Published by Wiley Periodicals, Inc PMID:26769551
NASA Astrophysics Data System (ADS)
Mottyll, S.; Skoda, R.
2015-12-01
A compressible inviscid flow solver with barotropic cavitation model is applied to two different ultrasonic horn set-ups and compared to hydrophone, shadowgraphy as well as erosion test data. The statistical analysis of single collapse events in wall-adjacent flow regions allows the determination of the flow aggressiveness via load collectives (cumulative event rate vs collapse pressure), which show an exponential decrease in agreement to studies on hydrodynamic cavitation [1]. A post-processing projection of event rate and collapse pressure on a reference grid reduces the grid dependency significantly. In order to evaluate the erosion-sensitive areas a statistical analysis of transient wall loads is utilised. Predicted erosion sensitive areas as well as temporal pressure and vapour volume evolution are in good agreement to the experimental data.
Effect of spatial coherence of light on the photoregulation processes in cells
NASA Astrophysics Data System (ADS)
Budagovsky, A. V.; Solovykh, N. V.; Yankovskaya, M. B.; Maslova, M. V.; Budagovskaya, O. N.; Budagovsky, I. A.
2016-07-01
The effect of the statistical properties of light on the value of the photoinduced reaction of the biological objects, which differ in the morphological and physiological characteristics, the optical properties, and the size of cells, was studied. The fruit of apple trees, the pollen of cherries, the microcuttings of blackberries in vitro, and the spores and the mycelium of fungi were irradiated by quasimonochromatic light fluxes with identical energy parameters but different values of coherence length and radius of correlation. In all cases, the greatest stimulation effect occurred when the cells completely fit in the volume of the coherence of the field, while both temporal and spatial coherence have a significant and mathematically certain impact on the physiological activity of cells. It was concluded that not only the spectral, but also the statistical (coherent) properties of the acting light play an important role in the photoregulation process.
NASA Astrophysics Data System (ADS)
Leung, Juliana Y.; Srinivasan, Sanjay
2016-09-01
Modeling transport process at large scale requires proper scale-up of subsurface heterogeneity and an understanding of its interaction with the underlying transport mechanisms. A technique based on volume averaging is applied to quantitatively assess the scaling characteristics of effective mass transfer coefficient in heterogeneous reservoir models. The effective mass transfer coefficient represents the combined contribution from diffusion and dispersion to the transport of non-reactive solute particles within a fluid phase. Although treatment of transport problems with the volume averaging technique has been published in the past, application to geological systems exhibiting realistic spatial variability remains a challenge. Previously, the authors developed a new procedure where results from a fine-scale numerical flow simulation reflecting the full physics of the transport process albeit over a sub-volume of the reservoir are integrated with the volume averaging technique to provide effective description of transport properties. The procedure is extended such that spatial averaging is performed at the local-heterogeneity scale. In this paper, the transport of a passive (non-reactive) solute is simulated on multiple reservoir models exhibiting different patterns of heterogeneities, and the scaling behavior of effective mass transfer coefficient (Keff) is examined and compared. One such set of models exhibit power-law (fractal) characteristics, and the variability of dispersion and Keff with scale is in good agreement with analytical expressions described in the literature. This work offers an insight into the impacts of heterogeneity on the scaling of effective transport parameters. A key finding is that spatial heterogeneity models with similar univariate and bivariate statistics may exhibit different scaling characteristics because of the influence of higher order statistics. More mixing is observed in the channelized models with higher-order continuity. It reinforces the notion that the flow response is influenced by the higher-order statistical description of heterogeneity. An important implication is that when scaling-up transport response from lab-scale results to the field scale, it is necessary to account for the scale-up of heterogeneity. Since the characteristics of higher-order multivariate distributions and large-scale heterogeneity are typically not captured in small-scale experiments, a reservoir modeling framework that captures the uncertainty in heterogeneity description should be adopted.
Favre-Averaged Turbulence Statistics in Variable Density Mixing of Buoyant Jets
NASA Astrophysics Data System (ADS)
Charonko, John; Prestridge, Kathy
2014-11-01
Variable density mixing of a heavy fluid jet with lower density ambient fluid in a subsonic wind tunnel was experimentally studied using Particle Image Velocimetry and Planar Laser Induced Fluorescence to simultaneously measure velocity and density. Flows involving the mixing of fluids with large density ratios are important in a range of physical problems including atmospheric and oceanic flows, industrial processes, and inertial confinement fusion. Here we focus on buoyant jets with coflow. Results from two different Atwood numbers, 0.1 (Boussinesq limit) and 0.6 (non-Boussinesq case), reveal that buoyancy is important for most of the turbulent quantities measured. Statistical characteristics of the mixing important for modeling these flows such as the PDFs of density and density gradients, turbulent kinetic energy, Favre averaged Reynolds stress, turbulent mass flux velocity, density-specific volume correlation, and density power spectra were also examined and compared with previous direct numerical simulations. Additionally, a method for directly estimating Reynolds-averaged velocity statistics on a per-pixel basis is extended to Favre-averages, yielding improved accuracy and spatial resolution as compared to traditional post-processing of velocity and density fields.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-01-01
This volume contains geology of the Durango D detail area, radioactive mineral occurrences in Colorado, and geophysical data interpretation. Eight appendices provide: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, geochemical statistical tables, magnetic and ancillary profiles, and test line data.
Scaling Limits and Generic Bounds for Exploration Processes
NASA Astrophysics Data System (ADS)
Bermolen, Paola; Jonckheere, Matthieu; Sanders, Jaron
2017-12-01
We consider exploration algorithms of the random sequential adsorption type both for homogeneous random graphs and random geometric graphs based on spatial Poisson processes. At each step, a vertex of the graph becomes active and its neighboring nodes become blocked. Given an initial number of vertices N growing to infinity, we study statistical properties of the proportion of explored (active or blocked) nodes in time using scaling limits. We obtain exact limits for homogeneous graphs and prove an explicit central limit theorem for the final proportion of active nodes, known as the jamming constant, through a diffusion approximation for the exploration process which can be described as a unidimensional process. We then focus on bounding the trajectories of such exploration processes on random geometric graphs, i.e., random sequential adsorption. As opposed to exploration processes on homogeneous random graphs, these do not allow for such a dimensional reduction. Instead we derive a fundamental relationship between the number of explored nodes and the discovered volume in the spatial process, and we obtain generic bounds for the fluid limit and jamming constant: bounds that are independent of the dimension of space and the detailed shape of the volume associated to the discovered node. Lastly, using coupling techinques, we give trajectorial interpretations of the generic bounds.
The effect of process parameters on audible acoustic emissions from high-shear granulation.
Hansuld, Erin M; Briens, Lauren; Sayani, Amyn; McCann, Joe A B
2013-02-01
Product quality in high-shear granulation is easily compromised by minor changes in raw material properties or process conditions. It is desired to develop a process analytical technology (PAT) that can monitor the process in real-time and provide feedback for quality control. In this work, the application of audible acoustic emissions (AAEs) as a PAT tool was investigated. A condenser microphone was placed at the top of the air exhaust on a PMA-10 high-shear granulator to collect AAEs for a design of experiment (DOE) varying impeller speed, total binder volume and spray rate. The results showed the 10 Hz total power spectral densities (TPSDs) between 20 and 250 Hz were significantly affected by the changes in process conditions. Impeller speed and spray rate were shown to have statistically significant effects on granulation wetting, and impeller speed and total binder volume were significant in terms of process end-point. The DOE results were confirmed by a multivariate PLS model of the TPSDs. The scores plot showed separation based on impeller speed in the first component and spray rate in the second component. The findings support the use of AAEs to monitor changes in process conditions in real-time and achieve consistent product quality.
NASA Technical Reports Server (NTRS)
Rockwell, T. H.; Griffin, W. C.
1981-01-01
Critical in-flight events (CIFE) that threaten the aircraft were studied. The scope of the CIFE was described and defined with emphasis on characterizing event development, detection and assessment; pilot information requirements, sources, acquisition, and interpretation, pilot response options, decision processed, and decision implementation and event outcome. Detailed scenarios were developed for use in simulators and paper and pencil testing for developing relationships between pilot performance and background information as well as for an analysis of pilot reaction decision and feedback processes. Statistical relationships among pilot characteristics and observed responses to CIFE's were developed.
Petroleum supply annual, 1990. [Contains Glossary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-05-30
The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1990 through annual and monthly surveys. The PSA is divided into two volumes. This first volume contains three sections, Summary Statistics, Detailed Statistics, and Refinery Capacity, each with final annual data. The second volume contains final statistics for each month of 1990, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. Explanatory Notes,more » located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary. 35 tabs.« less
Petroleum supply annual 1992. [Contains glossary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-05-27
The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1992 through annual and monthly surveys. The PSA is divided into two volumes. The first volume contains four sections: Summary Statistics, Detailed Statistics, Refinery Capacity, and Oxygenate Capacity each with final annual data. This second volume contains final statistics for each month of 1992, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them.more » Explanatory Notes, located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary.« less
Estimation of truck volumes and flows
DOT National Transportation Integrated Search
2004-08-01
This research presents a statistical approach for estimating truck volumes, based : primarily on classification counts and information on roadway functionality, employment, : sales volume and number of establishments within the state. Models have bee...
Objective Analysis of Poly-L-Lactic Acid Injection Efficacy in Different Settings.
Byun, Sang-Young; Seo, Koo-Il; Shin, Jung-Won; Kwon, Soon-Hyo; Park, Mi-Sook; Lee, Joshua; Park, Kyoung-Chan; Na, Jung-Im; Huh, Chang-Hun
2015-12-01
Poly-L-lactic acid (PLLA) filler is known to have continuous volume effect. The objective of this study is to analyze objective volume effect of PLLA in different settings of injection schedule on the cheek. A split-face, evaluator-blind randomized study in 24 volunteers was conducted. One side was injected 3 times with 4 cc dose and the other side was injected 2 times with 6 cc dose per visit. Facial volume loss scale (FVLS) and Vectra were evaluated. Measured average FVLS showed statistically significant improvement both in 3 and 2 times injection sides and maintained efficacy until 12 months. Vectra showed volume difference (cc) between before and after injection. In 3 times injection side, it was increased 2.12 (after 1 month) to 3.17 (after 12 months). In 2 times injection side, it was increased 2.26 (after 1 month) to 3.19 (after 12 months). Gradual volume improvement over 12 months was statistically significant in both sides. There was no statistically significant difference between 3 and 2 times injection in FVLS and Vectra. There was no severe adverse event. Poly-L-lactic acid has continuous volume effect and there was no significant difference by injection times at the same total injection volume.
Bridging stylized facts in finance and data non-stationarities
NASA Astrophysics Data System (ADS)
Camargo, Sabrina; Duarte Queirós, Sílvio M.; Anteneodo, Celia
2013-04-01
Employing a recent technique which allows the representation of nonstationary data by means of a juxtaposition of locally stationary paths of different length, we introduce a comprehensive analysis of the key observables in a financial market: the trading volume and the price fluctuations. From the segmentation procedure we are able to introduce a quantitative description of statistical features of these two quantities, which are often named stylized facts, namely the tails of the distribution of trading volume and price fluctuations and a dynamics compatible with the U-shaped profile of the volume in a trading section and the slow decay of the autocorrelation function. The segmentation of the trading volume series provides evidence of slow evolution of the fluctuating parameters of each patch, pointing to the mixing scenario. Assuming that long-term features are the outcome of a statistical mixture of simple local forms, we test and compare different probability density functions to provide the long-term distribution of the trading volume, concluding that the log-normal gives the best agreement with the empirical distribution. Moreover, the segmentation of the magnitude price fluctuations are quite different from the results for the trading volume, indicating that changes in the statistics of price fluctuations occur at a faster scale than in the case of trading volume.
Effect of crowd size on patient volume at a large, multipurpose, indoor stadium.
De Lorenzo, R A; Gray, B C; Bennett, P C; Lamparella, V J
1989-01-01
A prediction of patient volume expected at "mass gatherings" is desirable in order to provide optimal on-site emergency medical care. While several methods of predicting patient loads have been suggested, a reliable technique has not been established. This study examines the frequency of medical emergencies at the Syracuse University Carrier Dome, a 50,500-seat indoor stadium. Patient volume and level of care at collegiate basketball and football games as well as rock concerts, over a 7-year period were examined and tabulated. This information was analyzed using simple regression and nonparametric statistical methods to determine level of correlation between crowd size and patient volume. These analyses demonstrated no statistically significant increase in patient volume for increasing crowd size for basketball and football events. There was a small but statistically significant increase in patient volume for increasing crowd size for concerts. A comparison of similar crowd size for each of the three events showed that patient frequency is greatest for concerts and smallest for basketball. The study suggests that crowd size alone has only a minor influence on patient volume at any given event. Structuring medical services based solely on expected crowd size and not considering other influences such as event type and duration may give poor results.
Froum, Stuart J; Wallace, Stephen; Cho, Sang-Choon; Khouly, Ismael; Rosenberg, Edwin; Corby, Patricia; Froum, Scott; Mascarenhas, Patrick; Tarnow, Dennis P
2014-01-01
The purpose of this study was to radiographically evaluate, then analyze, bone height, volume, and density with reference to percentage of vital bone after maxillary sinuses were grafted using two different doses of recombinant human bone morphogenetic protein 2/acellular collagen sponge (rhBMP-2/ACS) combined with mineralized cancellous bone allograft (MCBA) and a control sinus grafted with MCBA only. A total of 18 patients (36 sinuses) were used for analysis of height and volume measurements, having two of three graft combinations (one in each sinus): (1) control, MCBA only; (2) test 1, MCBA + 5.6 mL of rhBMP-2/ACS (containing 8.4 mg of rhBMP-2); and (3) test 2, MCBA + 2.8 mL of rhBMP-2/ACS (containing 4.2 mg of rhBMP-2). The study was completed with 16 patients who also had bilateral cores removed 6 to 9 months following sinus augmentation. A computer software system was used to evaluate 36 computed tomography scans. Two time points where selected for measurements of height: The results indicated that height of the grafted sinus was significantly greater in the treatment groups compared with the control. However, by the second time point, there were no statistically significant differences. Three weeks post-surgery bone volume measurements showed similar statistically significant differences between test and controls. However, prior to core removal, test group 1 with the greater dose of rhBMP-2 showed a statistically significant greater increase compared with test group 2 and the control. There was no statistically significant difference between the latter two groups. All three groups had similar volume and shrinkage. Density measurements varied from the above results, with the control showing statistically significant greater density at both time points. By contrast, the density increase over time in both rhBMP groups was similar and statistically higher than in the control group. There were strong associations between height and volume in all groups and between volume and new vital bone only in the control group. There were no statistically significant relationships observed between height and bone density or between volume and bone density for any parameter measured. More cases and monitoring of the future survival of implants placed in these augmented sinuses are needed to verify these results.
Wu, Jianyong; Gronewold, Andrew D; Rodriguez, Roberto A; Stewart, Jill R; Sobsey, Mark D
2014-02-01
Rapid quantification of viral pathogens in drinking and recreational water can help reduce waterborne disease risks. For this purpose, samples in small volume (e.g. 1L) are favored because of the convenience of collection, transportation and processing. However, the results of viral analysis are often subject to uncertainty. To overcome this limitation, we propose an approach that integrates Bayesian statistics, efficient concentration methods, and quantitative PCR (qPCR) to quantify viral pathogens in water. Using this approach, we quantified human adenoviruses (HAdVs) in eighteen samples of source water collected from six drinking water treatment plants. HAdVs were found in seven samples. In the other eleven samples, HAdVs were not detected by qPCR, but might have existed based on Bayesian inference. Our integrated approach that quantifies uncertainty provides a better understanding than conventional assessments of potential risks to public health, particularly in cases when pathogens may present a threat but cannot be detected by traditional methods. © 2013 Elsevier B.V. All rights reserved.
"Geo-statistics methods and neural networks in geophysical applications: A case study"
NASA Astrophysics Data System (ADS)
Rodriguez Sandoval, R.; Urrutia Fucugauchi, J.; Ramirez Cruz, L. C.
2008-12-01
The study is focus in the Ebano-Panuco basin of northeastern Mexico, which is being explored for hydrocarbon reservoirs. These reservoirs are in limestones and there is interest in determining porosity and permeability in the carbonate sequences. The porosity maps presented in this study are estimated from application of multiattribute and neural networks techniques, which combine geophysics logs and 3-D seismic data by means of statistical relationships. The multiattribute analysis is a process to predict a volume of any underground petrophysical measurement from well-log and seismic data. The data consist of a series of target logs from wells which tie a 3-D seismic volume. The target logs are neutron porosity logs. From the 3-D seismic volume a series of sample attributes is calculated. The objective of this study is to derive a set of attributes and the target log values. The selected set is determined by a process of forward stepwise regression. The analysis can be linear or nonlinear. In the linear mode the method consists of a series of weights derived by least-square minimization. In the nonlinear mode, a neural network is trained using the select attributes as inputs. In this case we used a probabilistic neural network PNN. The method is applied to a real data set from PEMEX. For better reservoir characterization the porosity distribution was estimated using both techniques. The case shown a continues improvement in the prediction of the porosity from the multiattribute to the neural network analysis. The improvement is in the training and the validation, which are important indicators of the reliability of the results. The neural network showed an improvement in resolution over the multiattribute analysis. The final maps provide more realistic results of the porosity distribution.
Dykes, Thomas M; Bhargavan-Chatfield, Mythreyi; Dyer, Raymond B
2015-02-01
Establish 3 performance benchmarks for intravenous contrast extravasation during CT examinations: extravasation frequency, distribution of extravasation volumes, and severity of injury. Evaluate the effectiveness of implementing practice quality improvement (PQI) methodology in improving performance for these 3 benchmarks. The Society of Abdominal Radiology and ACR developed a registry collecting data for contrast extravasation events. The project includes a PQI initiative allowing for process improvement. As of December 2013, a total of 58 radiology practices have participated in this project, and 32 practices have completed the 2-cycle PQI. There were a total of 454,497 contrast-enhanced CT exams and 1,085 extravasation events. The average extravasation rate is 0.24%. The median extravasation rate is 0.21%. Most extravasations (82.9%) were between 10 mL and 99 mL. The majority of injuries, 94.6%, are mild in severity, with 4.7% having moderate and 0.8% having severe injuries. Data from practices that completed the PQI process showed a change in the average extravasation rate from 0.28% in the first 6 months to 0.23% in the second 6 months, and the median extravasation rate dropped from 0.25% to 0.16%, neither statistically significant. The distribution of extravasation volumes and the severity of injury did not change between the first and second measurement periods. National performance benchmarks for contrast extravasation rate, distribution of volumes of extravasate, and distribution of severity of injury are established through this multi-institutional practice registry. The application of PQI failed to have a statistically significant positive impact on any of the 3 benchmarks. Copyright © 2015 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Barchuk, A A; Podolsky, M D; Tarakanov, S A; Kotsyuba, I Yu; Gaidukov, V S; Kuznetsov, V I; Merabishvili, V M; Barchuk, A S; Levchenko, E V; Filochkina, A V; Arseniev, A I
2015-01-01
This review article analyzes data of literature devoted to the description, interpretation and classification of focal (nodal) changes in the lungs detected by computed tomography of the chest cavity. There are discussed possible criteria for determining the most likely of their character--primary and metastatic tumor processes, inflammation, scarring, and autoimmune changes, tuberculosis and others. Identification of the most characteristic, reliable and statistically significant evidences of a variety of pathological processes in the lungs including the use of modern computer-aided detection and diagnosis of sites will optimize the diagnostic measures and ensure processing of a large volume of medical data in a short time.
Brisson, Romain; Bianchi, Renzo
2015-11-01
The aim of this study is twofold: first, to assess the statistical significance of the data used by Pierre Bourdieu in Distinction; second, to test the hypothesis that the volume of capital (i.e., the global amount of capital) allows for a finer discrimination of dispositional differences than the composition of capital (i.e., the respective weight of the different types of capital in the global amount of capital). To these ends, five data samples were submitted to bilateral between-proportion comparison tests. The findings (1) reveal that about two-thirds of the differences reported by P. Bourdieu are significant and (2) support the view that the volume of capital prevails over its composition. © 2015 Canadian Sociological Association/La Société canadienne de sociologie.
Failure Analysis by Statistical Techniques (FAST). Volume 1. User’s Manual
1974-10-31
REPORT NUMBER DNA 3336F-1 2. OOVT ACCESSION NO 4. TITLE Cand Sublllle) • FAILURE ANALYSIS BY STATISTICAL TECHNIQUES (FAST) Volume I, User’s...SS2), and t’ a facility ( SS7 ). The other three diagrams break down the three critical subsystems. T le median probability of survival of the
Forest statistics for Maine: 1971 and 1982
Douglas S. Powell; David R. Dickson
1984-01-01
A statistical report on the third forest survey of Maine (1982) and reprocessed data from the second survey (1971). Results of the surveys are displayed in a 169 tables containing estimates of forest and timberland area, numbers of trees, timber volume, tree biomass, timber products output, and components of average annual net change in growing-stock volume for the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-01-01
This volume contains geology of the Durango A detail area, radioactive mineral occurences in Colorado, and geophysical data interpretation. Eight appendices provide the following: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, geochemical statistical tables, magnetic and ancillary profiles, and test line data.
The Forest Survey Organization Central States Forest Experiment Station
1949-01-01
This Survey Release presents the more significant statistics on forest area and timber volume for the state of Illinois and for each of the three regions into which the state has been divided. Later an analytical report for the state will be published which will interpret forest area, timber volume, growth, and drain statistics in the light of existing and anticipated...
The 1985 Army Experience Survey: Tabular Descriptions of First-Term Separatees. Volume 1
1986-01-01
Assistance. Survey data were processed through survey receipt control and sample management systems . Data were also keyed, edited, coded, and weighted. The...268-269 047A R135 AGE OF OLDEST CHILD. ..................... .. ..... 270-271 048 R136 REGION OF RESIDENCE WHEN YOU JOINED ARMY...100.0 STATISTIC VALUE D.F. PROD. CHISQUARE APPROX. 65.725 10 0.0000 272 R136 -- REGION OF RESIDENCE WHEN YOU JOINED ARMY RECODED - WHAT STATE WERE
The 1985 Army Experience Survey: Tabular Descriptions of Mid-Career Separatees. Volume 2
1986-01-01
Survey data were processed through survey receipt control and sample management systems . Data were also keyed, edited, coded, and weighted. The coding...270-271 048 R136 REGION OF RESIDENCE WHEN YOU JOINED ARMY ................. . 272-273 049 E137 # TERMS OF ACTIVE ENLISTMENT...STATISTIC VALUE D.F. PROB. CHISQUARE APPROX. 4.449 5 0.4868 I 272 R136 -- REGION OF RESIDENCE WHEN YOU JOINED ARMY RECODED - WHAT STATE WERE YOU LIVING IN
1976-08-01
foreign policy dynamics, the structure of a theory cannot in {.eneral be derived from statistical analysis of time series data ( Brunner (1071), Thorson...and where such scientific knowledge la applicable. Recent attention in theory and research on the bureaucratic, handling of foreign policy...process. Some of ^.z element» of thase concerns can be made explicit if we introduce modern systems theories which seek to treat organizations as
Li, Wen; Chen, Fei; Zhang, Feng; Ding, Wanghui; Ye, Qingsong; Shi, Jiejun; Fu, Baiping
2013-01-01
Molar intrusion by mini-screw implantation can cause different degrees of root resorption. However, most methods (2-D and 3-D) used for evaluating root resorption have focused on the root length without considering 3-D resorption. The purpose of this study was to volumetrically evaluate root resorption using cone beam computed tomography(CBCT) after mini-screw implant intrusion. 1. The volumes of 32 teeth were measured using CBCT and laser scanning to verify the accuracy of CBCT. 2. Twelve overerupted molars from adult patients were investigated in this study. After mini-screw implants were inserted into the buccal and palatal alveolar bones, 150 g of force was applied to the mini-screw implants on each side to intrude the molars. CBCT images of all patients were taken immediately prior to intrusion and after intrusion. The volumes of the roots were calculated using the Mimics software program. The differences between the pre-intrusion and post-intrusion root volumes were statistically evaluated with a paired-samples t-test. In addition, the losses of the roots were statistically compared with each other using one-way analysis of variance at the P<0.05 level. No statistically significant volume differences were observed between the physical (laser scanning) and CBCT measurements (P>0.05). The overerupted molars were significantly intruded (P<0.05), and the average intrusion was 3.30±1.60 mm. The differences between the pre-intrusion and post-intrusion root volumes were statistically significant for all of the roots investigated (P<0.05). The roots were sorted by volume loss in descending order as follows: mesiobuccal, palatal, and distobuccal. Statistical significance was achieved among the three roots. The average total resorption for each tooth was 58.39±1.54 mm(3). Volume measurement using CBCT was able to effectively evaluate root resorption caused by mini-screw intrusion. The highest volume loss was observed in the mesiobuccal root among the three roots of the investigated first molar teeth.
Li, Wen; Chen, Fei; Zhang, Feng; Ding, Wanghui; Ye, Qingsong; Shi, Jiejun; Fu, Baiping
2013-01-01
Objective Molar intrusion by mini-screw implantation can cause different degrees of root resorption. However, most methods (2-D and 3-D) used for evaluating root resorption have focused on the root length without considering 3-D resorption. The purpose of this study was to volumetrically evaluate root resorption using cone beam computed tomography(CBCT) after mini-screw implant intrusion. Materials and Methods 1. The volumes of 32 teeth were measured using CBCT and laser scanning to verify the accuracy of CBCT. 2. Twelve overerupted molars from adult patients were investigated in this study. After mini-screw implants were inserted into the buccal and palatal alveolar bones, 150 g of force was applied to the mini-screw implants on each side to intrude the molars. CBCT images of all patients were taken immediately prior to intrusion and after intrusion. The volumes of the roots were calculated using the Mimics software program. The differences between the pre-intrusion and post-intrusion root volumes were statistically evaluated with a paired-samples t-test. In addition, the losses of the roots were statistically compared with each other using one-way analysis of variance at the P<0.05 level. Results No statistically significant volume differences were observed between the physical (laser scanning) and CBCT measurements (P>0.05). The overerupted molars were significantly intruded (P<0.05), and the average intrusion was 3.30±1.60 mm. The differences between the pre-intrusion and post-intrusion root volumes were statistically significant for all of the roots investigated (P<0.05). The roots were sorted by volume loss in descending order as follows: mesiobuccal, palatal, and distobuccal. Statistical significance was achieved among the three roots. The average total resorption for each tooth was 58.39±1.54 mm3. Conclusion Volume measurement using CBCT was able to effectively evaluate root resorption caused by mini-screw intrusion. The highest volume loss was observed in the mesiobuccal root among the three roots of the investigated first molar teeth. PMID:23585866
Sargos, P; Charleux, T; Haas, R L; Michot, A; Llacer, C; Moureau-Zabotto, L; Vogin, G; Le Péchoux, C; Verry, C; Ducassou, A; Delannes, M; Mervoyer, A; Wiazzane, N; Thariat, J; Sunyach, M P; Benchalal, M; Laredo, J D; Kind, M; Gillon, P; Kantor, G
2018-04-01
The purpose of this study was to evaluate, during a national workshop, the inter-observer variability in target volume delineation for primary extremity soft tissue sarcoma radiation therapy. Six expert sarcoma radiation oncologists (members of French Sarcoma Group) received two extremity soft tissue sarcoma radiation therapy cases 1: one preoperative and one postoperative. They were distributed with instructions for contouring gross tumour volume or reconstructed gross tumour volume, clinical target volume and to propose a planning target volume. The preoperative radiation therapy case was a patient with a grade 1 extraskeletal myxoid chondrosarcoma of the thigh. The postoperative case was a patient with a grade 3 pleomorphic undifferentiated sarcoma of the thigh. Contour agreement analysis was performed using kappa statistics. For the preoperative case, contouring agreement regarding GTV, gross tumour volume GTV, clinical target volume and planning target volume were substantial (kappa between 0.68 and 0.77). In the postoperative case, the agreement was only fair for reconstructed gross tumour volume (kappa: 0.38) but moderate for clinical target volume and planning target volume (kappa: 0.42). During the workshop discussion, consensus was reached on most of the contour divergences especially clinical target volume longitudinal extension. The determination of a limited cutaneous cover was also discussed. Accurate delineation of target volume appears to be a crucial element to ensure multicenter clinical trial quality assessment, reproducibility and homogeneity in delivering RT. radiation therapy RT. Quality assessment process should be proposed in this setting. We have shown in our study that preoperative radiation therapy of extremity soft tissue sarcoma has less inter-observer contouring variability. Copyright © 2018 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.
Tsunoda, A; Mitsuoka, H; Sato, K; Kanayama, S
2000-06-01
Our purpose was to quantify the intracranial cerebrospinal fluid (CSF) volume components using an original MRI-based segmentation technique and to investigate whether a CSF volume index is useful for diagnosis of normal pressure hydrocephalus (NPH). We studied 59 subjects: 16 patients with NPH, 14 young and 13 elderly normal volunteers, and 16 patients with cerebrovascular disease. Images were acquired on a 1.5-T system, using a 3D-fast asymmetrical spin-echo (FASE) method. A region-growing method (RGM) was used to extract the CSF spaces from the FASE images. Ventricular volume (VV) and intracranial CSF volume (ICV) were measured, and a VV/ICV ratio was calculated. Mean VV and VV/ICV ratio were higher in the NPH group than in the other groups, and the differences were statistically significant, whereas the mean ICV value in the NPH group was not significantly increased. Of the 16 patients in the NPH group, 13 had VV/ICV ratios above 30%. In contrast, no subject in the other groups had a VV/ICV ratios higher than 30%. We conclude that these CSF volume parameters, especially the VV/ICV ratio, are useful for the diagnosis of NPH.
NASA Astrophysics Data System (ADS)
Jongkreangkrai, C.; Vichianin, Y.; Tocharoenchai, C.; Arimura, H.; Alzheimer's Disease Neuroimaging Initiative
2016-03-01
Several studies have differentiated Alzheimer's disease (AD) using cerebral image features derived from MR brain images. In this study, we were interested in combining hippocampus and amygdala volumes and entorhinal cortex thickness to improve the performance of AD differentiation. Thus, our objective was to investigate the useful features obtained from MRI for classification of AD patients using support vector machine (SVM). T1-weighted MR brain images of 100 AD patients and 100 normal subjects were processed using FreeSurfer software to measure hippocampus and amygdala volumes and entorhinal cortex thicknesses in both brain hemispheres. Relative volumes of hippocampus and amygdala were calculated to correct variation in individual head size. SVM was employed with five combinations of features (H: hippocampus relative volumes, A: amygdala relative volumes, E: entorhinal cortex thicknesses, HA: hippocampus and amygdala relative volumes and ALL: all features). Receiver operating characteristic (ROC) analysis was used to evaluate the method. AUC values of five combinations were 0.8575 (H), 0.8374 (A), 0.8422 (E), 0.8631 (HA) and 0.8906 (ALL). Although “ALL” provided the highest AUC, there were no statistically significant differences among them except for “A” feature. Our results showed that all suggested features may be feasible for computer-aided classification of AD patients.
Osterberg, T; Norinder, U
2001-01-01
A method of modelling and predicting biopharmaceutical properties using simple theoretically computed molecular descriptors and multivariate statistics has been investigated for several data sets related to solubility, IAM chromatography, permeability across Caco-2 cell monolayers, human intestinal perfusion, brain-blood partitioning, and P-glycoprotein ATPase activity. The molecular descriptors (e.g. molar refractivity, molar volume, index of refraction, surface tension and density) and logP were computed with ACD/ChemSketch and ACD/logP, respectively. Good statistical models were derived that permit simple computational prediction of biopharmaceutical properties. All final models derived had R(2) values ranging from 0.73 to 0.95 and Q(2) values ranging from 0.69 to 0.86. The RMSEP values for the external test sets ranged from 0.24 to 0.85 (log scale).
Hematological Alterations on Sub-acute Exposure to Flubendiamide in Sprague Dawley Rats.
Vemu, Bhaskar; Dumka, Vinod Kumar
2014-01-01
Pesticide poisoning is a common occurrence around the world. Pesticides can act on various body systems resulting in toxicity. Flubendiamide is a new generation pesticide, reported to have better activity against Lepidopteran insects. The present study was carried out with an objective to analyze the effects of flubendiamide sub-acute exposure on hematology of rats. Male and female Sprague Dawley (SD) rats (9-11 weeks) were divided into five groups with six animals in each group. First group served as control, while the rest were exposed to ascending oral doses of flubendiamide (125, 250, 500 and 1000 mg/kg) for 28 days. After the trial period, blood was collected in heparinized vials and analyzed using Siemens ADVIA 2120(®) autoanalyzer. Various erythrocytic, platelet and leukocyte parameters were measured and analyzed using statistical tests by one-way analysis of variance (ANOVA) and t-test using Statistical Package for Social Sciences (SPSS)(®) 20 software. After processing the data through statistical analysis, it was observed that the effect of flubendiamide exposure on female rats was negligible. The only significant change observed in the female rats was that in total erythrocytic count, while rest of the parameters showed non-significant bidirectional changes. In males, many parameters viz., total leukocyte count (TLC), total erythrocyte count (TEC), packed cell volume (PCV), mean corpuscular volume (MCV), platelet count (PC), mean platelet volume (MPV), platelet distribution width (PDW), hemoglobin distribution width (HDW), large platelets (LPT) and plateletcrit (PCT) expressed significant difference when compared to control. Many of the changes were dose independent, but sex specific. This lead to the hypothesis that saturation toxicokinetics might be one of the reasons for this varied response, which can only be evaluated after further testing.
Evaluation of mean platelet volume in localized scleroderma.
Bahali, Anil Gulsel; Su, Ozlem; Emiroglu, Nazan; Cengiz, Fatma Pelin; Kaya, Mehmet Onur; Onsun, Nahide
2017-01-01
Localized scleroderma is a chronic inflammatory skin disease characterized by sclerosis of the dermis and subcutaneous tissue. Platelets play an important role in inflammation. Following activation, platelets rapidly release numerous mediators and cytokines, which contribute to inflammation. To evaluate whether there was any relation between localized scleroderma and platelet parameters. Forty-one patients with localized scleroderma were enrolled in the study. The control group consisted of 30 healthy subjects. The mean platelet volume level in the patient group was 9.9 ± 1.3 fl and in the control group was 7.6 ± 1.1 fl. This difference was statistically significant (p< 0.001). The plateletcrit values are minimally higher in the patient group as compared to the control group. It was statistically significant (p<0.001). There was no significant difference in the platelet counts between the two groups (p= 0.560) In the patient group, there was no significant relation between the mean platelet volume levels and clinical signs of disease (p=0.09). However, plateletcrit values are higher in generalized than localized forms of disease (p=0.01). The limited number of patients and the retrospective nature of the study were our limitations. This study suggests that platelets might play a role in the pathogenesis of scleroderma. Platelet parameters may be used as markers for evaluating disease severity and inflammatory processes. Thus, there is a need for more detailed and prospective studies.
Lorenzi, M; Ayache, N; Pennec, X
2015-07-15
In this study we introduce the regional flux analysis, a novel approach to deformation based morphometry based on the Helmholtz decomposition of deformations parameterized by stationary velocity fields. We use the scalar pressure map associated to the irrotational component of the deformation to discover the critical regions of volume change. These regions are used to consistently quantify the associated measure of volume change by the probabilistic integration of the flux of the longitudinal deformations across the boundaries. The presented framework unifies voxel-based and regional approaches, and robustly describes the volume changes at both group-wise and subject-specific level as a spatial process governed by consistently defined regions. Our experiments on the large cohorts of the ADNI dataset show that the regional flux analysis is a powerful and flexible instrument for the study of Alzheimer's disease in a wide range of scenarios: cross-sectional deformation based morphometry, longitudinal discovery and quantification of group-wise volume changes, and statistically powered and robust quantification of hippocampal and ventricular atrophy. Copyright © 2015 Elsevier Inc. All rights reserved.
Roller, Lauren A; Bruce, Beau B; Saindane, Amit M
2015-04-01
Measurement of posterior fossa volume has been proposed to have diagnostic utility and physiologic significance in the Chiari malformation type 1. This study evaluated the effects of demographics on posterior fossa volume and total intracranial volume in adult control subjects, adult patients with Chiari malformation type 1, and adult patients with idiopathic intracranial hypertension, who may share some imaging features of patients with Chiari malformation type 1. Twenty-eight patients with Chiari malformation type 1, 21 patients with idiopathic intracranial hypertension, and 113 control subjects underwent brain MRI including contrast-enhanced 3D gradient-recalled echo (GRE) T1-weighted imaging. Linear measurements of the posterior fossa and intracranial space were obtained. Manual segmentation of the posterior fossa and intracranial space was performed to yield posterior fossa volume and total intracranial volume. Age, sex, race, and body mass index (weight in kilograms divided by the square of height in meters; BMI) were controlled for when comparing cohorts. Three of the 12 linear measurements significantly predicted total intracranial volume (accounting for 74% of variance), and four predicted posterior fossa volume (54% of variance). Age, race, sex, and BMI each statistically significantly influenced posterior fossa volume and total intracranial volume. No statistically significant differences in posterior fossa volume, total intracranial volume, or ratio of posterior fossa volume to total intracranial volume were seen between the Chiari malformation type 1 group and control group after controlling for demographics. Patients with idiopathic intracranial hypertension were more likely than control subjects to have smaller posterior fossa volumes (odds ratio [OR]=1.81; p=0.01) and larger total intracranial volumes (OR=1.24; p=0.06). Linear measurements of the posterior fossa are not strong predictors of posterior fossa volume. Age, race, sex, and BMI have statistically significant effects on intracranial measurements that must be considered, particularly with respect to posterior fossa volume in Chiari malformation type 1. Even when these demographic variables are appropriately accounted for, other similarly presenting diseases may show small posterior fossa volumes.
Timber resource statistics for the Porcupine inventory unit ofAlaska, 1978.
Theodore S. Setzer
1987-01-01
A timber resource inventory of the Porcupine inventory unit, Alaska, was conducted in 1977 and 1978. Statistics on forest area, timber volumes, and annual growth and mortality from this inventory are presented. Timberland area is estimated at 1,453 thousand acres, and net growing stock volume, mostly softwood, is 530,505 thousand cubic feet. Net annual growth of...
Timber resource statistics for the Yakataga inventory unit, Alaska, 1976.
Willem W.S. van Hees
1985-01-01
Statistics on forest area, total gross and net timber volumes, and annual net growth and mortality are presented from the 1976 timber inventory of the Yakataga unit, Alaska. Timberland area is estimated at 209.3 thousand acres (84.7 thousand ha), net growing stock volume at 917.1 million cubic feet (26.0 million m3), and annual net growth and...
Timber resource statistics for the Ketchikan inventory unit, Alaska, 1974.
Willem W.S. van Hees
1984-01-01
Statistics on forest area, total gross and net timber volumes, and annual net growth and mortality are presented from the 1974 timber inventory of the Ketchikan. unit, Alaska. Timberland area is estimated at 1.16 million acres (470 040 ha), net growing stock volume at 6.39 billion cubic feet (181.04 million m3), and annual net growth and...
NASA Technical Reports Server (NTRS)
Jasperson, W. H.; Nastron, G. D.; Davis, R. E.; Holdeman, J. D.
1984-01-01
Summary studies are presented for the entire cloud observation archive from the NASA Global Atmospheric Sampling Program (GASP). Studies are also presented for GASP particle-concentration data gathered concurrently with the cloud observations. Cloud encounters are shown on about 15 percent of the data samples overall, but the probability of cloud encounter is shown to vary significantly with altitude, latitude, and distance from the tropopause. Several meteorological circulation features are apparent in the latitudinal distribution of cloud cover, and the cloud-encounter statistics are shown to be consistent with the classical mid-latitude cyclone model. Observations of clouds spaced more closely than 90 minutes are shown to be statistically dependent. The statistics for cloud and particle encounter are utilized to estimate the frequency of cloud encounter on long-range airline routes, and to assess the probability and extent of laminaar flow loss due to cloud or particle encounter by aircraft utilizing laminar flow control (LFC). It is shown that the probability of extended cloud encounter is too low, of itself, to make LFC impractical. This report is presented in two volumes. Volume I contains the narrative, analysis, and conclusions. Volume II contains five supporting appendixes.
Blumen, Helena M; Brown, Lucy L; Habeck, Christian; Allali, Gilles; Ayers, Emmeline; Beauchet, Olivier; Callisaya, Michele; Lipton, Richard B; Mathuranath, P S; Phan, Thanh G; Pradeep Kumar, V G; Srikanth, Velandai; Verghese, Joe
2018-04-09
Accelerated gait decline in aging is associated with many adverse outcomes, including an increased risk for falls, cognitive decline, and dementia. Yet, the brain structures associated with gait speed, and how they relate to specific cognitive domains, are not well-understood. We examined structural brain correlates of gait speed, and how they relate to processing speed, executive function, and episodic memory in three non-demented and community-dwelling older adult cohorts (Overall N = 352), using voxel-based morphometry and multivariate covariance-based statistics. In all three cohorts, we identified gray matter volume covariance patterns associated with gait speed that included brain stem, precuneus, fusiform, motor, supplementary motor, and prefrontal (particularly ventrolateral prefrontal) cortex regions. Greater expression of these gray matter volume covariance patterns linked to gait speed were associated with better processing speed in all three cohorts, and with better executive function in one cohort. These gray matter covariance patterns linked to gait speed were not associated with episodic memory in any of the cohorts. These findings suggest that gait speed, processing speed (and to some extent executive functions) rely on shared neural systems that are subject to age-related and dementia-related change. The implications of these findings are discussed within the context of the development of interventions to compensate for age-related gait and cognitive decline.
The Proell Effect: A Macroscopic Maxwell's Demon
NASA Astrophysics Data System (ADS)
Rauen, Kenneth M.
2011-12-01
Maxwell's Demon is a legitimate challenge to the Second Law of Thermodynamics when the "demon" is executed via the Proell effect. Thermal energy transfer according to the Kinetic Theory of Heat and Statistical Mechanics that takes place over distances greater than the mean free path of a gas circumvents the microscopic randomness that leads to macroscopic irreversibility. No information is required to sort the particles as no sorting occurs; the entire volume of gas undergoes the same transition. The Proell effect achieves quasi-spontaneous thermal separation without sorting by the perturbation of a heterogeneous constant volume system with displacement and regeneration. The classical analysis of the constant volume process, such as found in the Stirling Cycle, is incomplete and therefore incorrect. There are extra energy flows that classical thermo does not recognize. When a working fluid is displaced across a regenerator with a temperature gradient in a constant volume system, complimentary compression and expansion work takes place that transfers energy between the regenerator and the bulk gas volumes of the hot and cold sides of the constant volume system. Heat capacity at constant pressure applies instead of heat capacity at constant volume. The resultant increase in calculated, recyclable energy allows the Carnot Limit to be exceeded in certain cycles. Super-Carnot heat engines and heat pumps have been designed and a US patent has been awarded.
Bartholomay, R.C.
1993-01-01
Water from 11 wells completed in the Snake River Plain aquifer at the Idaho National Engineering Laboratory was sampled as part of the U.S. Geological Survey's quality assurance program to determine the effect of purging different borehole volumes on tritium and strontium-90 concentrations. Wells were selected for sampling on the basis of the length of time it took to purge a borehole volume of water. Samples were collected after purging one, two, and three borehole volumes. The U.S. Department of Energy's Radiological and Environmental Sciences Laboratory provided analytical services. Statistics were used to determine the reproducibility of analytical results. The comparison between tritium and strontium-90 concentrations after purging one and three borehole volumes and two and three borehole volumes showed that all but two sample pairs with defined numbers were in statistical agreement. Results indicate that concentrations of tritium and strontium-90 are not affected measurably by the number of borehole volumes purged.
A hybrid ARIMA and neural network model applied to forecast catch volumes of Selar crumenophthalmus
NASA Astrophysics Data System (ADS)
Aquino, Ronald L.; Alcantara, Nialle Loui Mar T.; Addawe, Rizavel C.
2017-11-01
The Selar crumenophthalmus with the English name big-eyed scad fish, locally known as matang-baka, is one of the fishes commonly caught along the waters of La Union, Philippines. The study deals with the forecasting of catch volumes of big-eyed scad fish for commercial consumption. The data used are quarterly caught volumes of big-eyed scad fish from 2002 to first quarter of 2017. This actual data is available from the open stat database published by the Philippine Statistics Authority (PSA)whose task is to collect, compiles, analyzes and publish information concerning different aspects of the Philippine setting. Autoregressive Integrated Moving Average (ARIMA) models, Artificial Neural Network (ANN) model and the Hybrid model consisting of ARIMA and ANN were developed to forecast catch volumes of big-eyed scad fish. Statistical errors such as Mean Absolute Errors (MAE) and Root Mean Square Errors (RMSE) were computed and compared to choose the most suitable model for forecasting the catch volume for the next few quarters. A comparison of the results of each model and corresponding statistical errors reveals that the hybrid model, ARIMA-ANN (2,1,2)(6:3:1), is the most suitable model to forecast the catch volumes of the big-eyed scad fish for the next few quarters.
Masked areas in shear peak statistics. A forward modeling approach
Bard, D.; Kratochvil, J. M.; Dawson, W.
2016-03-09
The statistics of shear peaks have been shown to provide valuable cosmological information beyond the power spectrum, and will be an important constraint of models of cosmology in forthcoming astronomical surveys. Surveys include masked areas due to bright stars, bad pixels etc., which must be accounted for in producing constraints on cosmology from shear maps. We advocate a forward-modeling approach, where the impacts of masking and other survey artifacts are accounted for in the theoretical prediction of cosmological parameters, rather than correcting survey data to remove them. We use masks based on the Deep Lens Survey, and explore the impactmore » of up to 37% of the survey area being masked on LSST and DES-scale surveys. By reconstructing maps of aperture mass the masking effect is smoothed out, resulting in up to 14% smaller statistical uncertainties compared to simply reducing the survey area by the masked area. We show that, even in the presence of large survey masks, the bias in cosmological parameter estimation produced in the forward-modeling process is ≈1%, dominated by bias caused by limited simulation volume. We also explore how this potential bias scales with survey area and evaluate how much small survey areas are impacted by the differences in cosmological structure in the data and simulated volumes, due to cosmic variance.« less
MASKED AREAS IN SHEAR PEAK STATISTICS: A FORWARD MODELING APPROACH
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bard, D.; Kratochvil, J. M.; Dawson, W., E-mail: djbard@slac.stanford.edu
2016-03-10
The statistics of shear peaks have been shown to provide valuable cosmological information beyond the power spectrum, and will be an important constraint of models of cosmology in forthcoming astronomical surveys. Surveys include masked areas due to bright stars, bad pixels etc., which must be accounted for in producing constraints on cosmology from shear maps. We advocate a forward-modeling approach, where the impacts of masking and other survey artifacts are accounted for in the theoretical prediction of cosmological parameters, rather than correcting survey data to remove them. We use masks based on the Deep Lens Survey, and explore the impactmore » of up to 37% of the survey area being masked on LSST and DES-scale surveys. By reconstructing maps of aperture mass the masking effect is smoothed out, resulting in up to 14% smaller statistical uncertainties compared to simply reducing the survey area by the masked area. We show that, even in the presence of large survey masks, the bias in cosmological parameter estimation produced in the forward-modeling process is ≈1%, dominated by bias caused by limited simulation volume. We also explore how this potential bias scales with survey area and evaluate how much small survey areas are impacted by the differences in cosmological structure in the data and simulated volumes, due to cosmic variance.« less
1995-09-01
path and aircraft attitude and other flight or aircraft parameters • Calculations in the frequency domain ( Fast Fourier Transform) • Data analysis...Signal filtering Image processing of video and radar data Parameter identification Statistical analysis Power spectral density Fast Fourier Transform...airspeeds both fast and slow, altitude, load factor both above and below 1g, centers of gravity (fore and aft), and with system/subsystem failures. Whether
Deborah S. Page-Dumroese; Ann M. Abbott; Thomas M. Rice
2009-01-01
Volume I and volume II of the Forest Soil Disturbance Monitoring Protocol (FSDMP) provide information for a wide range of users, including technicians, field crew leaders, private landowners, land managers, forest professionals, and researchers. Volume I: Rapid Assessment includes the basic methods for establishing forest soil monitoring transects and consistently...
Theodore S. Setzer; Bert R. Mead; Gary L. Carroll
1984-01-01
A multiresource inventory of the Willow block, Susitna River basin inventory unit, was conducted in 1978. Statistics on forest area, timber volumes, and growth and mortality from this inventory are presented. Timberland area is estimated at 230,200 acres and net growing stock volume, mostly birch, at 231.9 million cubic feet. Net annual growth of growing stock is...
Bert R. Mead; Theodore S. Setzer; Gary L. Carroll
1985-01-01
A multiresource inventory of the Upper Susitna block, Susitna River basin inventory unit, was conducted in 1980. Statistics on forest area, timber volumes, and annual growth from this inventory are presented. Timberland area is estimated at 112,130 acres, and net growing stock volume, mostly hardwood, is 84.6 million cubic feet. Net annual growth of growing stock is...
Gary L. Carroll; Theodore S. Setzer; Bert R. Mead
1985-01-01
A multiresource inventory of the Beluga block, Susitna River basin inventory unit, was conducted in 1980. Statistics on forest area, timber volumes, and growth and mortality from this inventory are presented. Timberland area is estimated at 131,740 acres and net growing stock volume, mostly hardwood, is 99.4 million cubic feet. Net annual growth of growing stock is...
Theodore S. Setzer; Gary L. Carroll; Bert R. Mead
1984-01-01
A multiresource inventory of the Talkeetna block, Susitna River basin inventory unit, was conducted in 1979. Statistics on forest area, timber volumes, and growth and mortality from this inventory are presented. Timberland area is estimated at 562,105 acres and net growing stock volume, mostly hardwood, at 574.7 million cubic feet. Net annual growth of growing stock is...
ERIC Educational Resources Information Center
National Science Foundation, Washington, DC.
During the March through July 1981 period a total of 36 Federal agencies and their subdivisions (95 individual respondents) submitted data in response to the Annual Survey of Federal Funds for Research and Development, Volume XXX, conducted by the National Science Foundation. The detailed statistical tables presented in this report were derived…
Timber resource statistics for the Petersburg/Wrangell inventory unit, Alaska, 1972.
Willem W.S. Van Hees; Vernon J. LaBau
1983-01-01
Statistics on forest area, total gross and net timber volumes, and annual net growth and mortality are presented from the 1972 timber inventory of the Petersburg/Wrangell unit, Alaska. Timberland area is estimated at 1.3 million acres (520 770 ha), net growing stock volume at 7.1 billion cubic feet (200.2 million m3), and annual net growth and...
Timber resource statistics for the Prince of Wales inventory unit, Alaska, 1973.
Willem W.S. Van Hees; Vernon J. LaBau
1983-01-01
Statistics on forest area, total gross and net timber volumes, and annual net growth and mortality are presented from the 1973 timber inventory of the Prince of Wales unit, Alaska. Timberland area is estimated at 1.38 million acres (557 593 ha), net growing stock volume at 7.56 billion cubic feet (214 million m3), and annual net growth and...
Timber resource statistics for the Juneau inventory unit, Alaska, 1970.
Vernon J. LaBau; Willem W.S. Van Hees
1983-01-01
Statistics on forest area, total gross and net timber volumes, and annual net growth and mortality are presented for the 1970 timber inventory of the Juneau unit, Alaska. Estimates for commercial forest land area total 1.3 million acres (535 000 ha) with a net growing stock volume of 8.3 billion cubic feet (234 million m3), and annual net growth...
Timber resource statistics for the Yakutat inventory unit, Alaska, 1975.
Willem W.S. Van Hees; Vernon J. LaBau
1984-01-01
Statistics on forest area, total gross and net,timber volumes, and annual net growth and mortality are presented from the 1975 timber inventory of the Yakutat unit, Alaska. Area of timberland is estimated at 236.3 thousand acres (95.6 thousand ha), net volume of growing stock at 1.1 billion cubic feet (29.9 million m3), and annual net growth and...
Brief communication: Skeletal biology past and present: Are we moving in the right direction?
Hens, Samantha M; Godde, Kanya
2008-10-01
In 1982, Spencer's edited volume A History of American Physical Anthropology: 1930-1980 allowed numerous authors to document the state of our science, including a critical examination of skeletal biology. Some authors argued that the first 50 years of skeletal biology were characterized by the descriptive-historical approach with little regard for processual problems and that technological and statistical analyses were not rooted in theory. In an effort to determine whether Spencer's landmark volume impacted the field of skeletal biology, a content analysis was carried out for the American Journal of Physical Anthropology from 1980 to 2004. The percentage of skeletal biology articles is similar to that of previous decades. Analytical articles averaged only 32% and are defined by three criteria: statistical analysis, hypothesis testing, and broader explanatory context. However, when these criteria were scored individually, nearly 80% of papers attempted a broader theoretical explanation, 44% tested hypotheses, and 67% used advanced statistics, suggesting that the skeletal biology papers in the journal have an analytical emphasis. Considerable fluctuation exists between subfields; trends toward a more analytical approach are witnessed in the subfields of age/sex/stature/demography, skeletal maturation, anatomy, and nonhuman primate studies, which also increased in frequency, while paleontology and pathology were largely descriptive. Comparisons to the International Journal of Osteoarchaeology indicate that there are statistically significant differences between the two journals in terms of analytical criteria. These data indicate a positive shift in theoretical thinking, i.e., an attempt by most to explain processes rather than present a simple description of events.
Statistical Abstract of the United States: 2012. 131st Edition
ERIC Educational Resources Information Center
US Census Bureau, 2011
2011-01-01
"The Statistical Abstract of the United States," published from 1878 to 2012, is the authoritative and comprehensive summary of statistics on the social, political, and economic organization of the United States. It is designed to serve as a convenient volume for statistical reference, and as a guide to other statistical publications and…
Experimental investigation of the structural behavior of equine urethra.
Natali, Arturo Nicola; Carniel, Emanuele Luigi; Frigo, Alessandro; Fontanella, Chiara Giulia; Rubini, Alessandro; Avital, Yochai; De Benedictis, Giulia Maria
2017-04-01
An integrated experimental and computational investigation was developed aiming to provide a methodology for characterizing the structural response of the urethral duct. The investigation provides information that are suitable for the actual comprehension of lower urinary tract mechanical functionality and the optimal design of prosthetic devices. Experimental activity entailed the execution of inflation tests performed on segments of horse penile urethras from both proximal and distal regions. Inflation tests were developed imposing different volumes. Each test was performed according to a two-step procedure. The tubular segment was inflated almost instantaneously during the first step, while volume was held constant for about 300s to allow the development of relaxation processes during the second step. Tests performed on the same specimen were interspersed by 600s of rest to allow the recovery of the specimen mechanical condition. Results from experimental activities were statistically analyzed and processed by means of a specific mechanical model. Such computational model was developed with the purpose of interpreting the general pressure-volume-time response of biologic tubular structures. The model includes parameters that interpret the elastic and viscous behavior of hollow structures, directly correlated with the results from the experimental activities. Post-processing of experimental data provided information about the non-linear elastic and time-dependent behavior of the urethral duct. In detail, statistically representative pressure-volume and pressure relaxation curves were identified, and summarized by structural parameters. Considering elastic properties, initial stiffness ranged between 0.677 ± 0.026kPa and 0.262 ± 0.006kPa moving from proximal to distal region of penile urethra. Viscous parameters showed typical values of soft biological tissues, as τ 1 =0.153±0.018s, τ 2 =17.458 ± 1.644s and τ 1 =0.201 ± 0.085, τ 2 = 8.514 ± 1.379s for proximal and distal regions respectively. A general procedure for the mechanical characterization of the urethral duct has been provided. The proposed methodology allows identifying mechanical parameters that properly express the mechanical behavior of the biological tube. The approach is especially suitable for evaluating the influence of degenerative phenomena on the lower urinary tract mechanical functionality. The information are mandatory for the optimal design of potential surgical procedures and devices. Copyright © 2017 Elsevier B.V. All rights reserved.
Jamming II: Edwards’ statistical mechanics of random packings of hard spheres
NASA Astrophysics Data System (ADS)
Wang, Ping; Song, Chaoming; Jin, Yuliang; Makse, Hernán A.
2011-02-01
The problem of finding the most efficient way to pack spheres has an illustrious history, dating back to the crystalline arrays conjectured by Kepler and the random geometries explored by Bernal in the 1960s. This problem finds applications spanning from the mathematician’s pencil, the processing of granular materials, the jamming and glass transitions, all the way to fruit packing in every grocery. There are presently numerous experiments showing that the loosest way to pack spheres gives a density of ∼55% (named random loose packing, RLP) while filling all the loose voids results in a maximum density of ∼63%-64% (named random close packing, RCP). While those values seem robustly true, to this date there is no well-accepted physical explanation or theoretical prediction for them. Here we develop a common framework for understanding the random packings of monodisperse hard spheres whose limits can be interpreted as the experimentally observed RLP and RCP. The reason for these limits arises from a statistical picture of jammed states in which the RCP can be interpreted as the ground state of the ensemble of jammed matter with zero compactivity, while the RLP arises in the infinite compactivity limit. We combine an extended statistical mechanics approach ‘a la Edwards’ (where the role traditionally played by the energy and temperature in thermal systems is substituted by the volume and compactivity) with a constraint on mechanical stability imposed by the isostatic condition. We show how such approaches can bring results that can be compared to experiments and allow for an exploitation of the statistical mechanics framework. The key result is the use of a relation between the local Voronoi volumes of the constituent grains (denoted the volume function) and the number of neighbors in contact that permits us to simply combine the two approaches to develop a theory of volume fluctuations in jammed matter. Ultimately, our results lead to a phase diagram that provides a unifying view of the disordered hard sphere packing problem and further sheds light on a diverse spectrum of data, including the RLP state. Theoretical results are well reproduced by numerical simulations that confirm the essential role played by friction in determining both the RLP and RCP limits. The RLP values depend on friction, explaining why varied experimental results can be obtained.
NASA Astrophysics Data System (ADS)
Kuroda, Koji; Maskawa, Jun-ichi; Murai, Joshin
2013-08-01
Empirical studies of the high frequency data in stock markets show that the time series of trade signs or signed volumes has a long memory property. In this paper, we present a discrete time stochastic process for polymer model which describes trader's trading strategy, and show that a scale limit of the process converges to superposition of fractional Brownian motions with Hurst exponents and Brownian motion, provided that the index γ of the time scale about the trader's investment strategy coincides with the index δ of the interaction range in the discrete time process. The main tool for the investigation is the method of cluster expansion developed in the mathematical study of statistical mechanics.
Statistical Representations of Track Geometry : Volume II, Appendices.
DOT National Transportation Integrated Search
1980-03-31
This volume contains some of the more detailed data and analyses to support the results and conclusions reached in Volume I of this report. It is divided into appendixes lettered A through J. Appendix A defines a procedure for evaluating the statisti...
[Again review of research design and statistical methods of Chinese Journal of Cardiology].
Kong, Qun-yu; Yu, Jin-ming; Jia, Gong-xian; Lin, Fan-li
2012-11-01
To re-evaluate and compare the research design and the use of statistical methods in Chinese Journal of Cardiology. Summary the research design and statistical methods in all of the original papers in Chinese Journal of Cardiology all over the year of 2011, and compared the result with the evaluation of 2008. (1) There is no difference in the distribution of the design of researches of between the two volumes. Compared with the early volume, the use of survival regression and non-parameter test are increased, while decreased in the proportion of articles with no statistical analysis. (2) The proportions of articles in the later volume are significant lower than the former, such as 6(4%) with flaws in designs, 5(3%) with flaws in the expressions, 9(5%) with the incomplete of analysis. (3) The rate of correction of variance analysis has been increased, so as the multi-group comparisons and the test of normality. The error rate of usage has been decreased form 17% to 25% without significance in statistics due to the ignorance of the test of homogeneity of variance. Many improvements showed in Chinese Journal of Cardiology such as the regulation of the design and statistics. The homogeneity of variance should be paid more attention in the further application.
Central Limit Theorem for Exponentially Quasi-local Statistics of Spin Models on Cayley Graphs
NASA Astrophysics Data System (ADS)
Reddy, Tulasi Ram; Vadlamani, Sreekar; Yogeshwaran, D.
2018-04-01
Central limit theorems for linear statistics of lattice random fields (including spin models) are usually proven under suitable mixing conditions or quasi-associativity. Many interesting examples of spin models do not satisfy mixing conditions, and on the other hand, it does not seem easy to show central limit theorem for local statistics via quasi-associativity. In this work, we prove general central limit theorems for local statistics and exponentially quasi-local statistics of spin models on discrete Cayley graphs with polynomial growth. Further, we supplement these results by proving similar central limit theorems for random fields on discrete Cayley graphs taking values in a countable space, but under the stronger assumptions of α -mixing (for local statistics) and exponential α -mixing (for exponentially quasi-local statistics). All our central limit theorems assume a suitable variance lower bound like many others in the literature. We illustrate our general central limit theorem with specific examples of lattice spin models and statistics arising in computational topology, statistical physics and random networks. Examples of clustering spin models include quasi-associated spin models with fast decaying covariances like the off-critical Ising model, level sets of Gaussian random fields with fast decaying covariances like the massive Gaussian free field and determinantal point processes with fast decaying kernels. Examples of local statistics include intrinsic volumes, face counts, component counts of random cubical complexes while exponentially quasi-local statistics include nearest neighbour distances in spin models and Betti numbers of sub-critical random cubical complexes.
Diurnal fluctuations in brain volume: Statistical analyses of MRI from large populations.
Nakamura, Kunio; Brown, Robert A; Narayanan, Sridar; Collins, D Louis; Arnold, Douglas L
2015-09-01
We investigated fluctuations in brain volume throughout the day using statistical modeling of magnetic resonance imaging (MRI) from large populations. We applied fully automated image analysis software to measure the brain parenchymal fraction (BPF), defined as the ratio of the brain parenchymal volume and intracranial volume, thus accounting for variations in head size. The MRI data came from serial scans of multiple sclerosis (MS) patients in clinical trials (n=755, 3269 scans) and from subjects participating in the Alzheimer's Disease Neuroimaging Initiative (ADNI, n=834, 6114 scans). The percent change in BPF was modeled with a linear mixed effect (LME) model, and the model was applied separately to the MS and ADNI datasets. The LME model for the MS datasets included random subject effects (intercept and slope over time) and fixed effects for the time-of-day, time from the baseline scan, and trial, which accounted for trial-related effects (for example, different inclusion criteria and imaging protocol). The model for ADNI additionally included the demographics (baseline age, sex, subject type [normal, mild cognitive impairment, or Alzheimer's disease], and interaction between subject type and time from baseline). There was a statistically significant effect of time-of-day on the BPF change in MS clinical trial datasets (-0.180 per day, that is, 0.180% of intracranial volume, p=0.019) as well as the ADNI dataset (-0.438 per day, that is, 0.438% of intracranial volume, p<0.0001), showing that the brain volume is greater in the morning. Linearly correcting the BPF values with the time-of-day reduced the required sample size to detect a 25% treatment effect (80% power and 0.05 significance level) on change in brain volume from 2 time-points over a period of 1year by 2.6%. Our results have significant implications for future brain volumetric studies, suggesting that there is a potential acquisition time bias that should be randomized or statistically controlled to account for the day-to-day brain volume fluctuations. Copyright © 2015 Elsevier Inc. All rights reserved.
Kessel, Kerstin A; Habermehl, Daniel; Jäger, Andreas; Floca, Ralf O; Zhang, Lanlan; Bendl, Rolf; Debus, Jürgen; Combs, Stephanie E
2013-06-07
In radiation oncology recurrence analysis is an important part in the evaluation process and clinical quality assurance of treatment concepts. With the example of 9 patients with locally advanced pancreatic cancer we developed and validated interactive analysis tools to support the evaluation workflow. After an automatic registration of the radiation planning CTs with the follow-up images, the recurrence volumes are segmented manually. Based on these volumes the DVH (dose volume histogram) statistic is calculated, followed by the determination of the dose applied to the region of recurrence and the distance between the boost and recurrence volume. We calculated the percentage of the recurrence volume within the 80%-isodose volume and compared it to the location of the recurrence within the boost volume, boost + 1 cm, boost + 1.5 cm and boost + 2 cm volumes. Recurrence analysis of 9 patients demonstrated that all recurrences except one occurred within the defined GTV/boost volume; one recurrence developed beyond the field border/outfield. With the defined distance volumes in relation to the recurrences, we could show that 7 recurrent lesions were within the 2 cm radius of the primary tumor. Two large recurrences extended beyond the 2 cm, however, this might be due to very rapid growth and/or late detection of the tumor progression. The main goal of using automatic analysis tools is to reduce time and effort conducting clinical analyses. We showed a first approach and use of a semi-automated workflow for recurrence analysis, which will be continuously optimized. In conclusion, despite the limitations of the automatic calculations we contributed to in-house optimization of subsequent study concepts based on an improved and validated target volume definition.
NASA Astrophysics Data System (ADS)
Tien, Hai Minh; Le, Kien Anh; Le, Phung Thi Kim
2017-09-01
Bio hydrogen is a sustainable energy resource due to its potentially higher efficiency of conversion to usable power, high energy efficiency and non-polluting nature resource. In this work, the experiments have been carried out to indicate the possibility of generating bio hydrogen as well as identifying effective factors and the optimum conditions from cassava starch. Experimental design was used to investigate the effect of operating temperature (37-43 °C), pH (6-7), and inoculums ratio (6-10 %) to the yield hydrogen production, the COD reduction and the ratio of volume of hydrogen production to COD reduction. The statistical analysis of the experiment indicated that the significant effects for the fermentation yield were the main effect of temperature, pH and inoculums ratio. The interaction effects between them seem not significant. The central composite design showed that the polynomial regression models were in good agreement with the experimental results. This result will be applied to enhance the process of cassava starch processing wastewater treatment.
Timber resource statistics for the Chatham area of the Tongass National Forest, Alaska, 1982.
George Rogers; Willern W.S. van Hees
1991-01-01
Statistics on forest area, total gross and net volumes, and annual net growth and mortality are presented from the 1980-82 timber inventory of the Chatham Area, Tongass National Forest, Alaska. Available timberland area is estimated at 1.4 million acres, net growing stock volume at 7.2 billion cubic feet, and annual net growth and mortality at 35.9 and 54.8 million...
Timber resource statistics for the Stikine area of the Tongass National Forest, Alaska, 1984.
George Rogers; Wlllem W.S. van Hees
1991-01-01
Statistics on forest area, total gross and net timber volumes, and annual net growth and mortality are presented from the 1983-84 timber inventory of the Stikine Area, Tongass National Forest, Alaska. Available timberland area is estimated at 1.2 million acres, net growing stock volume at 7.2 billion cubic feet, and annual net growth and mortality at 18.8 and 57.0...
Timber resource statistics for the Ketchikan area of the Tongass National Forest, Alaska, 1985.
George Rogers; Willem W.S. van Hees
1991-01-01
Statistics on forest area, total gross and net volumes, and annual net growth and mortality are presented from the 1984-85 timber inventory of the Ketchikan Area, Tongass National Forest, Alaska. Available timberland area is estimated at 1.5 million acres, net growing stock volume at 8.2 billion cubic feet, and annual net growth and mortality at 24.8 and 65.6 million...
Michigan's fourth forest inventory: timber volumes and projections of timber supply.
John S. Jr. Spencer; Jerold T. Hahn
1984-01-01
The fourth inventory of the timber resource of Michigan shows growing-stock volume increased 27% between 1966 and 1980, from 15.1 to 19.1 billion cubic feet. Presented are highlights and statistics on volume, growth, mortality, removals, biomass, and projections.
Carpal tunnel volume changes of the wrist under distraction.
Cho, M S; Means, K R; Shrout, J A; Segalman, K A
2008-10-01
This study attempts to determine changes in carpal canal volume with distraction across the wrist. Uniform longitudinal distraction was maintained with two external fixators on the radial and ulnar aspects of the forearm axis of five cadaver specimens. After CT scanning, volume determinations were made at 5 mm increments beginning at the lunocapitate joint to a point 1.5 cm distal to the middle finger carpometacarpal joint. There was a statistically significant decrease of the mean total carpal canal volume from 0 to 4.54 kg of distraction, with no statistically significant decrease from 0 to 2.27 kg or 2.27 to 4.54 kg. The largest decrease occurred at 15 and 20 mm distal to the proximal edge of the transverse carpal ligament corresponding to the level of the hamate hook. Reduction in mean carpal canal volume was 10.2% and 7.5% at these distances, respectively, from 0 to 4.54 kg of distraction. Progressive distraction across the wrist causes a decrease in total carpal canal volume.
Liem, Franziskus; Mérillat, Susan; Bezzola, Ladina; Hirsiger, Sarah; Philipp, Michel; Madhyastha, Tara; Jäncke, Lutz
2015-03-01
FreeSurfer is a tool to quantify cortical and subcortical brain anatomy automatically and noninvasively. Previous studies have reported reliability and statistical power analyses in relatively small samples or only selected one aspect of brain anatomy. Here, we investigated reliability and statistical power of cortical thickness, surface area, volume, and the volume of subcortical structures in a large sample (N=189) of healthy elderly subjects (64+ years). Reliability (intraclass correlation coefficient) of cortical and subcortical parameters is generally high (cortical: ICCs>0.87, subcortical: ICCs>0.95). Surface-based smoothing increases reliability of cortical thickness maps, while it decreases reliability of cortical surface area and volume. Nevertheless, statistical power of all measures benefits from smoothing. When aiming to detect a 10% difference between groups, the number of subjects required to test effects with sufficient power over the entire cortex varies between cortical measures (cortical thickness: N=39, surface area: N=21, volume: N=81; 10mm smoothing, power=0.8, α=0.05). For subcortical regions this number is between 16 and 76 subjects, depending on the region. We also demonstrate the advantage of within-subject designs over between-subject designs. Furthermore, we publicly provide a tool that allows researchers to perform a priori power analysis and sensitivity analysis to help evaluate previously published studies and to design future studies with sufficient statistical power. Copyright © 2014 Elsevier Inc. All rights reserved.
Limit order book and its modeling in terms of Gibbs Grand-Canonical Ensemble
NASA Astrophysics Data System (ADS)
Bicci, Alberto
2016-12-01
In the domain of so called Econophysics some attempts have been already made for applying the theory of thermodynamics and statistical mechanics to economics and financial markets. In this paper a similar approach is made from a different perspective, trying to model the limit order book and price formation process of a given stock by the Grand-Canonical Gibbs Ensemble for the bid and ask orders. The application of the Bose-Einstein statistics to this ensemble allows then to derive the distribution of the sell and buy orders as a function of price. As a consequence we can define in a meaningful way expressions for the temperatures of the ensembles of bid orders and of ask orders, which are a function of minimum bid, maximum ask and closure prices of the stock as well as of the exchanged volume of shares. It is demonstrated that the difference between the ask and bid orders temperatures can be related to the VAO (Volume Accumulation Oscillator), an indicator empirically defined in Technical Analysis of stock markets. Furthermore the derived distributions for aggregate bid and ask orders can be subject to well defined validations against real data, giving a falsifiable character to the model.
Sulter, A M; Wit, H P
1996-11-01
Glottal volume velocity waveform characteristics of 224 subjects, categorized in four groups according to gender and vocal training, were determined, and their relations to sound-pressure level, fundamental frequency, intra-oral pressure, and age were analyzed. Subjects phonated at three intensity conditions. The glottal volume velocity waveforms were obtained by inverse filtering the oral flow. Glottal volume velocity waveforms were parameterized with flow-based (minimum flow, ac flow, average flow, maximum flow declination rate) and time-based parameters (closed quotient, closing quotient, speed quotient), as well as with derived parameters (vocal efficiency and glottal resistance). Higher sound-pressure levels, intra-oral pressures, and flow-parameter values (ac flow, maximum flow declination rate) were observed, when compared with previous investigations. These higher values might be the result of the specific phonation tasks (stressed /ae/ vowel in a word and a sentence) or filtering processes. Few statistically significant (p < 0.01) differences in parameters were found between untrained and trained subjects [the maximum flow declination rate and the closing quotient were higher in trained women (p < 0.001), and the speed quotient was higher in trained men (p < 0.005)]. Several statistically significant parameter differences were found between men and women [minimum flow, ac flow, average flow, maximum flow declination rate, closing quotient, glottal resistance (p < 0.001), and closed quotient (p < 0.005)]. Significant effects of intensity condition were observed on ac flow, maximum flow declination rate, closing quotient, and vocal efficiency in women (p < 0.005), and on minimum flow, ac flow, average flow, maximum flow declination rate, closed quotient, and vocal efficiency in men (p < 0.01).
NASA Astrophysics Data System (ADS)
Gutiérrez, J. M.; Natxiondo, A.; Nieves, J.; Zabala, A.; Sertucha, J.
2017-04-01
The study of shrinkage incidence variations in nodular cast irons is an important aspect of manufacturing processes. These variations change the feeding requirements on castings and the optimization of risers' size is consequently affected when avoiding the formation of shrinkage defects. The effect of a number of processing variables on the shrinkage size has been studied using a layout specifically designed for this purpose. The β parameter has been defined as the relative volume reduction from the pouring temperature up to the room temperature. It is observed that shrinkage size and β decrease as effective carbon content increases and when inoculant is added in the pouring stream. A similar effect is found when the parameters selected from cooling curves show high graphite nucleation during solidification of cast irons for a given inoculation level. Pearson statistical analysis has been used to analyze the correlations among all involved variables and a group of Bayesian networks have been subsequently built so as to get the best accurate model for predicting β as a function of the input processing variables. The developed models can be used in foundry plants to study the shrinkage incidence variations in the manufacturing process and to optimize the related costs.
Rivkin, Michael J; Davis, Peter E; Lemaster, Jennifer L; Cabral, Howard J; Warfield, Simon K; Mulkern, Robert V; Robson, Caroline D; Rose-Jacobs, Ruth; Frank, Deborah A
2008-04-01
The objective of this study was to use volumetric MRI to study brain volumes in 10- to 14-year-old children with and without intrauterine exposure to cocaine, alcohol, cigarettes, or marijuana. Volumetric MRI was performed on 35 children (mean age: 12.3 years; 14 with intrauterine exposure to cocaine, 21 with no intrauterine exposure to cocaine) to determine the effect of prenatal drug exposure on volumes of cortical gray matter; white matter; subcortical gray matter; cerebrospinal fluid; and total parenchymal volume. Head circumference was also obtained. Analyses of each individual substance were adjusted for demographic characteristics and the remaining 3 prenatal substance exposures. Regression analyses adjusted for demographic characteristics showed that children with intrauterine exposure to cocaine had lower mean cortical gray matter and total parenchymal volumes and smaller mean head circumference than comparison children. After adjustment for other prenatal exposures, these volumes remained smaller but lost statistical significance. Similar analyses conducted for prenatal ethanol exposure adjusted for demographics showed significant reduction in mean cortical gray matter; total parenchymal volumes; and head circumference, which remained smaller but lost statistical significance after adjustment for the remaining 3 exposures. Notably, prenatal cigarette exposure was associated with significant reductions in cortical gray matter and total parenchymal volumes and head circumference after adjustment for demographics that retained marginal significance after adjustment for the other 3 exposures. Finally, as the number of exposures to prenatal substances grew, cortical gray matter and total parenchymal volumes and head circumference declined significantly with smallest measures found among children exposed to all 4. CONCLUSIONS; These data suggest that intrauterine exposures to cocaine, alcohol, and cigarettes are individually related to reduced head circumference; cortical gray matter; and total parenchymal volumes as measured by MRI at school age. Adjustment for other substance exposures precludes determination of statistically significant individual substance effect on brain volume in this small sample; however, these substances may act cumulatively during gestation to exert lasting effects on brain size and volume.
Efficient Encoding and Rendering of Time-Varying Volume Data
NASA Technical Reports Server (NTRS)
Ma, Kwan-Liu; Smith, Diann; Shih, Ming-Yun; Shen, Han-Wei
1998-01-01
Visualization of time-varying volumetric data sets, which may be obtained from numerical simulations or sensing instruments, provides scientists insights into the detailed dynamics of the phenomenon under study. This paper describes a coherent solution based on quantization, coupled with octree and difference encoding for visualizing time-varying volumetric data. Quantization is used to attain voxel-level compression and may have a significant influence on the performance of the subsequent encoding and visualization steps. Octree encoding is used for spatial domain compression, and difference encoding for temporal domain compression. In essence, neighboring voxels may be fused into macro voxels if they have similar values, and subtrees at consecutive time steps may be merged if they are identical. The software rendering process is tailored according to the tree structures and the volume visualization process. With the tree representation, selective rendering may be performed very efficiently. Additionally, the I/O costs are reduced. With these combined savings, a higher level of user interactivity is achieved. We have studied a variety of time-varying volume datasets, performed encoding based on data statistics, and optimized the rendering calculations wherever possible. Preliminary tests on workstations have shown in many cases tremendous reduction by as high as 90% in both storage space and inter-frame delay.
Physics of self-aligned assembly at room temperature
NASA Astrophysics Data System (ADS)
Dubey, V.; Beyne, E.; Derakhshandeh, J.; De Wolf, I.
2018-01-01
Self-aligned assembly, making use of capillary forces, is considered as an alternative to active alignment during thermo-compression bonding of Si chips in the 3D heterogeneous integration process. Various process parameters affect the alignment accuracy of the chip over the patterned binding site on a substrate/carrier wafer. This paper discusses the chip motion due to wetting and capillary force using a transient coupled physics model for the two regimes (that is, wetting regime and damped oscillatory regime) in the temporal domain. Using the transient model, the effect of the volume of the liquid and the placement accuracy of the chip on the alignment force is studied. The capillary time (that is, the time it takes for the chip to reach its mean position) for the chip is directly proportional to the placement offset and inversely proportional to the viscosity. The time constant of the harmonic oscillations is directly proportional to the gap between the chips due to the volume of the fluid. The predicted behavior from transient simulations is next experimentally validated and it is confirmed that the liquid volume and the initial placement affect the final alignment accuracy of the top chip on the bottom substrate. With statistical experimental data, we demonstrate an alignment accuracy reaching <1 μm.
An index-flood model for deficit volumes assessment
NASA Astrophysics Data System (ADS)
Strnad, Filip; Moravec, Vojtěch; Hanel, Martin
2017-04-01
The estimation of return periods of hydrological extreme events and the evaluation of risks related to such events are objectives of many water resources studies. The aim of this study is to develop statistical model for drought indices using extreme value theory and index-flood method and to use this model for estimation of return levels of maximum deficit volumes of total runoff and baseflow. Deficit volumes for hundred and thirty-three catchments in the Czech Republic for the period 1901-2015 simulated by a hydrological model Bilan are considered. The characteristics of simulated deficit periods (severity, intensity and length) correspond well to those based on observed data. It is assumed that annual maximum deficit volumes in each catchment follow the generalized extreme value (GEV) distribution. The catchments are divided into three homogeneous regions considering long term mean runoff, potential evapotranspiration and base flow. In line with the index-flood method it is further assumed that the deficit volumes within each homogeneous region are identically distributed after scaling with a site-specific factor. The goodness-of-fit of the statistical model is assessed by Anderson-Darling statistics. For the estimation of critical values of the test several resampling strategies allowing for appropriate handling of years without drought are presented. Finally the significance of the trends in the deficit volumes is assessed by a likelihood ratio test.
Estimation of the brain stem volume by stereological method on magnetic resonance imaging.
Erbagci, Hulya; Keser, Munevver; Kervancioglu, Selim; Kizilkan, Nese
2012-11-01
Neuron loss that occurs in some neurodegenerative diseases can lead to volume alterations by causing atrophy in the brain stem. The aim of this study was to determine the brain stem volume and the volume ratio of the brain stem to total brain volume related to gender and age using new Stereo Investigator system in normal subjects. For this purpose, MR images of 72 individuals who have no pathologic condition were evaluated. The total brain volumes of female and male were calculated as 966.81 ± 77.44 and 1,074.06 ± 111.75 cm3, respectively. Brain stem volumes of female and male were determined as 18.99 ± 2.36 and 22.05 ± 4.01 cm3, respectively. The ratios of brain stem volume to total brain volume were 1.96 ± 0.17 in female and 2.05 ± 0.29 in male. The total brain and brain stem volumes were observed smaller in female and it is statistically significant. Among the individuals whose ages are between 20 and 40, total brain and brain stem volume measurements with aging were not statistically significant. As a result, we believe that the measurement of brain stem volume with an objective and efficient calculation method will contribute to the early diagnosis of neurodegenerative diseases, as well as to determine the rate of disease progression, and the outcomes of treatment.
Ashkani-Esfahani, Soheil; Hosseinabadi, Omid Koohi; Moezzi, Parinaz; Moafpourian, Yalda; Kardeh, Sina; Rafiee, Shima; Fatheazam, Reza; Noorafshan, Ali; Nadimi, Elham; Mehrvarz, Shayan; Khoshneviszadeh, Mehdi; Khoshneviszadeh, Mahsima
2016-08-01
Calcium can play noticeable roles in the wound-healing process, such as its effects on organization of F-actinin collagen bundles by fibroblasts at the injury site. In addition, calcium-channel blockers such as verapamil have antioxidant activity by increasing nitric oxide production that promotes angiogenesis, proliferation of fibroblasts, and endothelial cells in the skin-regeneration process. Therefore, in this study, the authors' objective was to investigate the effects of verapamil on the process of wound healing in rat models according to stereological parameters. In this experimental study, 36 male Wistar rats were randomly divided into 3 groups (n = 12): the control group that received no treatment, gel-base-treated group, and the 5% verapamil gel-treated group. Treatments were done every 24 hours for 15 days. Wound closure rate, volume densities of the collagen bundles and the vessels, vessel's length density and mean diameter, and fibroblast populations were estimated using stereological methods and were analyzed by the Kruskal-Wallis and Mann-Whitney U tests; P < .05 was considered statistically significant. The verapamil-treated group showed a faster wound closure rate in comparison with control and gel-base groups (P = .007 and P = .011). The numerical density of fibroblasts, volume density of collagen bundles, mean diameter, and volume densities of the vessels in the verapamil group were significantly higher than those in the control and the base groups (P < .005). The authors showed that verapamil has the ability to improve wound healing by enhancing fibroblast proliferation, collagen bundle synthesis, and revascularization in skin injuries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartholomay, R.C.
1993-12-31
Water from 11 wells completed in the Snake River Plain aquifer at the Idaho National Engineering Laboratory was sampled as Part of the US. Geological Survey`s quality assurance program to determine the effect of Purging different borehole volumes on tritium and strontium-90 concentrations. Wells were selected for sampling on the basis of the length of time it took to purge a borehole volume of water. Samples were collected after purging one, two, and three borehole volumes. The US Department of Energy`s Radiological and Environmental Sciences Laboratory provided analytical services. Statistics were used to determine the reproducibility of analytical results. Themore » comparison between tritium and strontium-90 concentrations after purging one and three borehole volumes and two and three borehole volumes showed that all but two sample pairs with defined numbers were in statistical agreement. Results indicate that concentrations of tritium and strontium-90 are not affected measurably by the number of borehole volumes purged.« less
Zivković, Nikica; Zivković, Kreiimir; Despot, Albert; Paić, Josip; Zelić, Ana
2012-12-01
The aim of this study was clinical testing of the reliability and usability of three-dimensional (3D) and two-dimensional (2D) ultrasound (US) technology. The ultimate aim and purpose of this study was to establish ultrasound methods, standards and protocols for determining the volume of any gynecologic organ or tumor. The study included 31 women in reproductive age and postmenopause. All patients were examined with a RIC 5-9 3D-endovaginal probe (4.3-7.5 MHz) on a Voluson 730 Pro ultrasound device. The volume of myomas was measured by using the existing 2D and 3D ultrasound methods on the above mentioned device. All patients underwent myomectomy or hysterectomy due to clinically and ultrasonographically diagnosed uterine myomas indicating operative intervention. After the operation, the pathologist determined the volume of removed myomas by measuring them in a gauge bowl containing water, i.e. using Archimedes' principle (lift), serving as the control group with histopathologic diagnosis. A total of 155 myoma volumes were processed on 2D display, 31 myoma volumes were preoperatively measured on 3D display and 31 myoma volumes were measured by the pathologist. The values of US measurements for each US method were expressed as mean value of all measurements of myoma volumes. Statistical processing of the results and Student's t-test for independent samples revealed that the 2nd examined US method (measuring of myoma by using an ellipse and the longer tumor diameter) and 4th examined US method (measuring of myoma by using the longer and shorter tumor diameters together with establishing their mean values) in 2D US technique, as well as the 6th examined US method in 3D US technique showed no significant measurement differences in comparison with control measurement in a gauge bowl containing water (p < 0.05), indicating acceptability of the US methods for verifying tumor volumes. The standard error in determining the volume of myomas by the above US methods varied between 15% and 25%, so it is concluded that these three methods can be used in clinical practice to determine tumor volumes, in this case uterine myomas. The 3D MultiPlane method proved to be the most reliable method of determining the volume of uterine myomas.
Minnesota forest statistics, 1977.
Pamela J. Jakes
1980-01-01
Presents highlights and statistics from the Fourth Minnesota Forest Inventory. Includes detailed tables of forest area, timber volume, net annual growth, timber removals, mortality, and timber products output.
Wang, Xue; Bi, Dao-wei; Ding, Liang; Wang, Sheng
2007-01-01
The recent availability of low cost and miniaturized hardware has allowed wireless sensor networks (WSNs) to retrieve audio and video data in real world applications, which has fostered the development of wireless multimedia sensor networks (WMSNs). Resource constraints and challenging multimedia data volume make development of efficient algorithms to perform in-network processing of multimedia contents imperative. This paper proposes solving problems in the domain of WMSNs from the perspective of multi-agent systems. The multi-agent framework enables flexible network configuration and efficient collaborative in-network processing. The focus is placed on target classification in WMSNs where audio information is retrieved by microphones. To deal with the uncertainties related to audio information retrieval, the statistical approaches of power spectral density estimates, principal component analysis and Gaussian process classification are employed. A multi-agent negotiation mechanism is specially developed to efficiently utilize limited resources and simultaneously enhance classification accuracy and reliability. The negotiation is composed of two phases, where an auction based approach is first exploited to allocate the classification task among the agents and then individual agent decisions are combined by the committee decision mechanism. Simulation experiments with real world data are conducted and the results show that the proposed statistical approaches and negotiation mechanism not only reduce memory and computation requirements in WMSNs but also significantly enhance classification accuracy and reliability. PMID:28903223
The fourth Minnesota forest inventory: timber volumes and projections of timber supply.
John S. Jr. Spencer
1982-01-01
The fourth inventory of Minnesota's forest resources shows a 21% increase in growing-stock volume between 1962 and 1977, from 9.4 to 11.5 billion cubic feet. Presented are text and statistics on timber volume, growth, mortality, removals, and future timber supply.
Lu, Lu; Zhang, Lianqing; Hu, Xinyu; Hu, Xiaoxiao; Li, Lingjiang; Gong, Qiyong; Huang, Xiaoqi
2018-04-01
In the current study, we aim to investigate whether post-traumatic stress disorder (PTSD) is associated with structural alterations in specific subfields of hippocampus comparing with trauma-exposed control (TC) in a relatively large sample. We included 67 PTSD patients who were diagnosed under Diagnostic and Statistical Manual of Mental Disorders (4th Edition) (DSM-Ⅳ) criteria and 78 age- and sex-matched non-PTSD adult survivors who experienced similar stressors. High resolution T1 weighted images were obtained via a GE 3.0 T scanner. The structural data was automatically segmented using FreeSurfer software, and volume of whole hippocampus and subfield including CA1, CA2-3, CA4-DG, fimbria, presubiculum, subiculum and fissure were extracted. Volume differences between the two groups were statistically compared with age, years of education, duration from the events and intracranial volume (ICV) as covariates. Hemisphere, sex and diagnosis were entered as fixed factors. Relationship between morphometric measurements with Clinician-Administered PTSD Scale (CAPS) score and illness duration were performed using Pearson's correlation with SPSS. Comparing to TC, PTSD patients showed no statistically significant alteration in volumes of the whole hippocampus and all the subfields ( P > 0.05). In male patients, there were significant correlations between CAPS score and volume of right CA2-3 ( R 2 = 0.197, P = 0.034), right subiculum ( R 2 = 0.245, P = 0.016), and duration statistically correlated with right fissure ( R 2 = 0.247, P = 0.016). In female patients, CAPS scores significant correlated with volume of left presubiculum ( R 2 = 0.095, P = 0.042), left subiculum ( R 2 = 0.090, P = 0.048), and left CA4-DG ( R 2 = 0.099, P = 0.037). The main findings of the current study suggest that stress event causes non-selective damage to hippocampus in both PTSD patients and TC, and gender-specific lateralization may underlie PTSD pathology.
Darrow, Michele C.; Zhang, Yujin; Cinquin, Bertrand P.; ...
2016-08-09
Sickle cell disease is a destructive genetic disorder characterized by the formation of fibrils of deoxygenated hemoglobin, leading to the red blood cell (RBC) morphology changes that underlie the clinical manifestations of this disease. Here, using cryogenic soft X-ray tomography (SXT), we characterized the morphology of sickled RBCs in terms of volume and the number of protrusions per cell. We were able to identify statistically a relationship between the number of protrusions and the volume of the cell, which is known to correlate to the severity of sickling. This structural polymorphism allows for the classification of the stages of themore » sickling process. Recent studies have shown that elevated sphingosine kinase 1 (Sphk1)-mediated sphingosine 1-phosphate production contributes to sickling. Here, we further demonstrate that compound 5C, an inhibitor of Sphk1, has anti-sickling properties. Additionally, the variation in cellular morphology upon treatment suggests that this drug acts to delay the sickling process. SXT is an effective tool that can be used to identify the morphology of the sickling process and assess the effectiveness of potential therapeutics.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darrow, Michele C.; Zhang, Yujin; Cinquin, Bertrand P.
Sickle cell disease is a destructive genetic disorder characterized by the formation of fibrils of deoxygenated hemoglobin, leading to the red blood cell (RBC) morphology changes that underlie the clinical manifestations of this disease. Here, using cryogenic soft X-ray tomography (SXT), we characterized the morphology of sickled RBCs in terms of volume and the number of protrusions per cell. We were able to identify statistically a relationship between the number of protrusions and the volume of the cell, which is known to correlate to the severity of sickling. This structural polymorphism allows for the classification of the stages of themore » sickling process. Recent studies have shown that elevated sphingosine kinase 1 (Sphk1)-mediated sphingosine 1-phosphate production contributes to sickling. Here, we further demonstrate that compound 5C, an inhibitor of Sphk1, has anti-sickling properties. Additionally, the variation in cellular morphology upon treatment suggests that this drug acts to delay the sickling process. SXT is an effective tool that can be used to identify the morphology of the sickling process and assess the effectiveness of potential therapeutics.« less
Stochastic calculus of protein filament formation under spatial confinement
NASA Astrophysics Data System (ADS)
Michaels, Thomas C. T.; Dear, Alexander J.; Knowles, Tuomas P. J.
2018-05-01
The growth of filamentous aggregates from precursor proteins is a process of central importance to both normal and aberrant biology, for instance as the driver of devastating human disorders such as Alzheimer's and Parkinson's diseases. The conventional theoretical framework for describing this class of phenomena in bulk is based upon the mean-field limit of the law of mass action, which implicitly assumes deterministic dynamics. However, protein filament formation processes under spatial confinement, such as in microdroplets or in the cellular environment, show intrinsic variability due to the molecular noise associated with small-volume effects. To account for this effect, in this paper we introduce a stochastic differential equation approach for investigating protein filament formation processes under spatial confinement. Using this framework, we study the statistical properties of stochastic aggregation curves, as well as the distribution of reaction lag-times. Moreover, we establish the gradual breakdown of the correlation between lag-time and normalized growth rate under spatial confinement. Our results establish the key role of spatial confinement in determining the onset of stochasticity in protein filament formation and offer a formalism for studying protein aggregation kinetics in small volumes in terms of the kinetic parameters describing the aggregation dynamics in bulk.
Eyigör, Hülya; Çekiç, Bülent; Turgut Çoban, Deniz; Selçuk, Ömer Tarık; Renda, Levent; Şimşek, Emine Handan; Yılmaz, Mustafa Deniz
2016-07-01
Silent sinus syndrome (SSS) is a clinical syndrome that occurs as a result of chronic maxillary sinus atelectasis (CMA) and is seen with progressive enophthalmos and hypoglobus. The aim of this study was to investigate the correlation between radiological findings and clinical findings in patients with radiologically asymmetrical reduced maxillary sinus volume. A comparison was made of patients with CMA through evaluation of paranasal sinus computed tomography, magnetic resonance imaging examination of maxillary sinus volume of the CMA side and the contralateral side, thickness of the retroantral fat tissue, infraorbital bone curve, uncinate process lateralisation measurement, middle concha diameter, and calculation of the change in location of the inferior rectus muscle. The study included 16 patients. Although a statistically significant difference was determined between the healthy and the pathological sides in respect to maxillary sinus volume, thickness of the retroantral fat tissue, infraorbital bone curve, uncinate process lateralisation measurement, and middle concha diameter (p = 0.00, p = 0.002, p = 0.020, p = 0.020, p = 0.007), no significant difference was determined in respect to the change in location of the inferior rectus muscle (p = 0.154). A positive correlation was determined between the increase in sulcus depth and maxillary sinus volume and inferior orbital bone curve (p < 0.05). In CMA patients suspected of having SSS, radiological maxillary sinus volume analysis, determination of retroantral fat thickness, measurement of the infraorbital bone curve, and measurement of the uncinate process lateralisation can be used as objective tests. However, it should be kept in mind that radiological findings may not always be compatible with the ophthalmological examination findings. Copyright © 2016 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.
Dobbin, Kevin K; Cesano, Alessandra; Alvarez, John; Hawtin, Rachael; Janetzki, Sylvia; Kirsch, Ilan; Masucci, Giuseppe V; Robbins, Paul B; Selvan, Senthamil R; Streicher, Howard Z; Zhang, Jenny; Butterfield, Lisa H; Thurin, Magdalena
2016-01-01
There is growing recognition that immunotherapy is likely to significantly improve health outcomes for cancer patients in the coming years. Currently, while a subset of patients experience substantial clinical benefit in response to different immunotherapeutic approaches, the majority of patients do not but are still exposed to the significant drug toxicities. Therefore, a growing need for the development and clinical use of predictive biomarkers exists in the field of cancer immunotherapy. Predictive cancer biomarkers can be used to identify the patients who are or who are not likely to derive benefit from specific therapeutic approaches. In order to be applicable in a clinical setting, predictive biomarkers must be carefully shepherded through a step-wise, highly regulated developmental process. Volume I of this two-volume document focused on the pre-analytical and analytical phases of the biomarker development process, by providing background, examples and "good practice" recommendations. In the current Volume II, the focus is on the clinical validation, validation of clinical utility and regulatory considerations for biomarker development. Together, this two volume series is meant to provide guidance on the entire biomarker development process, with a particular focus on the unique aspects of developing immune-based biomarkers. Specifically, knowledge about the challenges to clinical validation of predictive biomarkers, which has been gained from numerous successes and failures in other contexts, will be reviewed together with statistical methodological issues related to bias and overfitting. The different trial designs used for the clinical validation of biomarkers will also be discussed, as the selection of clinical metrics and endpoints becomes critical to establish the clinical utility of the biomarker during the clinical validation phase of the biomarker development. Finally, the regulatory aspects of submission of biomarker assays to the U.S. Food and Drug Administration as well as regulatory considerations in the European Union will be covered.
On-line Machine Learning and Event Detection in Petascale Data Streams
NASA Astrophysics Data System (ADS)
Thompson, David R.; Wagstaff, K. L.
2012-01-01
Traditional statistical data mining involves off-line analysis in which all data are available and equally accessible. However, petascale datasets have challenged this premise since it is often impossible to store, let alone analyze, the relevant observations. This has led the machine learning community to investigate adaptive processing chains where data mining is a continuous process. Here pattern recognition permits triage and followup decisions at multiple stages of a processing pipeline. Such techniques can also benefit new astronomical instruments such as the Large Synoptic Survey Telescope (LSST) and Square Kilometre Array (SKA) that will generate petascale data volumes. We summarize some machine learning perspectives on real time data mining, with representative cases of astronomical applications and event detection in high volume datastreams. The first is a "supervised classification" approach currently used for transient event detection at the Very Long Baseline Array (VLBA). It injects known signals of interest - faint single-pulse anomalies - and tunes system parameters to recover these events. This permits meaningful event detection for diverse instrument configurations and observing conditions whose noise cannot be well-characterized in advance. Second, "semi-supervised novelty detection" finds novel events based on statistical deviations from previous patterns. It detects outlier signals of interest while considering known examples of false alarm interference. Applied to data from the Parkes pulsar survey, the approach identifies anomalous "peryton" phenomena that do not match previous event models. Finally, we consider online light curve classification that can trigger adaptive followup measurements of candidate events. Classifier performance analyses suggest optimal survey strategies, and permit principled followup decisions from incomplete data. These examples trace a broad range of algorithm possibilities available for online astronomical data mining. This talk describes research performed at the Jet Propulsion Laboratory, California Institute of Technology. Copyright 2012, All Rights Reserved. U.S. Government support acknowledged.
Enlarged right superior temporal gyrus in children and adolescents with autism.
Jou, Roger J; Minshew, Nancy J; Keshavan, Matcheri S; Vitale, Matthew P; Hardan, Antonio Y
2010-11-11
The superior temporal gyrus has been implicated in language processing and social perception. Therefore, anatomical abnormalities of this structure may underlie some of the deficits observed in autism, a severe neurodevelopmental disorder characterized by impairments in social interaction and communication. In this study, volumes of the left and right superior temporal gyri were measured using magnetic resonance imaging obtained from 18 boys with high-functioning autism (mean age=13.5±3.4years; full-scale IQ=103.6±13.4) and 19 healthy controls (mean age=13.7±3.0years; full-scale IQ=103.9±10.5), group-matched on age, gender, and handedness. When compared to the control group, right superior temporal gyral volumes was significantly increased in the autism group after controlling for age and total brain volume. There was no significant difference in the volume of the left superior temporal gyrus. Post-hoc analysis revealed a significant increase of the right posterior superior temporal gyral volume in the autism group, before and after controlling for age and total brain volume. Examination of the symmetry index for the superior temporal gyral volumes did not yield statistically significant between-group differences. Findings from this preliminary investigation suggest the existence of volumetric alterations in the right superior temporal gyrus in children and adolescents with autism, providing support for a neuroanatomical basis of the social perceptual deficits characterizing this severe neurodevelopmental disorder. Copyright © 2010 Elsevier B.V. All rights reserved.
Enlarged Right Superior Temporal Gyrus in Children and Adolescents with Autism
Jou, Roger J.; Minshew, Nancy J.; Keshavan, Matcheri S.; Vitale, Matthew P.; Hardan, Antonio Y.
2010-01-01
The superior temporal gyrus has been implicated in language processing and social perception. Therefore, anatomical abnormalities of this structure may underlie some of the deficits observed in autism, a severe neurodevelopmental disorder characterized by impairments in social interaction and communication. In this study, volumes of the left and right superior temporal gyri were measured using magnetic resonance imaging obtained from 18 boys with high-functioning autism (mean age = 13.5 ±3.4 years; full-scale IQ = 103.6 ±13.4) and 19 healthy controls (mean age = 13.7 ±3.0 years; full-scale IQ = 103.9 ±10.5), group-matched on age, gender, and handedness. When compared to the control group, right superior temporal gyral volumes were significantly increased in the autism group after controlling for age and total brain volume. There was no significant difference in the volume of the left superior temporal gyrus. Post-hoc analysis revealed a significant increase of the right posterior superior temporal gyral volume in the autism group, before and after controlling for age and total brain volume. Examination of the symmetry index for the superior temporal gyral volumes did not yield statistically significant between-group differences. Findings from this preliminary investigation suggest the existence of volumetric alterations in the right superior temporal gyrus in children and adolescents with autism, providing support for a neuroanatomical basis of the social perceptual deficits characterizing this severe neurodevelopmental disorder. PMID:20833154
The validity of ultrasound estimation of muscle volumes.
Infantolino, Benjamin W; Gales, Daniel J; Winter, Samantha L; Challis, John H
2007-08-01
The purpose of this study was to validate ultrasound muscle volume estimation in vivo. To examine validity, vastus lateralis ultrasound images were collected from cadavers before muscle dissection; after dissection, the volumes were determined by hydrostatic weighing. Seven thighs from cadaver specimens were scanned using a 7.5-MHz ultrasound probe (SSD-1000, Aloka, Japan). The perimeter of the vastus lateralis was identified in the ultrasound images and manually digitized. Volumes were then estimated using the Cavalieri principle, by measuring the image areas of sets of parallel two-dimensional slices through the muscles. The muscles were then dissected from the cadavers, and muscle volume was determined via hydrostatic weighing. There was no statistically significant difference between the ultrasound estimation of muscle volume and that estimated using hydrostatic weighing (p > 0.05). The mean percentage error between the two volume estimates was 0.4% +/- 6.9. Three operators all performed four digitizations of all images from one randomly selected muscle; there was no statistical difference between operators or trials and the intraclass correlation was high (>0.8). The results of this study indicate that ultrasound is an accurate method for estimating muscle volumes in vivo.
ERIC Educational Resources Information Center
Organisation for Economic Cooperation and Development, Paris (France).
This document presents statistical data from the countries of France, Finland, the Netherlands, Japan, Italy, and Norway regarding the flows of graduates from higher education and their entry into the workforce. Among the statistical data presented are the trends and current situation in each country for such areas as college enrollments and…
Methods for determining and processing 3D errors and uncertainties for AFM data analysis
NASA Astrophysics Data System (ADS)
Klapetek, P.; Nečas, D.; Campbellová, A.; Yacoot, A.; Koenders, L.
2011-02-01
This paper describes the processing of three-dimensional (3D) scanning probe microscopy (SPM) data. It is shown that 3D volumetric calibration error and uncertainty data can be acquired for both metrological atomic force microscope systems and commercial SPMs. These data can be used within nearly all the standard SPM data processing algorithms to determine local values of uncertainty of the scanning system. If the error function of the scanning system is determined for the whole measurement volume of an SPM, it can be converted to yield local dimensional uncertainty values that can in turn be used for evaluation of uncertainties related to the acquired data and for further data processing applications (e.g. area, ACF, roughness) within direct or statistical measurements. These have been implemented in the software package Gwyddion.
Industrial Schools for Delinquents, 1917-18. Bulletin, 1919, No. 52
ERIC Educational Resources Information Center
Bureau of Education, Department of the Interior, 1920
1920-01-01
After the statistical report found in Volume II, 1917, Report of the Commissioner of Education, containing statistics for the year 1915-16, had been prepared, the Bureau of Education adopted the plan of collecting statistics biennially instead of annually, as had been done in preceding years. This bulletin contains statistics of Industrial Schools…
Fokker-Planck description for the queue dynamics of large tick stocks.
Garèche, A; Disdier, G; Kockelkoren, J; Bouchaud, J-P
2013-09-01
Motivated by empirical data, we develop a statistical description of the queue dynamics for large tick assets based on a two-dimensional Fokker-Planck (diffusion) equation. Our description explicitly includes state dependence, i.e., the fact that the drift and diffusion depend on the volume present on both sides of the spread. "Jump" events, corresponding to sudden changes of the best limit price, must also be included as birth-death terms in the Fokker-Planck equation. All quantities involved in the equation can be calibrated using high-frequency data on the best quotes. One of our central findings is that the dynamical process is approximately scale invariant, i.e., the only relevant variable is the ratio of the current volume in the queue to its average value. While the latter shows intraday seasonalities and strong variability across stocks and time periods, the dynamics of the rescaled volumes is universal. In terms of rescaled volumes, we found that the drift has a complex two-dimensional structure, which is a sum of a gradient contribution and a rotational contribution, both stable across stocks and time. This drift term is entirely responsible for the dynamical correlations between the ask queue and the bid queue.
Fokker-Planck description for the queue dynamics of large tick stocks
NASA Astrophysics Data System (ADS)
Garèche, A.; Disdier, G.; Kockelkoren, J.; Bouchaud, J.-P.
2013-09-01
Motivated by empirical data, we develop a statistical description of the queue dynamics for large tick assets based on a two-dimensional Fokker-Planck (diffusion) equation. Our description explicitly includes state dependence, i.e., the fact that the drift and diffusion depend on the volume present on both sides of the spread. “Jump” events, corresponding to sudden changes of the best limit price, must also be included as birth-death terms in the Fokker-Planck equation. All quantities involved in the equation can be calibrated using high-frequency data on the best quotes. One of our central findings is that the dynamical process is approximately scale invariant, i.e., the only relevant variable is the ratio of the current volume in the queue to its average value. While the latter shows intraday seasonalities and strong variability across stocks and time periods, the dynamics of the rescaled volumes is universal. In terms of rescaled volumes, we found that the drift has a complex two-dimensional structure, which is a sum of a gradient contribution and a rotational contribution, both stable across stocks and time. This drift term is entirely responsible for the dynamical correlations between the ask queue and the bid queue.
The space technology demand on materials and processes
NASA Astrophysics Data System (ADS)
Dauphin, J.
1982-01-01
Space technologies which entail materials or process problems, such as clean satellites, thermal control materials with electrical conductivity, space stations and reusable hardware are reviewed. The statistical approaches to selection used are jeopardized by small production volumes, while the analogy methods are limited by experience. Commercially available materials are extensively used in order to cut development costs, e.g., solar panel adhesives are obtained by cleaning commercial silicones by molecular distillation. The long-life and reusable spacecraft requirements, e.g., for very thin laminates, which cannot be met by commercial products are discussed. Space agencies either meet needs themselves (NASA makes white conductive paint) or they develop solutions in partnership with manufacturers.
John S. Jr. Spencer; Burton L. Essex
1976-01-01
The third inventory of Missouri's timber resource shows a small gain in growing-stock volume and a somewhat larger gain in sawtimber volume since 1959. Area of commercial forest declined sharply between surveys. Presented are text and statistics on forest area and timber volume, growth, mortality, ownership, stocking, future timber supply, and forest management...
DOT National Transportation Integrated Search
2001-01-01
The Bureau of Transportation Statistics (BTS) Airport Activity Statistics of Certificated Air Carriers: Summary Tables presents summary data for all scheduled and nonscheduled service by large certificated U.S. air carriers including the volume of pa...
Abdominal fat volume estimation by stereology on CT: a comparison with manual planimetry.
Manios, G E; Mazonakis, M; Voulgaris, C; Karantanas, A; Damilakis, J
2016-03-01
To deploy and evaluate a stereological point-counting technique on abdominal CT for the estimation of visceral (VAF) and subcutaneous abdominal fat (SAF) volumes. Stereological volume estimations based on point counting and systematic sampling were performed on images from 14 consecutive patients who had undergone abdominal CT. For the optimization of the method, five sampling intensities in combination with 100 and 200 points were tested. The optimum stereological measurements were compared with VAF and SAF volumes derived by the standard technique of manual planimetry on the same scans. Optimization analysis showed that the selection of 200 points along with the sampling intensity 1/8 provided efficient volume estimations in less than 4 min for VAF and SAF together. The optimized stereology showed strong correlation with planimetry (VAF: r = 0.98; SAF: r = 0.98). No statistical differences were found between the two methods (VAF: P = 0.81; SAF: P = 0.83). The 95% limits of agreement were also acceptable (VAF: -16.5%, 16.1%; SAF: -10.8%, 10.7%) and the repeatability of stereology was good (VAF: CV = 4.5%, SAF: CV = 3.2%). Stereology may be successfully applied to CT images for the efficient estimation of abdominal fat volume and may constitute a good alternative to the conventional planimetric technique. Abdominal obesity is associated with increased risk of disease and mortality. Stereology may quantify visceral and subcutaneous abdominal fat accurately and consistently. The application of stereology to estimating abdominal volume fat reduces processing time. Stereology is an efficient alternative method for estimating abdominal fat volume.
Statistical mechanics of the international trade network.
Fronczak, Agata; Fronczak, Piotr
2012-05-01
Analyzing real data on international trade covering the time interval 1950-2000, we show that in each year over the analyzed period the network is a typical representative of the ensemble of maximally random weighted networks, whose directed connections (bilateral trade volumes) are only characterized by the product of the trading countries' GDPs. It means that time evolution of this network may be considered as a continuous sequence of equilibrium states, i.e., a quasistatic process. This, in turn, allows one to apply the linear response theory to make (and also verify) simple predictions about the network. In particular, we show that bilateral trade fulfills a fluctuation-response theorem, which states that the average relative change in imports (exports) between two countries is a sum of the relative changes in their GDPs. Yearly changes in trade volumes prove that the theorem is valid.
Statistical mechanics of the international trade network
NASA Astrophysics Data System (ADS)
Fronczak, Agata; Fronczak, Piotr
2012-05-01
Analyzing real data on international trade covering the time interval 1950-2000, we show that in each year over the analyzed period the network is a typical representative of the ensemble of maximally random weighted networks, whose directed connections (bilateral trade volumes) are only characterized by the product of the trading countries' GDPs. It means that time evolution of this network may be considered as a continuous sequence of equilibrium states, i.e., a quasistatic process. This, in turn, allows one to apply the linear response theory to make (and also verify) simple predictions about the network. In particular, we show that bilateral trade fulfills a fluctuation-response theorem, which states that the average relative change in imports (exports) between two countries is a sum of the relative changes in their GDPs. Yearly changes in trade volumes prove that the theorem is valid.
NASA Astrophysics Data System (ADS)
Subara, Deni; Jaswir, Irwandi; Alkhatib, Maan Fahmi Rashid; Noorbatcha, Ibrahim Ali
2018-01-01
The aim of this experiment is to screen and to understand the process variables on the fabrication of fish gelatin nanoparticles by using quality-design approach. The most influencing process variables were screened by using Plackett-Burman design. Mean particles size, size distribution, and zeta potential were found in the range 240±9.76 nm, 0.3, and -9 mV, respectively. Statistical results explained that concentration of acetone, pH of solution during precipitation step and volume of cross linker had a most significant effect on particles size of fish gelatin nanoparticles. It was found that, time and chemical consuming is lower than previous research. This study revealed the potential of quality-by design in understanding the effects of process variables on the fish gelatin nanoparticles production.
Computational study of noise in a large signal transduction network.
Intosalmi, Jukka; Manninen, Tiina; Ruohonen, Keijo; Linne, Marja-Leena
2011-06-21
Biochemical systems are inherently noisy due to the discrete reaction events that occur in a random manner. Although noise is often perceived as a disturbing factor, the system might actually benefit from it. In order to understand the role of noise better, its quality must be studied in a quantitative manner. Computational analysis and modeling play an essential role in this demanding endeavor. We implemented a large nonlinear signal transduction network combining protein kinase C, mitogen-activated protein kinase, phospholipase A2, and β isoform of phospholipase C networks. We simulated the network in 300 different cellular volumes using the exact Gillespie stochastic simulation algorithm and analyzed the results in both the time and frequency domain. In order to perform simulations in a reasonable time, we used modern parallel computing techniques. The analysis revealed that time and frequency domain characteristics depend on the system volume. The simulation results also indicated that there are several kinds of noise processes in the network, all of them representing different kinds of low-frequency fluctuations. In the simulations, the power of noise decreased on all frequencies when the system volume was increased. We concluded that basic frequency domain techniques can be applied to the analysis of simulation results produced by the Gillespie stochastic simulation algorithm. This approach is suited not only to the study of fluctuations but also to the study of pure noise processes. Noise seems to have an important role in biochemical systems and its properties can be numerically studied by simulating the reacting system in different cellular volumes. Parallel computing techniques make it possible to run massive simulations in hundreds of volumes and, as a result, accurate statistics can be obtained from computational studies. © 2011 Intosalmi et al; licensee BioMed Central Ltd.
Jardim, Anaclara Prada; Corso, Jeana Torres; Garcia, Maria Teresa Fernandes Castilho; Gaça, Larissa Botelho; Comper, Sandra Mara; Lancellotti, Carmen Lúcia Penteado; Centeno, Ricardo Silva; Carrete, Henrique; Cavalheiro, Esper Abrão; Scorza, Carla Alessandra; Yacubian, Elza Márcia Targas
2016-12-01
To correlate hippocampal volumes obtained from brain structural imaging with histopathological patterns of hippocampal sclerosis (HS), in order to predict surgical outcome. Patients with mesial temporal lobe epilepsy (MTLE) with HS were selected. Clinical data were assessed pre-operatively and surgical outcome in the first year post surgery. One block of mid hippocampal body was selected for HS classification according to ILAE criteria. NeuN-immunoreactive cell bodies were counted within hippocampal subfields, in four randomly visual fields, and cell densities were transformed into z-score values. FreeSurfer processing of 1.5T brain structural images was used for subcortical and cortical volumetric estimation of the ipsilateral hippocampus. Univariate analysis of variance and Pearson's correlation test were applied for statistical analyses. Sixty-two cases (31 female, 32 right HS) were included. ILAE type 1 HS was identified in 48 patients, type 2 in eight, type 3 in two, and four had no-HS. Better results regarding seizure control, i.e. ILAE 1, were achieved by patients with type 1 HS (58.3%). Patients with types 1 and 2 had smaller hippocampal volumes compared to those with no-HS (p<0.001 and p=0.004, respectively). Positive correlation was encountered between hippocampal volumes and CA1, CA3, CA4, and total estimated neuronal densities. CA2 was the only sector which did not correlate its neuronal density with hippocampal volume (p=0.390). This is the first study correlating hippocampal volume on MRI submitted to FreeSurfer processing with ILAE patterns of HS and neuronal loss within each hippocampal subfield, a fundamental finding to anticipate surgical prognosis for patients with drug-resistant MTLE and HS. Copyright © 2016 Elsevier B.V. All rights reserved.
Graczyk, Michelle B; Duarte Queirós, Sílvio M
2016-01-01
We study the intraday behaviour of the statistical moments of the trading volume of the blue chip equities that composed the Dow Jones Industrial Average index between 2003 and 2014. By splitting that time interval into semesters, we provide a quantitative account of the nonstationary nature of the intraday statistical properties as well. Explicitly, we prove the well-known ∪-shape exhibited by the average trading volume-as well as the volatility of the price fluctuations-experienced a significant change from 2008 (the year of the "subprime" financial crisis) onwards. That has resulted in a faster relaxation after the market opening and relates to a consistent decrease in the convexity of the average trading volume intraday profile. Simultaneously, the last part of the session has become steeper as well, a modification that is likely to have been triggered by the new short-selling rules that were introduced in 2007 by the Securities and Exchange Commission. The combination of both results reveals that the ∪ has been turning into a ⊔. Additionally, the analysis of higher-order cumulants-namely the skewness and the kurtosis-shows that the morning and the afternoon parts of the trading session are each clearly associated with different statistical features and hence dynamical rules. Concretely, we claim that the large initial trading volume is due to wayward stocks whereas the large volume during the last part of the session hinges on a cohesive increase of the trading volume. That dissimilarity between the two parts of the trading session is stressed in periods of higher uproar in the market.
Kecik, Dariusz; Makowiec-Tabernacka, Marta; Golebiewska, Joanna; Moneta-Wielgos, Joanna; Kasprzak, Jan
2009-01-01
To evaluate changes in the macular thickness and volume using optical coherence tomography in patients after phacoemulsification and intracapsular implantation of a foldable intraocular lens. The study included 82 patients (37 males and 45 females) after phacoemulsification and intracapsular implantaion of the same type of a foldable intraocular lens, without any other eye disease. Phacoemulsification was performed with an INFINITI machine. In all patients, macular thickness and volume were measured with an optical coherence tomograph (Stratus OCT) using the Fast Macular Thickness Map. The OCT evaluation was performed on days 1, 7, 30 and 90 postoperatively. In 58 patients (71%), it was additionally performed at 12 months after surgery and in 52 patients (63%) the macular parameters in the healthy and operated eyes were compared. A statistically significant increase in the minimal retinal thickness was observed on days 30 (p<0.0005) and 90 (p<0.005) postoperatively compared to post-operative day 1. A statistically significant increase in the foveal volume was seen on days 30 (p<0.00005) and 90 (p<0.0005). A statistically significant increase in the volume of the entire macula was found on days 7, 30 and 90 (p<0.00005). Uncomplicated cataract phacoemulsification is followed by increases in the central retinal thickness, foveal volume and volume of the entire macula on days 30 and 90 and at 12 months postoperatively. Further observation of patients is required to confirm whether the macular parameters will return to their values on day 1 postoperatively and if so, when this will occur.
Caivano, R; Fiorentino, A; Pedicini, P; Califano, G; Fusco, V
2014-05-01
To evaluate radiotherapy treatment planning accuracy by varying computed tomography (CT) slice thickness and tumor size. CT datasets from patients with primary brain disease and metastatic brain disease were selected. Tumor volumes ranging from about 2.5 to 100 cc and CT scan at different slice thicknesses (1, 2, 4, 6 and 10 mm) were used to perform treatment planning (1-, 2-, 4-, 6- and 10-CT, respectively). For any slice thickness, a conformity index (CI) referring to 100, 98, 95 and 90 % isodoses and tumor size was computed. All the CI and volumes obtained were compared to evaluate the impact of CT slice thickness on treatment plans. The smallest volumes reduce significantly if defined on 1-CT with respect to 4- and 6-CT, while the CT slice thickness does not affect target definition for the largest volumes. The mean CI for all the considered isodoses and CT slice thickness shows no statistical differences when 1-CT is compared to 2-CT. Comparing the mean CI of 1- with 4-CT and 1- with 6-CT, statistical differences appear only for the smallest volumes with respect to 100, 98 and 95 % isodoses-the CI for 90 % isodose being not statistically significant for all the considered PTVs. The accuracy of radiotherapy tumor volume definition depends on CT slice thickness. To achieve a better tumor definition and dose coverage, 1- and 2-CT would be suitable for small targets, while 4- and 6-CT are suitable for the other volumes.
Probing the statistics of primordial fluctuations and their evolution
NASA Technical Reports Server (NTRS)
Gaztanaga, Enrique; Yokoyama, Jun'ichi
1993-01-01
The statistical distribution of fluctuations on various scales is analyzed in terms of the counts in cells of smoothed density fields, using volume-limited samples of galaxy redshift catalogs. It is shown that the distribution on large scales, with volume average of the two-point correlation function of the smoothed field less than about 0.05, is consistent with Gaussian. Statistics are shown to agree remarkably well with the negative binomial distribution, which has hierarchial correlations and a Gaussian behavior at large scales. If these observed properties correspond to the matter distribution, they suggest that our universe started with Gaussian fluctuations and evolved keeping hierarchial form.
Obuchowski, Nancy A; Buckler, Andrew; Kinahan, Paul; Chen-Mayer, Heather; Petrick, Nicholas; Barboriak, Daniel P; Bullen, Jennifer; Barnhart, Huiman; Sullivan, Daniel C
2016-04-01
A major initiative of the Quantitative Imaging Biomarker Alliance is to develop standards-based documents called "Profiles," which describe one or more technical performance claims for a given imaging modality. The term "actor" denotes any entity (device, software, or person) whose performance must meet certain specifications for the claim to be met. The objective of this paper is to present the statistical issues in testing actors' conformance with the specifications. In particular, we present the general rationale and interpretation of the claims, the minimum requirements for testing whether an actor achieves the performance requirements, the study designs used for testing conformity, and the statistical analysis plan. We use three examples to illustrate the process: apparent diffusion coefficient in solid tumors measured by MRI, change in Perc 15 as a biomarker for the progression of emphysema, and percent change in solid tumor volume by computed tomography as a biomarker for lung cancer progression. Copyright © 2016 The Association of University Radiologists. All rights reserved.
Kalenderoglu, Aysun; Çelik, Mustafa; Sevgi-Karadag, Ayse; Egilmez, Oguzhan Bekir
2016-11-01
Previous research has consistently detected inflammation in the etiology of depression and neuroimaging studies have demonstrated gray matter abnormalities implying a neurodegenerative process in depression. The aim of this study was to compare ganglion cell layer (GCL), and inner plexiform layer (IPL) volumes and retinal nerve fiber layer (RNFL) thickness between first episode and recurrent major depressive disorder (MDD) patients and controls using optic coherence tomography (OCT) in order to detect findings supporting a degenerative process. Also choroid thicknesses of the same groups were compared to examine effects of inflammation on MDD. This study included 50 recurrent MDD patients, 50 first episode MDD patients and 50 controls. OCT measurements were performed by a spectral OCT device. GCL and IPL volumes and RNFL and choroid thicknesses were measured automatically by the device. GCL and IPL volumes were significantly smaller in recurrent depression patients than first episode patients and in all MDD patients than controls. Also there were significant negative correlations between their volumes and disease severity parameters such as Ham-D and CGI scores, and disease duration. RNFL thicknesses were also lower in recurrent MDD patients than first episode patients and all MDD patients than controls but statistical significance was achieved only for global RNFL and temporal superior RNFL. Mean choroid thickness was higher in MDD patients than controls and in first episode MDD patients than recurrent MDD patients. Cross-sectional design of our study limits conclusions about progressive degeneration during the course of MDD. Lack of a control neuroimaging method like magnetic resonance imaging makes it hard to draw firm conclusions from our results. OCT finding of decreased GCL and IPL volumes supports previous research suggesting degeneration in MDD. OCT may be an important tool to track neurodegeneration in patients with major depression. Considering RNFL to be the latest layer that will be affected during course of degeneration, GCL and IPL volumes appear to be better parameters to follow. In addition, choroid may be an important structure to detect acute attack period and to follow inflammatory process in MDD like in systemic inflammatory diseases. Copyright © 2016 Elsevier B.V. All rights reserved.
2007-02-28
Program •Services executed Defense HUMINT Activities •DIA ran attaché system •Over time , deferred the Secretary’s Authorities •Post-1995 (Perry and White...ornl.gov orbucma@doe.ic.gov 26 February, 2007 TT L SENSO RS COMMS time trust Intelligence …the power of change… hameleon ORNL Cognitive Radio Program...and internal states in real- time to meet user requirements and goals • Learns: uses statistical signal processing and machine learning to reflect
Studies of Big Data metadata segmentation between relational and non-relational databases
NASA Astrophysics Data System (ADS)
Golosova, M. V.; Grigorieva, M. A.; Klimentov, A. A.; Ryabinkin, E. A.; Dimitrov, G.; Potekhin, M.
2015-12-01
In recent years the concepts of Big Data became well established in IT. Systems managing large data volumes produce metadata that describe data and workflows. These metadata are used to obtain information about current system state and for statistical and trend analysis of the processes these systems drive. Over the time the amount of the stored metadata can grow dramatically. In this article we present our studies to demonstrate how metadata storage scalability and performance can be improved by using hybrid RDBMS/NoSQL architecture.
LACIE performance predictor final operational capability program description, volume 3
NASA Technical Reports Server (NTRS)
1976-01-01
The requirements and processing logic for the LACIE Error Model program (LEM) are described. This program is an integral part of the Large Area Crop Inventory Experiment (LACIE) system. LEM is that portion of the LPP (LACIE Performance Predictor) which simulates the sample segment classification, strata yield estimation, and production aggregation. LEM controls repetitive Monte Carlo trials based on input error distributions to obtain statistical estimates of the wheat area, yield, and production at different levels of aggregation. LEM interfaces with the rest of the LPP through a set of data files.
Ahmetoglu, Fuat; Keles, Ali; Simsek, Neslihan; Ocak, M Sinan; Yologlu, Saim
2015-01-01
This study was aimed to use micro-computed tomography (μ-CT) to evaluate the canal shaping properties of three nickel-titanium instruments, Self-Adjusting File (SAF), Reciproc, and Revo-S rotary file, in maxillary first molars. Thirty maxillary molars were scanned preoperatively by using micro-computed tomography (μ-CT) scans at 13,68 μm resolution. The teeth were randomly assigned to three groups (n = 10). The root canals were shaped with SAF, Reciproc, and Revo-S, respectively. The shaped root canals were rescanned. Changes in canal volumes and surface areas were compared with preoperative values. The data were analyzed using Kruskal-Wallis and Conover's post hoc tests, with p < .05 denoting a statistically significant difference. Preoperatively canal volumes and surface area were statistically similar among the three groups (p > .05). There were statistically significant differences in all measures comparing preoperative and postoperative canal models (p = 0.0001). These differences occurred after instrumentation among the three experimental groups showed no statistically significant difference for volume (p > .05). Surface area showed the similar activity in buccal canals in each of the three techniques whereas no statistically significant difference was detected among surface area, the SAF, and the Revo-S in the palatal (P) canal. Each of three shaping system showed the similar volume activity in all canals, but SAF and Revo-S provided more effectively root planning in comparison with Reciproc in P canal. © Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Genovesi, Giovanni, Ed.
This collection, the last of four volumes on the history of compulsory education among the nations of Europe and the western hemisphere, analyzes statistics, methodology, reforms, and new tendencies. Twelve of the document's 18 articles are written in English, 3 are written in French and 3 are in Italian. Summaries accompany most articles; three…
Testing 3D landform quantification methods with synthetic drumlins in a real digital elevation model
NASA Astrophysics Data System (ADS)
Hillier, John K.; Smith, Mike J.
2012-06-01
Metrics such as height and volume quantifying the 3D morphology of landforms are important observations that reflect and constrain Earth surface processes. Errors in such measurements are, however, poorly understood. A novel approach, using statistically valid ‘synthetic' landscapes to quantify the errors is presented. The utility of the approach is illustrated using a case study of 184 drumlins observed in Scotland as quantified from a Digital Elevation Model (DEM) by the ‘cookie cutter' extraction method. To create the synthetic DEMs, observed drumlins were removed from the measured DEM and replaced by elongate 3D Gaussian ones of equivalent dimensions positioned randomly with respect to the ‘noise' (e.g. trees) and regional trends (e.g. hills) that cause the errors. Then, errors in the cookie cutter extraction method were investigated by using it to quantify these ‘synthetic' drumlins, whose location and size is known. Thus, the approach determines which key metrics are recovered accurately. For example, mean height of 6.8 m is recovered poorly at 12.5 ± 0.6 (2σ) m, but mean volume is recovered correctly. Additionally, quantification methods can be compared: A variant on the cookie cutter using an un-tensioned spline induced about twice (× 1.79) as much error. Finally, a previously reportedly statistically significant (p = 0.007) difference in mean volume between sub-populations of different ages, which may reflect formational processes, is demonstrated to be only 30-50% likely to exist in reality. Critically, the synthetic DEMs are demonstrated to realistically model parameter recovery, primarily because they are still almost entirely the original landscape. Results are insensitive to the exact method used to create the synthetic DEMs, and the approach could be readily adapted to assess a variety of landforms (e.g. craters, dunes and volcanoes).
[Statistics for statistics?--Thoughts about psychological tools].
Berger, Uwe; Stöbel-Richter, Yve
2007-12-01
Statistical methods take a prominent place among psychologists' educational programs. Being known as difficult to understand and heavy to learn, students fear of these contents. Those, who do not aspire after a research carrier at the university, will forget the drilled contents fast. Furthermore, because it does not apply for the work with patients and other target groups at a first glance, the methodological education as a whole was often questioned. For many psychological practitioners the statistical education makes only sense by enforcing respect against other professions, namely physicians. For the own business, statistics is rarely taken seriously as a professional tool. The reason seems to be clear: Statistics treats numbers, while psychotherapy treats subjects. So, does statistics ends in itself? With this article, we try to answer the question, if and how statistical methods were represented within the psychotherapeutical and psychological research. Therefore, we analyzed 46 Originals of a complete volume of the journal Psychotherapy, Psychosomatics, Psychological Medicine (PPmP). Within the volume, 28 different analyse methods were applied, from which 89 per cent were directly based upon statistics. To be able to write and critically read Originals as a backbone of research, presumes a high degree of statistical education. To ignore statistics means to ignore research and at least to reveal the own professional work to arbitrariness.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parthipun, A. A., E-mail: aneeta@hotmail.co.uk; Taylor, J.; Manyonda, I.
The purpose of this study was to determine whether there is a correlation between large uterine fibroid diameter, uterine volume, number of vials of embolic agent used and risk of complications from uterine artery embolisation (UAE). This was a prospective study involving 121 patients undergoing UAE embolisation for symptomatic uterine fibroids at a single institution. Patients were grouped according to diameter of largest fibroid and uterine volume. Results were also stratified according to the number of vials of embolic agent used and rate of complications. No statistical difference in complication rate was demonstrated between the two groups according to diametermore » of the largest fibroid (large fibroids were classified as {>=}10 cm; Fisher's exact test P = 1.00), and no statistical difference in complication rate was demonstrated according to uterine volume (large uterine volume was defined as {>=}750 cm{sup 3}; Fisher's exact test P = 0.70). 84 of the 121 patients had documentation of the number of vials used during the procedure. Patients were divided into two groups, with {>=}4 used defined as a large number of embolic agent. There was no statistical difference between these two groups and no associated increased risk of developing complications. This study showed no increased incidence of complications in women with large-diameter fibroids or uterine volumes as defined. In addition, there was no evidence of increased complications according to quantity of embolic material used. Therefore, UAE should be offered to women with large fibroids and uterine volumes.« less
Controlled Gelation of Particle Suspensions Using Controlled Solvent Removal in Picoliter Droplets
NASA Astrophysics Data System (ADS)
Vuong, Sharon; Walker, Lynn; Anna, Shelley
2013-11-01
Droplets in microfluidic devices have proven useful as uniform picoliter reactors for nanoparticle synthesis and as components in tunable emulsions. However, there can be significant transport between the component phases depending on solubility and other factors. In the present talk, we show that water droplets trapped within a microfluidic device for tens of hours slowly dehydrate, concentrating the contents encapsulated within. We use this slow dehydration along with control of the initial droplet composition to monitor gelation of aqueous suspensions of spherical silica particles (Ludox) and disk-shaped clay particles (Laponite). Droplets are generated in a microfluidic device containing small wells that trap the droplets. We monitor the concentration process through size and shape changes of these droplets as a function of time in tens of droplets and use the large number of individual reactors to generate statistics regarding the gelation process. We also examine changes in suspension viscosity through fluorescent particle tracking as a function of dehydration rate, initial suspension concentration and initial droplet volume, and added salt, and compare the results with the Krieger-Dougherty model in which viscosity increases dramatically with particle volume fraction.
CADDIS Volume 4. Data Analysis: PECBO Appendix - R Scripts for Non-Parametric Regressions
Script for computing nonparametric regression analysis. Overview of using scripts to infer environmental conditions from biological observations, statistically estimating species-environment relationships, statistical scripts.
CADDIS Volume 4. Data Analysis: Biological and Environmental Data Requirements
Overview of PECBO Module, using scripts to infer environmental conditions from biological observations, statistically estimating species-environment relationships, methods for inferring environmental conditions, statistical scripts in module.
Dynamics of non-Markovian exclusion processes
NASA Astrophysics Data System (ADS)
Khoromskaia, Diana; Harris, Rosemary J.; Grosskinsky, Stefan
2014-12-01
Driven diffusive systems are often used as simple discrete models of collective transport phenomena in physics, biology or social sciences. Restricting attention to one-dimensional geometries, the asymmetric simple exclusion process (ASEP) plays a paradigmatic role to describe noise-activated driven motion of entities subject to an excluded volume interaction and many variants have been studied in recent years. While in the standard ASEP the noise is Poissonian and the process is therefore Markovian, in many applications the statistics of the activating noise has a non-standard distribution with possible memory effects resulting from internal degrees of freedom or external sources. This leads to temporal correlations and can significantly affect the shape of the current-density relation as has been studied recently for a number of scenarios. In this paper we report a general framework to derive the fundamental diagram of ASEPs driven by non-Poissonian noise by using effectively only two simple quantities, viz., the mean residual lifetime of the jump distribution and a suitably defined temporal correlation length. We corroborate our results by detailed numerical studies for various noise statistics under periodic boundary conditions and discuss how our approach can be applied to more general driven diffusive systems.
Geographic Origins of Students, Fall 1991. Volume II.
ERIC Educational Resources Information Center
State Univ. of New York, Albany. Central Staff Office of Institutional Research.
This volume (second of three) contains statistical tables displaying origin or origin grouping of credit course students attending the State University of New York System. The volume contains six parts: Part 1 contains separate tables for each New York State county; Part 2 displays the permanent residence of students from outside of New York State…
de la Fuente-Valero, Jesús; Zapardiel-Gutiérrez, Ignacio; Orensanz-Fernández, Inmaculada; Alvarez-Alvarez, Pilar; Engels-Calvo, Virginia; Bajo-Arenas, José Manuel
2010-01-01
To measure the vascularization and ovarian volume with three-dimensional sonography in patients diagnosed of polycystic ovary syndrome with stimulated ovulation treatment, and to analyse the differences between the patients treated with clomiphen citrate versus clomiphen citrate and metformin. Therty patients were studied. Twenty ovulation cycles were obtained with clomiphen citrate and 17 with clomiphen citrate plus merformin (added in case of obesity or hyperglucemy/hyperinsulinemia). Ovarian volumes and vascular indexes were studied with 3D-sonography and results were analysed by treatment. There were no statistical differences of ovarian volume by treatment along the cycles, although bigger volume were found in ovulatory cycles compared to non-ovulatory ones (20,36 versus 13,89 ml, p = 0,026). No statistical differences were also found concerning vascular indexes, neither by treatment nor by the obtention of ovulation in the cycle. Ovarian volume and vascular indexes measured with three-dimensional sonography in patients diagnosed of polycystic ovary syndrome do not show differents values in patients treated with clomiphen citrate alone versus clomiphen citrate plus metformin.
Study of Research and Development Processes through Fuzzy Super FRM Model and Optimization Solutions
Sârbu, Flavius Aurelian; Moga, Monika; Calefariu, Gavrilă; Boșcoianu, Mircea
2015-01-01
The aim of this study is to measure resources for R&D (research and development) at the regional level in Romania and also obtain primary data that will be important in making the right decisions to increase competitiveness and development based on an economic knowledge. As our motivation, we would like to emphasize that by the use of Super Fuzzy FRM model we want to determine the state of R&D processes at regional level using a mean different from the statistical survey, while by the two optimization methods we mean to provide optimization solutions for the R&D actions of the enterprises. Therefore to fulfill the above mentioned aim in this application-oriented paper we decided to use a questionnaire and for the interpretation of the results the Super Fuzzy FRM model, representing the main novelty of our paper, as this theory provides a formalism based on matrix calculus, which allows processing of large volumes of information and also delivers results difficult or impossible to see, through statistical processing. Furthermore another novelty of the paper represents the optimization solutions submitted in this work, given for the situation when the sales price is variable, and the quantity sold is constant in time and for the reverse situation. PMID:25821846
Howard Stauffer; Nadav Nur
2005-01-01
The papers included in the Advances in Statistics section of the Partners in Flight (PIF) 2002 Proceedings represent a small sample of statistical topics of current importance to Partners In Flight research scientists: hierarchical modeling, estimation of detection probabilities, and Bayesian applications. Sauer et al. (this volume) examines a hierarchical model...
Dosage variability of topical ocular hypotensive products: a densitometric assessment.
Gaynes, Bruce I; Singa, Ramesh M; Cao, Ying
2009-02-01
To ascertain consequence of variability in drop volume obtained from multiuse topical ocular hypotensive products in terms of uniformity of product dosage. Densitometric assessment of drop volume dispensed from 2 alternative bottle positions. All except one product demonstrated a statistically significant difference in drop volume when administered at either a 45-degree or 90-degree bottle angle (Student t test, P<0.001). Product-specific drop volume ranged from a nadir of 22.36 microL to a high of 53.54 microL depending on bottle angle of administration. Deviation in drop dose was directly proportional to variability in drop volume. Variability in per drop dosage was conspicuous among products with a coefficient of variation from 1.49% to 15.91%. In accordance with drop volume, all products demonstrated a statistically significant difference in drop dose at 45-degree versus 90-degree administration angles. Drop volume was found unrelated to drop uniformity (Spearman r=0.01987 and P=0.9463). Variability and lack of uniformity in drop dosage is clearly evident among select ocular hypotensive products and is related to angle of drop administration. Erratic dosage of topical ocular hypotensive therapy may contribute in part to therapeutic failure and/or toxicity.
Mulder, Emma R; de Jong, Remko A; Knol, Dirk L; van Schijndel, Ronald A; Cover, Keith S; Visser, Pieter J; Barkhof, Frederik; Vrenken, Hugo
2014-05-15
To measure hippocampal volume change in Alzheimer's disease (AD) or mild cognitive impairment (MCI), expert manual delineation is often used because of its supposed accuracy. It has been suggested that expert outlining yields poorer reproducibility as compared to automated methods, but this has not been investigated. To determine the reproducibilities of expert manual outlining and two common automated methods for measuring hippocampal atrophy rates in healthy aging, MCI and AD. From the Alzheimer's Disease Neuroimaging Initiative (ADNI), 80 subjects were selected: 20 patients with AD, 40 patients with mild cognitive impairment (MCI) and 20 healthy controls (HCs). Left and right hippocampal volume change between baseline and month-12 visit was assessed by using expert manual delineation, and by the automated software packages FreeSurfer (longitudinal processing stream) and FIRST. To assess reproducibility of the measured hippocampal volume change, both back-to-back (BTB) MPRAGE scans available for each visit were analyzed. Hippocampal volume change was expressed in μL, and as a percentage of baseline volume. Reproducibility of the 1-year hippocampal volume change was estimated from the BTB measurements by using linear mixed model to calculate the limits of agreement (LoA) of each method, reflecting its measurement uncertainty. Using the delta method, approximate p-values were calculated for the pairwise comparisons between methods. Statistical analyses were performed both with inclusion and exclusion of visibly incorrect segmentations. Visibly incorrect automated segmentation in either one or both scans of a longitudinal scan pair occurred in 7.5% of the hippocampi for FreeSurfer and in 6.9% of the hippocampi for FIRST. After excluding these failed cases, reproducibility analysis for 1-year percentage volume change yielded LoA of ±7.2% for FreeSurfer, ±9.7% for expert manual delineation, and ±10.0% for FIRST. Methods ranked the same for reproducibility of 1-year μL volume change, with LoA of ±218 μL for FreeSurfer, ±319 μL for expert manual delineation, and ±333 μL for FIRST. Approximate p-values indicated that reproducibility was better for FreeSurfer than for manual or FIRST, and that manual and FIRST did not differ. Inclusion of failed automated segmentations led to worsening of reproducibility of both automated methods for 1-year raw and percentage volume change. Quantitative reproducibility values of 1-year microliter and percentage hippocampal volume change were roughly similar between expert manual outlining, FIRST and FreeSurfer, but FreeSurfer reproducibility was statistically significantly superior to both manual outlining and FIRST after exclusion of failed segmentations. Copyright © 2014 Elsevier Inc. All rights reserved.
Granados Sánchez, A M; Orejuela Zapata, J F
2018-05-25
The pathological classification of hippocampal sclerosis is based on the loss of neurons in the substructures of the hippocampus. This study aimed to evaluate these substructures in patients with hippocampal sclerosis by magnetic resonance imaging and to compare the usefulness of this morphological analysis compared to that of volumetric analysis of the entire hippocampus. We included 25 controls and 25 patients with hippocampal sclerosis whose diagnosis was extracted from the institutional epilepsy board. We used FreeSurfer to process the studies and obtain the volumetric data. We evaluated overall volume and volume by substructure: fimbria, subiculum, presubiculum, hippocampal sulcus, CA1, CA2-CA3, CA4, and dentate gyrus (DG). We considered p < 0.05 statistically significant. We observed statistically significant decreases in the volume of the hippocampus ipsilateral to the epileptogenic focus in 19 (76.0%) of the 25 cases. With the exception of the hippocampal sulcus, we observed a decrease in all ipsilateral hippocampal substructures in patients with right hippocampal sclerosis (CA1, p=0.0223; CA2-CA3, p=0.0066; CA4-GD, p=0.0066; fimbria, p=0.0046; presubiculum, p=0.0087; subiculum, p=0.0017) and in those with left hippocampal sclerosis (CA1, p<0.0001; CA2-CA3, p<0. 0001; CA4-GD, p<0. 0001; fimbria, p=0.0183; presubiculum, p<0. 0001; subiculum, p<0. 0001). In four patients with left hippocampal sclerosis, none of the substructures had statistically significant alterations, although a trend toward atrophy was observed, mainly in CA2-CA3 and CA4-GD. The findings suggest that it can be useful to assess the substructures of the hippocampus to improve the performance of diagnostic imaging in patients with hippocampal sclerosis. Copyright © 2018 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.
Clarence D. Chase; John K. Strickler
1968-01-01
The report presents statistics on area, volume, growth, mortality, and timber use. Projections of expected timber volumes 30 years in the future are also presented. These data are discussed with regard to possible future development and use of the state's woodlands.
The Regionalization of Lumbar Spine Procedures in New York State: A 10-Year Analysis.
Jancuska, Jeffrey; Adrados, Murillo; Hutzler, Lorraine; Bosco, Joseph
2016-01-01
A retrospective review of an administrative database. The purpose of this study is to determine the current extent of regionalization by mapping lumbar spine procedures according to hospital and patient zip code, as well as examine the rate of growth of lumbar spine procedures performed at high-, medium-, and low-volume institutions in New York State. The association between hospital and spine surgeon volume and improved patient outcomes is well established. There is no study investigating the actual process of patient migration to high-volume hospitals. New York Statewide Planning and Research Cooperative System (SPARCS) administrative data were used to identify 228,695 lumbar spine surgery patients from 2005 to 2014. The data included the patients' zip code, hospital of operation, and year of discharge. The volume of lumbar spine surgery in New York State was mapped according to patient and hospital 3-digit zip code. New York State hospitals were categorized as low, medium, and high volume and descriptive statistics were used to determine trends in changes in hospital volume. Lumbar spine surgery recipients are widely distributed throughout the state. Procedures are regionalized on a select few metropolitan centers. The total number of procedures grew 2.5% over the entire 10-year-period. High-volume hospital caseload increased 50%, from 7253 procedures in 2005 to 10,915 procedures in 2014. The number of procedures at medium and low-volume hospitals decreased 30% and 13%, respectively. Despite any concerted effort aimed at moving orthopedic patients to high-volume hospitals, migration to high-volume centers occurred. Public interest in quality outcomes and cost, as well as financial incentives among medical centers to increase market share, potentially influence the migration of patients to high-volume centers. Further regionalization has the potential to exacerbate the current level of disparities among patient populations at low and high-volume hospitals. 3.
Wagner, Maximilian E H; Gellrich, Nils-Claudius; Friese, Karl-Ingo; Becker, Matthias; Wolter, Franz-Erich; Lichtenstein, Juergen T; Stoetzer, Marcus; Rana, Majeed; Essig, Harald
2016-01-01
Objective determination of the orbital volume is important in the diagnostic process and in evaluating the efficacy of medical and/or surgical treatment of orbital diseases. Tools designed to measure orbital volume with computed tomography (CT) often cannot be used with cone beam CT (CBCT) because of inferior tissue representation, although CBCT has the benefit of greater availability and lower patient radiation exposure. Therefore, a model-based segmentation technique is presented as a new method for measuring orbital volume and compared to alternative techniques. Both eyes from thirty subjects with no known orbital pathology who had undergone CBCT as a part of routine care were evaluated (n = 60 eyes). Orbital volume was measured with manual, atlas-based, and model-based segmentation methods. Volume measurements, volume determination time, and usability were compared between the three methods. Differences in means were tested for statistical significance using two-tailed Student's t tests. Neither atlas-based (26.63 ± 3.15 mm(3)) nor model-based (26.87 ± 2.99 mm(3)) measurements were significantly different from manual volume measurements (26.65 ± 4.0 mm(3)). However, the time required to determine orbital volume was significantly longer for manual measurements (10.24 ± 1.21 min) than for atlas-based (6.96 ± 2.62 min, p < 0.001) or model-based (5.73 ± 1.12 min, p < 0.001) measurements. All three orbital volume measurement methods examined can accurately measure orbital volume, although atlas-based and model-based methods seem to be more user-friendly and less time-consuming. The new model-based technique achieves fully automated segmentation results, whereas all atlas-based segmentations at least required manipulations to the anterior closing. Additionally, model-based segmentation can provide reliable orbital volume measurements when CT image quality is poor.
Tang, An; Chen, Joshua; Le, Thuy-Anh; Changchien, Christopher; Hamilton, Gavin; Middleton, Michael S.; Loomba, Rohit; Sirlin, Claude B.
2014-01-01
Purpose To explore the cross-sectional and longitudinal relationships between fractional liver fat content, liver volume, and total liver fat burden. Methods In 43 adults with non-alcoholic steatohepatitis participating in a clinical trial, liver volume was estimated by segmentation of magnitude-based low-flip-angle multiecho GRE images. The liver mean proton density fat fraction (PDFF) was calculated. The total liver fat index (TLFI) was estimated as the product of liver mean PDFF and liver volume. Linear regression analyses were performed. Results Cross-sectional analyses revealed statistically significant relationships between TLFI and liver mean PDFF (R2 = 0.740 baseline/0.791 follow-up, P < 0.001 baseline/P < 0.001 follow-up), and between TLFI and liver volume (R2 = 0.352/0.452, P < 0.001/< 0.001). Longitudinal analyses revealed statistically significant relationships between liver volume change and liver mean PDFF change (R2 = 0.556, P < 0.001), between TLFI change and liver mean PDFF change (R2 = 0.920, P < 0.001), and between TLFI change and liver volume change (R2 = 0.735, P < 0.001). Conclusion Liver segmentation in combination with MRI-based PDFF estimation may be used to monitor liver volume, liver mean PDFF, and TLFI in a clinical trial. PMID:25015398
Adapting an Agent-Based Model of Socio-Technical Systems to Analyze System and Security Failures
2016-05-09
statistically significant amount, which it did with a p-valueɘ.0003 on a simulation of 3125 iterations; the data is shown in the Delegation 1 column of...Blackout metric to a statistically significant amount, with a p-valueɘ.0003 on a simulation of 3125 iterations; the data is shown in the Delegation 2...Proceedings of the 9th International Conference on Autonomous Agents and Multiagent Systems: volume 1-Volume 1, pp. 1007- 1014 . International Foundation
Statistical analysis of experimental multifragmentation events in 64Zn+112Sn at 40 MeV/nucleon
NASA Astrophysics Data System (ADS)
Lin, W.; Zheng, H.; Ren, P.; Liu, X.; Huang, M.; Wada, R.; Chen, Z.; Wang, J.; Xiao, G. Q.; Qu, G.
2018-04-01
A statistical multifragmentation model (SMM) is applied to the experimentally observed multifragmentation events in an intermediate heavy-ion reaction. Using the temperature and symmetry energy extracted from the isobaric yield ratio (IYR) method based on the modified Fisher model (MFM), SMM is applied to the reaction 64Zn+112Sn at 40 MeV/nucleon. The experimental isotope distribution and mass distribution of the primary reconstructed fragments are compared without afterburner and they are well reproduced. The extracted temperature T and symmetry energy coefficient asym from SMM simulated events, using the IYR method, are also consistent with those from the experiment. These results strongly suggest that in the multifragmentation process there is a freezeout volume, in which the thermal and chemical equilibrium is established before or at the time of the intermediate-mass fragments emission.
Scalable Performance Measurement and Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamblin, Todd
2009-01-01
Concurrency levels in large-scale, distributed-memory supercomputers are rising exponentially. Modern machines may contain 100,000 or more microprocessor cores, and the largest of these, IBM's Blue Gene/L, contains over 200,000 cores. Future systems are expected to support millions of concurrent tasks. In this dissertation, we focus on efficient techniques for measuring and analyzing the performance of applications running on very large parallel machines. Tuning the performance of large-scale applications can be a subtle and time-consuming task because application developers must measure and interpret data from many independent processes. While the volume of the raw data scales linearly with the number ofmore » tasks in the running system, the number of tasks is growing exponentially, and data for even small systems quickly becomes unmanageable. Transporting performance data from so many processes over a network can perturb application performance and make measurements inaccurate, and storing such data would require a prohibitive amount of space. Moreover, even if it were stored, analyzing the data would be extremely time-consuming. In this dissertation, we present novel methods for reducing performance data volume. The first draws on multi-scale wavelet techniques from signal processing to compress systemwide, time-varying load-balance data. The second uses statistical sampling to select a small subset of running processes to generate low-volume traces. A third approach combines sampling and wavelet compression to stratify performance data adaptively at run-time and to reduce further the cost of sampled tracing. We have integrated these approaches into Libra, a toolset for scalable load-balance analysis. We present Libra and show how it can be used to analyze data from large scientific applications scalably.« less
NASA Astrophysics Data System (ADS)
Mahmood, H.; Siddique, M. R. H.; Akhter, M.
2016-08-01
Estimations of biomass, volume and carbon stock are important in the decision making process for the sustainable management of a forest. These estimations can be conducted by using available allometric equations of biomass and volume. Present study aims to: i. develop a compilation with verified allometric equations of biomass, volume, and carbon for trees and shrubs of Bangladesh, ii. find out the gaps and scope for further development of allometric equations for different trees and shrubs of Bangladesh. Key stakeholders (government departments, research organizations, academic institutions, and potential individual researchers) were identified considering their involvement in use and development of allometric equations. A list of documents containing allometric equations was prepared from secondary sources. The documents were collected, examined, and sorted to avoid repetition, yielding 50 documents. These equations were tested through a quality control scheme involving operational verification, conceptual verification, applicability, and statistical credibility. A total of 517 allometric equations for 80 species of trees, shrubs, palm, and bamboo were recorded. In addition, 222 allometric equations for 39 species were validated through the quality control scheme. Among the verified equations, 20%, 12% and 62% of equations were for green-biomass, oven-dried biomass, and volume respectively and 4 tree species contributed 37% of the total verified equations. Five gaps have been pinpointed for the existing allometric equations of Bangladesh: a. little work on allometric equation of common tree and shrub species, b. most of the works were concentrated on certain species, c. very little proportion of allometric equations for biomass estimation, d. no allometric equation for belowground biomass and carbon estimation, and d. lower proportion of valid allometric equations. It is recommended that site and species specific allometric equations should be developed and consistency in field sampling, sample processing, data recording and selection of allometric equations should be maintained to ensure accuracy in estimation of biomass, volume, and carbon stock in different forest types of Bangladesh.
Automated SEM Modal Analysis Applied to the Diogenites
NASA Technical Reports Server (NTRS)
Bowman, L. E.; Spilde, M. N.; Papike, James J.
1996-01-01
Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.
Møller, Jens K S; Jakobsen, Marianne; Weber, Claus J; Martinussen, Torben; Skibsted, Leif H; Bertelsen, Grete
2003-02-01
A multifactorial design, including (1) percent residual oxygen, (2) oxygen transmission rate of packaging film (OTR), (3) product to headspace volume ratio, (4) illuminance level and (5) nitrite level during curing, was established to investigate factors affecting light-induced oxidative discoloration of cured ham (packaged in modified atmosphere of 20% carbon dioxide and balanced with nitrogen) during 14 days of chill storage. Univariate statistical analysis found significant effects of all main factors on the redness (tristimulus a-value) of the ham. Subsequently, Response Surface Modelling of the data further proved that the interactions between packaging and storage conditions are important when optimising colour stability. The measured content of oxygen in the headspace was incorporated in the model and the interaction between measured oxygen content in the headspace and the product to headspace volume ratio was found to be crucial. Thus, it is not enough to keep the headspace oxygen level low, if the headspace volume at the same time is large, there will still be sufficient oxygen for colour deteriorating processes to take place.
Handbook of Labor Statistics. Bulletin 2175.
ERIC Educational Resources Information Center
Springsteen, Rosalind, Comp.; Epstein, Rosalie, Comp.
This publication makes available in one volume the major series produced by the Bureau of Labor Statistics. Technical notes preceding each major section contain information on data changes and explain the services. Forty-four tables derived from the Current Population Survey (CPS) provide statistics on labor force and employment status,…
Education Statistics Quarterly. Volume 6, Issue 3, 2004. NCES 2005-612
ERIC Educational Resources Information Center
National Center for Education Statistics, 2005
2005-01-01
The National Center for Education Statistics (NCES) fulfills a congressional mandate to collect and report "statistics and information showing the condition and progress of education in the United States and other nations in order to promote and accelerate the improvement of American education." The "Quarterly" offers a…
ALISE Library and Information Science Education Statistical Report, 1999.
ERIC Educational Resources Information Center
Daniel, Evelyn H., Ed.; Saye, Jerry D., Ed.
This volume is the twentieth annual statistical report on library and information science (LIS) education published by the Association for Library and Information Science Education (ALISE). Its purpose is to compile, analyze, interpret, and report statistical (and other descriptive) information about library/information science programs offered by…
Förste, Alexander; Pfirrmann, Marco; Sachs, Johannes; Gröger, Roland; Walheim, Stefan; Brinkmann, Falko; Hirtz, Michael; Fuchs, Harald; Schimmel, Thomas
2015-05-01
There are only few quantitative studies commenting on the writing process in dip-pen nanolithography with lipids. Lipids are important carrier ink molecules for the delivery of bio-functional patters in bio-nanotechnology. In order to better understand and control the writing process, more information on the transfer of lipid material from the tip to the substrate is needed. The dependence of the transferred ink volume on the dwell time of the tip on the substrate was investigated by topography measurements with an atomic force microscope (AFM) that is characterized by an ultra-large scan range of 800 × 800 μm(2). For this purpose arrays of dots of the phospholipid1,2-dioleoyl-sn-glycero-3-phosphocholine were written onto planar glass substrates and the resulting pattern was imaged by large scan area AFM. Two writing regimes were identified, characterized of either a steady decline or a constant ink volume transfer per dot feature. For the steady state ink transfer, a linear relationship between the dwell time and the dot volume was determined, which is characterized by a flow rate of about 16 femtoliters per second. A dependence of the ink transport from the length of pauses before and in between writing the structures was observed and should be taken into account during pattern design when aiming at best writing homogeneity. The ultra-large scan range of the utilized AFM allowed for a simultaneous study of the entire preparation area of almost 1 mm(2), yielding good statistic results.
NASA Astrophysics Data System (ADS)
Förste, Alexander; Pfirrmann, Marco; Sachs, Johannes; Gröger, Roland; Walheim, Stefan; Brinkmann, Falko; Hirtz, Michael; Fuchs, Harald; Schimmel, Thomas
2015-05-01
There are only few quantitative studies commenting on the writing process in dip-pen nanolithography with lipids. Lipids are important carrier ink molecules for the delivery of bio-functional patters in bio-nanotechnology. In order to better understand and control the writing process, more information on the transfer of lipid material from the tip to the substrate is needed. The dependence of the transferred ink volume on the dwell time of the tip on the substrate was investigated by topography measurements with an atomic force microscope (AFM) that is characterized by an ultra-large scan range of 800 × 800 μm2. For this purpose arrays of dots of the phospholipid1,2-dioleoyl-sn-glycero-3-phosphocholine were written onto planar glass substrates and the resulting pattern was imaged by large scan area AFM. Two writing regimes were identified, characterized of either a steady decline or a constant ink volume transfer per dot feature. For the steady state ink transfer, a linear relationship between the dwell time and the dot volume was determined, which is characterized by a flow rate of about 16 femtoliters per second. A dependence of the ink transport from the length of pauses before and in between writing the structures was observed and should be taken into account during pattern design when aiming at best writing homogeneity. The ultra-large scan range of the utilized AFM allowed for a simultaneous study of the entire preparation area of almost 1 mm2, yielding good statistic results.
Haines, David E; Wong, Wilson; Canby, Robert; Jewell, Coty; Houmsse, Mahmoud; Pederson, David; Sugeng, Lissa; Porterfield, John; Kottam, Anil; Pearce, John; Valvano, Jon; Michalek, Joel; Trevino, Aron; Sagar, Sandeep; Feldman, Marc D
2017-10-01
There is increasing evidence that using frequent invasive measures of pressure in patients with heart failure results in improved outcomes compared to traditional measures. Admittance, a measure of volume derived from preexisting defibrillation leads, is proposed as a new technique to monitor cardiac hemodynamics in patients with an implantable defibrillator. The purpose of this study was to evaluate the accuracy of a new ventricular volume sensor (VVS, CardioVol) compared with 3-dimenssional echocardiography (echo) in patients with an implantable defibrillator. Twenty-two patients referred for generator replacement had their defibrillation lead attached to VVS to determine the level of agreement to a volume measurement standard (echo). Two opposite hemodynamic challenges were sequentially applied to the heart (overdrive pacing and dobutamine administration) to determine whether real changes in hemodynamics could be reliably and repeatedly assessed with VVS. Equivalence of end-diastolic volume (EDV) and stroke volume (SV) determined by both methods was also assessed. EDV and SV were compared using VVS and echo. VVS tracked expected physiologic trends. EDV was modulated -10% by overdrive pacing (14 mL). SV was modulated -13.7% during overdrive pacing (-6 mL) and increased over baseline +14.6% (+8 mL) with dobutamine. VVS and echo mean EDVs were found statistically equivalent, with margin of equivalence 13.8 mL (P <.05). Likewise, mean SVs were found statistically equivalent with margin of equivalence 15.8 mL (P <.05). VVS provides an accurate method for ventricular volume assessment using chronically implanted defibrillator leads and is statistically equivalent to echo determination of mean EDV and SV. Copyright © 2017 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.
Bergamo, Ana Zn; Nelson-Filho, Paulo; Romano, Fábio L; da Silva, Raquel Ab; Saraiva, Maria Cp; da Silva, Lea Ab; Matsumoto, Mirian An
2016-12-01
The aim of this study was to evaluate the alterations on plaque index (PI), gingival index (GI), gingival bleeding index (GBI), and gingival crevicular fluid (GCF) volume after use of three different brackets types for 60 days. Setting Participants: The sample comprised 20 patients of both sexes aged 11-15 years (mean age: 13.3 years), with permanent dentition, adequate oral hygiene, and mild tooth crowding, overjet, and overbite. A conventional metallic bracket Gemini™, and two different brands of self-ligating brackets - In-Ovation ® R and SmartClip™ - were bonded to the maxillary incisors and canines. PI, GI, GBI scores, and GCF volume were measured before and 30 and 60 days after bonding of the brackets. Data were analysed statistically using non-parametric tests coefficient at a 5% significance level. There was no statistically significant correlation (P > 0.05) between tooth crowding, overjet, and overbite and the PI, GI, GBI scores, and GCF volume before bonding, indicating no influence of malocclusion on the clinical parameters. Regardless of the bracket design, no statistically significant difference (P > 0.05) was found for GI, GBI scores. PI and GCF volume showed a significant difference among the brackets in different periods. In pairwise comparisons a significant difference was observed when compared before with 60 days after bonding, for the teeth bonded with SmartClip™ self-ligating bracket, (PI P = 0.009; GCF volume P = 0.001). There was an increase in PI score and GCF volume 60 days after bonding of SmartClip™ self-ligating brackets, indicating the influence of bracket design on these clinical parameters.
Relationship between volume and in-hospital mortality in digestive oncological surgery.
Pérez-López, Paloma; Baré, Marisa; Touma-Fernández, Ángel; Sarría-Santamera, Antonio
2016-03-01
The results previously obtained in Spain in the study of the relationship between surgical caseload and in-hospital mortality are inconclusive. The aim of this study is to evaluate the volume-outcome association in Spain in the setting of digestive oncological surgery. An analytical, cross-sectional study was conducted with data from patients who underwent surgical procedures with curative intent of esophageal, gastric, colorectal and pancreatic neoplasms between 2006-2009 with data from the Spanish MBDS. In-hospital mortality was used as outcome variable. Control variables were patient, health care and hospital characteristics. Exposure variable was the number of interventions for each disease, dividing the hospitals in 3 categories: high volume (HV), mid volume (MV) and low volume (LV) according to the number of procedures. An inverse, statistically significant relationship between procedure volume and in-hospital mortality was observed for both volume categories in both gastric (LV: OR=1,50 [IC 95%: 1,28-1,76]; MV: OR=1,49 (IC 95%: 1,28-1,74)) and colorectal (LV: OR=1,44 [IC 95%: 1,33-1,55]; MV: OR=1,24 [IC 95%: 1,15-1,33]) cancer surgery. In pancreatic procedures, this difference was only statistically significant between LV and HV categories (LV: OR=1,89 [IC 95%: 1,29-2,75]; MV: OR=1,21 [IC 95%: 0,82-1,79]). Esophageal surgery also showed an inverse relationship, which was not statistically significant (LV: OR=1,89 [IC 95%: 0,98-3,64]; MV: OR=1,05 [IC 95%: 0,50-2,21]). The results of this study suggest the existence in Spain of an inverse relationship between caseload and in-hospital mortality in digestive oncological surgery for the procedures analyzed. Copyright © 2015 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.
Single photon laser altimeter simulator and statistical signal processing
NASA Astrophysics Data System (ADS)
Vacek, Michael; Prochazka, Ivan
2013-05-01
Spaceborne altimeters are common instruments onboard the deep space rendezvous spacecrafts. They provide range and topographic measurements critical in spacecraft navigation. Simultaneously, the receiver part may be utilized for Earth-to-satellite link, one way time transfer, and precise optical radiometry. The main advantage of single photon counting approach is the ability of processing signals with very low signal-to-noise ratio eliminating the need of large telescopes and high power laser source. Extremely small, rugged and compact microchip lasers can be employed. The major limiting factor, on the other hand, is the acquisition time needed to gather sufficient volume of data in repetitive measurements in order to process and evaluate the data appropriately. Statistical signal processing is adopted to detect signals with average strength much lower than one photon per measurement. A comprehensive simulator design and range signal processing algorithm are presented to identify a mission specific altimeter configuration. Typical mission scenarios (celestial body surface landing and topographical mapping) are simulated and evaluated. The high interest and promising single photon altimeter applications are low-orbit (˜10 km) and low-radial velocity (several m/s) topographical mapping (asteroids, Phobos and Deimos) and landing altimetry (˜10 km) where range evaluation repetition rates of ˜100 Hz and 0.1 m precision may be achieved. Moon landing and asteroid Itokawa topographical mapping scenario simulations are discussed in more detail.
Overview of PECBO Module, using scripts to infer environmental conditions from biological observations, statistically estimating species-environment relationships, methods for inferring environmental conditions, statistical scripts in module.
Biomass statistics for Maryland--1986
Thomas S. Frieswyk; Dawn M. DiGiovanni; Dawn M. DiGiovanni
1990-01-01
A statistical report on the fourth forest survey of Maryland (1986). Findings are displayed in 97 tables containing estimates of forest area, tree biomass, and timber volume. Data are presented by state and county level.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The Petroleum Marketing Monthly (PMM) provides information and statistical data on a variety of crude oils and refined petroleum products. The publication presents statistics on crude oil costs and refined petroleum products sales for use by industry, government, private sector analysts, educational institutions, and consumers. Data on crude oil include the domestic first purchase price, the f.o.b. and landed cost of imported crude oil, and the refiners` acquisition cost of crude oil. Refined petroleum product sales data include motor gasoline, distillates, residuals, aviation fuels, kerosene, and propane. Monthly statistics on purchases of crude oil and sales of petroleum products aremore » presented in the Petroleum Marketing Monthly in five sections: summary statistics; crude oil prices; prices of petroleum products; volumes of petroleum products; and prime supplier sales volumes of petroleum products for local consumption. 7 figs., 50 tabs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peter, Justin R; May, Peter T; Potts, Rodney J
Statistics of radar-retrievals of precipitation are presented. A K-means clustering algorithm is applied to an historical record of radiosonde measurements which identified three major synoptic regimes; a dry, stable regime with mainly westerly winds prevalent during winter, a moist south easterly trade wind regime and a moist northerly regime both prevalent during summer. These are referred to as westerly, trade wind and northerly regimes, respectively. Cell statistics are calculated using an objective cell identification and tracking methodology on data obtained from a nearby S-band radar. Cell statistics are investigated for the entire radar observational period and also during sub-periods correspondingmore » to the three major synoptic regimes. The statistics investigated are cell initiation location, area, rainrate, volume, height, height of the maximum reflectivity, volume greater than 40 dBZ and storm speed and direction. Cells are found predominantly along the elevated topography. The cell statistics reveal that storms which form in the dry, stable westerly regime are of comparable size to the deep cells which form in the northerly regime, larger than those in the trade regime and, furthermore, have the largest rainrate. However, they occur less frequently and have shorter lifetimes than cells in the other regimes. Diurnal statistics of precipitation area and rainrate exhibit early morning and mid afternoon peaks, although the areal coverage lags the rainrate by several hours indicative of a transition from convective to stratiform precipitation. The probability distributions of cell area, rainrate, volume, height and height of the maximum re ectivity are found to follow lognormal distributions.« less
Neural changes in periapical lesions after systemic steroids in the ferret.
Holland, G R
1993-06-01
This study was intended to clarify the relationship between the neural changes which occur around the apex of the ferret canine after pulpectomy and the inflammatory process induced by the procedure. In 12 young adult ferrets, under general anesthesia, the pulps in the mandibular canine teeth were removed and replaced with gutta percha and Grossman's sealer. Six of the animals were treated with dexamethasone to reduce the inflammatory response. Three months later, the animals, again under general anesthesia, were perfused with a fixative mixture. Three unoperated animals that had not been treated with dexamethasone were also perfused. The mandibular canine teeth and their supporting tissues were removed, processed, and serially sectioned. Three-dimensional reconstructions of the periapical lesions in each animal were assembled and their volumes measured. The density of innervation in the periapical region was estimated. The mean lesion volume in the pulpectomized animals not treated with dexamethasone was 3.54 (+/- 2.27) mm3 and in the dexamethasone-treated animals 1.33 (+/- 1.31) mm3. The differences were statistically significant when tested by the Mann-Whitney U test (p < 0.01). Bacteria were not seen within any of the lesions. The innervation density beneath the canines in the pulpectomized animals not treated with dexamethasone was 164 units per mm2 (+/- 80) and in the steroid-treated animals 151 +/- 68 units per mm2. In the control, untreated animals, the innervation density was 22 +/- 10 units per mm2. The difference between the steroid-treated pulpectomized animals and the untreated pulpectomized animals was not statistically significant (p > 0.5).(ABSTRACT TRUNCATED AT 250 WORDS)
NASA Astrophysics Data System (ADS)
Ma, Kevin; Liu, Joseph; Zhang, Xuejun; Lerner, Alex; Shiroishi, Mark; Amezcua, Lilyana; Liu, Brent
2016-03-01
We have designed and developed a multiple sclerosis eFolder system for patient data storage, image viewing, and automatic lesion quantification results stored in DICOM-SR format. The web-based system aims to be integrated in DICOM-compliant clinical and research environments to aid clinicians in patient treatments and data analysis. The system needs to quantify lesion volumes, identify and register lesion locations to track shifts in volume and quantity of lesions in a longitudinal study. In order to perform lesion registration, we have developed a brain warping and normalizing methodology using Statistical Parametric Mapping (SPM) MATLAB toolkit for brain MRI. Patients' brain MR images are processed via SPM's normalization processes, and the brain images are analyzed and warped according to the tissue probability map. Lesion identification and contouring are completed by neuroradiologists, and lesion volume quantification is completed by the eFolder's CAD program. Lesion comparison results in longitudinal studies show key growth and active regions. The results display successful lesion registration and tracking over a longitudinal study. Lesion change results are graphically represented in the web-based user interface, and users are able to correlate patient progress and changes in the MRI images. The completed lesion and disease tracking tool would enable the eFolder to provide complete patient profiles, improve the efficiency of patient care, and perform comprehensive data analysis through an integrated imaging informatics system.
State Programs Supporting Health Manpower Training: An Inventory. Volume 1. Report and Tables.
ERIC Educational Resources Information Center
Public Health Service (DHEW), Washington, DC. Bureau of Health Manpower.
A detailed statistical review of state support for health manpower training during 1973, 1974, and 1975 with an inventory of state expenditures for specific health occupations in 32 states are presented in Volume One of this two-volume study conducted for the Health Resources Administration. Objectives of the study included: investigation of the…
ERIC Educational Resources Information Center
National Academy of Sciences - National Research Council, Washington, DC. Office of Scientific and Engineering Personnel.
Volume Two of a three volume set of the Biomedical and Behavioral Research Scientists study presents tables of data which were required for the study's development by the National Research Council. Data from these tables were obtained from the Association of American Medical Colleges, the American Dental Association, the American Medical…
Effective Thermal Conductivity of an Aluminum Foam + Water Two Phase System
NASA Technical Reports Server (NTRS)
Moskito, John
1996-01-01
This study examined the effect of volume fraction and pore size on the effective thermal conductivity of an aluminum foam and water system. Nine specimens of aluminum foam representing a matrix of three volume fractions (4-8% by vol.) and three pore sizes (2-4 mm) were tested with water to determine relationships to the effective thermal conductivity. It was determined that increases in volume fraction of the aluminum phase were correlated to increases in the effective thermal conductivity. It was not statistically possible to prove that changes in pore size of the aluminum foam correlated to changes in the effective thermal conductivity. However, interaction effects between the volume fraction and pore size of the foam were statistically significant. Ten theoretical models were selected from the published literature to compare against the experimental data. Models by Asaad, Hadley, and de Vries provided effective thermal conductivity predictions within a 95% confidence interval.
2014-01-01
Quantitative imaging biomarkers (QIBs) are being used increasingly in medicine to diagnose and monitor patients’ disease. The computer algorithms that measure QIBs have different technical performance characteristics. In this paper we illustrate the appropriate statistical methods for assessing and comparing the bias, precision, and agreement of computer algorithms. We use data from three studies of pulmonary nodules. The first study is a small phantom study used to illustrate metrics for assessing repeatability. The second study is a large phantom study allowing assessment of four algorithms’ bias and reproducibility for measuring tumor volume and the change in tumor volume. The third study is a small clinical study of patients whose tumors were measured on two occasions. This study allows a direct assessment of six algorithms’ performance for measuring tumor change. With these three examples we compare and contrast study designs and performance metrics, and we illustrate the advantages and limitations of various common statistical methods for QIB studies. PMID:24919828
Datamining approaches for modeling tumor control probability.
Naqa, Issam El; Deasy, Joseph O; Mu, Yi; Huang, Ellen; Hope, Andrew J; Lindsay, Patricia E; Apte, Aditya; Alaly, James; Bradley, Jeffrey D
2010-11-01
Tumor control probability (TCP) to radiotherapy is determined by complex interactions between tumor biology, tumor microenvironment, radiation dosimetry, and patient-related variables. The complexity of these heterogeneous variable interactions constitutes a challenge for building predictive models for routine clinical practice. We describe a datamining framework that can unravel the higher order relationships among dosimetric dose-volume prognostic variables, interrogate various radiobiological processes, and generalize to unseen data before when applied prospectively. Several datamining approaches are discussed that include dose-volume metrics, equivalent uniform dose, mechanistic Poisson model, and model building methods using statistical regression and machine learning techniques. Institutional datasets of non-small cell lung cancer (NSCLC) patients are used to demonstrate these methods. The performance of the different methods was evaluated using bivariate Spearman rank correlations (rs). Over-fitting was controlled via resampling methods. Using a dataset of 56 patients with primary NCSLC tumors and 23 candidate variables, we estimated GTV volume and V75 to be the best model parameters for predicting TCP using statistical resampling and a logistic model. Using these variables, the support vector machine (SVM) kernel method provided superior performance for TCP prediction with an rs=0.68 on leave-one-out testing compared to logistic regression (rs=0.4), Poisson-based TCP (rs=0.33), and cell kill equivalent uniform dose model (rs=0.17). The prediction of treatment response can be improved by utilizing datamining approaches, which are able to unravel important non-linear complex interactions among model variables and have the capacity to predict on unseen data for prospective clinical applications.
Steiger, V R; Brühl, A B; Weidt, S; Delsignore, A; Rufer, M; Jäncke, L; Herwig, U; Hänggi, J
2017-08-01
Social anxiety disorder (SAD) is characterized by fears of social and performance situations. Cognitive behavioral group therapy (CBGT) has in general positive effects on symptoms, distress and avoidance in SAD. Prior studies found increased cortical volumes and decreased fractional anisotropy (FA) in SAD compared with healthy controls (HCs). Thirty-three participants diagnosed with SAD attended in a 10-week CBGT and were scanned before and after therapy. We applied three neuroimaging methods-surface-based morphometry, diffusion tensor imaging and network-based statistics-each with specific longitudinal processing protocols, to investigate CBGT-induced structural brain alterations of the gray and white matter (WM). Surface-based morphometry revealed a significant cortical volume reduction (pre- to post-treatment) in the left inferior parietal cortex, as well as a positive partial correlation between treatment success (indexed by reductions in Liebowitz Social Anxiety Scale) and reductions in cortical volume in bilateral dorsomedial prefrontal cortex. Diffusion tensor imaging analysis revealed a significant increase in FA in bilateral uncinate fasciculus and right inferior longitudinal fasciculus. Network-based statistics revealed a significant increase of structural connectivity in a frontolimbic network. No partial correlations with treatment success have been found in WM analyses. For, we believe, the first time, we present a distinctive pattern of longitudinal structural brain changes after CBGT measured with three established magnetic resonance imaging analyzing techniques. Our findings are in line with previous cross-sectional, unimodal SAD studies and extent them by highlighting anatomical brain alterations that point toward the level of HCs in parallel with a reduction in SAD symptomatology.
Effects of vegetation canopy on the radar backscattering coefficient
NASA Technical Reports Server (NTRS)
Mo, T.; Blanchard, B. J.; Schmugge, T. J.
1983-01-01
Airborne L- and C-band scatterometer data, taken over both vegetation-covered and bare fields, were systematically analyzed and theoretically reproduced, using a recently developed model for calculating radar backscattering coefficients of rough soil surfaces. The results show that the model can reproduce the observed angular variations of radar backscattering coefficient quite well via a least-squares fit method. Best fits to the data provide estimates of the statistical properties of the surface roughness, which is characterized by two parameters: the standard deviation of surface height, and the surface correlation length. In addition, the processes of vegetation attenuation and volume scattering require two canopy parameters, the canopy optical thickness and a volume scattering factor. Canopy parameter values for individual vegetation types, including alfalfa, milo and corn, were also determined from the best-fit results. The uncertainties in the scatterometer data were also explored.
Infective endocarditis detection through SPECT/CT images digital processing
NASA Astrophysics Data System (ADS)
Moreno, Albino; Valdés, Raquel; Jiménez, Luis; Vallejo, Enrique; Hernández, Salvador; Soto, Gabriel
2014-03-01
Infective endocarditis (IE) is a difficult-to-diagnose pathology, since its manifestation in patients is highly variable. In this work, it was proposed a semiautomatic algorithm based on SPECT images digital processing for the detection of IE using a CT images volume as a spatial reference. The heart/lung rate was calculated using the SPECT images information. There were no statistically significant differences between the heart/lung rates values of a group of patients diagnosed with IE (2.62+/-0.47) and a group of healthy or control subjects (2.84+/-0.68). However, it is necessary to increase the study sample of both the individuals diagnosed with IE and the control group subjects, as well as to improve the images quality.
Computed tomography-based volumetric tool for standardized measurement of the maxillary sinus
Giacomini, Guilherme; Pavan, Ana Luiza Menegatti; Altemani, João Mauricio Carrasco; Duarte, Sergio Barbosa; Fortaleza, Carlos Magno Castelo Branco; Miranda, José Ricardo de Arruda
2018-01-01
Volume measurements of maxillary sinus may be useful to identify diseases affecting paranasal sinuses. However, literature shows a lack of consensus in studies measuring the volume. This may be attributable to different computed tomography data acquisition techniques, segmentation methods, focuses of investigation, among other reasons. Furthermore, methods for volumetrically quantifying the maxillary sinus are commonly manual or semiautomated, which require substantial user expertise and are time-consuming. The purpose of the present study was to develop an automated tool for quantifying the total and air-free volume of the maxillary sinus based on computed tomography images. The quantification tool seeks to standardize maxillary sinus volume measurements, thus allowing better comparisons and determinations of factors that influence maxillary sinus size. The automated tool utilized image processing techniques (watershed, threshold, and morphological operators). The maxillary sinus volume was quantified in 30 patients. To evaluate the accuracy of the automated tool, the results were compared with manual segmentation that was performed by an experienced radiologist using a standard procedure. The mean percent differences between the automated and manual methods were 7.19% ± 5.83% and 6.93% ± 4.29% for total and air-free maxillary sinus volume, respectively. Linear regression and Bland-Altman statistics showed good agreement and low dispersion between both methods. The present automated tool for maxillary sinus volume assessment was rapid, reliable, robust, accurate, and reproducible and may be applied in clinical practice. The tool may be used to standardize measurements of maxillary volume. Such standardization is extremely important for allowing comparisons between studies, providing a better understanding of the role of the maxillary sinus, and determining the factors that influence maxillary sinus size under normal and pathological conditions. PMID:29304130
Voronoi Tessellation for reducing the processing time of correlation functions
NASA Astrophysics Data System (ADS)
Cárdenas-Montes, Miguel; Sevilla-Noarbe, Ignacio
2018-01-01
The increase of data volume in Cosmology is motivating the search of new solutions for solving the difficulties associated with the large processing time and precision of calculations. This is specially true in the case of several relevant statistics of the galaxy distribution of the Large Scale Structure of the Universe, namely the two and three point angular correlation functions. For these, the processing time has critically grown with the increase of the size of the data sample. Beyond parallel implementations to overcome the barrier of processing time, space partitioning algorithms are necessary to reduce the computational load. These can delimit the elements involved in the correlation function estimation to those that can potentially contribute to the final result. In this work, Voronoi Tessellation is used to reduce the processing time of the two-point and three-point angular correlation functions. The results of this proof-of-concept show a significant reduction of the processing time when preprocessing the galaxy positions with Voronoi Tessellation.
Panayi, Efstathios; Peters, Gareth W; Kyriakides, George
2017-01-01
Quantifying the effects of environmental factors over the duration of the growing process on Agaricus Bisporus (button mushroom) yields has been difficult, as common functional data analysis approaches require fixed length functional data. The data available from commercial growers, however, is of variable duration, due to commercial considerations. We employ a recently proposed regression technique termed Variable-Domain Functional Regression in order to be able to accommodate these irregular-length datasets. In this way, we are able to quantify the contribution of covariates such as temperature, humidity and water spraying volumes across the growing process, and for different lengths of growing processes. Our results indicate that optimal oxygen and temperature levels vary across the growing cycle and we propose environmental schedules for these covariates to optimise overall yields.
Panayi, Efstathios; Kyriakides, George
2017-01-01
Quantifying the effects of environmental factors over the duration of the growing process on Agaricus Bisporus (button mushroom) yields has been difficult, as common functional data analysis approaches require fixed length functional data. The data available from commercial growers, however, is of variable duration, due to commercial considerations. We employ a recently proposed regression technique termed Variable-Domain Functional Regression in order to be able to accommodate these irregular-length datasets. In this way, we are able to quantify the contribution of covariates such as temperature, humidity and water spraying volumes across the growing process, and for different lengths of growing processes. Our results indicate that optimal oxygen and temperature levels vary across the growing cycle and we propose environmental schedules for these covariates to optimise overall yields. PMID:28961254
ERIC Educational Resources Information Center
Gilpatrick, Eleanor
This document is volume 3 of a four-volume report which describes the components of the Health Services Mobility Study (HSMS) method of task analysis, job ladder design, and curriculum development. Divided into four chapters, volume 3 is a manual for using HSMS computer based statistical procedures to design job structures and job ladders. Chapter…
Unal, Ozkan; Kartum, Alp; Avcu, Serhat; Etlik, Omer; Arslan, Halil; Bora, Aydin
2009-12-01
The aim of this study was cerebrospinal flow quantification in the cerebral aqueduct using cine phase-contrast magnetic resonance imaging (MRI) technique in both sexes and five different age groups to provide normative data. Sixty subjects with no cerebral pathology were included in this study. Subjects were divided into five age groups: < or =14 years, 15-24 years, 25-34 years, 35-44 years, and > or =45 years. Phase, rephase, and magnitude images were acquired by 1.5 T MR unit at the level of cerebral aqueduct with spoiled gradient echo through-plane, which is a cine phase-contrast sequence. At this level, peak flow velocity (cm/s), average flow rate (cm/ s), average flow (L/min), volumes in cranial and caudal directions (mL), and net volumes (mL) were studied. There was a statistically significant difference in peak flow between the age group of < or =14 years and the older age groups. There were no statistically significant differences in average velocity, cranial and caudal volume, net volume, and average flow parameters among different age groups. Statistically significant differences were not detected in flow parameters between sexes. When using cine-phase contrast MRI in the cerebral aqueduct, only the peak velocity showed a statistically significant difference between age groups; it was higher in subjects aged < or =14 years than those in older age groups. When performing age-dependent clinical studies including adolescents, this should be taken into consideration.
Roldan-Valadez, Ernesto; Garcia-Ulloa, Ana Cristina; Gonzalez-Gutierrez, Omar; Martinez-Lopez, Manuel
2011-01-01
Computed-assisted three-dimensional data (3D) allows for an accurate evaluation of volumes compared with traditional measurements. An in vitro method comparison between geometric volume and 3D volumetry to obtain reference data for pituitary volumes in normal pituitary glands (PGs) and PGs containing adenomas. Prospective, transverse, analytical study. Forty-eight subjects underwent brain magnetic resonance imaging (MRI) with 3D sequencing for computer-aided volumetry. PG phantom volumes by both methods were compared. Using the best volumetric method, volumes of normal PGs and PGs with adenoma were compared. Statistical analysis used the Bland-Altman method, t-statistics, effect size and linear regression analysis. Method comparison between 3D volumetry and geometric volume revealed a lower bias and precision for 3D volumetry. A total of 27 patients exhibited normal PGs (mean age, 42.07 ± 16.17 years), although length, height, width, geometric volume and 3D volumetry were greater in women than in men. A total of 21 patients exhibited adenomas (mean age 39.62 ± 10.79 years), and length, height, width, geometric volume and 3D volumetry were greater in men than in women, with significant volumetric differences. Age did not influence pituitary volumes on linear regression analysis. Results from the present study showed that 3D volumetry was more accurate than the geometric method. In addition, the upper normal limits of PGs overlapped with lower volume limits during early stage microadenomas.
Bach, P M; McCarthy, D T; Deletic, A
2010-01-01
The management of stormwater pollution has placed particular emphasis on the first flush phenomenon. However, definition and current methods of analyses of the phenomena contain serious limitations, the most important being their inability to capture a possible impact of the event size (total event volume) on the first flush. This paper presents the development of a novel approach in defining and assessing the first flush that should overcome these problems. The phenomenon is present in a catchment if the decrease in pollution concentration with the absolute cumulative volume of runoff from the catchment is statistically significant. Using data from seven diverse catchments around Melbourne, Australia, changes in pollutant concentrations for Total Suspended Solids (TSS) and Total Nitrogen (TN) were calculated over the absolute cumulative runoff and aggregated from a collection of different storm events. Due to the discrete nature of the water quality data, each concentration was calculated as a flow-weighted average at 2 mm runoff volume increments. The aggregated concentrations recorded in each increment (termed as a 'slice' of runoff) were statistically compared to each other across the absolute cumulative runoff volume. A first flush is then defined as the volume at which concentrations reach the 'background concentration' (i.e. the statistically significant minimum). Initial results clearly highlight first flush and background concentrations in all but one catchment supporting the validity of this new approach. Future work will need to address factors, which will help assess the first flush's magnitude and volume. Sensitivity testing and correlation with catchment characteristics should also be undertaken.
Nasal mask ventilation is better than face mask ventilation in edentulous patients.
Kapoor, Mukul Chandra; Rana, Sandeep; Singh, Arvind Kumar; Vishal, Vindhya; Sikdar, Indranil
2016-01-01
Face mask ventilation of the edentulous patient is often difficult as ineffective seating of the standard mask to the face prevents attainment of an adequate air seal. The efficacy of nasal ventilation in edentulous patients has been cited in case reports but has never been investigated. Consecutive edentulous adult patients scheduled for surgery under general anesthesia with endotracheal intubation, during a 17-month period, were prospectively evaluated. After induction of anesthesia and administration of neuromuscular blocker, lungs were ventilated with a standard anatomical face mask of appropriate size, using a volume controlled anesthesia ventilator with tidal volume set at 10 ml/kg. In case of inadequate ventilation, the mask position was adjusted to achieve best-fit. Inspired and expired tidal volumes were measured. Thereafter, the face mask was replaced by a nasal mask and after achieving best-fit, the inspired and expired tidal volumes were recorded. The difference in expired tidal volumes and airway pressures at best-fit with the use of the two masks and number of patients with inadequate ventilation with use of the masks were statistically analyzed. A total of 79 edentulous patients were recruited for the study. The difference in expiratory tidal volumes with the use of the two masks at best-fit was statistically significant (P = 0.0017). Despite the best-fit mask placement, adequacy of ventilation could not be achieved in 24.1% patients during face mask ventilation, and 12.7% patients during nasal mask ventilation and the difference was statistically significant. Nasal mask ventilation is more efficient than standard face mask ventilation in edentulous patients.
Vedantham, Srinivasan; Shi, Linxi; Michaelsen, Kelly E.; Krishnaswamy, Venkataramanan; Pogue, Brian W.; Poplack, Steven P.; Karellas, Andrew; Paulsen, Keith D.
2016-01-01
A multimodality system combining a clinical prototype digital breast tomosynthesis with its imaging geometry modified to facilitate near-infrared spectroscopic imaging has been developed. The accuracy of parameters recovered from near-infrared spectroscopy is dependent on fibroglandular tissue content. Hence, in this study, volumetric estimates of fibroglandular tissue from tomosynthesis reconstructions were determined. A kernel-based fuzzy c-means algorithm was implemented to segment tomosynthesis reconstructed slices in order to estimate fibroglandular content and to provide anatomic priors for near-infrared spectroscopy. This algorithm was used to determine volumetric breast density (VBD), defined as the ratio of fibroglandular tissue volume to the total breast volume, expressed as percentage, from 62 tomosynthesis reconstructions of 34 study participants. For a subset of study participants who subsequently underwent mammography, VBD from mammography matched for subject, breast laterality and mammographic view was quantified using commercial software and statistically analyzed to determine if it differed from tomosynthesis. Summary statistics of the VBD from all study participants were compared with prior independent studies. The fibroglandular volume from tomosynthesis and mammography were not statistically different (p=0.211, paired t-test). After accounting for the compressed breast thickness, which were different between tomosynthesis and mammography, the VBD from tomosynthesis was correlated with (r =0.809, p<0.001), did not statistically differ from (p>0.99, paired t-test), and was linearly related to, the VBD from mammography. Summary statistics of the VBD from tomosynthesis were not statistically different from prior studies using high-resolution dedicated breast computed tomography. The observation of correlation and linear association in VBD between mammography and tomosynthesis suggests that breast density associated risk measures determined for mammography are translatable to tomosynthesis. Accounting for compressed breast thickness is important when it differs between the two modalities. The fibroglandular volume from tomosynthesis reconstructions is similar to mammography indicating suitability for use during near-infrared spectroscopy. PMID:26941961
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, R; Bai, W
Purpose: Because of statistical noise in Monte Carlo dose calculations, effective point doses may not be accurate. Volume spheres are useful for evaluating dose in Monte Carlo plans, which have an inherent statistical uncertainty.We use a user-defined sphere volume instead of a point, take sphere sampling around effective point make the dose statistics to decrease the stochastic errors. Methods: Direct dose measurements were made using a 0.125cc Semiflex ion chamber (IC) 31010 isocentrically placed in the center of a homogeneous Cylindric sliced RW3 phantom (PTW, Germany).In the scanned CT phantom series the sensitive volume length of the IC (6.5mm) weremore » delineated and defined the isocenter as the simulation effective points. All beams were simulated in Monaco in accordance to the measured model. In our simulation using 2mm voxels calculation grid spacing and choose calculate dose to medium and request the relative standard deviation ≤0.5%. Taking three different assigned IC over densities (air electron density(ED) as 0.01g/cm3 default CT scanned ED and Esophageal lumen ED 0.21g/cm3) were tested at different sampling sphere radius (2.5, 2, 1.5 and 1 mm) statistics dose were compared with the measured does. Results: The results show that in the Monaco TPS for the IC using Esophageal lumen ED 0.21g/cm3 and sampling sphere radius 1.5mm the statistical value is the best accordance with the measured value, the absolute average percentage deviation is 0.49%. And when the IC using air electron density(ED) as 0.01g/cm3 and default CT scanned EDthe recommented statistical sampling sphere radius is 2.5mm, the percentage deviation are 0.61% and 0.70%, respectivly. Conclusion: In Monaco treatment planning system for the ionization chamber 31010 recommend air cavity using ED 0.21g/cm3 and sampling 1.5mm sphere volume instead of a point dose to decrease the stochastic errors. Funding Support No.C201505006.« less
Engberg, Lovisa; Forsgren, Anders; Eriksson, Kjell; Hårdemark, Björn
2017-06-01
To formulate convex planning objectives of treatment plan multicriteria optimization with explicit relationships to the dose-volume histogram (DVH) statistics used in plan quality evaluation. Conventional planning objectives are designed to minimize the violation of DVH statistics thresholds using penalty functions. Although successful in guiding the DVH curve towards these thresholds, conventional planning objectives offer limited control of the individual points on the DVH curve (doses-at-volume) used to evaluate plan quality. In this study, we abandon the usual penalty-function framework and propose planning objectives that more closely relate to DVH statistics. The proposed planning objectives are based on mean-tail-dose, resulting in convex optimization. We also demonstrate how to adapt a standard optimization method to the proposed formulation in order to obtain a substantial reduction in computational cost. We investigated the potential of the proposed planning objectives as tools for optimizing DVH statistics through juxtaposition with the conventional planning objectives on two patient cases. Sets of treatment plans with differently balanced planning objectives were generated using either the proposed or the conventional approach. Dominance in the sense of better distributed doses-at-volume was observed in plans optimized within the proposed framework. The initial computational study indicates that the DVH statistics are better optimized and more efficiently balanced using the proposed planning objectives than using the conventional approach. © 2017 American Association of Physicists in Medicine.
NASA Astrophysics Data System (ADS)
Kim, Dokyun; Bravo, Luis; Matusik, Katarzyna; Duke, Daniel; Kastengren, Alan; Swantek, Andy; Powell, Christopher; Ham, Frank
2016-11-01
One of the major concerns in modern direct injection engines is the sensitivity of engine performance to fuel characteristics. Recent works have shown that even slight differences in fuel properties can cause significant changes in efficiency and emission of an engine. Since the combustion process is very sensitive to the fuel/air mixture formation resulting from disintegration of liquid jet, the precise assessment of fuel sensitivity on liquid jet atomization process is required first to study the impact of different fuels on the combustion. In the present study, the breaking process of a liquid jet from a diesel injector injecting into a quiescent gas chamber is investigated numerically and experimentally for different liquid fuels (n-dodecane, iso-octane, CAT A2 and C3). The unsplit geometric Volume-of-Fluid method is employed to capture the phase interface in Large-eddy simulations and results are compared against the radiography measurement from Argonne National Lab including jet penetration, liquid mass distribution and volume fraction. The breakup characteristics will be shown for different fuels as well as droplet PDF statistics to demonstrate the influences of the physical properties on the primary atomization of liquid jet. Supported by HPCMP FRONTIER award, US DOD, Office of the Army.
Manavella, Valeria; Romano, Federica; Garrone, Federica; Terzini, Mara; Bignardi, Cristina; Aimetti, Mario
2017-06-01
The aim of this study was to present and validate a novel procedure for the quantitative volumetric assessment of extraction sockets that combines cone-beam computed tomography (CBCT) and image processing techniques. The CBCT dataset of 9 severely resorbed extraction sockets was analyzed by means of two image processing software, Image J and Mimics, using manual and automated segmentation techniques. They were also applied on 5-mm spherical aluminum markers of known volume and on a polyvinyl chloride model of one alveolar socket scanned with Micro-CT to test the accuracy. Statistical differences in alveolar socket volume were found between the different methods of volumetric analysis (P<0.0001). The automated segmentation using Mimics was the most reliable and accurate method with a relative error of 1.5%, considerably smaller than the error of 7% and of 10% introduced by the manual method using Mimics and by the automated method using ImageJ. The currently proposed automated segmentation protocol for the three-dimensional rendering of alveolar sockets showed more accurate results, excellent inter-observer similarity and increased user friendliness. The clinical application of this method enables a three-dimensional evaluation of extraction socket healing after the reconstructive procedures and during the follow-up visits.
Jiang, Y L; Yu, J P; Sun, H T; Guo, F X; Ji, Z; Fan, J H; Zhang, L J; Li, X; Wang, J J
2017-08-01
Objective: To compare the post-implant target volumes and dosimetric evaluation with pre-plan, the gross tumor volume(GTV) by CT image fusion-based and the manual delineation of target volume in CT guided radioactive seeds implantation. Methods: A total of 10 patients treated under CT-guidance (125)I seed implantation during March 2016 to April 2016 were analyzed in Peking University Third Hospital.All patients underwent pre-operative CT simulation, pre-operative planning, implantation seeds, CT scanning after seed implantation and dosimetric evaluation of GTV.In every patient, post-implant target volumes were delineated by both two methods, and were divided into two groups. Group 1: image fusion pre-implantation simulation and post-operative CT image, then the contours of GTV were automatically performed by brachytherapy treatment planning system; Group 2: the contouring of the GTV on post-operative CT image were performed manually by three senior radiation oncologists independently. The average of three data was sets. Statistical analyses were performed using SPSS software, version 3.2.0. The paired t -test was used to compare the target volumes and D(90) parameters in two modality. Results: In Group 1, average volume of GTV in post-operation group was 12-167(73±56) cm(3). D(90) was 101-153 (142±19)Gy. In Group 2, they were 14-186(80±58)cm(3) and 96-146(122±16) Gy respectively. In both target volumes and D(90), there was no statistical difference between pre-operation and post-operation in Group 1.The D(90) was slightly lower than that of pre-plan group, but there was no statistical difference ( P =0.142); in Group 2, between pre-operation and post-operation group, there was a significant statistical difference in the GTV ( P =0.002). The difference of D(90) was similarly ( P <0.01). Conclusion: The method of delineation of post-implant GTV through fusion pre-implantation simulation and post-operative CT scan images, the contours of GTV are automatically performed by brachytherapy treatment planning system appears to have improved more accuracy, reproducibility and convenience than manual delineation of target volume by maximum reduce the interference from artificial factor and metal artifacts. Further work and more cases are required in the future.
Maxwell, M; Howie, J G; Pryde, C J
1998-01-01
BACKGROUND: Prescribing matters (particularly budget setting and research into prescribing variation between doctors) have been handicapped by the absence of credible measures of the volume of drugs prescribed. AIM: To use the defined daily dose (DDD) method to study variation in the volume and cost of drugs prescribed across the seven main British National Formulary (BNF) chapters with a view to comparing different methods of setting prescribing budgets. METHOD: Study of one year of prescribing statistics from all 129 general practices in Lothian, covering 808,059 patients: analyses of prescribing statistics for 1995 to define volume and cost/volume of prescribing for one year for 10 groups of practices defined by the age and deprivation status of their patients, for seven BNF chapters; creation of prescribing budgets for 1996 for each individual practice based on the use of target volume and cost statistics; comparison of 1996 DDD-based budgets with those set using the conventional historical approach; and comparison of DDD-based budgets with budgets set using a capitation-based formula derived from local cost/patient information. RESULTS: The volume of drugs prescribed was affected by the age structure of the practices in BNF Chapters 1 (gastrointestinal), 2 (cardiovascular), and 6 (endocrine), and by deprivation structure for BNF Chapters 3 (respiratory) and 4 (central nervous system). Costs per DDD in the major BNF chapters were largely independent of age, deprivation structure, or fundholding status. Capitation and DDD-based budgets were similar to each other, but both differed substantially from historic budgets. One practice in seven gained or lost more than 100,000 Pounds per annum using DDD or capitation budgets compared with historic budgets. The DDD-based budget, but not the capitation-based budget, can be used to set volume-specific prescribing targets. CONCLUSIONS: DDD-based and capitation-based prescribing budgets can be set using a simple explanatory model and generalizable methods. In this study, both differed substantially from historic budgets. DDD budgets could be created to accommodate new prescribing strategies and raised or lowered to reflect local intentions to alter overall prescribing volume or cost targets. We recommend that future work on setting budgets and researching prescribing variations should be based on DDD statistics. PMID:10024703
A Commercial IOTV Cleaning Study
2010-04-12
manufacturer’s list price without taking into consideration of possible volume discount. Equipment depreciation cost was calculated based on...Laundering with Prewash Spot Cleaning) 32 Table 12 Shrinkage Statistical Data (Traditional Wet Laundering without Prewash Spot Cleaning...Statistical Data (Computer-controlled Wet Cleaning without Prewash Spot Cleaning) 35 Table 15 Shrinkage Statistical Data (Liquid CO2 Cleaning
The Lure of Statistics in Data Mining
ERIC Educational Resources Information Center
Grover, Lovleen Kumar; Mehra, Rajni
2008-01-01
The field of Data Mining like Statistics concerns itself with "learning from data" or "turning data into information". For statisticians the term "Data mining" has a pejorative meaning. Instead of finding useful patterns in large volumes of data as in the case of Statistics, data mining has the connotation of searching for data to fit preconceived…
Education Statistics Quarterly. Volume 5, Issue 3, 2003. NCES 2005-609
ERIC Educational Resources Information Center
National Center for Education Statistics, 2004
2004-01-01
The National Center for Education Statistics (NCES) fulfills a congressional mandate to collect and report "statistics and information showing the condition and progress of education in the United States and other nations in order to promote and accelerate the improvement of American education." The "Quarterly" offers an accessible, convenient…
Shock and Vibration Symposium (59th) Held in Albuquerque, New Mexico on 18-20 October 1988. Volume 1
1988-10-01
Partial contents: The Quest for Omega = sq root(K/M) -- Notes on the development of vibration analysis; An overview of Statistical Energy analysis ; Its...and inplane vibration transmission in statistical energy analysis ; Vibroacoustic response using the finite element method and statistical energy analysis ; Helium
Forest statistics for New Hampshire
Thomas S. Frieswyk; Anne M. Malley
1985-01-01
This is a statistical report on the fourth forest survey of New Hampshire conducted in 1982-83 by the Forest Inventory and Analysis Unit, Northeastern Forest Experiment Station. Statistics for forest area, numbers of trees, timber volume, tree biomass, and timber products output are displayed at the state, unit, and county levels. The current inventory indicates that...
Forest Statistics for Pennsylvania - 1978
Thomas J. Considine; Douglas S. Powell
1980-01-01
A statistical report on the third forest survey of Pennsylvania conducted in 1977 and 1978. Statistical findings are based on data from remeasured 115-acre plots and both remeasured and new 10-point variable-radius plots. The current status of forestland area, timber volume, and annual growth and removals is presented. Timber products output by timber industries, based...
Golbaz, Isabelle; Ahlers, Christian; Goesseringer, Nina; Stock, Geraldine; Geitzenauer, Wolfgang; Prünte, Christian; Schmidt-Erfurth, Ursula Margarethe
2011-03-01
This study compared automatic- and manual segmentation modalities in the retina of healthy eyes using high-definition optical coherence tomography (HD-OCT). Twenty retinas in 20 healthy individuals were examined using an HD-OCT system (Carl Zeiss Meditec, Inc.). Three-dimensional imaging was performed with an axial resolution of 6 μm at a maximum scanning speed of 25,000 A-scans/second. Volumes of 6 × 6 × 2 mm were scanned. Scans were analysed using a matlab-based algorithm and a manual segmentation software system (3D-Doctor). The volume values calculated by the two methods were compared. Statistical analysis revealed a high correlation between automatic and manual modes of segmentation. The automatic mode of measuring retinal volume and the corresponding three-dimensional images provided similar results to the manual segmentation procedure. Both methods were able to visualize retinal and subretinal features accurately. This study compared two methods of assessing retinal volume using HD-OCT scans in healthy retinas. Both methods were able to provide realistic volumetric data when applied to raster scan sets. Manual segmentation methods represent an adequate tool with which to control automated processes and to identify clinically relevant structures, whereas automatic procedures will be needed to obtain data in larger patient populations. © 2009 The Authors. Journal compilation © 2009 Acta Ophthalmol.
Turbulent entrainment across turbulent-nonturbulent interfaces in stably stratified mixing layers
NASA Astrophysics Data System (ADS)
Watanabe, T.; Riley, J. J.; Nagata, K.
2017-10-01
The entrainment process in stably stratified mixing layers is studied in relation to the turbulent-nonturbulent interface (TNTI) using direct numerical simulations. The statistics are calculated with the interface coordinate in an Eulerian frame as well as with the Lagrangian fluid particles entrained from the nonturbulent to the turbulent regions. The characteristics of entrainment change as the buoyancy Reynolds number Reb decreases and the flow begins to layer. The baroclinic torque delays the enstrophy growth of the entrained fluids at small Reb, while this effect is less efficient for large Reb. The entrained particle movement within the TNTI layer is dominated by the small dissipative scales, and the rapid decay of the kinetic energy dissipation rate due to buoyancy causes the entrained particle movement relative to the interface location to become slower. Although the Eulerian statistics confirm that there exists turbulent fluid with strong vorticity or with large buoyancy frequency near the TNTI, the entrained fluid particles circumvent these regions by passing through the TNTI in strain-dominant regions or in regions with small buoyancy frequency. The multiparticle statistics show that once the nonturbulent fluid volumes are entrained, they are deformed into flattened shapes in the vertical direction and diffuse in the horizontal direction. When Reb is large enough for small-scale turbulence to exist, the entrained fluid is able to penetrate into the turbulent core region. Once the flow begins to layer with decreasing Reb, however, the entrained fluid volume remains near the outer edge of the turbulent region and forms a stably stratified layer without vertical overturning.
Forest statistics for Southwest Georgia, 1971
Herbert A. Knight
1971-01-01
Acreage of commercial forest land in this 22-county area has declined by 180,000 acres. or almost 6 percent, since 1960. Over this same period, volume of growing-stock timber increased by 581 million cubic feet, or almost 27 percent, reversing a downward trend in volume between 1951 and 1960. Softwoods have accounted for 85 percent of this net gain in volume. In 1970...
NASA historical data book. Volume 2: Programs and projects 1958-1968
NASA Technical Reports Server (NTRS)
Ezell, Linda Neuman
1988-01-01
This is Volume 2, Programs and Projects 1958-1968, of a multi-volume series providing a 20-year compilation of summary statistical and other data descriptive of NASA's programs in aeronautics and manned and unmanned spaceflight. This series is an important component of NASA published historical reference works, used by NASA personnel, managers, external researchers, and other government agencies.
NASA historical data book. Volume 3: Programs and projects 1969-1978
NASA Technical Reports Server (NTRS)
Ezell, Linda Neuman
1988-01-01
This is Volume 3, Programs and Projects 1969-1978, of a multi-volume series providing a 20-year compilation of summary statistical and other data descriptive of NASA's programs in aeronautics and manned and unmanned spaceflight. This series is an important component of NASA published historical reference works, used by NASA personnel, managers, external researchers, and other government agencies.
NASA historical data book. Volume 1: NASA resources 1958-1968
NASA Technical Reports Server (NTRS)
Vannimmen, Jane; Bruno, Leonard C.; Rosholt, Robert L.
1988-01-01
This is Volume 1, NASA Resources 1958-1968, of a multi-volume series providing a 20-year compilation of summary statistical and other data descriptive of NASA's programs in aeronautics and manned and unmanned spaceflight. This series is an important component of NASA published historical reference works, used by NASA personnel, managers, external researchers, and other government agencies.
Sugar maple sap volume increases as vacuum level is increased
Russell S. Walters; H. Clay Smith
1975-01-01
Maple sap yields collected by using plastic tubing with a vacuum pump increased as the vacuum level was increased. Sap volumes collected at the 10- and 15-inch mercury vacuum levels were statistically significantly higher than volumes collected at the 5-inch level. Although the 15-inch vacuum yielded more sap than the 10-inch vacuum, the difference was not...
Panasiti, V; Curzio, M; Roberti, V; Lieto, P; Devirgiliis, V; Gobbi, S; Naspi, A; Coppola, R; Lopez, T; di Meo, N; Gatti, A; Trevisan, G; Londei, P; Calvieri, S
2013-01-01
The last melanoma staging system of the 2009 American Joint Committee on Cancer takes into account, for stage IV disease, the serum levels of lactate dehydrogenase (LDH) and the site of distant metastases. Our aim was to compare the significance of metastatic volume, as evaluated at the time of stage IV melanoma diagnosis, with other clinical predictors of prognosis. We conducted a retrospective multicentric study. To establish which variables were statistically correlated both with death and survival time, contingency tables were evaluated. The overall survival curves were compared using the Kaplan-Meier method. Metastatic volume and number of affected organs were statistically related to death. In detail, patients with a metastatic volume >15 cm(3) had a worse prognosis than those with a volume lower than this value (survival probability at 60 months: 6.8 vs. 40.9%, respectively). The Kaplan-Meier method confirmed that survival time was significantly related to the site(s) of metastases, to elevated LDH serum levels and to melanoma stage according to the latest system. Our results suggest that metastatic volume may be considered as a useful prognostic factor for survival among melanoma patients.
Trends in Percutaneous Coronary Intervention and Coronary Artery Bypass Surgery in Korea.
Lee, Heeyoung; Lee, Kun Sei; Sim, Sung Bo; Jeong, Hyo Seon; Ahn, Hye Mi; Chee, Hyun Keun
2016-12-01
Coronary angioplasty has been replacing coronary artery bypass grafting (CABG) because of the relative advantage in terms of recovery time and noninvasiveness of the procedure. Compared to other Organization for Economic Cooperation and Development (OECD) countries, Korea has experienced a rapid increase in coronary angioplasty volumes. We analyzed changes in procedure volumes of CABG and of percutaneous coronary intervention (PCI) from three sources: the OECD Health Data, the National Health Insurance Service (NHIS) surgery statistics, and the National Health Insurance claims data. We found the ratio of procedure volume of PCI to that of CABG per 100,000 population was 19.12 in 2014, which was more than triple the OECD average of 5.92 for the same year. According to data from NHIS statistics, this ratio was an increase from 11.4 to 19.3 between 2006 and 2013. We found that Korea has a higher ratio of total procedure volumes of PCI with respect to CABG and also a more rapid increase of volumes of PCI than other countries. Prospective studies are required to determine whether this increase in absolute volumes of PCI is a natural response to a real medical need or representative of medical overuse.
Drivers of annual to decadal streamflow variability in the lower Colorado River Basin
NASA Astrophysics Data System (ADS)
Lambeth-Beagles, R. S.; Troch, P. A.
2010-12-01
The Colorado River is the main water supply to the southwest region. As demand reaches the limit of supply in the southwest it becomes increasingly important to understand the dynamics of streamflow in the Colorado River and in particular the tributaries to the lower Colorado River. Climate change may pose an additional threat to the already-scarce water supply in the southwest. Due to the narrowing margin for error, water managers are keen on extending their ability to predict streamflow volumes on a mid-range to decadal scale. Before a predictive streamflow model can be developed, an understanding of the physical drivers of annual to decadal streamflow variability in the lower Colorado River Basin is needed. This research addresses this need by applying multiple statistical methods to identify trends, patterns and relationships present in streamflow, precipitation and temperature over the past century in four contributing watersheds to the lower Colorado River. The four watersheds selected were the Paria, Little Colorado, Virgin/Muddy, and Bill Williams. Time series data over a common period from 1906-2007 for streamflow, precipitation and temperature were used for the initial analysis. Through statistical analysis the following questions were addressed: 1) are there observable trends and patterns in these variables during the past century and 2) if there are trends or patterns, how are they related to each other? The Mann-Kendall test was used to identify trends in the three variables. Assumptions regarding autocorrelation and persistence in the data were taken into consideration. Kendall’s tau-b test was used to establish association between any found trends in the data. Initial results suggest there are two primary processes occurring. First, statistical analysis reveals significant upward trends in temperatures and downward trends in streamflow. However, there appears to be no trend in precipitation data. These trends in streamflow and temperature speak to increasing evaporation and transpiration processes. Second, annual variability in streamflow is not statistically correlated with annual temperature variability but appears to be highly correlated with annual precipitation variability. This implies that on a year-to-year basis, changes in streamflow volumes are directly affected by precipitation and not temperature. Future development of a predictive streamflow model will need to take into consideration these two processes to obtain accurate results. In order to extend predictive skill to the multi-year scale relationships between precipitation, temperature and persistent climate indices such as the Pacific Decadal Oscillation, Atlantic Multidecadal Oscillation and El Nino/Southern Oscillation will need to be examined.
Petroleum marketing monthly, June 1994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1994-06-01
The Petroleum Marketing Monthly (PMM) provides information and statistical data on a variety of crude oils and refined petroleum products. The publication presents statistics on crude oil costs and refined petroleum products sales for use by industry, government, private sector analysts, educational institutions, and consumers. Data on crude oil include the domestic first purchase price, the f.o.b. and landed cost of imported crude oil, and the refiners` acquisition cost of crude oil. Refined petroleum product sales data include motor gasoline, distillates, residuals, aviation fuels, kerosene, and propane. Monthly statistics on purchases of crude oil and sales of petroleum products aremore » presented in five sections: Summary Statistics; Crude Oil Prices; Prices of Petroleum Products; Volumes of Petroleum Products; and Prime Supplier Sales Volumes of Petroleum Products for Local Consumption. The feature article is entitled ``The Second Oxygenated Gasoline Season.`` 7 figs., 50 tabs.« less
Petroleum marketing monthly, February 1999 with data for November 1998
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1999-02-01
The Petroleum Marketing Monthly (PMM) provides information and statistical data on a variety of crude oils and refined petroleum products. The publication presents statistics on crude oil costs and refined petroleum products sales for use by industry, government, private sector analysts, educational institutions, and consumers. Data on crude oil include the domestic first purchase price, the f.o.b. and landed cost of imported crude oil, and the refiners` acquisition cost of crude oil. Refined petroleum product sales data include motor gasoline, distillates, residuals, aviation fuels, kerosene, and propane. Monthly statistics on purchases of crude oil and sales of petroleum products aremore » presented in the Petroleum Marketing Monthly in six sections: Initial Estimates; Summary Statistics; Crude Oil Prices; Prices of Petroleum Products; Volumes of Petroleum Products; and Prime Supplier Sales Volumes of Petroleum Products for Local Consumption. 7 figs., 50 tabs.« less
Petroleum marketing monthly, March 1999 with data for December 1998
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1999-03-01
The Petroleum Marketing Monthly (PMM) provides information and statistical data on a variety of crude oils and refined petroleum products. The publication presents statistics on crude oil costs and refined petroleum products sales for use by industry, government, private sector analysts, educational institutions, and consumers. Data on crude oil include the domestic first purchase price, the f.o.b. and landed cost of imported crude oil, and the refiners` acquisition cost of crude oil. Refined petroleum product sales data include motor gasoline, distillates, residuals, aviation fuels, kerosene, and propane. Monthly statistics on purchases of crude oil and sales of petroleum products aremore » presented in the Petroleum Marketing Monthly in five sections: summary statistics; crude oil prices; prices of petroleum products; volumes of petroleum products; and prime supplier sales volumes of petroleum products for local consumption. 7 figs., 50 tabs.« less
Effect of topical ophthalmic epinastine and olopatadine on tear volume in mice.
Villareal, Arturo L; Farley, William; Pflugfelder, Stephen C
2006-12-01
To investigate the effects of topical epinastine and olopatadine on tear volume by using a mouse model. Eighty-five C57BL6 mice (170 eyes) were treated twice daily with topical ophthalmic epinastine 0.05%, olopatadine 0.1%, or atropine 1% or served as untreated controls. A thread-wetting assay was used to measure tear volume at baseline and 15, 45, 90, 120, and 240 minutes after the last instillation of the drug on days 2 and 4 of treatment. After 2 days of treatment, epinastine-treated mice showed greater mean tear volumes than olopatadine-treated mice did at 15, 45, 90, and 240 minutes, with statistical significance at 15 and 45 minutes (P<0.001). Olopatadine significantly reduced tear volume versus untreated controls at 15 and 45 minutes (P<0.001). After 4 days, tear volumes with epinastine treatment exceeded those with olopatadine treatment at all time points, with statistical significance at 45 minutes (P<0.05). Atropine rendered tears undetectable at 15, 45, and 90 minutes; tear volume returned to baseline levels at 240 minutes. Topical epinastine did not inhibit tear secretion, whereas olopatadine caused a significant decrease in tear volume. Because of its neutral impact on the lacrimal functional unit, epinastine may be an especially good choice for the treatment of allergic conjunctivitis in patients with dry eye disease or in those who are at risk for developing dry eye.
Szpinda, Michał; Paruszewska-Achtel, Monika; Woźniak, Alina; Mila-Kierzenkowska, Celestyna; Elminowska-Wenda, Gabriela; Dombek, Małgorzata; Szpinda, Anna; Badura, Mateusz
2015-01-01
Using anatomical, hydrostatic, and statistical methods, liver volumes were assessed in 69 human fetuses of both sexes aged 18-30 weeks. No sex differences were found. The median of liver volume achieved by hydrostatic measurements increased from 6.57 cm(3) at 18-21 weeks through 14.36 cm(3) at 22-25 weeks to 20.77 cm(3) at 26-30 weeks, according to the following regression: y = -26.95 + 1.74 × age ± Z × (-3.15 + 0.27 × age). The median of liver volume calculated indirectly according to the formula liver volume = 0.55 × liver length × liver transverse diameter × liver sagittal diameter increased from 12.41 cm(3) at 18-21 weeks through 28.21 cm(3) at 22-25 weeks to 49.69 cm(3) at 26-30 weeks. There was a strong relationship (r = 0.91, p < 0.001) between the liver volumes achieved by hydrostatic (x) and indirect (y) methods, expressed by y = -0.05 + 2.16x ± 7.26. The liver volume should be calculated as follows liver volume = 0.26 × liver length × liver transverse diameter × liver sagittal diameter. The age-specific liver volumes are of great relevance in the evaluation of the normal hepatic growth and the early diagnosis of fetal micro- and macrosomias.
Alternative Derivations of the Statistical Mechanical Distribution Laws
Wall, Frederick T.
1971-01-01
A new approach is presented for the derivation of statistical mechanical distribution laws. The derivations are accomplished by minimizing the Helmholtz free energy under constant temperature and volume, instead of maximizing the entropy under constant energy and volume. An alternative method involves stipulating equality of chemical potential, or equality of activity, for particles in different energy levels. This approach leads to a general statement of distribution laws applicable to all systems for which thermodynamic probabilities can be written. The methods also avoid use of the calculus of variations, Lagrangian multipliers, and Stirling's approximation for the factorial. The results are applied specifically to Boltzmann, Fermi-Dirac, and Bose-Einstein statistics. The special significance of chemical potential and activity is discussed for microscopic systems. PMID:16578712
Alternative derivations of the statistical mechanical distribution laws.
Wall, F T
1971-08-01
A new approach is presented for the derivation of statistical mechanical distribution laws. The derivations are accomplished by minimizing the Helmholtz free energy under constant temperature and volume, instead of maximizing the entropy under constant energy and volume. An alternative method involves stipulating equality of chemical potential, or equality of activity, for particles in different energy levels. This approach leads to a general statement of distribution laws applicable to all systems for which thermodynamic probabilities can be written. The methods also avoid use of the calculus of variations, Lagrangian multipliers, and Stirling's approximation for the factorial. The results are applied specifically to Boltzmann, Fermi-Dirac, and Bose-Einstein statistics. The special significance of chemical potential and activity is discussed for microscopic systems.
2011 statistical abstract of the United States
Krisanda, Joseph M.
2011-01-01
The Statistical Abstract of the United States, published since 1878, is the authoritative and comprehensive summary of statistics on the social, political, and economic organization of the United States.Use the Abstract as a convenient volume for statistical reference, and as a guide to sources of more information both in print and on the Web.Sources of data include the Census Bureau, Bureau of Labor Statistics, Bureau of Economic Analysis, and many other Federal agencies and private organizations.
Supplementary Computer Generated Cueing to Enhance Air Traffic Controller Efficiency
2013-03-01
assess the complexity of air traffic control (Mogford, Guttman, Morrow, & Kopardekar, 1995; Laudeman, Shelden, Branstrom, & Brasil , 1998). Controllers...Behaviorial Sciences: Volume 1: Methodological Issues Volume 2: Statistical Issues, 1, 257. Laudeman, I. V., Shelden, S. G., Branstrom, R., & Brasil
Study of Automobile Market Dynamics : Volume 2. Analysis.
DOT National Transportation Integrated Search
1977-08-01
Volume II describes the work in providing statistical inputs to a computer model by examining the effects of various options on the number of automobiles sold; the distribution of sales among small, medium and large cars; the distribution between aut...
Gerhard K. Raile; Earl C. Leatherberry
1988-01-01
The third inventory of forest resources in Illinois shows a 1.2% increase in timberland and a 40.5% gain in growing stock volume between 1962 and 1985. Text and statistics are presented on area, volume, growth, mortality, removals, utilization, biomass, and future timber supply.
NURE aerial gamma ray and magnetic detail survey of portions of northeast Washington. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1981-11-01
The Northeast Washington Survey was performed under the United States Department of Energy's National Uranium Resource Evaluation (NURE) Program, which is designed to provide radioelement distribution information to assist in assessing the uraniferous material potential of the United States. The radiometric and ancilliary data were digitally recorded and processed. The results are presented in the form of stacked profiles, contour maps, flight path maps, statistical tables and frequency distribution histograms. These graphical outputs are presented at a scale of 1:62,500 and are contained in the individual Volume 2 reports.
Quantitative impact of pediatric sinus surgery on facial growth.
Senior, B; Wirtschafter, A; Mai, C; Becker, C; Belenky, W
2000-11-01
To quantitatively evaluate the long-term impact of sinus surgery on paranasal sinus development in the pediatric patient. Longitudinal review of eight pediatric patients treated with unilateral sinus surgery for periorbital or orbital cellulitis with an average follow-up of 6.9 years. Control subjects consisted of two groups, 9 normal adult patients with no computed tomographic evidence of sinusitis and 10 adult patients with scans consistent with sinusitis and a history of sinus-related symptoms extending to childhood. Application of computed tomography (CT) volumetrics, a technique allowing for precise calculation of volumes using thinly cut CT images, to the study and control groups. Paired Student t test analyses of side-to-side volume comparisons in the normal patients, patients with sinusitis, and patients who had surgery revealed no statistically significant differences. Comparisons between the orbital volumes of patients who did and did not have surgery revealed a statistically significant increase in orbital volume in patients who had surgery. Only minimal changes in facial volume measurements have been found, confirming clinical impressions that sinus surgery in children is safe and without significant cosmetic sequelae.
Jack, C R; Twomey, C K; Zinsmeister, A R; Sharbrough, F W; Petersen, R C; Cascino, G D
1989-08-01
Volumes of the right and left anterior temporal lobes and hippocampal formations were measured from magnetic resonance images in 52 healthy volunteers, aged 20-40 years. Subjects were selected by age, sex, and handedness to evaluate possible effect of these variables. Data were normalized for variation in total intracranial volume between individuals. Right-left asymmetry in the volumes of the anterior temporal lobes and hippocampal formations was a normal finding. The anterior temporal lobe of the non-dominant (right) hemisphere was larger than the left by a small (mean right-left difference, 2.3 cm3) but statistically significant amount (P less than .005) in right-handed subjects. No significant effect of age or sex was seen in normalized right or left anterior temporal lobe volume. The right hippocampal formation was larger than the left for all subjects by a small (mean right-left difference, 0.3 cm3) but statistically significant amount (P less than .001). No effect of age, sex, or handedness was seen in normalized hippocampal formation volumes.
Seuba, Jordi; Deville, Sylvain; Guizard, Christian; Stevenson, Adam J
2016-01-01
Macroporous ceramics exhibit an intrinsic strength variability caused by the random distribution of defects in their structure. However, the precise role of microstructural features, other than pore volume, on reliability is still unknown. Here, we analyze the applicability of the Weibull analysis to unidirectional macroporous yttria-stabilized-zirconia (YSZ) prepared by ice-templating. First, we performed crush tests on samples with controlled microstructural features with the loading direction parallel to the porosity. The compressive strength data were fitted using two different fitting techniques, ordinary least squares and Bayesian Markov Chain Monte Carlo, to evaluate whether Weibull statistics are an adequate descriptor of the strength distribution. The statistical descriptors indicated that the strength data are well described by the Weibull statistical approach, for both fitting methods used. Furthermore, we assess the effect of different microstructural features (volume, size, densification of the walls, and morphology) on Weibull modulus and strength. We found that the key microstructural parameter controlling reliability is wall thickness. In contrast, pore volume is the main parameter controlling the strength. The highest Weibull modulus ([Formula: see text]) and mean strength (198.2 MPa) were obtained for the samples with the smallest and narrowest wall thickness distribution (3.1 [Formula: see text]m) and lower pore volume (54.5%).
NASA Astrophysics Data System (ADS)
Seuba, Jordi; Deville, Sylvain; Guizard, Christian; Stevenson, Adam J.
2016-01-01
Macroporous ceramics exhibit an intrinsic strength variability caused by the random distribution of defects in their structure. However, the precise role of microstructural features, other than pore volume, on reliability is still unknown. Here, we analyze the applicability of the Weibull analysis to unidirectional macroporous yttria-stabilized-zirconia (YSZ) prepared by ice-templating. First, we performed crush tests on samples with controlled microstructural features with the loading direction parallel to the porosity. The compressive strength data were fitted using two different fitting techniques, ordinary least squares and Bayesian Markov Chain Monte Carlo, to evaluate whether Weibull statistics are an adequate descriptor of the strength distribution. The statistical descriptors indicated that the strength data are well described by the Weibull statistical approach, for both fitting methods used. Furthermore, we assess the effect of different microstructural features (volume, size, densification of the walls, and morphology) on Weibull modulus and strength. We found that the key microstructural parameter controlling reliability is wall thickness. In contrast, pore volume is the main parameter controlling the strength. The highest Weibull modulus (?) and mean strength (198.2 MPa) were obtained for the samples with the smallest and narrowest wall thickness distribution (3.1 ?m) and lower pore volume (54.5%).
Three-Dimensional Eyeball and Orbit Volume Modification After LeFort III Midface Distraction.
Smektala, Tomasz; Nysjö, Johan; Thor, Andreas; Homik, Aleksandra; Sporniak-Tutak, Katarzyna; Safranow, Krzysztof; Dowgierd, Krzysztof; Olszewski, Raphael
2015-07-01
The aim of our study was to evaluate orbital volume modification with LeFort III midface distraction in patients with craniosynostosis and its influence on eyeball volume and axial diameter modification. Orbital volume was assessed by the semiautomatic segmentation method based on deformable surface models and on 3-dimensional (3D) interaction with haptics. The eyeball volumes and diameters were automatically calculated after manual segmentation of computed tomographic scans with 3D slicer software. The mean, minimal, and maximal differences as well as the standard deviation and intraclass correlation coefficient (ICC) for intraobserver and interobserver measurements reliability were calculated. The Wilcoxon signed rank test was used to compare measured values before and after surgery. P < 0.05 was considered statistically significant. Intraobserver and interobserver ICC for haptic-aided semiautomatic orbital volume measurements were 0.98 and 0.99, respectively. The intraobserver and interobserver ICC values for manual segmentation of the eyeball volume were 0.87 and 0.86, respectively. The orbital volume increased significantly after surgery: 30.32% (mean, 5.96 mL) for the left orbit and 31.04% (mean, 6.31 mL) for the right orbit. The mean increase in eyeball volume was 12.3%. The mean increases in the eyeball axial dimensions were 7.3%, 9.3%, and 4.4% for the X-, Y-, and Z-axes, respectively. The Wilcoxon signed rank test showed that preoperative and postoperative eyeball volumes, as well as the diameters along the X- and Y-axes, were statistically significant. Midface distraction in patients with syndromic craniostenosis results in a significant increase (P < 0.05) in the orbit and eyeball volumes. The 2 methods (haptic-aided semiautomatic segmentation and manual 3D slicer segmentation) are reproducible techniques for orbit and eyeball volume measurements.
Parallel processing of genomics data
NASA Astrophysics Data System (ADS)
Agapito, Giuseppe; Guzzi, Pietro Hiram; Cannataro, Mario
2016-10-01
The availability of high-throughput experimental platforms for the analysis of biological samples, such as mass spectrometry, microarrays and Next Generation Sequencing, have made possible to analyze a whole genome in a single experiment. Such platforms produce an enormous volume of data per single experiment, thus the analysis of this enormous flow of data poses several challenges in term of data storage, preprocessing, and analysis. To face those issues, efficient, possibly parallel, bioinformatics software needs to be used to preprocess and analyze data, for instance to highlight genetic variation associated with complex diseases. In this paper we present a parallel algorithm for the parallel preprocessing and statistical analysis of genomics data, able to face high dimension of data and resulting in good response time. The proposed system is able to find statistically significant biological markers able to discriminate classes of patients that respond to drugs in different ways. Experiments performed on real and synthetic genomic datasets show good speed-up and scalability.
NASA Astrophysics Data System (ADS)
Bíró, Gábor; Barnaföldi, Gergely Gábor; Biró, Tamás Sándor; Shen, Keming
2018-02-01
The latest, high-accuracy identified hadron spectra measurements in highenergy nuclear collisions led us to the investigation of the strongly interacting particles and collective effects in small systems. Since microscopical processes result in a statistical Tsallis - Pareto distribution, the fit parameters q and T are well suited for identifying system size scalings and initial conditions. Moreover, parameter values provide information on the deviation from the extensive, Boltzmann - Gibbs statistics in finite-volumes. We apply here the fit procedure developed in our earlier study for proton-proton collisions [1, 2]. The observed mass and center-of-mass energy trends in the hadron production are compared to RHIC dAu and LHC pPb data in different centrality/multiplicity classes. Here we present new results on mass hierarchy in pp and pA from light to heavy hadrons.
Statistical intensity variation analysis for rapid volumetric imaging of capillary network flux
Lee, Jonghwan; Jiang, James Y.; Wu, Weicheng; Lesage, Frederic; Boas, David A.
2014-01-01
We present a novel optical coherence tomography (OCT)-based technique for rapid volumetric imaging of red blood cell (RBC) flux in capillary networks. Previously we reported that OCT can capture individual RBC passage within a capillary, where the OCT intensity signal at a voxel fluctuates when an RBC passes the voxel. Based on this finding, we defined a metric of statistical intensity variation (SIV) and validated that the mean SIV is proportional to the RBC flux [RBC/s] through simulations and measurements. From rapidly scanned volume data, we used Hessian matrix analysis to vectorize a segment path of each capillary and estimate its flux from the mean of the SIVs gathered along the path. Repeating this process led to a 3D flux map of the capillary network. The present technique enabled us to trace the RBC flux changes over hundreds of capillaries with a temporal resolution of ~1 s during functional activation. PMID:24761298
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, T; Lin, X; Yin, Y
Purpose: To compare the dosimetric differences among fixed field intensity-modulated radiotherapy (IMRT) and double-arc volumetricmodulated arc therapy (VMAT) plans with simultaneous integrated boost in rectal cancer. Methods: Ten patients with rectal cancer previously treated with IMRT were included in this analysis. For each patient, two treatment techniques were designed for each patient: the fixed 7 fields IMRT and double-arc VMAT with RapidArc technique. The treatment plan was designed to deliver in one process with simultaneous integrated boost (SIB). The prescribed doses to the planning target volume of the subclinical disease (PTV1) and the gross disease (PTV2) were 45 Gy andmore » 55 Gy in 25 fractions, respectively. The dose distribution in the target, the dose to the organs at risk, total MU and the delivery time in two techniques were compared to explore the dosimetric differences. Results: For the target dose and homogeneity in PTV1 and PTV2, no statistically differences were observed in the two plans. VMAT plans showed a better conformity in PTV1. VMAT plans reduced the mean dose to bladder, small bowel, femur heads and iliac wings. For iliac wings, VMAT plans resulted in a statistically significant reduction in irradiated volume of 15 Gy, 20 Gy, 30 Gy but increased the 10 Gy irradiated volume. VMAT plans reduced the small bowel irradiated volume of 20 Gy and 30 Gy. Compared with IMRT plans, VMAT plans showed a significant reduction of monitor units by nearly 30% and reduced treatment time by an average of 70% Conclusion: Compared to IMRT plans, VMAT plans showed the similar target dose and reduced the dose of the organs at risk, especially for small bowel and iliac wings. For rectal cancer, VMAT with simultaneous integrated boost can be carried out with high quality and efficiency.« less
Minnesota forest statistics, 1990.
Patrick D. Miles; Chung M. Chen
1992-01-01
The fifth inventory of Minnesota's forests reports 51.0 million acres of land, of which 16.7 million acres are forested. This bulletin presents statistical highlights and contains detailed tables of forest area, as well as timber volume, growth, removals, mortality, and ownership.
ERIC Educational Resources Information Center
Ministerio de Educacion, Guatemala City (Guatemala). Oficina de Planeamiento Integral de la Educacion.
This booklet presents statistics concerning primary education in Guatemala. The first section covers enrollment, considering such factors as type of school and location. Other sections provide statistics on teachers, their locations, the number of schools, enrollment in terms of students repeating grades or leaving school, students advancing out…
Forest Statistics for Ohio--1979
Donald F. Dennis; Thomas W. Birch; Thomas W. Birch
1981-01-01
A statistical report on the third forest survey of Ohio conducted in 1978 and 1979. Statistical findings are based on data from remeasured and new 10-point variable radius plots. The current status of forest-land area, timber volume, and annual growth and removals is presented. Timber products output by timber industries, based on a 1978 updated canvass of...
Forest statistics for Vermont: 1973 and 1983
Thomas S. Frieswyk; Anne M. Malley
1985-01-01
A statistical report on the fourth forest survey of Vermont conducted in 1982-1983 by the Forest Inventory and Analysis Unit, Northeastern Forest Experiment Station. Statistics for forest area, numbers of trees, timber volume, tree biomass, and timber products output are displayed at the state, unit, and county levels. The current inventory indicates that the state has...
Forest statistics for New York--1980
Thomas J., Jr. Considine; Thomas S. Frieswyk; Thomas S. Frieswyk
1982-01-01
A statistical report on the third forest survey of New York conducted in 1978 and 1979. Statistical findings are based on data from remeasured and new 10-point variable-radius plots. The current status of forest-land area, timber volume, and annual growth and removals is presented. Timber products output by timber industries, based on a 1979 updated canvass of...
Timber resource statistics for timberland outside National Forests in eastern Oregon.
Neil McKay; Gary J. Lettman; Mary A. Mei
1994-01-01
This report summarizes a 1992 timber resource inventory Of timberland outside National Forests in eastern Oregon. The report presents statistical tables of timberland area, timber volume, growth, mortality, and harvest. It also displays tables of revised 1986-87 timber resource statistics for timberland outside National Forests; the 1992 and 1986-87 tables may be...
Forest statistics for Delaware: 1986 and 1999
Douglas M. Griffith; Richard H. Widmann; Richard H. Widmann
2001-01-01
A statistical report on the fourth forest inventory of Delaware conducted in 1999 by the Forest Inventory and Analysis Unit of the Northeastern Research Station. Statistics for forest area, numbers of trees, tree biomass, timber volume, growth, and change are displayed at the state and, where appropriate, the county level. The current inventory indicates that there are...
Forest statistics for West Virginia: 1989 and 2000
Douglas M. Griffith; Richard H. Widmann
2003-01-01
A statistical report on the fifth forest inventory of West Virginia conducted in 2000 by the Forest Inventory and Analysis unit of the Northeastern Research Station. Statistics for forest area, numbers of trees, tree biomass, timber volume, growth, and change are displayed at the state and, where appropriate, the county level. The current inventory indicates that there...
Saltus, R.W.; Kulander, Christopher S.; Potter, Christopher J.
2002-01-01
We have digitized, modified, and analyzed seismic interpretation maps of 12 subsurface stratigraphic horizons spanning portions of the National Petroleum Reserve in Alaska (NPRA). These original maps were prepared by Tetra Tech, Inc., based on about 15,000 miles of seismic data collected from 1974 to 1981. We have also digitized interpreted faults and seismic velocities from Tetra Tech maps. The seismic surfaces were digitized as two-way travel time horizons and converted to depth using Tetra Tech seismic velocities. The depth surfaces were then modified by long-wavelength corrections based on recent USGS seismic re-interpretation along regional seismic lines. We have developed and executed an algorithm to identify and calculate statistics on the area, volume, height, and depth of closed structures based on these seismic horizons. These closure statistics are tabulated and have been used as input to oil and gas assessment calculations for the region. Directories accompanying this report contain basic digitized data, processed data, maps, tabulations of closure statistics, and software relating to this project.
Port Needs Study (Vessel Traffic Services Benefits) : Volume 2. Appendices, Part 1.
DOT National Transportation Integrated Search
1991-08-01
Volume II focuses on organization and presentation of information for each individual study zone. It contains the appendix tables of input data, output statistics and the documentation of the candidate Vessel Traffic Service (VTS) Design by NavCom Sy...
Port Needs Study (Vessel Traffic Services Benefits) : Volume 2. Appendices, Part 2.
DOT National Transportation Integrated Search
1991-01-01
Volume II focuses on organization and presentation of information for each individual study zone. It contains the appendix tables of input data, output statistics and the documentation of the candidate Vessel Traffic Services (VTS) Design by NavCom S...
USING TRACERS TO DESCRIBE NAPL HETEROGENEITY
Tracers are frequently used to estimate both the average travel time for water flow through the tracer swept volume and NAPL saturation. The same data can be used to develop a statistical distribution describing the hydraulic conductivity in the sept volume and a possible distri...
Kansas forest inventory, 1981.
John S. Jr. Spencer; John K. Strickler; William J. Moyer
1984-01-01
The third inventory of the timber resource of Kansas shows a 1.4% increase in commercial forest area and a 42% gain in growing-stock volume between 1965 and 1980. Text and statistics are presented on area, volume, growth, mortality, removals, utilization, biomass, and future timber supply.
Nebraska's second forest inventory.
Gerhard K. Raile
1986-01-01
The second inventory of the timber resource of Nebraska shows a 25% decline in commercial forest area and a 23% gain in growing-stock volume between 1955 and 1983. Text and statistics are presented on area, volume, growth, mortality, removals, utilization, biomass, and future timber supply.
Timber resource statistics for the Upper Yukon inventory unit, Alaska, 1980.
Willem W.S. van Hees
1987-01-01
The 1980 inventory of the forest resources of the Upper Yukon unit was designed to produce inventory estimates of timberland area, volume of timber, and volumes of timber growth and mortality. Timberland area is estimated at 742,000 acres. Cubic-foot volume on all timberland is estimated at 475 million cubic feet. Timber growth and mortality are estimated at -615,000...
Kaplanoglu, Mustafa; Yuce, Tuncay; Bulbul, Mehmet
2015-01-01
The aim was to evaluate the place of mean platelet volume (MPV) in predicting spontaneous miscarriage and to identify any differences in its values following miscarriage after biochemical and clinical pregnancy. We retrospectively evaluated the data of 305 spontaneous miscarriages and 168 control subjects. The miscarriage subjects were evaluated in two groups: miscarriage after biochemical pregnancy (n=79) (BA group) and miscarriage after clinical pregnancy (n=226) (CA group). Demographic and laboratory data of all subjects were statistically compared. No statistically significant difference was found between the miscarriage and control subjects in terms of demographic data and Hb, Htc, WBC, and Plt values. The mean platelet volume (MPV) value in the miscarriage group (8.99±1.47 fl) was statistically significantly lower than in the control group (9.66±1.64 fl) (P<0.001). A statistically significant difference was present between the BA, CA and control group, with the lowest MPV value in the BA group (8.64±1.34 fl, 9.11±1.49 fl, and 9.66±1.64 fl, respectively) (P<0.001). MPV was significantly lower in patients with miscarriage than the control group, and this was correlated with the gestational stage when the miscarriage occurred.
Lemola, Sakari; Oser, Nadine; Urfer-Maurer, Natalie; Brand, Serge; Holsboer-Trachsler, Edith; Bechtel, Nina; Grob, Alexander; Weber, Peter; Datta, Alexandre N
2017-01-01
To determine whether the relationship of gestational age (GA) with brain volumes and cognitive functions is linear or whether it follows a threshold model in preterm and term born children during school-age. We studied 106 children (M = 10 years 1 month, SD = 16 months; 40 females) enrolled in primary school: 57 were healthy very preterm children (10 children born 24-27 completed weeks' gestation (extremely preterm), 14 children born 28-29 completed weeks' gestation, 19 children born 30-31 completed weeks' gestation (very preterm), and 14 born 32 completed weeks' gestation (moderately preterm)) all born appropriate for GA (AGA) and 49 term-born children. Neuroimaging involved voxel-based morphometry with the statistical parametric mapping software. Cognitive functions were assessed with the WISC-IV. General Linear Models and multiple regressions were conducted controlling age, sex, and maternal education. Compared to groups of children born 30 completed weeks' gestation and later, children born <28 completed weeks' gestation had less gray matter volume (GMV) and white matter volume (WMV) and poorer cognitive functions including decreased full scale IQ, and processing speed. Differences in GMV partially mediated the relationship between GA and full scale IQ in preterm born children. In preterm children who are born AGA and without major complications GA is associated with brain volume and cognitive functions. In particular, decreased brain volume becomes evident in the extremely preterm group (born <28 completed weeks' gestation). In preterm children born 30 completed weeks' gestation and later the relationship of GA with brain volume and cognitive functions may be less strong as previously thought.
Precipitation, landsliding, and erosion across the Olympic Mountains, Washington State, USA
NASA Astrophysics Data System (ADS)
Smith, Stephen G.; Wegmann, Karl W.
2018-01-01
In the Olympic Mountains of Washington State, landsliding is the primary surface process by which bedrock and hillslope regolith are delivered to river networks. However, the relative importance of large earthquakes versus high magnitude precipitation events to the total volume of landslide material transported to valley bottoms remains unknown in part due to the absence of large historical earthquakes. To test the hypothesis that erosion is linked to precipitation, approximately 1000 landslides were mapped from Google Earth imagery between 1990 and 2015 along a 15 km-wide × 85 km-long (1250 km2) swath across the range. The volume of hillslope material moved by each slide was calculated using previously published area-volume scaling relationships, and the spatial distribution of landslide volume was compared to mean annual precipitation data acquired from the PRISM climate group for the period 1981-2010. Statistical analysis reveals a significant correlation (r = 0.55; p < 0.001) between total landslide volume and mean annual precipitation, with 98% of landslide volume occurring along the windward, high-precipitation side of the range during the 25-year interval. Normalized to area, this volume yields a basin-wide erosion rate of 0.28 ± 0.11 mm yr- 1, which is similar to previous time-variable estimates of erosion throughout the Olympic Mountains, including those from river sediment yield, cosmogenic 10Be, fluvial terrace incision, and thermochronometry. The lack of large historic earthquakes makes it difficult to assess the relative contributions of precipitation and seismic shaking to total erosion, but our results suggest that climate, and more specifically a sharp precipitation gradient, plays an important role in controlling erosion and landscape evolution over both short and long timescales across the Olympic Mountains.
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Benjamin M., E-mail: bmwhite@mednet.ucla.edu; Lamb, James M.; Low, Daniel A.
Purpose: To characterize radiation therapy patient breathing patterns based on measured external surrogate information. Methods: Breathing surrogate data were collected during 4DCT from a cohort of 50 patients including 28 patients with lung cancer and 22 patients without lung cancer. A spirometer and an abdominal pneumatic bellows were used as the surrogates. The relationship between these measurements was assumed to be linear within a small phase difference. The signals were correlated and drift corrected using a previously published method to convert the signal into tidal volume. The airflow was calculated with a first order time derivative of the tidal volumemore » using a window centered on the point of interest and with a window length equal to the CT gantry rotation period. The airflow was compared against the tidal volume to create ellipsoidal patterns that were binned into 25 ml × 25 ml/s bins to determine the relative amount of time spent in each bin. To calculate the variability of the maximum inhalation tidal volume within a free-breathing scan timeframe, a metric based on percentile volume ratios was defined. The free breathing variability metric (κ) was defined as the ratio between extreme inhalation tidal volumes (defined as >93 tidal volume percentile of the measured tidal volume) and normal inhalation tidal volume (defined as >80 tidal volume percentile of the measured tidal volume). Results: There were three observed types of volume-flow curves, labeled Types 1, 2, and 3. Type 1 patients spent a greater duration of time during exhalation withκ = 1.37 ± 0.11. Type 2 patients had equal time duration spent during inhalation and exhalation with κ = 1.28 ± 0.09. The differences between the mean peak exhalation to peak inhalation tidal volume, breathing period, and the 85th tidal volume percentile for Type 1 and Type 2 patients were statistically significant at the 2% significance level. The difference between κ and the 98th tidal volume percentile for Type 1 and Type 2 patients was found to be statistically significant at the 1% significance level. Three patients did not display a breathing stability curve that could be classified as Type 1 or Type 2 due to chaotic breathing patterns. These patients were classified as Type 3 patients. Conclusions: Based on an observed volume-flow curve pattern, the cohort of 50 patients was divided into three categories called Type 1, Type 2, and Type 3. There were statistically significant differences in breathing characteristics between Type 1 and Type 2 patients. The use of volume-flow curves to classify patients has been demonstrated as a physiological characterization metric that has the potential to optimize gating windows in radiation therapy.« less
Carmichael, Owen; Xie, Jing; Fletcher, Evan; Singh, Baljeet; DeCarli, Charles
2012-06-01
Hippocampal injury in the Alzheimer's disease (AD) pathological process is region-specific and magnetic resonance imaging (MRI)-based measures of localized hippocampus (HP) atrophy are known to detect region-specific changes associated with clinical AD, but it is unclear whether these measures provide information that is independent of that already provided by measures of total HP volume. Therefore, this study assessed the strength of association between localized HP atrophy measures and AD-related measures including cerebrospinal fluid (CSF) amyloid beta and tau concentrations, and cognitive performance, in statistical models that also included total HP volume as a covariate. A computational technique termed localized components analysis (LoCA) was used to identify 7 independent patterns of HP atrophy among 390 semiautomatically delineated HP from baseline magnetic resonance imaging of participants in the Alzheimer's Disease Neuroimaging Initiative (ADNI). Among cognitively normal participants, multiple measures of localized HP atrophy were significantly associated with CSF amyloid concentration, while total HP volume was not. In addition, among all participants, localized HP atrophy measures and total HP volume were both independently and additively associated with CSF tau concentration, performance on numerous neuropsychological tests, and discrimination between normal, mild cognitive impairment (MCI), and AD clinical diagnostic groups. Together, these results suggest that regional measures of hippocampal atrophy provided by localized components analysis may be more sensitive than total HP volume to the effects of AD pathology burden among cognitively normal individuals and may provide information about HP regions whose deficits may have especially profound cognitive consequences throughout the AD clinical course. Copyright © 2012 Elsevier Inc. All rights reserved.
2011 statistical abstract of the United States
Krisanda, Joseph M.
2011-01-01
The Statistical Abstract of the United States, published since 1878, is the authoritative and comprehensive summary of statistics on the social, political, and economic organization of the United States.
Use the Abstract as a convenient volume for statistical reference, and as a guide to sources of more information both in print and on the Web.
Sources of data include the Census Bureau, Bureau of Labor Statistics, Bureau of Economic Analysis, and many other Federal agencies and private organizations.
Automated oil spill detection with multispectral imagery
NASA Astrophysics Data System (ADS)
Bradford, Brian N.; Sanchez-Reyes, Pedro J.
2011-06-01
In this publication we present an automated detection method for ocean surface oil, like that which existed in the Gulf of Mexico as a result of the April 20, 2010 Deepwater Horizon drilling rig explosion. Regions of surface oil in airborne imagery are isolated using red, green, and blue bands from multispectral data sets. The oil shape isolation procedure involves a series of image processing functions to draw out the visual phenomenological features of the surface oil. These functions include selective color band combinations, contrast enhancement and histogram warping. An image segmentation process then separates out contiguous regions of oil to provide a raster mask to an analyst. We automate the detection algorithm to allow large volumes of data to be processed in a short time period, which can provide timely oil coverage statistics to response crews. Geo-referenced and mosaicked data sets enable the largest identified oil regions to be mapped to exact geographic coordinates. In our simulation, multispectral imagery came from multiple sources including first-hand data collected from the Gulf. Results of the simulation show the oil spill coverage area as a raster mask, along with histogram statistics of the oil pixels. A rough square footage estimate of the coverage is reported if the image ground sample distance is available.
Nasal mask ventilation is better than face mask ventilation in edentulous patients
Kapoor, Mukul Chandra; Rana, Sandeep; Singh, Arvind Kumar; Vishal, Vindhya; Sikdar, Indranil
2016-01-01
Background and Aims: Face mask ventilation of the edentulous patient is often difficult as ineffective seating of the standard mask to the face prevents attainment of an adequate air seal. The efficacy of nasal ventilation in edentulous patients has been cited in case reports but has never been investigated. Material and Methods: Consecutive edentulous adult patients scheduled for surgery under general anesthesia with endotracheal intubation, during a 17-month period, were prospectively evaluated. After induction of anesthesia and administration of neuromuscular blocker, lungs were ventilated with a standard anatomical face mask of appropriate size, using a volume controlled anesthesia ventilator with tidal volume set at 10 ml/kg. In case of inadequate ventilation, the mask position was adjusted to achieve best-fit. Inspired and expired tidal volumes were measured. Thereafter, the face mask was replaced by a nasal mask and after achieving best-fit, the inspired and expired tidal volumes were recorded. The difference in expired tidal volumes and airway pressures at best-fit with the use of the two masks and number of patients with inadequate ventilation with use of the masks were statistically analyzed. Results: A total of 79 edentulous patients were recruited for the study. The difference in expiratory tidal volumes with the use of the two masks at best-fit was statistically significant (P = 0.0017). Despite the best-fit mask placement, adequacy of ventilation could not be achieved in 24.1% patients during face mask ventilation, and 12.7% patients during nasal mask ventilation and the difference was statistically significant. Conclusion: Nasal mask ventilation is more efficient than standard face mask ventilation in edentulous patients. PMID:27625477
Longitudinal Analysis of Superficial Midfacial Fat Volumes Over a 10-Year Period.
Tower, Jacob; Seifert, Kimberly; Paskhover, Boris
2018-04-11
Volumetric changes to facial fat that occur with aging remain poorly understood. The aim of this study was to evaluate for longitudinal changes to midfacial fat volumes in a group of individuals. We conducted a retrospective longitudinal study of adult subjects who underwent multiple facial computed tomographic (CT) scans timed at least 8 years apart. Subjects who underwent facial surgery or suffered facial trauma were excluded. Facial CT scans were analyzed, and superficial cheek fat volumes were measured and compared to track changes that occurred with aging. Fourteen subjects were included in our analysis of facial aging (5 male, 9 female; mean initial age 50.9 years; mean final age 60.4 years). In the right superficial cheek there was an increase in mean (SD) superficial fat volume from 10.33 (2.01) to 10.50 (1.80) cc, which was not statistically significant (P = 0.75). Similar results were observed in the left cheek. There were no statistically significant longitudinal changes to caudal, middle, or cephalad subdivisions of bilateral superficial cheek fat. A simple linear regression was performed to predict superficial cheek fat pad volume based on age which did not reach statistical significance (P = 0.31), with an R 2 of 0.039. This study is the first to quantitatively assess for longitudinal changes to midfacial fat in a group of individuals. Superficial cheek fat remained stable as subjects aged from approximately 50 to 60 years old, with no change in total volume or redistribution within a radiographically defined compartment. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .
Graczyk, Michelle B.; Duarte Queirós, Sílvio M.
2016-01-01
We study the intraday behaviour of the statistical moments of the trading volume of the blue chip equities that composed the Dow Jones Industrial Average index between 2003 and 2014. By splitting that time interval into semesters, we provide a quantitative account of the nonstationary nature of the intraday statistical properties as well. Explicitly, we prove the well-known ∪-shape exhibited by the average trading volume—as well as the volatility of the price fluctuations—experienced a significant change from 2008 (the year of the “subprime” financial crisis) onwards. That has resulted in a faster relaxation after the market opening and relates to a consistent decrease in the convexity of the average trading volume intraday profile. Simultaneously, the last part of the session has become steeper as well, a modification that is likely to have been triggered by the new short-selling rules that were introduced in 2007 by the Securities and Exchange Commission. The combination of both results reveals that the ∪ has been turning into a ⊔. Additionally, the analysis of higher-order cumulants—namely the skewness and the kurtosis—shows that the morning and the afternoon parts of the trading session are each clearly associated with different statistical features and hence dynamical rules. Concretely, we claim that the large initial trading volume is due to wayward stocks whereas the large volume during the last part of the session hinges on a cohesive increase of the trading volume. That dissimilarity between the two parts of the trading session is stressed in periods of higher uproar in the market. PMID:27812141
Zador, Zsolt; Coope, David J; Gnanalingham, Kanna; Lawton, Michael T
2014-04-01
Eyebrow craniotomy is a recently described minimally invasive approach for tackling primarily pathology of the anterior skull base. The removal of the orbital bar may further expand the surgical corridor of this exposure, but the extent of benefit is poorly quantified. We assessed the effect of orbital bar removal with regards to surgical access in the eyebrow craniotomy using classic morphometric measurements in cadaver heads. Using surgical phantoms and neuronavigation, we also measured the 'working volume', a new parameter for characterising the volume of surgical access in these approaches. Silicon injected cadaver heads (n = 5) were used for morphometric analysis of the eyebrow craniotomy with and without orbital bar removal. Working depths and 'working areas' of surgical access were measured as defined by key anatomical landmarks. The eyebrow craniotomy with or without orbital bar removal was also simulated using surgical phantoms (n = 3, 90-120 points per trial), calibrated against a frameless neuronavigation system. Working volume was derived from reference coordinates recorded along the anatomical borders of the eyebrow craniotomy using the "α-shape algorithm" in R statistics. In cadaver heads, eyebrow craniotomy with removal of the orbital bar reduced the working depth to the ipsilateral anterior clinoid process (42 ± 2 versus 33 ± 3 mm; p < 0.05), but the working areas as defined by deep neurovascular and bony landmarks was statistically unchanged (total working areas of 418 ± 80 cm(2) versus 334 ± 48 cm(2); p = 0.4). In surgical phantom studies, however, working-volume for the simulated eyebrow craniotomies was increased with orbital bar removal (16 ± 1 cm(3) versus 21 ± 1 cm(3); p < 0.01). In laboratory studies, orbital bar removal in eyebrow craniotomy provides a modest reduction in working depth and increase in the working volume. But this must be weighed up against the added morbidity of the procedure. Working volume, a newly developed parameter may provide a more meaningful endpoint for characterising the surgical access for different surgical approaches and it could be applied to other operative cases undertaken with frameless neuronavigation.
Vasilak, Lindsay; Tanu Halim, Silvie M; Das Gupta, Hrishikesh; Yang, Juan; Kamperman, Marleen; Turak, Ayse
2017-04-19
In this study, we assess the utility of a normal force (pull-test) approach to measuring adhesion in organic solar cells and organic light-emitting diodes. This approach is a simple and practical method of monitoring the impact of systematic changes in materials, processing conditions, or environmental exposure on interfacial strength and electrode delamination. The ease of measurement enables a statistical description with numerous samples, variant geometry, and minimal preparation. After examining over 70 samples, using the Weibull modulus and the characteristic breaking strength as metrics, we were able to successfully differentiate the adhesion values between 8-tris(hydroxyquinoline aluminum) (Alq 3 ) and poly(3-hexyl-thiophene) and [6,6]-phenyl C61-butyric acid methyl ester (P3HT:PCBM) interfaces with Al and between two annealing times for the bulk heterojunction polymer blends. Additionally, the Weibull modulus, a relative measure of the range of flaw sizes at the fracture plane, can be correlated with the roughness of the organic surface. Finite element modeling of the delamination process suggests that the out-of-plane elastic modulus for Alq 3 is lower than the reported in-plane elastic values. We suggest a statistical treatment of a large volume of tests be part of the standard protocol for investigating adhesion to accommodate the unavoidable variability in morphology and interfacial structure found in most organic devices.
Hsin, Yue-Loong; Harnod, Tomor; Chang, Cheng-Siu; Peng, Syu-Jyun
2017-11-01
Convulsive motor activity is a clinical manifestation of secondarily generalized seizures evolving from different focal regions. The way in which the motor seizures present themselves is not very different from most of the generalized seizures in and between epilepsy patients. This might point towards the involvement of motor-related cortices and corticospinal pathway for wide spread propagation of epileptic activity. Our aim was to identify changes in the cerebral structures and to correlate clinical variables with structural changes particularly in the motor-related cortices and pathway of patients with generalized convulsions from different seizure foci. Sixteen patients with focal onset and secondarily generalized seizures were included, along with sixteen healthy volunteers. Structural differences were analysed by measuring grey matter (GM) volume and thickness via T1-weighted MRI, and white matter (WM) fractional anisotropy (FA) via diffusion tensor imaging. GM and WM microstructural properties were compared between patients and controls by voxel- and surface- based analyses. Next, morphometric findings were correlated with seizure severity and disease duration to identify the pathologic process. In addition to widely reduced GM and WM properties, increased GM volume in the bilateral precentral gyri and paracentral lobules, and elevated regional FA in the bilateral corticospinal tracts adjacent to these motor -related GM were observed in patients and with higher statistical difference in the sub-patient group with drug-resistance. The increment of GM volume and WM FA in the motor pathway positively correlated with severity and duration of epilepsy. The demonstrated microstructural changes of motor pathways imply a plastic process of motor networks in the patients with frequent generalization of focal seizures. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Note: Nonpolar solute partial molar volume response to attractive interactions with water.
Williams, Steven M; Ashbaugh, Henry S
2014-01-07
The impact of attractive interactions on the partial molar volumes of methane-like solutes in water is characterized using molecular simulations. Attractions account for a significant 20% volume drop between a repulsive Weeks-Chandler-Andersen and full Lennard-Jones description of methane interactions. The response of the volume to interaction perturbations is characterized by linear fits to our simulations and a rigorous statistical thermodynamic expression for the derivative of the volume to increasing attractions. While a weak non-linear response is observed, an average effective slope accurately captures the volume decrease. This response, however, is anticipated to become more non-linear with increasing solute size.
Note: Nonpolar solute partial molar volume response to attractive interactions with water
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Steven M.; Ashbaugh, Henry S., E-mail: hanka@tulane.edu
2014-01-07
The impact of attractive interactions on the partial molar volumes of methane-like solutes in water is characterized using molecular simulations. Attractions account for a significant 20% volume drop between a repulsive Weeks-Chandler-Andersen and full Lennard-Jones description of methane interactions. The response of the volume to interaction perturbations is characterized by linear fits to our simulations and a rigorous statistical thermodynamic expression for the derivative of the volume to increasing attractions. While a weak non-linear response is observed, an average effective slope accurately captures the volume decrease. This response, however, is anticipated to become more non-linear with increasing solute size.
Impact of Major Pulmonary Resections on Right Ventricular Function: Early Postoperative Changes.
Elrakhawy, Hany M; Alassal, Mohamed A; Shaalan, Ayman M; Awad, Ahmed A; Sayed, Sameh; Saffan, Mohammad M
2018-01-15
Right ventricular (RV) dysfunction after pulmonary resection in the early postoperative period is documented by reduced RV ejection fraction and increased RV end-diastolic volume index. Supraventricular arrhythmia, particularly atrial fibrillation, is common after pulmonary resection. RV assessment can be done by non-invasive methods and/or invasive approaches such as right cardiac catheterization. Incorporation of a rapid response thermistor to pulmonary artery catheter permits continuous measurements of cardiac output, right ventricular ejection fraction, and right ventricular end-diastolic volume. It can also be used for right atrial and right ventricular pacing, and for measuring right-sided pressures, including pulmonary capillary wedge pressure. This study included 178 patients who underwent major pulmonary resections, 36 who underwent pneumonectomy assigned as group (I) and 142 who underwent lobectomy assigned as group (II). The study was conducted at the cardiothoracic surgery department of Benha University hospital in Egypt; patients enrolled were operated on from February 2012 to February 2016. A rapid response thermistor pulmonary artery catheter was inserted via the right internal jugular vein. Preoperatively the following was recorded: central venous pressure, mean pulmonary artery pressure, pulmonary capillary wedge pressure, cardiac output, right ventricular ejection fraction and volumes. The same parameters were collected in fixed time intervals after 3 hours, 6 hours, 12 hours, 24 hours, and 48 hours postoperatively. For group (I): There were no statistically significant changes between the preoperative and postoperative records in the central venous pressure and mean arterial pressure; there were no statistically significant changes in the preoperative and 12, 24, and 48 hour postoperative records for cardiac index; 3 and 6 hours postoperative showed significant changes. There were statistically significant changes between the preoperative and postoperative records for heart rate, mean pulmonary artery pressure, pulmonary capillary wedge pressure, pulmonary vascular resistance, right ventricular ejection fraction and right ventricular end diastolic volume index, in all postoperative records. For group (II): There were no statistically significant changes between the preoperative and all postoperative records for the central venous pressure, mean arterial pressure and cardiac index. There were statistically significant changes between the preoperative and postoperative records for heart rate, mean pulmonary artery pressure, pulmonary capillary wedge pressure, pulmonary vascular resistance, right ventricular ejection fraction and right ventricular end diastolic volume index in all postoperative records. There were statistically significant changes between the two groups in all postoperative records for heart rate, mean pulmonary artery pressure, pulmonary capillary wedge pressure, pulmonary vascular resistance, right ventricular ejection fraction and right ventricular end diastolic volume index. There is right ventricular dysfunction early after major pulmonary resection caused by increased right ventricular afterload. This dysfunction is more present in pneumonectomy than in lobectomy. Heart rate, mean pulmonary artery pressure, pulmonary capillary wedge pressure, pulmonary vascular resistance, right ventricular ejection fraction, and right ventricular end diastolic volume index are significantly affected by pulmonary resection.
Noise in Nonlinear Dynamical Systems 3 Volume Paperback Set
NASA Astrophysics Data System (ADS)
Moss, Frank; McClintock, P. V. E.
2011-11-01
Volume 1: List of contributors; Preface; Introduction to volume one; 1. Noise-activated escape from metastable states: an historical view Rolf Landauer; 2. Some Markov methods in the theory of stochastic processes in non-linear dynamical systems R. L. Stratonovich; 3. Langevin equations with coloured noise J. M. Sancho and M. San Miguel; 4. First passage time problems for non-Markovian processes Katja Lindenberg, Bruce J. West and Jaume Masoliver; 5. The projection approach to the Fokker-Planck equation: applications to phenomenological stochastic equations with coloured noises Paolo Grigolini; 6. Methods for solving Fokker-Planck equations with applications to bistable and periodic potentials H. Risken and H. D. Vollmer; 7. Macroscopic potentials, bifurcations and noise in dissipative systems Robert Graham; 8. Transition phenomena in multidimensional systems - models of evolution W. Ebeling and L. Schimansky-Geier; 9. Coloured noise in continuous dynamical systems: a functional calculus approach Peter Hanggi; Appendix. On the statistical treatment of dynamical systems L. Pontryagin, A. Andronov and A. Vitt; Index. Volume 2: List of contributors; Preface; Introduction to volume two; 1. Stochastic processes in quantum mechanical settings Ronald F. Fox; 2. Self-diffusion in non-Markovian condensed-matter systems Toyonori Munakata; 3. Escape from the underdamped potential well M. Buttiker; 4. Effect of noise on discrete dynamical systems with multiple attractors Edgar Knobloch and Jeffrey B. Weiss; 5. Discrete dynamics perturbed by weak noise Peter Talkner and Peter Hanggi; 6. Bifurcation behaviour under modulated control parameters M. Lucke; 7. Period doubling bifurcations: what good are they? Kurt Wiesenfeld; 8. Noise-induced transitions Werner Horsthemke and Rene Lefever; 9. Mechanisms for noise-induced transitions in chemical systems Raymond Kapral and Edward Celarier; 10. State selection dynamics in symmetry-breaking transitions Dilip K. Kondepudi; 11. Noise in a ring-laser gyroscope K. Vogel, H. Risken and W. Schleich; 12. Control of noise and applications to optical systems L. A. Lugiato, G. Broggi, M. Merri and M. A. Pernigo; 13. Transition probabilities and spectral density of fluctuations of noise driven bistable systems M. I. Dykman, M. A. Krivoglaz and S. M. Soskin; Index. Volume 3: List of contributors; Preface; Introduction to volume three; 1. The effects of coloured quadratic noise on a turbulent transition in liquid He II J. T. Tough; 2. Electrohydrodynamic instability of nematic liquid crystals: growth process and influence of noise S. Kai; 3. Suppression of electrohydrodynamic instabilities by external noise Helmut R. Brand; 4. Coloured noise in dye laser fluctuations R. Roy, A. W. Yu and S. Zhu; 5. Noisy dynamics in optically bistable systems E. Arimondo, D. Hennequin and P. Glorieux; 6. Use of an electronic model as a guideline in experiments on transient optical bistability W. Lange; 7. Computer experiments in nonlinear stochastic physics Riccardo Mannella; 8. Analogue simulations of stochastic processes by means of minimum component electronic devices Leone Fronzoni; 9. Analogue techniques for the study of problems in stochastic nonlinear dynamics P. V. E. McClintock and Frank Moss; Index.
Forest statistics for New Jersey--1987
Dawn M. DiGiovanni; Charles T. Scott; Charles T. Scott
1990-01-01
A statistical report on the third forest survey of New Jersey (1987). Findings are displayed in 66 tables containing estimates of forest area, numbers of trees, timber volume, tree biomass, and timber products output. Data are presented at two levels: state and county.
DOT National Transportation Integrated Search
1976-03-01
This introductory portion of a system science for tranportation planning, which is based on the statistical physics of ensembles, a foundations laid on how statistical mechanics, equilibrium thermodynamics, and near equilbrium thermodynamics can be u...
Statistical analyses of commercial vehicle accident factors. Volume 1 Part 1
DOT National Transportation Integrated Search
1978-02-01
Procedures for conducting statistical analyses of commercial vehicle accidents have been established and initially applied. A file of some 3,000 California Highway Patrol accident reports from two areas of California during a period of about one year...
Forest statistics for Vermont: 1983 and 1997
Thomas S. Frieswyk; Richard H. Widmann; Richard H. Widmann
2000-01-01
A statistical report on the fifth forest inventory of Vermont 1996-1998. Findings are displayed in 86 tables containing estimates of forest area numbers of trees timber volume growth change and biomass. Data are presented at three levels: state, county, and region.
DOT National Transportation Integrated Search
2001-01-01
Airport Activity Statistics of Certificated Air Carriers: Summary Tables presents summary data for : all scheduled and nonscheduled service by large certificated U.S. air carriersincluding the volume : of passenger, freight, and mail enplanements,...
Forest Statistics for Maine, 1995
Douglas M. Griffith; Carol L. Alerich; Carol L. Alerich
1996-01-01
A statistical report on the fourth forest inventory of Maine conducted in 1994-96. Findings are displayed in 117 tables containing estimates of forest area numbers of trees, timber volume, and growth. Data are presented at three levels: state, geographic unit, and county.
Forest Statistics for New Jersey: 1987 and 1999
Douglas M. Griffith; Richard H. Widmann; Richard H. Widmann
2001-01-01
A statistical report on the fourth forest inventory of New Jersey 1999. Findings are displayed in 49 tables containing estimates of forest area numbers of trees timber volume growth change and biomass. Data are presented at two levels state and county.
Forest statistics for Minnesota's Prairie Unit.
Sue M. Roussopoulos
1992-01-01
The fifth inventory of Minnesota's Prairie Unit reports 19.2 million acres of land, of which 660 thousand acres are forested. This bulletin presents statistical highlights and contains detailed tables of forest area, as well as timber volume, growth, removals, mortality, and ownership.
Forest Statistics for Connecticut--1972 and 1985
David R. Dickson; Carol L. McAfee; Carol L. McAfee
1988-01-01
A statistical report on the third forest survey of Connecticut (1984). Findings are displayed in 77 tables containing estimates of forest area, numbers of trees, timber volume, tree biomass, and timber products output. Data are presented at two levels: state and county.
Forest statistics for Delaware-1972 and 1986
Thomas S. Frieswyk; Dawn M. DiGiovanni; Dawn M. DiGiovanni
1989-01-01
A statistical report on the third forest survey of Delaware (1986). Findings are displayed in 65 tables containing estimates of forest area, number of trees, timber volume, tree biomass, and timber products output. Data are presented at two levels: state and county.
Michigan forest statistics, 1993.
Earl C. Leatherberry; John S. Jr. Spencer
1996-01-01
The fifth forest inventory of Michigan's forest reports 36.4 million acres of land, of which 19.3 million acres are forested. This bulletin presents statistical highlights and contains detailed tables of forest area, as well as timber volume, growth, removals, mortality, and biomass.
Forest Statistics for Massachusetts--1972 and 1985
David R. Dickson; Carol L. McAfee; Carol L. McAfee
1988-01-01
A statistical report on the third forest survey of Massachusetts (1984). Findings are displayed in 76 tables containing estimates of forest area, numbers of trees, timber volume, tree biomass, and timber products output. Data are presented at two levels: state and county.
Obuchowski, Nancy A; Barnhart, Huiman X; Buckler, Andrew J; Pennello, Gene; Wang, Xiao-Feng; Kalpathy-Cramer, Jayashree; Kim, Hyun J Grace; Reeves, Anthony P
2015-02-01
Quantitative imaging biomarkers are being used increasingly in medicine to diagnose and monitor patients' disease. The computer algorithms that measure quantitative imaging biomarkers have different technical performance characteristics. In this paper we illustrate the appropriate statistical methods for assessing and comparing the bias, precision, and agreement of computer algorithms. We use data from three studies of pulmonary nodules. The first study is a small phantom study used to illustrate metrics for assessing repeatability. The second study is a large phantom study allowing assessment of four algorithms' bias and reproducibility for measuring tumor volume and the change in tumor volume. The third study is a small clinical study of patients whose tumors were measured on two occasions. This study allows a direct assessment of six algorithms' performance for measuring tumor change. With these three examples we compare and contrast study designs and performance metrics, and we illustrate the advantages and limitations of various common statistical methods for quantitative imaging biomarker studies. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
NASA Astrophysics Data System (ADS)
Otake, Y.; Murphy, R. J.; Grupp, R. B.; Sato, Y.; Taylor, R. H.; Armand, M.
2015-03-01
A robust atlas-to-subject registration using a statistical deformation model (SDM) is presented. The SDM uses statistics of voxel-wise displacement learned from pre-computed deformation vectors of a training dataset. This allows an atlas instance to be directly translated into an intensity volume and compared with a patient's intensity volume. Rigid and nonrigid transformation parameters were simultaneously optimized via the Covariance Matrix Adaptation - Evolutionary Strategy (CMA-ES), with image similarity used as the objective function. The algorithm was tested on CT volumes of the pelvis from 55 female subjects. A performance comparison of the CMA-ES and Nelder-Mead downhill simplex optimization algorithms with the mutual information and normalized cross correlation similarity metrics was conducted. Simulation studies using synthetic subjects were performed, as well as leave-one-out cross validation studies. Both studies suggested that mutual information and CMA-ES achieved the best performance. The leave-one-out test demonstrated 4.13 mm error with respect to the true displacement field, and 26,102 function evaluations in 180 seconds, on average.
Takahashi, Masahiro; Kimura, Fumiko; Umezawa, Tatsuya; Watanabe, Yusuke; Ogawa, Harumi
2016-01-01
Adaptive statistical iterative reconstruction (ASIR) has been used to reduce radiation dose in cardiac computed tomography. However, change of image parameters by ASIR as compared to filtered back projection (FBP) may influence quantification of coronary calcium. To investigate the influence of ASIR on calcium quantification in comparison to FBP. In 352 patients, CT images were reconstructed using FBP alone, FBP combined with ASIR 30%, 50%, 70%, and ASIR 100% based on the same raw data. Image noise, plaque density, Agatston scores and calcium volumes were compared among the techniques. Image noise, Agatston score, and calcium volume decreased significantly with ASIR compared to FBP (each P < 0.001). Use of ASIR reduced Agatston score by 10.5% to 31.0%. In calcified plaques both of patients and a phantom, ASIR decreased maximum CT values and calcified plaque size. In comparison to FBP, adaptive statistical iterative reconstruction (ASIR) may significantly decrease Agatston scores and calcium volumes. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.
Hand surgery volume and the US economy: is there a statistical correlation?
Gordon, Chad R; Pryor, Landon; Afifi, Ahmed M; Gatherwright, James R; Evans, Peter J; Hendrickson, Mark; Bernard, Steven; Zins, James E
2010-11-01
To the best of our knowledge, there have been no previous studies evaluating the correlation of the US economy and hand surgery volume. Therefore, in light of the current recession, our objective was to study our institution's hand surgery volume over the last 17 years in relation to the nation's economy. A retrospective analysis of our institution's hand surgery volume, as represented by our most common procedure (ie, carpal tunnel release), was performed between January 1992 and October 2008. Liposuction and breast augmentation volumes were chosen to serve as cosmetic plastic surgery comparison groups. Pearson correlation statistics were used to estimate the relationship between the surgical volume and the US economy, as represented by the 3 market indices (Dow Jones, NASDAQ, and S&P500). A combined total of 7884 hand surgery carpal tunnel release (open or endoscopic) patients were identified. There were 1927 (24%) and 5957 (76%) patients within the departments of plastic and orthopedic surgery, respectively. In the plastic surgery department, there was a strong negative (ie, inverse relationship) correlation between hand surgery volume and the economy (P < 0.001). In converse, the orthopedic department's hand surgery volume demonstrated a positive (ie, parallel) correlation (P < 0.001). The volumes of liposuction and breast augmentation also showed a positive correlation (P < 0.001). To our knowledge, we have demonstrated for the first time an inverse (ie, negative) correlation between hand surgery volumes performed by plastic surgeons in relation to the US economy, as represented by the 3 major market indices. In contrast, orthopedic hand surgery volume and cosmetic surgery show a parallel (ie, positive) correlation. This data suggests that plastic surgeons are increasing their cosmetic surgery-to-reconstructive/hand surgery ratio during strong economic times and vice versa during times of economic slowdown.
Westenbroek, Stephen M.; Doherty, John; Walker, John F.; Kelson, Victor A.; Hunt, Randall J.; Cera, Timothy B.
2012-01-01
The TSPROC (Time Series PROCessor) computer software uses a simple scripting language to process and analyze time series. It was developed primarily to assist in the calibration of environmental models. The software is designed to perform calculations on time-series data commonly associated with surface-water models, including calculation of flow volumes, transformation by means of basic arithmetic operations, and generation of seasonal and annual statistics and hydrologic indices. TSPROC can also be used to generate some of the key input files required to perform parameter optimization by means of the PEST (Parameter ESTimation) computer software. Through the use of TSPROC, the objective function for use in the model-calibration process can be focused on specific components of a hydrograph.
Post-natal growth in the rat pineal gland: a stereological study.
Erbagci, H; Kizilkan, N; Ozbag, D; Erkilic, S; Kervancioglu, P; Canan, S; Gumusburun, E
2012-10-01
The purpose was to observe the changes in a rat pineal gland using stereological techniques during lactation and post-weaning periods. Thirty Wistar albino rats were studied during different post-natal periods using light microscopy. Pineal gland volume was estimated using the Cavalieri Method. Additionally, the total number of pinealocytes was estimated using the optical fractionator technique. Pineal gland volume displayed statistically significant changes between lactation and after weaning periods. A significant increase in pineal gland volume was observed from post-natal day 10 to post-natal day 90. The numerical density of pinealocytes became stabilized during lactation and decreased rapidly after weaning. However, the total number of pinealocytes continuously increased during post-natal life of all rats in the study. However, this increment was not statistically significant when comparing the lactation and after weaning periods. The increase in post-natal pineal gland volume may depend on increment of immunoreactive fibres, capsule thickness or new synaptic bodies. © 2012 Blackwell Verlag GmbH.
Toussaint, Renaud; Pride, Steven R
2002-09-01
This is the first of a series of three articles that treats fracture localization as a critical phenomenon. This first article establishes a statistical mechanics based on ensemble averages when fluctuations through time play no role in defining the ensemble. Ensembles are obtained by dividing a huge rock sample into many mesoscopic volumes. Because rocks are a disordered collection of grains in cohesive contact, we expect that once shear strain is applied and cracks begin to arrive in the system, the mesoscopic volumes will have a wide distribution of different crack states. These mesoscopic volumes are the members of our ensembles. We determine the probability of observing a mesoscopic volume to be in a given crack state by maximizing Shannon's measure of the emergent-crack disorder subject to constraints coming from the energy balance of brittle fracture. The laws of thermodynamics, the partition function, and the quantification of temperature are obtained for such cracking systems.
ERIC Educational Resources Information Center
Herlihy, Lester B.; Deffenbaugh, Walter S.
1938-01-01
This report presents statistics of city school systems for the school year 1935-36. prior to 1933-34 school statistics for cities included in county unit systems were estimated. Most of these cities are in Florida, Louisiana, Maryland, and West Virginia. Since the method of estimating school statistics for the cities included with the counties in…
Hassanain, Mazen; Zamakhshary, Mohammed; Farhat, Ghada; Al-Badr, Ahmed
2017-04-01
The objective of this study was to assess whether an intervention on process efficiency using the Lean methodology leads to improved utilization of the operating room (OR), as measured by key performance metrics of OR efficiency. A quasi-experimental design was used to test the impact of the intervention by comparing pre-intervention and post-intervention data on five key performance indicators. The ORs of 12 hospitals were selected across regions of the Kingdom of Saudi Arabia (KSA). The participants were patients treated at these hospitals during the study period. The intervention comprised the following: (i) creation of visual dashboards that enable starting the first case on time; (ii) use of computerized surgical list management; (iii) optimization of time allocation; (iv) development of an operating model with policies and procedures for the pre-anesthesia clinic; and (iv) creation of a governance structure with policies and procedures for day surgeries. The following were the main outcome measures: on-time start for the first case, room turnover times, percent of overrun cases, average weekly procedure volume and OR utilization. The hospital exhibited statistically significant improvements in the following performance metrics: on-time start for the first case, room turnover times and percent of overrun cases. A statistically significant difference in OR utilization or average weekly procedure volumes was not detected. The implementation of a Lean-based intervention targeting process efficiency applied in ORs across various KSA hospitals resulted in encouraging results on some metrics at some sites, suggesting that the approach has the potential to produce significant benefit in the future. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Delora, Adam; Gonzales, Aaron; Medina, Christopher S; Mitchell, Adam; Mohed, Abdul Faheem; Jacobs, Russell E; Bearer, Elaine L
2016-01-15
Magnetic resonance imaging (MRI) is a well-developed technique in neuroscience. Limitations in applying MRI to rodent models of neuropsychiatric disorders include the large number of animals required to achieve statistical significance, and the paucity of automation tools for the critical early step in processing, brain extraction, which prepares brain images for alignment and voxel-wise statistics. This novel timesaving automation of template-based brain extraction ("skull-stripping") is capable of quickly and reliably extracting the brain from large numbers of whole head images in a single step. The method is simple to install and requires minimal user interaction. This method is equally applicable to different types of MR images. Results were evaluated with Dice and Jacquard similarity indices and compared in 3D surface projections with other stripping approaches. Statistical comparisons demonstrate that individual variation of brain volumes are preserved. A downloadable software package not otherwise available for extraction of brains from whole head images is included here. This software tool increases speed, can be used with an atlas or a template from within the dataset, and produces masks that need little further refinement. Our new automation can be applied to any MR dataset, since the starting point is a template mask generated specifically for that dataset. The method reliably and rapidly extracts brain images from whole head images, rendering them useable for subsequent analytical processing. This software tool will accelerate the exploitation of mouse models for the investigation of human brain disorders by MRI. Copyright © 2015 Elsevier B.V. All rights reserved.
High call volume at poison control centers: identification and implications for communication
CARAVATI, E. M.; LATIMER, S.; REBLIN, M.; BENNETT, H. K. W.; CUMMINS, M. R.; CROUCH, B. I.; ELLINGTON, L.
2016-01-01
Context High volume surges in health care are uncommon and unpredictable events. Their impact on health system performance and capacity is difficult to study. Objectives To identify time periods that exhibited very busy conditions at a poison control center and to determine whether cases and communication during high volume call periods are different from cases during low volume periods. Methods Call data from a US poison control center over twelve consecutive months was collected via a call logger and an electronic case database (Toxicall®). Variables evaluated for high call volume conditions were: (1) call duration; (2) number of cases; and (3) number of calls per staff member per 30 minute period. Statistical analyses identified peak periods as busier than 99% of all other 30 minute time periods and low volume periods as slower than 70% of all other 30 minute periods. Case and communication characteristics of high volume and low volume calls were compared using logistic regression. Results A total of 65,364 incoming calls occurred over 12 months. One hundred high call volume and 4885 low call volume 30 minute periods were identified. High volume periods were more common between 1500 and 2300 hours and during the winter months. Coded verbal communication data were evaluated for 42 high volume and 296 low volume calls. The mean (standard deviation) call length of these calls during high volume and low volume periods was 3 minutes 27 seconds (1 minute 46 seconds) and 3 minutes 57 seconds (2 minutes 11 seconds), respectively. Regression analyses revealed a trend for fewer overall verbal statements and fewer staff questions during peak periods, but no other significant differences for staff-caller communication behaviors were found. Conclusion Peak activity for poison center call volume can be identified by statistical modeling. Calls during high volume periods were similar to low volume calls. Communication was more concise yet staff was able to maintain a good rapport with callers during busy call periods. This approach allows evaluation of poison exposure call characteristics and communication during high volume periods. PMID:22889059
High call volume at poison control centers: identification and implications for communication.
Caravati, E M; Latimer, S; Reblin, M; Bennett, H K W; Cummins, M R; Crouch, B I; Ellington, L
2012-09-01
High volume surges in health care are uncommon and unpredictable events. Their impact on health system performance and capacity is difficult to study. To identify time periods that exhibited very busy conditions at a poison control center and to determine whether cases and communication during high volume call periods are different from cases during low volume periods. Call data from a US poison control center over twelve consecutive months was collected via a call logger and an electronic case database (Toxicall®).Variables evaluated for high call volume conditions were: (1) call duration; (2) number of cases; and (3) number of calls per staff member per 30 minute period. Statistical analyses identified peak periods as busier than 99% of all other 30 minute time periods and low volume periods as slower than 70% of all other 30 minute periods. Case and communication characteristics of high volume and low volume calls were compared using logistic regression. A total of 65,364 incoming calls occurred over 12 months. One hundred high call volume and 4885 low call volume 30 minute periods were identified. High volume periods were more common between 1500 and 2300 hours and during the winter months. Coded verbal communication data were evaluated for 42 high volume and 296 low volume calls. The mean (standard deviation) call length of these calls during high volume and low volume periods was 3 minutes 27 seconds (1 minute 46 seconds) and 3 minutes 57 seconds (2 minutes 11 seconds), respectively. Regression analyses revealed a trend for fewer overall verbal statements and fewer staff questions during peak periods, but no other significant differences for staff-caller communication behaviors were found. Peak activity for poison center call volume can be identified by statistical modeling. Calls during high volume periods were similar to low volume calls. Communication was more concise yet staff was able to maintain a good rapport with callers during busy call periods. This approach allows evaluation of poison exposure call characteristics and communication during high volume periods.
NASA Astrophysics Data System (ADS)
Cordero-Llana, L.; Selmes, N.; Murray, T.; Scharrer, K.; Booth, A. D.
2012-12-01
Large volumes of water are necessary to propagate cracks to the glacial bed via hydrofractures. Hydrological models have shown that lakes above a critical volume can supply the necessary water for this process, so the ability to measure water depth in lakes remotely is important to study these processes. Previously, water depth has been derived from the optical properties of water using data from high resolution optical satellite images, as such ASTER, (Advanced Spaceborne Thermal Emission and Reflection Radiometer), IKONOS and LANDSAT. These studies used water-reflectance models based on the Bouguer-Lambert-Beer law and lack any estimation of model uncertainties. We propose an optimized model based on Sneed and Hamilton's (2007) approach to estimate water depths in supraglacial lakes and undertake a robust analysis of the errors for the first time. We used atmospherically-corrected data from ASTER and MODIS data as an input to the water-reflectance model. Three physical parameters are needed: namely bed albedo, water attenuation coefficient and reflectance of optically-deep water. These parameters were derived for each wavelength using standard calibrations. As a reference dataset, we obtained lake geometries using ICESat measurements over empty lakes. Differences between modeled and reference depths are used in a minimization model to obtain parameters for the water-reflectance model, yielding optimized lake depth estimates. Our key contribution is the development of a Monte Carlo simulation to run the water-reflectance model, which allows us to quantify the uncertainties in water depth and hence water volume. This robust statistical analysis provides better understanding of the sensitivity of the water-reflectance model to the choice of input parameters, which should contribute to the understanding of the influence of surface-derived melt-water on ice sheet dynamics. Sneed, W.A. and Hamilton, G.S., 2007: Evolution of melt pond volume on the surface of the Greenland Ice Sheet. Geophysical Research Letters, 34, 1-4.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cox, Brett W., E-mail: coxb@mskcc.org; Spratt, Daniel E.; Lovelock, Michael
2012-08-01
Purpose: Spinal stereotactic radiosurgery (SRS) is increasingly used to manage spinal metastases. However, target volume definition varies considerably and no consensus target volume guidelines exist. This study proposes consensus target volume definitions using common scenarios in metastatic spine radiosurgery. Methods and Materials: Seven radiation oncologists and 3 neurological surgeons with spinal radiosurgery expertise independently contoured target and critical normal structures for 10 cases representing common scenarios in metastatic spine radiosurgery. Each set of volumes was imported into the Computational Environment for Radiotherapy Research. Quantitative analysis was performed using an expectation maximization algorithm for Simultaneous Truth and Performance Level Estimation (STAPLE)more » with kappa statistics calculating agreement between physicians. Optimized confidence level consensus contours were identified using histogram agreement analysis and characterized to create target volume definition guidelines. Results: Mean STAPLE agreement sensitivity and specificity was 0.76 (range, 0.67-0.84) and 0.97 (range, 0.94-0.99), respectively, for gross tumor volume (GTV) and 0.79 (range, 0.66-0.91) and 0.96 (range, 0.92-0.98), respectively, for clinical target volume (CTV). Mean kappa agreement was 0.65 (range, 0.54-0.79) for GTV and 0.64 (range, 0.54-0.82) for CTV (P<.01 for GTV and CTV in all cases). STAPLE histogram agreement analysis identified optimal consensus contours (80% confidence limit). Consensus recommendations include that the CTV should include abnormal marrow signal suspicious for microscopic invasion and an adjacent normal bony expansion to account for subclinical tumor spread in the marrow space. No epidural CTV expansion is recommended without epidural disease, and circumferential CTVs encircling the cord should be used only when the vertebral body, bilateral pedicles/lamina, and spinous process are all involved or there is extensive metastatic disease along the circumference of the epidural space. Conclusions: This report provides consensus guidelines for target volume definition for spinal metastases receiving upfront SRS in common clinical situations.« less
Martinez-Enriquez, Eduardo; Sun, Mengchan; Velasco-Ocana, Miriam; Birkenfeld, Judith; Pérez-Merino, Pablo; Marcos, Susana
2016-07-01
Measurement of crystalline lens geometry in vivo is critical to optimize performance of state-of-the-art cataract surgery. We used custom-developed quantitative anterior segment optical coherence tomography (OCT) and developed dedicated algorithms to estimate lens volume (VOL), equatorial diameter (DIA), and equatorial plane position (EPP). The method was validated ex vivo in 27 human donor (19-71 years of age) lenses, which were imaged in three-dimensions by OCT. In vivo conditions were simulated assuming that only the information within a given pupil size (PS) was available. A parametric model was used to estimate the whole lens shape from PS-limited data. The accuracy of the estimated lens VOL, DIA, and EPP was evaluated by comparing estimates from the whole lens data and PS-limited data ex vivo. The method was demonstrated in vivo using 2 young eyes during accommodation and 2 cataract eyes. Crystalline lens VOL was estimated within 96% accuracy (average estimation error across lenses ± standard deviation: 9.30 ± 7.49 mm3). Average estimation errors in EPP were below 40 ± 32 μm, and below 0.26 ± 0.22 mm in DIA. Changes in lens VOL with accommodation were not statistically significant (2-way ANOVA, P = 0.35). In young eyes, DIA decreased and EPP increased statistically significantly with accommodation (P < 0.001) by 0.14 mm and 0.13 mm, respectively, on average across subjects. In cataract eyes, VOL = 205.5 mm3, DIA = 9.57 mm, and EPP = 2.15 mm on average. Quantitative OCT with dedicated image processing algorithms allows estimation of human crystalline lens volume, diameter, and equatorial lens position, as validated from ex vivo measurements, where entire lens images are available.
Polygenic risk of Alzheimer disease is associated with early- and late-life processes.
Mormino, Elizabeth C; Sperling, Reisa A; Holmes, Avram J; Buckner, Randy L; De Jager, Philip L; Smoller, Jordan W; Sabuncu, Mert R
2016-08-02
To examine associations between aggregate genetic risk and Alzheimer disease (AD) markers in stages preceding the clinical symptoms of dementia using data from 2 large observational cohort studies. We computed polygenic risk scores (PGRS) using summary statistics from the International Genomics of Alzheimer's Project genome-wide association study of AD. Associations between PGRS and AD markers (cognitive decline, clinical progression, hippocampus volume, and β-amyloid) were assessed within older participants with dementia. Associations between PGRS and hippocampus volume were additionally examined within healthy younger participants (age 18-35 years). Within participants without dementia, elevated PGRS was associated with worse memory (p = 0.002) and smaller hippocampus (p = 0.002) at baseline, as well as greater longitudinal cognitive decline (memory: p = 0.0005, executive function: p = 0.01) and clinical progression (p < 0.00001). High PGRS was associated with AD-like levels of β-amyloid burden as measured with florbetapir PET (p = 0.03) but did not reach statistical significance for CSF β-amyloid (p = 0.11). Within the younger group, higher PGRS was associated with smaller hippocampus volume (p = 0.05). This pattern was evident when examining a PGRS that included many loci below the genome-wide association study (GWAS)-level significance threshold (16,123 single nucleotide polymorphisms), but not when PGRS was restricted to GWAS-level significant loci (18 single nucleotide polymorphisms). Effects related to common genetic risk loci distributed throughout the genome are detectable among individuals without dementia. The influence of this genetic risk may begin in early life and make an individual more susceptible to cognitive impairment in late life. Future refinement of polygenic risk scores may help identify individuals at risk for AD dementia. © 2016 American Academy of Neurology.
Polygenic risk of Alzheimer disease is associated with early- and late-life processes
Sperling, Reisa A.; Holmes, Avram J.; Buckner, Randy L.; De Jager, Philip L.; Smoller, Jordan W.; Sabuncu, Mert R.
2016-01-01
Objective: To examine associations between aggregate genetic risk and Alzheimer disease (AD) markers in stages preceding the clinical symptoms of dementia using data from 2 large observational cohort studies. Methods: We computed polygenic risk scores (PGRS) using summary statistics from the International Genomics of Alzheimer's Project genome-wide association study of AD. Associations between PGRS and AD markers (cognitive decline, clinical progression, hippocampus volume, and β-amyloid) were assessed within older participants with dementia. Associations between PGRS and hippocampus volume were additionally examined within healthy younger participants (age 18–35 years). Results: Within participants without dementia, elevated PGRS was associated with worse memory (p = 0.002) and smaller hippocampus (p = 0.002) at baseline, as well as greater longitudinal cognitive decline (memory: p = 0.0005, executive function: p = 0.01) and clinical progression (p < 0.00001). High PGRS was associated with AD-like levels of β-amyloid burden as measured with florbetapir PET (p = 0.03) but did not reach statistical significance for CSF β-amyloid (p = 0.11). Within the younger group, higher PGRS was associated with smaller hippocampus volume (p = 0.05). This pattern was evident when examining a PGRS that included many loci below the genome-wide association study (GWAS)–level significance threshold (16,123 single nucleotide polymorphisms), but not when PGRS was restricted to GWAS-level significant loci (18 single nucleotide polymorphisms). Conclusions: Effects related to common genetic risk loci distributed throughout the genome are detectable among individuals without dementia. The influence of this genetic risk may begin in early life and make an individual more susceptible to cognitive impairment in late life. Future refinement of polygenic risk scores may help identify individuals at risk for AD dementia. PMID:27385740
Assessment of Process Capability: the case of Soft Drinks Processing Unit
NASA Astrophysics Data System (ADS)
Sri Yogi, Kottala
2018-03-01
The process capability studies have significant impact in investigating process variation which is important in achieving product quality characteristics. Its indices are to measure the inherent variability of a process and thus to improve the process performance radically. The main objective of this paper is to understand capability of the process being produced within specification of the soft drinks processing unit, a premier brands being marketed in India. A few selected critical parameters in soft drinks processing: concentration of gas volume, concentration of brix, torque of crock has been considered for this study. Assessed some relevant statistical parameters: short term capability, long term capability as a process capability indices perspective. For assessment we have used real time data of soft drinks bottling company which is located in state of Chhattisgarh, India. As our research output suggested reasons for variations in the process which is validated using ANOVA and also predicted Taguchi cost function, assessed also predicted waste monetarily this shall be used by organization for improving process parameters. This research work has substantially benefitted the organization in understanding the various variations of selected critical parameters for achieving zero rejection.
Center for Space Power, Texas A and M University
NASA Astrophysics Data System (ADS)
Jones, Ken
Johnson Controls is a 106 year old company employing 42,000 people worldwide with $4.7 billion annual sales. Though we are new to the aerospace industry we are a world leader in automobile battery manufacturing, automotive seating, plastic bottling, and facilities environment controls. The battery division produces over 24,000,000 batteries annually under private label for the new car manufacturers and the replacement market. We are entering the aerospace market with the nickel hydrogen battery with the help of NASA's Center for Space Power at Texas A&M. Unlike traditional nickel hydrogen battery manufacturers, we are reaching beyond the space applications to the higher volume markets of aircraft starting and utility load leveling. Though space applications alone will not provide sufficient volume to support the economies of scale and opportunities for statistical process control, these additional terrestrial applications will. For example, nickel hydrogen batteries do not have the environmental problems of nickel cadmium or lead acid and may someday start your car or power your electric vehicle. However you envision the future, keep in mind that no manufacturer moves into a large volume market without fine tuning their process. The Center for Space Power at Texas A&M is providing indepth technical analysis of all of the materials and fabricated parts of our battery as well as thermal and mechanical design computer modeling. Several examples of what we are doing with nickel hydrogen chemistry to lead to these production efficiencies are presented.
Center for Space Power, Texas A and M University
NASA Technical Reports Server (NTRS)
Jones, Ken
1991-01-01
Johnson Controls is a 106 year old company employing 42,000 people worldwide with $4.7 billion annual sales. Though we are new to the aerospace industry we are a world leader in automobile battery manufacturing, automotive seating, plastic bottling, and facilities environment controls. The battery division produces over 24,000,000 batteries annually under private label for the new car manufacturers and the replacement market. We are entering the aerospace market with the nickel hydrogen battery with the help of NASA's Center for Space Power at Texas A&M. Unlike traditional nickel hydrogen battery manufacturers, we are reaching beyond the space applications to the higher volume markets of aircraft starting and utility load leveling. Though space applications alone will not provide sufficient volume to support the economies of scale and opportunities for statistical process control, these additional terrestrial applications will. For example, nickel hydrogen batteries do not have the environmental problems of nickel cadmium or lead acid and may someday start your car or power your electric vehicle. However you envision the future, keep in mind that no manufacturer moves into a large volume market without fine tuning their process. The Center for Space Power at Texas A&M is providing indepth technical analysis of all of the materials and fabricated parts of our battery as well as thermal and mechanical design computer modeling. Several examples of what we are doing with nickel hydrogen chemistry to lead to these production efficiencies are presented.
NASA Astrophysics Data System (ADS)
Zhang, Mi; Guan, Zhidong; Wang, Xiaodong; Du, Shanyi
2017-10-01
Kink band is a typical phenomenon for composites under longitudinal compression. In this paper, theoretical analysis and finite element simulation were conducted to analyze kink angle as well as compressive strength of composites. Kink angle was considered to be an important character throughout longitudinal compression process. Three factors including plastic matrix, initial fiber misalignment and rotation due to loading were considered for theoretical analysis. Besides, the relationship between kink angle and fiber volume fraction was improved and optimized by theoretical derivation. In addition, finite element models considering fiber stochastic strength and Drucker-Prager constitutive model for matrix were conducted in ABAQUS to analyze kink band formation process, which corresponded with the experimental results. Through simulation, the loading and failure procedure can be evidently divided into three stages: elastic stage, softening stage, and fiber break stage. It also shows that kink band is a result of fiber misalignment and plastic matrix. Different values of initial fiber misalignment angle, wavelength and fiber volume fraction were considered to explore the effects on compressive strength and kink angle. Results show that compressive strength increases with the decreasing of initial fiber misalignment angle, the decreasing of initial fiber misalignment wavelength and the increasing of fiber volume fraction, while kink angle decreases in these situations. Orthogonal array in statistics was also built to distinguish the effect degree of these factors. It indicates that initial fiber misalignment angle has the largest impact on compressive strength and kink angle.
Gursoy, Olcay; Memiş, Dilek; Sut, Necdet
2008-01-01
This study aimed to determine the effect of administration of a single-dose proton pump inhibitor (PPI) on gastric intramucosal pH (pHi), gastric juice volume and gastric pH in critically ill patients. This prospective, randomized, double-blind, placebo-controlled study included 75 patients who were divided into five groups that received the following treatment: group C (n = 15), saline 100 mL; group O (n = 15), omeprazole 20 mg; group P (n = 15), pantoprazole 40 mg; group E (n = 15), esomeprazole 20 mg; and group R (n = 15), rabeprazole 20 mg. All treatments were administered nasogastrically in 100 mL of physiological saline. Measurements of gastric pHi, gastric juice volume and gastric pH were obtained immediately before and 2, 4 and 6 hours after administration of treatments. In addition, gastric content was aspirated and its volume was recorded. Initial gastric pHi, gastric juice volume and gastric pH values were not statistically significantly different among the groups (p > 0.05). No statistically significant difference in gastric pHi was seen among the groups before or 2, 4 or 6 hours after saline or PPI administration. At hours 2, 4 and 6, gastric pH in the pantoprazole, esomeprazole and rabeprazole groups increased significantly, whereas gastric juice volume decreased significantly, compared with the omeprazole and placebo groups (p < 0.001). No statistically significant differences were seen between the pantoprazole, esomeprazole and rabeprazole groups. This is the first study to show that single-dose pantoprazole, esomeprazole and rabeprazole are associated with greater gastric pH increase and greater gastric juice volume decrease than omeprazole in critically ill patients. Our study also suggests that PPIs do not affect gastric pHi measurements in critically ill patients and can be administered during pH monitoring.
The Shock and Vibration Digest. Volume 16, Number 1
1984-01-01
investigation of the measure- ment of frequency band average loss factors of structural components for use in the statistical energy analysis method of...stiffness. Matrix methods Key Words: Finite element technique. Statistical energy analysis . Experimental techniques. Framed structures, Com- puter...programs In order to further understand the practical application of the statistical energy analysis , a two section plate-like frame structure is
Timber resource statistics of the northern interior resource area of California.
Perry Colclasure; Joel Moen; Charles L. Bolsinger
1986-01-01
This report is one of five that provide timber resource statistics for 57 of the 58 counties in California (San Francisco is excluded). This report presents statistics from a 1981-84 inventory of the timber resources of Lassen, Modoc, Shasta, Siskiyou, and Trinity Counties. Tables presented are of forest area and of timber volume, growth, and mortality. Timberland area...
Timber resource statistics of the north coast resource area of California.
J.D. Lloyd; Joel Moen; Charles L. Bolsinger
1986-01-01
This report is one of five that provide timber resource statistics for 57 of the 58 counties in California (San Francisco is excluded). This report presents statistics from a 1981-84 inventory of the timber resources of Del Norte, Humboldt, Mendocino, and Sonoma Counties. Tables presented are of forest area and of timber volume, growth, and mortality. The north coast...
Timber resource statistics for the Tanana inventory unit, Alaska, 1971-75.
Willem W.S. Van Hees
1984-01-01
Statistics on forest area, total gross and net timber volumes, and annual net growth and mortality are presented for the 1971-75 timber inventory of the Tanana unit, Alaska. This report summarizes statistics previously published for the four inventory blocks of the unit: Fairbanks, Kantishna, Upper Tanana, and Wood-Salcha. Timberland area is estimated at 2.19 million...
Giacomini, Guilherme; Miranda, José R.A.; Pavan, Ana Luiza M.; Duarte, Sérgio B.; Ribeiro, Sérgio M.; Pereira, Paulo C.M.; Alves, Allan F.F.; de Oliveira, Marcela; Pina, Diana R.
2015-01-01
Abstract The purpose of this work was to develop a quantitative method for evaluating the pulmonary inflammatory process (PIP) through the computational analysis of chest radiography exams in posteroanterior (PA) and lateral views. The quantification procedure was applied to patients with tuberculosis (TB) as the motivating application. A study of high-resolution computed tomography (HRCT) examinations of patients with TB was developed to establish a relation between the inflammatory process and the signal difference-to-noise ratio (SDNR) measured in the PA projection. A phantom essay was used to validate this relation, which was implemented using an algorithm that is able to estimate the volume of the inflammatory region based solely on SDNR values in the chest radiographs of patients. The PIP volumes that were quantified for 30 patients with TB were used for comparisons with direct HRCT analysis for the same patient. The Bland–Altman statistical analyses showed no significant differences between the 2 quantification methods. The linear regression line had a correlation coefficient of R2 = 0.97 and P < 0.001, showing a strong association between the volume that was determined by our evaluation method and the results obtained by direct HRCT scan analysis. Since the diagnosis and follow-up of patients with TB is commonly performed using X-rays exams, the method developed herein can be considered an adequate tool for quantifying the PIP with a lower patient radiation dose and lower institutional cost. Although we used patients with TB for the application of the method, this method may be used for other pulmonary diseases characterized by a PIP. PMID:26131814
Chen, Hua-Hsuan; Rosenberg, David R; MacMaster, Frank P; Easter, Philip C; Caetano, Sheila C; Nicoletti, Mark; Hatch, John P; Nery, Fabiano G; Soares, Jair C
2008-12-01
Adults with major depressive disorder (MDD) are reported to have reduced orbitofrontal cortex (OFC) volumes, which could be related to decreased neuronal density. We conducted a study on medication naïve children with MDD to determine whether abnormalities of OFC are present early in the illness course. Twenty seven medication naïve pediatric Diagnostic and Statistical Manual of Mental Disorders, 4(th) edition (DSM-IV) MDD patients (mean age +/- SD = 14.4 +/- 2.2 years; 10 males) and 26 healthy controls (mean age +/- SD = 14.4 +/- 2.4 years; 12 males) underwent a 1.5T magnetic resonance imaging (MRI) with 3D spoiled gradient recalled acquisition. The OFC volumes were compared using analysis of covariance with age, gender, and total brain volume as covariates. There was no significant difference in either total OFC volume or total gray matter OFC volume between MDD patients and healthy controls. Exploratory analysis revealed that patients had unexpectedly larger total right lateral (F = 4.2, df = 1, 48, p = 0.05) and right lateral gray matter (F = 4.6, df = 1, 48, p = 0.04) OFC volumes compared to healthy controls, but this finding was not significant following statistical correction for multiple comparisons. No other OFC subregions showed a significant difference. The lack of OFC volume abnormalities in pediatric MDD patients suggests the abnormalities previously reported for adults may develop later in life as a result of neural cell loss.
Volumetric Growth of the Liver in the Human Fetus: An Anatomical, Hydrostatic, and Statistical Study
Szpinda, Michał; Paruszewska-Achtel, Monika; Mila-Kierzenkowska, Celestyna; Elminowska-Wenda, Gabriela; Dombek, Małgorzata; Szpinda, Anna; Badura, Mateusz
2015-01-01
Using anatomical, hydrostatic, and statistical methods, liver volumes were assessed in 69 human fetuses of both sexes aged 18–30 weeks. No sex differences were found. The median of liver volume achieved by hydrostatic measurements increased from 6.57 cm3 at 18–21 weeks through 14.36 cm3 at 22–25 weeks to 20.77 cm3 at 26–30 weeks, according to the following regression: y = −26.95 + 1.74 × age ± Z × (−3.15 + 0.27 × age). The median of liver volume calculated indirectly according to the formula liver volume = 0.55 × liver length × liver transverse diameter × liver sagittal diameter increased from 12.41 cm3 at 18–21 weeks through 28.21 cm3 at 22–25 weeks to 49.69 cm3 at 26–30 weeks. There was a strong relationship (r = 0.91, p < 0.001) between the liver volumes achieved by hydrostatic (x) and indirect (y) methods, expressed by y = −0.05 + 2.16x ± 7.26. The liver volume should be calculated as follows liver volume = 0.26 × liver length × liver transverse diameter × liver sagittal diameter. The age-specific liver volumes are of great relevance in the evaluation of the normal hepatic growth and the early diagnosis of fetal micro- and macrosomias. PMID:26413551
Rios Piedra, Edgar A; Taira, Ricky K; El-Saden, Suzie; Ellingson, Benjamin M; Bui, Alex A T; Hsu, William
2016-02-01
Brain tumor analysis is moving towards volumetric assessment of magnetic resonance imaging (MRI), providing a more precise description of disease progression to better inform clinical decision-making and treatment planning. While a multitude of segmentation approaches exist, inherent variability in the results of these algorithms may incorrectly indicate changes in tumor volume. In this work, we present a systematic approach to characterize variability in tumor boundaries that utilizes equivalence tests as a means to determine whether a tumor volume has significantly changed over time. To demonstrate these concepts, 32 MRI studies from 8 patients were segmented using four different approaches (statistical classifier, region-based, edge-based, knowledge-based) to generate different regions of interest representing tumor extent. We showed that across all studies, the average Dice coefficient for the superset of the different methods was 0.754 (95% confidence interval 0.701-0.808) when compared to a reference standard. We illustrate how variability obtained by different segmentations can be used to identify significant changes in tumor volume between sequential time points. Our study demonstrates that variability is an inherent part of interpreting tumor segmentation results and should be considered as part of the interpretation process.
Singh, S; Modi, S; Bagga, D; Kaur, P; Shankar, L R; Khushu, S
2013-03-01
The present study aimed to investigate whether brain morphological differences exist between adult hypothyroid subjects and age-matched controls using voxel-based morphometry (VBM) with diffeomorphic anatomic registration via an exponentiated lie algebra algorithm (DARTEL) approach. High-resolution structural magnetic resonance images were taken in ten healthy controls and ten hypothyroid subjects. The analysis was conducted using statistical parametric mapping. The VBM study revealed a reduction in grey matter volume in the left postcentral gyrus and cerebellum of hypothyroid subjects compared to controls. A significant reduction in white matter volume was also found in the cerebellum, right inferior and middle frontal gyrus, right precentral gyrus, right inferior occipital gyrus and right temporal gyrus of hypothyroid patients compared to healthy controls. Moreover, no meaningful cluster for greater grey or white matter volume was obtained in hypothyroid subjects compared to controls. Our study is the first VBM study of hypothyroidism in an adult population and suggests that, compared to controls, this disorder is associated with differences in brain morphology in areas corresponding to known functional deficits in attention, language, motor speed, visuospatial processing and memory in hypothyroidism. © 2012 British Society for Neuroendocrinology.
Loopless nontrapping invasion-percolation model for fracking.
Norris, J Quinn; Turcotte, Donald L; Rundle, John B
2014-02-01
Recent developments in hydraulic fracturing (fracking) have enabled the recovery of large quantities of natural gas and oil from old, low-permeability shales. These developments include a change from low-volume, high-viscosity fluid injection to high-volume, low-viscosity injection. The injected fluid introduces distributed damage that provides fracture permeability for the extraction of the gas and oil. In order to model this process, we utilize a loopless nontrapping invasion percolation previously introduced to model optimal polymers in a strongly disordered medium and for determining minimum energy spanning trees on a lattice. We performed numerical simulations on a two-dimensional square lattice and find significant differences from other percolation models. Additionally, we find that the growing fracture network satisfies both Horton-Strahler and Tokunaga network statistics. As with other invasion percolation models, our model displays burst dynamics, in which the cluster extends rapidly into a connected region. We introduce an alternative definition of bursts to be a consecutive series of opened bonds whose strengths are all below a specified value. Using this definition of bursts, we find good agreement with a power-law frequency-area distribution. These results are generally consistent with the observed distribution of microseismicity observed during a high-volume frack.
ERIC Educational Resources Information Center
Stanford Research Inst., Menlo Park, CA.
Public laws 874 and 815 are intended to help relieve the financial burdens imposed on public school districts as a result of the tax-exempt nature of federal property. Where volume 1 presented a broad statistical analysis of the impact of P.L. 874 and 815 in 54 school districts, this volume contains indepth case studies of the laws' effects in…
ERIC Educational Resources Information Center
Grasso, Janet; Fosburg, Steven
Fifth in a series of seven volumes reporting the design, methodology, and findings of the 4-year National Day Care Home Study (NDCHS), this volume presents a descriptive and statistical analysis of the day care institutions that administer day care systems. These systems, such as Learning Unlimited in Los Angeles and the family day care program of…
Cerebellar Volume in Children With Attention-Deficit Hyperactivity Disorder (ADHD).
Wyciszkiewicz, Aleksandra; Pawlak, Mikolaj A; Krawiec, Krzysztof
2017-02-01
Attention Deficit Hyperactivity Disorder (ADHD) is associated with altered cerebellar volume and cerebellum is associated with cognitive performance. However there are mixed results regarding the cerebellar volume in young patients with ADHD. To clarify the size and direction of this effect, we conducted the analysis on the large public database of brain images. The aim of this study was to confirm that cerebellar volume in ADHD is smaller than in control subjects in currently the largest publicly available cohort of ADHD subjects.We applied cross-sectional case control study design by comparing 286 ADHD patients (61 female) with age and gender matched control subjects. Volumetric measurements of cerebellum were obtained using automated segmentation with FreeSurfer 5.1. Statistical analysis was performed in R-CRAN statistical environment. Patients with ADHD had significantly smaller total cerebellar volumes (134.5±17.11cm 3 vs.138.90±15.32 cm 3 ). The effect was present in both females and males (males 136.9±14.37 cm 3 vs. 141.20±14.75 cm 3 ; females 125.7±12.34 cm 3 vs. 131.20±15.03 cm 3 ). Age was positively and significantly associated with the cerebellar volumes. These results indicate either delayed or disrupted cerebellar development possibly contributing to ADHD pathophysiology.
Impact of Infarct Size on Blood Pressure in Young Patients with Acute Stroke.
Bonardo, Pablo; Pantiú, Fátima; Ferraro, Martín; Chertcoff, Anibal; Bandeo, Lucrecia; Cejas, Luciana León; Pacha, Sol; Roca, Claudia Uribe; Rugilo, Carlos; Pardal, Manuel Maria Fernández; Reisin, Ricardo
2018-06-01
Hypertension can be found in up to 80% of patients with acute stroke. Many factors have been related to this phenomenon such as age, history of hypertension, and stroke severity. The aim of our study was to determine the relationship between infarct volume and blood pressure, at admission, in young patients with acute ischemic stroke. Patients younger than 55 years old admitted within 24 hours of ischemic stroke were included. Socio-demographic variables, systolic blood pressure, diastolic blood pressure, and infarct volume at admission were assessed. Statistical analysis: mean and SEM for quantitative variables, percentages for qualitative, and Spearman correlations ( p value < 0.05 was considered statistically significant). Twenty-two patients (12 men), mean age: 44.64 ± 1.62 years. The most frequent vascular risk factors were: hypertension, smoking, and overweight (40.9%). Mean systolic and diastolic blood pressure on admission were: 143.27 ± 6.57 mmHg and 85.14 ± 3.62 mmHg, respectively. Infarct volume: 11.55 ± 4.74 ml. Spearman correlations: systolic blood pressure and infarct volume: p = 0.15 r : -0.317; diastolic blood pressure and infarct volume: p = 0.738 r: -0.76. In our series of young patients with acute ischemic stroke, large infarct volume was not associated with high blood pressure at admission.
Forest statistics for Rhode Island--1972 and 1985
David R. Dickson; Carol L. McAfee; Carol L. McAfee
1988-01-01
A statistical report on the third forest survey of Rhode Island (1984). Findings are displayed in 77 tables containing estimates of forest area, numbers of trees, timber volume, tree biomass, and timber products output. Data are presented at two levels: state and county.
Forest statistics for New Hampshire; 1983 and 1997
Thomas S. Frieswyk; Richard H. Widmann; Richard H. Widmann
2000-01-01
A statistical report on the fifth forest inventory of New Hampshire 1996-1998. Findings are displayed in 86 tables containing estimates of forest area numbers of trees timber volume growth change and biomass. Data are presented at three levels; state, county, and region.
Forest statistics for Massachusetts: 1985 and 1998
Carol L. Alerich; Carol L. Alerich
2000-01-01
A statistical report on the fourth forest inventory of Massachusetts (1997-1998.) Findings are dispayed in 67 tables containing estimates of forest area numbers of trees, wildlife habitat, timber volume, growth, change, and biomass. Data are presented at two levels: state and county.
Forest Statistics for Kentucky - 1975 and 1988
Carol L. Alerich
1990-01-01
A statistical report on the fourth forest survey of Kentucky (1988). Findings are displayed in 204 tables containing estimates of forest area, number of trees, timber volume, tree biomass, and timber products output. Data are presented at three levels: state, geographic unit, and county.
Forest statistics for Connecticut: 1985 and 1998
Carol L. Alerich; Carol L. Alerich
2000-01-01
A statistical report on the fourth forest inventory of Connecticut 1997-1998. Findings are displayed in 67 tables containing estimates of forest area numbers of trees wildlife habitat timber volume growth change and biomass Data are presented at two levels: state and county.
Forest statistics for Maryland--1976 and 1986
Thomas S. Frieswyk; Dawn M. DiGiovanni; Dawn M. DiGiovanni
1988-01-01
A statistical report on the fourth forest survey of Maryland (1986). Findings are displayed in 115 tables containing estimates of forest area, numbers of trees, timber volume, tree biomass, and timber products output. Data are presented at three levels: state, geographic unit, and county.
Timber resource of Missouri's Prairie, 1972.
Jerold T. Hahn; Alexander Vasilevsky
1975-01-01
The third timber inventory of Missouri's Prairie Forest Survey Unit shows substantial declines in both growing-stock and sawtimber volumes between 1959 and 1972. Commercial forest area declined by one-fifth. Presents highlights and statistics on forest area and timber volume, growth, mortality, ownership, and use in 1972.
Timber resource of Minnesota's Prairie unit, 1977.
Jerold T. Hahn; W. Brad Smith
1980-01-01
The fourth inventory of Minnesota's Prairie Unit shows that although commercial forest area decreased 31.7% between 1962 and 1977, growing-stock volume increased 22%. This report gives statistical highlights and contains detailed tables of forest area as well as timber volume, growth, mortality, ownership, and use.
Iowa's forest resources, 1974.
John S. Jr. Spencer; Pamela J. Jakes
1980-01-01
The second inventory of Iowa's forest resources shows big declines in commercial forest area and in growing-stock and sawtimber volumes between 1954 and 1974. Presented are text and statistics on forest area and timber volume, growth, mortality, ownership, stocking, future timber supply, timber use, forest management opportunities, and nontimber resources.
Veterinary Specialist, 1-2. Military Curriculum Materials for Vocational and Technical Education.
ERIC Educational Resources Information Center
Ohio State Univ., Columbus. National Center for Research in Vocational Education.
These military-developed curriculum materials consist of five volumes of text information, student workbooks, and supplements for use in training veterinary specialists. Covered in the individual volumes are the following topics: the veterinary airman, administration, and statistical procedures; veterinary microbiology, consumer-level quality…
Timber Resource of Wisconsin's Southwest Survey Unit, 1983.
Gerhard K. Raile
1985-01-01
The timber resource of the Southwest Wisconsin Survey Unit increased 29% in commercial forest area and increased 52% in growing-stock volume between 1968 and 1983. Highlights and statistics from the fourth inventory of this unit are presented for area, volume, growth, mortality, removals, utilization, and biomass.
A systematic review of the impact of center volume in dialysis.
Pieper, Dawid; Mathes, Tim; Marshall, Mark Roger
2015-12-22
A significant relationship exists between the volume of surgical procedures that a given center performs and subsequent outcomes. It seems plausible that such a volume-outcome relationship is also present in dialysis. MEDLINE and EMBASE were searched in November 2014 for non-experimental studies evaluating the association between center volume and patient outcomes [mortality, morbidity, peritonitis, switch to hemodialysis (HD) or any other treatment], without language restrictions or other limits. Selection of relevant studies, data extraction and critical appraisal were performed by two independent reviewers. We did not perform meta-analysis due to clinical and methodological heterogeneity (e.g. different volume categories). 16 studies met out inclusion criteria. Most studies were performed in the US. The study quality ranged from fair to good. Only few items were judged to have a high risk of bias, while many items were judged to have an unclear risk of bias due to insufficient reporting. All 10 studies that analyzed peritoneal dialysis (PD) technique survival by modeling switch to HD or any other treatment as an outcome showed a statistical significant effect. The relative effect measures ranged from 0.25 to 0.94 (median 0.73) in favor of high volume centers. All nine studies indicated a lower mortality for PD in high volume centers, but only study was statistical significant. This systematic review supports a volume-outcome relationship in peritoneal dialysis with respect to switch to HD or any other treatment. An effect on mortality is probably present in HD. Further research is needed to identify and understand the associations of center volume that are causally related to patient benefit.
Edema is not a reliable diagnostic sign to exclude small brain metastases.
Schneider, Tanja; Kuhne, Jan Felix; Bittrich, Paul; Schroeder, Julian; Magnus, Tim; Mohme, Malte; Grosser, Malte; Schoen, Gerhard; Fiehler, Jens; Siemonsen, Susanne
2017-01-01
No prior systematic study on the extent of vasogenic edema (VE) in patients with brain metastases (BM) exists. Here, we aim to determine 1) the general volumetric relationship between BM and VE, 2) a threshold diameter above which a BM shows VE, and 3) the influence of the primary tumor and location of the BM in order to improve diagnostic processes and understanding of edema formation. This single center, retrospective study includes 173 untreated patients with histologically proven BM. Semi-manual segmentation of 1416 BM on contrast-enhanced T1-weighted images and of 865 VE on fluid-attenuated inversion recovery/T2-weighted images was conducted. Statistical analyses were performed using a paired-samples t-test, linear regression/generalized mixed-effects model, and receiver-operating characteristic (ROC) curve controlling for the possible effect of non-uniformly distributed metastases among patients. For BM with non-confluent edema (n = 545), there was a statistically significant positive correlation between the volumes of the BM and the VE (P < 0.001). The optimal threshold for edema formation was a diameter of 9.4 mm for all BM. The primary tumors as interaction term in multivariate analysis had a significant influence on VE formation whereas location had not. Hence VE development is dependent on the volume of the underlying BM and the site of the primary neoplasm, but not from the location of the BM.
Chen, Philip Kuo-Ting; Por, Yong-Chen; Liou, Eric Jein-Wein; Chang, Frank Chun-Shin
2011-07-01
To assess the results of maxillary distraction osteogenesis with the Rigid External Distraction System using three-dimensional computed tomography scan volume-rendered images with respect to stability and facial growth at three time frames: preoperative (T0), 1-year postoperative (T1), and 5-years postoperative (T2). Retrospective analysis. Tertiary. A total of 12 patients with severe cleft maxillary hypoplasia were treated between June 30, 1997, and July 15, 1998. The mean age at surgery was 11 years 1 month. Le Fort I maxillary distraction osteogenesis. Distraction was started 2 to 5 days postsurgery at a rate of 1 mm per day. The consolidation period was 3 months. No face mask was used. A paired t test was used for statistical analysis. Overjet, ANB, and SNA and maxillary, pterygoid, and mandibular volumes. From T0 to T1, there were statistically significant increments of overjet, ANB, and SNA and maxillary, pterygoid, and mandibular volumes. The T1 to T2 period demonstrated a reduction of overjet (30.07%) and ANB (54.42%). The maxilla showed a stable SNA and a small but statistically significant advancement of the ANS point. There was a significant increase in the mandibular volume. However, there was no significant change in the maxillary and pterygoid volumes. Maxillary distraction osteogenesis demonstrated linear and volumetric maxillary growth during the distraction phase without clinically significant continued growth thereafter. Overcorrection is required to take into account recurrence of midface retrusion over the long term.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bai, T; UT Southwestern Medical Center, Dallas, TX; Yan, H
2014-06-15
Purpose: To develop a 3D dictionary learning based statistical reconstruction algorithm on graphic processing units (GPU), to improve the quality of low-dose cone beam CT (CBCT) imaging with high efficiency. Methods: A 3D dictionary containing 256 small volumes (atoms) of 3x3x3 voxels was trained from a high quality volume image. During reconstruction, we utilized a Cholesky decomposition based orthogonal matching pursuit algorithm to find a sparse representation on this dictionary basis of each patch in the reconstructed image, in order to regularize the image quality. To accelerate the time-consuming sparse coding in the 3D case, we implemented our algorithm inmore » a parallel fashion by taking advantage of the tremendous computational power of GPU. Evaluations are performed based on a head-neck patient case. FDK reconstruction with full dataset of 364 projections is used as the reference. We compared the proposed 3D dictionary learning based method with a tight frame (TF) based one using a subset data of 121 projections. The image qualities under different resolutions in z-direction, with or without statistical weighting are also studied. Results: Compared to the TF-based CBCT reconstruction, our experiments indicated that 3D dictionary learning based CBCT reconstruction is able to recover finer structures, to remove more streaking artifacts, and is less susceptible to blocky artifacts. It is also observed that statistical reconstruction approach is sensitive to inconsistency between the forward and backward projection operations in parallel computing. Using high a spatial resolution along z direction helps improving the algorithm robustness. Conclusion: 3D dictionary learning based CBCT reconstruction algorithm is able to sense the structural information while suppressing noise, and hence to achieve high quality reconstruction. The GPU realization of the whole algorithm offers a significant efficiency enhancement, making this algorithm more feasible for potential clinical application. A high zresolution is preferred to stabilize statistical iterative reconstruction. This work was supported in part by NIH(1R01CA154747-01), NSFC((No. 61172163), Research Fund for the Doctoral Program of Higher Education of China (No. 20110201110011), China Scholarship Council.« less
Matsushima, Takashi; Blumenfeld, Raphael
2017-03-01
The microstructural organization of a granular system is the most important determinant of its macroscopic behavior. Here we identify the fundamental factors that determine the statistics of such microstructures, using numerical experiments to gain a general understanding. The experiments consist of preparing and compacting isotropically two-dimensional granular assemblies of polydisperse frictional disks and analyzing the emergent statistical properties of quadrons-the basic structural elements of granular solids. The focus on quadrons is because the statistics of their volumes have been found to display intriguing universal-like features [T. Matsushima and R. Blumenfeld, Phys. Rev. Lett. 112, 098003 (2014)PRLTAO0031-900710.1103/PhysRevLett.112.098003]. The dependence of the structures and of the packing fraction on the intergranular friction and the initial state is analyzed, and a number of significant results are found. (i) An analytical formula is derived for the mean quadron volume in terms of three macroscopic quantities: the mean coordination number, the packing fraction, and the rattlers fraction. (ii) We derive a unique, initial-state-independent relation between the mean coordination number and the rattler-free packing fraction. The relation is supported numerically for a range of different systems. (iii) We collapse the quadron volume distributions from all systems onto one curve, and we verify that they all have an exponential tail. (iv) The nature of the quadron volume distribution is investigated by decomposition into conditional distributions of volumes given the cell order, and we find that each of these also collapses onto a single curve. (v) We find that the mean quadron volume decreases with increasing intergranular friction coefficients, an effect that is prominent in high-order cells. We argue that this phenomenon is due to an increased probability of stable irregularly shaped cells, and we test this using a herewith developed free cell analytical model. We conclude that, in principle, the microstructural characteristics are governed mainly by the packing procedure, while the effects of intergranular friction and initial states are details that can be scaled away. However, mechanical stability constraints suppress slightly the occurrence of small quadron volumes in cells of order ≥6, and the magnitude of this effect does depend on friction. We quantify in detail this dependence and the deviation it causes from an exact collapse for these cells. (vi) We argue that our results support strongly the view that ensemble granular statistical mechanics does not satisfy the uniform measure assumption of conventional statistical mechanics. Results (i)-(iv) have been reported in the aforementioned reference, and they are reviewed and elaborated on here.
Enhancement of MS Signal Processing For Improved Cancer Biomarker Discovery
NASA Astrophysics Data System (ADS)
Si, Qian
Technological advances in proteomics have shown great potential in detecting cancer at the earliest stages. One way is to use the time of flight mass spectroscopy to identify biomarkers, or early disease indicators related to the cancer. Pattern analysis of time of flight mass spectra data from blood and tissue samples gives great hope for the identification of potential biomarkers among the complex mixture of biological and chemical samples for the early cancer detection. One of the keys issues is the pre-processing of raw mass spectra data. A lot of challenges need to be addressed: unknown noise character associated with the large volume of data, high variability in the mass spectroscopy measurements, and poorly understood signal background and so on. This dissertation focuses on developing statistical algorithms and creating data mining tools for computationally improved signal processing for mass spectrometry data. I have introduced an advanced accurate estimate of the noise model and a half-supervised method of mass spectrum data processing which requires little knowledge about the data.
NASA Astrophysics Data System (ADS)
Drossel, Welf-Guntram; Schubert, Andreas; Putz, Matthias; Koriath, Hans-Joachim; Wittstock, Volker; Hensel, Sebastian; Pierer, Alexander; Müller, Benedikt; Schmidt, Marek
2018-01-01
The technique joining by forming allows the structural integration of piezoceramic fibers into locally microstructured metal sheets without any elastic interlayers. A high-volume production of the joining partners causes in statistical deviations from the nominal dimensions. A numerical simulation on geometric process sensitivity shows that the deviations have a high significant influence on the resulting fiber stresses after the joining by forming operation and demonstrate the necessity of a monitoring concept. On this basis, the electromechanical behavior of piezoceramic array transducers is investigated experimentally before, during and after the joining process. The piezoceramic array transducer consists of an arrangement of five electrical interconnected piezoceramic fibers. The findings show that the impedance spectrum depends on the fiber stresses and can be used for in-process monitoring during the joining process. Based on the impedance values the preload state of the interconnected piezoceramic fibers can be specifically controlled and a fiber overload.
Twitter-Based Analysis of the Dynamics of Collective Attention to Political Parties
Eom, Young-Ho; Puliga, Michelangelo; Smailović, Jasmina; Mozetič, Igor; Caldarelli, Guido
2015-01-01
Large-scale data from social media have a significant potential to describe complex phenomena in the real world and to anticipate collective behaviors such as information spreading and social trends. One specific case of study is represented by the collective attention to the action of political parties. Not surprisingly, researchers and stakeholders tried to correlate parties' presence on social media with their performances in elections. Despite the many efforts, results are still inconclusive since this kind of data is often very noisy and significant signals could be covered by (largely unknown) statistical fluctuations. In this paper we consider the number of tweets (tweet volume) of a party as a proxy of collective attention to the party, identify the dynamics of the volume, and show that this quantity has some information on the election outcome. We find that the distribution of the tweet volume for each party follows a log-normal distribution with a positive autocorrelation of the volume over short terms, which indicates the volume has large fluctuations of the log-normal distribution yet with a short-term tendency. Furthermore, by measuring the ratio of two consecutive daily tweet volumes, we find that the evolution of the daily volume of a party can be described by means of a geometric Brownian motion (i.e., the logarithm of the volume moves randomly with a trend). Finally, we determine the optimal period of averaging tweet volume for reducing fluctuations and extracting short-term tendencies. We conclude that the tweet volume is a good indicator of parties' success in the elections when considered over an optimal time window. Our study identifies the statistical nature of collective attention to political issues and sheds light on how to model the dynamics of collective attention in social media. PMID:26161795
Twitter-Based Analysis of the Dynamics of Collective Attention to Political Parties.
Eom, Young-Ho; Puliga, Michelangelo; Smailović, Jasmina; Mozetič, Igor; Caldarelli, Guido
2015-01-01
Large-scale data from social media have a significant potential to describe complex phenomena in the real world and to anticipate collective behaviors such as information spreading and social trends. One specific case of study is represented by the collective attention to the action of political parties. Not surprisingly, researchers and stakeholders tried to correlate parties' presence on social media with their performances in elections. Despite the many efforts, results are still inconclusive since this kind of data is often very noisy and significant signals could be covered by (largely unknown) statistical fluctuations. In this paper we consider the number of tweets (tweet volume) of a party as a proxy of collective attention to the party, identify the dynamics of the volume, and show that this quantity has some information on the election outcome. We find that the distribution of the tweet volume for each party follows a log-normal distribution with a positive autocorrelation of the volume over short terms, which indicates the volume has large fluctuations of the log-normal distribution yet with a short-term tendency. Furthermore, by measuring the ratio of two consecutive daily tweet volumes, we find that the evolution of the daily volume of a party can be described by means of a geometric Brownian motion (i.e., the logarithm of the volume moves randomly with a trend). Finally, we determine the optimal period of averaging tweet volume for reducing fluctuations and extracting short-term tendencies. We conclude that the tweet volume is a good indicator of parties' success in the elections when considered over an optimal time window. Our study identifies the statistical nature of collective attention to political issues and sheds light on how to model the dynamics of collective attention in social media.
Sone, Daichi; Sato, Noriko; Kimura, Yukio; Watanabe, Yutaka; Okazaki, Mitsutoshi; Matsuda, Hiroshi
2018-06-01
Although epilepsy in the elderly has attracted attention recently, there are few systematic studies of neuroimaging in such patients. In this study, we used structural MRI and diffusion tensor imaging (DTI) to investigate the morphological and microstructural features of the brain in late-onset temporal lobe epilepsy (TLE). We recruited patients with TLE and an age of onset > 50 years (late-TLE group) and age- and sex-matched healthy volunteers (control group). 3-Tesla MRI scans, including 3D T1-weighted images and 15-direction DTI, showed normal findings on visual assessment in both groups. We used Statistical Parametric Mapping 12 (SPM12) for gray and white matter structural normalization and comparison and used Tract-Based Spatial Statistics (TBSS) for fractional anisotropy and mean diffusivity comparisons of DTI. In both methods, p < 0.05 (family-wise error) was considered statistically significant. In total, 30 patients with late-onset TLE (mean ± SD age, 66.8 ± 8.4; mean ± SD age of onset, 63.0 ± 7.6 years) and 40 healthy controls (mean ± SD age, 66.6 ± 8.5 years) were enrolled. The late-onset TLE group showed significant gray matter volume increases in the bilateral amygdala and anterior hippocampus and significantly reduced mean diffusivity in the left temporofrontal lobe, internal capsule, and brainstem. No significant changes were evident in white matter volume or fractional anisotropy. Our findings may reflect some characteristics or mechanisms of cryptogenic TLE in the elderly, such as inflammatory processes.
A computerized MRI biomarker quantification scheme for a canine model of Duchenne muscular dystrophy
Wang, Jiahui; Fan, Zheng; Vandenborne, Krista; Walter, Glenn; Shiloh-Malawsky, Yael; An, Hongyu; Kornegay, Joe N.; Styner, Martin A.
2015-01-01
Purpose Golden retriever muscular dystrophy (GRMD) is a widely used canine model of Duchenne muscular dystrophy (DMD). Recent studies have shown that magnetic resonance imaging (MRI) can be used to non-invasively detect consistent changes in both DMD and GRMD. In this paper, we propose a semi-automated system to quantify MRI biomarkers of GRMD. Methods Our system was applied to a database of 45 MRI scans from 8 normal and 10 GRMD dogs in a longitudinal natural history study. We first segmented six proximal pelvic limb muscles using two competing schemes: 1) standard, limited muscle range segmentation and 2) semi-automatic full muscle segmentation. We then performed pre-processing, including: intensity inhomogeneity correction, spatial registration of different image sequences, intensity calibration of T2-weighted (T2w) and T2-weighted fat suppressed (T2fs) images, and calculation of MRI biomarker maps. Finally, for each of the segmented muscles, we automatically measured MRI biomarkers of muscle volume and intensity statistics over MRI biomarker maps, and statistical image texture features. Results The muscle volume and the mean intensities in T2 value, fat, and water maps showed group differences between normal and GRMD dogs. For the statistical texture biomarkers, both the histogram and run-length matrix features showed obvious group differences between normal and GRMD dogs. The full muscle segmentation shows significantly less error and variability in the proposed biomarkers when compared to the standard, limited muscle range segmentation. Conclusion The experimental results demonstrated that this quantification tool can reliably quantify MRI biomarkers in GRMD dogs, suggesting that it would also be useful for quantifying disease progression and measuring therapeutic effect in DMD patients. PMID:23299128
Simplified Calculation of the Electrical Conductivity of Composites with Carbon Nanotubes
NASA Astrophysics Data System (ADS)
Ivanov, S. G.; Aniskevich, A.; Kulakov, V.
2018-03-01
The electrical conductivity of two groups of polymer nanocomposites filled with the same NC7000 carbon nanotubes (CNTs) beyond the percolation threshold is described with the help of simple formulas. Different manufacturing process of the nanocomposites led to different CNT network structures, and, as a consequence, their electrical conductivity, at the same CNT volume, differed by two orders of magnitude. The relation between the electrical conductivity and the volume content of CNTs of the first group of composites (with a higher electrical conductivity) is described assuming that the CNT network structure is close to a statistically homogeneous one. The formula for this case, derived on the basis of a self-consistent model, includes only two parameters: the effective longitudinal electrical conductivity of CNT and the percolation threshold (the critical value of CNT volume content). These parameters were determined from two experimental points of electrical conductivity as a function of the volume fraction of CNTs. The second group of nanocomposites had a pronounced agglomerative structure, which was confirmed by microscopy data. To describe the low electrical conductivity of this group of nanocomposites, a formula based on known models of micromechanics is proposed. Two parameters of this formula were determined from experimental data of the first group, but the other two — of the second group of nanocomposites. A comparison of calculation and experimental relations confirmed the practical expediency of using the approach described.
Forest Statistics for Minnesota's Northern Pine Unit.
Pat Murray
1991-01-01
The fifth inventory of Minnesota's Northern Pine Unit reports 11.1 million acres of land, of which 6.3 million acres are forested. This bulletin presents statistical highlights and contains detailed tables of forest area, as well as timber volume, growth, removals, mortality, and ownership.
Forest statistics for Minnesota's Aspen-Birch Unit.
Neal P. Kingsley
1991-01-01
The fifth inventory of Minnesota's Aspen-Birch Unit reports 8.7 million acres of land, of which 7.4 million acres are forested. This bulletin present statistical highlights and contains detailed tables of forest area, as well as timber volume, growth, removals, mortality, and ownership.
Forest statistics for Rhode Island: 1985 and 1998
Carol L. Alerich; Carol L. Alerich
2000-01-01
A statistical report on the fourth forest inventory of Rhode Island (1997-1998.) Findings are displayed in 67 tables containing estimates of forest area numbers of trees, wildlife habitat, timber, volume, growth, change and biomass. Data are presented at two levels: state and county.
Forest statistics for Maryland: 1986 and 1999
Thomas S. Frieswyk
2001-01-01
A statistical report on the fifth forest inventory of Maryland (1998-1999). Findings are displayed in 109 tables containing estimates of forest area, numbers of trees, wildlife habitat, timber volume, growth, change, and biomass. Data are presented at three levels: state, geographic unit and county.
The Shock and Vibration Digest. Volume 13. Number 7
1981-07-01
Richards, ISVR, University of Southampton Presidential Address "A Structural Dynamicist Looks at Statistical Energy Analysis " Professor B.L...excitation and for random and sine sweep mechanical excitation. Test data were used to assess prediction methods, in particular a statistical energy analysis method
Forest statistics for West Virginia--1975 and 1989
Dawn M. Di Giovanni; Dawn M. Di Giovanni
1990-01-01
A statistical report on the fourth forest survey of West Virginia (1989). Findings are displayed in 119 tables containing estimates of forest area, number of trees, timber volume, tree biomass, and timber products output. Data are presented at three levels: state, geographic unit, and county.
Liu, Zeyu; Su, Zhetong; Yang, Ming; Zou, Wenquan
2010-10-01
To screen the factors that affect indirubin-generated significantly in the process of preparing indigo naturalis, optimize level combination and determine the optimum technology for indirubin-generated. Using concentration of indirubin (mg x g(-1)) that generated by fresh leaf as an index, Plackett-Burman design, Box-Behnken design response surface analysis as the statistical method, we screened the significantly influencing factors and the optimal level combination. The soaking and making indirubin process in preparing indigo naturalis was identified as the wax is not removed before immersion with immersion pH 7, solvent volume-leaf weight (mL: g)15, soaked not avoided light, soaking 48 h, temperature 60 degrees C, ventilation time of 180 min, and added ammonia water to adjust pH to 10.5. The soaking and making indirubin process in preparing indigo naturalis is optimized systematically. It clarify the various factors on the impact of the active ingredient indirubin which controlled by industrialized production become reality in the process of preparing indigo naturalis, at the same time, it lay the foundation for processing principle of indigo naturalis.
New Trends in Mathematics Teaching, Volume III.
ERIC Educational Resources Information Center
United Nations Educational, Scientific, and Cultural Organization, Paris (France).
Each of the ten chapters in this volume is intended to present an objective analysis of the trends of some important subtopic in mathematics education and each includes a bibliography for fuller study. The chapters cover primary school mathematics, algebra, geometry, probability and statistics, analysis, logic, applications of mathematics, methods…
Timber resource of Minnesota's Central Hardwood Unit, 1977.
Alexander Vasilevsky; Ronald L. Hackett
1980-01-01
The fourth inventory of Minnesota's Central Hardwood Unit shows large gains in growing-stock and sawtimber volumes but a 17% decline in commercial forest area between 1962 and 1977. This report gives statistical highlights and contains detailed tables of forest area as well as timber volume, growth, mortality, ownership, and use.
Timber resource of Michigan's Southern Lower Peninsula Unit, 1980.
Jerold T. Hahn
1982-01-01
The fourth inventory of the timber resource of Michigan's Southern Lower Peninsula Survey Unit shows a 12% decline in commercial forest area and a 26% gain in growing-stock volume between 1966 and 1980. Presented are highlights and statistics on area, volume, growth, mortality, removals, utilization, and biomass.
Timber resource of Wisconsin's Southeast Survey Unit, 1983.
John S. Jr. Spencer
1985-01-01
The fourth inventory of Wisconsin's timber resource shows that commercial forest area in the Southeast Survey Unit increased from 904 to 909 thousand acres between 1968 and 1983. During the same period growing-stock volume increased 37%. Highlights and statistics are presented on area, volume, growth, mortality, and removals.
Timber resource of Michigan's Western Upper Peninsula Unit, 1980.
John S. Jr. Spencer
1982-01-01
The fourth inventory of the timber resource of Michigan's Western Upper Peninsula Survey Unit shows an 8% decline in commercial forest area and a 22% gain in growing-stock volume between 1966 and 1980. Presented are highlights and statistics on area, volume, growth, motility, removals, utilization, and biomass.
Timber resource of Minnesota's Aspen-Birch Unit, 1977.
John S. Jr. Spencer; Arnold J. Ostrom
1979-01-01
The fourth inventory of Minnesota's Aspen-Birch Unit shows solid gains in growing-stock and sawtimber volumes between 1962 and 1977, but a 13% decline in commercial forest area. This report gives statistical highlights and contains detailed tables of forest area a well as timber volume, growth, mortality, ownership, and use.
METHANE EMISSIONS FROM THE NATURAL GAS INDUSTRY VOLUME 4: STATISTICAL METHODOLOGY
The 15-volume report summarizes the results of a comprehensive program to quantify methane (CH4) emissions from the U.S. natural gas industry for the base year. The objective was to determine CH4 emissions from the wellhead and ending downstream at the customer's meter. The accur...
The Condition of Education 1991. Volume 2: Postsecondary Education.
ERIC Educational Resources Information Center
Alsalam, Nabeel; Rogers, Gayle Thompson
This volume contains 30 indicators that collectively describe the condition of postsecondary education from a variety of perspectives. The indicators have been derived from studies conducted by the Center for Education Statistics and from other surveys conducted both within and outside the Federal Government. Indicators have been grouped under the…
Timber resources of Michigan's Eastern Upper Peninsula, 1980.
W. Brad Smith
1982-01-01
The fourth inventory of the timber resource of Michigan's Eastern Upper Peninsula Survey Unit shows a 9% decline in commercial forest area and a 19% gain in growing-stock volume between 1966 and 1980. Presented are highlights and statistics on area, volume, growth, mortality, removals, utilization, and biomass.
Timber Resource of Wisconsin's Northeast Survey Unit, 1983.
Mark H. Hansen
1984-01-01
The timber resource of the Northeast Wisconsin Survey Unit declined 5.7% in commercial forest area and increased 23% in growing-stock volume between 1968 and 1983. Highlights and statistics from the fourth inventory of this unit are presented for area, volume, growth, mortality, removals, utilization, and biomass.
Timber resource of Wisconsin's Central Survey Unit, 1983.
Jerold T. Hahn
1985-01-01
The timber resource of the Central Wisconsin Survey Unit increased 4.2% in commercial forest area and increased 75% in growing-stock volume between 1968 and 1983. Highlights and statistics from the fourth inventory of this unit are presented for area, volume, growth, mortality, removals, utilization, and biomass.
Timber resource of Wisconsin's Northwest Survey Unit, 1983.
W. Brad Smith
1984-01-01
The fourth inventory of the timber resource of the Northwest Wisconsin Survey Unit shows a 1.8% decline in commercial forest area and a 36% gain in growing-stock volume between 1968 and 1983. Presented are highlights and statistics on area, volume, growth, mortality, removals, utilization, and biomass.
Timber resource of Missouri's Riverborder, 1972.
John S. Jr. Spencer; Arnold J. Ostrom
1975-01-01
The third timber inventory of Missouri's Riverborder Forest Survey Unit shows that neither the total volume of growing stock nor of sawtimber changed significantly between 1959 and 1972. Area of commercial forest land declined slightly. Presents statistics on forest area and timber volume, growth, mortality, ownership and use in 1972.
Timber resource of Michigan's Northern Lower Peninsula, 1980.
Pamela J. Jakes
1982-01-01
The fourth inventory of the timber resource of Michigan's Northern Lower Peninsula Survey Unit shows a 4% decline in commercial forest area and a 38% gain in growing-stock volume between 1966 and 1980. Presented are highlights and statistics on area, volume, growth, mortality, removals, utilization, and biomass.
Two Simple Models for Fracking
NASA Astrophysics Data System (ADS)
Norris, Jaren Quinn
Recent developments in fracking have enable the recovery of oil and gas from tight shale reservoirs. These developments have also made fracking one of the most controversial environmental issues in the United States. Despite the growing controversy surrounding fracking, there is relatively little publicly available research. This dissertation introduces two simple models for fracking that were developed using techniques from non-linear and statistical physics. The first model assumes that the volume of induced fractures must be equal to the volume of injected fluid. For simplicity, these fractures are assumed to form a spherically symmetric damage region around the borehole. The predicted volumes of water necessary to create a damage region with a given radius are in good agreement with reported values. The second model is a modification of invasion percolation which was previously introduced to model water flooding. The reservoir rock is represented by a regular lattice of local traps that contain oil and/or gas separated by rock barriers. The barriers are assumed to be highly heterogeneous and are assigned random strengths. Fluid is injected from a central site and the weakest rock barrier breaks allowing fluid to flow into the adjacent site. The process repeats with the weakest barrier breaking and fluid flowing to an adjacent site each time step. Extensive numerical simulations were carried out to obtain statistical properties of the growing fracture network. The network was found to be fractal with fractal dimensions differing slightly from the accepted values for traditional percolation. Additionally, the network follows Horton-Strahler and Tokunaga branching statistics which have been used to characterize river networks. As with other percolation models, the growth of the network occurs in bursts. These bursts follow a power-law size distribution similar to observed microseismic events. Reservoir stress anisotropy is incorporated into the model by assigning horizontal bonds weaker strengths on average than vertical bonds. Numerical simulations show that increasing bond strength anisotropy tends to reduce the fractal dimension of the growing fracture network, and decrease the power-law slope of the burst size distribution. Although simple, these two models are useful for making informed decisions about fracking.
Cell response to quasi-monochromatic light with different coherence
NASA Astrophysics Data System (ADS)
Budagovsky, A. V.; Solovykh, N. V.; Budagovskaya, O. N.; Budagovsky, I. A.
2015-04-01
The problem of the light coherence effect on the magnitude of the photoinduced cell response is discussed. The origins of ambiguous interpretation of the known experimental results are considered. Using the biological models, essentially differing in anatomy, morphology and biological functions (acrospires of radish, blackberry microsprouts cultivated in vitro, plum pollen), the effect of statistical properties of quasi-monochromatic light (λmax = 633 nm) on the magnitude of the photoinduced cell response is shown. It is found that for relatively low spatial coherence, the cell functional activity changes insignificantly. The maximal enhancement of growing processes (stimulating effect) is observed when the coherence length Lcoh and the correlation radius rcor are greater than the cell size, i.e., the entire cell fits into the field coherence volume. In this case, the representative indicators (germination of seeds and pollen, the spears length) exceeds those of non-irradiated objects by 1.7 - 3.9 times. For more correct assessment of the effect of light statistical properties on photocontrol processes, it is proposed to replace the qualitative description (coherent - incoherent) with the quantitative one, using the determination of spatial and temporal correlation functions and comparing them with the characteristic dimensions of the biological structures, e.g., the cell size.
Probabilistic modelling of flood events using the entropy copula
NASA Astrophysics Data System (ADS)
Li, Fan; Zheng, Qian
2016-11-01
The estimation of flood frequency is vital for the flood control strategies and hydraulic structure design. Generating synthetic flood events according to statistical properties of observations is one of plausible methods to analyze the flood frequency. Due to the statistical dependence among the flood event variables (i.e. the flood peak, volume and duration), a multidimensional joint probability estimation is required. Recently, the copula method is widely used for multivariable dependent structure construction, however, the copula family should be chosen before application and the choice process is sometimes rather subjective. The entropy copula, a new copula family, employed in this research proposed a way to avoid the relatively subjective process by combining the theories of copula and entropy. The analysis shows the effectiveness of the entropy copula for probabilistic modelling the flood events of two hydrological gauges, and a comparison of accuracy with the popular copulas was made. The Gibbs sampling technique was applied for trivariate flood events simulation in order to mitigate the calculation difficulties of extending to three dimension directly. The simulation results indicate that the entropy copula is a simple and effective copula family for trivariate flood simulation.
Qu, Zhijun; Wang, Geng; Xu, Chengshi; Zhang, Dazhi; Qu, Xiangdong; Zhou, Haibin; Ma, Jun
2016-10-01
Preoperative platelet rich plasma (PRP) harvest has been used in cardiopulmonary surgery for more than 10 years. There is no previous study dealing with PRP in bilateral total hip replacement. This study was to investigate the effects of PRP on blood saving and blood coagulation function in patients with bilateral total hip replacement. A prospective, randomized, clinical trial was conducted. Sixty patients were enrolled, including 30 patients undergoing PRP in the PRP group and 30 controls. The surgery time, total transfusion volume, blood loss, allogenic blood transfusion, autologous blood transfusion, urine volume, drainage volume, some blood parameters (including Fibrinogen, D-dimer, Prothrombin time, international normalizedratio, activated partial thromboplastin time, Platelet, Haemoglobin B), thrombelastogram (TEG) and blood-gas parameters were studied in the perioperative stage. The measurement data were analyzed statistically. There was no statistical difference between the two groups in baseline characteristics, surgery time, total transfusion volume, blood loss, autologous blood transfusion, etc. Allogenic blood transfusion in the PRP group was less than the control group with statistical difference (p = 0.024). Fibrinogen in the PRP group was higher than the control group (p = 0.008). Among the TEG indicators, activated clotting time and coagulation time K in the PRP group were less than the control group. Clotting rate and maximum amplitude in the PRP group were higher. The blood-gas parameters presented no statistical difference. The results suggested that PRP probably played a positive role in blood coagulation function as well as blood saving in patients with bilateral total hip replacement. Copyright © 2016 IJS Publishing Group Ltd. Published by Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
National Center for Education Statistics (ED), Washington, DC.
The four papers contained in this volume were presented at the August 1994 meetings of the American Statistical Association as a session titled, "Public Policy and Data Comparability: New Interest in Public Library Data." The first paper, "Public Library Statistics: Two Systems Compared" (Mary Jo Lynch), describes two systems…
The Shock and Vibration Digest. Volume 16, Number 3
1984-03-01
Fluid-induced Statistical Energy Analysis Method excitation, Wind tunnel testing V.R. Miller and L.L. Faulkner Flight Dynamics Lab., Air Force...84475 wall by the statistical energy analysis (SEA) method. The fuselage structure is represented as a series of curved, iso- Probabilistic Fracture...heavy are demonstrated in three-dimensional form. floor, a statistical energy analysis (SEA) model is presented. Only structural systems (i.e., no
A Parametric Model of Shoulder Articulation for Virtual Assessment of Space Suit Fit
NASA Technical Reports Server (NTRS)
Kim, K. Han; Young, Karen S.; Bernal, Yaritza; Boppana, Abhishektha; Vu, Linh Q.; Benson, Elizabeth A.; Jarvis, Sarah; Rajulu, Sudhakar L.
2016-01-01
Suboptimal suit fit is a known risk factor for crewmember shoulder injury. Suit fit assessment is however prohibitively time consuming and cannot be generalized across wide variations of body shapes and poses. In this work, we have developed a new design tool based on the statistical analysis of body shape scans. This tool is aimed at predicting the skin deformation and shape variations for any body size and shoulder pose for a target population. This new process, when incorporated with CAD software, will enable virtual suit fit assessments, predictively quantifying the contact volume, and clearance between the suit and body surface at reduced time and cost.
1990-11-30
signaI flow , xi. The learning" of such statistics could result from synaptic modification rules similar to those known to exist in the brain 7 " 1 0,1 1...in figure 1 had been established. If the series are appro\\imat.ed by Gaussian process. the information flow from X to Y can be expressed by the...Based on this model. the information flow in different direction were calculated by using eq.(1). RESULTS Figures 2 illustrates the information flow
Long-range correlations in an online betting exchange for a football tournament
NASA Astrophysics Data System (ADS)
Hardiman, Stephen J.; Richmond, Peter; Hutzler, Stefan
2010-10-01
We analyze the changes in the market odds of football matches in an online betting exchange, Betfair.com. We identify the statistical differences between the returns that occur when the game play is under way, which we argue are driven by match events, and the returns that occur during half-time, which we ascribe to a trader-driven noise. Furthermore, using detrended fluctuation analysis we identify anti-persistence (Hurst exponent H<0.5) in odds returns and long memory (H>0.5) in the volatilities, which we attribute to the trader-driven noise. The time series of trading volume are found to be short-memory processes.
FAA Rotorcraft Research, Engineering, and Development Bibliography 1962-1989
1990-05-01
Albert G. Delucien) (NTIS: ADA 102 521) FAA/CT-88/10 Digital Systems Validation Handbook - Volume II (R.L. McDowall, Hardy P. Curd, Lloyd N. Popish... Digital Systems in Avionics and Flight Control Applications, Handbook - Volume I, (Ellis F. Hilt, Donald Eldredge, Jeff Webb, Charles Lucius, Michael S...Structure Statistics of Helicopter GPS Navigation with the Magnavox Z-Set (Robert D. Till) FAA/CT-82/115 Handbook - Volume I, Validation of Digital
2016-07-01
Predicted variation in (a) hot-spot number density , (b) hot-spot volume fraction, and (c) hot-spot specific surface area for each ensemble with piston speed...packing density , characterized by its effective solid volume fraction φs,0, affects hot-spot statistics for pressure dominated waves corresponding to...distribution in solid volume fraction within each ensemble was nearly Gaussian, and its standard deviation decreased with increasing density . Analysis of
NASA Technical Reports Server (NTRS)
Gyekenyesi, John P.; Nemeth, Noel N.
1987-01-01
The SCARE (Structural Ceramics Analysis and Reliability Evaluation) computer program on statistical fast fracture reliability analysis with quadratic elements for volume distributed imperfections is enhanced to include the use of linear finite elements and the capability of designing against concurrent surface flaw induced ceramic component failure. The SCARE code is presently coupled as a postprocessor to the MSC/NASTRAN general purpose, finite element analysis program. The improved version now includes the Weibull and Batdorf statistical failure theories for both surface and volume flaw based reliability analysis. The program uses the two-parameter Weibull fracture strength cumulative failure probability distribution model with the principle of independent action for poly-axial stress states, and Batdorf's shear-sensitive as well as shear-insensitive statistical theories. The shear-sensitive surface crack configurations include the Griffith crack and Griffith notch geometries, using the total critical coplanar strain energy release rate criterion to predict mixed-mode fracture. Weibull material parameters based on both surface and volume flaw induced fracture can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and grouped fracture data. The statistical fast fracture theories for surface flaw induced failure, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.
Engelstad, Mark E; Morse, Timothy
2010-12-01
The anterior iliac crest, posterior iliac crest, and proximal tibia are common cancellous donor sites used for autogenous bone grafting. Donor site selection is partly dependent on the expected volume of available bone, but reports of cancellous bone volumes at each of these sites are variable. The goal of this study was to compare the volumes of cancellous bone harvested from donor sites within the same cadaver. Within each of 10 fresh frozen cadavers, cancellous bone was harvested from 3 donor sites-anterior iliac crest, posterior iliac crest, and proximal tibia-using established surgical techniques. Bone volumes were measured by fluid displacement. Mean compressed cancellous bone volumes from the 3 donor sites were compared among cadavers. Within each cadaver, the 3 donor sites were given a volume rank score from 1 (least volume) to 3 (most volume). Among cadavers, mean compressed cancellous bone volumes from the proximal tibia (11.3 mL) and posterior iliac crest (10.1 mL) were significantly greater than the anterior iliac crest (7.0 mL). Within cadavers, the mean volume rank score of the proximal tibia (mean rank, 2.7) was statistically greater than that for the posterior iliac crest (mean rank, 2.0), which was statistically greater than that for the anterior iliac crest (mean rank, 1.2). Strong correlations in bone volume existed between the proximal tibia and iliac crests (r = 0.67) and between the anterior iliac crest and posterior iliac crest (r = 0.93). The proximal tibia and posterior iliac crest yielded a significantly greater mean volume of compressed cancellous bone than the anterior iliac crest. Within individual cadaver skeletons, the proximal tibia was most likely to yield the largest cancellous volume, whereas the anterior iliac crest was most likely to yield the smallest cancellous volume. Although the proximal tibia contains relatively large volumes of cancellous bone, further investigation is required to determine how much cancellous bone can safely be harvested. Copyright © 2010 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
Dynamic Statistical Characterization of Variation in Source Processes of Microseismic Events
NASA Astrophysics Data System (ADS)
Smith-Boughner, L.; Viegas, G. F.; Urbancic, T.; Baig, A. M.
2015-12-01
During a hydraulic fracture, water is pumped at high pressure into a formation. A proppant, typically sand is later injected in the hope that it will make its way into a fracture, keep it open and provide a path for the hydrocarbon to enter the well. This injection can create micro-earthquakes, generated by deformation within the reservoir during treatment. When these injections are monitored, thousands of microseismic events are recorded within several hundred cubic meters. For each well-located event, many source parameters are estimated e.g. stress drop, Savage-Wood efficiency and apparent stress. However, because we are evaluating outputs from a power-law process, the extent to which the failure is impacted by fluid injection or stress triggering is not immediately clear. To better detect differences in source processes, we use a set of dynamic statistical parameters which characterize various force balance assumptions using the average distance to the nearest event, event rate, volume enclosed by the events, cumulative moment and energy from a group of events. One parameter, the Fracability index, approximates the ratio of viscous to elastic forcing and highlights differences in the response time of a rock to changes in stress. These dynamic parameters are applied to a database of more than 90 000 events in a shale-gas play in the Horn River Basin to characterize spatial-temporal variations in the source processes. In order to resolve these differences, a moving window, nearest neighbour approach was used. First, the center of mass of the local distribution was estimated for several source parameters. Then, a set of dynamic parameters, which characterize the response of the rock were estimated. These techniques reveal changes in seismic efficiency and apparent stress and often coincide with marked changes in the Fracability index and other dynamic statistical parameters. Utilizing these approaches allowed for the characterization of fluid injection related processes.
Brain Structure Changes Visualized in Early- and Late-Onset Blind Subjects
Leporé, Natasha; Voss, Patrice; Lepore, Franco; Chou, Yi-Yu; Fortin, Madeleine; Gougoux, Frédéric; Lee, Agatha D.; Brun, Caroline; Lassonde, Maryse; Madsen, Sarah K.; Toga, Arthur W.; Thompson, Paul M.
2009-01-01
We examine 3D patterns of volume differences in the brain associated with blindness, in subjects grouped according to early and late onset. Using tensor-based morphometry, we map volume reductions and gains in 16 early-onset (EB) and 16 late-onset (LB) blind adults (onset <5 and >14 years old, respectively) relative to 16 matched sighted controls. Each subject’s structural MRI was fluidly registered to a common template. Anatomical differences between groups were mapped based on statistical analysis of the resulting deformation fields revealing profound deficits in primary and secondary visual cortices for both blind groups. Regions outside the occipital lobe showed significant hypertrophy, suggesting widespread compensatory adaptations. EBs but not LBs showed deficits in the splenium and hypertrophy in the isthmus. Gains in the isthmus and non-occipital white matter were more widespread in the EBs. These differences may reflect regional alterations in late neurodevelopmental processes, such as myelination, that continue into adulthood. PMID:19643183
Granato, Gregory E.
2014-01-01
The U.S. Geological Survey (USGS) developed the Stochastic Empirical Loading and Dilution Model (SELDM) in cooperation with the Federal Highway Administration (FHWA) to indicate the risk for stormwater concentrations, flows, and loads to be above user-selected water-quality goals and the potential effectiveness of mitigation measures to reduce such risks. SELDM models the potential effect of mitigation measures by using Monte Carlo methods with statistics that approximate the net effects of structural and nonstructural best management practices (BMPs). In this report, structural BMPs are defined as the components of the drainage pathway between the source of runoff and a stormwater discharge location that affect the volume, timing, or quality of runoff. SELDM uses a simple stochastic statistical model of BMP performance to develop planning-level estimates of runoff-event characteristics. This statistical approach can be used to represent a single BMP or an assemblage of BMPs. The SELDM BMP-treatment module has provisions for stochastic modeling of three stormwater treatments: volume reduction, hydrograph extension, and water-quality treatment. In SELDM, these three treatment variables are modeled by using the trapezoidal distribution and the rank correlation with the associated highway-runoff variables. This report describes methods for calculating the trapezoidal-distribution statistics and rank correlation coefficients for stochastic modeling of volume reduction, hydrograph extension, and water-quality treatment by structural stormwater BMPs and provides the calculated values for these variables. This report also provides robust methods for estimating the minimum irreducible concentration (MIC), which is the lowest expected effluent concentration from a particular BMP site or a class of BMPs. These statistics are different from the statistics commonly used to characterize or compare BMPs. They are designed to provide a stochastic transfer function to approximate the quantity, duration, and quality of BMP effluent given the associated inflow values for a population of storm events. A database application and several spreadsheet tools are included in the digital media accompanying this report for further documentation of methods and for future use. In this study, analyses were done with data extracted from a modified copy of the January 2012 version of International Stormwater Best Management Practices Database, designated herein as the January 2012a version. Statistics for volume reduction, hydrograph extension, and water-quality treatment were developed with selected data. Sufficient data were available to estimate statistics for 5 to 10 BMP categories by using data from 40 to more than 165 monitoring sites. Water-quality treatment statistics were developed for 13 runoff-quality constituents commonly measured in highway and urban runoff studies including turbidity, sediment and solids; nutrients; total metals; organic carbon; and fecal coliforms. The medians of the best-fit statistics for each category were selected to construct generalized cumulative distribution functions for the three treatment variables. For volume reduction and hydrograph extension, interpretation of available data indicates that selection of a Spearman’s rho value that is the average of the median and maximum values for the BMP category may help generate realistic simulation results in SELDM. The median rho value may be selected to help generate realistic simulation results for water-quality treatment variables. MIC statistics were developed for 12 runoff-quality constituents commonly measured in highway and urban runoff studies by using data from 11 BMP categories and more than 167 monitoring sites. Four statistical techniques were applied for estimating MIC values with monitoring data from each site. These techniques produce a range of lower-bound estimates for each site. Four MIC estimators are proposed as alternatives for selecting a value from among the estimates from multiple sites. Correlation analysis indicates that the MIC estimates from multiple sites were weakly correlated with the geometric mean of inflow values, which indicates that there may be a qualitative or semiquantitative link between the inflow quality and the MIC. Correlations probably are weak because the MIC is influenced by the inflow water quality and the capability of each individual BMP site to reduce inflow concentrations.
Shiraishi, Satomi; Grams, Michael P; Fong de Los Santos, Luis E
2018-05-01
The purpose of this study was to demonstrate an objective quality control framework for the image review process. A total of 927 cone-beam computed tomography (CBCT) registrations were retrospectively analyzed for 33 bilateral head and neck cancer patients who received definitive radiotherapy. Two registration tracking volumes (RTVs) - cervical spine (C-spine) and mandible - were defined, within which a similarity metric was calculated and used as a registration quality tracking metric over the course of treatment. First, sensitivity to large misregistrations was analyzed for normalized cross-correlation (NCC) and mutual information (MI) in the context of statistical analysis. The distribution of metrics was obtained for displacements that varied according to a normal distribution with standard deviation of σ = 2 mm, and the detectability of displacements greater than 5 mm was investigated. Then, similarity metric control charts were created using a statistical process control (SPC) framework to objectively monitor the image registration and review process. Patient-specific control charts were created using NCC values from the first five fractions to set a patient-specific process capability limit. Population control charts were created using the average of the first five NCC values for all patients in the study. For each patient, the similarity metrics were calculated as a function of unidirectional translation, referred to as the effective displacement. Patient-specific action limits corresponding to 5 mm effective displacements were defined. Furthermore, effective displacements of the ten registrations with the lowest similarity metrics were compared with a three dimensional (3DoF) couch displacement required to align the anatomical landmarks. Normalized cross-correlation identified suboptimal registrations more effectively than MI within the framework of SPC. Deviations greater than 5 mm were detected at 2.8σ and 2.1σ from the mean for NCC and MI, respectively. Patient-specific control charts using NCC evaluated daily variation and identified statistically significant deviations. This study also showed that subjective evaluations of the images were not always consistent. Population control charts identified a patient whose tracking metrics were significantly lower than those of other patients. The patient-specific action limits identified registrations that warranted immediate evaluation by an expert. When effective displacements in the anterior-posterior direction were compared to 3DoF couch displacements, the agreement was ±1 mm for seven of 10 patients for both C-spine and mandible RTVs. Qualitative review alone of IGRT images can result in inconsistent feedback to the IGRT process. Registration tracking using NCC objectively identifies statistically significant deviations. When used in conjunction with the current image review process, this tool can assist in improving the safety and consistency of the IGRT process. © 2018 American Association of Physicists in Medicine.
Forest Statistics for Minnesota's Central Hardwood Unit.
Earl C. Leatherberry
1991-01-01
In 1990, the fifth inventory of Minnesota's Central Hardwood Unit found 11.9 million acres of land, of which 2.4 million acres are forested. This bulletin presents statistical highlights and contains detailed tables of forest area, as well as timber volume, growth, removal, mortality, and ownership.
Forest Statistics for New York 1980 and 1993
Carol L. Alerich; David A. Drake; David A. Drake
1995-01-01
A statistical report on the fourth forest inventory of New York 1991- 1994. Findings are displayed in 155 tables containing estimates of forest area numbers of trees wildlife habitat timber volume growth change and biomass. Data are presented at three levels; state, geographic unit, and county.
Hill, Shirley Y; Wang, Shuhui; Carter, Howard; McDermott, Michael D; Zezza, Nicholas; Stiffler, Scott
2013-12-12
The increased susceptibility for developing alcohol dependence seen in offspring from families with alcohol dependence may be related to structural and functional differences in brain circuits that influence emotional processing. Early childhood environment, genetic variation in the serotonin transporter-linked polymorphic region (5-HTTLPR) of the SLCA4 gene and allelic variation in the Brain Derived Neurotrophic Factor (BDNF) gene have each been reported to be related to volumetric differences in the temporal lobe especially the amygdala. Magnetic resonance imaging was used to obtain amygdala volumes for 129 adolescent/young adult individuals who were either High-Risk (HR) offspring from families with multiple cases of alcohol dependence (N=71) or Low-Risk (LR) controls (N=58). Childhood family environment was measured prospectively using age-appropriate versions of the Family Environment Scale during a longitudinal follow-up study. The subjects were genotyped for Brain-Derived Neurotrophic Factor (BDNF) Val66Met and the serotonin transporter polymorphism (5-HTTLPR). Two family environment scale scores (Cohesion and Conflict), genotypic variation, and their interaction were tested for their association with amygdala volumes. Personal and prenatal exposure to alcohol and drugs were considered in statistical analyses in order to more accurately determine the effects of familial risk group differences. Amygdala volume was reduced in offspring from families with multiple alcohol dependent members in comparison to offspring from control families. High-Risk offspring who were carriers of the S variant of the 5-HTTLPR polymorphism had reduced amygdala volume in comparison to those with an LL genotype. Larger amygdala volume was associated with greater family cohesion but only in Low-Risk control offspring. Familial risk for alcohol dependence is an important predictor of amygdala volume even when removing cases with significant personal exposure and covarying for prenatal exposure effects. The present study provides new evidence that amygdala volume is modified by 5-HTTLPR variation in High-Risk families.
Combining quantitative and qualitative breast density measures to assess breast cancer risk.
Kerlikowske, Karla; Ma, Lin; Scott, Christopher G; Mahmoudzadeh, Amir P; Jensen, Matthew R; Sprague, Brian L; Henderson, Louise M; Pankratz, V Shane; Cummings, Steven R; Miglioretti, Diana L; Vachon, Celine M; Shepherd, John A
2017-08-22
Accurately identifying women with dense breasts (Breast Imaging Reporting and Data System [BI-RADS] heterogeneously or extremely dense) who are at high breast cancer risk will facilitate discussions of supplemental imaging and primary prevention. We examined the independent contribution of dense breast volume and BI-RADS breast density to predict invasive breast cancer and whether dense breast volume combined with Breast Cancer Surveillance Consortium (BCSC) risk model factors (age, race/ethnicity, family history of breast cancer, history of breast biopsy, and BI-RADS breast density) improves identifying women with dense breasts at high breast cancer risk. We conducted a case-control study of 1720 women with invasive cancer and 3686 control subjects. We calculated ORs and 95% CIs for the effect of BI-RADS breast density and Volpara™ automated dense breast volume on invasive cancer risk, adjusting for other BCSC risk model factors plus body mass index (BMI), and we compared C-statistics between models. We calculated BCSC 5-year breast cancer risk, incorporating the adjusted ORs associated with dense breast volume. Compared with women with BI-RADS scattered fibroglandular densities and second-quartile dense breast volume, women with BI-RADS extremely dense breasts and third- or fourth-quartile dense breast volume (75% of women with extremely dense breasts) had high breast cancer risk (OR 2.87, 95% CI 1.84-4.47, and OR 2.56, 95% CI 1.87-3.52, respectively), whereas women with extremely dense breasts and first- or second-quartile dense breast volume were not at significantly increased breast cancer risk (OR 1.53, 95% CI 0.75-3.09, and OR 1.50, 95% CI 0.82-2.73, respectively). Adding continuous dense breast volume to a model with BCSC risk model factors and BMI increased discriminatory accuracy compared with a model with only BCSC risk model factors (C-statistic 0.639, 95% CI 0.623-0.654, vs. C-statistic 0.614, 95% CI 0.598-0.630, respectively; P < 0.001). Women with dense breasts and fourth-quartile dense breast volume had a BCSC 5-year risk of 2.5%, whereas women with dense breasts and first-quartile dense breast volume had a 5-year risk ≤ 1.8%. Risk models with automated dense breast volume combined with BI-RADS breast density may better identify women with dense breasts at high breast cancer risk than risk models with either measure alone.
Do altitude and climate affect paranasal sinus volume?
Selcuk, Omer Tarık; Erol, Bekir; Renda, Levent; Osma, Ustun; Eyigor, Hulya; Gunsoy, Behcet; Yagci, Buket; Yılmaz, Deniz
2015-09-01
The aim of this study was to evaluate the effect of climate and altitude differences on the volume of paranasal sinuses and on the frequency of anatomic variations by comparing the paranasal sinus tomograms (PNSCT) of patients who were born and living in a cold, dry climate at high altitude with those of patients who were born and living on the coast at sea level in a temperate climate. We also aimed to determine differences relating to gender. A total of 55 PNSCTs of 55 patients from the city center of Antalya and 60 PNSCTs of 60 patients from the city center of Agrı were evaluated and compared prospectively. The study included a total of 115 patients with a mean age of 44.75 ± 9.64 years (range, 27-63 years). Group 1 (Antalya) comprised 26 females (47.3%) and 29 males (52.7%) with a mean age of 36.7 ± 12.4 years. Group 2 (Agrı) comprised 25 females (41.7%) and 35 males (58.3%) with a mean age of 35.1 ± 13.4 years. Maxillary sinus volumes were 18.27 cm(3) (range, 5.04-37.62) and 15.06 cm(3) (4.11-41.40); sphenoid sinus volumes were 7.81 cm(3) (1.80-20.63) and 6.35 cm(3) (0.54-16.50); frontal sinus volumes were 5.51 cm(3) (0.50-29.25) and 3.76 cm(3) (0.68-22.81) respectively. There was no statistically significant difference between the groups in term of volumes (p > 0.025). Both maxillary and frontal sinus volumes were greater in males compared to females (p < 0.025). The mean value of the maxillary sinus volume was 15.7 ± 5.3 cm(3) and was significantly larger in males than in females (p = 0.004). There was no statistically significant correlation between the volume of maxillary sinuses with age or side. There was no statistically significant difference between the groups in terms of septum deviation and concha bullosa rates (p = 0.469 and p = 0.388). There have been many studies of nasal cavity changes due to climatic conditions but this is the first study to measure the difference of paranasal sinus volumes. No difference was determined in the anatomic variations and volumes of the maxillary, frontal, sphenoid sinuses on PNSCT of patients from different climates and altitudes. Copyright © 2015 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.
Overcoming bias in estimating the volume-outcome relationship.
Tsai, Alexander C; Votruba, Mark; Bridges, John F P; Cebul, Randall D
2006-02-01
To examine the effect of hospital volume on 30-day mortality for patients with congestive heart failure (CHF) using administrative and clinical data in conventional regression and instrumental variables (IV) estimation models. The primary data consisted of longitudinal information on comorbid conditions, vital signs, clinical status, and laboratory test results for 21,555 Medicare-insured patients aged 65 years and older hospitalized for CHF in northeast Ohio in 1991-1997. The patient was the primary unit of analysis. We fit a linear probability model to the data to assess the effects of hospital volume on patient mortality within 30 days of admission. Both administrative and clinical data elements were included for risk adjustment. Linear distances between patients and hospitals were used to construct the instrument, which was then used to assess the endogeneity of hospital volume. When only administrative data elements were included in the risk adjustment model, the estimated volume-outcome effect was statistically significant (p=.029) but small in magnitude. The estimate was markedly attenuated in magnitude and statistical significance when clinical data were added to the model as risk adjusters (p=.39). IV estimation shifted the estimate in a direction consistent with selective referral, but we were unable to reject the consistency of the linear probability estimates. Use of only administrative data for volume-outcomes research may generate spurious findings. The IV analysis further suggests that conventional estimates of the volume-outcome relationship may be contaminated by selective referral effects. Taken together, our results suggest that efforts to concentrate hospital-based CHF care in high-volume hospitals may not reduce mortality among elderly patients.
Optimizing image registration and infarct definition in stroke research.
Harston, George W J; Minks, David; Sheerin, Fintan; Payne, Stephen J; Chappell, Michael; Jezzard, Peter; Jenkinson, Mark; Kennedy, James
2017-03-01
Accurate representation of final infarct volume is essential for assessing the efficacy of stroke interventions in imaging-based studies. This study defines the impact of image registration methods used at different timepoints following stroke, and the implications for infarct definition in stroke research. Patients presenting with acute ischemic stroke were imaged serially using magnetic resonance imaging. Infarct volume was defined manually using four metrics: 24-h b1000 imaging; 1-week and 1-month T2-weighted FLAIR; and automatically using predefined thresholds of ADC at 24 h. Infarct overlap statistics and volumes were compared across timepoints following both rigid body and nonlinear image registration to the presenting MRI. The effect of nonlinear registration on a hypothetical trial sample size was calculated. Thirty-seven patients were included. Nonlinear registration improved infarct overlap statistics and consistency of total infarct volumes across timepoints, and reduced infarct volumes by 4.0 mL (13.1%) and 7.1 mL (18.2%) at 24 h and 1 week, respectively, compared to rigid body registration. Infarct volume at 24 h, defined using a predetermined ADC threshold, was less sensitive to infarction than b1000 imaging. 1-week T2-weighted FLAIR imaging was the most accurate representation of final infarct volume. Nonlinear registration reduced hypothetical trial sample size, independent of infarct volume, by an average of 13%. Nonlinear image registration may offer the opportunity of improving the accuracy of infarct definition in serial imaging studies compared to rigid body registration, helping to overcome the challenges of anatomical distortions at subacute timepoints, and reducing sample size for imaging-based clinical trials.
NASA Astrophysics Data System (ADS)
Bruns, S.; Stipp, S. L. S.; Sørensen, H. O.
2017-09-01
Digital rock physics carries the dogmatic concept of having to segment volume images for quantitative analysis but segmentation rejects huge amounts of signal information. Information that is essential for the analysis of difficult and marginally resolved samples, such as materials with very small features, is lost during segmentation. In X-ray nanotomography reconstructions of Hod chalk we observed partial volume voxels with an abundance that limits segmentation based analysis. Therefore, we investigated the suitability of greyscale analysis for establishing statistical representative elementary volumes (sREV) for the important petrophysical parameters of this type of chalk, namely porosity, specific surface area and diffusive tortuosity, by using volume images without segmenting the datasets. Instead, grey level intensities were transformed to a voxel level porosity estimate using a Gaussian mixture model. A simple model assumption was made that allowed formulating a two point correlation function for surface area estimates using Bayes' theory. The same assumption enables random walk simulations in the presence of severe partial volume effects. The established sREVs illustrate that in compacted chalk, these simulations cannot be performed in binary representations without increasing the resolution of the imaging system to a point where the spatial restrictions of the represented sample volume render the precision of the measurement unacceptable. We illustrate this by analyzing the origins of variance in the quantitative analysis of volume images, i.e. resolution dependence and intersample and intrasample variance. Although we cannot make any claims on the accuracy of the approach, eliminating the segmentation step from the analysis enables comparative studies with higher precision and repeatability.
Möhlhenrich, Stephan Christian; Heussen, Nicole; Peters, Florian; Steiner, Timm; Hölzle, Frank; Modabber, Ali
2015-11-01
The morphometric analysis of maxillary sinus was recently presented as a helpful instrument for sex determination. The aim of the present study was to examine the volume and surface of the fully dentate, partial, and complete edentulous maxillary sinus depending on the sex. Computed tomography data from 276 patients were imported in DICOM format via special virtual planning software, and surfaces (mm) and volumes (mm) of maxillary sinuses were measured. In sex-specific comparisons (women vs men), statistically significant differences for the mean maxillary sinus volume and surface were found between fully dentate (volume, 13,267.77 mm vs 16,623.17 mm, P < 0.0001; surface, 3480.05 mm vs 4100.83 mm, P < 0.0001) and partially edentulous (volume, 10,577.35 mm vs 14,608.10 mm, P = 0.0002; surface, 2980.11 mm vs 3797.42 mm, P < 0.0001) or complete edentulous sinuses (volume, 11,200.99 mm vs 15,382.29 mm, P < 0.0001; surface, 3118.32 mm vs 3877.25 mm, P < 0.0001). For males, the statistically different mean values were calculated between fully dentate and partially edentulous (volume, P = 0.0022; surface, P = 0.0048) maxillary sinuses. Between the sexes, no differences were only measured for female and male partially dentate fully edentulous sinuses (2 teeth missing) and between partially edentulous sinuses in women and men (1 teeth vs 2 teeth missing). With a corresponding software program, it is possible to analyze the maxillary sinus precisely. The dentition influences the volume and surface of the pneumatic maxillary sinus. Therefore, sex determination is possible by analysis of the maxillary sinus event through the increase in pneumatization.