Applications of Automation Methods for Nonlinear Fracture Test Analysis
NASA Technical Reports Server (NTRS)
Allen, Phillip A.; Wells, Douglas N.
2013-01-01
Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.
Computing tools for implementing standards for single-case designs.
Chen, Li-Ting; Peng, Chao-Ying Joanne; Chen, Ming-E
2015-11-01
In the single-case design (SCD) literature, five sets of standards have been formulated and distinguished: design standards, assessment standards, analysis standards, reporting standards, and research synthesis standards. This article reviews computing tools that can assist researchers and practitioners in meeting the analysis standards recommended by the What Works Clearinghouse: Procedures and Standards Handbook-the WWC standards. These tools consist of specialized web-based calculators or downloadable software for SCD data, and algorithms or programs written in Excel, SAS procedures, SPSS commands/Macros, or the R programming language. We aligned these tools with the WWC standards and evaluated them for accuracy and treatment of missing data, using two published data sets. All tools were tested to be accurate. When missing data were present, most tools either gave an error message or conducted analysis based on the available data. Only one program used a single imputation method. This article concludes with suggestions for an inclusive computing tool or environment, additional research on the treatment of missing data, and reasonable and flexible interpretations of the WWC standards. © The Author(s) 2015.
A dc model for power switching transistors suitable for computer-aided design and analysis
NASA Technical Reports Server (NTRS)
Wilson, P. M.; George, R. T., Jr.; Owen, H. A.; Wilson, T. G.
1979-01-01
A model for bipolar junction power switching transistors whose parameters can be readily obtained by the circuit design engineer, and which can be conveniently incorporated into standard computer-based circuit analysis programs is presented. This formulation results from measurements which may be made with standard laboratory equipment. Measurement procedures, as well as a comparison between actual and computed results, are presented.
Mount, D W; Conrad, B
1986-01-01
We have previously described programs for a variety of types of sequence analysis (1-4). These programs have now been integrated into a single package. They are written in the standard C programming language and run on virtually any computer system with a C compiler, such as the IBM/PC and other computers running under the MS/DOS and UNIX operating systems. The programs are widely distributed and may be obtained from the authors as described below. PMID:3753780
GNAP (Graphic Normative Analysis Program)
Bowen, Roger W.; Odell, John
1979-01-01
A user-oriented command language is developed to provide direct control over the computation and output of the standard CIPW norm. A user-supplied input format for the oxide values may be given or a standard CIPW Rock Analysis format may be used. Once the oxide values have been read by the computer, these values may be manipulated by the user and the 'norm' recalculated on the basis of the manipulated or 'adjusted' values. Additional output capabilities include tabular listing of computed values, summary listings suitable for publication, x-y plots, and ternary diagrams. As many as 20 rock analysis cards may be processed as a group. Any number of such groups may be processed in any one computer run.
Computer Ethics: A Slow Fade from Black and White to Shades of Gray
ERIC Educational Resources Information Center
Kraft, Theresa A.; Carlisle, Judith
2011-01-01
The expanded use of teaching case based analysis based on current events and news stories relating to computer ethics improves student engagement, encourages creativity and fosters an active learning environment. Professional ethics standards, accreditation standards for computer curriculum, ethics theories, resources for ethics on the internet,…
Programmers, professors, and parasites: credit and co-authorship in computer science.
Solomon, Justin
2009-12-01
This article presents an in-depth analysis of past and present publishing practices in academic computer science to suggest the establishment of a more consistent publishing standard. Historical precedent for academic publishing in computer science is established through the study of anecdotes as well as statistics collected from databases of published computer science papers. After examining these facts alongside information about analogous publishing situations and standards in other scientific fields, the article concludes with a list of basic principles that should be adopted in any computer science publishing standard. These principles would contribute to the reliability and scientific nature of academic publications in computer science and would allow for more straightforward discourse in future publications.
Valle, Susanne Collier; Støen, Ragnhild; Sæther, Rannei; Jensenius, Alexander Refsum; Adde, Lars
2015-10-01
A computer-based video analysis has recently been presented for quantitative assessment of general movements (GMs). This method's test-retest reliability, however, has not yet been evaluated. The aim of the current study was to evaluate the test-retest reliability of computer-based video analysis of GMs, and to explore the association between computer-based video analysis and the temporal organization of fidgety movements (FMs). Test-retest reliability study. 75 healthy, term-born infants were recorded twice the same day during the FMs period using a standardized video set-up. The computer-based movement variables "quantity of motion mean" (Qmean), "quantity of motion standard deviation" (QSD) and "centroid of motion standard deviation" (CSD) were analyzed, reflecting the amount of motion and the variability of the spatial center of motion of the infant, respectively. In addition, the association between the variable CSD and the temporal organization of FMs was explored. Intraclass correlation coefficients (ICC 1.1 and ICC 3.1) were calculated to assess test-retest reliability. The ICC values for the variables CSD, Qmean and QSD were 0.80, 0.80 and 0.86 for ICC (1.1), respectively; and 0.80, 0.86 and 0.90 for ICC (3.1), respectively. There were significantly lower CSD values in the recordings with continual FMs compared to the recordings with intermittent FMs (p<0.05). This study showed high test-retest reliability of computer-based video analysis of GMs, and a significant association between our computer-based video analysis and the temporal organization of FMs. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Human factors in the Naval Air Systems Command: Computer based training
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seamster, T.L.; Snyder, C.E.; Terranova, M.
1988-01-01
Military standards applied to the private sector contracts have a substantial effect on the quality of Computer Based Training (CBT) systems procured for the Naval Air Systems Command. This study evaluated standards regulating the following areas in CBT development and procurement: interactive training systems, cognitive task analysis, and CBT hardware. The objective was to develop some high-level recommendations for evolving standards that will govern the next generation of CBT systems. One of the key recommendations is that there be an integration of the instructional systems development, the human factors engineering, and the software development standards. Recommendations were also made formore » task analysis and CBT hardware standards. (9 refs., 3 figs.)« less
Michalski, Andrew S; Edwards, W Brent; Boyd, Steven K
2017-10-17
Quantitative computed tomography has been posed as an alternative imaging modality to investigate osteoporosis. We examined the influence of computed tomography convolution back-projection reconstruction kernels on the analysis of bone quantity and estimated mechanical properties in the proximal femur. Eighteen computed tomography scans of the proximal femur were reconstructed using both a standard smoothing reconstruction kernel and a bone-sharpening reconstruction kernel. Following phantom-based density calibration, we calculated typical bone quantity outcomes of integral volumetric bone mineral density, bone volume, and bone mineral content. Additionally, we performed finite element analysis in a standard sideways fall on the hip loading configuration. Significant differences for all outcome measures, except integral bone volume, were observed between the 2 reconstruction kernels. Volumetric bone mineral density measured using images reconstructed by the standard kernel was significantly lower (6.7%, p < 0.001) when compared with images reconstructed using the bone-sharpening kernel. Furthermore, the whole-bone stiffness and the failure load measured in images reconstructed by the standard kernel were significantly lower (16.5%, p < 0.001, and 18.2%, p < 0.001, respectively) when compared with the image reconstructed by the bone-sharpening kernel. These data suggest that for future quantitative computed tomography studies, a standardized reconstruction kernel will maximize reproducibility, independent of the use of a quantitative calibration phantom. Copyright © 2017 The International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.
Non-standard analysis and embedded software
NASA Technical Reports Server (NTRS)
Platek, Richard
1995-01-01
One model for computing in the future is ubiquitous, embedded computational devices analogous to embedded electrical motors. Many of these computers will control physical objects and processes. Such hidden computerized environments introduce new safety and correctness concerns whose treatment go beyond present Formal Methods. In particular, one has to begin to speak about Real Space software in analogy with Real Time software. By this we mean, computerized systems which have to meet requirements expressed in the real geometry of space. How to translate such requirements into ordinary software specifications and how to carry out proofs is a major challenge. In this talk we propose a research program based on the use of no-standard analysis. Much detail remains to be carried out. The purpose of the talk is to inform the Formal Methods community that Non-Standard Analysis provides a possible avenue to attack which we believe will be fruitful.
Analysis of rocket engine injection combustion processes
NASA Technical Reports Server (NTRS)
Salmon, J. W.; Saltzman, D. H.
1977-01-01
Mixing methodology improvement for the JANNAF DER and CICM injection/combustion analysis computer programs was accomplished. ZOM plane prediction model development was improved for installation into the new standardized DER computer program. An intra-element mixing model developing approach was recommended for gas/liquid coaxial injection elements for possible future incorporation into the CICM computer program.
Effects of Computer-Based Training on Procedural Modifications to Standard Functional Analyses
ERIC Educational Resources Information Center
Schnell, Lauren K.; Sidener, Tina M.; DeBar, Ruth M.; Vladescu, Jason C.; Kahng, SungWoo
2018-01-01
Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to…
NASA Astrophysics Data System (ADS)
Lu, D.; Ricciuto, D. M.; Evans, K. J.
2017-12-01
Data-worth analysis plays an essential role in improving the understanding of the subsurface system, in developing and refining subsurface models, and in supporting rational water resources management. However, data-worth analysis is computationally expensive as it requires quantifying parameter uncertainty, prediction uncertainty, and both current and potential data uncertainties. Assessment of these uncertainties in large-scale stochastic subsurface simulations using standard Monte Carlo (MC) sampling or advanced surrogate modeling is extremely computationally intensive, sometimes even infeasible. In this work, we propose efficient Bayesian analysis of data-worth using a multilevel Monte Carlo (MLMC) method. Compared to the standard MC that requires a significantly large number of high-fidelity model executions to achieve a prescribed accuracy in estimating expectations, the MLMC can substantially reduce the computational cost with the use of multifidelity approximations. As the data-worth analysis involves a great deal of expectation estimations, the cost savings from MLMC in the assessment can be very outstanding. While the proposed MLMC-based data-worth analysis is broadly applicable, we use it to a highly heterogeneous oil reservoir simulation to select an optimal candidate data set that gives the largest uncertainty reduction in predicting mass flow rates at four production wells. The choices made by the MLMC estimation are validated by the actual measurements of the potential data, and consistent with the estimation obtained from the standard MC. But compared to the standard MC, the MLMC greatly reduces the computational costs in the uncertainty reduction estimation, with up to 600 days cost savings when one processor is used.
1993-03-01
values themselves. The Wools perform risk-adjusted present-value comparisons and compute the ROI using discount factors. The assessment of risk in a...developed X Window system, the de facto industry standard window system in the UNIX environment. An X- terminal’s use is limited to display. It has no...2.1 IT HARDWARE The DOS-based PC used in this analysis costs $2,060. It includes an ASL 486DX-33 Industry Standard Architecture (ISA) computer with 8
NASA Technical Reports Server (NTRS)
Parrish, R. S.; Carter, M. C.
1974-01-01
This analysis utilizes computer simulation and statistical estimation. Realizations of stationary gaussian stochastic processes with selected autocorrelation functions are computer simulated. Analysis of the simulated data revealed that the mean and the variance of a process were functionally dependent upon the autocorrelation parameter and crossing level. Using predicted values for the mean and standard deviation, by the method of moments, the distribution parameters was estimated. Thus, given the autocorrelation parameter, crossing level, mean, and standard deviation of a process, the probability of exceeding the crossing level for a particular length of time was calculated.
Optimal Multicomponent Analysis Using the Generalized Standard Addition Method.
ERIC Educational Resources Information Center
Raymond, Margaret; And Others
1983-01-01
Describes an experiment on the simultaneous determination of chromium and magnesium by spectophotometry modified to include the Generalized Standard Addition Method computer program, a multivariate calibration method that provides optimal multicomponent analysis in the presence of interference and matrix effects. Provides instructions for…
Kiluk, Brian D.; Sugarman, Dawn E.; Nich, Charla; Gibbons, Carly J.; Martino, Steve; Rounsaville, Bruce J.; Carroll, Kathleen M.
2013-01-01
Objective Computer-assisted therapies offer a novel, cost-effective strategy for providing evidence-based therapies to a broad range of individuals with psychiatric disorders. However, the extent to which the growing body of randomized trials evaluating computer-assisted therapies meets current standards of methodological rigor for evidence-based interventions is not clear. Method A methodological analysis of randomized clinical trials of computer-assisted therapies for adult psychiatric disorders, published between January 1990 and January 2010, was conducted. Seventy-five studies that examined computer-assisted therapies for a range of axis I disorders were evaluated using a 14-item methodological quality index. Results Results indicated marked heterogeneity in study quality. No study met all 14 basic quality standards, and three met 13 criteria. Consistent weaknesses were noted in evaluation of treatment exposure and adherence, rates of follow-up assessment, and conformity to intention-to-treat principles. Studies utilizing weaker comparison conditions (e.g., wait-list controls) had poorer methodological quality scores and were more likely to report effects favoring the computer-assisted condition. Conclusions While several well-conducted studies have indicated promising results for computer-assisted therapies, this emerging field has not yet achieved a level of methodological quality equivalent to those required for other evidence-based behavioral therapies or pharmacotherapies. Adoption of more consistent standards for methodological quality in this field, with greater attention to potential adverse events, is needed before computer-assisted therapies are widely disseminated or marketed as evidence based. PMID:21536689
Standard surface-reflectance model and illuminant estimation
NASA Technical Reports Server (NTRS)
Tominaga, Shoji; Wandell, Brian A.
1989-01-01
A vector analysis technique was adopted to test the standard reflectance model. A computational model was developed to determine the components of the observed spectra and an estimate of the illuminant was obtained without using a reference white standard. The accuracy of the standard model is evaluated.
Programmable calculator software for computation of the plasma binding of ligands.
Conner, D P; Rocci, M L; Larijani, G E
1986-01-01
The computation of the extent of plasma binding of a ligand to plasma constituents using radiolabeled ligand and equilibrium dialysis is complex and tedious. A computer program for the HP-41C Handheld Computer Series (Hewlett-Packard) was developed to perform these calculations. The first segment of the program constructs a standard curve for quench correction of post-dialysis plasma and buffer samples, using either external standard ratio (ESR) or sample channels ratio (SCR) techniques. The remainder of the program uses the counts per minute, SCR or ESR, and post-dialysis volume of paired plasma and buffer samples generated from the dialysis procedure to compute the extent of binding after correction for background radiation, counting efficiency, and intradialytic shifts of fluid between plasma and buffer compartments during dialysis. This program greatly simplifies the analysis of equilibrium dialysis data and has been employed in the analysis of dexamethasone binding in normal and uremic sera.
Yan, W Y; Li, L; Yang, Y G; Lin, X L; Wu, J Z
2016-08-01
We designed a computer-based respiratory sound analysis system to identify pediatric normal lung sound. To verify the validity of the computer-based respiratory sound analysis system. First we downloaded the standard lung sounds from the network database (website: http: //www.easyauscultation.com/lung-sounds-reference-guide) and recorded 3 samples of abnormal loud sound (rhonchi, wheeze and crackles) from three patients of The Department of Pediatrics, the First Affiliated Hospital of Xiamen University. We regarded such lung sounds as"reference lung sounds". The"test lung sounds"were recorded from 29 children form Kindergarten of Xiamen University. we recorded lung sound by portable electronic stethoscope and valid lung sounds were selected by manual identification. We introduced Mel-frequency cepstral coefficient (MFCC) to extract lung sound features and dynamic time warping (DTW) for signal classification. We had 39 standard lung sounds, recorded 58 test lung sounds. This computer-based respiratory sound analysis system was carried out in 58 lung sound recognition, correct identification of 52 times, error identification 6 times. Accuracy was 89.7%. Based on MFCC and DTW, our computer-based respiratory sound analysis system can effectively identify healthy lung sounds of children (accuracy can reach 89.7%), fully embodies the reliability of the lung sounds analysis system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system.« less
ERIC Educational Resources Information Center
Nelson, Gena; Powell, Sarah R.
2018-01-01
Though proficiency with computation is highly emphasized in national mathematics standards, students with mathematics difficulty (MD) continue to struggle with computation. To learn more about the differences in computation error patterns between typically achieving students and students with MD, we assessed 478 third-grade students on a measure…
Computer Programs for the Semantic Differential: Further Modifications.
ERIC Educational Resources Information Center
Lawson, Edwin D.; And Others
The original nine programs for semantic differential analysis have been condensed into three programs which have been further refined and augmented. They yield: (1) means, standard deviations, and standard errors for each subscale on each concept; (2) Evaluation, Potency, and Activity (EPA) means, standard deviations, and standard errors; (3)…
Computer analysis of railcar vibrations
NASA Technical Reports Server (NTRS)
Vlaminck, R. R.
1975-01-01
Computer models and techniques for calculating railcar vibrations are discussed along with criteria for vehicle ride optimization. The effect on vibration of car body structural dynamics, suspension system parameters, vehicle geometry, and wheel and rail excitation are presented. Ride quality vibration data collected on the state-of-the-art car and standard light rail vehicle is compared to computer predictions. The results show that computer analysis of the vehicle can be performed for relatively low cost in short periods of time. The analysis permits optimization of the design as it progresses and minimizes the possibility of excessive vibration on production vehicles.
Some Computer-Based Developments in Sociology.
ERIC Educational Resources Information Center
Heise, David R.; Simmons, Roberta G.
1985-01-01
Discusses several ways in which computers are being used in sociology and how they continue to change this discipline. Areas considered include data collection, data analysis, simulations of social processes based on mathematical models, and problem areas (including standardization concerns, training, and the financing of computing facilities).…
Explorations in Statistics: The Analysis of Ratios and Normalized Data
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2013-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This ninth installment of "Explorations in Statistics" explores the analysis of ratios and normalized--or standardized--data. As researchers, we compute a ratio--a numerator divided by a denominator--to compute a…
Effect size calculation in meta-analyses of psychotherapy outcome research.
Hoyt, William T; Del Re, A C
2018-05-01
Meta-analysis of psychotherapy intervention research normally examines differences between treatment groups and some form of comparison group (e.g., wait list control; alternative treatment group). The effect of treatment is normally quantified as a standardized mean difference (SMD). We describe procedures for computing unbiased estimates of the population SMD from sample data (e.g., group Ms and SDs), and provide guidance about a number of complications that may arise related to effect size computation. These complications include (a) incomplete data in research reports; (b) use of baseline data in computing SMDs and estimating the population standard deviation (σ); (c) combining effect size data from studies using different research designs; and (d) appropriate techniques for analysis of data from studies providing multiple estimates of the effect of interest (i.e., dependent effect sizes). Clinical or Methodological Significance of this article: Meta-analysis is a set of techniques for producing valid summaries of existing research. The initial computational step for meta-analyses of research on intervention outcomes involves computing an effect size quantifying the change attributable to the intervention. We discuss common issues in the computation of effect sizes and provide recommended procedures to address them.
Morris, Paul D; Silva Soto, Daniel Alejandro; Feher, Jeroen F A; Rafiroiu, Dan; Lungu, Angela; Varma, Susheel; Lawford, Patricia V; Hose, D Rodney; Gunn, Julian P
2017-08-01
Fractional flow reserve (FFR)-guided percutaneous intervention is superior to standard assessment but remains underused. The authors have developed a novel "pseudotransient" analysis protocol for computing virtual fractional flow reserve (vFFR) based upon angiographic images and steady-state computational fluid dynamics. This protocol generates vFFR results in 189 s (cf >24 h for transient analysis) using a desktop PC, with <1% error relative to that of full-transient computational fluid dynamics analysis. Sensitivity analysis demonstrated that physiological lesion significance was influenced less by coronary or lesion anatomy (33%) and more by microvascular physiology (59%). If coronary microvascular resistance can be estimated, vFFR can be accurately computed in less time than it takes to make invasive measurements.
Theoretical basis of the DOE-2 building energy use analysis program
NASA Astrophysics Data System (ADS)
Curtis, R. B.
1981-04-01
A user-oriented, public domain, computer program was developed that will enable architects and engineers to perform design and retrofit studies of the energy-use of buildings under realistic weather conditions. The DOE-2.1A has been named by the US DOE as the standard evaluation technique for the Congressionally mandated building energy performance standards (BEPS). A number of program design decisions were made that determine the breadth of applicability of DOE-2.1. Such design decisions are intrinsic to all building energy use analysis computer programs and determine the types of buildings or the kind of HVAC systems that can be modeled. In particular, the weighting factor method used in DOE-2 has both advantages and disadvantages relative to other computer programs.
A Survey of CAD/CAM Technology Applications in the U.S. Shipbuilding Industry
1984-01-01
operation for drafting. Computer Aided Engineering (CAE) analysis is used primarily to determine the validity of design characteristics and produc- tion...include time standard generation, sea trial analysis , and group Systems integration While no systems surveyed Aided Design (CAD) is the technology... analysis . is the largest problem involving software packages. are truly integrated, many are interfaced. Computer most interfaced category with links
Computer-Aided Recognition of Facial Attributes for Fetal Alcohol Spectrum Disorders.
Valentine, Matthew; Bihm, Dustin C J; Wolf, Lior; Hoyme, H Eugene; May, Philip A; Buckley, David; Kalberg, Wendy; Abdul-Rahman, Omar A
2017-12-01
To compare the detection of facial attributes by computer-based facial recognition software of 2-D images against standard, manual examination in fetal alcohol spectrum disorders (FASD). Participants were gathered from the Fetal Alcohol Syndrome Epidemiology Research database. Standard frontal and oblique photographs of children were obtained during a manual, in-person dysmorphology assessment. Images were submitted for facial analysis conducted by the facial dysmorphology novel analysis technology (an automated system), which assesses ratios of measurements between various facial landmarks to determine the presence of dysmorphic features. Manual blinded dysmorphology assessments were compared with those obtained via the computer-aided system. Areas under the curve values for individual receiver-operating characteristic curves revealed the computer-aided system (0.88 ± 0.02) to be comparable to the manual method (0.86 ± 0.03) in detecting patients with FASD. Interestingly, cases of alcohol-related neurodevelopmental disorder (ARND) were identified more efficiently by the computer-aided system (0.84 ± 0.07) in comparison to the manual method (0.74 ± 0.04). A facial gestalt analysis of patients with ARND also identified more generalized facial findings compared to the cardinal facial features seen in more severe forms of FASD. We found there was an increased diagnostic accuracy for ARND via our computer-aided method. As this category has been historically difficult to diagnose, we believe our experiment demonstrates that facial dysmorphology novel analysis technology can potentially improve ARND diagnosis by introducing a standardized metric for recognizing FASD-associated facial anomalies. Earlier recognition of these patients will lead to earlier intervention with improved patient outcomes. Copyright © 2017 by the American Academy of Pediatrics.
An efficient Bayesian data-worth analysis using a multilevel Monte Carlo method
NASA Astrophysics Data System (ADS)
Lu, Dan; Ricciuto, Daniel; Evans, Katherine
2018-03-01
Improving the understanding of subsurface systems and thus reducing prediction uncertainty requires collection of data. As the collection of subsurface data is costly, it is important that the data collection scheme is cost-effective. Design of a cost-effective data collection scheme, i.e., data-worth analysis, requires quantifying model parameter, prediction, and both current and potential data uncertainties. Assessment of these uncertainties in large-scale stochastic subsurface hydrological model simulations using standard Monte Carlo (MC) sampling or surrogate modeling is extremely computationally intensive, sometimes even infeasible. In this work, we propose an efficient Bayesian data-worth analysis using a multilevel Monte Carlo (MLMC) method. Compared to the standard MC that requires a significantly large number of high-fidelity model executions to achieve a prescribed accuracy in estimating expectations, the MLMC can substantially reduce computational costs using multifidelity approximations. Since the Bayesian data-worth analysis involves a great deal of expectation estimation, the cost saving of the MLMC in the assessment can be outstanding. While the proposed MLMC-based data-worth analysis is broadly applicable, we use it for a highly heterogeneous two-phase subsurface flow simulation to select an optimal candidate data set that gives the largest uncertainty reduction in predicting mass flow rates at four production wells. The choices made by the MLMC estimation are validated by the actual measurements of the potential data, and consistent with the standard MC estimation. But compared to the standard MC, the MLMC greatly reduces the computational costs.
A general concept for consistent documentation of computational analyses
Müller, Fabian; Nordström, Karl; Lengauer, Thomas; Schulz, Marcel H.
2015-01-01
The ever-growing amount of data in the field of life sciences demands standardized ways of high-throughput computational analysis. This standardization requires a thorough documentation of each step in the computational analysis to enable researchers to understand and reproduce the results. However, due to the heterogeneity in software setups and the high rate of change during tool development, reproducibility is hard to achieve. One reason is that there is no common agreement in the research community on how to document computational studies. In many cases, simple flat files or other unstructured text documents are provided by researchers as documentation, which are often missing software dependencies, versions and sufficient documentation to understand the workflow and parameter settings. As a solution we suggest a simple and modest approach for documenting and verifying computational analysis pipelines. We propose a two-part scheme that defines a computational analysis using a Process and an Analysis metadata document, which jointly describe all necessary details to reproduce the results. In this design we separate the metadata specifying the process from the metadata describing an actual analysis run, thereby reducing the effort of manual documentation to an absolute minimum. Our approach is independent of a specific software environment, results in human readable XML documents that can easily be shared with other researchers and allows an automated validation to ensure consistency of the metadata. Because our approach has been designed with little to no assumptions concerning the workflow of an analysis, we expect it to be applicable in a wide range of computational research fields. Database URL: http://deep.mpi-inf.mpg.de/DAC/cmds/pub/pyvalid.zip PMID:26055099
Blanks: a computer program for analyzing furniture rough-part needs in standard-size blanks
Philip A. Araman
1983-01-01
A computer program is described that allows a company to determine the number of edge-glued, standard-size blanks required to satisfy its rough-part needs for a given production period. Yield and cost information also is determined by the program. A list of the program inputs, outputs, and uses of outputs is described, and an example analysis with sample output is...
Unified Computational Methods for Regression Analysis of Zero-Inflated and Bound-Inflated Data
Yang, Yan; Simpson, Douglas
2010-01-01
Bounded data with excess observations at the boundary are common in many areas of application. Various individual cases of inflated mixture models have been studied in the literature for bound-inflated data, yet the computational methods have been developed separately for each type of model. In this article we use a common framework for computing these models, and expand the range of models for both discrete and semi-continuous data with point inflation at the lower boundary. The quasi-Newton and EM algorithms are adapted and compared for estimation of model parameters. The numerical Hessian and generalized Louis method are investigated as means for computing standard errors after optimization. Correlated data are included in this framework via generalized estimating equations. The estimation of parameters and effectiveness of standard errors are demonstrated through simulation and in the analysis of data from an ultrasound bioeffect study. The unified approach enables reliable computation for a wide class of inflated mixture models and comparison of competing models. PMID:20228950
Villanova, Federica; Di Meglio, Paola; Inokuma, Margaret; Aghaeepour, Nima; Perucha, Esperanza; Mollon, Jennifer; Nomura, Laurel; Hernandez-Fuentes, Maria; Cope, Andrew; Prevost, A Toby; Heck, Susanne; Maino, Vernon; Lord, Graham; Brinkman, Ryan R; Nestle, Frank O
2013-01-01
Discovery of novel immune biomarkers for monitoring of disease prognosis and response to therapy in immune-mediated inflammatory diseases is an important unmet clinical need. Here, we establish a novel framework for immunological biomarker discovery, comparing a conventional (liquid) flow cytometry platform (CFP) and a unique lyoplate-based flow cytometry platform (LFP) in combination with advanced computational data analysis. We demonstrate that LFP had higher sensitivity compared to CFP, with increased detection of cytokines (IFN-γ and IL-10) and activation markers (Foxp3 and CD25). Fluorescent intensity of cells stained with lyophilized antibodies was increased compared to cells stained with liquid antibodies. LFP, using a plate loader, allowed medium-throughput processing of samples with comparable intra- and inter-assay variability between platforms. Automated computational analysis identified novel immunophenotypes that were not detected with manual analysis. Our results establish a new flow cytometry platform for standardized and rapid immunological biomarker discovery with wide application to immune-mediated diseases.
Villanova, Federica; Di Meglio, Paola; Inokuma, Margaret; Aghaeepour, Nima; Perucha, Esperanza; Mollon, Jennifer; Nomura, Laurel; Hernandez-Fuentes, Maria; Cope, Andrew; Prevost, A. Toby; Heck, Susanne; Maino, Vernon; Lord, Graham; Brinkman, Ryan R.; Nestle, Frank O.
2013-01-01
Discovery of novel immune biomarkers for monitoring of disease prognosis and response to therapy in immune-mediated inflammatory diseases is an important unmet clinical need. Here, we establish a novel framework for immunological biomarker discovery, comparing a conventional (liquid) flow cytometry platform (CFP) and a unique lyoplate-based flow cytometry platform (LFP) in combination with advanced computational data analysis. We demonstrate that LFP had higher sensitivity compared to CFP, with increased detection of cytokines (IFN-γ and IL-10) and activation markers (Foxp3 and CD25). Fluorescent intensity of cells stained with lyophilized antibodies was increased compared to cells stained with liquid antibodies. LFP, using a plate loader, allowed medium-throughput processing of samples with comparable intra- and inter-assay variability between platforms. Automated computational analysis identified novel immunophenotypes that were not detected with manual analysis. Our results establish a new flow cytometry platform for standardized and rapid immunological biomarker discovery with wide application to immune-mediated diseases. PMID:23843942
Sauer, Vernon B.
2002-01-01
Surface-water computation methods and procedures are described in this report to provide standards from which a completely automated electronic processing system can be developed. To the greatest extent possible, the traditional U. S. Geological Survey (USGS) methodology and standards for streamflow data collection and analysis have been incorporated into these standards. Although USGS methodology and standards are the basis for this report, the report is applicable to other organizations doing similar work. The proposed electronic processing system allows field measurement data, including data stored on automatic field recording devices and data recorded by the field hydrographer (a person who collects streamflow and other surface-water data) in electronic field notebooks, to be input easily and automatically. A user of the electronic processing system easily can monitor the incoming data and verify and edit the data, if necessary. Input of the computational procedures, rating curves, shift requirements, and other special methods are interactive processes between the user and the electronic processing system, with much of this processing being automatic. Special computation procedures are provided for complex stations such as velocity-index, slope, control structures, and unsteady-flow models, such as the Branch-Network Dynamic Flow Model (BRANCH). Navigation paths are designed to lead the user through the computational steps for each type of gaging station (stage-only, stagedischarge, velocity-index, slope, rate-of-change in stage, reservoir, tide, structure, and hydraulic model stations). The proposed electronic processing system emphasizes the use of interactive graphics to provide good visual tools for unit values editing, rating curve and shift analysis, hydrograph comparisons, data-estimation procedures, data review, and other needs. Documentation, review, finalization, and publication of records are provided for with the electronic processing system, as well as archiving, quality assurance, and quality control.
ERIC Educational Resources Information Center
Tuncer, Murat
2013-01-01
Present research investigates reciprocal relations amidst computer self-efficacy, scientific research and information literacy self-efficacy. Research findings have demonstrated that according to standardized regression coefficients, computer self-efficacy has a positive effect on information literacy self-efficacy. Likewise it has been detected…
Distributed-Memory Computing With the Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA)
NASA Technical Reports Server (NTRS)
Riley, Christopher J.; Cheatwood, F. McNeil
1997-01-01
The Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA), a Navier-Stokes solver, has been modified for use in a parallel, distributed-memory environment using the Message-Passing Interface (MPI) standard. A standard domain decomposition strategy is used in which the computational domain is divided into subdomains with each subdomain assigned to a processor. Performance is examined on dedicated parallel machines and a network of desktop workstations. The effect of domain decomposition and frequency of boundary updates on performance and convergence is also examined for several realistic configurations and conditions typical of large-scale computational fluid dynamic analysis.
Kernel analysis in TeV gamma-ray selection
NASA Astrophysics Data System (ADS)
Moriarty, P.; Samuelson, F. W.
2000-06-01
We discuss the use of kernel analysis as a technique for selecting gamma-ray candidates in Atmospheric Cherenkov astronomy. The method is applied to observations of the Crab Nebula and Markarian 501 recorded with the Whipple 10 m Atmospheric Cherenkov imaging system, and the results are compared with the standard Supercuts analysis. Since kernel analysis is computationally intensive, we examine approaches to reducing the computational load. Extension of the technique to estimate the energy of the gamma-ray primary is considered. .
Ataer-Cansizoglu, Esra; Bolon-Canedo, Veronica; Campbell, J Peter; Bozkurt, Alican; Erdogmus, Deniz; Kalpathy-Cramer, Jayashree; Patel, Samir; Jonas, Karyn; Chan, R V Paul; Ostmo, Susan; Chiang, Michael F
2015-11-01
We developed and evaluated the performance of a novel computer-based image analysis system for grading plus disease in retinopathy of prematurity (ROP), and identified the image features, shapes, and sizes that best correlate with expert diagnosis. A dataset of 77 wide-angle retinal images from infants screened for ROP was collected. A reference standard diagnosis was determined for each image by combining image grading from 3 experts with the clinical diagnosis from ophthalmoscopic examination. Manually segmented images were cropped into a range of shapes and sizes, and a computer algorithm was developed to extract tortuosity and dilation features from arteries and veins. Each feature was fed into our system to identify the set of characteristics that yielded the highest-performing system compared to the reference standard, which we refer to as the "i-ROP" system. Among the tested crop shapes, sizes, and measured features, point-based measurements of arterial and venous tortuosity (combined), and a large circular cropped image (with radius 6 times the disc diameter), provided the highest diagnostic accuracy. The i-ROP system achieved 95% accuracy for classifying preplus and plus disease compared to the reference standard. This was comparable to the performance of the 3 individual experts (96%, 94%, 92%), and significantly higher than the mean performance of 31 nonexperts (81%). This comprehensive analysis of computer-based plus disease suggests that it may be feasible to develop a fully-automated system based on wide-angle retinal images that performs comparably to expert graders at three-level plus disease discrimination. Computer-based image analysis, using objective and quantitative retinal vascular features, has potential to complement clinical ROP diagnosis by ophthalmologists.
Effects of computer-based training on procedural modifications to standard functional analyses.
Schnell, Lauren K; Sidener, Tina M; DeBar, Ruth M; Vladescu, Jason C; Kahng, SungWoo
2018-01-01
Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to training materials using interactive software during a 1-day session. Following the training, mean scores on the posttest, novel cases probe, and maintenance probe increased for all participants. These results replicate previous findings during a 1-day session and include a measure of participant acceptability of the training. Recommendations for future research on computer-based training and functional analysis are discussed. © 2017 Society for the Experimental Analysis of Behavior.
Linear stability analysis of detonations via numerical computation and dynamic mode decomposition
NASA Astrophysics Data System (ADS)
Kabanov, Dmitry I.; Kasimov, Aslan R.
2018-03-01
We introduce a new method to investigate linear stability of gaseous detonations that is based on an accurate shock-fitting numerical integration of the linearized reactive Euler equations with a subsequent analysis of the computed solution via the dynamic mode decomposition. The method is applied to the detonation models based on both the standard one-step Arrhenius kinetics and two-step exothermic-endothermic reaction kinetics. Stability spectra for all cases are computed and analyzed. The new approach is shown to be a viable alternative to the traditional normal-mode analysis used in detonation theory.
Anderson, Carl A; McRae, Allan F; Visscher, Peter M
2006-07-01
Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using simulation we compare this method to both the Cox and Weibull proportional hazards models and a standard linear regression method that ignores censoring. The grouped linear regression method is of equivalent power to both the Cox and Weibull proportional hazards methods and is significantly better than the standard linear regression method when censored observations are present. The method is also robust to the proportion of censored individuals and the underlying distribution of the trait. On the basis of linear regression methodology, the grouped linear regression model is computationally simple and fast and can be implemented readily in freely available statistical software.
An approach to quantum-computational hydrologic inverse analysis
O'Malley, Daniel
2018-05-02
Making predictions about flow and transport in an aquifer requires knowledge of the heterogeneous properties of the aquifer such as permeability. Computational methods for inverse analysis are commonly used to infer these properties from quantities that are more readily observable such as hydraulic head. We present a method for computational inverse analysis that utilizes a type of quantum computer called a quantum annealer. While quantum computing is in an early stage compared to classical computing, we demonstrate that it is sufficiently developed that it can be used to solve certain subsurface flow problems. We utilize a D-Wave 2X quantum annealermore » to solve 1D and 2D hydrologic inverse problems that, while small by modern standards, are similar in size and sometimes larger than hydrologic inverse problems that were solved with early classical computers. Our results and the rapid progress being made with quantum computing hardware indicate that the era of quantum-computational hydrology may not be too far in the future.« less
An approach to quantum-computational hydrologic inverse analysis.
O'Malley, Daniel
2018-05-02
Making predictions about flow and transport in an aquifer requires knowledge of the heterogeneous properties of the aquifer such as permeability. Computational methods for inverse analysis are commonly used to infer these properties from quantities that are more readily observable such as hydraulic head. We present a method for computational inverse analysis that utilizes a type of quantum computer called a quantum annealer. While quantum computing is in an early stage compared to classical computing, we demonstrate that it is sufficiently developed that it can be used to solve certain subsurface flow problems. We utilize a D-Wave 2X quantum annealer to solve 1D and 2D hydrologic inverse problems that, while small by modern standards, are similar in size and sometimes larger than hydrologic inverse problems that were solved with early classical computers. Our results and the rapid progress being made with quantum computing hardware indicate that the era of quantum-computational hydrology may not be too far in the future.
An approach to quantum-computational hydrologic inverse analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Malley, Daniel
Making predictions about flow and transport in an aquifer requires knowledge of the heterogeneous properties of the aquifer such as permeability. Computational methods for inverse analysis are commonly used to infer these properties from quantities that are more readily observable such as hydraulic head. We present a method for computational inverse analysis that utilizes a type of quantum computer called a quantum annealer. While quantum computing is in an early stage compared to classical computing, we demonstrate that it is sufficiently developed that it can be used to solve certain subsurface flow problems. We utilize a D-Wave 2X quantum annealermore » to solve 1D and 2D hydrologic inverse problems that, while small by modern standards, are similar in size and sometimes larger than hydrologic inverse problems that were solved with early classical computers. Our results and the rapid progress being made with quantum computing hardware indicate that the era of quantum-computational hydrology may not be too far in the future.« less
Standards guide for space and earth sciences computer software
NASA Technical Reports Server (NTRS)
Mason, G.; Chapman, R.; Klinglesmith, D.; Linnekin, J.; Putney, W.; Shaffer, F.; Dapice, R.
1972-01-01
Guidelines for the preparation of systems analysis and programming work statements are presented. The data is geared toward the efficient administration of available monetary and equipment resources. Language standards and the application of good management techniques to software development are emphasized.
Some computational techniques for estimating human operator describing functions
NASA Technical Reports Server (NTRS)
Levison, W. H.
1986-01-01
Computational procedures for improving the reliability of human operator describing functions are described. Special attention is given to the estimation of standard errors associated with mean operator gain and phase shift as computed from an ensemble of experimental trials. This analysis pertains to experiments using sum-of-sines forcing functions. Both open-loop and closed-loop measurement environments are considered.
An XML-Based Protocol for Distributed Event Services
NASA Technical Reports Server (NTRS)
Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)
2001-01-01
A recent trend in distributed computing is the construction of high-performance distributed systems called computational grids. One difficulty we have encountered is that there is no standard format for the representation of performance information and no standard protocol for transmitting this information. This limits the types of performance analysis that can be undertaken in complex distributed systems. To address this problem, we present an XML-based protocol for transmitting performance events in distributed systems and evaluate the performance of this protocol.
2015-06-01
events was ad - hoc and problematic due to time constraints and changing requirements. Determining errors in context and heuristics required expertise...area code ) 410-278-4678 Standard Form 298 (Rev. 8/98) Prescribed by ANSI Std. Z39.18 iii Contents List of Figures iv 1. Introduction 1...reduction code ...........8 1 1. Introduction Data reduction for analysis of Command, Control, Communications, and Computer (C4) network tests
Real time computer data system for the 40 x 80 ft wind tunnel facility at Ames Research Center
NASA Technical Reports Server (NTRS)
Cambra, J. M.; Tolari, G. P.
1974-01-01
The wind tunnel realtime computer system is a distributed data gathering system that features a master computer subsystem, a high speed data gathering subsystem, a quick look dynamic analysis and vibration control subsystem, an analog recording back-up subsystem, a pulse code modulation (PCM) on-board subsystem, a communications subsystem, and a transducer excitation and calibration subsystem. The subsystems are married to the master computer through an executive software system and standard hardware and FORTRAN software interfaces. The executive software system has four basic software routines. These are the playback, setup, record, and monitor routines. The standard hardware interfaces along with the software interfaces provide the system with the capability of adapting to new environments.
Peterson, Leif E
2002-01-01
CLUSFAVOR (CLUSter and Factor Analysis with Varimax Orthogonal Rotation) 5.0 is a Windows-based computer program for hierarchical cluster and principal-component analysis of microarray-based transcriptional profiles. CLUSFAVOR 5.0 standardizes input data; sorts data according to gene-specific coefficient of variation, standard deviation, average and total expression, and Shannon entropy; performs hierarchical cluster analysis using nearest-neighbor, unweighted pair-group method using arithmetic averages (UPGMA), or furthest-neighbor joining methods, and Euclidean, correlation, or jack-knife distances; and performs principal-component analysis. PMID:12184816
Xu, Rende; Li, Chenguang; Qian, Juying; Ge, Junbo
2015-11-01
Invasive fractional flow reserve (FFR) is the gold standard for the determination of physiologic stenosis severity and the need for revascularization. FFR computed from standard acquired coronary computed tomographic angiography datasets (FFRCT) is an emerging technology which allows calculation of FFR using resting image data from coronary computed tomographic angiography (CCTA). However, the diagnostic accuracy of FFRCT in the evaluation of lesion-specific myocardial ischemia remains to be confirmed, especially in patients with intermediate coronary stenosis. We performed an integrated analysis of data from 3 prospective, international, and multicenter trials, which assessed the diagnostic performance of FFRCT using invasive FFR as a reference standard. Three studies evaluating 609 patients and 1050 vessels were included. The total calculated sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of FFRCT were 82.8%, 77.7%, 60.8%, 91.6%, and 79.2%, respectively, for the per-vessel analysis, and 89.4%, 70.5%, 69.7%, 89.7%, and 78.7%, respectively, for the per-patient analysis. Compared with CCTA alone, FFRCT demonstrated significantly improved accuracy (P < 0.001) in detecting lesion-specific ischemia. In patients with intermediate coronary stenosis, FFRCT remained both highly sensitive and specific with respect to the diagnosis of ischemia. In conclusion, FFRCT appears to be a reliable noninvasive alternative to invasive FFR, as it demonstrates high accuracy in the determination of anatomy and lesion-specific ischemia, which justifies the performance of additional randomized controlled trials to evaluate both the clinical benefits and the cost-effectiveness of FFRCT-guided coronary revascularization.
Numerical Simulation Of Cutting Of Gear Teeth
NASA Technical Reports Server (NTRS)
Oswald, Fred B.; Huston, Ronald L.; Mavriplis, Dimitrios
1994-01-01
Shapes of gear teeth produced by gear cutters of specified shape simulated computationally, according to approach based on principles of differential geometry. Results of computer simulation displayed as computer graphics and/or used in analyses of design, manufacturing, and performance of gears. Applicable to both standard and non-standard gear-tooth forms. Accelerates and facilitates analysis of alternative designs of gears and cutters. Simulation extended to study generation of surfaces other than gears. Applied to cams, bearings, and surfaces of arbitrary rolling elements as well as to gears. Possible to develop analogous procedures for simulating manufacture of skin surfaces like automobile fenders, airfoils, and ship hulls.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greene, N.M.; Petrie, L.M.; Westfall, R.M.
SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries.« less
Development of computer software for pavement life cycle cost analysis.
DOT National Transportation Integrated Search
1988-01-01
The life cycle cost analysis program (LCCA) is designed to automate and standardize life cycle costing in Virginia. It allows the user to input information necessary for the analysis, and it then completes the calculations and produces a printed copy...
High-Performance Computing in Neuroscience for Data-Driven Discovery, Integration, and Dissemination
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bouchard, Kristofer E.; Aimone, James B.; Chun, Miyoung
A lack of coherent plans to analyze, manage, and understand data threatens the various opportunities offered by new neuro-technologies. High-performance computing will allow exploratory analysis of massive datasets stored in standardized formats, hosted in open repositories, and integrated with simulations.
High-Performance Computing in Neuroscience for Data-Driven Discovery, Integration, and Dissemination
Bouchard, Kristofer E.; Aimone, James B.; Chun, Miyoung; ...
2016-11-01
A lack of coherent plans to analyze, manage, and understand data threatens the various opportunities offered by new neuro-technologies. High-performance computing will allow exploratory analysis of massive datasets stored in standardized formats, hosted in open repositories, and integrated with simulations.
NASA Technical Reports Server (NTRS)
Voigt, S.
1975-01-01
The use of software engineering aids in the design of a structural finite-element analysis computer program for the STAR-100 computer is described. Nested functional diagrams to aid in communication among design team members were used, and a standardized specification format to describe modules designed by various members was adopted. This is a report of current work in which use of the functional diagrams provided continuity and helped resolve some of the problems arising in this long-running part-time project.
Equilibrium paths analysis of materials with rheological properties by using the chaos theory
NASA Astrophysics Data System (ADS)
Bednarek, Paweł; Rządkowski, Jan
2018-01-01
The numerical equilibrium path analysis of the material with random rheological properties by using standard procedures and specialist computer programs was not successful. The proper solution for the analysed heuristic model of the material was obtained on the base of chaos theory elements and neural networks. The paper deals with mathematical reasons of used computer programs and also are elaborated the properties of the attractor used in analysis. There are presented results of conducted numerical analysis both in a numerical and in graphical form for the used procedures.
Computer-Assisted Digital Image Analysis of Plus Disease in Retinopathy of Prematurity.
Kemp, Pavlina S; VanderVeen, Deborah K
2016-01-01
The objective of this study is to review the current state and role of computer-assisted analysis in diagnosis of plus disease in retinopathy of prematurity. Diagnosis and documentation of retinopathy of prematurity are increasingly being supplemented by digital imaging. The incorporation of computer-aided techniques has the potential to add valuable information and standardization regarding the presence of plus disease, an important criterion in deciding the necessity of treatment of vision-threatening retinopathy of prematurity. A review of literature found that several techniques have been published examining the process and role of computer aided analysis of plus disease in retinopathy of prematurity. These techniques use semiautomated image analysis techniques to evaluate retinal vascular dilation and tortuosity, using calculated parameters to evaluate presence or absence of plus disease. These values are then compared with expert consensus. The study concludes that computer-aided image analysis has the potential to use quantitative and objective criteria to act as a supplemental tool in evaluating for plus disease in the setting of retinopathy of prematurity.
NASA Technical Reports Server (NTRS)
Gupta, Kajal K.
1991-01-01
The details of an integrated general-purpose finite element structural analysis computer program which is also capable of solving complex multidisciplinary problems is presented. Thus, the SOLIDS module of the program possesses an extensive finite element library suitable for modeling most practical problems and is capable of solving statics, vibration, buckling, and dynamic response problems of complex structures, including spinning ones. The aerodynamic module, AERO, enables computation of unsteady aerodynamic forces for both subsonic and supersonic flow for subsequent flutter and divergence analysis of the structure. The associated aeroservoelastic analysis module, ASE, effects aero-structural-control stability analysis yielding frequency responses as well as damping characteristics of the structure. The program is written in standard FORTRAN to run on a wide variety of computers. Extensive graphics, preprocessing, and postprocessing routines are also available pertaining to a number of terminals.
A Computer Program for Preliminary Data Analysis
Dennis L. Schweitzer
1967-01-01
ABSTRACT. -- A computer program written in FORTRAN has been designed to summarize data. Class frequencies, means, and standard deviations are printed for as many as 100 independent variables. Cross-classifications of an observed dependent variable and of a dependent variable predicted by a multiple regression equation can also be generated.
Quaranta, Alessandro; DʼIsidoro, Orlando; Bambini, Fabrizio; Putignano, Angelo
2016-02-01
To compare the available potential bone-implant contact (PBIC) area of standard and short dental implants by micro-computed tomography (μCT) assessment. Three short implants with different diameters (4.5 × 6 mm, 4.1 × 7 mm, and 4.1 × 6 mm) and 2 standard implants (3.5 × 10 mm and 3.3 × 9 mm) with diverse design and surface features were scanned with μCT. Cross-sectional images were obtained. Image data were manually processed to find the plane that corresponds to the most coronal contact point between the crestal bone and implant. The available PBIC was calculated for each sample. Later on, the cross-sectional slices were processed by a 3-dimensional (3D) software, and 3D images of each sample were used for descriptive analysis and display the microtopography and macrotopography. The wide-diameter short implant (4.5 × 6 mm) showed the higher PBIC (210.89 mm) value followed by the standard (178.07 mm and 185.37 mm) and short implants (130.70 mm and 110.70 mm). Wide-diameter short implants show a surface area comparable with standard implants. Micro-CT analysis is a promising technique to evaluate surface area in dental implants with different macrodesign, microdesign, and surface features.
Ion diffusion may introduce spurious current sources in current-source density (CSD) analysis.
Halnes, Geir; Mäki-Marttunen, Tuomo; Pettersen, Klas H; Andreassen, Ole A; Einevoll, Gaute T
2017-07-01
Current-source density (CSD) analysis is a well-established method for analyzing recorded local field potentials (LFPs), that is, the low-frequency part of extracellular potentials. Standard CSD theory is based on the assumption that all extracellular currents are purely ohmic, and thus neglects the possible impact from ionic diffusion on recorded potentials. However, it has previously been shown that in physiological conditions with large ion-concentration gradients, diffusive currents can evoke slow shifts in extracellular potentials. Using computer simulations, we here show that diffusion-evoked potential shifts can introduce errors in standard CSD analysis, and can lead to prediction of spurious current sources. Further, we here show that the diffusion-evoked prediction errors can be removed by using an improved CSD estimator which accounts for concentration-dependent effects. NEW & NOTEWORTHY Standard CSD analysis does not account for ionic diffusion. Using biophysically realistic computer simulations, we show that unaccounted-for diffusive currents can lead to the prediction of spurious current sources. This finding may be of strong interest for in vivo electrophysiologists doing extracellular recordings in general, and CSD analysis in particular. Copyright © 2017 the American Physiological Society.
Vascular surgical data registries for small computers.
Kaufman, J L; Rosenberg, N
1984-08-01
Recent designs for computer-based vascular surgical registries and clinical data bases have employed large centralized systems with formal programming and mass storage. Small computers, of the types created for office use or for word processing, now contain sufficient speed and memory storage capacity to allow construction of decentralized office-based registries. Using a standardized dictionary of terms and a method of data organization adapted to word processing, we have created a new vascular surgery data registry, "VASREG." Data files are organized without programming, and a limited number of powerful logical statements in English are used for sorting. The capacity is 25,000 records with current inexpensive memory technology. VASREG is adaptable to computers made by a variety of manufacturers, and interface programs are available for conversion of the word processor formated registry data into forms suitable for analysis by programs written in a standard programming language. This is a low-cost clinical data registry available to any physician. With a standardized dictionary, preparation of regional and national statistical summaries may be facilitated.
Computed Tomography Inspection and Analysis for Additive Manufacturing Components
NASA Technical Reports Server (NTRS)
Beshears, Ronald D.
2016-01-01
Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws were inspected using a 2MeV linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed using standard image analysis techniques to determine the impact of additive manufacturing on inspectability of objects with complex geometries.
White, Timothy C.; Sauter, Edward A.; Stewart, Duff C.
2014-01-01
Intermagnet is an international oversight group which exists to establish a global network for geomagnetic observatories. This group establishes data standards and standard operating procedures for members and prospective members. Intermagnet has proposed a new One-Second Data Standard, for that emerging geomagnetic product. The standard specifies that all data collected must have a time stamp accuracy of ±10 milliseconds of the top-of-the-second Coordinated Universal Time. Therefore, the U.S. Geological Survey Geomagnetism Program has designed and executed several tests on its current data collection system, the Personal Computer Data Collection Platform. Tests are designed to measure the time shifts introduced by individual components within the data collection system, as well as to measure the time shift introduced by the entire Personal Computer Data Collection Platform. Additional testing designed for Intermagnet will be used to validate further such measurements. Current results of the measurements showed a 5.0–19.9 millisecond lag for the vertical channel (Z) of the Personal Computer Data Collection Platform and a 13.0–25.8 millisecond lag for horizontal channels (H and D) of the collection system. These measurements represent a dynamically changing delay introduced within the U.S. Geological Survey Personal Computer Data Collection Platform.
Airside HVAC BESTEST: HVAC Air-Distribution System Model Test Cases for ASHRAE Standard 140
DOE Office of Scientific and Technical Information (OSTI.GOV)
Judkoff, Ronald; Neymark, Joel; Kennedy, Mike D.
This paper summarizes recent work to develop new airside HVAC equipment model analytical verification test cases for ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs. The analytical verification test method allows comparison of simulation results from a wide variety of building energy simulation programs with quasi-analytical solutions, further described below. Standard 140 is widely cited for evaluating software for use with performance-path energy efficiency analysis, in conjunction with well-known energy-efficiency standards including ASHRAE Standard 90.1, the International Energy Conservation Code, and other international standards. Airside HVAC Equipment is a common area ofmore » modelling not previously explicitly tested by Standard 140. Integration of the completed test suite into Standard 140 is in progress.« less
NASA Technical Reports Server (NTRS)
Nagy, S.
1988-01-01
Due to extraordinary distances scanned by modern telescopes, optical surfaces in such telescopes must be manufactured to unimaginable standards of perfection of a few thousandths of a centimeter. The detection of imperfections of less than 1/20 of a wavelength of light, for application in the building of the mirror for the Space Infrared Telescope Facility, was undertaken. Because the mirror must be kept very cold while in space, another factor comes into effect: cryogenics. The process to test a specific morror under cryogenic conditions is described; including the follow-up analysis accomplished through computer work. To better illustrate the process and analysis, a Pyrex Hex-Core mirror is followed through the process from the laser interferometry in the lab, to computer analysis via a computer program called FRINGE. This analysis via FRINGE is detailed.
NASA Technical Reports Server (NTRS)
Montegani, F. J.
1974-01-01
Methods of handling one-third-octave band noise data originating from the outdoor full-scale fan noise facility and the engine acoustic facility at the Lewis Research Center are presented. Procedures for standardizing, retrieving, extrapolating, and reporting these data are explained. Computer programs are given which are used to accomplish these and other noise data analysis tasks. This information is useful as background for interpretation of data from these facilities appearing in NASA reports and can aid data exchange by promoting standardization.
Recent Updates to the CFD General Notation System (CGNS)
NASA Technical Reports Server (NTRS)
Rumsey, Christopher L.; Wedan, Bruce; Hauser, Thomas; Poinot, Marc
2012-01-01
The CFD General Notation System (CGNS) - a general, portable, and extensible standard for the storage and retrieval of computational fluid dynamics (CFD) analysis data has been in existence for more than a decade (Version 1.0 was released in May 1998). Both structured and unstructured CFD data are covered by the standard, and CGNS can be easily extended to cover any sort of data imaginable, while retaining backward compatibility with existing CGNS data files and software. Although originally designed for CFD, it is readily extendable to any field of computational analysis. In early 2011, CGNS Version 3.1 was released, which added significant capabilities. This paper describes these recent enhancements and highlights the continued usefulness of the CGNS methodology.
Barlough, J E; Jacobson, R H; Downing, D R; Lynch, T J; Scott, F W
1987-01-01
The computer-assisted, kinetics-based enzyme-linked immunosorbent assay for coronavirus antibodies in cats was calibrated to the conventional indirect immunofluorescence assay by linear regression analysis and computerized interpolation (generation of "immunofluorescence assay-equivalent" titers). Procedures were developed for normalization and standardization of kinetics-based enzyme-linked immunosorbent assay results through incorporation of five different control sera of predetermined ("expected") titer in daily runs. When used with such sera and with computer assistance, the kinetics-based enzyme-linked immunosorbent assay minimized both within-run and between-run variability while allowing also for efficient data reduction and statistical analysis and reporting of results. PMID:3032390
Barlough, J E; Jacobson, R H; Downing, D R; Lynch, T J; Scott, F W
1987-01-01
The computer-assisted, kinetics-based enzyme-linked immunosorbent assay for coronavirus antibodies in cats was calibrated to the conventional indirect immunofluorescence assay by linear regression analysis and computerized interpolation (generation of "immunofluorescence assay-equivalent" titers). Procedures were developed for normalization and standardization of kinetics-based enzyme-linked immunosorbent assay results through incorporation of five different control sera of predetermined ("expected") titer in daily runs. When used with such sera and with computer assistance, the kinetics-based enzyme-linked immunosorbent assay minimized both within-run and between-run variability while allowing also for efficient data reduction and statistical analysis and reporting of results.
Talarczyk-Desole, Joanna; Berger, Anna; Taszarek-Hauke, Grażyna; Hauke, Jan; Pawelczyk, Leszek; Jedrzejczak, Piotr
2017-01-01
The aim of the study was to check the quality of computer-assisted sperm analysis (CASA) system in comparison to the reference manual method as well as standardization of the computer-assisted semen assessment. The study was conducted between January and June 2015 at the Andrology Laboratory of the Division of Infertility and Reproductive Endocrinology, Poznań University of Medical Sciences, Poland. The study group consisted of 230 men who gave sperm samples for the first time in our center as part of an infertility investigation. The samples underwent manual and computer-assisted assessment of concentration, motility and morphology. A total of 184 samples were examined twice: manually, according to the 2010 WHO recommendations, and with CASA, using the program set-tings provided by the manufacturer. Additionally, 46 samples underwent two manual analyses and two computer-assisted analyses. The p-value of p < 0.05 was considered as statistically significant. Statistically significant differences were found between all of the investigated sperm parameters, except for non-progressive motility, measured with CASA and manually. In the group of patients where all analyses with each method were performed twice on the same sample we found no significant differences between both assessments of the same probe, neither in the samples analyzed manually nor with CASA, although standard deviation was higher in the CASA group. Our results suggest that computer-assisted sperm analysis requires further improvement for a wider application in clinical practice.
A dc model for power switching transistors suitable for computer-aided design and analysis
NASA Technical Reports Server (NTRS)
Wilson, P. M.; George, R. T., Jr.; Owen, H. A., Jr.; Wilson, T. G.
1979-01-01
The proposed dc model for bipolar junction power switching transistors is based on measurements which may be made with standard laboratory equipment. Those nonlinearities which are of importance to power electronics design are emphasized. Measurements procedures are discussed in detail. A model formulation adapted for use with a computer program is presented, and a comparison between actual and computer-generated results is made.
A multi-standard approach for GIAO (13)C NMR calculations.
Sarotti, Ariel M; Pellegrinet, Silvina C
2009-10-02
The influence of the reference standard employed in the calculation of (13)C NMR chemical shifts was investigated over a large variety of known organic compounds, using different quantum chemistry methods and basis sets. After detailed analysis of the collected data, we found that methanol and benzene are excellent reference standards for computing NMR shifts of sp(3)- and sp-sp(2)-hybridized carbon atoms, respectively. This multi-standard approach (MSTD) performs better than TMS in terms of accuracy and precision and also displays much lower dependence on the level of theory employed. The use of mPW1PW91/6-31G(d)//mPW1PW91/6-31G(d) level is recommended for accurate (13)C NMR chemical shift prediction at low computational cost.
NASA Astrophysics Data System (ADS)
Grzeszczuk, A.; Kowalski, S.
2015-04-01
Compute Unified Device Architecture (CUDA) is a parallel computing platform developed by Nvidia for increase speed of graphics by usage of parallel mode for processes calculation. The success of this solution has opened technology General-Purpose Graphic Processor Units (GPGPUs) for applications not coupled with graphics. The GPGPUs system can be applying as effective tool for reducing huge number of data for pulse shape analysis measures, by on-line recalculation or by very quick system of compression. The simplified structure of CUDA system and model of programming based on example Nvidia GForce GTX580 card are presented by our poster contribution in stand-alone version and as ROOT application.
Using Computer-Based "Experiments" in the Analysis of Chemical Reaction Equilibria
ERIC Educational Resources Information Center
Li, Zhao; Corti, David S.
2018-01-01
The application of the Reaction Monte Carlo (RxMC) algorithm to standard textbook problems in chemical reaction equilibria is discussed. The RxMC method is a molecular simulation algorithm for studying the equilibrium properties of reactive systems, and therefore provides the opportunity to develop computer-based "experiments" for the…
USSAERO version D computer program development using ANSI standard FORTRAN 77 and DI-3000 graphics
NASA Technical Reports Server (NTRS)
Wiese, M. R.
1986-01-01
The D version of the Unified Subsonic Supersonic Aerodynamic Analysis (USSAERO) program is the result of numerous modifications and enhancements to the B01 version. These changes include conversion to ANSI standard FORTRAN 77; use of the DI-3000 graphics package; removal of the overlay structure; a revised input format; the addition of an input data analysis routine; and increasing the number of aeronautical components allowed.
NASA Technical Reports Server (NTRS)
Thorp, Scott A.
1992-01-01
This presentation will discuss the development of a NASA Geometry Exchange Specification for transferring aerodynamic surface geometry between LeRC systems and grid generation software used for computational fluid dynamics research. The proposed specification is based on a subset of the Initial Graphics Exchange Specification (IGES). The presentation will include discussion of how the NASA-IGES standard will accommodate improved computer aided design inspection methods and reverse engineering techniques currently being developed. The presentation is in viewgraph format.
Kwon, Deukwoo; Reis, Isildinha M
2015-08-12
When conducting a meta-analysis of a continuous outcome, estimated means and standard deviations from the selected studies are required in order to obtain an overall estimate of the mean effect and its confidence interval. If these quantities are not directly reported in the publications, they must be estimated from other reported summary statistics, such as the median, the minimum, the maximum, and quartiles. We propose a simulation-based estimation approach using the Approximate Bayesian Computation (ABC) technique for estimating mean and standard deviation based on various sets of summary statistics found in published studies. We conduct a simulation study to compare the proposed ABC method with the existing methods of Hozo et al. (2005), Bland (2015), and Wan et al. (2014). In the estimation of the standard deviation, our ABC method performs better than the other methods when data are generated from skewed or heavy-tailed distributions. The corresponding average relative error (ARE) approaches zero as sample size increases. In data generated from the normal distribution, our ABC performs well. However, the Wan et al. method is best for estimating standard deviation under normal distribution. In the estimation of the mean, our ABC method is best regardless of assumed distribution. ABC is a flexible method for estimating the study-specific mean and standard deviation for meta-analysis, especially with underlying skewed or heavy-tailed distributions. The ABC method can be applied using other reported summary statistics such as the posterior mean and 95 % credible interval when Bayesian analysis has been employed.
Bailey, Sarah F; Scheible, Melissa K; Williams, Christopher; Silva, Deborah S B S; Hoggan, Marina; Eichman, Christopher; Faith, Seth A
2017-11-01
Next-generation Sequencing (NGS) is a rapidly evolving technology with demonstrated benefits for forensic genetic applications, and the strategies to analyze and manage the massive NGS datasets are currently in development. Here, the computing, data storage, connectivity, and security resources of the Cloud were evaluated as a model for forensic laboratory systems that produce NGS data. A complete front-to-end Cloud system was developed to upload, process, and interpret raw NGS data using a web browser dashboard. The system was extensible, demonstrating analysis capabilities of autosomal and Y-STRs from a variety of NGS instrumentation (Illumina MiniSeq and MiSeq, and Oxford Nanopore MinION). NGS data for STRs were concordant with standard reference materials previously characterized with capillary electrophoresis and Sanger sequencing. The computing power of the Cloud was implemented with on-demand auto-scaling to allow multiple file analysis in tandem. The system was designed to store resulting data in a relational database, amenable to downstream sample interpretations and databasing applications following the most recent guidelines in nomenclature for sequenced alleles. Lastly, a multi-layered Cloud security architecture was tested and showed that industry standards for securing data and computing resources were readily applied to the NGS system without disadvantageous effects for bioinformatic analysis, connectivity or data storage/retrieval. The results of this study demonstrate the feasibility of using Cloud-based systems for secured NGS data analysis, storage, databasing, and multi-user distributed connectivity. Copyright © 2017 Elsevier B.V. All rights reserved.
Financial Analysis for R&D Decisions.
ERIC Educational Resources Information Center
Carter, Robert
1997-01-01
Using personal computer spreadsheet software, standard corporate financial analysis can help university research administrators communicate the value of research and development to sponsors and other stakeholders; balance projects, technologies, or categories of research; and continually assess the value of investing in ongoing projects. It also…
A Priori Subgrid Analysis of Temporal Mixing Layers with Evaporating Droplets
NASA Technical Reports Server (NTRS)
Okongo, Nora; Bellan, Josette
1999-01-01
Subgrid analysis of a transitional temporal mixing layer with evaporating droplets has been performed using three sets of results from a Direct Numerical Simulation (DNS) database, with Reynolds numbers (based on initial vorticity thickness) as large as 600 and with droplet mass loadings as large as 0.5. In the DNS, the gas phase is computed using a Eulerian formulation, with Lagrangian droplet tracking. The Large Eddy Simulation (LES) equations corresponding to the DNS are first derived, and key assumptions in deriving them are first confirmed by computing the terms using the DNS database. Since LES of this flow requires the computation of unfiltered gas-phase variables at droplet locations from filtered gas-phase variables at the grid points, it is proposed to model these by assuming the gas-phase variables to be the sum of the filtered variables and a correction based on the filtered standard deviation; this correction is then computed from the Subgrid Scale (SGS) standard deviation. This model predicts the unfiltered variables at droplet locations considerably better than simply interpolating the filtered variables. Three methods are investigated for modeling the SGS standard deviation: the Smagorinsky approach, the Gradient model and the Scale-Similarity formulation. When the proportionality constant inherent in the SGS models is properly calculated, the Gradient and Scale-Similarity methods give results in excellent agreement with the DNS.
Nephele: a cloud platform for simplified, standardized and reproducible microbiome data analysis.
Weber, Nick; Liou, David; Dommer, Jennifer; MacMenamin, Philip; Quiñones, Mariam; Misner, Ian; Oler, Andrew J; Wan, Joe; Kim, Lewis; Coakley McCarthy, Meghan; Ezeji, Samuel; Noble, Karlynn; Hurt, Darrell E
2018-04-15
Widespread interest in the study of the microbiome has resulted in data proliferation and the development of powerful computational tools. However, many scientific researchers lack the time, training, or infrastructure to work with large datasets or to install and use command line tools. The National Institute of Allergy and Infectious Diseases (NIAID) has created Nephele, a cloud-based microbiome data analysis platform with standardized pipelines and a simple web interface for transforming raw data into biological insights. Nephele integrates common microbiome analysis tools as well as valuable reference datasets like the healthy human subjects cohort of the Human Microbiome Project (HMP). Nephele is built on the Amazon Web Services cloud, which provides centralized and automated storage and compute capacity, thereby reducing the burden on researchers and their institutions. https://nephele.niaid.nih.gov and https://github.com/niaid/Nephele. darrell.hurt@nih.gov.
Nephele: a cloud platform for simplified, standardized and reproducible microbiome data analysis
Weber, Nick; Liou, David; Dommer, Jennifer; MacMenamin, Philip; Quiñones, Mariam; Misner, Ian; Oler, Andrew J; Wan, Joe; Kim, Lewis; Coakley McCarthy, Meghan; Ezeji, Samuel; Noble, Karlynn; Hurt, Darrell E
2018-01-01
Abstract Motivation Widespread interest in the study of the microbiome has resulted in data proliferation and the development of powerful computational tools. However, many scientific researchers lack the time, training, or infrastructure to work with large datasets or to install and use command line tools. Results The National Institute of Allergy and Infectious Diseases (NIAID) has created Nephele, a cloud-based microbiome data analysis platform with standardized pipelines and a simple web interface for transforming raw data into biological insights. Nephele integrates common microbiome analysis tools as well as valuable reference datasets like the healthy human subjects cohort of the Human Microbiome Project (HMP). Nephele is built on the Amazon Web Services cloud, which provides centralized and automated storage and compute capacity, thereby reducing the burden on researchers and their institutions. Availability and implementation https://nephele.niaid.nih.gov and https://github.com/niaid/Nephele Contact darrell.hurt@nih.gov PMID:29028892
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
In this report, the scope of the tests, the method of analysis, the results, and the conclusions are discussed. The first test indicated that the requirements generated by the Standard procedures and formulae appear to yield reasonable results, although some of the cost data provided as defaults in the Standard should be reevaluated. The second test provided experience that was useful in modifying the points compliance format, but did not uncover any procedural issues that would lead to unreasonable results. These conclusions are based on analysis using the Automated Residential Energy Standard (ARES) computer program, developed to simplify the processmore » of standards generation.« less
CAD system for automatic analysis of CT perfusion maps
NASA Astrophysics Data System (ADS)
Hachaj, T.; Ogiela, M. R.
2011-03-01
In this article, authors present novel algorithms developed for the computer-assisted diagnosis (CAD) system for analysis of dynamic brain perfusion, computer tomography (CT) maps, cerebral blood flow (CBF), and cerebral blood volume (CBV). Those methods perform both quantitative analysis [detection and measurement and description with brain anatomy atlas (AA) of potential asymmetries/lesions] and qualitative analysis (semantic interpretation of visualized symptoms). The semantic interpretation (decision about type of lesion: ischemic/hemorrhagic, is the brain tissue at risk of infraction or not) of visualized symptoms is done by, so-called, cognitive inference processes allowing for reasoning on character of pathological regions based on specialist image knowledge. The whole system is implemented in.NET platform (C# programming language) and can be used on any standard PC computer with.NET framework installed.
Tang, Qi-Yi; Zhang, Chuan-Xi
2013-04-01
A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. © 2012 The Authors Insect Science © 2012 Institute of Zoology, Chinese Academy of Sciences.
NASA Technical Reports Server (NTRS)
Gupta, K. K.
1997-01-01
A multidisciplinary, finite element-based, highly graphics-oriented, linear and nonlinear analysis capability that includes such disciplines as structures, heat transfer, linear aerodynamics, computational fluid dynamics, and controls engineering has been achieved by integrating several new modules in the original STARS (STructural Analysis RoutineS) computer program. Each individual analysis module is general-purpose in nature and is effectively integrated to yield aeroelastic and aeroservoelastic solutions of complex engineering problems. Examples of advanced NASA Dryden Flight Research Center projects analyzed by the code in recent years include the X-29A, F-18 High Alpha Research Vehicle/Thrust Vectoring Control System, B-52/Pegasus Generic Hypersonics, National AeroSpace Plane (NASP), SR-71/Hypersonic Launch Vehicle, and High Speed Civil Transport (HSCT) projects. Extensive graphics capabilities exist for convenient model development and postprocessing of analysis results. The program is written in modular form in standard FORTRAN language to run on a variety of computers, such as the IBM RISC/6000, SGI, DEC, Cray, and personal computer; associated graphics codes use OpenGL and IBM/graPHIGS language for color depiction. This program is available from COSMIC, the NASA agency for distribution of computer programs.
NASA Technical Reports Server (NTRS)
Roberts, Floyd E., III
1994-01-01
Software provides for control and acquisition of data from optical pyrometer. There are six individual programs in PYROLASER package. Provides quick and easy way to set up, control, and program standard Pyrolaser. Temperature and emisivity measurements either collected as if Pyrolaser in manual operating mode or displayed on real-time strip charts and stored in standard spreadsheet format for posttest analysis. Shell supplied to allow macros, which are test-specific, added to system easily. Written using Labview software for use on Macintosh-series computers running System 6.0.3 or later, Sun Sparc-series computers running Open-Windows 3.0 or MIT's X Window System (X11R4 or X11R5), and IBM PC or compatible computers running Microsoft Windows 3.1 or later.
ACSYNT - A standards-based system for parametric, computer aided conceptual design of aircraft
NASA Technical Reports Server (NTRS)
Jayaram, S.; Myklebust, A.; Gelhausen, P.
1992-01-01
A group of eight US aerospace companies together with several NASA and NAVY centers, led by NASA Ames Systems Analysis Branch, and Virginia Tech's CAD Laboratory agreed, through the assistance of Americal Technology Initiative, in 1990 to form the ACSYNT (Aircraft Synthesis) Institute. The Institute is supported by a Joint Sponsored Research Agreement to continue the research and development in computer aided conceptual design of aircraft initiated by NASA Ames Research Center and Virginia Tech's CAD Laboratory. The result of this collaboration, a feature-based, parametric computer aided aircraft conceptual design code called ACSYNT, is described. The code is based on analysis routines begun at NASA Ames in the early 1970's. ACSYNT's CAD system is based entirely on the ISO standard Programmer's Hierarchical Interactive Graphics System and is graphics-device independent. The code includes a highly interactive graphical user interface, automatically generated Hermite and B-Spline surface models, and shaded image displays. Numerous features to enhance aircraft conceptual design are described.
Quantitative Assay for Starch by Colorimetry Using a Desktop Scanner
ERIC Educational Resources Information Center
Matthews, Kurt R.; Landmark, James D.; Stickle, Douglas F.
2004-01-01
The procedure to produce standard curve for starch concentration measurement by image analysis using a color scanner and computer for data acquisition and color analysis is described. Color analysis is performed by a Visual Basic program that measures red, green, and blue (RGB) color intensities for pixels within the scanner image.
Description of MSFC engineering photographic analysis
NASA Technical Reports Server (NTRS)
Earle, Jim; Williams, Frank
1988-01-01
Utilizing a background that includes development of basic launch and test photographic coverage and analysis procedures, the MSFC Photographic Evaluation Group has built a body of experience that enables it to effectively satisfy MSFC's engineering photographic analysis needs. Combining the basic soundness of reliable, proven techniques of the past with the newer technical advances of computers and computer-related devices, the MSFC Photo Evaluation Group is in a position to continue to provide photo and video analysis service center-wide and NASA-wide to supply an improving photo analysis product to meet the photo evaluation needs of the future; and to provide new standards in the state-of-the-art of photo analysis of dynamic events.
NASA Astrophysics Data System (ADS)
Priest, Richard Harding
A significant percentage of high school science teachers are not using computers to teach their students or prepare them for standardized testing. A survey of high school science teachers was conducted to determine how they are having students use computers in the classroom, why science teachers are not using computers in the classroom, which variables were relevant to their not using computers, and what are the effects of standardized testing on the use of technology in the high school science classroom. A self-administered questionnaire was developed to measure these aspects of computer integration and demographic information. A follow-up telephone interview survey of a portion of the original sample was conducted in order to clarify questions, correct misunderstandings, and to draw out more holistic descriptions from the subjects. The primary method used to analyze the quantitative data was frequency distributions. Multiple regression analysis was used to investigate the relationships between the barriers and facilitators and the dimensions of instructional use, frequency, and importance of the use of computers. All high school science teachers in a large urban/suburban school district were sent surveys. A response rate of 58% resulted from two mailings of the survey. It was found that contributing factors to why science teachers do not use computers were not enough up-to-date computers in their classrooms and other educational commitments and duties do not leave them enough time to prepare lessons that include technology. While a high percentage of science teachers thought their school and district administrations were supportive of technology, they also believed more inservice technology training and follow-up activities to support that training are needed and more software needs to be created. The majority of the science teachers do not use the computer to help students prepare for standardized tests because they believe they can prepare students more efficiently without a computer. Nearly half of the teachers, however, gave lack of time to prepare instructional materials and lack of a means to project a computer image to the whole class as reasons they do not use computers. A significant percentage thought science standardized testing was having a negative effect on computer use.
Automatic Error Analysis Using Intervals
ERIC Educational Resources Information Center
Rothwell, E. J.; Cloud, M. J.
2012-01-01
A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…
14 CFR 1215.115 - Payment and billing.
Code of Federal Regulations, 2010 CFR
2010-01-01
... of all standard pre-launch services such as mission planning, documentation, link analysis, testing, computer, human resources, etc., with the exception of TDRSS operational services, will be paid to the...
Standardization of computer-assisted semen analysis using an e-learning application.
Ehlers, J; Behr, M; Bollwein, H; Beyerbach, M; Waberski, D
2011-08-01
Computer-assisted semen analysis (CASA) is primarily used to obtain accurate and objective kinetic sperm measurements. Additionally, AI centers use computer-assessed sperm concentration in the sample as a basis for calculating the number of insemination doses available from a given ejaculate. The reliability of data is often limited and results can vary even when the same CASA systems with identical settings are used. The objective of the present study was to develop a computer-based training module for standardized measurements with a CASA system and to evaluate its training effect on the quality of the assessment of sperm motility and concentration. A digital versatile disc (DVD) has been produced showing the standardization of sample preparation and analysis with the CASA system SpermVision™ version 3.0 (Minitube, Verona, WI, USA) in words, pictures, and videos, as well as the most probable sources of error. Eight test persons educated in spermatology, but with different levels of experience with the CASA system, prepared and assessed 10 aliquots from one prediluted bull ejaculate using the same CASA system and laboratory equipment before and after electronic learning (e-learning). After using the e-learning application, the coefficient of variation was reduced on average for the sperm concentration from 26.1% to 11.3% (P ≤ 0.01), and for motility from 5.8% to 3.1% (P ≤ 0.05). For five test persons, the difference in the coefficient of variation before and after use of the e-learning application was significant (P ≤ 0.05). Individual deviations of means from the group mean before e-learning were reduced compared with individual deviations from the group mean after e-learning. According to a survey, the e-learning application was highly accepted by users. In conclusion, e-learning presents an effective, efficient, and accepted tool for improvement of the precision of CASA measurements. This study provides a model for the standardization of other laboratory procedures using e-learning. Copyright © 2011 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Casartelli, E.; Mangani, L.; Ryan, O.; Schmid, A.
2016-11-01
CFD has entered the product development process in hydraulic machines since more than three decades. Beside the actual design process, in which the most appropriate geometry for a certain task is iteratively sought, several steady-state simulations and related analyses are performed with the help of CFD. Basic transient CFD-analysis is becoming more and more routine for rotor-stator interaction assessment, but in general unsteady CFD is still not standard due to the large computational effort. Especially for FSI simulations, where mesh motion is involved, a considerable amount of computational time is necessary for the mesh handling and deformation as well as the related unsteady flow field resolution. Therefore this kind of CFD computations are still unusual and mostly performed during trouble-shooting analysis rather than in the standard development process, i.e. in order to understand what went wrong instead of preventing failure or even better to increase the available knowledge. In this paper the application of an efficient and particularly robust algorithm for fast computations with moving mesh is presented for the analysis of transient effects encountered during highly dynamic procedures in the operation of a pump-turbine, like runaway at fixed GV position and load-rejection with GV motion imposed as one-way FSI. In both cases the computations extend through the S-shape of the machine in the turbine-brake and reverse pump domain, showing that such exotic computations can be perform on a more regular base, even if quite time consuming. Beside the presentation of the procedure and global results, some highlights in the encountered flow-physics are also given.
MAAMD: a workflow to standardize meta-analyses and comparison of affymetrix microarray data
2014-01-01
Background Mandatory deposit of raw microarray data files for public access, prior to study publication, provides significant opportunities to conduct new bioinformatics analyses within and across multiple datasets. Analysis of raw microarray data files (e.g. Affymetrix CEL files) can be time consuming, complex, and requires fundamental computational and bioinformatics skills. The development of analytical workflows to automate these tasks simplifies the processing of, improves the efficiency of, and serves to standardize multiple and sequential analyses. Once installed, workflows facilitate the tedious steps required to run rapid intra- and inter-dataset comparisons. Results We developed a workflow to facilitate and standardize Meta-Analysis of Affymetrix Microarray Data analysis (MAAMD) in Kepler. Two freely available stand-alone software tools, R and AltAnalyze were embedded in MAAMD. The inputs of MAAMD are user-editable csv files, which contain sample information and parameters describing the locations of input files and required tools. MAAMD was tested by analyzing 4 different GEO datasets from mice and drosophila. MAAMD automates data downloading, data organization, data quality control assesment, differential gene expression analysis, clustering analysis, pathway visualization, gene-set enrichment analysis, and cross-species orthologous-gene comparisons. MAAMD was utilized to identify gene orthologues responding to hypoxia or hyperoxia in both mice and drosophila. The entire set of analyses for 4 datasets (34 total microarrays) finished in ~ one hour. Conclusions MAAMD saves time, minimizes the required computer skills, and offers a standardized procedure for users to analyze microarray datasets and make new intra- and inter-dataset comparisons. PMID:24621103
Computation of backwater and discharge at width constrictions of heavily vegetated flood plains
Schneider, V.R.; Board, J.W.; Colson, B.E.; Lee, F.N.; Druffel, Leroy
1977-01-01
The U.S. Geological Survey, cooperated with the Federal Highway Administration and the State Highway Departments of Mississippi, Alabama, and Louisiana, to develop a proposed method for computing backwater and discharge at width constrictions of heavily vegetated flood plains. Data were collected at 20 single opening sites for 31 floods. Flood-plain width varied from 4 to 14 times the bridge opening width. The recurrence intervals of peak discharge ranged from a 2-year flood to greater than a 100-year flood, with a median interval of 6 years. Measured backwater ranged from 0.39 to 3.16 feet. Backwater computed by the present standard Geological Survey method averaged 29 percent less than the measured, and that computed by the currently used Federal Highway Administration method averaged 47 percent less than the measured. Discharge computed by the Survey method averaged 21 percent more then the measured. Analysis of data showed that the flood-plain widths and the Manning 's roughness coefficient are larger than those used to develop the standard methods. A method to more accurately compute backwater and discharge was developed. The difference between the contracted and natural water-surface profiles computed using standard step-backwater procedures is defined as backwater. The energy loss terms in the step-backwater procedure are computed as the product of the geometric mean of the energy slopes and the flow distance in the reach was derived from potential flow theory. The mean error was 1 percent when using the proposed method for computing backwater and 3 percent for computing discharge. (Woodard-USGS)
Analysis of Delays in Transmitting Time Code Using an Automated Computer Time Distribution System
1999-12-01
jlevine@clock. bldrdoc.gov Abstract An automated computer time distribution system broadcasts standard tune to users using computers and modems via...contributed to &lays - sofhareplatform (50% of the delay), transmission speed of time- codes (25OA), telephone network (lS%), modem and others (10’4). The... modems , and telephone lines. Users dial the ACTS server to receive time traceable to the national time scale of Singapore, UTC(PSB). The users can in
NASA Astrophysics Data System (ADS)
Yue, Songshan; Chen, Min; Wen, Yongning; Lu, Guonian
2016-04-01
Earth environment is extremely complicated and constantly changing; thus, it is widely accepted that the use of a single geo-analysis model cannot accurately represent all details when solving complex geo-problems. Over several years of research, numerous geo-analysis models have been developed. However, a collaborative barrier between model providers and model users still exists. The development of cloud computing has provided a new and promising approach for sharing and integrating geo-analysis models across an open web environment. To share and integrate these heterogeneous models, encapsulation studies should be conducted that are aimed at shielding original execution differences to create services which can be reused in the web environment. Although some model service standards (such as Web Processing Service (WPS) and Geo Processing Workflow (GPW)) have been designed and developed to help researchers construct model services, various problems regarding model encapsulation remain. (1) The descriptions of geo-analysis models are complicated and typically require rich-text descriptions and case-study illustrations, which are difficult to fully represent within a single web request (such as the GetCapabilities and DescribeProcess operations in the WPS standard). (2) Although Web Service technologies can be used to publish model services, model users who want to use a geo-analysis model and copy the model service into another computer still encounter problems (e.g., they cannot access the model deployment dependencies information). This study presents a strategy for encapsulating geo-analysis models to reduce problems encountered when sharing models between model providers and model users and supports the tasks with different web service standards (e.g., the WPS standard). A description method for heterogeneous geo-analysis models is studied. Based on the model description information, the methods for encapsulating the model-execution program to model services and for describing model-service deployment information are also included in the proposed strategy. Hence, the model-description interface, model-execution interface and model-deployment interface are studied to help model providers and model users more easily share, reuse and integrate geo-analysis models in an open web environment. Finally, a prototype system is established, and the WPS standard is employed as an example to verify the capability and practicability of the model-encapsulation strategy. The results show that it is more convenient for modellers to share and integrate heterogeneous geo-analysis models in cloud computing platforms.
Application of microarray analysis on computer cluster and cloud platforms.
Bernau, C; Boulesteix, A-L; Knaus, J
2013-01-01
Analysis of recent high-dimensional biological data tends to be computationally intensive as many common approaches such as resampling or permutation tests require the basic statistical analysis to be repeated many times. A crucial advantage of these methods is that they can be easily parallelized due to the computational independence of the resampling or permutation iterations, which has induced many statistics departments to establish their own computer clusters. An alternative is to rent computing resources in the cloud, e.g. at Amazon Web Services. In this article we analyze whether a selection of statistical projects, recently implemented at our department, can be efficiently realized on these cloud resources. Moreover, we illustrate an opportunity to combine computer cluster and cloud resources. In order to compare the efficiency of computer cluster and cloud implementations and their respective parallelizations we use microarray analysis procedures and compare their runtimes on the different platforms. Amazon Web Services provide various instance types which meet the particular needs of the different statistical projects we analyzed in this paper. Moreover, the network capacity is sufficient and the parallelization is comparable in efficiency to standard computer cluster implementations. Our results suggest that many statistical projects can be efficiently realized on cloud resources. It is important to mention, however, that workflows can change substantially as a result of a shift from computer cluster to cloud computing.
Topics in the optimization of millimeter-wave mixers
NASA Technical Reports Server (NTRS)
Siegel, P. H.; Kerr, A. R.; Hwang, W.
1984-01-01
A user oriented computer program for the analysis of single-ended Schottky diode mixers is described. The program is used to compute the performance of a 140 to 220 GHz mixer and excellent agreement with measurements at 150 and 180 GHz is obtained. A sensitivity analysis indicates the importance of various diode and mount characteristics on the mixer performance. A computer program for the analysis of varactor diode multipliers is described. The diode operates in either the reverse biased varactor mode or with substantial forward current flow where the conversion mechanism is predominantly resistive. A description and analysis of a new H-plane rectangular waveguide transformer is reported. The transformer is made quickly and easily in split-block waveguide using a standard slitting saw. It is particularly suited for use in the millimeter-wave band, replacing conventional electroformed stepped transformers. A theoretical analysis of the transformer is given and good agreement is obtained with measurements made at X-band.
Periodic component analysis as a spatial filter for SSVEP-based brain-computer interface.
Kiran Kumar, G R; Reddy, M Ramasubba
2018-06-08
Traditional Spatial filters used for steady-state visual evoked potential (SSVEP) extraction such as minimum energy combination (MEC) require the estimation of the background electroencephalogram (EEG) noise components. Even though this leads to improved performance in low signal to noise ratio (SNR) conditions, it makes such algorithms slow compared to the standard detection methods like canonical correlation analysis (CCA) due to the additional computational cost. In this paper, Periodic component analysis (πCA) is presented as an alternative spatial filtering approach to extract the SSVEP component effectively without involving extensive modelling of the noise. The πCA can separate out components corresponding to a given frequency of interest from the background electroencephalogram (EEG) by capturing the temporal information and does not generalize SSVEP based on rigid templates. Data from ten test subjects were used to evaluate the proposed method and the results demonstrate that the periodic component analysis acts as a reliable spatial filter for SSVEP extraction. Statistical tests were performed to validate the results. The experimental results show that πCA provides significant improvement in accuracy compared to standard CCA and MEC in low SNR conditions. The results demonstrate that πCA provides better detection accuracy compared to CCA and on par with that of MEC at a lower computational cost. Hence πCA is a reliable and efficient alternative detection algorithm for SSVEP based brain-computer interface (BCI). Copyright © 2018. Published by Elsevier B.V.
Computer Science (CS) Education in Indian Schools: Situation Analysis Using Darmstadt Model
ERIC Educational Resources Information Center
Raman, Raghu; Venkatasubramanian, Smrithi; Achuthan, Krishnashree; Nedungadi, Prema
2015-01-01
Computer science (CS) and its enabling technologies are at the heart of this information age, yet its adoption as a core subject by senior secondary students in Indian schools is low and has not reached critical mass. Though there have been efforts to create core curriculum standards for subjects like Physics, Chemistry, Biology, and Math, CS…
CUDAICA: GPU Optimization of Infomax-ICA EEG Analysis
Raimondo, Federico; Kamienkowski, Juan E.; Sigman, Mariano; Fernandez Slezak, Diego
2012-01-01
In recent years, Independent Component Analysis (ICA) has become a standard to identify relevant dimensions of the data in neuroscience. ICA is a very reliable method to analyze data but it is, computationally, very costly. The use of ICA for online analysis of the data, used in brain computing interfaces, results are almost completely prohibitive. We show an increase with almost no cost (a rapid video card) of speed of ICA by about 25 fold. The EEG data, which is a repetition of many independent signals in multiple channels, is very suitable for processing using the vector processors included in the graphical units. We profiled the implementation of this algorithm and detected two main types of operations responsible of the processing bottleneck and taking almost 80% of computing time: vector-matrix and matrix-matrix multiplications. By replacing function calls to basic linear algebra functions to the standard CUBLAS routines provided by GPU manufacturers, it does not increase performance due to CUDA kernel launch overhead. Instead, we developed a GPU-based solution that, comparing with the original BLAS and CUBLAS versions, obtains a 25x increase of performance for the ICA calculation. PMID:22811699
Rotordynamics on the PC: Transient Analysis With ARDS
NASA Technical Reports Server (NTRS)
Fleming, David P.
1997-01-01
Personal computers can now do many jobs that formerly required a large mainframe computer. An example is NASA Lewis Research Center's program Analysis of RotorDynamic Systems (ARDS), which uses the component mode synthesis method to analyze the dynamic motion of up to five rotating shafts. As originally written in the early 1980's, this program was considered large for the mainframe computers of the time. ARDS, which was written in Fortran 77, has been successfully ported to a 486 personal computer. Plots appear on the computer monitor via calls programmed for the original CALCOMP plotter; plots can also be output on a standard laser printer. The executable code, which uses the full array sizes of the mainframe version, easily fits on a high-density floppy disk. The program runs under DOS with an extended memory manager. In addition to transient analysis of blade loss, step turns, and base acceleration, with simulation of squeeze-film dampers and rubs, ARDS calculates natural frequencies and unbalance response.
NASA Technical Reports Server (NTRS)
Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.
1993-01-01
A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.
A Priori Subgrid Scale Modeling for a Droplet Laden Temporal Mixing Layer
NASA Technical Reports Server (NTRS)
Okongo, Nora; Bellan, Josette
2000-01-01
Subgrid analysis of a transitional temporal mixing layer with evaporating droplets has been performed using a direct numerical simulation (DNS) database. The DNS is for a Reynolds number (based on initial vorticity thickness) of 600, with droplet mass loading of 0.2. The gas phase is computed using a Eulerian formulation, with Lagrangian droplet tracking. Since Large Eddy Simulation (LES) of this flow requires the computation of unfiltered gas-phase variables at droplet locations from filtered gas-phase variables at the grid points, it is proposed to model these by assuming the gas-phase variables to be given by the filtered variables plus a correction based on the filtered standard deviation, which can be computed from the sub-grid scale (SGS) standard deviation. This model predicts unfiltered variables at droplet locations better than simply interpolating the filtered variables. Three methods are investigated for modeling the SGS standard deviation: Smagorinsky, gradient and scale-similarity. When properly calibrated, the gradient and scale-similarity methods give results in excellent agreement with the DNS.
Segmentation of Unstructured Datasets
NASA Technical Reports Server (NTRS)
Bhat, Smitha
1996-01-01
Datasets generated by computer simulations and experiments in Computational Fluid Dynamics tend to be extremely large and complex. It is difficult to visualize these datasets using standard techniques like Volume Rendering and Ray Casting. Object Segmentation provides a technique to extract and quantify regions of interest within these massive datasets. This thesis explores basic algorithms to extract coherent amorphous regions from two-dimensional and three-dimensional scalar unstructured grids. The techniques are applied to datasets from Computational Fluid Dynamics and from Finite Element Analysis.
NASA Astrophysics Data System (ADS)
Wiemker, Rafael; Rogalla, Patrik; Opfer, Roland; Ekin, Ahmet; Romano, Valentina; Bülow, Thomas
2006-03-01
The performance of computer aided lung nodule detection (CAD) and computer aided nodule volumetry is compared between standard-dose (70-100 mAs) and ultra-low-dose CT images (5-10 mAs). A direct quantitative performance comparison was possible, since for each patient both an ultra-low-dose and a standard-dose CT scan were acquired within the same examination session. The data sets were recorded with a multi-slice CT scanner at the Charite university hospital Berlin with 1 mm slice thickness. Our computer aided nodule detection and segmentation algorithms were deployed on both ultra-low-dose and standard-dose CT data without any dose-specific fine-tuning or preprocessing. As a reference standard 292 nodules from 20 patients were visually identified, each nodule both in ultra-low-dose and standard-dose data sets. The CAD performance was analyzed by virtue of multiple FROC curves for different lower thresholds of the nodule diameter. For nodules with a volume-equivalent diameter equal or larger than 4 mm (149 nodules pairs), we observed a detection rate of 88% at a median false positive rate of 2 per patient in standard-dose images, and 86% detection rate in ultra-low-dose images, also at 2 FPs per patient. Including even smaller nodules equal or larger than 2 mm (272 nodules pairs), we observed a detection rate of 86% in standard-dose images, and 84% detection rate in ultra-low-dose images, both at a rate of 5 FPs per patient. Moreover, we observed a correlation of 94% between the volume-equivalent nodule diameter as automatically measured on ultra-low-dose versus on standard-dose images, indicating that ultra-low-dose CT is also feasible for growth-rate assessment in follow-up examinations. The comparable performance of lung nodule CAD in ultra-low-dose and standard-dose images is of particular interest with respect to lung cancer screening of asymptomatic patients.
Nakajima, Kenichi; Matsumoto, Naoya; Kasai, Tokuo; Matsuo, Shinro; Kiso, Keisuke; Okuda, Koichi
2016-04-01
As a 2-year project of the Japanese Society of Nuclear Medicine working group activity, normal myocardial imaging databases were accumulated and summarized. Stress-rest with gated and non-gated image sets were accumulated for myocardial perfusion imaging and could be used for perfusion defect scoring and normal left ventricular (LV) function analysis. For single-photon emission computed tomography (SPECT) with multi-focal collimator design, databases of supine and prone positions and computed tomography (CT)-based attenuation correction were created. The CT-based correction provided similar perfusion patterns between genders. In phase analysis of gated myocardial perfusion SPECT, a new approach for analyzing dyssynchrony, normal ranges of parameters for phase bandwidth, standard deviation and entropy were determined in four software programs. Although the results were not interchangeable, dependency on gender, ejection fraction and volumes were common characteristics of these parameters. Standardization of (123)I-MIBG sympathetic imaging was performed regarding heart-to-mediastinum ratio (HMR) using a calibration phantom method. The HMRs from any collimator types could be converted to the value with medium-energy comparable collimators. Appropriate quantification based on common normal databases and standard technology could play a pivotal role for clinical practice and researches.
CAD/CAE Integration Enhanced by New CAD Services Standard
NASA Technical Reports Server (NTRS)
Claus, Russell W.
2002-01-01
A Government-industry team led by the NASA Glenn Research Center has developed a computer interface standard for accessing data from computer-aided design (CAD) systems. The Object Management Group, an international computer standards organization, has adopted this CAD services standard. The new standard allows software (e.g., computer-aided engineering (CAE) and computer-aided manufacturing software to access multiple CAD systems through one programming interface. The interface is built on top of a distributed computing system called the Common Object Request Broker Architecture (CORBA). CORBA allows the CAD services software to operate in a distributed, heterogeneous computing environment.
CFD Analysis and Design Optimization Using Parallel Computers
NASA Technical Reports Server (NTRS)
Martinelli, Luigi; Alonso, Juan Jose; Jameson, Antony; Reuther, James
1997-01-01
A versatile and efficient multi-block method is presented for the simulation of both steady and unsteady flow, as well as aerodynamic design optimization of complete aircraft configurations. The compressible Euler and Reynolds Averaged Navier-Stokes (RANS) equations are discretized using a high resolution scheme on body-fitted structured meshes. An efficient multigrid implicit scheme is implemented for time-accurate flow calculations. Optimum aerodynamic shape design is achieved at very low cost using an adjoint formulation. The method is implemented on parallel computing systems using the MPI message passing interface standard to ensure portability. The results demonstrate that, by combining highly efficient algorithms with parallel computing, it is possible to perform detailed steady and unsteady analysis as well as automatic design for complex configurations using the present generation of parallel computers.
Analysis of backward error recovery for concurrent processes with recovery blocks
NASA Technical Reports Server (NTRS)
Shin, K. G.; Lee, Y. H.
1982-01-01
Three different methods of implementing recovery blocks (RB's). These are the asynchronous, synchronous, and the pseudo recovery point implementations. Pseudo recovery points so that unbounded rollback may be avoided while maintaining process autonomy are proposed. Probabilistic models for analyzing these three methods under standard assumptions in computer performance analysis, i.e., exponential distributions for related random variables were developed. The interval between two successive recovery lines for asynchronous RB's mean loss in computation power for the synchronized method, and additional overhead and rollback distance in case PRP's are used were estimated.
New space sensor and mesoscale data analysis
NASA Technical Reports Server (NTRS)
Hickey, John S.
1987-01-01
The developed Earth Science and Application Division (ESAD) system/software provides the research scientist with the following capabilities: an extensive data base management capibility to convert various experiment data types into a standard format; and interactive analysis and display package (AVE80); an interactive imaging/color graphics capability utilizing the Apple III and IBM PC workstations integrated into the ESAD computer system; and local and remote smart-terminal capability which provides color video, graphics, and Laserjet output. Recommendations for updating and enhancing the performance of the ESAD computer system are listed.
Sum and mean. Standard programs for activation analysis.
Lindstrom, R M
1994-01-01
Two computer programs in use for over a decade in the Nuclear Methods Group at NIST illustrate the utility of standard software: programs widely available and widely used, in which (ideally) well-tested public algorithms produce results that are well understood, and thereby capable of comparison, within the community of users. Sum interactively computes the position, net area, and uncertainty of the area of spectral peaks, and can give better results than automatic peak search programs when peaks are very small, very large, or unusually shaped. Mean combines unequal measurements of a single quantity, tests for consistency, and obtains the weighted mean and six measures of its uncertainty.
MIT Lincoln Laboratory Takes the Mystery Out of Supercomupting
2017-01-18
analysis, designing sensors, and developing algorithms. In 2008, the Lincoln demonstrated the largest single problem ever run on a computer using ... computation . As we design and prototype these devices, the use of leading–edge engineering practices have become the de facto standard. This includes...MIT Lincoln Laboratory Takes the Mystery Out of Supercomputing By Dr. Jeremy Kepner 1 The introduction of multicore and manycore processors
Javidi, Bahram; Markman, Adam; Rawat, Siddharth; O'Connor, Timothy; Anand, Arun; Andemariam, Biree
2018-05-14
We present a spatio-temporal analysis of cell membrane fluctuations to distinguish healthy patients from patients with sickle cell disease. A video hologram containing either healthy red blood cells (h-RBCs) or sickle cell disease red blood cells (SCD-RBCs) was recorded using a low-cost, compact, 3D printed shearing interferometer. Reconstructions were created for each hologram frame (time steps), forming a spatio-temporal data cube. Features were extracted by computing the standard deviations and the mean of the height fluctuations over time and for every location on the cell membrane, resulting in two-dimensional standard deviation and mean maps, followed by taking the standard deviations of these maps. The optical flow algorithm was used to estimate the apparent motion fields between subsequent frames (reconstructions). The standard deviation of the magnitude of the optical flow vectors across all frames was then computed. In addition, seven morphological cell (spatial) features based on optical path length were extracted from the cells to further improve the classification accuracy. A random forest classifier was trained to perform cell identification to distinguish between SCD-RBCs and h-RBCs. To the best of our knowledge, this is the first report of machine learning assisted cell identification and diagnosis of sickle cell disease based on cell membrane fluctuations and morphology using both spatio-temporal and spatial analysis.
ENFIN--A European network for integrative systems biology.
Kahlem, Pascal; Clegg, Andrew; Reisinger, Florian; Xenarios, Ioannis; Hermjakob, Henning; Orengo, Christine; Birney, Ewan
2009-11-01
Integration of biological data of various types and the development of adapted bioinformatics tools represent critical objectives to enable research at the systems level. The European Network of Excellence ENFIN is engaged in developing an adapted infrastructure to connect databases, and platforms to enable both the generation of new bioinformatics tools and the experimental validation of computational predictions. With the aim of bridging the gap existing between standard wet laboratories and bioinformatics, the ENFIN Network runs integrative research projects to bring the latest computational techniques to bear directly on questions dedicated to systems biology in the wet laboratory environment. The Network maintains internally close collaboration between experimental and computational research, enabling a permanent cycling of experimental validation and improvement of computational prediction methods. The computational work includes the development of a database infrastructure (EnCORE), bioinformatics analysis methods and a novel platform for protein function analysis FuncNet.
Spacelab experiment computer study. Volume 1: Executive summary (presentation)
NASA Technical Reports Server (NTRS)
Lewis, J. L.; Hodges, B. C.; Christy, J. O.
1976-01-01
A quantitative cost for various Spacelab flight hardware configurations is provided along with varied software development options. A cost analysis of Spacelab computer hardware and software is presented. The cost study is discussed based on utilization of a central experiment computer with optional auxillary equipment. Groundrules and assumptions used in deriving the costing methods for all options in the Spacelab experiment study are presented. The groundrules and assumptions, are analysed and the options along with their cost considerations, are discussed. It is concluded that Spacelab program cost for software development and maintenance is independent of experimental hardware and software options, that distributed standard computer concept simplifies software integration without a significant increase in cost, and that decisions on flight computer hardware configurations should not be made until payload selection for a given mission and a detailed analysis of the mission requirements are completed.
Time-variant random interval natural frequency analysis of structures
NASA Astrophysics Data System (ADS)
Wu, Binhua; Wu, Di; Gao, Wei; Song, Chongmin
2018-02-01
This paper presents a new robust method namely, unified interval Chebyshev-based random perturbation method, to tackle hybrid random interval structural natural frequency problem. In the proposed approach, random perturbation method is implemented to furnish the statistical features (i.e., mean and standard deviation) and Chebyshev surrogate model strategy is incorporated to formulate the statistical information of natural frequency with regards to the interval inputs. The comprehensive analysis framework combines the superiority of both methods in a way that computational cost is dramatically reduced. This presented method is thus capable of investigating the day-to-day based time-variant natural frequency of structures accurately and efficiently under concrete intrinsic creep effect with probabilistic and interval uncertain variables. The extreme bounds of the mean and standard deviation of natural frequency are captured through the embedded optimization strategy within the analysis procedure. Three particularly motivated numerical examples with progressive relationship in perspective of both structure type and uncertainty variables are demonstrated to justify the computational applicability, accuracy and efficiency of the proposed method.
ERIC Educational Resources Information Center
Abrams, Neal M.
2012-01-01
A cloud network system is combined with standard computing applications and a course management system to provide a robust method for sharing data among students. This system provides a unique method to improve data analysis by easily increasing the amount of sampled data available for analysis. The data can be shared within one course as well as…
Setting the standards for signal transduction research.
Saez-Rodriguez, Julio; Alexopoulos, Leonidas G; Stolovitzky, Gustavo
2011-02-15
Major advances in high-throughput technology platforms, coupled with increasingly sophisticated computational methods for systematic data analysis, have provided scientists with tools to better understand the complexity of signaling networks. In this era of massive and diverse data collection, standardization efforts that streamline data gathering, analysis, storage, and sharing are becoming a necessity. Here, we give an overview of current technologies to study signal transduction. We argue that along with the opportunities the new technologies open, their heterogeneous nature poses critical challenges for data handling that are further increased when data are to be integrated in mathematical models. Efficient standardization through markup languages and data annotation is a sine qua non condition for a systems-level analysis of signaling processes. It remains to be seen the extent to which and the speed at which the emerging standardization efforts will be embraced by the signaling community.
Dependency graph for code analysis on emerging architectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shashkov, Mikhail Jurievich; Lipnikov, Konstantin
Direct acyclic dependency (DAG) graph is becoming the standard for modern multi-physics codes.The ideal DAG is the true block-scheme of a multi-physics code. Therefore, it is the convenient object for insitu analysis of the cost of computations and algorithmic bottlenecks related to statistical frequent data motion and dymanical machine state.
An independent software system for the analysis of dynamic MR images.
Torheim, G; Lombardi, M; Rinck, P A
1997-01-01
A computer system for the manual, semi-automatic, and automatic analysis of dynamic MR images was to be developed on UNIX and personal computer platforms. The system was to offer an integrated and standardized way of performing both image processing and analysis that was independent of the MR unit used. The system consists of modules that are easily adaptable to special needs. Data from MR units or other diagnostic imaging equipment in techniques such as CT, ultrasonography, or nuclear medicine can be processed through the ACR-NEMA/DICOM standard file formats. A full set of functions is available, among them cine-loop visual analysis, and generation of time-intensity curves. Parameters such as cross-correlation coefficients, area under the curve, peak/maximum intensity, wash-in and wash-out slopes, time to peak, and relative signal intensity/contrast enhancement can be calculated. Other parameters can be extracted by fitting functions like the gamma-variate function. Region-of-interest data and parametric values can easily be exported. The system has been successfully tested in animal and patient examinations.
Computation of Standard Errors
Dowd, Bryan E; Greene, William H; Norton, Edward C
2014-01-01
Objectives We discuss the problem of computing the standard errors of functions involving estimated parameters and provide the relevant computer code for three different computational approaches using two popular computer packages. Study Design We show how to compute the standard errors of several functions of interest: the predicted value of the dependent variable for a particular subject, and the effect of a change in an explanatory variable on the predicted value of the dependent variable for an individual subject and average effect for a sample of subjects. Empirical Application Using a publicly available dataset, we explain three different methods of computing standard errors: the delta method, Krinsky–Robb, and bootstrapping. We provide computer code for Stata 12 and LIMDEP 10/NLOGIT 5. Conclusions In most applications, choice of the computational method for standard errors of functions of estimated parameters is a matter of convenience. However, when computing standard errors of the sample average of functions that involve both estimated parameters and nonstochastic explanatory variables, it is important to consider the sources of variation in the function's values. PMID:24800304
Lévy-like diffusion in eye movements during spoken-language comprehension.
Stephen, Damian G; Mirman, Daniel; Magnuson, James S; Dixon, James A
2009-05-01
This study explores the diffusive properties of human eye movements during a language comprehension task. In this task, adults are given auditory instructions to locate named objects on a computer screen. Although it has been convention to model visual search as standard Brownian diffusion, we find evidence that eye movements are hyperdiffusive. Specifically, we use comparisons of maximum-likelihood fit as well as standard deviation analysis and diffusion entropy analysis to show that visual search during language comprehension exhibits Lévy-like rather than Gaussian diffusion.
Lévy-like diffusion in eye movements during spoken-language comprehension
NASA Astrophysics Data System (ADS)
Stephen, Damian G.; Mirman, Daniel; Magnuson, James S.; Dixon, James A.
2009-05-01
This study explores the diffusive properties of human eye movements during a language comprehension task. In this task, adults are given auditory instructions to locate named objects on a computer screen. Although it has been convention to model visual search as standard Brownian diffusion, we find evidence that eye movements are hyperdiffusive. Specifically, we use comparisons of maximum-likelihood fit as well as standard deviation analysis and diffusion entropy analysis to show that visual search during language comprehension exhibits Lévy-like rather than Gaussian diffusion.
ERIC Educational Resources Information Center
Burns, Matthew K.; Taylor, Crystal N.; Warmbold-Brann, Kristy L.; Preast, June L.; Hosp, John L.; Ford, Jeremy W.
2017-01-01
Intervention researchers often use curriculum-based measurement of reading fluency (CBM-R) with a brief experimental analysis (BEA) to identify an effective intervention for individual students. The current study synthesized data from 22 studies that used CBM-R data within a BEA by computing the standard error of measure (SEM) for the median data…
Composite-Material Point-Stress Analysis
NASA Technical Reports Server (NTRS)
Spears, F., S.
1982-01-01
PSANAL computes composite-laminate elastic and thermal properties and allowable load levels for any combination of applied membrane and bending loads occurring at a point. Basic linear orthotropic stress/ strain relationships and standard composite-laminate theory formulas are utilized.
Hydrological analysis in R: Topmodel and beyond
NASA Astrophysics Data System (ADS)
Buytaert, W.; Reusser, D.
2011-12-01
R is quickly gaining popularity in the hydrological sciences community. The wide range of statistical and mathematical functionality makes it an excellent tool for data analysis, modelling and uncertainty analysis. Topmodel was one of the first hydrological models being implemented as an R package and distributed through R's own distribution network CRAN. This facilitated pre- and postprocessing of data such as parameter sampling, calculation of prediction bounds, and advanced visualisation. However, apart from these basic functionalities, the package did not use many of the more advanced features of the R environment, especially from R's object oriented functionality. With R's increasing expansion in arenas such as high performance computing, big data analysis, and cloud services, we revisit the topmodel package, and use it as an example of how to build and deploy the next generation of hydrological models. R provides a convenient environment and attractive features to build and couple hydrological - and in extension other environmental - models, to develop flexible and effective data assimilation strategies, and to take the model beyond the individual computer by linking into cloud services for both data provision and computing. However, in order to maximise the benefit of these approaches, it will be necessary to adopt standards and ontologies for model interaction and information exchange. Some of those are currently being developed, such as the OGC web processing standards, while other will need to be developed.
Determination of fiber volume in graphite/epoxy materials using computer image analysis
NASA Technical Reports Server (NTRS)
Viens, Michael J.
1990-01-01
The fiber volume of graphite/epoxy specimens was determined by analyzing optical images of cross sectioned specimens using image analysis software. Test specimens were mounted and polished using standard metallographic techniques and examined at 1000 times magnification. Fiber volume determined using the optical imaging agreed well with values determined using the standard acid digestion technique. The results were found to agree within 5 percent over a fiber volume range of 45 to 70 percent. The error observed is believed to arise from fiber volume variations within the graphite/epoxy panels themselves. The determination of ply orientation using image analysis techniques is also addressed.
NASA Technical Reports Server (NTRS)
Huang, L. C. P.; Cook, R. A.
1973-01-01
Models utilizing various sub-sets of the six degrees of freedom are used in trajectory simulation. A 3-D model with only linear degrees of freedom is especially attractive, since the coefficients for the angular degrees of freedom are the most difficult to determine and the angular equations are the most time consuming for the computer to evaluate. A computer program is developed that uses three separate subsections to predict trajectories. A launch rail subsection is used until the rocket has left its launcher. The program then switches to a special 3-D section which computes motions in two linear and one angular degrees of freedom. When the rocket trims out, the program switches to the standard, three linear degrees of freedom model.
The computational structural mechanics testbed procedures manual
NASA Technical Reports Server (NTRS)
Stewart, Caroline B. (Compiler)
1991-01-01
The purpose of this manual is to document the standard high level command language procedures of the Computational Structural Mechanics (CSM) Testbed software system. A description of each procedure including its function, commands, data interface, and use is presented. This manual is designed to assist users in defining and using command procedures to perform structural analysis in the CSM Testbed User's Manual and the CSM Testbed Data Library Description.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-22
... Regulation; Clarification of Standards for Computer Generation of Forms AGENCY: Department of Defense (DoD... American National Standards Institute X12, as the valid standard to use for computer-generated forms. FAR... optional forms on their computers. In addition to clarifying that FIPS 161 is no longer in use, public...
NASA Astrophysics Data System (ADS)
Eilers, J.
2013-09-01
The interface analysis from an observer of space objects makes a standard necessary. This standardized dataset serves as input for a cloud based service, which aimed for a near real-time Space Situational Awareness (SSA) system. The system contains all advantages of a cloud based solution, like redundancy, scalability and an easy way to distribute information. For the standard based on the interface analysis of the observer, the information can be separated in three parts. One part is the information about the observer e.g. a ground station. The next part is the information about the sensors that are used by the observer. And the last part is the data from the detected object. Backbone of the SSA System is the cloud based service which includes the consistency check for the observed objects, a database for the objects, the algorithms and analysis as well as the visualization of the results. This paper also provides an approximation of the needed computational power, data storage and a financial approach to deliver this service to a broad community. In this context cloud means, neither the user nor the observer has to think about the infrastructure of the calculation environment. The decision if the IT-infrastructure will be built by a conglomerate of different nations or rented on the marked should be based on an efficiency analysis. Also combinations are possible like starting on a rented cloud and then go to a private cloud owned by the government. One of the advantages of a cloud solution is the scalability. There are about 3000 satellites in space, 900 of them are active, and in total there are about ~17.000 detected space objects orbiting earth. But for the computation it is not a N(active) to N problem it is more N(active) to N(apo peri) quantity of N(all). Instead of 15.3 million possible collisions to calculate a computation of only approx. 2.3 million possible collisions must be done. In general, this Space Situational Awareness System can be used as a tool for satellite system owner for collision avoidance.
How little data is enough? Phase-diagram analysis of sparsity-regularized X-ray computed tomography
Jørgensen, J. S.; Sidky, E. Y.
2015-01-01
We introduce phase-diagram analysis, a standard tool in compressed sensing (CS), to the X-ray computed tomography (CT) community as a systematic method for determining how few projections suffice for accurate sparsity-regularized reconstruction. In CS, a phase diagram is a convenient way to study and express certain theoretical relations between sparsity and sufficient sampling. We adapt phase-diagram analysis for empirical use in X-ray CT for which the same theoretical results do not hold. We demonstrate in three case studies the potential of phase-diagram analysis for providing quantitative answers to questions of undersampling. First, we demonstrate that there are cases where X-ray CT empirically performs comparably with a near-optimal CS strategy, namely taking measurements with Gaussian sensing matrices. Second, we show that, in contrast to what might have been anticipated, taking randomized CT measurements does not lead to improved performance compared with standard structured sampling patterns. Finally, we show preliminary results of how well phase-diagram analysis can predict the sufficient number of projections for accurately reconstructing a large-scale image of a given sparsity by means of total-variation regularization. PMID:25939620
How little data is enough? Phase-diagram analysis of sparsity-regularized X-ray computed tomography.
Jørgensen, J S; Sidky, E Y
2015-06-13
We introduce phase-diagram analysis, a standard tool in compressed sensing (CS), to the X-ray computed tomography (CT) community as a systematic method for determining how few projections suffice for accurate sparsity-regularized reconstruction. In CS, a phase diagram is a convenient way to study and express certain theoretical relations between sparsity and sufficient sampling. We adapt phase-diagram analysis for empirical use in X-ray CT for which the same theoretical results do not hold. We demonstrate in three case studies the potential of phase-diagram analysis for providing quantitative answers to questions of undersampling. First, we demonstrate that there are cases where X-ray CT empirically performs comparably with a near-optimal CS strategy, namely taking measurements with Gaussian sensing matrices. Second, we show that, in contrast to what might have been anticipated, taking randomized CT measurements does not lead to improved performance compared with standard structured sampling patterns. Finally, we show preliminary results of how well phase-diagram analysis can predict the sufficient number of projections for accurately reconstructing a large-scale image of a given sparsity by means of total-variation regularization.
Jeffries, B F; Tarlton, M; De Smet, A A; Dwyer, S J; Brower, A C
1980-02-01
A computer program was created to identify and accept spatial data regarding the location of the thoracic and lumbar vertebral bodies on scoliosis films. With this information, the spine can be mathematically reconstructed and a scoliotic angle calculated. There was a 0.968 positive correlation between the computer and manual methods of measuring scoliosis. The computer method was more reproducible with a standard deviation of only 1.3 degrees. Computerized measurement of scoliosis also provides better evaluation of the true shape of the curve.
A Standardized Interface for Obtaining Digital Planetary and Heliophysics Time Series Data
NASA Astrophysics Data System (ADS)
Vandegriff, Jon; Weigel, Robert; Faden, Jeremy; King, Todd; Candey, Robert
2016-10-01
We describe a low level interface for accessing digital Planetary and Heliophysics data, focusing primarily on time-series data from in-situ instruments. As the volume and variety of planetary data has increased, it has become harder to merge diverse datasets into a common analysis environment. Thus we are building low-level computer-to-computer infrastructure to enable data from different missions or archives to be able to interoperate. The key to enabling interoperability is a simple access interface that standardizes the common capabilities available from any data server: 1. identify the data resources that can be accessed; 2. describe each resource; and 3. get the data from a resource. We have created a standardized way for data servers to perform each of these three activities. We are also developing a standard streaming data format for the actual data content to be returned (i.e., the result of item 3). Our proposed standard access interface is simple enough that it could be implemented on top of or beside existing data services, or it could even be fully implemented by a small data provider as a way to ensure that the provider's holdings can participate in larger data systems or joint analysis with other datasets. We present details of the interface and of the streaming format, including a sample server designed to illustrate the data request and streaming capabilities.
ParamAP: Standardized Parameterization of Sinoatrial Node Myocyte Action Potentials.
Rickert, Christian; Proenza, Catherine
2017-08-22
Sinoatrial node myocytes act as cardiac pacemaker cells by generating spontaneous action potentials (APs). Much information is encoded in sinoatrial AP waveforms, but both the analysis and the comparison of AP parameters between studies is hindered by the lack of standardized parameter definitions and the absence of automated analysis tools. Here we introduce ParamAP, a standalone cross-platform computational tool that uses a template-free detection algorithm to automatically identify and parameterize APs from text input files. ParamAP employs a graphic user interface with automatic and user-customizable input modes, and it outputs data files in text and PDF formats. ParamAP returns a total of 16 AP waveform parameters including time intervals such as the AP duration, membrane potentials such as the maximum diastolic potential, and rates of change of the membrane potential such as the diastolic depolarization rate. ParamAP provides a robust AP detection algorithm in combination with a standardized AP parameter analysis over a wide range of AP waveforms and firing rates, owing in part to the use of an iterative algorithm for the determination of the threshold potential and the diastolic depolarization rate that is independent of the maximum upstroke velocity, a parameter that can vary significantly among sinoatrial APs. Because ParamAP is implemented in Python 3, it is also highly customizable and extensible. In conclusion, ParamAP is a powerful computational tool that facilitates quantitative analysis and enables comparison of sinoatrial APs by standardizing parameter definitions and providing an automated work flow. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Bouchpan-Lerust-Juéry, L.
2007-08-01
Current and next generation on-board computer systems tend to implement real-time embedded control applications (e.g. Attitude and Orbit Control Subsystem (AOCS), Packet Utililization Standard (PUS), spacecraft autonomy . . . ) which must meet high standards of Reliability and Predictability as well as Safety. All these requirements require a considerable amount of effort and cost for Space Sofware Industry. This paper, in a first part, presents a free Open Source integrated solution to develop RTAI applications from analysis, design, simulation and direct implementation using code generation based on Open Source and in its second part summarises this suggested approach, its results and the conclusion for further work.
Cazzaniga, Paolo; Nobile, Marco S.; Besozzi, Daniela; Bellini, Matteo; Mauri, Giancarlo
2014-01-01
The introduction of general-purpose Graphics Processing Units (GPUs) is boosting scientific applications in Bioinformatics, Systems Biology, and Computational Biology. In these fields, the use of high-performance computing solutions is motivated by the need of performing large numbers of in silico analysis to study the behavior of biological systems in different conditions, which necessitate a computing power that usually overtakes the capability of standard desktop computers. In this work we present coagSODA, a CUDA-powered computational tool that was purposely developed for the analysis of a large mechanistic model of the blood coagulation cascade (BCC), defined according to both mass-action kinetics and Hill functions. coagSODA allows the execution of parallel simulations of the dynamics of the BCC by automatically deriving the system of ordinary differential equations and then exploiting the numerical integration algorithm LSODA. We present the biological results achieved with a massive exploration of perturbed conditions of the BCC, carried out with one-dimensional and bi-dimensional parameter sweep analysis, and show that GPU-accelerated parallel simulations of this model can increase the computational performances up to a 181× speedup compared to the corresponding sequential simulations. PMID:25025072
Using multi-criteria analysis of simulation models to understand complex biological systems
Maureen C. Kennedy; E. David Ford
2011-01-01
Scientists frequently use computer-simulation models to help solve complex biological problems. Typically, such models are highly integrated, they produce multiple outputs, and standard methods of model analysis are ill suited for evaluating them. We show how multi-criteria optimization with Pareto optimality allows for model outputs to be compared to multiple system...
A Test Reliability Analysis of an Abbreviated Version of the Pupil Control Ideology Form.
ERIC Educational Resources Information Center
Gaffney, Patrick V.
A reliability analysis was conducted of an abbreviated, 10-item version of the Pupil Control Ideology Form (PCI), using the Cronbach's alpha technique (L. J. Cronbach, 1951) and the computation of the standard error of measurement. The PCI measures a teacher's orientation toward pupil control. Subjects were 168 preservice teachers from one private…
ERIC Educational Resources Information Center
SMITH, STUART E.; AND OTHERS
FACTOR ANALYSIS WAS CARRIED OUT TO ASCERTAIN THE BEST OCCUPATIONAL GROUP LOCATION FOR EACH OF FOUR STRONG VOCATIONAL INTEREST BLANK (SVIB) SCALES--VETERMINARIAN, SENIOR CPA, PHARMACIST, AND MORTICIAN. THE SVIB WAS ADMINISTERED TO 125 MALE LIBERAL ARTS FRESHMEN. MEANS, STANDARD DEVIATIONS, AND INTERCORRELATIONS WERE COMPUTED. THIS FACTOR ANALYSIS…
Program For Joule-Thomson Analysis Of Mixed Cryogens
NASA Technical Reports Server (NTRS)
Jones, Jack A.; Lund, Alan
1994-01-01
JTMIX computer program predicts ideal and realistic properties of mixed gases at temperatures between 65 and 80 K. Performs Joule-Thomson analysis of any gaseous mixture of neon, nitrogen, various hydrocarbons, argon, oxygen, carbon monoxide, carbon dioxide, and hydrogen sulfide. When used in conjunction with DDMIX computer program of National Institute of Standards and Technology (NIST), JTMIX accurately predicts order-of-magnitude increases in Joule-Thomson cooling capacities occuring when various hydrocarbons added to nitrogen. Also predicts boiling temperature of nitrogen depressed from normal value to as low as 60 K upon addition of neon. Written in Turbo C.
Fast Eigensolver for Computing 3D Earth's Normal Modes
NASA Astrophysics Data System (ADS)
Shi, J.; De Hoop, M. V.; Li, R.; Xi, Y.; Saad, Y.
2017-12-01
We present a novel parallel computational approach to compute Earth's normal modes. We discretize Earth via an unstructured tetrahedral mesh and apply the continuous Galerkin finite element method to the elasto-gravitational system. To resolve the eigenvalue pollution issue, following the analysis separating the seismic point spectrum, we utilize explicitly a representation of the displacement for describing the oscillations of the non-seismic modes in the fluid outer core. Effectively, we separate out the essential spectrum which is naturally related to the Brunt-Väisälä frequency. We introduce two Lanczos approaches with polynomial and rational filtering for solving this generalized eigenvalue problem in prescribed intervals. The polynomial filtering technique only accesses the matrix pair through matrix-vector products and is an ideal candidate for solving three-dimensional large-scale eigenvalue problems. The matrix-free scheme allows us to deal with fluid separation and self-gravitation in an efficient way, while the standard shift-and-invert method typically needs an explicit shifted matrix and its factorization. The rational filtering method converges much faster than the standard shift-and-invert procedure when computing all the eigenvalues inside an interval. Both two Lanczos approaches solve for the internal eigenvalues extremely accurately, comparing with the standard eigensolver. In our computational experiments, we compare our results with the radial earth model benchmark, and visualize the normal modes using vector plots to illustrate the properties of the displacements in different modes.
Contracting for Computer Software in Standardized Computer Languages
Brannigan, Vincent M.; Dayhoff, Ruth E.
1982-01-01
The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the contracting process for standard language software. Appropriate contract language is suggested for requiring strict compliance with a standard, and an overview of remedies is given for failure to comply.
Automation of Precise Time Reference Stations (PTRS)
NASA Astrophysics Data System (ADS)
Wheeler, P. J.
1985-04-01
The U.S. Naval Observatory is presently engaged in a program of automating precise time stations (PTS) and precise time reference stations (PTBS) by using a versatile mini-computer controlled data acquisition system (DAS). The data acquisition system is configured to monitor locally available PTTI signals such as LORAN-C, OMEGA, and/or the Global Positioning System. In addition, the DAS performs local standard intercomparison. Computer telephone communications provide automatic data transfer to the Naval Observatory. Subsequently, after analysis of the data, results and information can be sent back to the precise time reference station to provide automatic control of remote station timing. The DAS configuration is designed around state of the art standard industrial high reliability modules. The system integration and software are standardized but allow considerable flexibility to satisfy special local requirements such as stability measurements, performance evaluation and printing of messages and certificates. The DAS operates completely independently and may be queried or controlled at any time with a computer or terminal device (control is protected for use by authorized personnel only). Such DAS equipped PTS are operational in Hawaii, California, Texas and Florida.
Classification Models for Pulmonary Function using Motion Analysis from Phone Sensors.
Cheng, Qian; Juen, Joshua; Bellam, Shashi; Fulara, Nicholas; Close, Deanna; Silverstein, Jonathan C; Schatz, Bruce
2016-01-01
Smartphones are ubiquitous, but it is unknown what physiological functions can be monitored at clinical quality. Pulmonary function is a standard measure of health status for cardiopulmonary patients. We have shown phone sensors can accurately measure walking patterns. Here we show that improved classification models can accurately measure pulmonary function, with sole inputs being sensor data from carried phones. Twenty-four cardiopulmonary patients performed six minute walk tests in pulmonary rehabilitation at a regional hospital. They carried smartphones running custom software recording phone motion. For every patient, every ten-second interval was correctly computed. The trained model perfectly computed the GOLD level 1/2/3, which is a standard categorization of pulmonary function as measured by spirometry. These results are encouraging towards field trials with passive monitors always running in the background. We expect patients can simply carry their phones during daily living, while supporting automatic computation ofpulmonary function for health monitoring.
Effects of experimental design on calibration curve precision in routine analysis
Pimentel, Maria Fernanda; Neto, Benício de Barros; Saldanha, Teresa Cristina B.
1998-01-01
A computational program which compares the effciencies of different experimental designs with those of maximum precision (D-optimized designs) is described. The program produces confidence interval plots for a calibration curve and provides information about the number of standard solutions, concentration levels and suitable concentration ranges to achieve an optimum calibration. Some examples of the application of this novel computational program are given, using both simulated and real data. PMID:18924816
Differences in head impulse test results due to analysis techniques.
Cleworth, Taylor W; Carpenter, Mark G; Honegger, Flurin; Allum, John H J
2017-01-01
Different analysis techniques are used to define vestibulo-ocular reflex (VOR) gain between eye and head angular velocity during the video head impulse test (vHIT). Comparisons would aid selection of gain techniques best related to head impulse characteristics and promote standardisation. Compare and contrast known methods of calculating vHIT VOR gain. We examined lateral canal vHIT responses recorded from 20 patients twice within 13 weeks of acute unilateral peripheral vestibular deficit onset. Ten patients were tested with an ICS Impulse system (GN Otometrics) and 10 with an EyeSeeCam (ESC) system (Interacoustics). Mean gain and variance were computed with area, average sample gain, and regression techniques over specific head angular velocity (HV) and acceleration (HA) intervals. Results for the same gain technique were not different between measurement systems. Area and average sample gain yielded equally lower variances than regression techniques. Gains computed over the whole impulse duration were larger than those computed for increasing HV. Gain over decreasing HV was associated with larger variances. Gains computed around peak HV were smaller than those computed around peak HA. The median gain over 50-70 ms was not different from gain around peak HV. However, depending on technique used, the gain over increasing HV was different from gain around peak HA. Conversion equations between gains obtained with standard ICS and ESC methods were computed. For low gains, the conversion was dominated by a constant that needed to be added to ESC gains to equal ICS gains. We recommend manufacturers standardize vHIT gain calculations using 2 techniques: area gain around peak HA and peak HV.
Standardizing clinical trials workflow representation in UML for international site comparison.
de Carvalho, Elias Cesar Araujo; Jayanti, Madhav Kishore; Batilana, Adelia Portero; Kozan, Andreia M O; Rodrigues, Maria J; Shah, Jatin; Loures, Marco R; Patil, Sunita; Payne, Philip; Pietrobon, Ricardo
2010-11-09
With the globalization of clinical trials, a growing emphasis has been placed on the standardization of the workflow in order to ensure the reproducibility and reliability of the overall trial. Despite the importance of workflow evaluation, to our knowledge no previous studies have attempted to adapt existing modeling languages to standardize the representation of clinical trials. Unified Modeling Language (UML) is a computational language that can be used to model operational workflow, and a UML profile can be developed to standardize UML models within a given domain. This paper's objective is to develop a UML profile to extend the UML Activity Diagram schema into the clinical trials domain, defining a standard representation for clinical trial workflow diagrams in UML. Two Brazilian clinical trial sites in rheumatology and oncology were examined to model their workflow and collect time-motion data. UML modeling was conducted in Eclipse, and a UML profile was developed to incorporate information used in discrete event simulation software. Ethnographic observation revealed bottlenecks in workflow: these included tasks requiring full commitment of CRCs, transferring notes from paper to computers, deviations from standard operating procedures, and conflicts between different IT systems. Time-motion analysis revealed that nurses' activities took up the most time in the workflow and contained a high frequency of shorter duration activities. Administrative assistants performed more activities near the beginning and end of the workflow. Overall, clinical trial tasks had a greater frequency than clinic routines or other general activities. This paper describes a method for modeling clinical trial workflow in UML and standardizing these workflow diagrams through a UML profile. In the increasingly global environment of clinical trials, the standardization of workflow modeling is a necessary precursor to conducting a comparative analysis of international clinical trials workflows.
Standardizing Clinical Trials Workflow Representation in UML for International Site Comparison
de Carvalho, Elias Cesar Araujo; Jayanti, Madhav Kishore; Batilana, Adelia Portero; Kozan, Andreia M. O.; Rodrigues, Maria J.; Shah, Jatin; Loures, Marco R.; Patil, Sunita; Payne, Philip; Pietrobon, Ricardo
2010-01-01
Background With the globalization of clinical trials, a growing emphasis has been placed on the standardization of the workflow in order to ensure the reproducibility and reliability of the overall trial. Despite the importance of workflow evaluation, to our knowledge no previous studies have attempted to adapt existing modeling languages to standardize the representation of clinical trials. Unified Modeling Language (UML) is a computational language that can be used to model operational workflow, and a UML profile can be developed to standardize UML models within a given domain. This paper's objective is to develop a UML profile to extend the UML Activity Diagram schema into the clinical trials domain, defining a standard representation for clinical trial workflow diagrams in UML. Methods Two Brazilian clinical trial sites in rheumatology and oncology were examined to model their workflow and collect time-motion data. UML modeling was conducted in Eclipse, and a UML profile was developed to incorporate information used in discrete event simulation software. Results Ethnographic observation revealed bottlenecks in workflow: these included tasks requiring full commitment of CRCs, transferring notes from paper to computers, deviations from standard operating procedures, and conflicts between different IT systems. Time-motion analysis revealed that nurses' activities took up the most time in the workflow and contained a high frequency of shorter duration activities. Administrative assistants performed more activities near the beginning and end of the workflow. Overall, clinical trial tasks had a greater frequency than clinic routines or other general activities. Conclusions This paper describes a method for modeling clinical trial workflow in UML and standardizing these workflow diagrams through a UML profile. In the increasingly global environment of clinical trials, the standardization of workflow modeling is a necessary precursor to conducting a comparative analysis of international clinical trials workflows. PMID:21085484
Incorporating principal component analysis into air quality model evaluation
The efficacy of standard air quality model evaluation techniques is becoming compromised as the simulation periods continue to lengthen in response to ever increasing computing capacity. Accordingly, the purpose of this paper is to demonstrate a statistical approach called Princi...
SELECTION AND CALIBRATION OF SUBSURFACE REACTIVE TRANSPORT MODELS USING A SURROGATE-MODEL APPROACH
While standard techniques for uncertainty analysis have been successfully applied to groundwater flow models, extension to reactive transport is frustrated by numerous difficulties, including excessive computational burden and parameter non-uniqueness. This research introduces a...
Randomization Procedures Applied to Analysis of Ballistic Data
1991-06-01
test,;;15. NUMBER OF PAGES data analysis; computationally intensive statistics ; randomization tests; permutation tests; 16 nonparametric statistics ...be 0.13. 8 Any reasonable statistical procedure would fail to support the notion of improvement of dynamic over standard indexing based on this data ...AD-A238 389 TECHNICAL REPORT BRL-TR-3245 iBRL RANDOMIZATION PROCEDURES APPLIED TO ANALYSIS OF BALLISTIC DATA MALCOLM S. TAYLOR BARRY A. BODT - JUNE
Standardizing Activation Analysis: New Software for Photon Activation Analysis
NASA Astrophysics Data System (ADS)
Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.
2011-06-01
Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.
Low Cost Desktop Image Analysis Workstation With Enhanced Interactive User Interface
NASA Astrophysics Data System (ADS)
Ratib, Osman M.; Huang, H. K.
1989-05-01
A multimodality picture archiving and communication system (PACS) is in routine clinical use in the UCLA Radiology Department. Several types workstations are currently implemented for this PACS. Among them, the Apple Macintosh II personal computer was recently chosen to serve as a desktop workstation for display and analysis of radiological images. This personal computer was selected mainly because of its extremely friendly user-interface, its popularity among the academic and medical community and its low cost. In comparison to other microcomputer-based systems the Macintosh II offers the following advantages: the extreme standardization of its user interface, file system and networking, and the availability of a very large variety of commercial software packages. In the current configuration the Macintosh II operates as a stand-alone workstation where images are imported from a centralized PACS server through an Ethernet network using a standard TCP-IP protocol, and stored locally on magnetic disk. The use of high resolution screens (1024x768 pixels x 8bits) offer sufficient performance for image display and analysis. We focused our project on the design and implementation of a variety of image analysis algorithms ranging from automated structure and edge detection to sophisticated dynamic analysis of sequential images. Specific analysis programs were developed for ultrasound images, digitized angiograms, MRI and CT tomographic images and scintigraphic images.
A study of sound generation in subsonic rotors, volume 2
NASA Technical Reports Server (NTRS)
Chalupnik, J. D.; Clark, L. T.
1975-01-01
Computer programs were developed for use in the analysis of sound generation by subsonic rotors. Program AIRFOIL computes the spectrum of radiated sound from a single airfoil immersed in a laminar flow field. Program ROTOR extends this to a rotating frame, and provides a model for sound generation in subsonic rotors. The program also computes tone sound generation due to steady state forces on the blades. Program TONE uses a moving source analysis to generate a time series for an array of forces moving in a circular path. The resultant time series are than Fourier transformed to render the results in spectral form. Program SDATA is a standard time series analysis package. It reads in two discrete time series and forms auto and cross covariances and normalizes these to form correlations. The program then transforms the covariances to yield auto and cross power spectra by means of a Fourier transformation.
Markeeva, M V; Mareev, O V; Nikolenko, V N; Mareev, G O; Danilova, T V; Fadeeva, E A; Fedorov, R V
The objective of the present work was to study the relationship between the dimensions of the ethmoidal labyrinth and the skull in the subjects differing in the nose shape by means of the factorial and correlation analysis with the application of the modern computer-assisted methods for the three-dimensional reconstruction of the skull. We developed an original method for computed craniometry with the use the original program that made it possible to determine the standard intravital craniometrics characteristics of the human skull with a high degree of accuracy based on the results of analysis of 200 computed tomograms of the head. It was shown that the length of the inferior turbinated bones and the posterior edge of the orbital plate is of special relevance for practically all parameters of the ethmoidal labyrinth. Also, the width of the choanae positively relates to the height of the ethmoidal labyrinth.
Shoba, D; Periandy, S; Karabacak, M; Ramalingam, S
2011-12-01
The FT-IR and FT-Raman vibrational spectra of 2,3-naphthalenediol (C(10)H(8)O(2)) have been recorded using Bruker IFS 66V spectrometer in the range of 4000-100 cm(-1) in solid phase. A detailed vibrational spectral analysis has been carried out and the assignments of the observed fundamental bands have been proposed on the basis of peak positions and relative intensities. The optimized molecular geometry and vibrational frequencies in the ground state are calculated by using the ab initio Hartree-Fock (HF) and DFT (LSDA and B3LYP) methods with 6-31+G(d,p) and 6-311+G(d,p) basis sets. There are three conformers, C1, C2 and C3 for this molecule. The computational results diagnose the most stable conformer of title molecule as the C1 form. The isotropic computational analysis showed good agreement with the experimental observations. Comparison of the fundamental vibrational frequencies with calculated results by HF and DFT methods. Comparison of the simulated spectra provides important information about the capability of computational method to describe the vibrational modes. A study on the electronic properties, such as absorption wavelengths, excitation energy, dipole moment and Frontier molecular orbital energies, are performed by time dependent DFT approach. The electronic structure and the assignment of the absorption bands in the electronic spectra of steady compounds are discussed. The calculated HOMO and LUMO energies show that charge transfer occurs within the molecule. On the basis of the thermodynamic properties of the title compound at different temperatures have been calculated. The statistical thermodynamic properties (standard heat capacities, standard entropies, and standard enthalpy changes) and their correlations with temperature have been obtained from the theoretical vibrations. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.
Woo, E H C; White, P; Lai, C W K
2016-03-01
This paper presents an overview of global ergonomics standards and guidelines for design of computer workstations, with particular focus on their inconsistency and associated health risk impact. Overall, considerable disagreements were found in the design specifications of computer workstations globally, particularly in relation to the results from previous ergonomics research and the outcomes from current ergonomics standards and guidelines. To cope with the rapid advancement in computer technology, this article provides justifications and suggestions for modifications in the current ergonomics standards and guidelines for the design of computer workstations. Practitioner Summary: A research gap exists in ergonomics standards and guidelines for computer workstations. We explore the validity and generalisability of ergonomics recommendations by comparing previous ergonomics research through to recommendations and outcomes from current ergonomics standards and guidelines.
Effectiveness of a Case-Based Computer Program on Students' Ethical Decision Making.
Park, Eun-Jun; Park, Mihyun
2015-11-01
The aim of this study was to test the effectiveness of a case-based computer program, using an integrative ethical decision-making model, on the ethical decision-making competency of nursing students in South Korea. This study used a pre- and posttest comparison design. Students in the intervention group used a computer program for case analysis assignments, whereas students in the standard group used a traditional paper assignment for case analysis. The findings showed that using the case-based computer program as a complementary tool for the ethics courses offered at the university enhanced students' ethical preparedness and satisfaction with the course. On the basis of the findings, it is recommended that nurse educators use a case-based computer program as a complementary self-study tool in ethics courses to supplement student learning without an increase in course hours, particularly in terms of analyzing ethics cases with dilemma scenarios and exercising ethical decision making. Copyright 2015, SLACK Incorporated.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-11
... before May 12, 2011. ADDRESSES: Written comments may be sent to: Chief, Computer Security Division... FURTHER INFORMATION CONTACT: Elaine Barker, Computer Security Division, National Institute of Standards... Quynh Dang, Computer Security Division, National Institute of Standards and Technology, Gaithersburg, MD...
Spatial Statistics for Tumor Cell Counting and Classification
NASA Astrophysics Data System (ADS)
Wirjadi, Oliver; Kim, Yoo-Jin; Breuel, Thomas
To count and classify cells in histological sections is a standard task in histology. One example is the grading of meningiomas, benign tumors of the meninges, which requires to assess the fraction of proliferating cells in an image. As this process is very time consuming when performed manually, automation is required. To address such problems, we propose a novel application of Markov point process methods in computer vision, leading to algorithms for computing the locations of circular objects in images. In contrast to previous algorithms using such spatial statistics methods in image analysis, the present one is fully trainable. This is achieved by combining point process methods with statistical classifiers. Using simulated data, the method proposed in this paper will be shown to be more accurate and more robust to noise than standard image processing methods. On the publicly available SIMCEP benchmark for cell image analysis algorithms, the cell count performance of the present paper is significantly more accurate than results published elsewhere, especially when cells form dense clusters. Furthermore, the proposed system performs as well as a state-of-the-art algorithm for the computer-aided histological grading of meningiomas when combined with a simple k-nearest neighbor classifier for identifying proliferating cells.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frank, R.N.
1990-02-28
The Inspection Shop at Lawrence Livermore Lab recently purchased a Sheffield Apollo RS50 Direct Computer Control Coordinate Measuring Machine. The performance of the machine was specified to conform to B89 standard which relies heavily upon using the measuring machine in its intended manner to verify its accuracy (rather than parametric tests). Although it would be possible to use the interactive measurement system to perform these tasks, a more thorough and efficient job can be done by creating Function Library programs for certain tasks which integrate Hewlett-Packard Basic 5.0 language and calls to proprietary analysis and machine control routines. This combinationmore » provides efficient use of the measuring machine with a minimum of keyboard input plus an analysis of the data with respect to the B89 Standard rather than a CMM analysis which would require subsequent interpretation. This paper discusses some characteristics of the Sheffield machine control and analysis software and my use of H-P Basic language to create automated measurement programs to support the B89 performance evaluation of the CMM. 1 ref.« less
Bootstrap Methods: A Very Leisurely Look.
ERIC Educational Resources Information Center
Hinkle, Dennis E.; Winstead, Wayland H.
The Bootstrap method, a computer-intensive statistical method of estimation, is illustrated using a simple and efficient Statistical Analysis System (SAS) routine. The utility of the method for generating unknown parameters, including standard errors for simple statistics, regression coefficients, discriminant function coefficients, and factor…
Effects of MicroCAD on Learning Fundamental Engineering Graphical Concepts: A Qualitative Study.
ERIC Educational Resources Information Center
Leach, James A.; Gull, Randall L.
1990-01-01
Students' reactions and performances were examined when taught engineering geometry concepts using a standard microcomputer-aided drafting software package. Two sample groups were compared based on their computer experience. Included are the methodology, data analysis, and conclusions. (KR)
Camera Network Topology Discovery Literature Review
2011-11-01
essential for 21st century military, enviromental and surveillance applications [Devarajan, Cheng & Radke 2008]. Computer networks pose several research...starting and ending points of object trajectories into entry/exit regions [Makris & Ellis 2003]. 3LDA is a new standard for document analysis. The model
Computation of elementary modes: a unifying framework and the new binary approach
Gagneur, Julien; Klamt, Steffen
2004-01-01
Background Metabolic pathway analysis has been recognized as a central approach to the structural analysis of metabolic networks. The concept of elementary (flux) modes provides a rigorous formalism to describe and assess pathways and has proven to be valuable for many applications. However, computing elementary modes is a hard computational task. In recent years we assisted in a multiplication of algorithms dedicated to it. We require a summarizing point of view and a continued improvement of the current methods. Results We show that computing the set of elementary modes is equivalent to computing the set of extreme rays of a convex cone. This standard mathematical representation provides a unified framework that encompasses the most prominent algorithmic methods that compute elementary modes and allows a clear comparison between them. Taking lessons from this benchmark, we here introduce a new method, the binary approach, which computes the elementary modes as binary patterns of participating reactions from which the respective stoichiometric coefficients can be computed in a post-processing step. We implemented the binary approach in FluxAnalyzer 5.1, a software that is free for academics. The binary approach decreases the memory demand up to 96% without loss of speed giving the most efficient method available for computing elementary modes to date. Conclusions The equivalence between elementary modes and extreme ray computations offers opportunities for employing tools from polyhedral computation for metabolic pathway analysis. The new binary approach introduced herein was derived from this general theoretical framework and facilitates the computation of elementary modes in considerably larger networks. PMID:15527509
Norms and Standards for Computer Education (MCA, BCA) through Distance Mode.
ERIC Educational Resources Information Center
Rausaria, R.R., Ed.; Lele, Nalini A., Ed.; Bhushan, Bharat, Ed.
This document presents the norms and standards for computer education in India through distance mode, including the Masters in Computer Applications (MCA) and Bachelor in Computer Applications (BCA) programs. These norms and standards were considered and approved by the Distance Education Council, Indira Gandhi National Open University (India), at…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neymark, J.; Kennedy, M.; Judkoff, R.
This report documents a set of diagnostic analytical verification cases for testing the ability of whole building simulation software to model the air distribution side of typical heating, ventilating and air conditioning (HVAC) equipment. These cases complement the unitary equipment cases included in American National Standards Institute (ANSI)/American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE) Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs, which test the ability to model the heat-transfer fluid side of HVAC equipment.
Operating a Geiger Müller tube using a PC sound card
NASA Astrophysics Data System (ADS)
Azooz, A. A.
2009-01-01
In this paper, a simple MATLAB-based PC program that enables the computer to function as a replacement for the electronic scalar-counter system associated with a Geiger-Müller (GM) tube is described. The program utilizes the ability of MATLAB to acquire data directly from the computer sound card. The signal from the GM tube is applied to the computer sound card via the line in port. All standard GM experiments, pulse shape and statistical analysis experiments can be carried out using this system. A new visual demonstration of dead time effects is also presented.
Next-generation genotype imputation service and methods.
Das, Sayantan; Forer, Lukas; Schönherr, Sebastian; Sidore, Carlo; Locke, Adam E; Kwong, Alan; Vrieze, Scott I; Chew, Emily Y; Levy, Shawn; McGue, Matt; Schlessinger, David; Stambolian, Dwight; Loh, Po-Ru; Iacono, William G; Swaroop, Anand; Scott, Laura J; Cucca, Francesco; Kronenberg, Florian; Boehnke, Michael; Abecasis, Gonçalo R; Fuchsberger, Christian
2016-10-01
Genotype imputation is a key component of genetic association studies, where it increases power, facilitates meta-analysis, and aids interpretation of signals. Genotype imputation is computationally demanding and, with current tools, typically requires access to a high-performance computing cluster and to a reference panel of sequenced genomes. Here we describe improvements to imputation machinery that reduce computational requirements by more than an order of magnitude with no loss of accuracy in comparison to standard imputation tools. We also describe a new web-based service for imputation that facilitates access to new reference panels and greatly improves user experience and productivity.
Implementation of the force decomposition machine for molecular dynamics simulations.
Borštnik, Urban; Miller, Benjamin T; Brooks, Bernard R; Janežič, Dušanka
2012-09-01
We present the design and implementation of the force decomposition machine (FDM), a cluster of personal computers (PCs) that is tailored to running molecular dynamics (MD) simulations using the distributed diagonal force decomposition (DDFD) parallelization method. The cluster interconnect architecture is optimized for the communication pattern of the DDFD method. Our implementation of the FDM relies on standard commodity components even for networking. Although the cluster is meant for DDFD MD simulations, it remains general enough for other parallel computations. An analysis of several MD simulation runs on both the FDM and a standard PC cluster demonstrates that the FDM's interconnect architecture provides a greater performance compared to a more general cluster interconnect. Copyright © 2012 Elsevier Inc. All rights reserved.
Fox, Anne-Marie V; Kedgley, Angela E; Lalone, Emily A; Johnson, James A; Athwal, George S; Jenkyn, Thomas R
2011-11-10
Standard, beaded radiostereometric analysis (RSA) and markerless RSA often use computed tomography (CT) scans to create three-dimensional (3D) bone models. However, ethical concerns exist due to risks associated with CT radiation exposure. Therefore, the aim of this study was to investigate the effect of decreasing CT dosage on RSA accuracy. Four cadaveric shoulder specimens were scanned using a normal-dose CT protocol and two low-dose protocols, where the dosage was decreased by 89% and 98%. 3D computer models of the humerus and scapula were created using each CT protocol. Bi-planar fluoroscopy was used to image five different static glenohumeral positions and two dynamic glenohumeral movements, of which a total of five static and four dynamic poses were selected for analysis. For standard RSA, negligible differences were found in bead (0.21±0.31mm) and bony landmark (2.31±1.90mm) locations when the CT dosage was decreased by 98% (p-values>0.167). For markerless RSA kinematic results, excellent agreement was found between the normal-dose and lowest-dose protocol, with all Spearman rank correlation coefficients greater than 0.95. Average root mean squared errors of 1.04±0.68mm and 2.42±0.81° were also found at this reduced dosage for static positions. In summary, CT dosage can be markedly reduced when performing shoulder RSA to minimize the risks of radiation exposure. Standard RSA accuracy was negligibly affected by the 98% CT dose reduction and for markerless RSA, the benefits of decreasing CT dosage to the subject outweigh the introduced errors. Copyright © 2011 Elsevier Ltd. All rights reserved.
Analysis of Flow Behavior Within An Integrated Computer-Communication Network,
1979-05-01
Howard. Plan today for tomorrows data/voice nets. Data Communications 7, 9 (Sep. 1978), 51-62. 24. F-ark, Howard, and Gitman , Israel. Inteqrated DoD...computer networks. NTC-74, San Diego, CA., (Dec. 2-4, 1974), 1032-1037. 31. Gitman , I., Frank, H., Occhiogrosso, B., and Hsieh, W. Issues in integrated...switched networks agree on standard interface. Data Communications, (May/June 1978), 25)-39. 36. Hsieh, W., Gitman , I., and Occhiogrosso, B. Design of
2015-06-30
401) 832-8689 Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39-18 i TABLE OF CONTENTS Section Page LIST OF ILLUSTRATIONS...calculated with a high degree of accuracy—leading to intensive computational calculations and long computational times when dealing with range-depth fields...be obtained using similitude analysis; it allows the comparison of differing explosive weights and provides the means to scale the pressure, energy
NASA Technical Reports Server (NTRS)
Smith, Leigh M.; Parker, Nelson C. (Technical Monitor)
2002-01-01
This paper analyzes the use of Computer Aided Design (CAD) packages at NASA's Marshall Space Flight Center (MSFC). It examines the effectiveness of recent efforts to standardize CAD practices across MSFC engineering activities. An assessment of the roles played by management, designers, analysts, and manufacturers in this initiative will be explored. Finally, solutions are presented for better integration of CAD across MSFC in the future.
LiPD and CSciBox: A Case Study in Why Data Standards are Important for Paleoscience
NASA Astrophysics Data System (ADS)
Weiss, I.; Bradley, E.; McKay, N.; Emile-Geay, J.; de Vesine, L. R.; Anderson, K. A.; White, J. W. C.; Marchitto, T. M., Jr.
2016-12-01
CSciBox [1] is an integrated software system that helps geoscientists build and evaluate age models. Its user chooses from a number of built-in analysis tools, composing them into an analysis workflow and applying it to paleoclimate proxy datasets. CSciBox employs modern database technology to store both the data and the analysis results in an easily accessible and searchable form, and offers the user access to the computational toolbox, the data, and the results via a graphical user interface and a sophisticated plotter. Standards are a staple of modern life, and underlie any form of automation. Without data standards, it is difficult, if not impossible, to construct effective computer tools for paleoscience analysis. The LiPD (Linked Paleo Data) framework [2] enables the storage of both data and metadata in systematic, meaningful, machine-readable ways. LiPD has been a primary enabler of CSciBox's goals of usability, interoperability, and reproducibility. Building LiPD capabilities into CSciBox's importer, for instance, eliminated the need to ask the user about file formats, variable names, relationships between columns in the input file, etc. Building LiPD capabilities into the exporter facilitated the storage of complete details about the input data-provenance, preprocessing steps, etc.-as well as full descriptions of any analyses that were performed using the CSciBox tool, along with citations to appropriate references. This comprehensive collection of data and metadata, which is all linked together in a semantically meaningful, machine-readable way, not only completely documents the analyses and makes them reproducible. It also enables interoperability with any other software system that employs the LiPD standard. [1] www.cs.colorado.edu/ lizb/cscience.html[2] McKay & Emile-Geay, Climate of the Past 12:1093 (2016)
Computable visually observed phenotype ontological framework for plants
2011-01-01
Background The ability to search for and precisely compare similar phenotypic appearances within and across species has vast potential in plant science and genetic research. The difficulty in doing so lies in the fact that many visual phenotypic data, especially visually observed phenotypes that often times cannot be directly measured quantitatively, are in the form of text annotations, and these descriptions are plagued by semantic ambiguity, heterogeneity, and low granularity. Though several bio-ontologies have been developed to standardize phenotypic (and genotypic) information and permit comparisons across species, these semantic issues persist and prevent precise analysis and retrieval of information. A framework suitable for the modeling and analysis of precise computable representations of such phenotypic appearances is needed. Results We have developed a new framework called the Computable Visually Observed Phenotype Ontological Framework for plants. This work provides a novel quantitative view of descriptions of plant phenotypes that leverages existing bio-ontologies and utilizes a computational approach to capture and represent domain knowledge in a machine-interpretable form. This is accomplished by means of a robust and accurate semantic mapping module that automatically maps high-level semantics to low-level measurements computed from phenotype imagery. The framework was applied to two different plant species with semantic rules mined and an ontology constructed. Rule quality was evaluated and showed high quality rules for most semantics. This framework also facilitates automatic annotation of phenotype images and can be adopted by different plant communities to aid in their research. Conclusions The Computable Visually Observed Phenotype Ontological Framework for plants has been developed for more efficient and accurate management of visually observed phenotypes, which play a significant role in plant genomics research. The uniqueness of this framework is its ability to bridge the knowledge of informaticians and plant science researchers by translating descriptions of visually observed phenotypes into standardized, machine-understandable representations, thus enabling the development of advanced information retrieval and phenotype annotation analysis tools for the plant science community. PMID:21702966
Effect of Contact Damage on the Strength of Ceramic Materials.
1982-10-01
variables that are important to erosion, and a multivariate , linear regression analysis is used to fit the data to the dimensional analysis. The...of Equations 7 and 8 by a multivariable regression analysis (room tem- perature data) Exponent Regression Standard error Computed coefficient of...1980) 593. WEAVER, Proc. Brit. Ceram. Soc. 22 (1973) 125. 39. P. W. BRIDGMAN, "Dimensional Analaysis ", (Yale 18. R. W. RICE, S. W. FREIMAN and P. F
Trimarchi, Matteo; Lund, Valerie J; Nicolai, Piero; Pini, Massimiliano; Senna, Massimo; Howard, David J
2004-04-01
The Neoplasms of the Sinonasal Tract software package (NSNT v 1.0) implements a complete visual database for patients with sinonasal neoplasia, facilitating standardization of data and statistical analysis. The software, which is compatible with the Macintosh and Windows platforms, provides multiuser application with a dedicated server (on Windows NT or 2000 or Macintosh OS 9 or X and a network of clients) together with web access, if required. The system hardware consists of an Apple Power Macintosh G4500 MHz computer with PCI bus, 256 Mb of RAM plus 60 Gb hard disk, or any IBM-compatible computer with a Pentium 2 processor. Image acquisition may be performed with different frame-grabber cards for analog or digital video input of different standards (PAL, SECAM, or NTSC) and levels of quality (VHS, S-VHS, Betacam, Mini DV, DV). The visual database is based on 4th Dimension by 4D Inc, and video compression is made in real-time MPEG format. Six sections have been developed: demographics, symptoms, extent of disease, radiology, treatment, and follow-up. Acquisition of data includes computed tomography and magnetic resonance imaging, histology, and endoscopy images, allowing sequential comparison. Statistical analysis integral to the program provides Kaplan-Meier survival curves. The development of a dedicated, user-friendly database for sinonasal neoplasia facilitates a multicenter network and has obvious clinical and research benefits.
NASA Technical Reports Server (NTRS)
Hockney, George; Lee, Seungwon
2008-01-01
A computer program known as PyPele, originally written as a Pythonlanguage extension module of a C++ language program, has been rewritten in pure Python language. The original version of PyPele dispatches and coordinates parallel-processing tasks on cluster computers and provides a conceptual framework for spacecraft-mission- design and -analysis software tools to run in an embarrassingly parallel mode. The original version of PyPele uses SSH (Secure Shell a set of standards and an associated network protocol for establishing a secure channel between a local and a remote computer) to coordinate parallel processing. Instead of SSH, the present Python version of PyPele uses Message Passing Interface (MPI) [an unofficial de-facto standard language-independent application programming interface for message- passing on a parallel computer] while keeping the same user interface. The use of MPI instead of SSH and the preservation of the original PyPele user interface make it possible for parallel application programs written previously for the original version of PyPele to run on MPI-based cluster computers. As a result, engineers using the previously written application programs can take advantage of embarrassing parallelism without need to rewrite those programs.
The microcomputer in the dental office: a new diagnostic aid.
van der Stelt, P F
1985-06-01
The first computer applications in the dental office were based upon standard accountancy procedures. Recently, more and more computer applications have become available to meet the specific requirements of dental practice. This implies not only business procedures, but also facilities to store patient records in the system and retrieve them easily. Another development concerns the automatic calculation of diagnostic data such as those provided in cephalometric analysis. Furthermore, growth and surgical results in the craniofacial area can be predicted by computerized extrapolation. Computers have been useful in obtaining the patient's anamnestic data objectively and for the making of decisions based on such data. Computer-aided instruction systems have been developed for undergraduate students to bridge the gap between textbook and patient interaction without the risks inherent in the latter. Radiology will undergo substantial changes as a result of the application of electronic imaging devices instead of the conventional radiographic films. Computer-assisted electronic imaging will enable image processing, image enhancement, pattern recognition and data transmission for consultation and storage purposes. Image processing techniques will increase image quality whilst still allowing low-dose systems. Standardization of software and system configuration and the development of 'user friendly' programs is the major concern for the near future.
Pinkerton, Steven D.; Pearson, Cynthia R.; Eachus, Susan R.; Berg, Karina M.; Grimes, Richard M.
2008-01-01
Summary Maximizing our economic investment in HIV prevention requires balancing the costs of candidate interventions against their effects and selecting the most cost-effective interventions for implementation. However, many HIV prevention intervention trials do not collect cost information, and those that do use a variety of cost data collection methods and analysis techniques. Standardized cost data collection procedures, instrumentation, and analysis techniques are needed to facilitate the task of assessing intervention costs and to ensure comparability across intervention trials. This article describes the basic elements of a standardized cost data collection and analysis protocol and outlines a computer-based approach to implementing this protocol. Ultimately, the development of such a protocol would require contributions and “buy-in” from a diverse range of stakeholders, including HIV prevention researchers, cost-effectiveness analysts, community collaborators, public health decision makers, and funding agencies. PMID:18301128
Djuričić, Goran J; Radulovic, Marko; Sopta, Jelena P; Nikitović, Marina; Milošević, Nebojša T
2017-01-01
The prediction of induction chemotherapy response at the time of diagnosis may improve outcomes in osteosarcoma by allowing for personalized tailoring of therapy. The aim of this study was thus to investigate the predictive potential of the so far unexploited computational analysis of osteosarcoma magnetic resonance (MR) images. Fractal and gray level cooccurrence matrix (GLCM) algorithms were employed in retrospective analysis of MR images of primary osteosarcoma localized in distal femur prior to the OsteoSa induction chemotherapy. The predicted and actual chemotherapy response outcomes were then compared by means of receiver operating characteristic (ROC) analysis and accuracy calculation. Dbin, Λ, and SCN were the standard fractal and GLCM features which significantly associated with the chemotherapy outcome, but only in one of the analyzed planes. Our newly developed normalized fractal dimension, called the space-filling ratio (SFR) exerted an independent and much better predictive value with the prediction significance accomplished in two of the three imaging planes, with accuracy of 82% and area under the ROC curve of 0.20 (95% confidence interval 0-0.41). In conclusion, SFR as the newly designed fractal coefficient provided superior predictive performance in comparison to standard image analysis features, presumably by compensating for the tumor size variation in MR images.
13C-based metabolic flux analysis: fundamentals and practice.
Yang, Tae Hoon
2013-01-01
Isotope-based metabolic flux analysis is one of the emerging technologies applied to system level metabolic phenotype characterization in metabolic engineering. Among the developed approaches, (13)C-based metabolic flux analysis has been established as a standard tool and has been widely applied to quantitative pathway characterization of diverse biological systems. To implement (13)C-based metabolic flux analysis in practice, comprehending the underlying mathematical and computational modeling fundamentals is of importance along with carefully conducted experiments and analytical measurements. Such knowledge is also crucial when designing (13)C-labeling experiments and properly acquiring key data sets essential for in vivo flux analysis implementation. In this regard, the modeling fundamentals of (13)C-labeling systems and analytical data processing are the main topics we will deal with in this chapter. Along with this, the relevant numerical optimization techniques are addressed to help implementation of the entire computational procedures aiming at (13)C-based metabolic flux analysis in vivo.
a Cognitive Approach to Teaching a Graduate-Level Geobia Course
NASA Astrophysics Data System (ADS)
Bianchetti, Raechel A.
2016-06-01
Remote sensing image analysis training occurs both in the classroom and the research lab. Education in the classroom for traditional pixel-based image analysis has been standardized across college curriculums. However, with the increasing interest in Geographic Object-Based Image Analysis (GEOBIA), there is a need to develop classroom instruction for this method of image analysis. While traditional remote sensing courses emphasize the expansion of skills and knowledge related to the use of computer-based analysis, GEOBIA courses should examine the cognitive factors underlying visual interpretation. This current paper provides an initial analysis of the development, implementation, and outcomes of a GEOBIA course that considers not only the computational methods of GEOBIA, but also the cognitive factors of expertise, that such software attempts to replicate. Finally, a reflection on the first instantiation of this course is presented, in addition to plans for development of an open-source repository for course materials.
Murray-Davis, Beth; McDonald, Helen; Cross-Sudworth, Fiona; Ahmed, Rashid; Simioni, Julia; Dore, Sharon; Marrin, Michael; DeSantis, Judy; Leyland, Nicholas; Gardosi, Jason; Hutton, Eileen; McDonald, Sarah
2015-08-01
Adverse events occur in up to 10% of obstetric cases, and up to one half of these could be prevented. Case reviews and root cause analysis using a structured tool may help health care providers to learn from adverse events and to identify trends and recurring systems issues. We sought to establish the reliability of a root cause analysis computer application called Standardized Clinical Outcome Review (SCOR). We designed a mixed methods study to evaluate the effectiveness of the tool. We conducted qualitative content analysis of five charts reviewed by both the traditional obstetric quality assurance methods and the SCOR tool. We also determined inter-rater reliability by having four health care providers review the same five cases using the SCOR tool. The comparative qualitative review revealed that the traditional quality assurance case review process used inconsistent language and made serious, personalized recommendations for those involved in the case. In contrast, the SCOR review provided a consistent format for recommendations, a list of action points, and highlighted systems issues. The mean percentage agreement between the four reviewers for the five cases was 75%. The different health care providers completed data entry and assessment of the case in a similar way. Missing data from the chart and poor wording of questions were identified as issues affecting percentage agreement. The SCOR tool provides a standardized, objective, obstetric-specific tool for root cause analysis that may improve identification of risk factors and dissemination of action plans to prevent future events.
But Is It Nutritious? Computer Analysis Creates Healthier Meals.
ERIC Educational Resources Information Center
Corrigan, Kathleen A.; Aumann, Margaret B.
1993-01-01
A computerized menu-planning method, "Nutrient Standard Menu Planning" (NSMP), uses today's technology to create healthier menus. Field tested in 20 California school districts, the advantages of NSMP are cost effectiveness, increased flexibility, greater productivity, improved public relations, improved finances, and improved student…
Computation of ancestry scores with mixed families and unrelated individuals.
Zhou, Yi-Hui; Marron, James S; Wright, Fred A
2018-03-01
The issue of robustness to family relationships in computing genotype ancestry scores such as eigenvector projections has received increased attention in genetic association, and is particularly challenging when sets of both unrelated individuals and closely related family members are included. The current standard is to compute loadings (left singular vectors) using unrelated individuals and to compute projected scores for remaining family members. However, projected ancestry scores from this approach suffer from shrinkage toward zero. We consider two main novel strategies: (i) matrix substitution based on decomposition of a target family-orthogonalized covariance matrix, and (ii) using family-averaged data to obtain loadings. We illustrate the performance via simulations, including resampling from 1000 Genomes Project data, and analysis of a cystic fibrosis dataset. The matrix substitution approach has similar performance to the current standard, but is simple and uses only a genotype covariance matrix, while the family-average method shows superior performance. Our approaches are accompanied by novel ancillary approaches that provide considerable insight, including individual-specific eigenvalue scree plots. © 2017 The Authors. Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.
ERIC Educational Resources Information Center
Lord, Frederic M.; Stocking, Martha
A general Computer program is described that will compute asymptotic standard errors and carry out significance tests for an endless variety of (standard and) nonstandard large-sample statistical problems, without requiring the statistician to derive asymptotic standard error formulas. The program assumes that the observations have a multinormal…
F-16 Task Analysis Criterion-Referenced Objective and Objectives Hierarchy Report. Volume 4
1981-03-01
Initiation cues: Engine flameout Systems presenting cues: Aircraft fuel, engine STANDARD: Authority: TACR 60-2 Performance precision: TD in first 1/3 of...task: None Initiation cues: On short final Systems preventing cues: N/A STANDARD: Authority: 60-2 Performance precision: +/- .5 AOA; TD zone 150-1000...precision: +/- .05 AOA; TD Zone 150-1000 Computational accuracy: N/A ... . . ... . ... e e m I TASK NO.: 1.9.4 BEHAVIOR: Perform short field landing
Language-Agnostic Reproducible Data Analysis Using Literate Programming.
Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa
2016-01-01
A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.
Language-Agnostic Reproducible Data Analysis Using Literate Programming
Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa
2016-01-01
A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir. PMID:27711123
Lefor, Alan T
2011-08-01
Oncology research has traditionally been conducted using techniques from the biological sciences. The new field of computational oncology has forged a new relationship between the physical sciences and oncology to further advance research. By applying physics and mathematics to oncologic problems, new insights will emerge into the pathogenesis and treatment of malignancies. One major area of investigation in computational oncology centers around the acquisition and analysis of data, using improved computing hardware and software. Large databases of cellular pathways are being analyzed to understand the interrelationship among complex biological processes. Computer-aided detection is being applied to the analysis of routine imaging data including mammography and chest imaging to improve the accuracy and detection rate for population screening. The second major area of investigation uses computers to construct sophisticated mathematical models of individual cancer cells as well as larger systems using partial differential equations. These models are further refined with clinically available information to more accurately reflect living systems. One of the major obstacles in the partnership between physical scientists and the oncology community is communications. Standard ways to convey information must be developed. Future progress in computational oncology will depend on close collaboration between clinicians and investigators to further the understanding of cancer using these new approaches.
Design and calibration of a vacuum compatible scanning tunneling microscope
NASA Technical Reports Server (NTRS)
Abel, Phillip B.
1990-01-01
A vacuum compatible scanning tunneling microscope was designed and built, capable of imaging solid surfaces with atomic resolution. The single piezoelectric tube design is compact, and makes use of sample mounting stubs standard to a commercially available surface analysis system. Image collection and display is computer controlled, allowing storage of images for further analysis. Calibration results from atomic scale images are presented.
DataForge: Modular platform for data storage and analysis
NASA Astrophysics Data System (ADS)
Nozik, Alexander
2018-04-01
DataForge is a framework for automated data acquisition, storage and analysis based on modern achievements of applied programming. The aim of the DataForge is to automate some standard tasks like parallel data processing, logging, output sorting and distributed computing. Also the framework extensively uses declarative programming principles via meta-data concept which allows a certain degree of meta-programming and improves results reproducibility.
Condor-COPASI: high-throughput computing for biochemical networks
2012-01-01
Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary expertise. Results We present Condor-COPASI, a server-based software tool that integrates COPASI, a biological pathway simulation tool, with Condor, a high-throughput computing environment. Condor-COPASI provides a web-based interface, which makes it extremely easy for a user to run a number of model simulation and analysis tasks in parallel. Tasks are transparently split into smaller parts, and submitted for execution on a Condor pool. Result output is presented to the user in a number of formats, including tables and interactive graphical displays. Conclusions Condor-COPASI can effectively use a Condor high-throughput computing environment to provide significant gains in performance for a number of model simulation and analysis tasks. Condor-COPASI is free, open source software, released under the Artistic License 2.0, and is suitable for use by any institution with access to a Condor pool. Source code is freely available for download at http://code.google.com/p/condor-copasi/, along with full instructions on deployment and usage. PMID:22834945
The contaminant analysis automation robot implementation for the automated laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Younkin, J.R.; Igou, R.E.; Urenda, T.D.
1995-12-31
The Contaminant Analysis Automation (CAA) project defines the automated laboratory as a series of standard laboratory modules (SLM) serviced by a robotic standard support module (SSM). These SLMs are designed to allow plug-and-play integration into automated systems that perform standard analysis methods (SAM). While the SLMs are autonomous in the execution of their particular chemical processing task, the SAM concept relies on a high-level task sequence controller (TSC) to coordinate the robotic delivery of materials requisite for SLM operations, initiate an SLM operation with the chemical method dependent operating parameters, and coordinate the robotic removal of materials from the SLMmore » when its commands and events has been established to allow ready them for transport operations as well as performing the Supervisor and Subsystems (GENISAS) software governs events from the SLMs and robot. The Intelligent System Operating Environment (ISOE) enables the inter-process communications used by GENISAS. CAA selected the Hewlett-Packard Optimized Robot for Chemical Analysis (ORCA) and its associated Windows based Methods Development Software (MDS) as the robot SSM. The MDS software is used to teach the robot each SLM position and required material port motions. To allow the TSC to command these SLM motions, a hardware and software implementation was required that allowed message passing between different operating systems. This implementation involved the use of a Virtual Memory Extended (VME) rack with a Force CPU-30 computer running VxWorks; a real-time multitasking operating system, and a Radiuses PC compatible VME computer running MDS. A GENISAS server on The Force computer accepts a transport command from the TSC, a GENISAS supervisor, over Ethernet and notifies software on the RadiSys PC of the pending command through VMEbus shared memory. The command is then delivered to the MDS robot control software using a Windows Dynamic Data Exchange conversation.« less
NASA Astrophysics Data System (ADS)
de Turck, Koen; de Vuyst, Stijn; Fiems, Dieter; Wittevrongel, Sabine; Bruneel, Herwig
There is a considerable interest nowadays in making wireless telecommunication more energy-efficient. The sleep-mode mechanism in WiMAX (IEEE 802.16e) is one of such energy saving measures. Recently, Samsung proposed some modifications on the sleep-mode mechanism, scheduled to appear in the forthcoming IEEE 802.16m standard, aimed at minimizing the signaling overhead. In this work, we present a performance analysis of this proposal and clarify the differences with the standard mechanism included in IEEE 802.16e. We also propose some special algorithms aimed at reducing the computational complexity of the analysis.
ERIC Educational Resources Information Center
Woodruff, David; Traynor, Anne; Cui, Zhongmin; Fang, Yu
2013-01-01
Professional standards for educational testing recommend that both the overall standard error of measurement and the conditional standard error of measurement (CSEM) be computed on the score scale used to report scores to examinees. Several methods have been developed to compute scale score CSEMs. This paper compares three methods, based on…
Standardization versus customization of glucose reporting.
Rodbard, David
2013-05-01
Bergenstal et al. (Diabetes Technol Ther 2013;15:198-211) described an important approach toward standardization of reporting and analysis of continuous glucose monitoring and self-monitoring of blood glucose (SMBG) data. The ambulatory glucose profile (AGP), a composite display of glucose by time of day that superimposes data from multiple days, is perhaps the most informative and useful of the many graphical approaches to display glucose data. However, the AGP has limitations; some variations are desirable and useful. Synchronization with respect to meals, traditionally used in glucose profiles for SMBG data, can improve characterization of postprandial glucose excursions. Several other types of graphical display are available, and recently developed ones can augment the information provided by the AGP. There is a need to standardize the parameters describing glycemic variability and cross-validate the available computer programs that calculate glycemic variability. Clinical decision support software can identify and prioritize clinical problems, make recommendations for modifications of therapy, and explain its justification for those recommendations. The goal of standardization is challenging in view of the diversity of clinical situations and of computing and display platforms and software. Standardization is desirable but must be done in a manner that permits flexibility and fosters innovation.
2001-10-25
a CT image, each voxel contains an integer number which is the CT value, in Hounsfield units (HU), of the voxel. Therefore, the standard method of...Task Number Work Unit Number Performing Organization Name(s) and Address(es) Department of Electrical and Computer Engineering, University of...34, Journal of Pediatric Surgery, vol 24(7), pp. 708-711, 1989. [4] I. N. Bankman, editor, Handbook of Medical Image Analysis, Academic Press, London, UK
Is the USAF Officer Corps a Fighting Force?
1988-05-01
quottin)j analysis, and operational audit . Following work moasurent and 14 computAtions, th., standards 4uro staffed and approved by NQ USAF *rnl...occupation group in the Air Force. 9 Most recently# Senator John Glenn has asked the GAO to conduct an audit of pilot requirements.1 9 This information...AlIocdtion System," Air Force Times, 14 September 1987, p. 6. 9. Pat )alton, " Audit Pilot Requirements, Glenn Asks GAO," Air Forte Times, 18 January 1988
The use of gas chromatographic-mass spectrometric-computer systems in pharmacokinetic studies.
Horning, M G; Nowlin, J; Stafford, M; Lertratanangkoon, K; Sommer, K R; Hill, R M; Stillwell, R N
1975-10-29
Pharmacokinetic studies involving plasma, urine, breast milk, saliva and liver homogenates have been carried out by selective ion detection with a gas chromatographic-mass spectrometric-computer system operated in the chemical ionization mode. Stable isotope labeled drugs were used as internal standards for quantification. The half-lives, the concentration at zero time, the slope (regression coefficient), the maximum velocity of the reaction and the apparent Michaelis constant of the reaction were determined by regression analysis, and also by graphic means.
Pasta, D J; Taylor, J L; Henning, J M
1999-01-01
Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.
PACM: A Two-Stage Procedure for Analyzing Structural Models.
ERIC Educational Resources Information Center
Lehmann, Donald R.; Gupta, Sunil
1989-01-01
Path Analysis of Covariance Matrix (PACM) is described as a way to separately estimate measurement and structural models using standard least squares procedures. PACM was empirically compared to simultaneous maximum likelihood estimation and use of the LISREL computer program, and its advantages are identified. (SLD)
Wu, Jia; Heike, Carrie; Birgfeld, Craig; Evans, Kelly; Maga, Murat; Morrison, Clinton; Saltzman, Babette; Shapiro, Linda; Tse, Raymond
2016-11-01
Quantitative measures of facial form to evaluate treatment outcomes for cleft lip (CL) are currently limited. Computer-based analysis of three-dimensional (3D) images provides an opportunity for efficient and objective analysis. The purpose of this study was to define a computer-based standard of identifying the 3D midfacial reference plane of the face in children with unrepaired cleft lip for measurement of facial symmetry. The 3D images of 50 subjects (35 with unilateral CL, 10 with bilateral CL, five controls) were included in this study. Five methods of defining a midfacial plane were applied to each image, including two human-based (Direct Placement, Manual Landmark) and three computer-based (Mirror, Deformation, Learning) methods. Six blinded raters (three cleft surgeons, two craniofacial pediatricians, and one craniofacial researcher) independently ranked and rated the accuracy of the defined planes. Among computer-based methods, the Deformation method performed significantly better than the others. Although human-based methods performed best, there was no significant difference compared with the Deformation method. The average correlation coefficient among raters was .4; however, it was .7 and .9 when the angular difference between planes was greater than 6° and 8°, respectively. Raters can agree on the 3D midfacial reference plane in children with unrepaired CL using digital surface mesh. The Deformation method performed best among computer-based methods evaluated and can be considered a useful tool to carry out automated measurements of facial symmetry in children with unrepaired cleft lip.
Mesoscale and severe storms (Mass) data management and analysis system
NASA Technical Reports Server (NTRS)
Hickey, J. S.; Karitani, S.; Dickerson, M.
1984-01-01
Progress on the Mesoscale and Severe Storms (MASS) data management and analysis system is described. An interactive atmospheric data base management software package to convert four types of data (Sounding, Single Level, Grid, Image) into standard random access formats is implemented and integrated with the MASS AVE80 Series general purpose plotting and graphics display data analysis software package. An interactive analysis and display graphics software package (AVE80) to analyze large volumes of conventional and satellite derived meteorological data is enhanced to provide imaging/color graphics display utilizing color video hardware integrated into the MASS computer system. Local and remote smart-terminal capability is provided by installing APPLE III computer systems within individual scientist offices and integrated with the MASS system, thus providing color video display, graphics, and characters display of the four data types.
The vehicle design evaluation program - A computer-aided design procedure for transport aircraft
NASA Technical Reports Server (NTRS)
Oman, B. H.; Kruse, G. S.; Schrader, O. E.
1977-01-01
The vehicle design evaluation program is described. This program is a computer-aided design procedure that provides a vehicle synthesis capability for vehicle sizing, external load analysis, structural analysis, and cost evaluation. The vehicle sizing subprogram provides geometry, weight, and balance data for aircraft using JP, hydrogen, or methane fuels. The structural synthesis subprogram uses a multistation analysis for aerodynamic surfaces and fuselages to develop theoretical weights and geometric dimensions. The parts definition subprogram uses the geometric data from the structural analysis and develops the predicted fabrication dimensions, parts material raw stock buy requirements, and predicted actual weights. The cost analysis subprogram uses detail part data in conjunction with standard hours, realization factors, labor rates, and material data to develop the manufacturing costs. The program is used to evaluate overall design effects on subsonic commercial type aircraft due to parameter variations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Judkoff, R.; Neymark, J.
2013-07-01
ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs applies the IEA BESTEST building thermal fabric test cases and example simulation results originally published in 1995. These software accuracy test cases and their example simulation results, which comprise the first test suite adapted for the initial 2001 version of Standard 140, are approaching their 20th anniversary. In response to the evolution of the state of the art in building thermal fabric modeling since the test cases and example simulation results were developed, work is commencing to update the normative test specification and themore » informative example results.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Judkoff, R.; Neymark, J.
2013-07-01
ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs applies the IEA BESTEST building thermal fabric test cases and example simulation results originally published in 1995. These software accuracy test cases and their example simulation results, which comprise the first test suite adapted for the initial 2001 version of Standard 140, are approaching their 20th anniversary. In response to the evolution of the state of the art in building thermal fabric modeling since the test cases and example simulation results were developed, work is commencing to update the normative test specification and themore » informative example results.« less
MPI implementation of PHOENICS: A general purpose computational fluid dynamics code
NASA Astrophysics Data System (ADS)
Simunovic, S.; Zacharia, T.; Baltas, N.; Spalding, D. B.
1995-03-01
PHOENICS is a suite of computational analysis programs that are used for simulation of fluid flow, heat transfer, and dynamical reaction processes. The parallel version of the solver EARTH for the Computational Fluid Dynamics (CFD) program PHOENICS has been implemented using Message Passing Interface (MPI) standard. Implementation of MPI version of PHOENICS makes this computational tool portable to a wide range of parallel machines and enables the use of high performance computing for large scale computational simulations. MPI libraries are available on several parallel architectures making the program usable across different architectures as well as on heterogeneous computer networks. The Intel Paragon NX and MPI versions of the program have been developed and tested on massively parallel supercomputers Intel Paragon XP/S 5, XP/S 35, and Kendall Square Research, and on the multiprocessor SGI Onyx computer at Oak Ridge National Laboratory. The preliminary testing results of the developed program have shown scalable performance for reasonably sized computational domains.
MPI implementation of PHOENICS: A general purpose computational fluid dynamics code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simunovic, S.; Zacharia, T.; Baltas, N.
1995-04-01
PHOENICS is a suite of computational analysis programs that are used for simulation of fluid flow, heat transfer, and dynamical reaction processes. The parallel version of the solver EARTH for the Computational Fluid Dynamics (CFD) program PHOENICS has been implemented using Message Passing Interface (MPI) standard. Implementation of MPI version of PHOENICS makes this computational tool portable to a wide range of parallel machines and enables the use of high performance computing for large scale computational simulations. MPI libraries are available on several parallel architectures making the program usable across different architectures as well as on heterogeneous computer networks. Themore » Intel Paragon NX and MPI versions of the program have been developed and tested on massively parallel supercomputers Intel Paragon XP/S 5, XP/S 35, and Kendall Square Research, and on the multiprocessor SGI Onyx computer at Oak Ridge National Laboratory. The preliminary testing results of the developed program have shown scalable performance for reasonably sized computational domains.« less
2HOT: An Improved Parallel Hashed Oct-Tree N-Body Algorithm for Cosmological Simulation
Warren, Michael S.
2014-01-01
We report on improvements made over the past two decades to our adaptive treecode N-body method (HOT). A mathematical and computational approach to the cosmological N-body problem is described, with performance and scalability measured up to 256k (2 18 ) processors. We present error analysis and scientific application results from a series of more than ten 69 billion (4096 3 ) particle cosmological simulations, accounting for 4×10 20 floating point operations. These results include the first simulations using the new constraints on the standard model of cosmology from the Planck satellite. Our simulations set a new standard for accuracy andmore » scientific throughput, while meeting or exceeding the computational efficiency of the latest generation of hybrid TreePM N-body methods.« less
A Format for Phylogenetic Placements
Matsen, Frederick A.; Hoffman, Noah G.; Gallagher, Aaron; Stamatakis, Alexandros
2012-01-01
We have developed a unified format for phylogenetic placements, that is, mappings of environmental sequence data (e.g., short reads) into a phylogenetic tree. We are motivated to do so by the growing number of tools for computing and post-processing phylogenetic placements, and the lack of an established standard for storing them. The format is lightweight, versatile, extensible, and is based on the JSON format, which can be parsed by most modern programming languages. Our format is already implemented in several tools for computing and post-processing parsimony- and likelihood-based phylogenetic placements and has worked well in practice. We believe that establishing a standard format for analyzing read placements at this early stage will lead to a more efficient development of powerful and portable post-analysis tools for the growing applications of phylogenetic placement. PMID:22383988
A format for phylogenetic placements.
Matsen, Frederick A; Hoffman, Noah G; Gallagher, Aaron; Stamatakis, Alexandros
2012-01-01
We have developed a unified format for phylogenetic placements, that is, mappings of environmental sequence data (e.g., short reads) into a phylogenetic tree. We are motivated to do so by the growing number of tools for computing and post-processing phylogenetic placements, and the lack of an established standard for storing them. The format is lightweight, versatile, extensible, and is based on the JSON format, which can be parsed by most modern programming languages. Our format is already implemented in several tools for computing and post-processing parsimony- and likelihood-based phylogenetic placements and has worked well in practice. We believe that establishing a standard format for analyzing read placements at this early stage will lead to a more efficient development of powerful and portable post-analysis tools for the growing applications of phylogenetic placement.
Gimpel, Charlotte; Kain, Renate; Laurinavicius, Arvydas; Bueno, Gloria; Zeng, Caihong; Liu, Zhihong; Schaefer, Franz; Kretzler, Matthias; Holzman, Lawrence B.; Hewitt, Stephen M.
2017-01-01
Abstract The introduction of digital pathology to nephrology provides a platform for the development of new methodologies and protocols for visual, morphometric and computer-aided assessment of renal biopsies. Application of digital imaging to pathology made substantial progress over the past decade; it is now in use for education, clinical trials and translational research. Digital pathology evolved as a valuable tool to generate comprehensive structural information in digital form, a key prerequisite for achieving precision pathology for computational biology. The application of this new technology on an international scale is driving novel methods for collaborations, providing unique opportunities but also challenges. Standardization of methods needs to be rigorously evaluated and applied at each step, from specimen processing to scanning, uploading into digital repositories, morphologic, morphometric and computer-aided assessment, data collection and analysis. In this review, we discuss the status and opportunities created by the application of digital imaging to precision nephropathology, and present a vision for the near future. PMID:28584625
Barisoni, Laura; Gimpel, Charlotte; Kain, Renate; Laurinavicius, Arvydas; Bueno, Gloria; Zeng, Caihong; Liu, Zhihong; Schaefer, Franz; Kretzler, Matthias; Holzman, Lawrence B; Hewitt, Stephen M
2017-04-01
The introduction of digital pathology to nephrology provides a platform for the development of new methodologies and protocols for visual, morphometric and computer-aided assessment of renal biopsies. Application of digital imaging to pathology made substantial progress over the past decade; it is now in use for education, clinical trials and translational research. Digital pathology evolved as a valuable tool to generate comprehensive structural information in digital form, a key prerequisite for achieving precision pathology for computational biology. The application of this new technology on an international scale is driving novel methods for collaborations, providing unique opportunities but also challenges. Standardization of methods needs to be rigorously evaluated and applied at each step, from specimen processing to scanning, uploading into digital repositories, morphologic, morphometric and computer-aided assessment, data collection and analysis. In this review, we discuss the status and opportunities created by the application of digital imaging to precision nephropathology, and present a vision for the near future.
NASA Technical Reports Server (NTRS)
Hofmann, R.
1980-01-01
The STEALTH code system, which solves large strain, nonlinear continuum mechanics problems, was rigorously structured in both overall design and programming standards. The design is based on the theoretical elements of analysis while the programming standards attempt to establish a parallelism between physical theory, programming structure, and documentation. These features have made it easy to maintain, modify, and transport the codes. It has also guaranteed users a high level of quality control and quality assurance.
Trends in computer applications in science assessment
NASA Astrophysics Data System (ADS)
Kumar, David D.; Helgeson, Stanley L.
1995-03-01
Seven computer applications to science assessment are reviewed. Conventional test administration includes record keeping, grading, and managing test banks. Multiple-choice testing involves forced selection of an answer from a menu, whereas constructed-response testing involves options for students to present their answers within a set standard deviation. Adaptive testing attempts to individualize the test to minimize the number of items and time needed to assess a student's knowledge. Figurai response testing assesses science proficiency in pictorial or graphic mode and requires the student to construct a mental image rather than selecting a response from a multiple choice menu. Simulations have been found useful for performance assessment on a large-scale basis in part because they make it possible to independently specify different aspects of a real experiment. An emerging approach to performance assessment is solution pathway analysis, which permits the analysis of the steps a student takes in solving a problem. Virtually all computer-based testing systems improve the quality and efficiency of record keeping and data analysis.
A Rocket Engine Design Expert System
NASA Technical Reports Server (NTRS)
Davidian, Kenneth J.
1989-01-01
The overall structure and capabilities of an expert system designed to evaluate rocket engine performance are described. The expert system incorporates a JANNAF standard reference computer code to determine rocket engine performance and a state of the art finite element computer code to calculate the interactions between propellant injection, energy release in the combustion chamber, and regenerative cooling heat transfer. Rule-of-thumb heuristics were incorporated for the H2-O2 coaxial injector design, including a minimum gap size constraint on the total number of injector elements. One dimensional equilibrium chemistry was used in the energy release analysis of the combustion chamber. A 3-D conduction and/or 1-D advection analysis is used to predict heat transfer and coolant channel wall temperature distributions, in addition to coolant temperature and pressure drop. Inputting values to describe the geometry and state properties of the entire system is done directly from the computer keyboard. Graphical display of all output results from the computer code analyses is facilitated by menu selection of up to five dependent variables per plot.
A rocket engine design expert system
NASA Technical Reports Server (NTRS)
Davidian, Kenneth J.
1989-01-01
The overall structure and capabilities of an expert system designed to evaluate rocket engine performance are described. The expert system incorporates a JANNAF standard reference computer code to determine rocket engine performance and a state-of-the-art finite element computer code to calculate the interactions between propellant injection, energy release in the combustion chamber, and regenerative cooling heat transfer. Rule-of-thumb heuristics were incorporated for the hydrogen-oxygen coaxial injector design, including a minimum gap size constraint on the total number of injector elements. One-dimensional equilibrium chemistry was employed in the energy release analysis of the combustion chamber and three-dimensional finite-difference analysis of the regenerative cooling channels was used to calculate the pressure drop along the channels and the coolant temperature as it exits the coolant circuit. Inputting values to describe the geometry and state properties of the entire system is done directly from the computer keyboard. Graphical display of all output results from the computer code analyses is facilitated by menu selection of up to five dependent variables per plot.
"Type Ia Supernovae: Tools for Studying Dark Energy" Final Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woosley, Stan; Kasen, Dan
2017-05-10
Final technical report for project "Type Ia Supernovae: Tools for the Study of Dark Energy" awarded jointly to scientists at the University of California, Santa Cruz and Berkeley, for computer modeling, theory and data analysis relevant to the use of Type Ia supernovae as standard candles for cosmology.
Publication Of Oceanographic Data on CD-ROM
NASA Technical Reports Server (NTRS)
Hilland, Jeffrey E.; Smith, Elizabeth A.; Martin, Michael D.
1992-01-01
Large collections of oceanographic data and other large collections of data published on CD-ROM's in formats facilitating access and analysis. Involves four major steps: preprocessing, premastering, mastering, and verification. Large capacity, small size, commercial availability, long-life, and standard format of CD-ROM's offer advantages over computer-compatible magnetic tape.
ERIC Educational Resources Information Center
Remley, Dirk
2009-01-01
Carter (2007) identifies four meta-genres associated with writing activities that can help students learn discipline-specific writing skills relative to standards within a given field: these include problem solving, empirical approaches to analysis, selection of sources to use within research, and production of materials that meet accepted…
An analysis of ratings: A guide to RMRATE
Thomas C. Brown; Terry C. Daniel; Herbert W. Schroeder; Glen E. Brink
1990-01-01
This report describes RMRATE, a computer program for analyzing rating judgments. RMRATE scales ratings using several scaling procedures, and compares the resulting scale values. The scaling procedures include the median and simple mean, standardized values, scale values based on Thurstone's Law of Categorical Judgment, and regression-based values. RMRATE also...
The role of continuity in residual-based variational multiscale modeling of turbulence
NASA Astrophysics Data System (ADS)
Akkerman, I.; Bazilevs, Y.; Calo, V. M.; Hughes, T. J. R.; Hulshoff, S.
2008-02-01
This paper examines the role of continuity of the basis in the computation of turbulent flows. We compare standard finite elements and non-uniform rational B-splines (NURBS) discretizations that are employed in Isogeometric Analysis (Hughes et al. in Comput Methods Appl Mech Eng, 194:4135 4195, 2005). We make use of quadratic discretizations that are C 0-continuous across element boundaries in standard finite elements, and C 1-continuous in the case of NURBS. The variational multiscale residual-based method (Bazilevs in Isogeometric analysis of turbulence and fluid-structure interaction, PhD thesis, ICES, UT Austin, 2006; Bazilevs et al. in Comput Methods Appl Mech Eng, submitted, 2007; Calo in Residual-based multiscale turbulence modeling: finite volume simulation of bypass transition. PhD thesis, Department of Civil and Environmental Engineering, Stanford University, 2004; Hughes et al. in proceedings of the XXI international congress of theoretical and applied mechanics (IUTAM), Kluwer, 2004; Scovazzi in Multiscale methods in science and engineering, PhD thesis, Department of Mechanical Engineering, Stanford Universty, 2004) is employed as a turbulence modeling technique. We find that C 1-continuous discretizations outperform their C 0-continuous counterparts on a per-degree-of-freedom basis. We also find that the effect of continuity is greater for higher Reynolds number flows.
Evaluation of Shipbuilding CAD/CAM/CIM Systems - Phase II (Requirements for Future Systems)
1997-02-01
INNOVATION MARINE INDUSTRY STANDARDS WELDING INDUSTRIAL ENGINEERING EDUCATION AND TRAINING THE NATIONAL SHIPBUILDING RESEARCH PROGRAM February 1997 NSRP 0479...an analysis of CAD/CAM/CIM in shipyards, ship-design software firms, and alIied industries in Europe, Japan and the U.S. The purpose of the analysis...possible: Black and Veatch Hitachi Ariake Works Industrial Technology Institute Intergraph Corporation Kockums Computer Systems Mitsubishi Heavy Industries
Artificial neural network-aided image analysis system for cell counting.
Sjöström, P J; Frydel, B R; Wahlberg, L U
1999-05-01
In histological preparations containing debris and synthetic materials, it is difficult to automate cell counting using standard image analysis tools, i.e., systems that rely on boundary contours, histogram thresholding, etc. In an attempt to mimic manual cell recognition, an automated cell counter was constructed using a combination of artificial intelligence and standard image analysis methods. Artificial neural network (ANN) methods were applied on digitized microscopy fields without pre-ANN feature extraction. A three-layer feed-forward network with extensive weight sharing in the first hidden layer was employed and trained on 1,830 examples using the error back-propagation algorithm on a Power Macintosh 7300/180 desktop computer. The optimal number of hidden neurons was determined and the trained system was validated by comparison with blinded human counts. System performance at 50x and lO0x magnification was evaluated. The correlation index at 100x magnification neared person-to-person variability, while 50x magnification was not useful. The system was approximately six times faster than an experienced human. ANN-based automated cell counting in noisy histological preparations is feasible. Consistent histology and computer power are crucial for system performance. The system provides several benefits, such as speed of analysis and consistency, and frees up personnel for other tasks.
[A study of biomechanical method for urine test based on color difference estimation].
Wang, Chunhong; Zhou, Yue; Zhao, Hongxia; Zhou, Fengkun
2008-02-01
The biochemical analysis of urine is an important inspection and diagnosis method in hospitals. The conventional method of urine analysis covers mainly colorimetric visual appraisement and automation detection, in which the colorimetric visual appraisement technique has been superseded basically, and the automation detection method is adopted in hospital; moreover, the price of urine biochemical analyzer on market is around twenty thousand RMB yuan (Y), which is hard to enter into ordinary families. It is known that computer vision system is not subject to the physiological and psychological influence of person, its appraisement standard is objective and steady. Therefore, according to the color theory, we have established a computer vision system, which can carry through collection, management, display, and appraisement of color difference between the color of standard threshold value and the color of urine test paper after reaction with urine liquid, and then the level of an illness can be judged accurately. In this paper, we introduce the Urine Test Biochemical Analysis method, which is new and can be popularized in families. Experimental result shows that this test method is easy-to-use and cost-effective. It can realize the monitoring of a whole course and can find extensive applications.
Wood, Scott T; Dean, Brian C; Dean, Delphine
2013-04-01
This paper presents a novel computer vision algorithm to analyze 3D stacks of confocal images of fluorescently stained single cells. The goal of the algorithm is to create representative in silico model structures that can be imported into finite element analysis software for mechanical characterization. Segmentation of cell and nucleus boundaries is accomplished via standard thresholding methods. Using novel linear programming methods, a representative actin stress fiber network is generated by computing a linear superposition of fibers having minimum discrepancy compared with an experimental 3D confocal image. Qualitative validation is performed through analysis of seven 3D confocal image stacks of adherent vascular smooth muscle cells (VSMCs) grown in 2D culture. The presented method is able to automatically generate 3D geometries of the cell's boundary, nucleus, and representative F-actin network based on standard cell microscopy data. These geometries can be used for direct importation and implementation in structural finite element models for analysis of the mechanics of a single cell to potentially speed discoveries in the fields of regenerative medicine, mechanobiology, and drug discovery. Copyright © 2012 Elsevier B.V. All rights reserved.
Huang, Shi; MacKinnon, David P.; Perrino, Tatiana; Gallo, Carlos; Cruden, Gracelyn; Brown, C Hendricks
2016-01-01
Mediation analysis often requires larger sample sizes than main effect analysis to achieve the same statistical power. Combining results across similar trials may be the only practical option for increasing statistical power for mediation analysis in some situations. In this paper, we propose a method to estimate: 1) marginal means for mediation path a, the relation of the independent variable to the mediator; 2) marginal means for path b, the relation of the mediator to the outcome, across multiple trials; and 3) the between-trial level variance-covariance matrix based on a bivariate normal distribution. We present the statistical theory and an R computer program to combine regression coefficients from multiple trials to estimate a combined mediated effect and confidence interval under a random effects model. Values of coefficients a and b, along with their standard errors from each trial are the input for the method. This marginal likelihood based approach with Monte Carlo confidence intervals provides more accurate inference than the standard meta-analytic approach. We discuss computational issues, apply the method to two real-data examples and make recommendations for the use of the method in different settings. PMID:28239330
NASA Technical Reports Server (NTRS)
Groves, Curtis E.; Ilie, marcel; Shallhorn, Paul A.
2014-01-01
Computational Fluid Dynamics (CFD) is the standard numerical tool used by Fluid Dynamists to estimate solutions to many problems in academia, government, and industry. CFD is known to have errors and uncertainties and there is no universally adopted method to estimate such quantities. This paper describes an approach to estimate CFD uncertainties strictly numerically using inputs and the Student-T distribution. The approach is compared to an exact analytical solution of fully developed, laminar flow between infinite, stationary plates. It is shown that treating all CFD input parameters as oscillatory uncertainty terms coupled with the Student-T distribution can encompass the exact solution.
NASA Technical Reports Server (NTRS)
Chadwick, C.
1984-01-01
This paper describes the development and use of an algorithm to compute approximate statistics of the magnitude of a single random trajectory correction maneuver (TCM) Delta v vector. The TCM Delta v vector is modeled as a three component Cartesian vector each of whose components is a random variable having a normal (Gaussian) distribution with zero mean and possibly unequal standard deviations. The algorithm uses these standard deviations as input to produce approximations to (1) the mean and standard deviation of the magnitude of Delta v, (2) points of the probability density function of the magnitude of Delta v, and (3) points of the cumulative and inverse cumulative distribution functions of Delta v. The approximates are based on Monte Carlo techniques developed in a previous paper by the author and extended here. The algorithm described is expected to be useful in both pre-flight planning and in-flight analysis of maneuver propellant requirements for space missions.
Performance management of multiple access communication networks
NASA Astrophysics Data System (ADS)
Lee, Suk; Ray, Asok
1993-12-01
This paper focuses on conceptual design, development, and implementation of a performance management tool for computer communication networks to serve large-scale integrated systems. The objective is to improve the network performance in handling various types of messages by on-line adjustment of protocol parameters. The techniques of perturbation analysis of Discrete Event Dynamic Systems (DEDS), stochastic approximation (SA), and learning automata have been used in formulating the algorithm of performance management. The efficacy of the performance management tool has been demonstrated on a network testbed. The conceptual design presented in this paper offers a step forward to bridging the gap between management standards and users' demands for efficient network operations since most standards such as ISO (International Standards Organization) and IEEE address only the architecture, services, and interfaces for network management. The proposed concept of performance management can also be used as a general framework to assist design, operation, and management of various DEDS such as computer integrated manufacturing and battlefield C(sup 3) (Command, Control, and Communications).
Toward a new culture in verified quantum operations
NASA Astrophysics Data System (ADS)
Flammia, Steve
Measuring error rates of quantum operations has become an indispensable component in any aspiring platform for quantum computation. As the quality of controlled quantum operations increases, the demands on the accuracy and precision with which we measure these error rates also grows. However, well-meaning scientists that report these error measures are faced with a sea of non-standardized methodologies and are often asked during publication for only coarse information about how their estimates were obtained. Moreover, there are serious incentives to use methodologies and measures that will continually produce numbers that improve with time to show progress. These problems will only get exacerbated as our typical error rates go from 1 in 100 to 1 in 1000 or less. This talk will survey existing challenges presented by the current paradigm and offer some suggestions for solutions than can help us move toward fair and standardized methods for error metrology in quantum computing experiments, and towards a culture that values full disclose of methodologies and higher standards for data analysis.
Thermal radiation view factor: Methods, accuracy and computer-aided procedures
NASA Technical Reports Server (NTRS)
Kadaba, P. V.
1982-01-01
The computer aided thermal analysis programs which predicts the result of predetermined acceptable temperature range prior to stationing of these orbiting equipment in various attitudes with respect to the Sun and the Earth was examined. Complexity of the surface geometries suggests the use of numerical schemes for the determination of these viewfactors. Basic definitions and standard methods which form the basis for various digital computer methods and various numerical methods are presented. The physical model and the mathematical methods on which a number of available programs are built are summarized. The strength and the weaknesses of the methods employed, the accuracy of the calculations and the time required for computations are evaluated. The situations where accuracies are important for energy calculations are identified and methods to save computational times are proposed. Guide to best use of the available programs at several centers and the future choices for efficient use of digital computers are included in the recommendations.
2015-10-01
capability to meet the task to the standard under the condition, nothing more or less, else the funding is wasted . Also, that funding for the...bin to segregate gaps qualitatively before the gap value model determined preference among gaps within the bins. Computation of a gap’s...for communication, interpretation, or processing by humans or by automatic means (as it pertains to modeling and simulation). Delphi Method -- a
Influence of counting chamber type on CASA outcomes of equine semen analysis.
Hoogewijs, M K; de Vliegher, S P; Govaere, J L; de Schauwer, C; de Kruif, A; van Soom, A
2012-09-01
Sperm motility is considered to be one of the key features of semen analysis. Assessment of motility is frequently performed using computer-assisted sperm analysis (CASA). Nevertheless, no uniform standards are present to analyse a semen sample using CASA. We hypothesised that the type of counting chamber used might influence the results of analysis and aimed to study the effect of chamber type on estimated concentration and motility of an equine semen sample assessed using CASA. Commonly used disposable Leja chambers of different depths were compared with disposable and reusable ISAS chambers, a Makler chamber and a World Health Organization (WHO) motility slide. Motility parameters and concentrations obtained with CASA using these different chambers were analysed. The NucleoCounter was used as gold standard for determining concentration. Concentration and motility parameters were significantly influenced by the chamber type used. Using the NucleoCounter as the gold standard for determining concentration, the correlation coefficients were low for all of the various chambers evaluated, with the exception of the 12 µm deep Leja chamber. Filling a chamber by capillary forces resulted in a lower observed concentration and reduced motility parameters. All chambers evaluated in this study resulted in significant lower progressive motility than the WHO prepared slide, with the exception of the Makler chamber, which resulted in a slight, but statistically significant, increase in progressive motility estimates. Computer-assisted sperm analysis can only provide a rough estimate of sperm concentration and overestimation is likely when drop-filled slides with a coverslip are used. Motility estimates using CASA are highly influenced by the counting chamber; therefore, a complete description of the chamber type used should be provided in semen reports and in scientific articles. © 2011 EVJ Ltd.
Mental Computation or Standard Algorithm? Children's Strategy Choices on Multi-Digit Subtractions
ERIC Educational Resources Information Center
Torbeyns, Joke; Verschaffel, Lieven
2016-01-01
This study analyzed children's use of mental computation strategies and the standard algorithm on multi-digit subtractions. Fifty-eight Flemish 4th graders of varying mathematical achievement level were individually offered subtractions that either stimulated the use of mental computation strategies or the standard algorithm in one choice and two…
Fast, Exact Bootstrap Principal Component Analysis for p > 1 million
Fisher, Aaron; Caffo, Brian; Schwartz, Brian; Zipunnikov, Vadim
2015-01-01
Many have suggested a bootstrap procedure for estimating the sampling variability of principal component analysis (PCA) results. However, when the number of measurements per subject (p) is much larger than the number of subjects (n), calculating and storing the leading principal components from each bootstrap sample can be computationally infeasible. To address this, we outline methods for fast, exact calculation of bootstrap principal components, eigenvalues, and scores. Our methods leverage the fact that all bootstrap samples occupy the same n-dimensional subspace as the original sample. As a result, all bootstrap principal components are limited to the same n-dimensional subspace and can be efficiently represented by their low dimensional coordinates in that subspace. Several uncertainty metrics can be computed solely based on the bootstrap distribution of these low dimensional coordinates, without calculating or storing the p-dimensional bootstrap components. Fast bootstrap PCA is applied to a dataset of sleep electroencephalogram recordings (p = 900, n = 392), and to a dataset of brain magnetic resonance images (MRIs) (p ≈ 3 million, n = 352). For the MRI dataset, our method allows for standard errors for the first 3 principal components based on 1000 bootstrap samples to be calculated on a standard laptop in 47 minutes, as opposed to approximately 4 days with standard methods. PMID:27616801
Kai, Chiharu; Uchiyama, Yoshikazu; Shiraishi, Junji; Fujita, Hiroshi; Doi, Kunio
2018-05-10
In the post-genome era, a novel research field, 'radiomics' has been developed to offer a new viewpoint for the use of genotypes in radiology and medicine research which have traditionally focused on the analysis of imaging phenotypes. The present study analyzed brain morphological changes related to the individual's genotype. Our data consisted of magnetic resonance (MR) images of patients with mild cognitive impairment (MCI) and Alzheimer's disease (AD), as well as their apolipoprotein E (APOE) genotypes. First, statistical parametric mapping (SPM) 12 was used for three-dimensional anatomical standardization of the brain MR images. A total of 30 normal images were used to create a standard normal brain image. Z-score maps were generated to identify the differences between an abnormal image and the standard normal brain. Our experimental results revealed that cerebral atrophies, depending on genotypes, can occur in different locations and that morphological changes may differ between MCI and AD. Using a classifier to characterize cerebral atrophies related to an individual's genotype, we developed a computer-aided diagnosis (CAD) scheme to identify the disease. For the early detection of cerebral diseases, a screening system using MR images, called Brain Check-up, is widely performed in Japan. Therefore, our proposed CAD scheme would be used in Brain Check-up.
Provenance-aware optimization of workload for distributed data production
NASA Astrophysics Data System (ADS)
Makatun, Dzmitry; Lauret, Jérôme; Rudová, Hana; Šumbera, Michal
2017-10-01
Distributed data processing in High Energy and Nuclear Physics (HENP) is a prominent example of big data analysis. Having petabytes of data being processed at tens of computational sites with thousands of CPUs, standard job scheduling approaches either do not address well the problem complexity or are dedicated to one specific aspect of the problem only (CPU, network or storage). Previously we have developed a new job scheduling approach dedicated to distributed data production - an essential part of data processing in HENP (preprocessing in big data terminology). In this contribution, we discuss the load balancing with multiple data sources and data replication, present recent improvements made to our planner and provide results of simulations which demonstrate the advantage against standard scheduling policies for the new use case. Multi-source or provenance is common in computing models of many applications whereas the data may be copied to several destinations. The initial input data set would hence be already partially replicated to multiple locations and the task of the scheduler is to maximize overall computational throughput considering possible data movements and CPU allocation. The studies have shown that our approach can provide a significant gain in overall computational performance in a wide scope of simulations considering realistic size of computational Grid and various input data distribution.
NASA Astrophysics Data System (ADS)
Hofierka, Jaroslav; Lacko, Michal; Zubal, Stanislav
2017-10-01
In this paper, we describe the parallelization of three complex and computationally intensive modules of GRASS GIS using the OpenMP application programming interface for multi-core computers. These include the v.surf.rst module for spatial interpolation, the r.sun module for solar radiation modeling and the r.sim.water module for water flow simulation. We briefly describe the functionality of the modules and parallelization approaches used in the modules. Our approach includes the analysis of the module's functionality, identification of source code segments suitable for parallelization and proper application of OpenMP parallelization code to create efficient threads processing the subtasks. We document the efficiency of the solutions using the airborne laser scanning data representing land surface in the test area and derived high-resolution digital terrain model grids. We discuss the performance speed-up and parallelization efficiency depending on the number of processor threads. The study showed a substantial increase in computation speeds on a standard multi-core computer while maintaining the accuracy of results in comparison to the output from original modules. The presented parallelization approach showed the simplicity and efficiency of the parallelization of open-source GRASS GIS modules using OpenMP, leading to an increased performance of this geospatial software on standard multi-core computers.
Gold-standard for computer-assisted morphological sperm analysis.
Chang, Violeta; Garcia, Alejandra; Hitschfeld, Nancy; Härtel, Steffen
2017-04-01
Published algorithms for classification of human sperm heads are based on relatively small image databases that are not open to the public, and thus no direct comparison is available for competing methods. We describe a gold-standard for morphological sperm analysis (SCIAN-MorphoSpermGS), a dataset of sperm head images with expert-classification labels in one of the following classes: normal, tapered, pyriform, small or amorphous. This gold-standard is for evaluating and comparing known techniques and future improvements to present approaches for classification of human sperm heads for semen analysis. Although this paper does not provide a computational tool for morphological sperm analysis, we present a set of experiments for comparing sperm head description and classification common techniques. This classification base-line is aimed to be used as a reference for future improvements to present approaches for human sperm head classification. The gold-standard provides a label for each sperm head, which is achieved by majority voting among experts. The classification base-line compares four supervised learning methods (1- Nearest Neighbor, naive Bayes, decision trees and Support Vector Machine (SVM)) and three shape-based descriptors (Hu moments, Zernike moments and Fourier descriptors), reporting the accuracy and the true positive rate for each experiment. We used Fleiss' Kappa Coefficient to evaluate the inter-expert agreement and Fisher's exact test for inter-expert variability and statistical significant differences between descriptors and learning techniques. Our results confirm the high degree of inter-expert variability in the morphological sperm analysis. Regarding the classification base line, we show that none of the standard descriptors or classification approaches is best suitable for tackling the problem of sperm head classification. We discovered that the correct classification rate was highly variable when trying to discriminate among non-normal sperm heads. By using the Fourier descriptor and SVM, we achieved the best mean correct classification: only 49%. We conclude that the SCIAN-MorphoSpermGS will provide a standard tool for evaluation of characterization and classification approaches for human sperm heads. Indeed, there is a clear need for a specific shape-based descriptor for human sperm heads and a specific classification approach to tackle the problem of high variability within subcategories of abnormal sperm cells. Copyright © 2017 Elsevier Ltd. All rights reserved.
Atlas2 Cloud: a framework for personal genome analysis in the cloud
2012-01-01
Background Until recently, sequencing has primarily been carried out in large genome centers which have invested heavily in developing the computational infrastructure that enables genomic sequence analysis. The recent advancements in next generation sequencing (NGS) have led to a wide dissemination of sequencing technologies and data, to highly diverse research groups. It is expected that clinical sequencing will become part of diagnostic routines shortly. However, limited accessibility to computational infrastructure and high quality bioinformatic tools, and the demand for personnel skilled in data analysis and interpretation remains a serious bottleneck. To this end, the cloud computing and Software-as-a-Service (SaaS) technologies can help address these issues. Results We successfully enabled the Atlas2 Cloud pipeline for personal genome analysis on two different cloud service platforms: a community cloud via the Genboree Workbench, and a commercial cloud via the Amazon Web Services using Software-as-a-Service model. We report a case study of personal genome analysis using our Atlas2 Genboree pipeline. We also outline a detailed cost structure for running Atlas2 Amazon on whole exome capture data, providing cost projections in terms of storage, compute and I/O when running Atlas2 Amazon on a large data set. Conclusions We find that providing a web interface and an optimized pipeline clearly facilitates usage of cloud computing for personal genome analysis, but for it to be routinely used for large scale projects there needs to be a paradigm shift in the way we develop tools, in standard operating procedures, and in funding mechanisms. PMID:23134663
Atlas2 Cloud: a framework for personal genome analysis in the cloud.
Evani, Uday S; Challis, Danny; Yu, Jin; Jackson, Andrew R; Paithankar, Sameer; Bainbridge, Matthew N; Jakkamsetti, Adinarayana; Pham, Peter; Coarfa, Cristian; Milosavljevic, Aleksandar; Yu, Fuli
2012-01-01
Until recently, sequencing has primarily been carried out in large genome centers which have invested heavily in developing the computational infrastructure that enables genomic sequence analysis. The recent advancements in next generation sequencing (NGS) have led to a wide dissemination of sequencing technologies and data, to highly diverse research groups. It is expected that clinical sequencing will become part of diagnostic routines shortly. However, limited accessibility to computational infrastructure and high quality bioinformatic tools, and the demand for personnel skilled in data analysis and interpretation remains a serious bottleneck. To this end, the cloud computing and Software-as-a-Service (SaaS) technologies can help address these issues. We successfully enabled the Atlas2 Cloud pipeline for personal genome analysis on two different cloud service platforms: a community cloud via the Genboree Workbench, and a commercial cloud via the Amazon Web Services using Software-as-a-Service model. We report a case study of personal genome analysis using our Atlas2 Genboree pipeline. We also outline a detailed cost structure for running Atlas2 Amazon on whole exome capture data, providing cost projections in terms of storage, compute and I/O when running Atlas2 Amazon on a large data set. We find that providing a web interface and an optimized pipeline clearly facilitates usage of cloud computing for personal genome analysis, but for it to be routinely used for large scale projects there needs to be a paradigm shift in the way we develop tools, in standard operating procedures, and in funding mechanisms.
Perl One-Liners: Bridging the Gap Between Large Data Sets and Analysis Tools.
Hokamp, Karsten
2015-01-01
Computational analyses of biological data are becoming increasingly powerful, and researchers intending on carrying out their own analyses can often choose from a wide array of tools and resources. However, their application might be obstructed by the wide variety of different data formats that are in use, from standard, commonly used formats to output files from high-throughput analysis platforms. The latter are often too large to be opened, viewed, or edited by standard programs, potentially leading to a bottleneck in the analysis. Perl one-liners provide a simple solution to quickly reformat, filter, and merge data sets in preparation for downstream analyses. This chapter presents example code that can be easily adjusted to meet individual requirements. An online version is available at http://bioinf.gen.tcd.ie/pol.
NASA Astrophysics Data System (ADS)
Gil, Y.; Duffy, C.
2015-12-01
This paper proposes the concept of a "Computable Catchment" which is used to develop a collaborative platform for watershed modeling and data analysis. The object of the research is a sharable, executable document similar to a pdf, but one that includes documentation of the underlying theoretical concepts, interactive computational/numerical resources, linkage to essential data repositories and the ability for interactive model-data visualization and analysis. The executable document for each catchment is stored in the cloud with automatic provisioning and a unique identifier allowing collaborative model and data enhancements for historical hydroclimatic reconstruction and/or future landuse or climate change scenarios to be easily reconstructed or extended. The Computable Catchment adopts metadata standards for naming all variables in the model and the data. The a-priori or initial data is derived from national data sources for soils, hydrogeology, climate, and land cover available from the www.hydroterre.psu.edu data service (Leonard and Duffy, 2015). The executable document is based on Wolfram CDF or Computable Document Format with an interactive open-source reader accessible by any modern computing platform. The CDF file and contents can be uploaded to a website or simply shared as a normal document maintaining all interactive features of the model and data. The Computable Catchment concept represents one application for Geoscience Papers of the Future representing an extensible document that combines theory, models, data and analysis that are digitally shared, documented and reused among research collaborators, students, educators and decision makers.
Saito, Taiichi; Sugiyama, Kazuhiko; Ikawa, Fusao; Yamasaki, Fumiyuki; Ishifuro, Minoru; Takayasu, Takeshi; Nosaka, Ryo; Nishibuchi, Ikuno; Muragaki, Yoshihiro; Kawamata, Takakazu; Kurisu, Kaoru
2017-01-01
The current standard treatment protocol for patients with newly diagnosed glioblastoma (GBM) includes surgery, radiotherapy, and concomitant and adjuvant temozolomide (TMZ). We hypothesized that the permeability surface area product (PS) from a perfusion computed tomography (PCT) study is associated with sensitivity to TMZ. The aim of this study was to determine whether PS values were correlated with prognosis of GBM patients who received the standard treatment protocol. This study included 36 patients with GBM that were newly diagnosed between October 2005 and September 2014 and who underwent preoperative PCT study and the standard treatment protocol. We measured the maximum value of relative cerebral blood volume (rCBVmax) and the maximum PS value (PSmax). We statistically examined the relationship between PSmax and prognosis using survival analysis, including other clinicopathologic factors (age, Karnofsky performance status [KPS], extent of resection, O6-methylguanine-DNA methyltransferase [MGMT] status, second-line use of bevacizumab, and rCBVmax). Log-rank tests revealed that age, KPS, MGMT status, and PSmax were significantly correlated with overall survival. Multivariate analysis using the Cox regression model showed that PSmax was the most significant prognostic factor. Receiver operating characteristic curve analysis showed that PSmax had the highest accuracy in differentiating longtime survivors (LTSs) (surviving more than 2 years) from non-LTSs. At a cutoff point of 8.26 mL/100 g/min, sensitivity and specificity were 90% and 70%, respectively. PSmax from PCT study can help predict survival time in patients with GBM receiving the standard treatment protocol. Survival may be related to sensitivity to TMZ. Copyright © 2016 Elsevier Inc. All rights reserved.
A data acquisition and storage system for the ion auxiliary propulsion system cyclic thruster test
NASA Technical Reports Server (NTRS)
Hamley, John A.
1989-01-01
A nine-track tape drive interfaced to a standard personal computer was used to transport data from a remote test site to the NASA Lewis mainframe computer for analysis. The Cyclic Ground Test of the Ion Auxiliary Propulsion System (IAPS), which successfully achieved its goal of 2557 cycles and 7057 hr of thrusting beam on time generated several megabytes of test data over many months of continuous testing. A flight-like controller and power supply were used to control the thruster and acquire data. Thruster data was converted to RS232 format and transmitted to a personal computer, which stored the raw digital data on the nine-track tape. The tape format was such that with minor modifications, mainframe flight data analysis software could be used to analyze the Cyclic Ground Test data. The personal computer also converted the digital data to engineering units and displayed real time thruster parameters. Hardcopy data was printed at a rate dependent on thruster operating conditions. The tape drive provided a convenient means to transport the data to the mainframe for analysis, and avoided a development effort for new data analysis software for the Cyclic test. This paper describes the data system, interfacing and software requirements.
Building a Prototype of LHC Analysis Oriented Computing Centers
NASA Astrophysics Data System (ADS)
Bagliesi, G.; Boccali, T.; Della Ricca, G.; Donvito, G.; Paganoni, M.
2012-12-01
A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.
Expert diagnosis of plus disease in retinopathy of prematurity from computer-based image analysis
Campbell, J. Peter; Ataer-Cansizoglu, Esra; Bolon-Canedo, Veronica; Bozkurt, Alican; Erdogmus, Deniz; Kalpathy-Cramer, Jayashree; Patel, Samir N.; Reynolds, James D.; Horowitz, Jason; Hutcheson, Kelly; Shapiro, Michael; Repka, Michael X.; Ferrone, Phillip; Drenser, Kimberly; Martinez-Castellanos, Maria Ana; Ostmo, Susan; Jonas, Karyn; Chan, R.V. Paul; Chiang, Michael F.
2016-01-01
Importance Published definitions of “plus disease” in retinopathy of prematurity (ROP) reference arterial tortuosity and venous dilation within the posterior pole based on a standard published photograph. One possible explanation for limited inter-expert reliability for plus disease diagnosis is that experts deviate from the published definitions. Objective To identify vascular features used by experts for diagnosis of plus disease through quantitative image analysis. Design We developed a computer-based image analysis system (Imaging and Informatics in ROP, i-ROP), and trained the system to classify images compared to a reference standard diagnosis (RSD). System performance was analyzed as a function of the field of view (circular crops 1–6 disc diameters [DD] radius) and vessel subtype (arteries only, veins only, or all vessels). The RSD was compared to the majority diagnosis of experts. Setting Routine ROP screening in neonatal intensive care units at 8 academic institutions. Participants A set of 77 digital fundus images was used to develop the i-ROP system. A subset of 73 images was independently classified by 11 ROP experts for validation. Main Outcome Measures The primary outcome measure was the percentage accuracy of i-ROP system classification of plus disease with the RSD as a function of field-of-view and vessel type. Secondary outcome measures included the accuracy of the 11 experts compared to the RSD. Results Accuracy of plus disease diagnosis by the i-ROP computer based system was highest (95%, confidence interval [CI] 94 – 95%) when it incorporated vascular tortuosity from both arteries and veins, and with the widest field of view (6 disc diameter radius). Accuracy was ≤90% when using only arterial tortuosity (P<0.001), and ≤85% using a 2–3 disc diameter view similar to the standard published photograph (p<0.001). Diagnostic accuracy of the i-ROP system (95%) was comparable to that of 11 expert clinicians (79–99%). Conclusions and Relevance ROP experts appear to consider findings from beyond the posterior retina when diagnosing plus disease, and consider tortuosity of both arteries and veins, in contrast to published definitions. It is feasible for a computer-based image analysis system to perform comparably to ROP experts, using manually segmented images. PMID:27077667
Expert Diagnosis of Plus Disease in Retinopathy of Prematurity From Computer-Based Image Analysis.
Campbell, J Peter; Ataer-Cansizoglu, Esra; Bolon-Canedo, Veronica; Bozkurt, Alican; Erdogmus, Deniz; Kalpathy-Cramer, Jayashree; Patel, Samir N; Reynolds, James D; Horowitz, Jason; Hutcheson, Kelly; Shapiro, Michael; Repka, Michael X; Ferrone, Phillip; Drenser, Kimberly; Martinez-Castellanos, Maria Ana; Ostmo, Susan; Jonas, Karyn; Chan, R V Paul; Chiang, Michael F
2016-06-01
Published definitions of plus disease in retinopathy of prematurity (ROP) reference arterial tortuosity and venous dilation within the posterior pole based on a standard published photograph. One possible explanation for limited interexpert reliability for a diagnosis of plus disease is that experts deviate from the published definitions. To identify vascular features used by experts for diagnosis of plus disease through quantitative image analysis. A computer-based image analysis system (Imaging and Informatics in ROP [i-ROP]) was developed using a set of 77 digital fundus images, and the system was designed to classify images compared with a reference standard diagnosis (RSD). System performance was analyzed as a function of the field of view (circular crops with a radius of 1-6 disc diameters) and vessel subtype (arteries only, veins only, or all vessels). Routine ROP screening was conducted from June 29, 2011, to October 14, 2014, in neonatal intensive care units at 8 academic institutions, with a subset of 73 images independently classified by 11 ROP experts for validation. The RSD was compared with the majority diagnosis of experts. The primary outcome measure was the percentage of accuracy of the i-ROP system classification of plus disease, with the RSD as a function of the field of view and vessel type. Secondary outcome measures included the accuracy of the 11 experts compared with the RSD. Accuracy of plus disease diagnosis by the i-ROP computer-based system was highest (95%; 95% CI, 94%-95%) when it incorporated vascular tortuosity from both arteries and veins and with the widest field of view (6-disc diameter radius). Accuracy was 90% or less when using only arterial tortuosity and 85% or less using a 2- to 3-disc diameter view similar to the standard published photograph. Diagnostic accuracy of the i-ROP system (95%) was comparable to that of 11 expert physicians (mean 87%, range 79%-99%). Experts in ROP appear to consider findings from beyond the posterior retina when diagnosing plus disease and consider tortuosity of both arteries and veins, in contrast with published definitions. It is feasible for a computer-based image analysis system to perform comparably with ROP experts, using manually segmented images.
MOVES-Matrix and distributed computing for microscale line source dispersion analysis.
Liu, Haobing; Xu, Xiaodan; Rodgers, Michael O; Xu, Yanzhi Ann; Guensler, Randall L
2017-07-01
MOVES and AERMOD are the U.S. Environmental Protection Agency's recommended models for use in project-level transportation conformity and hot-spot analysis. However, the structure and algorithms involved in running MOVES make analyses cumbersome and time-consuming. Likewise, the modeling setup process, including extensive data requirements and required input formats, in AERMOD lead to a high potential for analysis error in dispersion modeling. This study presents a distributed computing method for line source dispersion modeling that integrates MOVES-Matrix, a high-performance emission modeling tool, with the microscale dispersion models CALINE4 and AERMOD. MOVES-Matrix was prepared by iteratively running MOVES across all possible iterations of vehicle source-type, fuel, operating conditions, and environmental parameters to create a huge multi-dimensional emission rate lookup matrix. AERMOD and CALINE4 are connected with MOVES-Matrix in a distributed computing cluster using a series of Python scripts. This streamlined system built on MOVES-Matrix generates exactly the same emission rates and concentration results as using MOVES with AERMOD and CALINE4, but the approach is more than 200 times faster than using the MOVES graphical user interface. Because AERMOD requires detailed meteorological input, which is difficult to obtain, this study also recommends using CALINE4 as a screening tool for identifying the potential area that may exceed air quality standards before using AERMOD (and identifying areas that are exceedingly unlikely to exceed air quality standards). CALINE4 worst case method yields consistently higher concentration results than AERMOD for all comparisons in this paper, as expected given the nature of the meteorological data employed. The paper demonstrates a distributed computing method for line source dispersion modeling that integrates MOVES-Matrix with the CALINE4 and AERMOD. This streamlined system generates exactly the same emission rates and concentration results as traditional way to use MOVES with AERMOD and CALINE4, which are regulatory models approved by the U.S. EPA for conformity analysis, but the approach is more than 200 times faster than implementing the MOVES model. We highlighted the potentially significant benefit of using CALINE4 as screening tool for identifying potential area that may exceeds air quality standards before using AERMOD, which requires much more meteorology input than CALINE4.
Marqués-Jiménez, Diego; Calleja-González, Julio; Arratibel, Iñaki; Delextrat, Anne; Terrados, Nicolás
2016-01-01
The aim was to identify benefits of compression garments used for recovery of exercised-induced muscle damage. Computer-based literature research was performed in September 2015 using four online databases: Medline (PubMed), Cochrane, WOS (Web Of Science) and Scopus. The analysis of risk of bias was completed in accordance with the Cochrane Collaboration Guidelines. Mean differences and 95% confidence intervals were calculated with Hedges' g for continuous outcomes. A random effect meta-analysis model was used. Systematic differences (heterogeneity) were assessed with I(2) statistic. Most results obtained had high heterogeneity, thus their interpretation should be careful. Our findings showed that creatine kinase (standard mean difference=-0.02, 9 studies) was unaffected when using compression garments for recovery purposes. In contrast, blood lactate concentration was increased (standard mean difference=0.98, 5 studies). Applying compression reduced lactate dehydrogenase (standard mean difference=-0.52, 2 studies), muscle swelling (standard mean difference=-0.73, 5 studies) and perceptual measurements (standard mean difference=-0.43, 15 studies). Analyses of power (standard mean difference=1.63, 5 studies) and strength (standard mean difference=1.18, 8 studies) indicate faster recovery of muscle function after exercise. These results suggest that the application of compression clothing may aid in the recovery of exercise induced muscle damage, although the findings need corroboration. Copyright © 2015 Elsevier Inc. All rights reserved.
TopoMS: Comprehensive topological exploration for molecular and condensed-matter systems.
Bhatia, Harsh; Gyulassy, Attila G; Lordi, Vincenzo; Pask, John E; Pascucci, Valerio; Bremer, Peer-Timo
2018-06-15
We introduce TopoMS, a computational tool enabling detailed topological analysis of molecular and condensed-matter systems, including the computation of atomic volumes and charges through the quantum theory of atoms in molecules, as well as the complete molecular graph. With roots in techniques from computational topology, and using a shared-memory parallel approach, TopoMS provides scalable, numerically robust, and topologically consistent analysis. TopoMS can be used as a command-line tool or with a GUI (graphical user interface), where the latter also enables an interactive exploration of the molecular graph. This paper presents algorithmic details of TopoMS and compares it with state-of-the-art tools: Bader charge analysis v1.0 (Arnaldsson et al., 01/11/17) and molecular graph extraction using Critic2 (Otero-de-la-Roza et al., Comput. Phys. Commun. 2014, 185, 1007). TopoMS not only combines the functionality of these individual codes but also demonstrates up to 4× performance gain on a standard laptop, faster convergence to fine-grid solution, robustness against lattice bias, and topological consistency. TopoMS is released publicly under BSD License. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.
Li, Zijian
2018-01-01
Regulations for pesticides in soil are important for controlling human health risk; humans can be exposed to pesticides by ingesting soil, inhaling soil dust, and through dermal contact. Previous studies focused on analyses of numerical standard values for pesticides and evaluated the same pesticide using different standards among different jurisdictions. To understand the health consequences associated with pesticide soil standard values, lifetime theoretical maximum contribution and risk characterization factors were used in this study to quantify the severity of damage using disability-adjusted life years (DALYs) under the maximum "legal" exposure to persistent organic pollutant (POP) pesticides that are commonly regulated by the Stockholm Convention. Results show that computed soil characterization factors for some pesticides present lognormal distributions, and some of them have DALY values higher than 1000.0 per million population (e.g., the DALY for dichlorodiphenyltrichloroethane [DDT] is 14,065 in the Netherlands, which exceeds the tolerable risk of uncertainty upper bound of 1380.0 DALYs). Health risk characterization factors computed from national jurisdictions illustrate that values can vary over eight orders of magnitude. Further, the computed characterization factors can vary over four orders of magnitude within the same national jurisdiction. These data indicate that there is little agreement regarding pesticide soil regulatory guidance values (RGVs) among worldwide national jurisdictions or even RGV standard values within the same jurisdiction. Among these POP pesticides, lindane has the lowest median (0.16 DALYs) and geometric mean (0.28 DALYs) risk characterization factors, indicating that worldwide national jurisdictions provide relatively conservative soil RGVs for lindane. In addition, we found that some European nations and members of the former Union of Soviet Socialist Republics share the same pesticide RGVs and data clusters for the computed characterization factors. Copyright © 2017 Elsevier Ltd. All rights reserved.
Rasmussen, Patrick P.; Gray, John R.; Glysson, G. Douglas; Ziegler, Andrew C.
2009-01-01
In-stream continuous turbidity and streamflow data, calibrated with measured suspended-sediment concentration data, can be used to compute a time series of suspended-sediment concentration and load at a stream site. Development of a simple linear (ordinary least squares) regression model for computing suspended-sediment concentrations from instantaneous turbidity data is the first step in the computation process. If the model standard percentage error (MSPE) of the simple linear regression model meets a minimum criterion, this model should be used to compute a time series of suspended-sediment concentrations. Otherwise, a multiple linear regression model using paired instantaneous turbidity and streamflow data is developed and compared to the simple regression model. If the inclusion of the streamflow variable proves to be statistically significant and the uncertainty associated with the multiple regression model results in an improvement over that for the simple linear model, the turbidity-streamflow multiple linear regression model should be used to compute a suspended-sediment concentration time series. The computed concentration time series is subsequently used with its paired streamflow time series to compute suspended-sediment loads by standard U.S. Geological Survey techniques. Once an acceptable regression model is developed, it can be used to compute suspended-sediment concentration beyond the period of record used in model development with proper ongoing collection and analysis of calibration samples. Regression models to compute suspended-sediment concentrations are generally site specific and should never be considered static, but they represent a set period in a continually dynamic system in which additional data will help verify any change in sediment load, type, and source.
A meta-analysis of instructional systems applied in science teaching
NASA Astrophysics Data System (ADS)
Willett, John B.; Yamashita, June J. M.; Anderson, Ronald D.
This article is a report of a meta-analysis on the question: What are the effects of different instructional systems used in science teaching? The studies utilized in this meta-analysis were identified by a process that included a systematic screening of all dissertations completed in the field of science education since 1950, an ERIC search of the literature, a systematic screening of selected research journals, and the standard procedure of identifying potentially relevant studies through examination of the bibliographies of the studies reviewed. In all, the 130 studies coded gave rise to 341 effect sizes. The mean effect size produced over all systems was 0.10 with a standard deviation of 0.41, indicating that, on the average, an innovative teaching system in this sample produced one-tenth of a standard deviation better performance than traditional science teaching. Particular kinds of teaching systems, however, produced results that varied from this overall result. Mean effect sizes were also computed by year of publication, form of publication, grade level, and subject matter.
76 FR 62373 - Notice of Public Meeting-Cloud Computing Forum & Workshop IV
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-07
...--Cloud Computing Forum & Workshop IV AGENCY: National Institute of Standards and Technology (NIST), Commerce. ACTION: Notice. SUMMARY: NIST announces the Cloud Computing Forum & Workshop IV to be held on... to help develop open standards in interoperability, portability and security in cloud computing. This...
Computer Science and Technology Publications. NBS Publications List 84.
ERIC Educational Resources Information Center
National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.
This bibliography lists publications of the Institute for Computer Sciences and Technology of the National Bureau of Standards. Publications are listed by subject in the areas of computer security, computer networking, and automation technology. Sections list publications of: (1) current Federal Information Processing Standards; (2) computer…
Web-based analysis and publication of flow cytometry experiments.
Kotecha, Nikesh; Krutzik, Peter O; Irish, Jonathan M
2010-07-01
Cytobank is a Web-based application for storage, analysis, and sharing of flow cytometry experiments. Researchers use a Web browser to log in and use a wide range of tools developed for basic and advanced flow cytometry. In addition to providing access to standard cytometry tools from any computer, Cytobank creates a platform and community for developing new analysis and publication tools. Figure layouts created on Cytobank are designed to allow transparent access to the underlying experiment annotation and data processing steps. Since all flow cytometry files and analysis data are stored on a central server, experiments and figures can be viewed or edited by anyone with the proper permission, from any computer with Internet access. Once a primary researcher has performed the initial analysis of the data, collaborators can engage in experiment analysis and make their own figure layouts using the gated, compensated experiment files. Cytobank is available to the scientific community at http://www.cytobank.org. (c) 2010 by John Wiley & Sons, Inc.
Web-Based Analysis and Publication of Flow Cytometry Experiments
Kotecha, Nikesh; Krutzik, Peter O.; Irish, Jonathan M.
2014-01-01
Cytobank is a web-based application for storage, analysis, and sharing of flow cytometry experiments. Researchers use a web browser to log in and use a wide range of tools developed for basic and advanced flow cytometry. In addition to providing access to standard cytometry tools from any computer, Cytobank creates a platform and community for developing new analysis and publication tools. Figure layouts created on Cytobank are designed to allow transparent access to the underlying experiment annotation and data processing steps. Since all flow cytometry files and analysis data are stored on a central server, experiments and figures can be viewed or edited by anyone with the proper permissions from any computer with Internet access. Once a primary researcher has performed the initial analysis of the data, collaborators can engage in experiment analysis and make their own figure layouts using the gated, compensated experiment files. Cytobank is available to the scientific community at www.cytobank.org PMID:20578106
Zhang, Xue; Xiao, Yang; Zeng, Jie; Qiu, Weibao; Qian, Ming; Wang, Congzhi; Zheng, Rongqin; Zheng, Hairong
2014-01-01
To develop and evaluate a computer-assisted method of quantifying five-point elasticity scoring system based on ultrasound real-time elastography (RTE), for classifying benign and malignant breast lesions, with pathologic results as the reference standard. Conventional ultrasonography (US) and RTE images of 145 breast lesions (67 malignant, 78 benign) were performed in this study. Each lesion was automatically contoured on the B-mode image by the level set method and mapped on the RTE image. The relative elasticity value of each pixel was reconstructed and classified into hard or soft by the fuzzy c-means clustering method. According to the hardness degree inside lesion and its surrounding tissue, the elasticity score of the RTE image was computed in an automatic way. Visual assessments of the radiologists were used for comparing the diagnostic performance. Histopathologic examination was used as the reference standard. The Student's t test and receiver operating characteristic (ROC) curve analysis were performed for statistical analysis. Considering score 4 or higher as test positive for malignancy, the diagnostic accuracy, sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) were 93.8% (136/145), 92.5% (62/67), 94.9% (74/78), 93.9% (62/66), and 93.7% (74/79) for the computer-assisted scheme, and 89.7% (130/145), 85.1% (57/67), 93.6% (73/78), 92.0% (57/62), and 88.0% (73/83) for manual assessment. Area under ROC curve (Az value) for the proposed method was higher than the Az value for visual assessment (0.96 vs. 0.93). Computer-assisted quantification of classical five-point scoring system can significantly eliminate the interobserver variability and thereby improve the diagnostic confidence of classifying the breast lesions to avoid unnecessary biopsy. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Camargo, Anyela; Papadopoulou, Dimitra; Spyropoulou, Zoi; Vlachonasios, Konstantinos; Doonan, John H; Gay, Alan P
2014-01-01
Computer-vision based measurements of phenotypic variation have implications for crop improvement and food security because they are intrinsically objective. It should be possible therefore to use such approaches to select robust genotypes. However, plants are morphologically complex and identification of meaningful traits from automatically acquired image data is not straightforward. Bespoke algorithms can be designed to capture and/or quantitate specific features but this approach is inflexible and is not generally applicable to a wide range of traits. In this paper, we have used industry-standard computer vision techniques to extract a wide range of features from images of genetically diverse Arabidopsis rosettes growing under non-stimulated conditions, and then used statistical analysis to identify those features that provide good discrimination between ecotypes. This analysis indicates that almost all the observed shape variation can be described by 5 principal components. We describe an easily implemented pipeline including image segmentation, feature extraction and statistical analysis. This pipeline provides a cost-effective and inherently scalable method to parameterise and analyse variation in rosette shape. The acquisition of images does not require any specialised equipment and the computer routines for image processing and data analysis have been implemented using open source software. Source code for data analysis is written using the R package. The equations to calculate image descriptors have been also provided.
Spacecraft Charging Standard Report.
1980-09-30
SSPM include: SAMPLE POTENTIAL (with respect to S/C ground) Aluminized Kapton -2.0 kV Silvered Teflon -4.0 kV Astroquartz -3.7 kV 50.3 Analysis. As...and potential gradients on the space vehicle (candidate spacecraft locations for ESD tests) (The NASCAP computer code, when validated, will be useful...The coupling analysis should then determine as a minimum: I. electromagnetic fields generated interior to the space vehicle due to ESD 2. induced
Application of linear regression analysis in accuracy assessment of rolling force calculations
NASA Astrophysics Data System (ADS)
Poliak, E. I.; Shim, M. K.; Kim, G. S.; Choo, W. Y.
1998-10-01
Efficient operation of the computational models employed in process control systems require periodical assessment of the accuracy of their predictions. Linear regression is proposed as a tool which allows separate systematic and random prediction errors from those related to measurements. A quantitative characteristic of the model predictive ability is introduced in addition to standard statistical tests for model adequacy. Rolling force calculations are considered as an example for the application. However, the outlined approach can be used to assess the performance of any computational model.
TRAC posttest calculations of Semiscale Test S-06-3. [PWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ireland, J.R.; Bleiweis, P.B.
A comparison of Transient Reactor Analysis Code (TRAC) steady-state and transient results with Semiscale Test S-06-3 (US Standard Problem 8) experimental data is discussed. The TRAC model used employs fewer mesh cells than normal data comparison models so that TRAC's ability to obtain reasonable results with less computer time can be assessed. In general, the TRAC results are in good agreement with the data and the major phenomena found in the experiment are reproduced by the code with a substantial reduction in computing times.
Microcomputer package for statistical analysis of microbial populations.
Lacroix, J M; Lavoie, M C
1987-11-01
We have developed a Pascal system to compare microbial populations from different ecological sites using microcomputers. The values calculated are: the coverage value and its standard error, the minimum similarity and the geometric similarity between two biological samples, and the Lambda test consisting of calculating the ratio of the mean similarity between two subsets by the mean similarity within subsets. This system is written for Apple II, IBM or compatible computers, but it can work for any computer which can use CP/M, if the programs are recompiled for such a system.
Langenbucher, Frieder
2005-01-01
A linear system comprising n compartments is completely defined by the rate constants between any of the compartments and the initial condition in which compartment(s) the drug is present at the beginning. The generalized solution is the time profiles of drug amount in each compartment, described by polyexponential equations. Based on standard matrix operations, an Excel worksheet computes the rate constants and the coefficients, finally the full time profiles for a specified range of time values.
2009-09-01
CTM) personnel install, configure, diagnose, and repair state-of-the- art electronic, computer, and network hardware/software systems ashore and...participants consisted of a Culinary Specialist and a Fire Control Technician (not actively involved with Combat System operations). 26 Figure 10
ERIC Educational Resources Information Center
Lewis, Virginia Vimpeny
2011-01-01
Number Concepts; Measurement; Geometry; Probability; Statistics; and Patterns, Functions and Algebra. Procedural Errors were further categorized into the following content categories: Computation; Measurement; Statistics; and Patterns, Functions, and Algebra. The results of the analysis showed the main sources of error for 6th, 7th, and 8th…
Mathematical form models of tree trunks
Rudolfs Ozolins
2000-01-01
Assortment structure analysis of tree trunks is a characteristic and proper problem that can be solved by using mathematical modeling and standard computer programs. Mathematical form model of tree trunks consists of tapering curve equations and their parameters. Parameters for nine species were obtained by processing measurements of 2,794 model trees and studying the...
NASA Astrophysics Data System (ADS)
Trappe, Neil; Murphy, J. Anthony; Withington, Stafford
2003-07-01
Gaussian beam mode analysis (GBMA) offers a more intuitive physical insight into how light beams evolve as they propagate than the conventional Fresnel diffraction integral approach. In this paper we illustrate that GBMA is a computationally efficient, alternative technique for tracing the evolution of a diffracting coherent beam. In previous papers we demonstrated the straightforward application of GBMA to the computation of the classical diffraction patterns associated with a range of standard apertures. In this paper we show how the GBMA technique can be expanded to investigate the effects of aberrations in the presence of diffraction by introducing the appropriate phase error term into the propagating quasi-optical beam. We compare our technique to the standard diffraction integral calculation for coma, astigmatism and spherical aberration, taking—for comparison—examples from the classic text 'Principles of Optics' by Born and Wolf. We show the advantages of GBMA for allowing the defocusing of an aberrated image to be evaluated quickly, which is particularly important and useful for probing the consequences of astigmatism and spherical aberration.
Cosmochemistry: Understanding the Solar System through analysis of extraterrestrial materials.
MacPherson, Glenn J; Thiemens, Mark H
2011-11-29
Cosmochemistry is the chemical analysis of extraterrestrial materials. This term generally is taken to mean laboratory analysis, which is the cosmochemistry gold standard because of the ability for repeated analysis under highly controlled conditions using the most advanced instrumentation unhindered by limitations in power, space, or environment. Over the past 40 y, advances in technology have enabled telescopic and spacecraft instruments to provide important data that significantly complement the laboratory data. In this special edition, recent advances in the state of the art of cosmochemistry are presented, which range from instrumental analysis of meteorites to theoretical-computational and astronomical observations.
A Simple XML Producer-Consumer Protocol
NASA Technical Reports Server (NTRS)
Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)
2001-01-01
There are many different projects from government, academia, and industry that provide services for delivering events in distributed environments. The problem with these event services is that they are not general enough to support all uses and they speak different protocols so that they cannot interoperate. We require such interoperability when we, for example, wish to analyze the performance of an application in a distributed environment. Such an analysis might require performance information from the application, computer systems, networks, and scientific instruments. In this work we propose and evaluate a standard XML-based protocol for the transmission of events in distributed systems. One recent trend in government and academic research is the development and deployment of computational grids. Computational grids are large-scale distributed systems that typically consist of high-performance compute, storage, and networking resources. Examples of such computational grids are the DOE Science Grid, the NASA Information Power Grid (IPG), and the NSF Partnerships for Advanced Computing Infrastructure (PACIs). The major effort to deploy these grids is in the area of developing the software services to allow users to execute applications on these large and diverse sets of resources. These services include security, execution of remote applications, managing remote data, access to information about resources and services, and so on. There are several toolkits for providing these services such as Globus, Legion, and Condor. As part of these efforts to develop computational grids, the Global Grid Forum is working to standardize the protocols and APIs used by various grid services. This standardization will allow interoperability between the client and server software of the toolkits that are providing the grid services. The goal of the Performance Working Group of the Grid Forum is to standardize protocols and representations related to the storage and distribution of performance data. These standard protocols and representations must support tasks such as profiling parallel applications, monitoring the status of computers and networks, and monitoring the performance of services provided by a computational grid. This paper describes a proposed protocol and data representation for the exchange of events in a distributed system. The protocol exchanges messages formatted in XML and it can be layered atop any low-level communication protocol such as TCP or UDP Further, we describe Java and C++ implementations of this protocol and discuss their performance. The next section will provide some further background information. Section 3 describes the main communication patterns of our protocol. Section 4 describes how we represent events and related information using XML. Section 5 describes our protocol and Section 6 discusses the performance of two implementations of the protocol. Finally, an appendix provides the XML Schema definition of our protocol and event information.
A cloud physics investigation utilizing Skylab data
NASA Technical Reports Server (NTRS)
Alishouse, J.; Jacobowitz, H.; Wark, D. (Principal Investigator)
1975-01-01
The author has identified the following significant results. The Lowtran 2 program, S191 spectral response, and solar spectrum were used to compute the expected absorption by 2.0 micron band for a variety of cloud pressure levels and solar zenith angles. Analysis of the three long wavelength data channels continued in which it was found necessary to impose a minimum radiance criterion. It was also found necessary to modify the computer program to permit the computation of mean values and standard deviations for selected subsets of data on a given tape. A technique for computing the integrated absorption in the A band was devised. The technique normalizes the relative maximum at approximately .78 micron to the solar irradiance curve and then adjusts the relative maximum at approximately .74 micron to fit the solar curve.
Integration of drug dosing data with physiological data streams using a cloud computing paradigm.
Bressan, Nadja; James, Andrew; McGregor, Carolyn
2013-01-01
Many drugs are used during the provision of intensive care for the preterm newborn infant. Recommendations for drug dosing in newborns depend upon data from population based pharmacokinetic research. There is a need to be able to modify drug dosing in response to the preterm infant's response to the standard dosing recommendations. The real-time integration of physiological data with drug dosing data would facilitate individualised drug dosing for these immature infants. This paper proposes the use of a novel computational framework that employs real-time, temporal data analysis for this task. Deployment of the framework within the cloud computing paradigm will enable widespread distribution of individualized drug dosing for newborn infants.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bakosi, Jozsef; Christon, Mark A.; Francois, Marianne M.
Progress is reported on computational capabilities for the grid-to-rod-fretting (GTRF) problem of pressurized water reactors. Numeca's Hexpress/Hybrid mesh generator is demonstrated as an excellent alternative to generating computational meshes for complex flow geometries, such as in GTRF. Mesh assessment is carried out using standard industrial computational fluid dynamics practices. Hydra-TH, a simulation code developed at LANL for reactor thermal-hydraulics, is demonstrated on hybrid meshes, containing different element types. A series of new Hydra-TH calculations has been carried out collecting turbulence statistics. Preliminary results on the newly generated meshes are discussed; full analysis will be documented in the L3 milestone, THM.CFD.P5.05,more » Sept. 2012.« less
NASA Technical Reports Server (NTRS)
Smith, Terence R.; Menon, Sudhakar; Star, Jeffrey L.; Estes, John E.
1987-01-01
This paper provides a brief survey of the history, structure and functions of 'traditional' geographic information systems (GIS), and then suggests a set of requirements that large-scale GIS should satisfy, together with a set of principles for their satisfaction. These principles, which include the systematic application of techniques from several subfields of computer science to the design and implementation of GIS and the integration of techniques from computer vision and image processing into standard GIS technology, are discussed in some detail. In particular, the paper provides a detailed discussion of questions relating to appropriate data models, data structures and computational procedures for the efficient storage, retrieval and analysis of spatially-indexed data.
Marcek, Dusan; Durisova, Maria
2016-01-01
This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process. PMID:26977450
Falat, Lukas; Marcek, Dusan; Durisova, Maria
2016-01-01
This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process.
Status of emerging standards for removable computer storage media and related contributions of NIST
NASA Technical Reports Server (NTRS)
Podio, Fernando L.
1992-01-01
Standards for removable computer storage media are needed so that users may reliably interchange data both within and among various computer installations. Furthermore, media interchange standards support competition in industry and prevent sole-source lock-in. NIST participates in magnetic tape and optical disk standards development through Technical Committees X3B5, Digital Magnetic Tapes, X3B11, Optical Digital Data Disk, and the Joint Technical Commission on Data Permanence. NIST also participates in other relevant national and international standards committees for removable computer storage media. Industry standards for digital magnetic tapes require the use of Standard Reference Materials (SRM's) developed and maintained by NIST. In addition, NIST has been studying care and handling procedures required for digital magnetic tapes. NIST has developed a methodology for determining the life expectancy of optical disks. NIST is developing care and handling procedures for optical digital data disks and is involved in a program to investigate error reporting capabilities of optical disk drives. This presentation reflects the status of emerging magnetic tape and optical disk standards, as well as NIST's contributions in support of these standards.
Large data series: Modeling the usual to identify the unusual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Downing, D.J.; Fedorov, V.V.; Lawkins, W.F.
{open_quotes}Standard{close_quotes} approaches such as regression analysis, Fourier analysis, Box-Jenkins procedure, et al., which handle a data series as a whole, are not useful for very large data sets for at least two reasons. First, even with computer hardware available today, including parallel processors and storage devices, there are no effective means for manipulating and analyzing gigabyte, or larger, data files. Second, in general it can not be assumed that a very large data set is {open_quotes}stable{close_quotes} by the usual measures, like homogeneity, stationarity, and ergodicity, that standard analysis techniques require. Both reasons dictate the necessity to use {open_quotes}local{close_quotes} data analysismore » methods whereby the data is segmented and ordered, where order leads to a sense of {open_quotes}neighbor,{close_quotes} and then analyzed segment by segment. The idea of local data analysis is central to the study reported here.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potter, R.A.
1981-09-01
Eighteen heat-up tests were run on nine standard and experimental dual component monolithic refractory concrete linings. These tests were run with a five foot diameter by 14-ft high Pressure Vessel/Test Furnace designed to accommodate a 12-inch thick by 5-ft high refractory lining, heat the hot face to 2000/sup 0/F and expose the lining to air or steam pressures up to 150 psig. Results obtained from standard type linings in the test facility indicated that lining degradation duplicated that observed in field installations. The lining performance was significantly improved due to information gained from a systematic study of the cracking thatmore » occurred in the linings; the analysis of the lining strains, shell stresses and acoustic emission results; and the stress analyses performed on the standard and experimental lining designs with the finite element analysis computer programs, REFSAM and RESGAP.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Z. J.; Wells, D.; Green, J.
Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switchingmore » the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.« less
Laboratory techniques and rhythmometry
NASA Technical Reports Server (NTRS)
Halberg, F.
1973-01-01
Some of the procedures used for the analysis of rhythms are illustrated, notably as these apply to current medical and biological practice. For a quantitative approach to medical and broader socio-ecologic goals, the chronobiologist gathers numerical objective reference standards for rhythmic biophysical, biochemical, and behavioral variables. These biological reference standards can be derived by specialized computer analyses of largely self-measured (until eventually automatically recorded) time series (autorhythmometry). Objective numerical values for individual and population parameters of reproductive cycles can be obtained concomitantly with characteristics of about-yearly (circannual), about-daily (circadian) and other rhythms.
A normative price for a manufactured product: The SAMICS methodology. Volume 2: Analysis
NASA Technical Reports Server (NTRS)
Chamberlain, R. G.
1979-01-01
The Solar Array Manufacturing Industry Costing Standards provide standard formats, data, assumptions, and procedures for determining the price a hypothetical solar array manufacturer would have to be able to obtain in the market to realize a specified after-tax rate of return on equity for a specified level of production. The methodology and its theoretical background are presented. The model is sufficiently general to be used in any production-line manufacturing environment. Implementation of this methodology by the Solar Array Manufacturing Industry Simultation computer program is discussed.
Annotare--a tool for annotating high-throughput biomedical investigations and resulting data.
Shankar, Ravi; Parkinson, Helen; Burdett, Tony; Hastings, Emma; Liu, Junmin; Miller, Michael; Srinivasa, Rashmi; White, Joseph; Brazma, Alvis; Sherlock, Gavin; Stoeckert, Christian J; Ball, Catherine A
2010-10-01
Computational methods in molecular biology will increasingly depend on standards-based annotations that describe biological experiments in an unambiguous manner. Annotare is a software tool that enables biologists to easily annotate their high-throughput experiments, biomaterials and data in a standards-compliant way that facilitates meaningful search and analysis. Annotare is available from http://code.google.com/p/annotare/ under the terms of the open-source MIT License (http://www.opensource.org/licenses/mit-license.php). It has been tested on both Mac and Windows.
CFD Process Pre- and Post-processing Automation in Support of Space Propulsion
NASA Technical Reports Server (NTRS)
Dorney, Suzanne M.
2003-01-01
The use of Computational Fluid Dynamics or CFD has become standard practice in the design and analysis of the major components used for space propulsion. In an attempt to standardize and improve the CFD process a series of automated tools have been developed. Through the use of these automated tools the application of CFD to the design cycle has been improved and streamlined. This paper presents a series of applications in which deficiencies were identified in the CFD process and corrected through the development of automated tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnstad, H.
The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnstad, H.
The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less
Research on the application in disaster reduction for using cloud computing technology
NASA Astrophysics Data System (ADS)
Tao, Liang; Fan, Yida; Wang, Xingling
Cloud Computing technology has been rapidly applied in different domains recently, promotes the progress of the domain's informatization. Based on the analysis of the state of application requirement in disaster reduction and combining the characteristics of Cloud Computing technology, we present the research on the application of Cloud Computing technology in disaster reduction. First of all, we give the architecture of disaster reduction cloud, which consists of disaster reduction infrastructure as a service (IAAS), disaster reduction cloud application platform as a service (PAAS) and disaster reduction software as a service (SAAS). Secondly, we talk about the standard system of disaster reduction in five aspects. Thirdly, we indicate the security system of disaster reduction cloud. Finally, we draw a conclusion the use of cloud computing technology will help us to solve the problems for disaster reduction and promote the development of disaster reduction.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...
Code of Federal Regulations, 2012 CFR
2012-10-01
... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...
Code of Federal Regulations, 2011 CFR
2011-10-01
... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...
Code of Federal Regulations, 2010 CFR
2010-10-01
... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.706(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...
Finite difference computation of Casimir forces
NASA Astrophysics Data System (ADS)
Pinto, Fabrizio
2016-09-01
In this Invited paper, we begin by a historical introduction to provide a motivation for the classical problems of interatomic force computation and associated challenges. This analysis will lead us from early theoretical and experimental accomplishments to the integration of these fascinating interactions into the operation of realistic, next-generation micro- and nanodevices both for the advanced metrology of fundamental physical processes and in breakthrough industrial applications. Among several powerful strategies enabling vastly enhanced performance and entirely novel technological capabilities, we shall specifically consider Casimir force time-modulation and the adoption of non-trivial geometries. As to the former, the ability to alter the magnitude and sign of the Casimir force will be recognized as a crucial principle to implement thermodynamical nano-engines. As to the latter, we shall first briefly review various reported computational approaches. We shall then discuss the game-changing discovery, in the last decade, that standard methods of numerical classical electromagnetism can be retooled to formulate the problem of Casimir force computation in arbitrary geometries. This remarkable development will be practically illustrated by showing that such an apparently elementary method as standard finite-differencing can be successfully employed to numerically recover results known from the Lifshitz theory of dispersion forces in the case of interacting parallel-plane slabs. Other geometries will be also be explored and consideration given to the potential of non-standard finite-difference methods. Finally, we shall introduce problems at the computational frontier, such as those including membranes deformed by Casimir forces and the effects of anisotropic materials. Conclusions will highlight the dramatic transition from the enduring perception of this field as an exotic application of quantum electrodynamics to the recent demonstration of a human climbing vertically on smooth glass.
Freud: a software suite for high-throughput simulation analysis
NASA Astrophysics Data System (ADS)
Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon
Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.
NASA Astrophysics Data System (ADS)
Law, Yuen C.; Tenbrinck, Daniel; Jiang, Xiaoyi; Kuhlen, Torsten
2014-03-01
Computer-assisted processing and interpretation of medical ultrasound images is one of the most challenging tasks within image analysis. Physical phenomena in ultrasonographic images, e.g., the characteristic speckle noise and shadowing effects, make the majority of standard methods from image analysis non optimal. Furthermore, validation of adapted computer vision methods proves to be difficult due to missing ground truth information. There is no widely accepted software phantom in the community and existing software phantoms are not exible enough to support the use of specific speckle models for different tissue types, e.g., muscle and fat tissue. In this work we propose an anatomical software phantom with a realistic speckle pattern simulation to _ll this gap and provide a exible tool for validation purposes in medical ultrasound image analysis. We discuss the generation of speckle patterns and perform statistical analysis of the simulated textures to obtain quantitative measures of the realism and accuracy regarding the resulting textures.
Retinal imaging analysis based on vessel detection.
Jamal, Arshad; Hazim Alkawaz, Mohammed; Rehman, Amjad; Saba, Tanzila
2017-07-01
With an increase in the advancement of digital imaging and computing power, computationally intelligent technologies are in high demand to be used in ophthalmology cure and treatment. In current research, Retina Image Analysis (RIA) is developed for optometrist at Eye Care Center in Management and Science University. This research aims to analyze the retina through vessel detection. The RIA assists in the analysis of the retinal images and specialists are served with various options like saving, processing and analyzing retinal images through its advanced interface layout. Additionally, RIA assists in the selection process of vessel segment; processing these vessels by calculating its diameter, standard deviation, length, and displaying detected vessel on the retina. The Agile Unified Process is adopted as the methodology in developing this research. To conclude, Retina Image Analysis might help the optometrist to get better understanding in analyzing the patient's retina. Finally, the Retina Image Analysis procedure is developed using MATLAB (R2011b). Promising results are attained that are comparable in the state of art. © 2017 Wiley Periodicals, Inc.
Illuminator, a desktop program for mutation detection using short-read clonal sequencing.
Carr, Ian M; Morgan, Joanne E; Diggle, Christine P; Sheridan, Eamonn; Markham, Alexander F; Logan, Clare V; Inglehearn, Chris F; Taylor, Graham R; Bonthron, David T
2011-10-01
Current methods for sequencing clonal populations of DNA molecules yield several gigabases of data per day, typically comprising reads of < 100 nt. Such datasets permit widespread genome resequencing and transcriptome analysis or other quantitative tasks. However, this huge capacity can also be harnessed for the resequencing of smaller (gene-sized) target regions, through the simultaneous parallel analysis of multiple subjects, using sample "tagging" or "indexing". These methods promise to have a huge impact on diagnostic mutation analysis and candidate gene testing. Here we describe a software package developed for such studies, offering the ability to resolve pooled samples carrying barcode tags and to align reads to a reference sequence using a mutation-tolerant process. The program, Illuminator, can identify rare sequence variants, including insertions and deletions, and permits interactive data analysis on standard desktop computers. It facilitates the effective analysis of targeted clonal sequencer data without dedicated computational infrastructure or specialized training. Copyright © 2011 Elsevier Inc. All rights reserved.
Estimation of standard liver volume in Chinese adult living donors.
Fu-Gui, L; Lu-Nan, Y; Bo, L; Yong, Z; Tian-Fu, W; Ming-Qing, X; Wen-Tao, W; Zhe-Yu, C
2009-12-01
To determine a formula predicting the standard liver volume based on body surface area (BSA) or body weight in Chinese adults. A total of 115 consecutive right-lobe living donors not including the middle hepatic vein underwent right hemi-hepatectomy. No organs were used from prisoners, and no subjects were prisoners. Donor anthropometric data including age, gender, body weight, and body height were recorded prospectively. The weights and volumes of the right lobe liver grafts were measured at the back table. Liver weights and volumes were calculated from the right lobe graft weight and volume obtained at the back table, divided by the proportion of the right lobe on computed tomography. By simple linear regression analysis and stepwise multiple linear regression analysis, we correlated calculated liver volume and body height, body weight, or body surface area. The subjects had a mean age of 35.97 +/- 9.6 years, and a female-to-male ratio of 60:55. The mean volume of the right lobe was 727.47 +/- 136.17 mL, occupying 55.59% +/- 6.70% of the whole liver by computed tomography. The volume of the right lobe was 581.73 +/- 96.137 mL, and the estimated liver volume was 1053.08 +/- 167.56 mL. Females of the same body weight showed a slightly lower liver weight. By simple linear regression analysis and stepwise multiple linear regression analysis, a formula was derived based on body weight. All formulae except the Hong Kong formula overestimated liver volume compared to this formula. The formula of standard liver volume, SLV (mL) = 11.508 x body weight (kg) + 334.024, may be applied to estimate liver volumes in Chinese adults.
Wake coupling to full potential rotor analysis code
NASA Technical Reports Server (NTRS)
Torres, Francisco J.; Chang, I-Chung; Oh, Byung K.
1990-01-01
The wake information from a helicopter forward flight code is coupled with two transonic potential rotor codes. The induced velocities for the near-, mid-, and far-wake geometries are extracted from a nonlinear rigid wake of a standard performance and analysis code. These, together with the corresponding inflow angles, computation points, and azimuth angles, are then incorporated into the transonic potential codes. The coupled codes can then provide an improved prediction of rotor blade loading at transonic speeds.
Life-cycle costs of high-performance cells
NASA Technical Reports Server (NTRS)
Daniel, R.; Burger, D.; Reiter, L.
1985-01-01
A life cycle cost analysis of high efficiency cells was presented. Although high efficiency cells produce more power, they also cost more to make and are more susceptible to array hot-spot heating. Three different computer analysis programs were used: SAMICS (solar array manufacturing industry costing standards), PVARRAY (an array failure mode/degradation simulator), and LCP (lifetime cost and performance). The high efficiency cell modules were found to be more economical in this study, but parallel redundancy is recommended.
Schadow, Gunther; Dhaval, Rakesh; McDonald, Clement J; Ragg, Susanne
2006-01-01
We present the architecture and approach of an evolving campus-wide information service for tissues with clinical and data annotations to be used and contributed to by clinical researchers across the campus. The services provided include specimen tracking, long term data storage, and computational analysis services. The project is conceived and sustained by collaboration among researchers on the campus as well as participation in standards organizations and national collaboratives.
NASA Technical Reports Server (NTRS)
Muss, J. A.; Nguyen, T. V.; Johnson, C. W.
1991-01-01
The user's manual for the rocket combustor interactive design (ROCCID) computer program is presented. The program, written in Fortran 77, provides a standardized methodology using state of the art codes and procedures for the analysis of a liquid rocket engine combustor's steady state combustion performance and combustion stability. The ROCCID is currently capable of analyzing mixed element injector patterns containing impinging like doublet or unlike triplet, showerhead, shear coaxial, and swirl coaxial elements as long as only one element type exists in each injector core, baffle, or barrier zone. Real propellant properties of oxygen, hydrogen, methane, propane, and RP-1 are included in ROCCID. The properties of other propellants can easily be added. The analysis model in ROCCID can account for the influence of acoustic cavities, helmholtz resonators, and radial thrust chamber baffles on combustion stability. ROCCID also contains the logic to interactively create a combustor design which meets input performance and stability goals. A preliminary design results from the application of historical correlations to the input design requirements. The steady state performance and combustion stability of this design is evaluated using the analysis models, and ROCCID guides the user as to the design changes required to satisfy the user's performance and stability goals, including the design of stability aids. Output from ROCCID includes a formatted input file for the standardized JANNAF engine performance prediction procedure.
NASA Astrophysics Data System (ADS)
Peng, Yahui; Jiang, Yulei; Liarski, Vladimir M.; Kaverina, Natalya; Clark, Marcus R.; Giger, Maryellen L.
2012-03-01
Analysis of interactions between B and T cells in tubulointerstitial inflammation is important for understanding human lupus nephritis. We developed a computer technique to perform this analysis, and compared it with manual analysis. Multi-channel immunoflourescent-microscopy images were acquired from 207 regions of interest in 40 renal tissue sections of 19 patients diagnosed with lupus nephritis. Fresh-frozen renal tissue sections were stained with combinations of immunoflourescent antibodies to membrane proteins and counter-stained with a cell nuclear marker. Manual delineation of the antibodies was considered as the reference standard. We first segmented cell nuclei and cell membrane markers, and then determined corresponding cell types based on the distances between cell nuclei and specific cell-membrane marker combinations. Subsequently, the distribution of the shortest distance from T cell nuclei to B cell nuclei was obtained and used as a surrogate indicator of cell-cell interactions. The computer and manual analyses results were concordant. The average absolute difference was 1.1+/-1.2% between the computer and manual analysis results in the number of cell-cell distances of 3 μm or less as a percentage of the total number of cell-cell distances. Our computerized analysis of cell-cell distances could be used as a surrogate for quantifying cell-cell interactions as either an automated and quantitative analysis or for independent confirmation of manual analysis.
CAPRI: Using a Geometric Foundation for Computational Analysis and Design
NASA Technical Reports Server (NTRS)
Haimes, Robert
2002-01-01
CAPRI (Computational Analysis Programming Interface) is a software development tool intended to make computerized design, simulation and analysis faster and more efficient. The computational steps traditionally taken for most engineering analysis (Computational Fluid Dynamics (CFD), structural analysis, etc.) are: Surface Generation, usually by employing a Computer Aided Design (CAD) system; Grid Generation, preparing the volume for the simulation; Flow Solver, producing the results at the specified operational point; Post-processing Visualization, interactively attempting to understand the results. It should be noted that the structures problem is more tractable than CFD; there are fewer mesh topologies used and the grids are not as fine (this problem space does not have the length scaling issues of fluids). For CFD, these steps have worked well in the past for simple steady-state simulations at the expense of much user interaction. The data was transmitted between phases via files. In most cases, the output from a CAD system could go IGES files. The output from Grid Generators and Solvers do not really have standards though there are a couple of file formats that can be used for a subset of the gridding (i.e. PLOT3D) data formats and the upcoming CGNS). The user would have to patch up the data or translate from one format to another to move to the next step. Sometimes this could take days. Instead of the serial approach to analysis, CAPRI takes a geometry centric approach. CAPRI is a software building tool-kit that refers to two ideas: (1) A simplified, object-oriented, hierarchical view of a solid part integrating both geometry and topology definitions, and (2) programming access to this part or assembly and any attached data. The connection to the geometry is made through an Application Programming Interface (API) and not a file system.
An alternative approach for computing seismic response with accidental eccentricity
NASA Astrophysics Data System (ADS)
Fan, Xuanhua; Yin, Jiacong; Sun, Shuli; Chen, Pu
2014-09-01
Accidental eccentricity is a non-standard assumption for seismic design of tall buildings. Taking it into consideration requires reanalysis of seismic resistance, which requires either time consuming computation of natural vibration of eccentric structures or finding a static displacement solution by applying an approximated equivalent torsional moment for each eccentric case. This study proposes an alternative modal response spectrum analysis (MRSA) approach to calculate seismic responses with accidental eccentricity. The proposed approach, called the Rayleigh Ritz Projection-MRSA (RRP-MRSA), is developed based on MRSA and two strategies: (a) a RRP method to obtain a fast calculation of approximate modes of eccentric structures; and (b) an approach to assemble mass matrices of eccentric structures. The efficiency of RRP-MRSA is tested via engineering examples and compared with the standard MRSA (ST-MRSA) and one approximate method, i.e., the equivalent torsional moment hybrid MRSA (ETM-MRSA). Numerical results show that RRP-MRSA not only achieves almost the same precision as ST-MRSA, and is much better than ETM-MRSA, but is also more economical. Thus, RRP-MRSA can be in place of current accidental eccentricity computations in seismic design.
NASA Technical Reports Server (NTRS)
Papanyan, Valeri; Oshle, Edward; Adamo, Daniel
2008-01-01
Measurement of the jettisoned object departure trajectory and velocity vector in the International Space Station (ISS) reference frame is vitally important for prompt evaluation of the object s imminent orbit. We report on the first successful application of photogrammetric analysis of the ISS imagery for the prompt computation of the jettisoned object s position and velocity vectors. As post-EVA analyses examples, we present the Floating Potential Probe (FPP) and the Russian "Orlan" Space Suit jettisons, as well as the near-real-time (provided in several hours after the separation) computations of the Video Stanchion Support Assembly Flight Support Assembly (VSSA-FSA) and Early Ammonia Servicer (EAS) jettisons during the US astronauts space-walk. Standard close-range photogrammetry analysis was used during this EVA to analyze two on-board camera image sequences down-linked from the ISS. In this approach the ISS camera orientations were computed from known coordinates of several reference points on the ISS hardware. Then the position of the jettisoned object for each time-frame was computed from its image in each frame of the video-clips. In another, "quick-look" approach used in near-real time, orientation of the cameras was computed from their position (from the ISS CAD model) and operational data (pan and tilt) then location of the jettisoned object was calculated only for several frames of the two synchronized movies. Keywords: Photogrammetry, International Space Station, jettisons, image analysis.
A study of the feasibility of statistical analysis of airport performance simulation
NASA Technical Reports Server (NTRS)
Myers, R. H.
1982-01-01
The feasibility of conducting a statistical analysis of simulation experiments to study airport capacity is investigated. First, the form of the distribution of airport capacity is studied. Since the distribution is non-Gaussian, it is important to determine the effect of this distribution on standard analysis of variance techniques and power calculations. Next, power computations are made in order to determine how economic simulation experiments would be if they are designed to detect capacity changes from condition to condition. Many of the conclusions drawn are results of Monte-Carlo techniques.
48 CFR 227.7203-5 - Government rights.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Software and Computer Software Documentation 227.7203-5 Government rights. The standard license rights in computer software that a licensor grants to the Government are unlimited rights, government purpose rights, or restricted rights. The standard license in computer software documentation conveys unlimited...
48 CFR 227.7203-5 - Government rights.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Software and Computer Software Documentation 227.7203-5 Government rights. The standard license rights in computer software that a licensor grants to the Government are unlimited rights, government purpose rights, or restricted rights. The standard license in computer software documentation conveys unlimited...
48 CFR 227.7203-5 - Government rights.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Software and Computer Software Documentation 227.7203-5 Government rights. The standard license rights in computer software that a licensor grants to the Government are unlimited rights, government purpose rights, or restricted rights. The standard license in computer software documentation conveys unlimited...
48 CFR 227.7203-5 - Government rights.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Software and Computer Software Documentation 227.7203-5 Government rights. The standard license rights in computer software that a licensor grants to the Government are unlimited rights, government purpose rights, or restricted rights. The standard license in computer software documentation conveys unlimited...
48 CFR 227.7203-5 - Government rights.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Software and Computer Software Documentation 227.7203-5 Government rights. The standard license rights in computer software that a licensor grants to the Government are unlimited rights, government purpose rights, or restricted rights. The standard license in computer software documentation conveys unlimited...
Product or waste? Importation and end-of-life processing of computers in Peru.
Kahhat, Ramzy; Williams, Eric
2009-08-01
This paper considers the importation of used personal computers (PCs) in Peru and domestic practices in their production, reuse, and end-of-life processing. The empirical pillars of this study are analysis of government data describing trade in used and new computers and surveys and interviews of computer sellers, refurbishers, and recyclers. The United States is the primary source of used PCs imported to Peru. Analysis of shipment value (as measured by trade statistics) shows that 87-88% of imported used computers had a price higher than the ideal recycle value of constituent materials. The official trade in end-of-life computers is thus driven by reuse as opposed to recycling. The domestic reverse supply chain of PCs is well developed with extensive collection, reuse, and recycling. Environmental problems identified include open burning of copper-bearing wires to remove insulation and landfilling of CRT glass. Distinct from informal recycling in China and India, printed circuit boards are usually not recycled domestically but exported to Europe for advanced recycling or to China for (presumably) informal recycling. It is notable that purely economic considerations lead to circuit boards being exported to Europe where environmental standards are stringent, presumably due to higher recovery of precious metals.
Pliasunova, S A; Balugian, R Sh; Khmel'nitskiĭ, K E; Medovyĭ, V S; Parpara, A A; Piatnitskiĭ, A M; Sokolinskiĭ, B Z; Dem'ianov, V L; Nikolaenko, D S
2006-10-01
The paper presents the results of medical tests of a group of computer-aided procedures for microscopic analysis by means of a MECOS-Ts2 complex (ZAO "MECOS", Russia), which have been conducted at the Republican Children's Clinical Hospital, the Research Institute of Emergency Pediatric Surgery and Traumatology, and Moscow City Clinical Hospital No. 23. Computer-aided procedures for calculating the differential count and for analyzing the morphology of red blood cells were tested on blood smears from a total of 443 patients and donors, computer-aided calculation of the count of reticulocytes was tested on 318 smears. The tests were carried out under the US standard NCCLS-H20A. Manual microscopy (443 smears) and flow blood analysis on a Coulter GEN*S (125 smears) were used as reference methods. The quality of collection of samples and laboriousness were additionally assessed. The certified MECOS-Ts2 subsystems were additionally used as reference tools. The tests indicated the advantage of computer-aided MECOS-Tsl2 complex microscopy over manual microscopy.
Interaction entropy for protein-protein binding
NASA Astrophysics Data System (ADS)
Sun, Zhaoxi; Yan, Yu N.; Yang, Maoyou; Zhang, John Z. H.
2017-03-01
Protein-protein interactions are at the heart of signal transduction and are central to the function of protein machine in biology. The highly specific protein-protein binding is quantitatively characterized by the binding free energy whose accurate calculation from the first principle is a grand challenge in computational biology. In this paper, we show how the interaction entropy approach, which was recently proposed for protein-ligand binding free energy calculation, can be applied to computing the entropic contribution to the protein-protein binding free energy. Explicit theoretical derivation of the interaction entropy approach for protein-protein interaction system is given in detail from the basic definition. Extensive computational studies for a dozen realistic protein-protein interaction systems are carried out using the present approach and comparisons of the results for these protein-protein systems with those from the standard normal mode method are presented. Analysis of the present method for application in protein-protein binding as well as the limitation of the method in numerical computation is discussed. Our study and analysis of the results provided useful information for extracting correct entropic contribution in protein-protein binding from molecular dynamics simulations.
Interaction entropy for protein-protein binding.
Sun, Zhaoxi; Yan, Yu N; Yang, Maoyou; Zhang, John Z H
2017-03-28
Protein-protein interactions are at the heart of signal transduction and are central to the function of protein machine in biology. The highly specific protein-protein binding is quantitatively characterized by the binding free energy whose accurate calculation from the first principle is a grand challenge in computational biology. In this paper, we show how the interactionentropy approach, which was recently proposed for protein-ligand binding free energy calculation, can be applied to computing the entropic contribution to the protein-protein binding free energy. Explicit theoretical derivation of the interactionentropy approach for protein-protein interaction system is given in detail from the basic definition. Extensive computational studies for a dozen realistic protein-protein interaction systems are carried out using the present approach and comparisons of the results for these protein-protein systems with those from the standard normal mode method are presented. Analysis of the present method for application in protein-protein binding as well as the limitation of the method in numerical computation is discussed. Our study and analysis of the results provided useful information for extracting correct entropic contribution in protein-protein binding from molecular dynamics simulations.
Validation of Bayesian analysis of compartmental kinetic models in medical imaging.
Sitek, Arkadiusz; Li, Quanzheng; El Fakhri, Georges; Alpert, Nathaniel M
2016-10-01
Kinetic compartmental analysis is frequently used to compute physiologically relevant quantitative values from time series of images. In this paper, a new approach based on Bayesian analysis to obtain information about these parameters is presented and validated. The closed-form of the posterior distribution of kinetic parameters is derived with a hierarchical prior to model the standard deviation of normally distributed noise. Markov chain Monte Carlo methods are used for numerical estimation of the posterior distribution. Computer simulations of the kinetics of F18-fluorodeoxyglucose (FDG) are used to demonstrate drawing statistical inferences about kinetic parameters and to validate the theory and implementation. Additionally, point estimates of kinetic parameters and covariance of those estimates are determined using the classical non-linear least squares approach. Posteriors obtained using methods proposed in this work are accurate as no significant deviation from the expected shape of the posterior was found (one-sided P>0.08). It is demonstrated that the results obtained by the standard non-linear least-square methods fail to provide accurate estimation of uncertainty for the same data set (P<0.0001). The results of this work validate new methods for a computer simulations of FDG kinetics. Results show that in situations where the classical approach fails in accurate estimation of uncertainty, Bayesian estimation provides an accurate information about the uncertainties in the parameters. Although a particular example of FDG kinetics was used in the paper, the methods can be extended for different pharmaceuticals and imaging modalities. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
SOCR: Statistics Online Computational Resource
Dinov, Ivo D.
2011-01-01
The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741
Computer aided manual validation of mass spectrometry-based proteomic data.
Curran, Timothy G; Bryson, Bryan D; Reigelhaupt, Michael; Johnson, Hannah; White, Forest M
2013-06-15
Advances in mass spectrometry-based proteomic technologies have increased the speed of analysis and the depth provided by a single analysis. Computational tools to evaluate the accuracy of peptide identifications from these high-throughput analyses have not kept pace with technological advances; currently the most common quality evaluation methods are based on statistical analysis of the likelihood of false positive identifications in large-scale data sets. While helpful, these calculations do not consider the accuracy of each identification, thus creating a precarious situation for biologists relying on the data to inform experimental design. Manual validation is the gold standard approach to confirm accuracy of database identifications, but is extremely time-intensive. To palliate the increasing time required to manually validate large proteomic datasets, we provide computer aided manual validation software (CAMV) to expedite the process. Relevant spectra are collected, catalogued, and pre-labeled, allowing users to efficiently judge the quality of each identification and summarize applicable quantitative information. CAMV significantly reduces the burden associated with manual validation and will hopefully encourage broader adoption of manual validation in mass spectrometry-based proteomics. Copyright © 2013 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Barth, Tim; Zapata, Edgar; Benjamin, Perakath; Graul, Mike; Jones, Doug
2005-01-01
Portfolio Analysis Tool (PAT) is a Web-based, client/server computer program that helps managers of multiple projects funded by different customers to make decisions regarding investments in those projects. PAT facilitates analysis on a macroscopic level, without distraction by parochial concerns or tactical details of individual projects, so that managers decisions can reflect the broad strategy of their organization. PAT is accessible via almost any Web-browser software. Experts in specific projects can contribute to a broad database that managers can use in analyzing the costs and benefits of all projects, but do not have access for modifying criteria for analyzing projects: access for modifying criteria is limited to managers according to levels of administrative privilege. PAT affords flexibility for modifying criteria for particular "focus areas" so as to enable standardization of criteria among similar projects, thereby making it possible to improve assessments without need to rewrite computer code or to rehire experts, and thereby further reducing the cost of maintaining and upgrading computer code. Information in the PAT database and results of PAT analyses can be incorporated into a variety of ready-made or customizable tabular or graphical displays.
Santiago-Moreno, Julian; Esteso, Milagros Cristina; Villaverde-Morcillo, Silvia; Toledano-Díaz, Adolfo; Castaño, Cristina; Velázquez, Rosario; López-Sebastián, Antonio; Goya, Agustín López; Martínez, Javier Gimeno
2016-01-01
Postcopulatory sexual selection through sperm competition may be an important evolutionary force affecting many reproductive traits, including sperm morphometrics. Environmental factors such as pollutants, pesticides, and climate change may affect different sperm traits, and thus reproduction, in sensitive bird species. Many sperm-handling processes used in assisted reproductive techniques may also affect the size of sperm cells. The accurately measured dimensions of sperm cell structures (especially the head) can thus be used as indicators of environmental influences, in improving our understanding of reproductive and evolutionary strategies, and for optimizing assisted reproductive techniques (e.g., sperm cryopreservation) for use with birds. Computer-assisted sperm morphometry analysis (CASA-Morph) provides an accurate and reliable method for assessing sperm morphometry, reducing the problem of subjectivity associated with human visual assessment. Computerized systems have been standardized for use with semen from different mammalian species. Avian spermatozoa, however, are filiform, limiting their analysis with such systems, which were developed to examine the approximately spherical heads of mammalian sperm cells. To help overcome this, the standardization of staining techniques to be used in computer-assessed light microscopical methods is a priority. The present review discusses these points and describes the sperm morphometric characteristics of several wild and domestic bird species. PMID:27678467
Validation of Computerized Automatic Calculation of the Sequential Organ Failure Assessment Score
Harrison, Andrew M.; Pickering, Brian W.; Herasevich, Vitaly
2013-01-01
Purpose. To validate the use of a computer program for the automatic calculation of the sequential organ failure assessment (SOFA) score, as compared to the gold standard of manual chart review. Materials and Methods. Adult admissions (age > 18 years) to the medical ICU with a length of stay greater than 24 hours were studied in the setting of an academic tertiary referral center. A retrospective cross-sectional analysis was performed using a derivation cohort to compare automatic calculation of the SOFA score to the gold standard of manual chart review. After critical appraisal of sources of disagreement, another analysis was performed using an independent validation cohort. Then, a prospective observational analysis was performed using an implementation of this computer program in AWARE Dashboard, which is an existing real-time patient EMR system for use in the ICU. Results. Good agreement between the manual and automatic SOFA calculations was observed for both the derivation (N=94) and validation (N=268) cohorts: 0.02 ± 2.33 and 0.29 ± 1.75 points, respectively. These results were validated in AWARE (N=60). Conclusion. This EMR-based automatic tool accurately calculates SOFA scores and can facilitate ICU decisions without the need for manual data collection. This tool can also be employed in a real-time electronic environment. PMID:23936639
How to Make a Synthetic Multicellular Computer
Macia, Javier; Sole, Ricard
2014-01-01
Biological systems perform computations at multiple scales and they do so in a robust way. Engineering metaphors have often been used in order to provide a rationale for modeling cellular and molecular computing networks and as the basis for their synthetic design. However, a major constraint in this mapping between electronic and wet computational circuits is the wiring problem. Although wires are identical within electronic devices, they must be different when using synthetic biology designs. Moreover, in most cases the designed molecular systems cannot be reused for other functions. A new approximation allows us to simplify the problem by using synthetic cellular consortia where the output of the computation is distributed over multiple engineered cells. By evolving circuits in silico, we can obtain the minimal sets of Boolean units required to solve the given problem at the lowest cost using cellular consortia. Our analysis reveals that the basic set of logic units is typically non-standard. Among the most common units, the so called inverted IMPLIES (N-Implies) appears to be one of the most important elements along with the NOT and AND functions. Although NOR and NAND gates are widely used in electronics, evolved circuits based on combinations of these gates are rare, thus suggesting that the strategy of combining the same basic logic gates might be inappropriate in order to easily implement synthetic computational constructs. The implications for future synthetic designs, the general view of synthetic biology as a standard engineering domain, as well as potencial drawbacks are outlined. PMID:24586222
System statistical reliability model and analysis
NASA Technical Reports Server (NTRS)
Lekach, V. S.; Rood, H.
1973-01-01
A digital computer code was developed to simulate the time-dependent behavior of the 5-kwe reactor thermoelectric system. The code was used to determine lifetime sensitivity coefficients for a number of system design parameters, such as thermoelectric module efficiency and degradation rate, radiator absorptivity and emissivity, fuel element barrier defect constant, beginning-of-life reactivity, etc. A probability distribution (mean and standard deviation) was estimated for each of these design parameters. Then, error analysis was used to obtain a probability distribution for the system lifetime (mean = 7.7 years, standard deviation = 1.1 years). From this, the probability that the system will achieve the design goal of 5 years lifetime is 0.993. This value represents an estimate of the degradation reliability of the system.
Domain fusion analysis by applying relational algebra to protein sequence and domain databases
Truong, Kevin; Ikura, Mitsuhiko
2003-01-01
Background Domain fusion analysis is a useful method to predict functionally linked proteins that may be involved in direct protein-protein interactions or in the same metabolic or signaling pathway. As separate domain databases like BLOCKS, PROSITE, Pfam, SMART, PRINTS-S, ProDom, TIGRFAMs, and amalgamated domain databases like InterPro continue to grow in size and quality, a computational method to perform domain fusion analysis that leverages on these efforts will become increasingly powerful. Results This paper proposes a computational method employing relational algebra to find domain fusions in protein sequence databases. The feasibility of this method was illustrated on the SWISS-PROT+TrEMBL sequence database using domain predictions from the Pfam HMM (hidden Markov model) database. We identified 235 and 189 putative functionally linked protein partners in H. sapiens and S. cerevisiae, respectively. From scientific literature, we were able to confirm many of these functional linkages, while the remainder offer testable experimental hypothesis. Results can be viewed at . Conclusion As the analysis can be computed quickly on any relational database that supports standard SQL (structured query language), it can be dynamically updated along with the sequence and domain databases, thereby improving the quality of predictions over time. PMID:12734020
Effectiveness of Intelligent Tutoring Systems: A Meta-Analytic Review
ERIC Educational Resources Information Center
Kulik, James A.; Fletcher, J. D.
2016-01-01
This review describes a meta-analysis of findings from 50 controlled evaluations of intelligent computer tutoring systems. The median effect of intelligent tutoring in the 50 evaluations was to raise test scores 0.66 standard deviations over conventional levels, or from the 50th to the 75th percentile. However, the amount of improvement found in…
Analysis of USAREUR Family Housing.
1985-04-01
Standard Installation/Division Personnel System SJA ................ Staff Judge Advocate SPSS ............... Statistical Package for the...for Projecting Family Housing Requirements. a. Attempts to define USAREUR’s programmable family housing deficit Sbased on the FHS have caused anguish ...responses using the Statistical Package for the Social Sciences ( SPSS ) computer program. E-2 ANNEX E RESPONSE TO ESC HOUSING QUESTIONNAIRE Section Page I
A note on contagion indices for landscape analysis
Kurt H. Riitters; Robert V. O' Neill; James D. Wickham; K. Bruce Jones
1996-01-01
The landscape contagion index measures the degree of clumping of attributes on raster maps. The index is computed from the frequencies by which different pairs of attributes occur as adjacent pixels on a map. Because there are subtle differences in the way the attribute adjacencies may be tabulated, the standard index formula may not always apply, and published index...
FRAN: financial ratio analysis and more (Version 2.0 for Windows)
Bruce G. Hansen; Arnold J., Jr. Palmer
1999-01-01
FRAN is a computer-based, stand-alone program designed to generate important financial and operating ratios from tax and wage forms filed with the Internal Revenue Service. FRAN generates standard profitability, financial/leverage, liquidity/solvency, and activity ratios, as well as unique measures of workforce and capital cost and acquisition. Information produced by...
NASA Astrophysics Data System (ADS)
Larsen, J. D.; Schaap, M. G.
2013-12-01
Recent advances in computing technology and experimental techniques have made it possible to observe and characterize fluid dynamics at the micro-scale. Many computational methods exist that can adequately simulate fluid flow in porous media. Lattice Boltzmann methods provide the distinct advantage of tracking particles at the microscopic level and returning macroscopic observations. While experimental methods can accurately measure macroscopic fluid dynamics, computational efforts can be used to predict and gain insight into fluid dynamics by utilizing thin sections or computed micro-tomography (CMT) images of core sections. Although substantial effort have been made to advance non-invasive imaging methods such as CMT, fluid dynamics simulations, and microscale analysis, a true three dimensional image segmentation technique has not been developed until recently. Many competing segmentation techniques are utilized in industry and research settings with varying results. In this study lattice Boltzmann method is used to simulate stokes flow in a macroporous soil column. Two dimensional CMT images were used to reconstruct a three dimensional representation of the original sample. Six competing segmentation standards were used to binarize the CMT volumes which provide distinction between solid phase and pore space. The permeability of the reconstructed samples was calculated, with Darcy's Law, from lattice Boltzmann simulations of fluid flow in the samples. We compare simulated permeability from differing segmentation algorithms to experimental findings.
77 FR 74829 - Notice of Public Meeting-Cloud Computing and Big Data Forum and Workshop
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-18
...--Cloud Computing and Big Data Forum and Workshop AGENCY: National Institute of Standards and Technology... Standards and Technology (NIST) announces a Cloud Computing and Big Data Forum and Workshop to be held on... followed by a one-day hands-on workshop. The NIST Cloud Computing and Big Data Forum and Workshop will...
An Integrated Nonlinear Analysis library - (INA) for solar system plasma turbulence
NASA Astrophysics Data System (ADS)
Munteanu, Costel; Kovacs, Peter; Echim, Marius; Koppan, Andras
2014-05-01
We present an integrated software library dedicated to the analysis of time series recorded in space and adapted to investigate turbulence, intermittency and multifractals. The library is written in MATLAB and provides a graphical user interface (GUI) customized for the analysis of space physics data available online like: Coordinated Data Analysis Web (CDAWeb), Automated Multi Dataset Analysis system (AMDA), Planetary Science Archive (PSA), World Data Center Kyoto (WDC), Ulysses Final Archive (UFA) and Cluster Active Archive (CAA). Three main modules are already implemented in INA : the Power Spectral Density (PSD) Analysis, the Wavelet and Intemittency Analysis and the Probability Density Functions (PDF) analysis.The layered structure of the software allows the user to easily switch between different modules/methods while retaining the same time interval for the analysis. The wavelet analysis module includes algorithms to compute and analyse the PSD, the Scalogram, the Local Intermittency Measure (LIM) or the Flatness parameter. The PDF analysis module includes algorithms for computing the PDFs for a range of scales and parameters fully customizable by the user; it also computes the Flatness parameter and enables fast comparison with standard PDF profiles like, for instance, the Gaussian PDF. The library has been already tested on Cluster and Venus Express data and we will show relevant examples. Research supported by the European Community's Seventh Framework Programme (FP7/2007-2013) under grant agreement no 313038/STORM, and a grant of the Romanian Ministry of National Education, CNCS UEFISCDI, project number PN-II-ID PCE-2012-4-0418.
21 CFR 1311.08 - Incorporation by reference.
Code of Federal Regulations, 2010 CFR
2010-04-01
... of Standards and Technology, Computer Security Division, Information Technology Laboratory, National... standards are available from the National Institute of Standards and Technology, Computer Security Division... 140-2, Security Requirements for Cryptographic Modules, May 25, 2001, as amended by Change Notices 2...
Johnson, Timothy C.; Versteeg, Roelof J.; Ward, Andy; Day-Lewis, Frederick D.; Revil, André
2010-01-01
Electrical geophysical methods have found wide use in the growing discipline of hydrogeophysics for characterizing the electrical properties of the subsurface and for monitoring subsurface processes in terms of the spatiotemporal changes in subsurface conductivity, chargeability, and source currents they govern. Presently, multichannel and multielectrode data collections systems can collect large data sets in relatively short periods of time. Practitioners, however, often are unable to fully utilize these large data sets and the information they contain because of standard desktop-computer processing limitations. These limitations can be addressed by utilizing the storage and processing capabilities of parallel computing environments. We have developed a parallel distributed-memory forward and inverse modeling algorithm for analyzing resistivity and time-domain induced polar-ization (IP) data. The primary components of the parallel computations include distributed computation of the pole solutions in forward mode, distributed storage and computation of the Jacobian matrix in inverse mode, and parallel execution of the inverse equation solver. We have tested the corresponding parallel code in three efforts: (1) resistivity characterization of the Hanford 300 Area Integrated Field Research Challenge site in Hanford, Washington, U.S.A., (2) resistivity characterization of a volcanic island in the southern Tyrrhenian Sea in Italy, and (3) resistivity and IP monitoring of biostimulation at a Superfund site in Brandywine, Maryland, U.S.A. Inverse analysis of each of these data sets would be limited or impossible in a standard serial computing environment, which underscores the need for parallel high-performance computing to fully utilize the potential of electrical geophysical methods in hydrogeophysical applications.
Visualizing and Validating Metadata Traceability within the CDISC Standards.
Hume, Sam; Sarnikar, Surendra; Becnel, Lauren; Bennett, Dorine
2017-01-01
The Food & Drug Administration has begun requiring that electronic submissions of regulated clinical studies utilize the Clinical Data Information Standards Consortium data standards. Within regulated clinical research, traceability is a requirement and indicates that the analysis results can be traced back to the original source data. Current solutions for clinical research data traceability are limited in terms of querying, validation and visualization capabilities. This paper describes (1) the development of metadata models to support computable traceability and traceability visualizations that are compatible with industry data standards for the regulated clinical research domain, (2) adaptation of graph traversal algorithms to make them capable of identifying traceability gaps and validating traceability across the clinical research data lifecycle, and (3) development of a traceability query capability for retrieval and visualization of traceability information.
Visualizing and Validating Metadata Traceability within the CDISC Standards
Hume, Sam; Sarnikar, Surendra; Becnel, Lauren; Bennett, Dorine
2017-01-01
The Food & Drug Administration has begun requiring that electronic submissions of regulated clinical studies utilize the Clinical Data Information Standards Consortium data standards. Within regulated clinical research, traceability is a requirement and indicates that the analysis results can be traced back to the original source data. Current solutions for clinical research data traceability are limited in terms of querying, validation and visualization capabilities. This paper describes (1) the development of metadata models to support computable traceability and traceability visualizations that are compatible with industry data standards for the regulated clinical research domain, (2) adaptation of graph traversal algorithms to make them capable of identifying traceability gaps and validating traceability across the clinical research data lifecycle, and (3) development of a traceability query capability for retrieval and visualization of traceability information. PMID:28815125
Critical Software for Human Spaceflight
NASA Technical Reports Server (NTRS)
Preden, Antonio; Kaschner, Jens; Rettig, Felix; Rodriggs, Michael
2017-01-01
The NASA Orion vehicle that will fly to the moon in the next years is propelled along its mission by the European Service Module (ESM), developed by ESA and its prime contractor Airbus Defense and Space. This paper describes the development of the Propulsion Drive Electronics (PDE) Software that provides the interface between the propulsion hardware of the European Service Module with the Orion flight computers, and highlights the challenges that have been faced during the development. Particularly, the specific aspects relevant to Human Spaceflight in an international cooperation are presented, as the compliance to both European and US standards and the software criticality classification to the highest category A. An innovative aspect of the PDE SW is its Time- Triggered Ethernet interface with the Orion Flight Computers, which has never been flown so far on any European spacecraft. Finally the verification aspects are presented, applying the most exigent quality requirements defined in the European Cooperation for Space Standardization (ECSS) standards such as the structural coverage analysis of the object code and the recourse to an independent software verification and validation activity carried on in parallel by a different team.
On pp wave limit for η deformed superstrings
NASA Astrophysics Data System (ADS)
Roychowdhury, Dibakar
2018-05-01
In this paper, based on the notion of plane wave string/gauge theory duality, we explore the pp wave limit associated with the bosonic sector of η deformed superstrings propagating in ( AdS 5 × S 5) η . Our analysis reveals that in the presence of NS-NS and RR fluxes, the pp wave limit associated to full ABF background satisfies type IIB equations in its standard form. However, the beta functions as well as the string Hamiltonian start receiving non trivial curvature corrections as one starts probing beyond pp wave limit which thereby takes solutions away from the standard type IIB form. Furthermore, using uniform gauge, we also explore the BMN dynamics associated with short strings and compute the corresponding Hamiltonian density. Finally, we explore the Penrose limit associated with the HT background and compute the corresponding stringy spectrum for the bosonic sector.
System Identification Applied to Dynamic CFD Simulation and Wind Tunnel Data
NASA Technical Reports Server (NTRS)
Murphy, Patrick C.; Klein, Vladislav; Frink, Neal T.; Vicroy, Dan D.
2011-01-01
Demanding aerodynamic modeling requirements for military and civilian aircraft have provided impetus for researchers to improve computational and experimental techniques. Model validation is a key component for these research endeavors so this study is an initial effort to extend conventional time history comparisons by comparing model parameter estimates and their standard errors using system identification methods. An aerodynamic model of an aircraft performing one-degree-of-freedom roll oscillatory motion about its body axes is developed. The model includes linear aerodynamics and deficiency function parameters characterizing an unsteady effect. For estimation of unknown parameters two techniques, harmonic analysis and two-step linear regression, were applied to roll-oscillatory wind tunnel data and to computational fluid dynamics (CFD) simulated data. The model used for this study is a highly swept wing unmanned aerial combat vehicle. Differences in response prediction, parameters estimates, and standard errors are compared and discussed
Computational complexity of the landscape II-Cosmological considerations
NASA Astrophysics Data System (ADS)
Denef, Frederik; Douglas, Michael R.; Greene, Brian; Zukowski, Claire
2018-05-01
We propose a new approach for multiverse analysis based on computational complexity, which leads to a new family of "computational" measure factors. By defining a cosmology as a space-time containing a vacuum with specified properties (for example small cosmological constant) together with rules for how time evolution will produce the vacuum, we can associate global time in a multiverse with clock time on a supercomputer which simulates it. We argue for a principle of "limited computational complexity" governing early universe dynamics as simulated by this supercomputer, which translates to a global measure for regulating the infinities of eternal inflation. The rules for time evolution can be thought of as a search algorithm, whose details should be constrained by a stronger principle of "minimal computational complexity". Unlike previously studied global measures, ours avoids standard equilibrium considerations and the well-known problems of Boltzmann Brains and the youngness paradox. We also give various definitions of the computational complexity of a cosmology, and argue that there are only a few natural complexity classes.
Developments in REDES: The rocket engine design expert system
NASA Technical Reports Server (NTRS)
Davidian, Kenneth O.
1990-01-01
The Rocket Engine Design Expert System (REDES) is being developed at the NASA-Lewis to collect, automate, and perpetuate the existing expertise of performing a comprehensive rocket engine analysis and design. Currently, REDES uses the rigorous JANNAF methodology to analyze the performance of the thrust chamber and perform computational studies of liquid rocket engine problems. The following computer codes were included in REDES: a gas properties program named GASP, a nozzle design program named RAO, a regenerative cooling channel performance evaluation code named RTE, and the JANNAF standard liquid rocket engine performance prediction code TDK (including performance evaluation modules ODE, ODK, TDE, TDK, and BLM). Computational analyses are being conducted by REDES to provide solutions to liquid rocket engine thrust chamber problems. REDES is built in the Knowledge Engineering Environment (KEE) expert system shell and runs on a Sun 4/110 computer.
Developments in REDES: The Rocket Engine Design Expert System
NASA Technical Reports Server (NTRS)
Davidian, Kenneth O.
1990-01-01
The Rocket Engine Design Expert System (REDES) was developed at NASA-Lewis to collect, automate, and perpetuate the existing expertise of performing a comprehensive rocket engine analysis and design. Currently, REDES uses the rigorous JANNAF methodology to analyze the performance of the thrust chamber and perform computational studies of liquid rocket engine problems. The following computer codes were included in REDES: a gas properties program named GASP; a nozzle design program named RAO; a regenerative cooling channel performance evaluation code named RTE; and the JANNAF standard liquid rocket engine performance prediction code TDK (including performance evaluation modules ODE, ODK, TDE, TDK, and BLM). Computational analyses are being conducted by REDES to provide solutions to liquid rocket engine thrust chamber problems. REDES was built in the Knowledge Engineering Environment (KEE) expert system shell and runs on a Sun 4/110 computer.
HYSEP: A Computer Program for Streamflow Hydrograph Separation and Analysis
Sloto, Ronald A.; Crouse, Michele Y.
1996-01-01
HYSEP is a computer program that can be used to separate a streamflow hydrograph into base-flow and surface-runoff components. The base-flow component has traditionally been associated with ground-water discharge and the surface-runoff component with precipitation that enters the stream as overland runoff. HYSEP includes three methods of hydrograph separation that are referred to in the literature as the fixed interval, sliding-interval, and local-minimum methods. The program also describes the frequency and duration of measured streamflow and computed base flow and surface runoff. Daily mean stream discharge is used as input to the program in either an American Standard Code for Information Interchange (ASCII) or binary format. Output from the program includes table,s graphs, and data files. Graphical output may be plotted on the computer screen or output to a printer, plotter, or metafile.
No Evidence for Extensions to the Standard Cosmological Model.
Heavens, Alan; Fantaye, Yabebal; Sellentin, Elena; Eggers, Hans; Hosenie, Zafiirah; Kroon, Steve; Mootoovaloo, Arrykrishna
2017-09-08
We compute the Bayesian evidence for models considered in the main analysis of Planck cosmic microwave background data. By utilizing carefully defined nearest-neighbor distances in parameter space, we reuse the Monte Carlo Markov chains already produced for parameter inference to compute Bayes factors B for many different model-data set combinations. The standard 6-parameter flat cold dark matter model with a cosmological constant (ΛCDM) is favored over all other models considered, with curvature being mildly favored only when cosmic microwave background lensing is not included. Many alternative models are strongly disfavored by the data, including primordial correlated isocurvature models (lnB=-7.8), nonzero scalar-to-tensor ratio (lnB=-4.3), running of the spectral index (lnB=-4.7), curvature (lnB=-3.6), nonstandard numbers of neutrinos (lnB=-3.1), nonstandard neutrino masses (lnB=-3.2), nonstandard lensing potential (lnB=-4.6), evolving dark energy (lnB=-3.2), sterile neutrinos (lnB=-6.9), and extra sterile neutrinos with a nonzero scalar-to-tensor ratio (lnB=-10.8). Other models are less strongly disfavored with respect to flat ΛCDM. As with all analyses based on Bayesian evidence, the final numbers depend on the widths of the parameter priors. We adopt the priors used in the Planck analysis, while performing a prior sensitivity analysis. Our quantitative conclusion is that extensions beyond the standard cosmological model are disfavored by Planck data. Only when newer Hubble constant measurements are included does ΛCDM become disfavored, and only mildly, compared with a dynamical dark energy model (lnB∼+2).
No Evidence for Extensions to the Standard Cosmological Model
NASA Astrophysics Data System (ADS)
Heavens, Alan; Fantaye, Yabebal; Sellentin, Elena; Eggers, Hans; Hosenie, Zafiirah; Kroon, Steve; Mootoovaloo, Arrykrishna
2017-09-01
We compute the Bayesian evidence for models considered in the main analysis of Planck cosmic microwave background data. By utilizing carefully defined nearest-neighbor distances in parameter space, we reuse the Monte Carlo Markov chains already produced for parameter inference to compute Bayes factors B for many different model-data set combinations. The standard 6-parameter flat cold dark matter model with a cosmological constant (Λ CDM ) is favored over all other models considered, with curvature being mildly favored only when cosmic microwave background lensing is not included. Many alternative models are strongly disfavored by the data, including primordial correlated isocurvature models (ln B =-7.8 ), nonzero scalar-to-tensor ratio (ln B =-4.3 ), running of the spectral index (ln B =-4.7 ), curvature (ln B =-3.6 ), nonstandard numbers of neutrinos (ln B =-3.1 ), nonstandard neutrino masses (ln B =-3.2 ), nonstandard lensing potential (ln B =-4.6 ), evolving dark energy (ln B =-3.2 ), sterile neutrinos (ln B =-6.9 ), and extra sterile neutrinos with a nonzero scalar-to-tensor ratio (ln B =-10.8 ). Other models are less strongly disfavored with respect to flat Λ CDM . As with all analyses based on Bayesian evidence, the final numbers depend on the widths of the parameter priors. We adopt the priors used in the Planck analysis, while performing a prior sensitivity analysis. Our quantitative conclusion is that extensions beyond the standard cosmological model are disfavored by Planck data. Only when newer Hubble constant measurements are included does Λ CDM become disfavored, and only mildly, compared with a dynamical dark energy model (ln B ˜+2 ).
Benchmarking undedicated cloud computing providers for analysis of genomic datasets.
Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W
2014-01-01
A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.
Benchmarking Undedicated Cloud Computing Providers for Analysis of Genomic Datasets
Yazar, Seyhan; Gooden, George E. C.; Mackey, David A.; Hewitt, Alex W.
2014-01-01
A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5–78.2) for E.coli and 53.5% (95% CI: 34.4–72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5–303.1) and 173.9% (95% CI: 134.6–213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE. PMID:25247298
The Generation Challenge Programme Platform: Semantic Standards and Workbench for Crop Science
Bruskiewich, Richard; Senger, Martin; Davenport, Guy; Ruiz, Manuel; Rouard, Mathieu; Hazekamp, Tom; Takeya, Masaru; Doi, Koji; Satoh, Kouji; Costa, Marcos; Simon, Reinhard; Balaji, Jayashree; Akintunde, Akinnola; Mauleon, Ramil; Wanchana, Samart; Shah, Trushar; Anacleto, Mylah; Portugal, Arllet; Ulat, Victor Jun; Thongjuea, Supat; Braak, Kyle; Ritter, Sebastian; Dereeper, Alexis; Skofic, Milko; Rojas, Edwin; Martins, Natalia; Pappas, Georgios; Alamban, Ryan; Almodiel, Roque; Barboza, Lord Hendrix; Detras, Jeffrey; Manansala, Kevin; Mendoza, Michael Jonathan; Morales, Jeffrey; Peralta, Barry; Valerio, Rowena; Zhang, Yi; Gregorio, Sergio; Hermocilla, Joseph; Echavez, Michael; Yap, Jan Michael; Farmer, Andrew; Schiltz, Gary; Lee, Jennifer; Casstevens, Terry; Jaiswal, Pankaj; Meintjes, Ayton; Wilkinson, Mark; Good, Benjamin; Wagner, James; Morris, Jane; Marshall, David; Collins, Anthony; Kikuchi, Shoshi; Metz, Thomas; McLaren, Graham; van Hintum, Theo
2008-01-01
The Generation Challenge programme (GCP) is a global crop research consortium directed toward crop improvement through the application of comparative biology and genetic resources characterization to plant breeding. A key consortium research activity is the development of a GCP crop bioinformatics platform to support GCP research. This platform includes the following: (i) shared, public platform-independent domain models, ontology, and data formats to enable interoperability of data and analysis flows within the platform; (ii) web service and registry technologies to identify, share, and integrate information across diverse, globally dispersed data sources, as well as to access high-performance computational (HPC) facilities for computationally intensive, high-throughput analyses of project data; (iii) platform-specific middleware reference implementations of the domain model integrating a suite of public (largely open-access/-source) databases and software tools into a workbench to facilitate biodiversity analysis, comparative analysis of crop genomic data, and plant breeding decision making. PMID:18483570
Computational fluid dynamics of airfoils and wings
NASA Technical Reports Server (NTRS)
Garabedian, P.; Mcfadden, G.
1982-01-01
It is pointed out that transonic flow is one of the fields where computational fluid dynamics turns out to be most effective. Codes for the design and analysis of supercritical airfoils and wings have become standard tools of the aircraft industry. The present investigation is concerned with mathematical models and theorems which account for some of the progress that has been made. The most successful aerodynamics codes are those for the analysis of flow at off-design conditions where weak shock waves appear. A major breakthrough was achieved by Murman and Cole (1971), who conceived of a retarded difference scheme which incorporates artificial viscosity to capture shocks in the supersonic zone. This concept has been used to develop codes for the analysis of transonic flow past a swept wing. Attention is given to the trailing edge and the boundary layer, entropy inequalities and wave drag, shockless airfoils, and the inverse swept wing code.
CASA: tracking the past and plotting the future.
Gallagher, M T; Smith, D J; Kirkman-Brown, J C
2018-05-29
The human semen sample carries a wealth of information of varying degrees of accessibility ranging from the traditional visual measures of count and motility to those that need a more computational approach, such as tracking the flagellar waveform. Although computer-aided sperm analysis (CASA) options are becoming more widespread, the gold standard for clinical semen analysis requires trained laboratory staff. In this review we characterise the key attitudes towards the use of CASA and set out areas in which CASA should, and should not, be used and improved. We provide an overview of the current CASA landscape, discussing clinical uses as well as potential areas for the clinical translation of existing research technologies. Finally, we discuss where we see potential for the future of CASA, and how the integration of mathematical modelling and new technologies, such as automated flagellar tracking, may open new doors in clinical semen analysis.
Sobel, E.; Lange, K.
1996-01-01
The introduction of stochastic methods in pedigree analysis has enabled geneticists to tackle computations intractable by standard deterministic methods. Until now these stochastic techniques have worked by running a Markov chain on the set of genetic descent states of a pedigree. Each descent state specifies the paths of gene flow in the pedigree and the founder alleles dropped down each path. The current paper follows up on a suggestion by Elizabeth Thompson that genetic descent graphs offer a more appropriate space for executing a Markov chain. A descent graph specifies the paths of gene flow but not the particular founder alleles traveling down the paths. This paper explores algorithms for implementing Thompson's suggestion for codominant markers in the context of automatic haplotyping, estimating location scores, and computing gene-clustering statistics for robust linkage analysis. Realistic numerical examples demonstrate the feasibility of the algorithms. PMID:8651310
Development of the NASA/FLAGRO computer program for analysis of airframe structures
NASA Technical Reports Server (NTRS)
Forman, R. G.; Shivakumar, V.; Newman, J. C., Jr.
1994-01-01
The NASA/FLAGRO (NASGRO) computer program was developed for fracture control analysis of space hardware and is currently the standard computer code in NASA, the U.S. Air Force, and the European Agency (ESA) for this purpose. The significant attributes of the NASGRO program are the numerous crack case solutions, the large materials file, the improved growth rate equation based on crack closure theory, and the user-friendly promptive input features. In support of the National Aging Aircraft Research Program (NAARP); NASGRO is being further developed to provide advanced state-of-the-art capability for damage tolerance and crack growth analysis of aircraft structural problems, including mechanical systems and engines. The project currently involves a cooperative development effort by NASA, FAA, and ESA. The primary tasks underway are the incorporation of advanced methodology for crack growth rate retardation resulting from spectrum loading and improved analysis for determining crack instability. Also, the current weight function solutions in NASGRO or nonlinear stress gradient problems are being extended to more crack cases, and the 2-d boundary integral routine for stress analysis and stress-intensity factor solutions is being extended to 3-d problems. Lastly, effort is underway to enhance the program to operate on personal computers and work stations in a Windows environment. Because of the increasing and already wide usage of NASGRO, the code offers an excellent mechanism for technology transfer for new fatigue and fracture mechanics capabilities developed within NAARP.
Cosmochemistry: Understanding the Solar System through analysis of extraterrestrial materials
MacPherson, Glenn J.; Thiemens, Mark H.
2011-01-01
Cosmochemistry is the chemical analysis of extraterrestrial materials. This term generally is taken to mean laboratory analysis, which is the cosmochemistry gold standard because of the ability for repeated analysis under highly controlled conditions using the most advanced instrumentation unhindered by limitations in power, space, or environment. Over the past 40 y, advances in technology have enabled telescopic and spacecraft instruments to provide important data that significantly complement the laboratory data. In this special edition, recent advances in the state of the art of cosmochemistry are presented, which range from instrumental analysis of meteorites to theoretical–computational and astronomical observations. PMID:22128323
Roche, Benjamin; Guégan, Jean-François; Bousquet, François
2008-10-15
Computational biology is often associated with genetic or genomic studies only. However, thanks to the increase of computational resources, computational models are appreciated as useful tools in many other scientific fields. Such modeling systems are particularly relevant for the study of complex systems, like the epidemiology of emerging infectious diseases. So far, mathematical models remain the main tool for the epidemiological and ecological analysis of infectious diseases, with SIR models could be seen as an implicit standard in epidemiology. Unfortunately, these models are based on differential equations and, therefore, can become very rapidly unmanageable due to the too many parameters which need to be taken into consideration. For instance, in the case of zoonotic and vector-borne diseases in wildlife many different potential host species could be involved in the life-cycle of disease transmission, and SIR models might not be the most suitable tool to truly capture the overall disease circulation within that environment. This limitation underlines the necessity to develop a standard spatial model that can cope with the transmission of disease in realistic ecosystems. Computational biology may prove to be flexible enough to take into account the natural complexity observed in both natural and man-made ecosystems. In this paper, we propose a new computational model to study the transmission of infectious diseases in a spatially explicit context. We developed a multi-agent system model for vector-borne disease transmission in a realistic spatial environment. Here we describe in detail the general behavior of this model that we hope will become a standard reference for the study of vector-borne disease transmission in wildlife. To conclude, we show how this simple model could be easily adapted and modified to be used as a common framework for further research developments in this field.
Computation of forces from deformed visco-elastic biological tissues
NASA Astrophysics Data System (ADS)
Muñoz, José J.; Amat, David; Conte, Vito
2018-04-01
We present a least-squares based inverse analysis of visco-elastic biological tissues. The proposed method computes the set of contractile forces (dipoles) at the cell boundaries that induce the observed and quantified deformations. We show that the computation of these forces requires the regularisation of the problem functional for some load configurations that we study here. The functional measures the error of the dynamic problem being discretised in time with a second-order implicit time-stepping and in space with standard finite elements. We analyse the uniqueness of the inverse problem and estimate the regularisation parameter by means of an L-curved criterion. We apply the methodology to a simple toy problem and to an in vivo set of morphogenetic deformations of the Drosophila embryo.
Standardized input for Hanford environmental impact statements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Napier, B.A.
1981-05-01
Models and computer programs for simulating the environmental behavior of radionuclides in the environment and the resulting radiation dose to humans have been developed over the years by the Environmental Analysis Section staff, Ecological Sciences Department at the Pacific Northwest Laboratory (PNL). Methodologies have evolved for calculating raidation doses from many exposure pathways for any type of release mechanism. Depending on the situation or process being simulated, different sets of computer programs, assumptions, and modeling techniques must be used. This report is a compilation of recommended computer programs and necessary input information for use in calculating doses to members ofmore » the general public for environmental impact statements prepared for DOE activities to be conducted on or near the Hanford Reservation.« less
Comparison of FDNS liquid rocket engine plume computations with SPF/2
NASA Technical Reports Server (NTRS)
Kumar, G. N.; Griffith, D. O., II; Warsi, S. A.; Seaford, C. M.
1993-01-01
Prediction of a plume's shape and structure is essential to the evaluation of base region environments. The JANNAF standard plume flowfield analysis code SPF/2 predicts plumes well, but cannot analyze base regions. Full Navier-Stokes CFD codes can calculate both zones; however, before they can be used, they must be validated. The CFD code FDNS3D (Finite Difference Navier-Stokes Solver) was used to analyze the single plume of a Space Transportation Main Engine (STME) and comparisons were made with SPF/2 computations. Both frozen and finite rate chemistry models were employed as well as two turbulence models in SPF/2. The results indicate that FDNS3D plume computations agree well with SPF/2 predictions for liquid rocket engine plumes.
Adaptive relaxation for the steady-state analysis of Markov chains
NASA Technical Reports Server (NTRS)
Horton, Graham
1994-01-01
We consider a variant of the well-known Gauss-Seidel method for the solution of Markov chains in steady state. Whereas the standard algorithm visits each state exactly once per iteration in a predetermined order, the alternative approach uses a dynamic strategy. A set of states to be visited is maintained which can grow and shrink as the computation progresses. In this manner, we hope to concentrate the computational work in those areas of the chain in which maximum improvement in the solution can be achieved. We consider the adaptive approach both as a solver in its own right and as a relaxation method within the multi-level algorithm. Experimental results show significant computational savings in both cases.
Operating manual for coaxial injection combustion model. [for the space shuttle main engine
NASA Technical Reports Server (NTRS)
Sutton, R. D.; Schuman, M. D.; Chadwick, W. D.
1974-01-01
An operating manual for the coaxial injection combustion model (CICM) is presented as the final report for an eleven month effort designed to provide improvement, to verify, and to document the comprehensive computer program for analyzing the performance of thrust chamber operation with gas/liquid coaxial jet injection. The effort culminated in delivery of an operation FORTRAN IV computer program and associated documentation pertaining to the combustion conditions in the space shuttle main engine. The computer program is structured for compatibility with the standardized Joint Army-Navy-NASA-Air Force (JANNAF) performance evaluation procedure. Use of the CICM in conjunction with the JANNAF procedure allows the analysis of engine systems using coaxial gas/liquid injection.
Comprehensive analysis of a Radiology Operations Management computer system.
Arenson, R L; London, J W
1979-11-01
The Radiology Operations Management computer system at the Hospital of the University of Pennsylvania is discussed. The scheduling and file room modules are based on the system at Massachusetts General Hospital. Patient delays are indicated by the patient tracking module. A reporting module allows CRT/keyboard entry by transcriptionists, entry of standard reports by radiologists using bar code labels, and entry by radiologists using a specialty designed diagnostic reporting terminal. Time-flow analyses demonstrate a significant improvement in scheduling, patient waiting, retrieval of radiographs, and report delivery. Recovery of previously lost billing contributes to the proved cost effectiveness of this system.
On a computational model of building thermal dynamic response
NASA Astrophysics Data System (ADS)
Jarošová, Petra; Vala, Jiří
2016-07-01
Development and exploitation of advanced materials, structures and technologies in civil engineering, both for buildings with carefully controlled interior temperature and for common residential houses, together with new European and national directives and technical standards, stimulate the development of rather complex and robust, but sufficiently simple and inexpensive computational tools, supporting their design and optimization of energy consumption. This paper demonstrates the possibility of consideration of such seemingly contradictory requirements, using the simplified non-stationary thermal model of a building, motivated by the analogy with the analysis of electric circuits; certain semi-analytical forms of solutions come from the method of lines.
Computing Linear Mathematical Models Of Aircraft
NASA Technical Reports Server (NTRS)
Duke, Eugene L.; Antoniewicz, Robert F.; Krambeer, Keith D.
1991-01-01
Derivation and Definition of Linear Aircraft Model (LINEAR) computer program provides user with powerful, and flexible, standard, documented, and verified software tool for linearization of mathematical models of aerodynamics of aircraft. Intended for use in software tool to drive linear analysis of stability and design of control laws for aircraft. Capable of both extracting such linearized engine effects as net thrust, torque, and gyroscopic effects, and including these effects in linear model of system. Designed to provide easy selection of state, control, and observation variables used in particular model. Also provides flexibility of allowing alternate formulations of both state and observation equations. Written in FORTRAN.
Moutsatsos, Ioannis K; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J; Jenkins, Jeremy L; Holway, Nicholas; Tallarico, John; Parker, Christian N
2017-03-01
High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an "off-the-shelf," open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community.
Moutsatsos, Ioannis K.; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J.; Jenkins, Jeremy L.; Holway, Nicholas; Tallarico, John; Parker, Christian N.
2016-01-01
High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an “off-the-shelf,” open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community. PMID:27899692
Influence of computer work under time pressure on cardiac activity.
Shi, Ping; Hu, Sijung; Yu, Hongliu
2015-03-01
Computer users are often under stress when required to complete computer work within a required time. Work stress has repeatedly been associated with an increased risk for cardiovascular disease. The present study examined the effects of time pressure workload during computer tasks on cardiac activity in 20 healthy subjects. Heart rate, time domain and frequency domain indices of heart rate variability (HRV) and Poincaré plot parameters were compared among five computer tasks and two rest periods. Faster heart rate and decreased standard deviation of R-R interval were noted in response to computer tasks under time pressure. The Poincaré plot parameters showed significant differences between different levels of time pressure workload during computer tasks, and between computer tasks and the rest periods. In contrast, no significant differences were identified for the frequency domain indices of HRV. The results suggest that the quantitative Poincaré plot analysis used in this study was able to reveal the intrinsic nonlinear nature of the autonomically regulated cardiac rhythm. Specifically, heightened vagal tone occurred during the relaxation computer tasks without time pressure. In contrast, the stressful computer tasks with added time pressure stimulated cardiac sympathetic activity. Copyright © 2015 Elsevier Ltd. All rights reserved.
An ergonomic evaluation comparing desktop, notebook, and subnotebook computers.
Szeto, Grace P; Lee, Raymond
2002-04-01
To evaluate and compare the postures and movements of the cervical and upper thoracic spine, the typing performance, and workstation ergonomic factors when using a desktop, notebook, and subnotebook computers. Repeated-measures design. A motion analysis laboratory with an electromagnetic tracking device. A convenience sample of 21 university students between ages 20 and 24 years with no history of neck or shoulder discomfort. Each subject performed a standardized typing task by using each of the 3 computers. Measurements during the typing task were taken at set intervals. Cervical and thoracic spines adopted a more flexed posture in using the smaller-sized computers. There were significantly greater neck movements in using desktop computers when compared with the notebook and subnotebook computers. The viewing distances adopted by the subjects decreased as the computer size decreased. Typing performance and subjective rating of difficulty in using the keyboards were also significantly different among the 3 types of computers. Computer users need to consider the posture of the spine and potential risk of developing musculoskeletal discomfort in choosing computers. Copyright 2002 by the American Congress of Rehabilitation Medicine and the American Academy of Physical Medicine and Rehabilitation
Dorfman, David M; LaPlante, Charlotte D; Li, Betty
2016-09-01
We analyzed plasma cell populations in bone marrow samples from 353 patients with possible bone marrow involvement by a plasma cell neoplasm, using FLOCK (FLOw Clustering without K), an unbiased, automated, computational approach to identify cell subsets in multidimensional flow cytometry data. FLOCK identified discrete plasma cell populations in the majority of bone marrow specimens found by standard histologic and immunophenotypic criteria to be involved by a plasma cell neoplasm (202/208 cases; 97%), including 34 cases that were negative by standard flow cytometric analysis that included clonality assessment. FLOCK identified discrete plasma cell populations in only a minority of cases negative for involvement by a plasma cell neoplasm by standard histologic and immunophenotypic criteria (38/145 cases; 26%). Interestingly, 55% of the cases negative by standard analysis, but containing a FLOCK-identified discrete plasma cell population, were positive for monoclonal gammopathy by serum protein electrophoresis and immunofixation. FLOCK-identified and quantitated plasma cell populations accounted for 3.05% of total cells on average in cases positive for involvement by a plasma cell neoplasm by standard histologic and immunophenotypic criteria, and 0.27% of total cells on average in cases negative for involvement by a plasma cell neoplasm by standard histologic and immunophenotypic criteria (p<0.0001; area under the curve by ROC analysis=0.96). The presence of a FLOCK-identified discrete plasma cell population was predictive of the presence of plasma cell neoplasia with a sensitivity of 97%, compared with only 81% for standard flow cytometric analysis, and had specificity of 74%, PPV of 84% and NPV of 95%. FLOCK analysis, which has been shown to provide useful diagnostic information for evaluating patients with suspected systemic mastocytosis, is able to identify neoplastic plasma cell populations analyzed by flow cytometry, and may be helpful in the diagnostic evaluation of bone marrow samples for involvement by plasma cell neoplasia. Copyright © 2016 Elsevier Ltd. All rights reserved.
Engine dynamic analysis with general nonlinear finite element codes
NASA Technical Reports Server (NTRS)
Adams, M. L.; Padovan, J.; Fertis, D. G.
1991-01-01
A general engine dynamic analysis as a standard design study computational tool is described for the prediction and understanding of complex engine dynamic behavior. Improved definition of engine dynamic response provides valuable information and insights leading to reduced maintenance and overhaul costs on existing engine configurations. Application of advanced engine dynamic simulation methods provides a considerable cost reduction in the development of new engine designs by eliminating some of the trial and error process done with engine hardware development.
General aviation air traffic pattern safety analysis
NASA Technical Reports Server (NTRS)
Parker, L. C.
1973-01-01
A concept is described for evaluating the general aviation mid-air collision hazard in uncontrolled terminal airspace. Three-dimensional traffic pattern measurements were conducted at uncontrolled and controlled airports. Computer programs for data reduction, storage retrieval and statistical analysis have been developed. Initial general aviation air traffic pattern characteristics are presented. These preliminary results indicate that patterns are highly divergent from the expected standard pattern, and that pattern procedures observed can affect the ability of pilots to see and avoid each other.
1988-11-01
system, using graphic techniques which enable users, analysts, and designers to get a clear and common picture of the system and how its parts fit...boxes into hierarchies suitable for computer implementation. ŗ. Structured Design uses tools, especially graphic ones, to render systems readily...LSA, PROCESSES, DATA FLOWS, DATA STORES, EX"RNAL ENTITIES, OVERALL SYSTEMS DESIGN PROCESS, over 19, ABSTRACT (Continue on reverse if necessary and
Fully automated motion correction in first-pass myocardial perfusion MR image sequences.
Milles, Julien; van der Geest, Rob J; Jerosch-Herold, Michael; Reiber, Johan H C; Lelieveldt, Boudewijn P F
2008-11-01
This paper presents a novel method for registration of cardiac perfusion magnetic resonance imaging (MRI). The presented method is capable of automatically registering perfusion data, using independent component analysis (ICA) to extract physiologically relevant features together with their time-intensity behavior. A time-varying reference image mimicking intensity changes in the data of interest is computed based on the results of that ICA. This reference image is used in a two-pass registration framework. Qualitative and quantitative validation of the method is carried out using 46 clinical quality, short-axis, perfusion MR datasets comprising 100 images each. Despite varying image quality and motion patterns in the evaluation set, validation of the method showed a reduction of the average right ventricle (LV) motion from 1.26+/-0.87 to 0.64+/-0.46 pixels. Time-intensity curves are also improved after registration with an average error reduced from 2.65+/-7.89% to 0.87+/-3.88% between registered data and manual gold standard. Comparison of clinically relevant parameters computed using registered data and the manual gold standard show a good agreement. Additional tests with a simulated free-breathing protocol showed robustness against considerable deviations from a standard breathing protocol. We conclude that this fully automatic ICA-based method shows an accuracy, a robustness and a computation speed adequate for use in a clinical environment.
Computer-assisted image analysis to quantify daily growth rates of broiler chickens.
De Wet, L; Vranken, E; Chedad, A; Aerts, J M; Ceunen, J; Berckmans, D
2003-09-01
1. The objective was to investigate the possibility of detecting daily body weight changes of broiler chickens with computer-assisted image analysis. 2. The experiment included 50 broiler chickens reared under commercial conditions. Ten out of 50 chickens were randomly selected and video recorded (upper view) 18 times during the 42-d growing period. The number of surface and periphery pixels from the images was used to derive a relationship between body dimension and live weight. 3. The relative error in weight estimation, expressed in terms of the standard deviation of the residuals from image surface data was 10%, while it was found to be 15% for the image periphery data. 4. Image-processing systems could be developed to assist the farmer in making important management and marketing decisions.
NASA Technical Reports Server (NTRS)
Wardroper, A. M. K.; Brooks, P. W.; Humberston, M. J.; Maxwell, J. R.
1977-01-01
A computer method is described for the automatic classification of triterpanes and steranes into gross structural type from their mass spectral characteristics. The method has been applied to the spectra obtained by gas-chromatographic/mass-spectroscopic analysis of two mixtures of standards and of hydrocarbon fractions isolated from Green River and Messel oil shales. Almost all of the steranes and triterpanes identified previously in both shales were classified, in addition to a number of new components. The results indicate that classification of such alkanes is possible with a laboratory computer system. The method has application to diagenesis and maturation studies as well as to oil/oil and oil/source rock correlations in which rapid screening of large numbers of samples is required.
Energy Frontier Research With ATLAS: Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butler, John; Black, Kevin; Ahlen, Steve
2016-06-14
The Boston University (BU) group is playing key roles across the ATLAS experiment: in detector operations, the online trigger, the upgrade, computing, and physics analysis. Our team has been critical to the maintenance and operations of the muon system since its installation. During Run 1 we led the muon trigger group and that responsibility continues into Run 2. BU maintains and operates the ATLAS Northeast Tier 2 computing center. We are actively engaged in the analysis of ATLAS data from Run 1 and Run 2. Physics analyses we have contributed to include Standard Model measurements (W and Z cross sections,more » t\\bar{t} differential cross sections, WWW^* production), evidence for the Higgs decaying to \\tau^+\\tau^-, and searches for new phenomena (technicolor, Z' and W', vector-like quarks, dark matter).« less
STAMPS: Software Tool for Automated MRI Post-processing on a supercomputer.
Bigler, Don C; Aksu, Yaman; Miller, David J; Yang, Qing X
2009-08-01
This paper describes a Software Tool for Automated MRI Post-processing (STAMP) of multiple types of brain MRIs on a workstation and for parallel processing on a supercomputer (STAMPS). This software tool enables the automation of nonlinear registration for a large image set and for multiple MR image types. The tool uses standard brain MRI post-processing tools (such as SPM, FSL, and HAMMER) for multiple MR image types in a pipeline fashion. It also contains novel MRI post-processing features. The STAMP image outputs can be used to perform brain analysis using Statistical Parametric Mapping (SPM) or single-/multi-image modality brain analysis using Support Vector Machines (SVMs). Since STAMPS is PBS-based, the supercomputer may be a multi-node computer cluster or one of the latest multi-core computers.
Frequency modulation television analysis: Distortion analysis
NASA Technical Reports Server (NTRS)
Hodge, W. H.; Wong, W. H.
1973-01-01
Computer simulation is used to calculate the time-domain waveform of standard T-pulse-and-bar test signal distorted in passing through an FM television system. The simulator includes flat or preemphasized systems and requires specification of the RF predetection filter characteristics. The predetection filters are modeled with frequency-symmetric Chebyshev (0.1-db ripple) and Butterworth filters. The computer was used to calculate distorted output signals for sixty-four different specified systems, and the output waveforms are plotted for all sixty-four. Comparison of the plotted graphs indicates that a Chebyshev predetection filter of four poles causes slightly more signal distortion than a corresponding Butterworth filter and the signal distortion increases as the number of poles increases. An increase in the peak deviation also increases signal distortion. Distortion also increases with the addition of preemphasis.
Appel, R D; Palagi, P M; Walther, D; Vargas, J R; Sanchez, J C; Ravier, F; Pasquali, C; Hochstrasser, D F
1997-12-01
Although two-dimensional electrophoresis (2-DE) computer analysis software packages have existed ever since 2-DE technology was developed, it is only now that the hardware and software technology allows large-scale studies to be performed on low-cost personal computers or workstations, and that setting up a 2-DE computer analysis system in a small laboratory is no longer considered a luxury. After a first attempt in the seventies and early eighties to develop 2-DE analysis software systems on hardware that had poor or even no graphical capabilities, followed in the late eighties by a wave of innovative software developments that were possible thanks to new graphical interface standards such as XWindows, a third generation of 2-DE analysis software packages has now come to maturity. It can be run on a variety of low-cost, general-purpose personal computers, thus making the purchase of a 2-DE analysis system easily attainable for even the smallest laboratory that is involved in proteome research. Melanie II 2-D PAGE, developed at the University Hospital of Geneva, is such a third-generation software system for 2-DE analysis. Based on unique image processing algorithms, this user-friendly object-oriented software package runs on multiple platforms, including Unix, MS-Windows 95 and NT, and Power Macintosh. It provides efficient spot detection and quantitation, state-of-the-art image comparison, statistical data analysis facilities, and is Internet-ready. Linked to proteome databases such as those available on the World Wide Web, it represents a valuable tool for the "Virtual Lab" of the post-genome area.
He, Li; Xu, Zongda; Fan, Xing; Li, Jing; Lu, Hongwei
2017-05-01
This study develops a meta-modeling based mathematical programming approach with flexibility in environmental standards. It integrates numerical simulation, meta-modeling analysis, and fuzzy programming within a general framework. A set of models between remediation strategies and remediation performance can well guarantee the mitigation in computational efforts in the simulation and optimization process. In order to prevent the occurrence of over-optimistic and pessimistic optimization strategies, a high satisfaction level resulting from the implementation of a flexible standard can indicate the degree to which the environmental standard is satisfied. The proposed approach is applied to a naphthalene-contaminated site in China. Results show that a longer remediation period corresponds to a lower total pumping rate and a stringent risk standard implies a high total pumping rate. The wells located near or in the down-gradient direction to the contaminant sources have the most significant efficiency among all of remediation schemes.
Milles, J; van der Geest, R J; Jerosch-Herold, M; Reiber, J H C; Lelieveldt, B P F
2007-01-01
This paper presents a novel method for registration of cardiac perfusion MRI. The presented method successfully corrects for breathing motion without any manual interaction using Independent Component Analysis to extract physiologically relevant features together with their time-intensity behavior. A time-varying reference image mimicking intensity changes in the data of interest is computed based on the results of ICA, and used to compute the displacement caused by breathing for each frame. Qualitative and quantitative validation of the method is carried out using 46 clinical quality, short-axis, perfusion MR datasets comprising 100 images each. Validation experiments showed a reduction of the average LV motion from 1.26+/-0.87 to 0.64+/-0.46 pixels. Time-intensity curves are also improved after registration with an average error reduced from 2.65+/-7.89% to 0.87+/-3.88% between registered data and manual gold standard. We conclude that this fully automatic ICA-based method shows an excellent accuracy, robustness and computation speed, adequate for use in a clinical environment.
[Application of virtual instrumentation technique in toxicological studies].
Moczko, Jerzy A
2005-01-01
Research investigations require frequently direct connection of measuring equipment to the computer. Virtual instrumentation technique considerably facilitates programming of sophisticated acquisition-and-analysis procedures. In standard approach these two steps are performed subsequently with separate software tools. The acquired data are transfered with export / import procedures of particular program to the another one which executes next step of analysis. The described procedure is cumbersome, time consuming and may be potential source of the errors. In 1987 National Instruments Corporation introduced LabVIEW language based on the concept of graphical programming. Contrary to conventional textual languages it allows the researcher to concentrate on the resolved problem and omit all syntactical rules. Programs developed in LabVIEW are called as virtual instruments (VI) and are portable among different computer platforms as PCs, Macintoshes, Sun SPARCstations, Concurrent PowerMAX stations, HP PA/RISK workstations. This flexibility warrants that the programs prepared for one particular platform would be also appropriate to another one. In presented paper basic principles of connection of research equipment to computer systems were described.
NASA Astrophysics Data System (ADS)
Khodachenko, Maxim; Miller, Steven; Stoeckler, Robert; Topf, Florian
2010-05-01
Computational modeling and observational data analysis are two major aspects of the modern scientific research. Both appear nowadays under extensive development and application. Many of the scientific goals of planetary space missions require robust models of planetary objects and environments as well as efficient data analysis algorithms, to predict conditions for mission planning and to interpret the experimental data. Europe has great strength in these areas, but it is insufficiently coordinated; individual groups, models, techniques and algorithms need to be coupled and integrated. Existing level of scientific cooperation and the technical capabilities for operative communication, allow considerable progress in the development of a distributed international Research Infrastructure (RI) which is based on the existing in Europe computational modelling and data analysis centers, providing the scientific community with dedicated services in the fields of their computational and data analysis expertise. These services will appear as a product of the collaborative communication and joint research efforts of the numerical and data analysis experts together with planetary scientists. The major goal of the EUROPLANET-RI / EMDAF is to make computational models and data analysis algorithms associated with particular national RIs and teams, as well as their outputs, more readily available to their potential user community and more tailored to scientific user requirements, without compromising front-line specialized research on model and data analysis algorithms development and software implementation. This objective will be met through four keys subdivisions/tasks of EMAF: 1) an Interactive Catalogue of Planetary Models; 2) a Distributed Planetary Modelling Laboratory; 3) a Distributed Data Analysis Laboratory, and 4) enabling Models and Routines for High Performance Computing Grids. Using the advantages of the coordinated operation and efficient communication between the involved computational modelling, research and data analysis expert teams and their related research infrastructures, EMDAF will provide a 1) flexible, 2) scientific user oriented, 3) continuously developing and fast upgrading computational and data analysis service to support and intensify the European planetary scientific research. At the beginning EMDAF will create a set of demonstrators and operational tests of this service in key areas of European planetary science. This work will aim at the following objectives: (a) Development and implementation of tools for distant interactive communication between the planetary scientists and computing experts (including related RIs); (b) Development of standard routine packages, and user-friendly interfaces for operation of the existing numerical codes and data analysis algorithms by the specialized planetary scientists; (c) Development of a prototype of numerical modelling services "on demand" for space missions and planetary researchers; (d) Development of a prototype of data analysis services "on demand" for space missions and planetary researchers; (e) Development of a prototype of coordinated interconnected simulations of planetary phenomena and objects (global multi-model simulators); (f) Providing the demonstrators of a coordinated use of high performance computing facilities (super-computer networks), done in cooperation with European HPC Grid DEISA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunn, B. D.; Diamond, S. C.; Bennett, G. A.
1977-10-01
A set of computer programs, called Cal-ERDA, is described that is capable of rapid and detailed analysis of energy consumption in buildings. A new user-oriented input language, named the Building Design Language (BDL), has been written to allow simplified manipulation of the many variables used to describe a building and its operation. This manual provides the user with information necessary to understand in detail the Cal-ERDA set of computer programs. The new computer programs described include: an EXECUTIVE Processor to create computer system control commands; a BDL Processor to analyze input instructions, execute computer system control commands, perform assignments andmore » data retrieval, and control the operation of the LOADS, SYSTEMS, PLANT, ECONOMICS, and REPORT programs; a LOADS analysis program that calculates peak (design) zone and hourly loads and the effect of the ambient weather conditions, the internal occupancy, lighting, and equipment within the building, as well as variations in the size, location, orientation, construction, walls, roofs, floors, fenestrations, attachments (awnings, balconies), and shape of a building; a Heating, Ventilating, and Air-Conditioning (HVAC) SYSTEMS analysis program capable of modeling the operation of HVAC components including fans, coils, economizers, humidifiers, etc.; 16 standard configurations and operated according to various temperature and humidity control schedules. A plant equipment program models the operation of boilers, chillers, electrical generation equipment (diesel or turbines), heat storage apparatus (chilled or heated water), and solar heating and/or cooling systems. An ECONOMIC analysis program calculates life-cycle costs. A REPORT program produces tables of user-selected variables and arranges them according to user-specified formats. A set of WEATHER ANALYSIS programs manipulates, summarizes and plots weather data. Libraries of weather data, schedule data, and building data were prepared.« less
Computer-aided target tracking in motion analysis studies
NASA Astrophysics Data System (ADS)
Burdick, Dominic C.; Marcuse, M. L.; Mislan, J. D.
1990-08-01
Motion analysis studies require the precise tracking of reference objects in sequential scenes. In a typical situation, events of interest are captured at high frame rates using special cameras, and selected objects or targets are tracked on a frame by frame basis to provide necessary data for motion reconstruction. Tracking is usually done using manual methods which are slow and prone to error. A computer based image analysis system has been developed that performs tracking automatically. The objective of this work was to eliminate the bottleneck due to manual methods in high volume tracking applications such as the analysis of crash test films for the automotive industry. The system has proven to be successful in tracking standard fiducial targets and other objects in crash test scenes. Over 95 percent of target positions which could be located using manual methods can be tracked by the system, with a significant improvement in throughput over manual methods. Future work will focus on the tracking of clusters of targets and on tracking deformable objects such as airbags.
Open Rotor Computational Aeroacoustic Analysis with an Immersed Boundary Method
NASA Technical Reports Server (NTRS)
Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.
2016-01-01
Reliable noise prediction capabilities are essential to enable novel fuel efficient open rotor designs that can meet the community and cabin noise standards. Toward this end, immersed boundary methods have reached a level of maturity so that they are being frequently employed for specific real world applications within NASA. This paper demonstrates that our higher-order immersed boundary method provides the ability for aeroacoustic analysis of wake-dominated flow fields generated by highly complex geometries. This is the first of a kind aeroacoustic simulation of an open rotor propulsion system employing an immersed boundary method. In addition to discussing the peculiarities of applying the immersed boundary method to this moving boundary problem, we will provide a detailed aeroacoustic analysis of the noise generation mechanisms encountered in the open rotor flow. The simulation data is compared to available experimental data and other computational results employing more conventional CFD methods. The noise generation mechanisms are analyzed employing spectral analysis, proper orthogonal decomposition and the causality method.
A Federal Vision for Future Computing: A Nanotechnology-Inspired Grand Challenge
2016-07-29
Science Foundation (NSF), Department of Defense (DOD), National Institute of Standards and Technology (NIST), Intelligence Community (IC) Introduction...multiple Federal agencies: • Intelligent big data sensors that act autonomously and are programmable via the network for increased flexibility, and... intelligence for scientific discovery enabled by rapid extreme-scale data analysis, capable of understanding and making sense of results and thereby
ERIC Educational Resources Information Center
Pham, Vinh Huy
2009-01-01
Stakeholders of the educational system assume that standardized tests are transparently about the subject content being tested and therefore can be used as a metric to measure achievement in outcome-based educational reform. Both analysis of longitudinal data for the Texas Assessment of Knowledge and Skills (TAKS) exam and agent based computer…
Data needs for X-ray astronomy satellites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kallman, T.
I review the current status of atomic data for X-ray astronomy satellites. This includes some of the astrophysical issues which can be addressed, current modeling and analysis techniques, computational tools, the limitations imposed by currently available atomic data, and the validity of standard assumptions. I also discuss the future: challenges associated with future missions and goals for atomic data collection.
Evaluation of standard radiation atmosphere aerosol models for a coastal environment
NASA Technical Reports Server (NTRS)
Whitlock, C. H.; Suttles, J. T.; Sebacher, D. I.; Fuller, W. H.; Lecroy, S. R.
1986-01-01
Calculations are compared with data from an experiment to evaluate the utility of standard radiation atmosphere (SRA) models for defining aerosol properties in atmospheric radiation computations. Initial calculations with only SRA aerosols in a four-layer atmospheric column simulation allowed a sensitivity study and the detection of spectral trends in optical depth, which differed from measurements. Subsequently, a more detailed analysis provided a revision in the stratospheric layer, which brought calculations in line with both optical depth and skylight radiance data. The simulation procedure allows determination of which atmospheric layers influence both downwelling and upwelling radiation spectra.
Annotare—a tool for annotating high-throughput biomedical investigations and resulting data
Shankar, Ravi; Parkinson, Helen; Burdett, Tony; Hastings, Emma; Liu, Junmin; Miller, Michael; Srinivasa, Rashmi; White, Joseph; Brazma, Alvis; Sherlock, Gavin; Stoeckert, Christian J.; Ball, Catherine A.
2010-01-01
Summary: Computational methods in molecular biology will increasingly depend on standards-based annotations that describe biological experiments in an unambiguous manner. Annotare is a software tool that enables biologists to easily annotate their high-throughput experiments, biomaterials and data in a standards-compliant way that facilitates meaningful search and analysis. Availability and Implementation: Annotare is available from http://code.google.com/p/annotare/ under the terms of the open-source MIT License (http://www.opensource.org/licenses/mit-license.php). It has been tested on both Mac and Windows. Contact: rshankar@stanford.edu PMID:20733062
Topologically preserving straightening of spinal cord MRI.
De Leener, Benjamin; Mangeat, Gabriel; Dupont, Sara; Martin, Allan R; Callot, Virginie; Stikov, Nikola; Fehlings, Michael G; Cohen-Adad, Julien
2017-10-01
To propose a robust and accurate method for straightening magnetic resonance (MR) images of the spinal cord, based on spinal cord segmentation, that preserves spinal cord topology and that works for any MRI contrast, in a context of spinal cord template-based analysis. The spinal cord curvature was computed using an iterative Non-Uniform Rational B-Spline (NURBS) approximation. Forward and inverse deformation fields for straightening were computed by solving analytically the straightening equations for each image voxel. Computational speed-up was accomplished by solving all voxel equation systems as one single system. Straightening accuracy (mean and maximum distance from straight line), computational time, and robustness to spinal cord length was evaluated using the proposed and the standard straightening method (label-based spline deformation) on 3T T 2 - and T 1 -weighted images from 57 healthy subjects and 33 patients with spinal cord compression due to degenerative cervical myelopathy (DCM). The proposed algorithm was more accurate, more robust, and faster than the standard method (mean distance = 0.80 vs. 0.83 mm, maximum distance = 1.49 vs. 1.78 mm, time = 71 vs. 174 sec for the healthy population and mean distance = 0.65 vs. 0.68 mm, maximum distance = 1.28 vs. 1.55 mm, time = 32 vs. 60 sec for the DCM population). A novel image straightening method that enables template-based analysis of quantitative spinal cord MRI data is introduced. This algorithm works for any MRI contrast and was validated on healthy and patient populations. The presented method is implemented in the Spinal Cord Toolbox, an open-source software for processing spinal cord MRI data. 1 Technical Efficacy: Stage 1 J. Magn. Reson. Imaging 2017;46:1209-1219. © 2017 International Society for Magnetic Resonance in Medicine.
Comparison of custom to standard TKA instrumentation with computed tomography.
Ng, Vincent Y; Arnott, Lindsay; Li, Jia; Hopkins, Ronald; Lewis, Jamie; Sutphen, Sean; Nicholson, Lisa; Reader, Douglas; McShane, Michael A
2014-08-01
There is conflicting evidence whether custom instrumentation for total knee arthroplasty (TKA) improves component position compared to standard instrumentation. Studies have relied on long-limb radiographs limited to two-dimensional (2D) analysis and subjected to rotational inaccuracy. We used postoperative computed tomography (CT) to evaluate preoperative three-dimensional templating and CI to facilitate accurate and efficient implantation of TKA femoral and tibial components. We prospectively evaluated a single-surgeon cohort of 78 TKA patients (51 custom, 27 standard) with postoperative CT scans using 3D reconstruction and contour-matching technology to preoperative imaging. Component alignment was measured in coronal, sagittal and axial planes. Preoperative templating for custom instrumentation was 87 and 79 % accurate for femoral and tibial component size. All custom components were within 1 size except for the tibial component in one patient (2 sizes). Tourniquet time was 5 min longer for custom (30 min) than standard (25 min). In no case was custom instrumentation aborted in favour of standard instrumentation nor was original alignment of custom instrumentation required to be adjusted intraoperatively. There were more outliers greater than 2° from intended alignment with standard instrumentation than custom for both components in all three planes. Custom instrumentation was more accurate in component position for tibial coronal alignment (custom: 1.5° ± 1.2°; standard: 3° ± 1.9°; p = 0.0001) and both tibial (custom: 1.4° ± 1.1°; standard: 16.9° ± 6.8°; p < 0.0001) and femoral (custom: 1.2° ± 0.9°; standard: 3.1° ± 2.1°; p < 0.0001) rotational alignment, and was similar to standard instrumentation in other measurements. When evaluated with CT, custom instrumentation performs similar or better to standard instrumentation in component alignment and accurately templates component size. Tourniquet time was mildly increased for custom compared to standard.
Levman, Jacob E D; Gallego-Ortiz, Cristina; Warner, Ellen; Causer, Petrina; Martel, Anne L
2016-02-01
Magnetic resonance imaging (MRI)-enabled cancer screening has been shown to be a highly sensitive method for the early detection of breast cancer. Computer-aided detection systems have the potential to improve the screening process by standardizing radiologists to a high level of diagnostic accuracy. This retrospective study was approved by the institutional review board of Sunnybrook Health Sciences Centre. This study compares the performance of a proposed method for computer-aided detection (based on the second-order spatial derivative of the relative signal intensity) with the signal enhancement ratio (SER) on MRI-based breast screening examinations. Comparison is performed using receiver operating characteristic (ROC) curve analysis as well as free-response receiver operating characteristic (FROC) curve analysis. A modified computer-aided detection system combining the proposed approach with the SER method is also presented. The proposed method provides improvements in the rates of false positive markings over the SER method in the detection of breast cancer (as assessed by FROC analysis). The modified computer-aided detection system that incorporates both the proposed method and the SER method yields ROC results equal to that produced by SER while simultaneously providing improvements over the SER method in terms of false positives per noncancerous exam. The proposed method for identifying malignancies outperforms the SER method in terms of false positives on a challenging dataset containing many small lesions and may play a useful role in breast cancer screening by MRI as part of a computer-aided detection system.
Parallel ICA and its hardware implementation in hyperspectral image analysis
NASA Astrophysics Data System (ADS)
Du, Hongtao; Qi, Hairong; Peterson, Gregory D.
2004-04-01
Advances in hyperspectral images have dramatically boosted remote sensing applications by providing abundant information using hundreds of contiguous spectral bands. However, the high volume of information also results in excessive computation burden. Since most materials have specific characteristics only at certain bands, a lot of these information is redundant. This property of hyperspectral images has motivated many researchers to study various dimensionality reduction algorithms, including Projection Pursuit (PP), Principal Component Analysis (PCA), wavelet transform, and Independent Component Analysis (ICA), where ICA is one of the most popular techniques. It searches for a linear or nonlinear transformation which minimizes the statistical dependence between spectral bands. Through this process, ICA can eliminate superfluous but retain practical information given only the observations of hyperspectral images. One hurdle of applying ICA in hyperspectral image (HSI) analysis, however, is its long computation time, especially for high volume hyperspectral data sets. Even the most efficient method, FastICA, is a very time-consuming process. In this paper, we present a parallel ICA (pICA) algorithm derived from FastICA. During the unmixing process, pICA divides the estimation of weight matrix into sub-processes which can be conducted in parallel on multiple processors. The decorrelation process is decomposed into the internal decorrelation and the external decorrelation, which perform weight vector decorrelations within individual processors and between cooperative processors, respectively. In order to further improve the performance of pICA, we seek hardware solutions in the implementation of pICA. Until now, there are very few hardware designs for ICA-related processes due to the complicated and iterant computation. This paper discusses capacity limitation of FPGA implementations for pICA in HSI analysis. A synthesis of Application-Specific Integrated Circuit (ASIC) is designed for pICA-based dimensionality reduction in HSI analysis. The pICA design is implemented using standard-height cells and aimed at TSMC 0.18 micron process. During the synthesis procedure, three ICA-related reconfigurable components are developed for the reuse and retargeting purpose. Preliminary results show that the standard-height cell based ASIC synthesis provide an effective solution for pICA and ICA-related processes in HSI analysis.
Eberl, D.D.
2003-01-01
RockJock is a computer program that determines quantitative mineralogy in powdered samples by comparing the integrated X-ray diffraction (XRD) intensities of individual minerals in complex mixtures to the intensities of an internal standard. Analysis without an internal standard (standardless analysis) also is an option. This manual discusses how to prepare and X-ray samples and mineral standards for these types of analyses and describes the operation of the program. Carefully weighed samples containing an internal standard (zincite) are ground in a McCrone mill. Randomly oriented preparations then are X-rayed, and the X-ray data are entered into the RockJock program. Minerals likely to be present in the sample are chosen from a list of standards, and the calculation is begun. The program then automatically fits the sum of stored XRD patterns of pure standard minerals (the calculated pattern) to the measured pattern by varying the fraction of each mineral standard pattern, using the Solver function in Microsoft Excel to minimize a degree of fit parameter between the calculated and measured pattern. The calculation analyzes the pattern (usually 20 to 65 degrees two-theta) to find integrated intensities for the minerals. Integrated intensities for each mineral then are determined from the proportion of each mineral standard pattern required to give the best fit. These integrated intensities then are compared to the integrated intensity of the internal standard, and the weight percentages of the minerals are calculated. The results are presented as a list of minerals with their corresponding weight percent. To some extent, the quality of the analysis can be checked because each mineral is analyzed independently, and, therefore, the sum of the analysis should approach 100 percent. Also, the method has been shown to give good results with artificial mixtures. The program is easy to use, but does require an understanding of mineralogy, of X-ray diffraction practice, and an elementary knowledge of the Excel program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaffney, P.W.; Wooten, J.W.
1980-05-01
Four software tools PFORT, DAVE, POLISH, and BRNANL, which may be used to ensure the standardization of FORTRAN software are introduced. First, FORTRAN computer programs are loosely classified into three groups. Then reasons are given why the program in two of these groups should adhere to a portable subset of the American National Standard (ANS) First FORTRAN 1966. Next, the software tools PFORT, DAVE, POLISH, and BRNANL, are briefly described, and an example of the output from PFORT, DAVE, and POLISH are given. Finally, the dissemination of information pertaining to the tools together with their availability is outlined. 11 figures.
Kasahara, Kota; Kinoshita, Kengo
2016-01-01
Ion conduction mechanisms of ion channels are a long-standing conundrum. Although the molecular dynamics (MD) method has been extensively used to simulate ion conduction dynamics at the atomic level, analysis and interpretation of MD results are not straightforward due to complexity of the dynamics. In our previous reports, we proposed an analytical method called ion-binding state analysis to scrutinize and summarize ion conduction mechanisms by taking advantage of a variety of analytical protocols, e.g., the complex network analysis, sequence alignment, and hierarchical clustering. This approach effectively revealed the ion conduction mechanisms and their dependence on the conditions, i.e., ion concentration and membrane voltage. Here, we present an easy-to-use computational toolkit for ion-binding state analysis, called IBiSA_tools. This toolkit consists of a C++ program and a series of Python and R scripts. From the trajectory file of MD simulations and a structure file, users can generate several images and statistics of ion conduction processes. A complex network named ion-binding state graph is generated in a standard graph format (graph modeling language; GML), which can be visualized by standard network analyzers such as Cytoscape. As a tutorial, a trajectory of a 50 ns MD simulation of the Kv1.2 channel is also distributed with the toolkit. Users can trace the entire process of ion-binding state analysis step by step. The novel method for analysis of ion conduction mechanisms of ion channels can be easily used by means of IBiSA_tools. This software is distributed under an open source license at the following URL: http://www.ritsumei.ac.jp/~ktkshr/ibisa_tools/.
Earth Science Informatics Comes of Age
NASA Technical Reports Server (NTRS)
Jodha, Siri; Khalsa, S.; Ramachandran, Rahul
2014-01-01
The volume and complexity of Earth science data have steadily increased, placing ever-greater demands on researchers, software developers and data managers tasked with handling such data. Additional demands arise from requirements being levied by funding agencies and governments to better manage, preserve and provide open access to data. Fortunately, over the past 10-15 years significant advances in information technology, such as increased processing power, advanced programming languages, more sophisticated and practical standards, and near-ubiquitous internet access have made the jobs of those acquiring, processing, distributing and archiving data easier. These advances have also led to an increasing number of individuals entering the field of informatics as it applies to Geoscience and Remote Sensing. Informatics is the science and technology of applying computers and computational methods to the systematic analysis, management, interchange, and representation of data, information, and knowledge. Informatics also encompasses the use of computers and computational methods to support decisionmaking and other applications for societal benefits.
Scalable Faceted Ranking in Tagging Systems
NASA Astrophysics Data System (ADS)
Orlicki, José I.; Alvarez-Hamelin, J. Ignacio; Fierens, Pablo I.
Nowadays, web collaborative tagging systems which allow users to upload, comment on and recommend contents, are growing. Such systems can be represented as graphs where nodes correspond to users and tagged-links to recommendations. In this paper we analyze the problem of computing a ranking of users with respect to a facet described as a set of tags. A straightforward solution is to compute a PageRank-like algorithm on a facet-related graph, but it is not feasible for online computation. We propose an alternative: (i) a ranking for each tag is computed offline on the basis of tag-related subgraphs; (ii) a faceted order is generated online by merging rankings corresponding to all the tags in the facet. Based on the graph analysis of YouTube and Flickr, we show that step (i) is scalable. We also present efficient algorithms for step (ii), which are evaluated by comparing their results with two gold standards.
Development and analysis of the Software Implemented Fault-Tolerance (SIFT) computer
NASA Technical Reports Server (NTRS)
Goldberg, J.; Kautz, W. H.; Melliar-Smith, P. M.; Green, M. W.; Levitt, K. N.; Schwartz, R. L.; Weinstock, C. B.
1984-01-01
SIFT (Software Implemented Fault Tolerance) is an experimental, fault-tolerant computer system designed to meet the extreme reliability requirements for safety-critical functions in advanced aircraft. Errors are masked by performing a majority voting operation over the results of identical computations, and faulty processors are removed from service by reassigning computations to the nonfaulty processors. This scheme has been implemented in a special architecture using a set of standard Bendix BDX930 processors, augmented by a special asynchronous-broadcast communication interface that provides direct, processor to processor communication among all processors. Fault isolation is accomplished in hardware; all other fault-tolerance functions, together with scheduling and synchronization are implemented exclusively by executive system software. The system reliability is predicted by a Markov model. Mathematical consistency of the system software with respect to the reliability model has been partially verified, using recently developed tools for machine-aided proof of program correctness.
Automated image quality assessment for chest CT scans.
Reeves, Anthony P; Xie, Yiting; Liu, Shuang
2018-02-01
Medical image quality needs to be maintained at standards sufficient for effective clinical reading. Automated computer analytic methods may be applied to medical images for quality assessment. For chest CT scans in a lung cancer screening context, an automated quality assessment method is presented that characterizes image noise and image intensity calibration. This is achieved by image measurements in three automatically segmented homogeneous regions of the scan: external air, trachea lumen air, and descending aorta blood. Profiles of CT scanner behavior are also computed. The method has been evaluated on both phantom and real low-dose chest CT scans and results show that repeatable noise and calibration measures may be realized by automated computer algorithms. Noise and calibration profiles show relevant differences between different scanners and protocols. Automated image quality assessment may be useful for quality control for lung cancer screening and may enable performance improvements to automated computer analysis methods. © 2017 American Association of Physicists in Medicine.
Geiss, Karla; Meyer, Martin
2013-09-01
Standardized mortality ratios and standardized incidence ratios are widely used in cohort studies to compare mortality or incidence in a study population to that in the general population on a age-time-specific basis, but their computation is not included in standard statistical software packages. Here we present a user-friendly Microsoft Windows program for computing standardized mortality ratios and standardized incidence ratios based on calculation of exact person-years at risk stratified by sex, age and calendar time. The program offers flexible import of different file formats for input data and easy handling of general population reference rate tables, such as mortality or incidence tables exported from cancer registry databases. The application of the program is illustrated with two examples using empirical data from the Bavarian Cancer Registry. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Greenfeld, Max; van de Meent, Jan-Willem; Pavlichin, Dmitri S; Mabuchi, Hideo; Wiggins, Chris H; Gonzalez, Ruben L; Herschlag, Daniel
2015-01-16
Single-molecule techniques have emerged as incisive approaches for addressing a wide range of questions arising in contemporary biological research [Trends Biochem Sci 38:30-37, 2013; Nat Rev Genet 14:9-22, 2013; Curr Opin Struct Biol 2014, 28C:112-121; Annu Rev Biophys 43:19-39, 2014]. The analysis and interpretation of raw single-molecule data benefits greatly from the ongoing development of sophisticated statistical analysis tools that enable accurate inference at the low signal-to-noise ratios frequently associated with these measurements. While a number of groups have released analysis toolkits as open source software [J Phys Chem B 114:5386-5403, 2010; Biophys J 79:1915-1927, 2000; Biophys J 91:1941-1951, 2006; Biophys J 79:1928-1944, 2000; Biophys J 86:4015-4029, 2004; Biophys J 97:3196-3205, 2009; PLoS One 7:e30024, 2012; BMC Bioinformatics 288 11(8):S2, 2010; Biophys J 106:1327-1337, 2014; Proc Int Conf Mach Learn 28:361-369, 2013], it remains difficult to compare analysis for experiments performed in different labs due to a lack of standardization. Here we propose a standardized single-molecule dataset (SMD) file format. SMD is designed to accommodate a wide variety of computer programming languages, single-molecule techniques, and analysis strategies. To facilitate adoption of this format we have made two existing data analysis packages that are used for single-molecule analysis compatible with this format. Adoption of a common, standard data file format for sharing raw single-molecule data and analysis outcomes is a critical step for the emerging and powerful single-molecule field, which will benefit both sophisticated users and non-specialists by allowing standardized, transparent, and reproducible analysis practices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-02-01
Work on energy consumption in a large office building is reported, including the following tasks: (1) evaluating and testing the effectiveness of the existing ASHRAE 90-75 and 90-80 standards; (2) evaluating the effectiveness of the BEPS; (3) evaluating the effectiveness of some envelope and lighting design variables towards achieving the BEPS budgets; and (4) comparing the computer energy analysis technique, DOE-2.1, with manual calculation procedures. These tasks are the initial activities in the energy analysis of the Park Plaza Building and will serve as the basis for further understanding the results of ongoing data collection and analysis.
Space crew radiation exposure analysis system based on a commercial stand-alone CAD system
NASA Technical Reports Server (NTRS)
Appleby, Matthew H.; Golightly, Michael J.; Hardy, Alva C.
1992-01-01
Major improvements have recently been completed in the approach to spacecraft shielding analysis. A Computer-Aided Design (CAD)-based system has been developed for determining the shielding provided to any point within or external to the spacecraft. Shielding analysis is performed using a commercially available stand-alone CAD system and a customized ray-tracing subroutine contained within a standard engineering modeling software package. This improved shielding analysis technique has been used in several vehicle design projects such as a Mars transfer habitat, pressurized lunar rover, and the redesigned Space Station. Results of these analyses are provided to demonstrate the applicability and versatility of the system.
Design and analysis of roll cage
NASA Astrophysics Data System (ADS)
Angadi, Gurusangappa; Chetan, S.
2018-04-01
Wildlife fire fighting vehicles are used to extinguish fires in forests, in this process vehicles face falling objects like rocks, tree branches and other objects. Also due to uneven conditions of the terrain like cliff edges, uneven surfaces etc. makes the vehicle to roll over and these can cause injuries to both the driver and the operator. Roll over of a vehicle is a common incident which makes fatal injuries to the operator and also stands next to the crash accidents. In order to reduce the injury level and continuous roll over of the vehicle it is necessary to equip suitable roll cage according to standards of vehicle. In this present work roll cage for pump operator in wildfire fighting vehicle is designed and analysis is carried out in computer simulated environment when seating position of operator seated outside of the cabin. According to NFPA 1906 standards wildlife fire apparatus, Design and Test procedures that are carried out in Hyperworks maintaining SAE J1194.1983 standards. G load case, roof crush analysis and pendulum impact analysis tests are carried out on roll cage to ensure the saftey of design. These load cases are considerd to satisfy the situation faced in forest terrain. In these test procedures roll cage is analysed for stresses and deformation in various load cases. After recording results these are compared with standards mentioned in SAE J1194.1983.
Lanotte, M; Cavallo, M; Franzini, A; Grifi, M; Marchese, E; Pantaleoni, M; Piacentino, M; Servello, D
2010-09-01
Deep brain stimulation (DBS) alleviates symptoms of many neurological disorders by applying electrical impulses to the brain by means of implanted electrodes, generally put in place using a conventional stereotactic frame. A new image guided disposable mini-stereotactic system has been designed to help shorten and simplify DBS procedures when compared to standard stereotaxy. A small number of studies have been conducted which demonstrate localization accuracies of the system similar to those achievable by the conventional frame. However no data are available to date on the economic impact of this new frame. The aim of this paper was to develop a computational model to evaluate the investment required to introduce the image guided mini-stereotactic technology for stereotactic DBS neurosurgery. A standard DBS patient care pathway was developed and related costs were analyzed. A differential analysis was conducted to capture the impact of introducing the image guided system on the procedure workflow. The analysis was carried out in five Italian neurosurgical centers. A computational model was developed to estimate upfront investments and surgery costs leading to a definition of the best financial option to introduce the new frame. Investments may vary from Euro 1.900 (purchasing of Image Guided [IG] mini-stereotactic frame only) to Euro 158.000.000. Moreover the model demonstrates how the introduction of the IG mini-stereotactic frame doesn't substantially affect the DBS procedure costs.
Vugts, Miel A P; Joosen, Margot C W; van der Geer, Jessica E; Zedlitz, Aglaia M E E; Vrijhoef, Hubertus J M
2018-01-01
Computer-based interventions target improvement of physical and emotional functioning in patients with chronic pain and functional somatic syndromes. However, it is unclear to what extent which interventions work and for whom. This systematic review and meta-analysis (registered at PROSPERO, 2016: CRD42016050839) assesses efficacy relative to passive and active control conditions, and explores patient and intervention factors. Controlled studies were identified from MEDLINE, EMBASE, PsychInfo, Web of Science, and Cochrane Library. Pooled standardized mean differences by comparison type, and somatic symptom, health-related quality of life, functional interference, catastrophizing, and depression outcomes were calculated at post-treatment and at 6 or more months follow-up. Risk of bias was assessed. Sub-group analyses were performed by patient and intervention characteristics when heterogeneous outcomes were observed. Maximally, 30 out of 46 eligible studies and 3,387 participants were included per meta-analysis. Mostly, internet-based cognitive behavioral therapies were identified. Significantly higher patient reported outcomes were found in comparisons with passive control groups (standardized mean differences ranged between -.41 and -.18), but not in comparisons with active control groups (SMD = -.26 - -.14). For some outcomes, significant heterogeneity related to patient and intervention characteristics. To conclude, there is a minority of good quality evidence for small positive average effects of computer-based (cognitive) behavior change interventions, similar to traditional modes. These effects may be sustainable. Indications were found as of which interventions work better or more consistently across outcomes for which patients. Future process analyses are recommended in the aim of better understanding individual chances of clinically relevant outcomes.
Dynamic provisioning of local and remote compute resources with OpenStack
NASA Astrophysics Data System (ADS)
Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.
2015-12-01
Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.
Analyzing huge pathology images with open source software.
Deroulers, Christophe; Ameisen, David; Badoual, Mathilde; Gerin, Chloé; Granier, Alexandre; Lartaud, Marc
2013-06-06
Digital pathology images are increasingly used both for diagnosis and research, because slide scanners are nowadays broadly available and because the quantitative study of these images yields new insights in systems biology. However, such virtual slides build up a technical challenge since the images occupy often several gigabytes and cannot be fully opened in a computer's memory. Moreover, there is no standard format. Therefore, most common open source tools such as ImageJ fail at treating them, and the others require expensive hardware while still being prohibitively slow. We have developed several cross-platform open source software tools to overcome these limitations. The NDPITools provide a way to transform microscopy images initially in the loosely supported NDPI format into one or several standard TIFF files, and to create mosaics (division of huge images into small ones, with or without overlap) in various TIFF and JPEG formats. They can be driven through ImageJ plugins. The LargeTIFFTools achieve similar functionality for huge TIFF images which do not fit into RAM. We test the performance of these tools on several digital slides and compare them, when applicable, to standard software. A statistical study of the cells in a tissue sample from an oligodendroglioma was performed on an average laptop computer to demonstrate the efficiency of the tools. Our open source software enables dealing with huge images with standard software on average computers. They are cross-platform, independent of proprietary libraries and very modular, allowing them to be used in other open source projects. They have excellent performance in terms of execution speed and RAM requirements. They open promising perspectives both to the clinician who wants to study a single slide and to the research team or data centre who do image analysis of many slides on a computer cluster. The virtual slide(s) for this article can be found here:http://www.diagnosticpathology.diagnomx.eu/vs/5955513929846272.
Liu, Yan; Yang, Dong; Xiong, Fen; Yu, Lan; Ji, Fei; Wang, Qiu-Ju
2015-09-01
Hearing loss affects more than 27 million people in mainland China. It would be helpful to develop a portable and self-testing audiometer for the timely detection of hearing loss so that the optimal clinical therapeutic schedule can be determined. The objective of this study was to develop a software-based hearing self-testing system. The software-based self-testing system consisted of a notebook computer, an external sound card, and a pair of 10-Ω insert earphones. The system could be used to test the hearing thresholds by individuals themselves in an interactive manner using software. The reliability and validity of the system at octave frequencies of 0.25 Hz to 8.0 kHz were analyzed in three series of experiments. Thirty-seven normal-hearing particpants (74 ears) were enrolled in experiment 1. Forty individuals (80 ears) with sensorineural hearing loss (SNHL) participated in experiment 2. Thirteen normal-hearing participants (26 ears) and 37 participants (74 ears) with SNHL were enrolled in experiment 3. Each participant was enrolled in only one of the three experiments. In all experiments, pure-tone audiometry in a sound insulation room (standard test) was regarded as the gold standard. SPSS for Windows, version 17.0, was used for statistical analysis. The paired t-test was used to compare the hearing thresholds between the standard test and software-based self-testing (self-test) in experiments 1 and 2. In experiment 3 (main study), one-way analysis of variance and post hoc comparisons were used to compare the hearing thresholds among the standard test and two rounds of the self-test. Linear correlation analysis was carried out for the self-tests performed twice. The concordance was analyzed between the standard test and the self-test using the kappa method. p < 0.05 was considered statistically significant. Experiments 1 and 2: The hearing thresholds determined by the two methods were not significantly different at frequencies of 250, 500, or 8000 Hz (p > 0.05) but were significantly different at frequencies of 1000, 2000, and 4000 Hz (p < 0.05), except for 1000 Hz in the right ear in experiment 2. Experiment 3: The hearing thresholds determined by the standard test and self-tests repeated twice were not significantly different at any frequency (p > 0.05). The overall sensitivity of the self-test method was 97.6%, and the specificity was 98.3%. The sensitivity was 97.6% and the specificity was 97% for the patients with SNHL. The self-test had significant concordance with the standard test (kappa value = 0.848, p < 0.001). This portable hearing self-testing system based on a notebook personal computer is a reliable and sensitive method for hearing threshold assessment and monitoring. American Academy of Audiology.
NASA Technical Reports Server (NTRS)
Muss, J. A.; Nguyen, T. V.; Johnson, C. W.
1991-01-01
The appendices A-K to the user's manual for the rocket combustor interactive design (ROCCID) computer program are presented. This includes installation instructions, flow charts, subroutine model documentation, and sample output files. The ROCCID program, written in Fortran 77, provides a standardized methodology using state of the art codes and procedures for the analysis of a liquid rocket engine combustor's steady state combustion performance and combustion stability. The ROCCID is currently capable of analyzing mixed element injector patterns containing impinging like doublet or unlike triplet, showerhead, shear coaxial and swirl coaxial elements as long as only one element type exists in each injector core, baffle, or barrier zone. Real propellant properties of oxygen, hydrogen, methane, propane, and RP-1 are included in ROCCID. The properties of other propellants can be easily added. The analysis models in ROCCID can account for the influences of acoustic cavities, helmholtz resonators, and radial thrust chamber baffles on combustion stability. ROCCID also contains the logic to interactively create a combustor design which meets input performance and stability goals. A preliminary design results from the application of historical correlations to the input design requirements. The steady state performance and combustion stability of this design is evaluated using the analysis models, and ROCCID guides the user as to the design changes required to satisfy the user's performance and stability goals, including the design of stability aids. Output from ROCCID includes a formatted input file for the standardized JANNAF engine performance prediction procedure.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-22
..., Airborne Automatic Dead Reckoning Computer Equipment Utilizing Aircraft Heading and Doppler Ground Speed.... ACTION: Notice of intent to cancel Technical Standard Order (TSO)-C68a, Airborne automatic dead reckoning... dead reckoning computer equipment utilizing aircraft heading and Doppler ground speed and drift angle...
ERIC Educational Resources Information Center
Branstad, Dennis K., Ed.
The 15 papers and summaries of presentations in this collection provide technical information and guidance offered by representatives from federal agencies and private industry. Topics discussed include physical security, risk assessment, software security, computer network security, and applications and implementation of the Data Encryption…
LINCS: Livermore's network architecture. [Octopus computing network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fletcher, J.G.
1982-01-01
Octopus, a local computing network that has been evolving at the Lawrence Livermore National Laboratory for over fifteen years, is currently undergoing a major revision. The primary purpose of the revision is to consolidate and redefine the variety of conventions and formats, which have grown up over the years, into a single standard family of protocols, the Livermore Interactive Network Communication Standard (LINCS). This standard treats the entire network as a single distributed operating system such that access to a computing resource is obtained in a single way, whether that resource is local (on the same computer as the accessingmore » process) or remote (on another computer). LINCS encompasses not only communication but also such issues as the relationship of customer to server processes and the structure, naming, and protection of resources. The discussion includes: an overview of the Livermore user community and computing hardware, the functions and structure of each of the seven layers of LINCS protocol, the reasons why we have designed our own protocols and why we are dissatisfied by the directions that current protocol standards are taking.« less
Anisotropic resonator analysis using the Fourier-Bessel mode solver
NASA Astrophysics Data System (ADS)
Gauthier, Robert C.
2018-03-01
A numerical mode solver for optical structures that conform to cylindrical symmetry using Faraday's and Ampere's laws as starting expressions is developed when electric or magnetic anisotropy is present. The technique builds on the existing Fourier-Bessel mode solver which allows resonator states to be computed exploiting the symmetry properties of the resonator and states to reduce the matrix system. The introduction of anisotropy into the theoretical frame work facilitates the inclusion of PML borders permitting the computation of open ended structures and a better estimation of the resonator state quality factor. Matrix populating expressions are provided that can accommodate any material anisotropy with arbitrary orientation in the computation domain. Several example of electrical anisotropic computations are provided for rationally symmetric structures such as standard optical fibers, axial Bragg-ring fibers and bottle resonators. The anisotropy present in the materials introduces off diagonal matrix elements in the permittivity tensor when expressed in cylindrical coordinates. The effects of the anisotropy of computed states are presented and discussed.
User's manual for computer program BASEPLOT
Sanders, Curtis L.
2002-01-01
The checking and reviewing of daily records of streamflow within the U.S. Geological Survey is traditionally accomplished by hand-plotting and mentally collating tables of data. The process is time consuming, difficult to standardize, and subject to errors in computation, data entry, and logic. In addition, the presentation of flow data on the internet requires more timely and accurate computation of daily flow records. BASEPLOT was developed for checking and review of primary streamflow records within the U.S. Geological Survey. Use of BASEPLOT enables users to (1) provide efficiencies during the record checking and review process, (2) improve quality control, (3) achieve uniformity of checking and review techniques of simple stage-discharge relations, and (4) provide a tool for teaching streamflow computation techniques. The BASEPLOT program produces tables of quality control checks and produces plots of rating curves and discharge measurements; variable shift (V-shift) diagrams; and V-shifts converted to stage-discharge plots, using data stored in the U.S. Geological Survey Automatic Data Processing System database. In addition, the program plots unit-value hydrographs that show unit-value stages, shifts, and datum corrections; input shifts, datum corrections, and effective dates; discharge measurements; effective dates for rating tables; and numeric quality control checks. Checklist/tutorial forms are provided for reviewers to ensure completeness of review and standardize the review process. The program was written for the U.S. Geological Survey SUN computer using the Statistical Analysis System (SAS) software produced by SAS Institute, Incorporated.
Large-scale inverse model analyses employing fast randomized data reduction
NASA Astrophysics Data System (ADS)
Lin, Youzuo; Le, Ellen B.; O'Malley, Daniel; Vesselinov, Velimir V.; Bui-Thanh, Tan
2017-08-01
When the number of observations is large, it is computationally challenging to apply classical inverse modeling techniques. We have developed a new computationally efficient technique for solving inverse problems with a large number of observations (e.g., on the order of 107 or greater). Our method, which we call the randomized geostatistical approach (RGA), is built upon the principal component geostatistical approach (PCGA). We employ a data reduction technique combined with the PCGA to improve the computational efficiency and reduce the memory usage. Specifically, we employ a randomized numerical linear algebra technique based on a so-called "sketching" matrix to effectively reduce the dimension of the observations without losing the information content needed for the inverse analysis. In this way, the computational and memory costs for RGA scale with the information content rather than the size of the calibration data. Our algorithm is coded in Julia and implemented in the MADS open-source high-performance computational framework (http://mads.lanl.gov). We apply our new inverse modeling method to invert for a synthetic transmissivity field. Compared to a standard geostatistical approach (GA), our method is more efficient when the number of observations is large. Most importantly, our method is capable of solving larger inverse problems than the standard GA and PCGA approaches. Therefore, our new model inversion method is a powerful tool for solving large-scale inverse problems. The method can be applied in any field and is not limited to hydrogeological applications such as the characterization of aquifer heterogeneity.
Fiz, Francesco; Marini, Cecilia; Piva, Roberta; Miglino, Maurizio; Massollo, Michela; Bongioanni, Francesca; Morbelli, Silvia; Bottoni, Gianluca; Campi, Cristina; Bacigalupo, Andrea; Bruzzi, Paolo; Frassoni, Francesco; Piana, Michele; Sambuceti, Gianmario
2014-06-01
To assess the presence of alteration of bone structure and bone marrow metabolism in adult patients who were suspected of having advanced chronic lymphocytic leukemia (ACLL) by using a computational prognostic model that was based on computational analysis of positron emission tomography (PET)/computed tomography (CT) images. In this retrospective study, all patients signed written informed consent as a requisite to undergo PET/CT examination. However, due to its observational nature, approval from the ethical committee was not deemed necessary. Twenty-two previously untreated chronic lymphocytic leukemia patients underwent PET/CT for disease progression. PET/CT images were analyzed by using dedicated software, capable of recognizing an external 2-pixel bone ring whose Hounsfield coefficient served as cutoff to recognize trabecular and compact bone. PET/CT data from 22 age- and sex-matched control subjects were used as comparison. All data are reported as means ± standard deviations. The Student t test, log-rank, or Cox proportional hazards model were used as appropriate, considering a difference with a P value of less than .05 as significant. Trabecular bone was expanded in ACLL patients and occupied a larger fraction of the skeleton with respect to control subjects (mean, 39% ± 5 [standard deviation] vs 31% ± 7; ie, 32 of 81 mL/kg of ideal body weight vs 27 of 86 mL/kg of ideal body weight, respectively; P < .001). After stratification according to median value, patients with a ratio of trabecular to skeletal bone volume of more than 37.3% showed an actuarial 2-year survival of 18%, compared with 82% for those with a ratio of less than 37.3% (P < .001), independent from age, sex, biological markers, and disease duration. These data suggest that computational assessment of skeletal alterations might represent a new window for prediction of the clinical course of the disease.
A cloud-based information repository for bridge monitoring applications
NASA Astrophysics Data System (ADS)
Jeong, Seongwoon; Zhang, Yilan; Hou, Rui; Lynch, Jerome P.; Sohn, Hoon; Law, Kincho H.
2016-04-01
This paper describes an information repository to support bridge monitoring applications on a cloud computing platform. Bridge monitoring, with instrumentation of sensors in particular, collects significant amount of data. In addition to sensor data, a wide variety of information such as bridge geometry, analysis model and sensor description need to be stored. Data management plays an important role to facilitate data utilization and data sharing. While bridge information modeling (BrIM) technologies and standards have been proposed and they provide a means to enable integration and facilitate interoperability, current BrIM standards support mostly the information about bridge geometry. In this study, we extend the BrIM schema to include analysis models and sensor information. Specifically, using the OpenBrIM standards as the base, we draw on CSI Bridge, a commercial software widely used for bridge analysis and design, and SensorML, a standard schema for sensor definition, to define the data entities necessary for bridge monitoring applications. NoSQL database systems are employed for data repository. Cloud service infrastructure is deployed to enhance scalability, flexibility and accessibility of the data management system. The data model and systems are tested using the bridge model and the sensor data collected at the Telegraph Road Bridge, Monroe, Michigan.
Napolitano, José G.; Gödecke, Tanja; Lankin, David C.; Jaki, Birgit U.; McAlpine, James B.; Chen, Shao-Nong; Pauli, Guido F.
2013-01-01
The development of analytical methods for parallel characterization of multiple phytoconstituents is essential to advance the quality control of herbal products. While chemical standardization is commonly carried out by targeted analysis using gas or liquid chromatography-based methods, more universal approaches based on quantitative 1H NMR (qHNMR) measurements are being used increasingly in the multi-targeted assessment of these complex mixtures. The present study describes the development of a 1D qHNMR-based method for simultaneous identification and quantification of green tea constituents. This approach utilizes computer-assisted 1H iterative Full Spin Analysis (HiFSA) and enables rapid profiling of seven catechins in commercial green tea extracts. The qHNMR results were cross-validated against quantitative profiles obtained with an orthogonal LC-MS/MS method. The relative strengths and weaknesses of both approaches are discussed, with special emphasis on the role of identical reference standards in qualitative and quantitative analyses. PMID:23870106
A scaling procedure for the response of an isolated system with high modal overlap factor
NASA Astrophysics Data System (ADS)
De Rosa, S.; Franco, F.
2008-10-01
The paper deals with a numerical approach that reduces some physical sizes of the solution domain to compute the dynamic response of an isolated system: it has been named Asymptotical Scaled Modal Analysis (ASMA). The proposed numerical procedure alters the input data needed to obtain the classic modal responses to increase the frequency band of validity of the discrete or continuous coordinates model through the definition of a proper scaling coefficient. It is demonstrated that the computational cost remains acceptable while the frequency range of analysis increases. Moreover, with reference to the flexural vibrations of a rectangular plate, the paper discusses the ASMA vs. the statistical energy analysis and the energy distribution approach. Some insights are also given about the limits of the scaling coefficient. Finally it is shown that the linear dynamic response, predicted with the scaling procedure, has the same quality and characteristics of the statistical energy analysis, but it can be useful when the system cannot be solved appropriately by the standard Statistical Energy Analysis (SEA).
Domain fusion analysis by applying relational algebra to protein sequence and domain databases.
Truong, Kevin; Ikura, Mitsuhiko
2003-05-06
Domain fusion analysis is a useful method to predict functionally linked proteins that may be involved in direct protein-protein interactions or in the same metabolic or signaling pathway. As separate domain databases like BLOCKS, PROSITE, Pfam, SMART, PRINTS-S, ProDom, TIGRFAMs, and amalgamated domain databases like InterPro continue to grow in size and quality, a computational method to perform domain fusion analysis that leverages on these efforts will become increasingly powerful. This paper proposes a computational method employing relational algebra to find domain fusions in protein sequence databases. The feasibility of this method was illustrated on the SWISS-PROT+TrEMBL sequence database using domain predictions from the Pfam HMM (hidden Markov model) database. We identified 235 and 189 putative functionally linked protein partners in H. sapiens and S. cerevisiae, respectively. From scientific literature, we were able to confirm many of these functional linkages, while the remainder offer testable experimental hypothesis. Results can be viewed at http://calcium.uhnres.utoronto.ca/pi. As the analysis can be computed quickly on any relational database that supports standard SQL (structured query language), it can be dynamically updated along with the sequence and domain databases, thereby improving the quality of predictions over time.
NASA Technical Reports Server (NTRS)
Hall, Edward J.; Delaney, Robert A.; Adamczyk, John J.; Miller, Christopher J.; Arnone, Andrea; Swanson, Charles
1993-01-01
The primary objective of this study was the development of a time-marching three-dimensional Euler/Navier-Stokes aerodynamic analysis to predict steady and unsteady compressible transonic flows about ducted and unducted propfan propulsion systems employing multiple blade rows. The computer codes resulting from this study are referred to as ADPAC-AOACR (Advanced Ducted Propfan Analysis Codes-Angle of Attack Coupled Row). This report is intended to serve as a computer program user's manual for the ADPAC-AOACR codes developed under Task 5 of NASA Contract NAS3-25270, Unsteady Counterrotating Ducted Propfan Analysis. The ADPAC-AOACR program is based on a flexible multiple blocked grid discretization scheme permitting coupled 2-D/3-D mesh block solutions with application to a wide variety of geometries. For convenience, several standard mesh block structures are described for turbomachinery applications. Aerodynamic calculations are based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. Steady flow predictions are accelerated by a multigrid procedure. Numerical calculations are compared with experimental data for several test cases to demonstrate the utility of this approach for predicting the aerodynamics of modern turbomachinery configurations employing multiple blade rows.
Transient loads analysis for space flight applications
NASA Technical Reports Server (NTRS)
Thampi, S. K.; Vidyasagar, N. S.; Ganesan, N.
1992-01-01
A significant part of the flight readiness verification process involves transient analysis of the coupled Shuttle-payload system to determine the low frequency transient loads. This paper describes a methodology for transient loads analysis and its implementation for the Spacelab Life Sciences Mission. The analysis is carried out using two major software tools - NASTRAN and an external FORTRAN code called EZTRAN. This approach is adopted to overcome some of the limitations of NASTRAN's standard transient analysis capabilities. The method uses Data Recovery Matrices (DRM) to improve computational efficiency. The mode acceleration method is fully implemented in the DRM formulation to recover accurate displacements, stresses, and forces. The advantages of the method are demonstrated through a numerical example.
Medical Signal-Conditioning and Data-Interface System
NASA Technical Reports Server (NTRS)
Braun, Jeffrey; Jacobus, charles; Booth, Scott; Suarez, Michael; Smith, Derek; Hartnagle, Jeffrey; LePrell, Glenn
2006-01-01
A general-purpose portable, wearable electronic signal-conditioning and data-interface system is being developed for medical applications. The system can acquire multiple physiological signals (e.g., electrocardiographic, electroencephalographic, and electromyographic signals) from sensors on the wearer s body, digitize those signals that are received in analog form, preprocess the resulting data, and transmit the data to one or more remote location(s) via a radiocommunication link and/or the Internet. The system includes a computer running data-object-oriented software that can be programmed to configure the system to accept almost any analog or digital input signals from medical devices. The computing hardware and software implement a general-purpose data-routing-and-encapsulation architecture that supports tagging of input data and routing the data in a standardized way through the Internet and other modern packet-switching networks to one or more computer(s) for review by physicians. The architecture supports multiple-site buffering of data for redundancy and reliability, and supports both real-time and slower-than-real-time collection, routing, and viewing of signal data. Routing and viewing stations support insertion of automated analysis routines to aid in encoding, analysis, viewing, and diagnosis.
Appleby, Ryan; Zur Linden, Alex; Sears, William
2017-05-01
Diagnostic imaging plays an important role in the operating room, providing surgeons with a reference and surgical plan. Surgeon autonomy in the operating room has been suggested to decrease errors that stem from communication mistakes. A standard computer mouse was compared to a wireless remote-control style controller for computer game consoles (Wiimote) for the navigation of diagnostic imaging studies by sterile personnel in this prospective survey study. Participants were recruited from a cohort of residents and faculty that use the surgical suites at our institution. Outcome assessments were based on survey data completed by study participants following each use of either the mouse or Wiimote, and compared using an analysis of variance. The mouse was significantly preferred by the study participants in the categories of handling, accuracy and efficiency, and overall satisfaction (P <0.05). The mouse was preferred to both the Wiimote and to no device, when participants were asked to rank options for image navigation. This indicates the need for the implementation of intraoperative image navigation devices, to increase surgeon autonomy in the operating room. © 2017 American College of Veterinary Radiology.
Obert, Martin; Kubelt, Carolin; Schaaf, Thomas; Dassinger, Benjamin; Grams, Astrid; Gizewski, Elke R; Krombach, Gabriele A; Verhoff, Marcel A
2013-05-10
The objective of this article was to explore age-at-death estimates in forensic medicine, which were methodically based on age-dependent, radiologically defined bone-density (HC) decay and which were investigated with a standard clinical computed tomography (CT) system. Such density decay was formerly discovered with a high-resolution flat-panel CT in the skulls of adult females. The development of a standard CT methodology for age estimations--with thousands of installations--would have the advantage of being applicable everywhere, whereas only few flat-panel prototype CT systems are in use worldwide. A Multi-Slice CT scanner (MSCT) was used to obtain 22,773 images from 173 European human skulls (89 male, 84 female), taken from a population of patients from the Department of Neuroradiology at the University Hospital Giessen and Marburg during 2010 and 2011. An automated image analysis was carried out to evaluate HC of all images. The age dependence of HC was studied by correlation analysis. The prediction accuracy of age-at-death estimates was calculated. Computer simulations were carried out to explore the influence of noise on the accuracy of age predictions. Human skull HC values strongly scatter as a function of age for both sexes. Adult male skull bone-density remains constant during lifetime. Adult female HC decays during lifetime, as indicated by a correlation coefficient (CC) of -0.53. Prediction errors for age-at-death estimates for both of the used scanners are in the range of ±18 years at a 75% confidence interval (CI). Computer simulations indicate that this is the best that can be expected for such noisy data. Our results indicate that HC-decay is indeed present in adult females and that it can be demonstrated both by standard and by high-resolution CT methods, applied to different subject groups of an identical population. The weak correlation between HC and age found by both CT methods only enables a method to estimate age-at-death with limited practical relevance since the errors of the estimates are large. Computer simulations clearly indicate that data with less noise and CCs in the order of -0.97 or less would be necessary to enable age-at-death estimates with an accuracy of ±5 years at a 75% CI. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Yang, Jingjing; Cox, Dennis D; Lee, Jong Soo; Ren, Peng; Choi, Taeryon
2017-12-01
Functional data are defined as realizations of random functions (mostly smooth functions) varying over a continuum, which are usually collected on discretized grids with measurement errors. In order to accurately smooth noisy functional observations and deal with the issue of high-dimensional observation grids, we propose a novel Bayesian method based on the Bayesian hierarchical model with a Gaussian-Wishart process prior and basis function representations. We first derive an induced model for the basis-function coefficients of the functional data, and then use this model to conduct posterior inference through Markov chain Monte Carlo methods. Compared to the standard Bayesian inference that suffers serious computational burden and instability in analyzing high-dimensional functional data, our method greatly improves the computational scalability and stability, while inheriting the advantage of simultaneously smoothing raw observations and estimating the mean-covariance functions in a nonparametric way. In addition, our method can naturally handle functional data observed on random or uncommon grids. Simulation and real studies demonstrate that our method produces similar results to those obtainable by the standard Bayesian inference with low-dimensional common grids, while efficiently smoothing and estimating functional data with random and high-dimensional observation grids when the standard Bayesian inference fails. In conclusion, our method can efficiently smooth and estimate high-dimensional functional data, providing one way to resolve the curse of dimensionality for Bayesian functional data analysis with Gaussian-Wishart processes. © 2017, The International Biometric Society.
NASA Astrophysics Data System (ADS)
Wang, Tao; Zhou, Guoqing; Wang, Jianzhou; Zhou, Lei
2018-03-01
The artificial ground freezing method (AGF) is widely used in civil and mining engineering, and the thermal regime of frozen soil around the freezing pipe affects the safety of design and construction. The thermal parameters can be truly random due to heterogeneity of the soil properties, which lead to the randomness of thermal regime of frozen soil around the freezing pipe. The purpose of this paper is to study the one-dimensional (1D) random thermal regime problem on the basis of a stochastic analysis model and the Monte Carlo (MC) method. Considering the uncertain thermal parameters of frozen soil as random variables, stochastic processes and random fields, the corresponding stochastic thermal regime of frozen soil around a single freezing pipe are obtained and analyzed. Taking the variability of each stochastic parameter into account individually, the influences of each stochastic thermal parameter on stochastic thermal regime are investigated. The results show that the mean temperatures of frozen soil around the single freezing pipe with three analogy method are the same while the standard deviations are different. The distributions of standard deviation have a great difference at different radial coordinate location and the larger standard deviations are mainly at the phase change area. The computed data with random variable method and stochastic process method have a great difference from the measured data while the computed data with random field method well agree with the measured data. Each uncertain thermal parameter has a different effect on the standard deviation of frozen soil temperature around the single freezing pipe. These results can provide a theoretical basis for the design and construction of AGF.
Alvelo, Jessica L.; Papademetris, Xenophon; Mena-Hurtado, Carlos; Jeon, Sangchoon; Sumpio, Bauer E.; Sinusas, Albert J.
2018-01-01
Background: Single photon emission computed tomography (SPECT)/computed tomography (CT) imaging allows for assessment of skeletal muscle microvascular perfusion but has not been quantitatively assessed in angiosomes, or 3-dimensional vascular territories, of the foot. This study assessed and compared resting angiosome foot perfusion between healthy subjects and diabetic patients with critical limb ischemia (CLI). Additionally, the relationship between SPECT/CT imaging and the ankle–brachial index—a standard tool for evaluating peripheral artery disease—was assessed. Methods and Results: Healthy subjects (n=9) and diabetic patients with CLI and nonhealing ulcers (n=42) underwent SPECT/CT perfusion imaging of the feet. CT images were segmented into angiosomes for quantification of relative radiotracer uptake, expressed as standardized uptake values. Standardized uptake values were assessed in ulcerated angiosomes of patients with CLI and compared with whole-foot standardized uptake values in healthy subjects. Serial SPECT/CT imaging was performed to assess uptake kinetics of technetium-99m-tetrofosmin. The relationship between angiosome perfusion and ankle–brachial index was assessed via correlational analysis. Resting perfusion was significantly lower in CLI versus healthy subjects (P=0.0007). Intraclass correlation coefficients of 0.95 (healthy) and 0.93 (CLI) demonstrated excellent agreement between serial perfusion measurements. Correlational analysis, including healthy and CLI subjects, demonstrated a significant relationship between ankle–brachial index and SPECT/CT (P=0.01); however, this relationship was not significant for diabetic CLI patients only (P=0.2). Conclusions: SPECT/CT imaging assesses regional foot perfusion and detects abnormalities in microvascular perfusion that may be undetectable by conventional ankle–brachial index in patients with diabetes mellitus. SPECT/CT may provide a novel approach for evaluating responses to targeted therapies. PMID:29748311
Regression-based adaptive sparse polynomial dimensional decomposition for sensitivity analysis
NASA Astrophysics Data System (ADS)
Tang, Kunkun; Congedo, Pietro; Abgrall, Remi
2014-11-01
Polynomial dimensional decomposition (PDD) is employed in this work for global sensitivity analysis and uncertainty quantification of stochastic systems subject to a large number of random input variables. Due to the intimate structure between PDD and Analysis-of-Variance, PDD is able to provide simpler and more direct evaluation of the Sobol' sensitivity indices, when compared to polynomial chaos (PC). Unfortunately, the number of PDD terms grows exponentially with respect to the size of the input random vector, which makes the computational cost of the standard method unaffordable for real engineering applications. In order to address this problem of curse of dimensionality, this work proposes a variance-based adaptive strategy aiming to build a cheap meta-model by sparse-PDD with PDD coefficients computed by regression. During this adaptive procedure, the model representation by PDD only contains few terms, so that the cost to resolve repeatedly the linear system of the least-square regression problem is negligible. The size of the final sparse-PDD representation is much smaller than the full PDD, since only significant terms are eventually retained. Consequently, a much less number of calls to the deterministic model is required to compute the final PDD coefficients.
Novel Method for Incorporating Model Uncertainties into Gravitational Wave Parameter Estimates
NASA Astrophysics Data System (ADS)
Moore, Christopher J.; Gair, Jonathan R.
2014-12-01
Posterior distributions on parameters computed from experimental data using Bayesian techniques are only as accurate as the models used to construct them. In many applications, these models are incomplete, which both reduces the prospects of detection and leads to a systematic error in the parameter estimates. In the analysis of data from gravitational wave detectors, for example, accurate waveform templates can be computed using numerical methods, but the prohibitive cost of these simulations means this can only be done for a small handful of parameters. In this Letter, a novel method to fold model uncertainties into data analysis is proposed; the waveform uncertainty is analytically marginalized over using with a prior distribution constructed by using Gaussian process regression to interpolate the waveform difference from a small training set of accurate templates. The method is well motivated, easy to implement, and no more computationally expensive than standard techniques. The new method is shown to perform extremely well when applied to a toy problem. While we use the application to gravitational wave data analysis to motivate and illustrate the technique, it can be applied in any context where model uncertainties exist.
INFN-Pisa scientific computation environment (GRID, HPC and Interactive Analysis)
NASA Astrophysics Data System (ADS)
Arezzini, S.; Carboni, A.; Caruso, G.; Ciampa, A.; Coscetti, S.; Mazzoni, E.; Piras, S.
2014-06-01
The INFN-Pisa Tier2 infrastructure is described, optimized not only for GRID CPU and Storage access, but also for a more interactive use of the resources in order to provide good solutions for the final data analysis step. The Data Center, equipped with about 6700 production cores, permits the use of modern analysis techniques realized via advanced statistical tools (like RooFit and RooStat) implemented in multicore systems. In particular a POSIX file storage access integrated with standard SRM access is provided. Therefore the unified storage infrastructure is described, based on GPFS and Xrootd, used both for SRM data repository and interactive POSIX access. Such a common infrastructure allows a transparent access to the Tier2 data to the users for their interactive analysis. The organization of a specialized many cores CPU facility devoted to interactive analysis is also described along with the login mechanism integrated with the INFN-AAI (National INFN Infrastructure) to extend the site access and use to a geographical distributed community. Such infrastructure is used also for a national computing facility in use to the INFN theoretical community, it enables a synergic use of computing and storage resources. Our Center initially developed for the HEP community is now growing and includes also HPC resources fully integrated. In recent years has been installed and managed a cluster facility (1000 cores, parallel use via InfiniBand connection) and we are now updating this facility that will provide resources for all the intermediate level HPC computing needs of the INFN theoretical national community.
Exploiting graphics processing units for computational biology and bioinformatics.
Payne, Joshua L; Sinnott-Armstrong, Nicholas A; Moore, Jason H
2010-09-01
Advances in the video gaming industry have led to the production of low-cost, high-performance graphics processing units (GPUs) that possess more memory bandwidth and computational capability than central processing units (CPUs), the standard workhorses of scientific computing. With the recent release of generalpurpose GPUs and NVIDIA's GPU programming language, CUDA, graphics engines are being adopted widely in scientific computing applications, particularly in the fields of computational biology and bioinformatics. The goal of this article is to concisely present an introduction to GPU hardware and programming, aimed at the computational biologist or bioinformaticist. To this end, we discuss the primary differences between GPU and CPU architecture, introduce the basics of the CUDA programming language, and discuss important CUDA programming practices, such as the proper use of coalesced reads, data types, and memory hierarchies. We highlight each of these topics in the context of computing the all-pairs distance between instances in a dataset, a common procedure in numerous disciplines of scientific computing. We conclude with a runtime analysis of the GPU and CPU implementations of the all-pairs distance calculation. We show our final GPU implementation to outperform the CPU implementation by a factor of 1700.
A new computer program for mass screening of visual defects in preschool children.
Briscoe, D; Lifshitz, T; Grotman, M; Kushelevsky, A; Vardi, H; Weizman, S; Biedner, B
1998-04-01
To test the effectiveness of a PC computer program for detecting vision disorders which could be used by non-trained personnel, and to determine the prevalence of visual impairment in a sample population of preschool children in the city of Beer-Sheba, Israel. 292 preschool children, aged 4-6 years, were examined in the kindergarten setting, using the computer system and "gold standard" tests. Visual acuity and stereopsis were tested and compared using Snellen type symbol charts and random dot stereograms respectively. The sensitivity, specificity, positive predictive value, negative predictive value, and kappa test were evaluated. A computer pseudo Worth four dot test was also performed but could not be compared with the standard Worth four dot test owing to the inability of many children to count. Agreement between computer and gold standard tests was 83% and 97.3% for visual acuity and stereopsis respectively. The sensitivity of the computer stereogram was only 50%, but it had a specificity of 98.9%, whereas the sensitivity and specificity of the visual acuity test were 81.5% and 83% respectively. The positive predictive value of both tests was about 63%. 27.7% of children tested had a visual acuity of 6/12 or less and stereopsis was absent in 28% using standard tests. Impairment of fusion was found in 5% of children using the computer pseudo Worth four dot test. The computer program was found to be stimulating, rapid, and easy to perform. The wide availability of computers in schools and at home allow it to be used as an additional screening tool by non-trained personnel, such as teachers and parents, but it is not a replacement for standard testing.
Robust tuning of robot control systems
NASA Technical Reports Server (NTRS)
Minis, I.; Uebel, M.
1992-01-01
The computed torque control problem is examined for a robot arm with flexible, geared, joint drive systems which are typical in many industrial robots. The standard computed torque algorithm is not directly applicable to this class of manipulators because of the dynamics introduced by the joint drive system. The proposed approach to computed torque control combines a computed torque algorithm with torque controller at each joint. Three such control schemes are proposed. The first scheme uses the joint torque control system currently implemented on the robot arm and a novel form of the computed torque algorithm. The other two use the standard computed torque algorithm and a novel model following torque control system based on model following techniques. Standard tasks and performance indices are used to evaluate the performance of the controllers. Both numerical simulations and experiments are used in evaluation. The study shows that all three proposed systems lead to improved tracking performance over a conventional PD controller.
Comparative analysis of economic models in selected solar energy computer programs
NASA Astrophysics Data System (ADS)
Powell, J. W.; Barnes, K. A.
1982-01-01
The economic evaluation models in five computer programs widely used for analyzing solar energy systems (F-CHART 3.0, F-CHART 4.0, SOLCOST, BLAST, and DOE-2) are compared. Differences in analysis techniques and assumptions among the programs are assessed from the point of view of consistency with the Federal requirements for life cycle costing (10 CFR Part 436), effect on predicted economic performance, and optimal system size, case of use, and general applicability to diverse systems types and building types. The FEDSOL program developed by the National Bureau of Standards specifically to meet the Federal life cycle cost requirements serves as a basis for the comparison. Results of the study are illustrated in test cases of two different types of Federally owned buildings: a single family residence and a low rise office building.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amerio, S.; Behari, S.; Boyd, J.
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and D0 experiments each have approximately 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 and beyond. To achieve this goal, we have implemented a system that utilizes virtualization, automated validation, and migration to new standards inmore » both software and data storage technology and leverages resources available from currently-running experiments at Fermilab. Lastly, these efforts have also provided useful lessons in ensuring long-term data access for numerous experiments, and enable high-quality scientific output for years to come.« less
Lee, W Anthony
2007-01-01
The gold standard for preoperative evaluation of an aortic aneurysm is a computed tomography angiogram (CTA). Three-dimensional reconstruction and analysis of the computed tomography data set is enormously helpful, and even sometimes essential, in proper sizing and planning for endovascular stent graft repair. To a large extent, it has obviated the need for conventional angiography for morphologic evaluation. The TeraRecon Aquarius workstation (San Mateo, Calif) represents a highly sophisticated but user-friendly platform utilizing a combination of task-specific hardware and software specifically designed to rapidly manipulate large Digital Imaging and Communications in Medicine (DICOM) data sets and provide surface-shaded and multiplanar renderings in real-time. This article discusses the basics of sizing and planning for endovascular abdominal aortic aneurysm repair and the role of 3-dimensional analysis using the TeraRecon workstation.
Ecoupling server: A tool to compute and analyze electronic couplings.
Cabeza de Vaca, Israel; Acebes, Sandra; Guallar, Victor
2016-07-05
Electron transfer processes are often studied through the evaluation and analysis of the electronic coupling (EC). Since most standard QM codes do not provide readily such a measure, additional, and user-friendly tools to compute and analyze electronic coupling from external wave functions will be of high value. The first server to provide a friendly interface for evaluation and analysis of electronic couplings under two different approximations (FDC and GMH) is presented in this communication. Ecoupling server accepts inputs from common QM and QM/MM software and provides useful plots to understand and analyze the results easily. The web server has been implemented in CGI-python using Apache and it is accessible at http://ecouplingserver.bsc.es. Ecoupling server is free and open to all users without login. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Data preservation at the Fermilab Tevatron
NASA Astrophysics Data System (ADS)
Amerio, S.; Behari, S.; Boyd, J.; Brochmann, M.; Culbertson, R.; Diesburg, M.; Freeman, J.; Garren, L.; Greenlee, H.; Herner, K.; Illingworth, R.; Jayatilaka, B.; Jonckheere, A.; Li, Q.; Naymola, S.; Oleynik, G.; Sakumoto, W.; Varnes, E.; Vellidis, C.; Watts, G.; White, S.
2017-04-01
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and D0 experiments each have approximately 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 and beyond. To achieve this goal, we have implemented a system that utilizes virtualization, automated validation, and migration to new standards in both software and data storage technology and leverages resources available from currently-running experiments at Fermilab. These efforts have also provided useful lessons in ensuring long-term data access for numerous experiments, and enable high-quality scientific output for years to come.
NASA Astrophysics Data System (ADS)
Guo, Xiao-Xi; Hu, Wei; Liu, Yuan; Gu, Dong-Chen; Sun, Su-Qin; Xu, Chang-Hua; Wang, Xi-Chang
2015-11-01
Fluorescent brightener, industrial whitening agent, has been illegally used to whitening wheat flour. In this article, computer vision technology (E-eyes) and colorimetry were employed to investigate color difference among different concentrations of fluorescent brightener in wheat flour using DMS as an example. Tri-step infrared spectroscopy (Fourier transform-infrared spectroscopy coupled with second derivative infrared spectroscopy (SD-IR) and two dimensional correlation infrared spectroscopy (2DCOS-IR)) was used to identify and quantitate DMS in wheat flour. According to color analysis, the whitening effect was significant when added with less than 30 mg/g DMS but when more than 100 mg/g, the flour began greenish. Thus it was speculated that the concentration of DMS should be below 100 mg/g in real flour adulterant with DMS. With the increase of the concentration, the spectral similarity of wheat flour with DMS to DMS standard was increasing. SD-IR peaks at 1153 cm-1, 1141 cm-1, 1112 cm-1, 1085 cm-1 and 1025 cm-1 attributed to DMS were regularly enhanced. Furthermore, it could be differentiated by 2DOS-IR between DMS standard and wheat flour added with DMS low to 0.05 mg/g and the bands in the range of 1000-1500 cm-1 could be an exclusive range to identify whether wheat flour contained DMS. Finally, a quantitative prediction model based on IR spectra was established successfully by Partial least squares (PLS) with a concentration range from 1 mg/g to 100 mg/g. The calibration set gave a determination coefficient of 0.9884 with a standard error (RMSEC) of 5.56 and the validation set presented a determination coefficient of 0.9881 with a standard error of 5.73. It was demonstrated that computer vision technology and colorimetry were effective to estimate the content of DMS in wheat flour and the Tri-step infrared macro-fingerprinting combined with PLS was applicable for rapid and nondestructive fluorescent brightener identification and quantitation.
Requirements for a multifunctional code architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tiihonen, O.; Juslin, K.
1997-07-01
The present paper studies a set of requirements for a multifunctional simulation software architecture in the light of experiences gained in developing and using the APROS simulation environment. The huge steps taken in the development of computer hardware and software during the last ten years are changing the status of the traditional nuclear safety analysis software. The affordable computing power on the safety analysts table by far exceeds the possibilities offered to him/her ten years ago. At the same time the features of everyday office software tend to set standards to the way the input data and calculational results aremore » managed.« less
Computer-enhanced laparoscopic training system (CELTS): bridging the gap.
Stylopoulos, N; Cotin, S; Maithel, S K; Ottensmeye, M; Jackson, P G; Bardsley, R S; Neumann, P F; Rattner, D W; Dawson, S L
2004-05-01
There is a large and growing gap between the need for better surgical training methodologies and the systems currently available for such training. In an effort to bridge this gap and overcome the disadvantages of the training simulators now in use, we developed the Computer-Enhanced Laparoscopic Training System (CELTS). CELTS is a computer-based system capable of tracking the motion of laparoscopic instruments and providing feedback about performance in real time. CELTS consists of a mechanical interface, a customizable set of tasks, and an Internet-based software interface. The special cognitive and psychomotor skills a laparoscopic surgeon should master were explicitly defined and transformed into quantitative metrics based on kinematics analysis theory. A single global standardized and task-independent scoring system utilizing a z-score statistic was developed. Validation exercises were performed. The scoring system clearly revealed a gap between experts and trainees, irrespective of the task performed; none of the trainees obtained a score above the threshold that distinguishes the two groups. Moreover, CELTS provided educational feedback by identifying the key factors that contributed to the overall score. Among the defined metrics, depth perception, smoothness of motion, instrument orientation, and the outcome of the task are major indicators of performance and key parameters that distinguish experts from trainees. Time and path length alone, which are the most commonly used metrics in currently available systems, are not considered good indicators of performance. CELTS is a novel and standardized skills trainer that combines the advantages of computer simulation with the features of the traditional and popular training boxes. CELTS can easily be used with a wide array of tasks and ensures comparability across different training conditions. This report further shows that a set of appropriate and clinically relevant performance metrics can be defined and a standardized scoring system can be designed.
An approach for quantitative image quality analysis for CT
NASA Astrophysics Data System (ADS)
Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe
2016-03-01
An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.
Applied Computational Electromagnetics Society Journal and Newletter, Volume 14 No. 1
1999-03-01
code validation, performance analysis, and input/output standardization; code or technique optimization and error minimization; innovations in...SOUTH AFRICA Alamo, CA, 94507-0516 USA Washington, DC 20330 USA MANAGING EDITOR Kueichien C. Hill Krishna Naishadham Richard W. Adler Wright Laboratory...INSTITUTIONAL MEMBERS ALLGON DERA Nasvagen 17 Common Road, Funtington INNOVATIVE DYNAMICS Akersberga, SWEDEN S-18425 Chichester, P018 9PD UK 2560 N. Triphammer
NASA Technical Reports Server (NTRS)
Barber, Peter W.; Demerdash, Nabeel A. O.; Wang, R.; Hurysz, B.; Luo, Z.
1991-01-01
The goal is to analyze the potential effects of electromagnetic interference (EMI) originating from power system processing and transmission components for Space Station Freedom.The approach consists of four steps: (1) develop analytical tools (models and computer programs); (2) conduct parameterization studies; (3) predict the global space station EMI environment; and (4) provide a basis for modification of EMI standards.
ERIC Educational Resources Information Center
Carroll, Margaret Aby; Chandler, Yvonne J.
This study examines whether an analysis of characteristics of libraries or information centers and librarians in highly productive companies yields operational models and standards that can improve their efficiency and effectiveness and their parent organization's productivity. Data was collected using an e-mail survey instrument sent to 500 large…
States Move toward Computer Science Standards. Policy Update. Vol. 23, No. 17
ERIC Educational Resources Information Center
Tilley-Coulson, Eve
2016-01-01
While educators and parents recognize computer science as a key skill for career readiness, only five states have adopted learning standards in this area. Tides are changing, however, as the Every Student Succeeds Act (ESSA) recognizes with its call on states to provide a "well-rounded education" for students, to include computer science…
Assessment of Computer Literacy of Nurses in Lesotho.
Mugomeri, Eltony; Chatanga, Peter; Maibvise, Charles; Masitha, Matseliso
2016-11-01
Health systems worldwide are moving toward use of information technology to improve healthcare delivery. However, this requires basic computer skills. This study assessed the computer literacy of nurses in Lesotho using a cross-sectional quantitative approach. A structured questionnaire with 32 standardized computer skills was distributed to 290 randomly selected nurses in Maseru District. Univariate and multivariate logistic regression analyses in Stata 13 were performed to identify factors associated with having inadequate computer skills. Overall, 177 (61%) nurses scored below 16 of the 32 skills assessed. Finding hyperlinks on Web pages (63%), use of advanced search parameters (60.2%), and downloading new software (60.1%) proved to be challenging to the highest proportions of nurses. Age, sex, year of obtaining latest qualification, computer experience, and work experience were significantly (P < .05) associated with inadequate computer skills in univariate analysis. However, in multivariate analyses, sex (P = .001), year of obtaining latest qualification (P = .011), and computer experience (P < .001) emerged as significant factors. The majority of nurses in Lesotho have inadequate computer skills, and this is significantly associated with having many years since obtaining their latest qualification, being female, and lack of exposure to computers. These factors should be considered during planning of training curriculum for nurses in Lesotho.
Choi, Hyungwon; Kim, Sinae; Fermin, Damian; Tsou, Chih-Chiang; Nesvizhskii, Alexey I
2015-11-03
We introduce QPROT, a statistical framework and computational tool for differential protein expression analysis using protein intensity data. QPROT is an extension of the QSPEC suite, originally developed for spectral count data, adapted for the analysis using continuously measured protein-level intensity data. QPROT offers a new intensity normalization procedure and model-based differential expression analysis, both of which account for missing data. Determination of differential expression of each protein is based on the standardized Z-statistic based on the posterior distribution of the log fold change parameter, guided by the false discovery rate estimated by a well-known Empirical Bayes method. We evaluated the classification performance of QPROT using the quantification calibration data from the clinical proteomic technology assessment for cancer (CPTAC) study and a recently published Escherichia coli benchmark dataset, with evaluation of FDR accuracy in the latter. QPROT is a statistical framework with computational software tool for comparative quantitative proteomics analysis. It features various extensions of QSPEC method originally built for spectral count data analysis, including probabilistic treatment of missing values in protein intensity data. With the increasing popularity of label-free quantitative proteomics data, the proposed method and accompanying software suite will be immediately useful for many proteomics laboratories. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.
The Navy/NASA Engine Program (NNEP89): A user's manual
NASA Technical Reports Server (NTRS)
Plencner, Robert M.; Snyder, Christopher A.
1991-01-01
An engine simulation computer code called NNEP89 was written to perform 1-D steady state thermodynamic analysis of turbine engine cycles. By using a very flexible method of input, a set of standard components are connected at execution time to simulate almost any turbine engine configuration that the user could imagine. The code was used to simulate a wide range of engine cycles from turboshafts and turboprops to air turborockets and supersonic cruise variable cycle engines. Off design performance is calculated through the use of component performance maps. A chemical equilibrium model is incorporated to adequately predict chemical dissociation as well as model virtually any fuel. NNEP89 is written in standard FORTRAN77 with clear structured programming and extensive internal documentation. The standard FORTRAN77 programming allows it to be installed onto most mainframe computers and workstations without modification. The NNEP89 code was derived from the Navy/NASA Engine program (NNEP). NNEP89 provides many improvements and enhancements to the original NNEP code and incorporates features which make it easier to use for the novice user. This is a comprehensive user's guide for the NNEP89 code.
Shadish, William R; Hedges, Larry V; Pustejovsky, James E
2014-04-01
This article presents a d-statistic for single-case designs that is in the same metric as the d-statistic used in between-subjects designs such as randomized experiments and offers some reasons why such a statistic would be useful in SCD research. The d has a formal statistical development, is accompanied by appropriate power analyses, and can be estimated using user-friendly SPSS macros. We discuss both advantages and disadvantages of d compared to other approaches such as previous d-statistics, overlap statistics, and multilevel modeling. It requires at least three cases for computation and assumes normally distributed outcomes and stationarity, assumptions that are discussed in some detail. We also show how to test these assumptions. The core of the article then demonstrates in depth how to compute d for one study, including estimation of the autocorrelation and the ratio of between case variance to total variance (between case plus within case variance), how to compute power using a macro, and how to use the d to conduct a meta-analysis of studies using single-case designs in the free program R, including syntax in an appendix. This syntax includes how to read data, compute fixed and random effect average effect sizes, prepare a forest plot and a cumulative meta-analysis, estimate various influence statistics to identify studies contributing to heterogeneity and effect size, and do various kinds of publication bias analyses. This d may prove useful for both the analysis and meta-analysis of data from SCDs. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.
Eigensolver for a Sparse, Large Hermitian Matrix
NASA Technical Reports Server (NTRS)
Tisdale, E. Robert; Oyafuso, Fabiano; Klimeck, Gerhard; Brown, R. Chris
2003-01-01
A parallel-processing computer program finds a few eigenvalues in a sparse Hermitian matrix that contains as many as 100 million diagonal elements. This program finds the eigenvalues faster, using less memory, than do other, comparable eigensolver programs. This program implements a Lanczos algorithm in the American National Standards Institute/ International Organization for Standardization (ANSI/ISO) C computing language, using the Message Passing Interface (MPI) standard to complement an eigensolver in PARPACK. [PARPACK (Parallel Arnoldi Package) is an extension, to parallel-processing computer architectures, of ARPACK (Arnoldi Package), which is a collection of Fortran 77 subroutines that solve large-scale eigenvalue problems.] The eigensolver runs on Beowulf clusters of computers at the Jet Propulsion Laboratory (JPL).
Leveraging transcript quantification for fast computation of alternative splicing profiles.
Alamancos, Gael P; Pagès, Amadís; Trincado, Juan L; Bellora, Nicolás; Eyras, Eduardo
2015-09-01
Alternative splicing plays an essential role in many cellular processes and bears major relevance in the understanding of multiple diseases, including cancer. High-throughput RNA sequencing allows genome-wide analyses of splicing across multiple conditions. However, the increasing number of available data sets represents a major challenge in terms of computation time and storage requirements. We describe SUPPA, a computational tool to calculate relative inclusion values of alternative splicing events, exploiting fast transcript quantification. SUPPA accuracy is comparable and sometimes superior to standard methods using simulated as well as real RNA-sequencing data compared with experimentally validated events. We assess the variability in terms of the choice of annotation and provide evidence that using complete transcripts rather than more transcripts per gene provides better estimates. Moreover, SUPPA coupled with de novo transcript reconstruction methods does not achieve accuracies as high as using quantification of known transcripts, but remains comparable to existing methods. Finally, we show that SUPPA is more than 1000 times faster than standard methods. Coupled with fast transcript quantification, SUPPA provides inclusion values at a much higher speed than existing methods without compromising accuracy, thereby facilitating the systematic splicing analysis of large data sets with limited computational resources. The software is implemented in Python 2.7 and is available under the MIT license at https://bitbucket.org/regulatorygenomicsupf/suppa. © 2015 Alamancos et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.
Weiss, Kenneth L; Pan, Hai; Storrs, Judd; Strub, William; Weiss, Jane L; Jia, Li; Eldevik, O Petter
2003-05-01
Variability in patient head positioning may yield substantial interstudy image variance in the clinical setting. We describe and test three-step technologist and computer-automated algorithms designed to image the brain in a standard reference system and reduce variance. Triple oblique axial images obtained parallel to the Talairach anterior commissure (AC)-posterior commissure (PC) plane were reviewed in a prospective analysis of 126 consecutive patients. Requisite roll, yaw, and pitch correction, as three authors determined independently and subsequently by consensus, were compared with the technologists' actual graphical prescriptions and those generated by a novel computer automated three-step (CATS) program. Automated pitch determinations generated with Statistical Parametric Mapping '99 (SPM'99) were also compared. Requisite pitch correction (15.2 degrees +/- 10.2 degrees ) far exceeded that for roll (-0.6 degrees +/- 3.7 degrees ) and yaw (-0.9 degrees +/- 4.7 degrees ) in terms of magnitude and variance (P <.001). Technologist and computer-generated prescriptions substantially reduced interpatient image variance with regard to roll (3.4 degrees and 3.9 degrees vs 13.5 degrees ), yaw (0.6 degrees and 2.5 degrees vs 22.3 degrees ), and pitch (28.6 degrees, 18.5 degrees with CATS, and 59.3 degrees with SPM'99 vs 104 degrees ). CATS performed worse than the technologists in yaw prescription, and it was equivalent in roll and pitch prescriptions. Talairach prescriptions better approximated standard CT canthomeatal angulations (9 degrees vs 24 degrees ) and provided more efficient brain coverage than that of routine axial imaging. Brain MR prescriptions corrected for direct roll, yaw, and Talairach AC-PC pitch can be readily achieved by trained technologists or automated computer algorithms. This ability will substantially reduce interpatient variance, allow better approximation of standard CT angulation, and yield more efficient brain coverage than that of routine clinical axial imaging.
Semantic Pattern Analysis for Verbal Fluency Based Assessment of Neurological Disorders
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sukumar, Sreenivas R; Ainsworth, Keela C; Brown, Tyler C
In this paper, we present preliminary results of semantic pattern analysis of verbal fluency tests used for assessing cognitive psychological and neuropsychological disorders. We posit that recent advances in semantic reasoning and artificial intelligence can be combined to create a standardized computer-aided diagnosis tool to automatically evaluate and interpret verbal fluency tests. Towards that goal, we derive novel semantic similarity (phonetic, phonemic and conceptual) metrics and present the predictive capability of these metrics on a de-identified dataset of participants with and without neurological disorders.
NASA Technical Reports Server (NTRS)
Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.
2012-01-01
There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.
Abe, Hiroyuki; Mori, Naoko; Tsuchiya, Keiko; Schacht, David V; Pineda, Federico D; Jiang, Yulei; Karczmar, Gregory S
2016-11-01
The purposes of this study were to evaluate diagnostic parameters measured with ultrafast MRI acquisition and with standard acquisition and to compare diagnostic utility for differentiating benign from malignant lesions. Ultrafast acquisition is a high-temporal-resolution (7 seconds) imaging technique for obtaining 3D whole-breast images. The dynamic contrast-enhanced 3-T MRI protocol consists of an unenhanced standard and an ultrafast acquisition that includes eight contrast-enhanced ultrafast images and four standard images. Retrospective assessment was performed for 60 patients with 33 malignant and 29 benign lesions. A computer-aided detection system was used to obtain initial enhancement rate and signal enhancement ratio (SER) by means of identification of a voxel showing the highest signal intensity in the first phase of standard imaging. From the same voxel, the enhancement rate at each time point of the ultrafast acquisition and the AUC of the kinetic curve from zero to each time point of ultrafast imaging were obtained. There was a statistically significant difference between benign and malignant lesions in enhancement rate and kinetic AUC for ultrafast imaging and also in initial enhancement rate and SER for standard imaging. ROC analysis showed no significant differences between enhancement rate in ultrafast imaging and SER or initial enhancement rate in standard imaging. Ultrafast imaging is useful for discriminating benign from malignant lesions. The differential utility of ultrafast imaging is comparable to that of standard kinetic assessment in a shorter study time.
Automated serum chloride analysis using the Apple computer
Taylor, Paul J.; Bouska, Rosalie A.
1988-01-01
Chloride analysis employing a coulometric technique is a wellestablished method. However, the equipment needed is specialized and somewhat expensive. The purpose of this paper is to report the development of the hardware and software to perform this analysis using an Apple computer to control the coulometric titration, as well as to automate it and to print out the results. The Apple computer is used to control the flow of current in a circuit, which includes silver and platinum electrodes where the following reactions take place: A g → A g + + l e − ( at silver anode ) 2 H 2 O + 2 e − → 2 O H − + H 2 ( at platinum cathode ) The generated silver ions then react with the chloride ion in the sample to form AgCl. A g + + C l − → A g C l ( s ) When all of the chloride ion has been titrated, the concentration of silver ions in solution increases rapidly, which causes an increase in the current between two silver microelectrodes. This current is converted to a voltage and amplified by a simple circuit. This voltage is read by the analogue-to-digital converter. The computer stops the titration and calculates the chloride ion content of the sample. Thus, the computer controls the apparatus, records the data, and reacts to the data to terminate the analyses and prints out the results and messages to the analyst. Analysis of standards and reference sera indicate the method is rapid, accurate and precise. Application of this apparatus as a teaching aidfor electronics to chemistry and medical students is also described. PMID:18925182
Portable Computer Keyboard For Use With One Hand
NASA Technical Reports Server (NTRS)
Friedman, Gary L.
1992-01-01
Data-entry device held in one hand and operated with five fingers. Contains seven keys. Letters, numbers, punctuation, and cursor commands keyed into computer by pressing keys in various combinations. Device called "data egg" used where standard typewriter keyboard unusable or unavailable. Contains micro-processor and 32-Kbyte memory. Captures text and transmits it to computer. Concept extended to computer mouse. Especially useful to handicapped or bedridden people who find it difficult or impossible to operate standard keyboards.
Papuga, Mark O; Mesfin, Addisu; Molinari, Robert; Rubery, Paul T
2016-07-15
A prospective and retrospective cross-sectional cohort analysis. The aim of this study was to show that Patient-Reported Outcomes Measurement Information System (PROMIS) computer adaptive testing (CAT) assessments for physical function and pain interference can be efficiently collected in a standard office visit and to evaluate these scores with scores from previously validated Oswestry Disability Index (ODI) and Neck Disability Index (NDI) providing evidence of convergent validity for use in patients with spine pathology. Spinal surgery outcomes are highly variable, and substantial debate continues regarding the role and value of spine surgery. The routine collection of patient-based outcomes instruments in spine surgery patients may inform this debate. Traditionally, the inefficiency associated with collecting standard validated instruments has been a barrier to routine use in outpatient clinics. We utilized several CAT instruments available through PROMIS and correlated these with the results obtained using "gold standard" legacy outcomes measurement instruments. All measurements were collected at a routine clinical visit. The ODI and the NDI assessments were used as "gold standard" comparisons for patient-reported outcomes. PROMIS CAT instruments required 4.5 ± 1.8 questions and took 35 ± 16 seconds to complete, compared with ODI/NDI requiring 10 questions and taking 188 ± 85 seconds when administered electronically. Linear regression analysis of retrospective scores involving a primary back complaint revealed moderate to strong correlations between ODI and PROMIS physical function with r values ranging from 0.5846 to 0.8907 depending on the specific assessment and patient subsets examined. Routine collection of physical function outcome measures in clinical practice offers the ability to inform and improve patient care. We have shown that several PROMIS CAT instruments can be efficiently administered during routine clinical visits. The moderate to strong correlations found validate the utility of computer adaptive testing when compared with the gold standard "static" legacy assessments. 4.
Using Standardized Lexicons for Report Template Validation with LexMap, a Web-based Application.
Hostetter, Jason; Wang, Kenneth; Siegel, Eliot; Durack, Jeremy; Morrison, James J
2015-06-01
An enormous amount of data exists in unstructured diagnostic and interventional radiology reports. Free text or non-standardized terminologies limit the ability to parse, extract, and analyze these report data elements. Medical lexicons and ontologies contain standardized terms for relevant concepts including disease entities, radiographic technique, and findings. The use of standardized terms offers the potential to improve reporting consistency and facilitate computer analysis. The purpose of this project was to implement an interface to aid in the creation of standards-compliant reporting templates for use in interventional radiology. Non-standardized procedure report text was analyzed and referenced to RadLex, SNOMED-CT, and LOINC. Using JavaScript, a web application was developed which determined whether exact terms or synonyms in reports existed within these three reference resources. The NCBO BioPortal Annotator web service was used to map terms, and output from this application was used to create an interactive annotated version of the original report. The application was successfully used to analyze and modify five distinct reports for the Society of Interventional Radiology's standardized reporting project.
Operant learning of Drosophila at the torque meter.
Brembs, Bjoern
2008-06-16
For experiments at the torque meter, flies are kept on standard fly medium at 25 degrees C and 60% humidity with a 12hr light/12hr dark regime. A standardized breeding regime assures proper larval density and age-matched cohorts. Cold-anesthetized flies are glued with head and thorax to a triangle-shaped hook the day before the experiment. Attached to the torque meter via a clamp, the fly's intended flight maneuvers are measured as the angular momentum around its vertical body axis. The fly is placed in the center of a cylindrical panorama to accomplish stationary flight. An analog to digital converter card feeds the yaw torque signal into a computer which stores the trace for later analysis. The computer also controls a variety of stimuli which can be brought under the fly's control by closing the feedback loop between these stimuli and the yaw torque trace. Punishment is achieved by applying heat from an adjustable infrared laser.
BioPAX – A community standard for pathway data sharing
Demir, Emek; Cary, Michael P.; Paley, Suzanne; Fukuda, Ken; Lemer, Christian; Vastrik, Imre; Wu, Guanming; D’Eustachio, Peter; Schaefer, Carl; Luciano, Joanne; Schacherer, Frank; Martinez-Flores, Irma; Hu, Zhenjun; Jimenez-Jacinto, Veronica; Joshi-Tope, Geeta; Kandasamy, Kumaran; Lopez-Fuentes, Alejandra C.; Mi, Huaiyu; Pichler, Elgar; Rodchenkov, Igor; Splendiani, Andrea; Tkachev, Sasha; Zucker, Jeremy; Gopinath, Gopal; Rajasimha, Harsha; Ramakrishnan, Ranjani; Shah, Imran; Syed, Mustafa; Anwar, Nadia; Babur, Ozgun; Blinov, Michael; Brauner, Erik; Corwin, Dan; Donaldson, Sylva; Gibbons, Frank; Goldberg, Robert; Hornbeck, Peter; Luna, Augustin; Murray-Rust, Peter; Neumann, Eric; Reubenacker, Oliver; Samwald, Matthias; van Iersel, Martijn; Wimalaratne, Sarala; Allen, Keith; Braun, Burk; Whirl-Carrillo, Michelle; Dahlquist, Kam; Finney, Andrew; Gillespie, Marc; Glass, Elizabeth; Gong, Li; Haw, Robin; Honig, Michael; Hubaut, Olivier; Kane, David; Krupa, Shiva; Kutmon, Martina; Leonard, Julie; Marks, Debbie; Merberg, David; Petri, Victoria; Pico, Alex; Ravenscroft, Dean; Ren, Liya; Shah, Nigam; Sunshine, Margot; Tang, Rebecca; Whaley, Ryan; Letovksy, Stan; Buetow, Kenneth H.; Rzhetsky, Andrey; Schachter, Vincent; Sobral, Bruno S.; Dogrusoz, Ugur; McWeeney, Shannon; Aladjem, Mirit; Birney, Ewan; Collado-Vides, Julio; Goto, Susumu; Hucka, Michael; Le Novère, Nicolas; Maltsev, Natalia; Pandey, Akhilesh; Thomas, Paul; Wingender, Edgar; Karp, Peter D.; Sander, Chris; Bader, Gary D.
2010-01-01
BioPAX (Biological Pathway Exchange) is a standard language to represent biological pathways at the molecular and cellular level. Its major use is to facilitate the exchange of pathway data (http://www.biopax.org). Pathway data captures our understanding of biological processes, but its rapid growth necessitates development of databases and computational tools to aid interpretation. However, the current fragmentation of pathway information across many databases with incompatible formats presents barriers to its effective use. BioPAX solves this problem by making pathway data substantially easier to collect, index, interpret and share. BioPAX can represent metabolic and signaling pathways, molecular and genetic interactions and gene regulation networks. BioPAX was created through a community process. Through BioPAX, millions of interactions organized into thousands of pathways across many organisms, from a growing number of sources, are available. Thus, large amounts of pathway data are available in a computable form to support visualization, analysis and biological discovery. PMID:20829833
Krause, C; Ens, K; Fechner, K; Voigt, J; Fraune, J; Rohwäder, E; Hahn, M; Danckwardt, M; Feirer, C; Barth, E; Martinetz, T; Stöcker, W
2015-04-01
Antinuclear autoantibodies (ANA) are highly informative biomarkers in autoimmune diagnostics. The increasing demand for effective test systems, however, has led to the development of a confusingly large variety of different platforms. One of them, the indirect immunofluorescence (IIF), is regarded as the common gold standard for ANA screening, as described in a position statement by the American College of Rheumatology in 2009. Technological solutions have been developed aimed at standardization and automation of IIF to overcome methodological limitations and subjective bias in IIF interpretation. In this review, we present the EUROPattern Suite, a system for computer-aided immunofluorescence microscopy (CAIFM) including automated acquisition of digital images and evaluation of IIF results. The system was originally designed for ANA diagnostics on human epithelial cells, but its applications have been extended with the latest system update version 1.5 to the analysis of antineutrophil cytoplasmic antibodies (ANCA) and anti-dsDNA antibodies. © The Author(s) 2015 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
NASA Astrophysics Data System (ADS)
Roquet, F.; Madec, G.; McDougall, Trevor J.; Barker, Paul M.
2015-06-01
A new set of approximations to the standard TEOS-10 equation of state are presented. These follow a polynomial form, making it computationally efficient for use in numerical ocean models. Two versions are provided, the first being a fit of density for Boussinesq ocean models, and the second fitting specific volume which is more suitable for compressible models. Both versions are given as the sum of a vertical reference profile (6th-order polynomial) and an anomaly (52-term polynomial, cubic in pressure), with relative errors of ∼0.1% on the thermal expansion coefficients. A 75-term polynomial expression is also presented for computing specific volume, with a better accuracy than the existing TEOS-10 48-term rational approximation, especially regarding the sound speed, and it is suggested that this expression represents a valuable approximation of the TEOS-10 equation of state for hydrographic data analysis. In the last section, practical aspects about the implementation of TEOS-10 in ocean models are discussed.
With Great Measurements Come Great Results
NASA Astrophysics Data System (ADS)
Williams, Carl
Measurements are the foundation for science and modern life. Technologies we take for granted every day depend on them-cell phones, CAT scans, pharmaceuticals, even sports equipment. Metrology, or measurement science, determines what industry can make reliably and what they cannot. At the National Institute of Standards and Technology (NIST) we specialize in making world class measurements that an incredibly wide range of industries use to continually improve their products - computer chips with nanoscale components, atomic clocks that you can hold in your hand, lasers for both super-strong welds and delicate eye surgeries. Think of all the key technologies developed over the last 100 years and better measurements, standards, or analysis techniques played a role in making them possible. NIST works collaboratively with industry researchers on the advanced metrology for tomorrow's technologies. A new kilogram based on electromagnetic force, cars that weigh half as much but are just as strong, quantum computers, personalized medicine, single atom devices - it's all happening in our labs now. This talk will focus on how metrology creates the future.
Cybersecurity Workforce Development and the Protection of Critical Infrastructure
2017-03-31
communicat ions products, and limited travel for site visits and conferencing. The CSCC contains a developed web-based coordination site, computer ...the CSCC. The Best Practices Ana~yst position maintains a lisr of best practices, computer related patches. and standard operating procedures (SOP...involved in conducting vulnerability assessments of computer networks. To adequately exercise and experiment with industry standard software, it was
Voss, Andreas; Leonhart, Rainer; Stahl, Christoph
2007-11-01
Psychological research is based in large parts on response latencies, which are often registered by keypresses on a standard computer keyboard. Recording response latencies with a standard keyboard is problematic because keypresses are buffered within the keyboard hardware before they are signaled to the computer, adding error variance to the recorded latencies. This can be circumvented by using external response pads connected to the computer's parallel port. In this article, we describe how to build inexpensive, reliable, and easy-to-use response pads with six keys from two standard computer mice that can be connected to the PC's parallel port. We also address the problem of recording data from the parallel port with different software packages under Microsoft's Windows XP.
Finite element techniques in computational time series analysis of turbulent flows
NASA Astrophysics Data System (ADS)
Horenko, I.
2009-04-01
In recent years there has been considerable increase of interest in the mathematical modeling and analysis of complex systems that undergo transitions between several phases or regimes. Such systems can be found, e.g., in weather forecast (transitions between weather conditions), climate research (ice and warm ages), computational drug design (conformational transitions) and in econometrics (e.g., transitions between different phases of the market). In all cases, the accumulation of sufficiently detailed time series has led to the formation of huge databases, containing enormous but still undiscovered treasures of information. However, the extraction of essential dynamics and identification of the phases is usually hindered by the multidimensional nature of the signal, i.e., the information is "hidden" in the time series. The standard filtering approaches (like f.~e. wavelets-based spectral methods) have in general unfeasible numerical complexity in high-dimensions, other standard methods (like f.~e. Kalman-filter, MVAR, ARCH/GARCH etc.) impose some strong assumptions about the type of the underlying dynamics. Approach based on optimization of the specially constructed regularized functional (describing the quality of data description in terms of the certain amount of specified models) will be introduced. Based on this approach, several new adaptive mathematical methods for simultaneous EOF/SSA-like data-based dimension reduction and identification of hidden phases in high-dimensional time series will be presented. The methods exploit the topological structure of the analysed data an do not impose severe assumptions on the underlying dynamics. Special emphasis will be done on the mathematical assumptions and numerical cost of the constructed methods. The application of the presented methods will be first demonstrated on a toy example and the results will be compared with the ones obtained by standard approaches. The importance of accounting for the mathematical assumptions used in the analysis will be pointed up in this example. Finally, applications to analysis of meteorological and climate data will be presented.
Ganalyzer: A tool for automatic galaxy image analysis
NASA Astrophysics Data System (ADS)
Shamir, Lior
2011-05-01
Ganalyzer is a model-based tool that automatically analyzes and classifies galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large datasets of galaxy images collected by autonomous sky surveys such as SDSS, LSST or DES.
Analysis of positron lifetime spectra in polymers
NASA Technical Reports Server (NTRS)
Singh, Jag J.; Mall, Gerald H.; Sprinkle, Danny R.
1988-01-01
A new procedure for analyzing multicomponent positron lifetime spectra in polymers was developed. It requires initial estimates of the lifetimes and the intensities of various components, which are readily obtainable by a standard spectrum stripping process. These initial estimates, after convolution with the timing system resolution function, are then used as the inputs for a nonlinear least squares analysis to compute the estimates that conform to a global error minimization criterion. The convolution integral uses the full experimental resolution function, in contrast to the previous studies where analytical approximations of it were utilized. These concepts were incorporated into a generalized Computer Program for Analyzing Positron Lifetime Spectra (PAPLS) in polymers. Its validity was tested using several artificially generated data sets. These data sets were also analyzed using the widely used POSITRONFIT program. In almost all cases, the PAPLS program gives closer fit to the input values. The new procedure was applied to the analysis of several lifetime spectra measured in metal ion containing Epon-828 samples. The results are described.
NASA Astrophysics Data System (ADS)
Latief, F. D. E.; Mohammad, I. H.; Rarasati, A. D.
2017-11-01
Digital imaging of a concrete sample using high resolution tomographic imaging by means of X-Ray Micro Computed Tomography (μ-CT) has been conducted to assess the characteristic of the sample’s structure. A standard procedure of image acquisition, reconstruction, image processing of the method using a particular scanning device i.e., the Bruker SkyScan 1173 High Energy Micro-CT are elaborated. A qualitative and a quantitative analysis were briefly performed on the sample to deliver some basic ideas of the capability of the system and the bundled software package. Calculation of total VOI volume, object volume, percent of object volume, total VOI surface, object surface, object surface/volume ratio, object surface density, structure thickness, structure separation, total porosity were conducted and analysed. This paper should serve as a brief description of how the device can produce the preferred image quality as well as the ability of the bundled software packages to help in performing qualitative and quantitative analysis.
Winter, Mark R.; Liu, Mo; Monteleone, David; Melunis, Justin; Hershberg, Uri; Goderie, Susan K.; Temple, Sally; Cohen, Andrew R.
2015-01-01
Summary Time-lapse microscopy can capture patterns of development through multiple divisions for an entire clone of proliferating cells. Images are taken every few minutes over many days, generating data too vast to process completely by hand. Computational analysis of this data can benefit from occasional human guidance. Here we combine improved automated algorithms with minimized human validation to produce fully corrected segmentation, tracking, and lineaging results with dramatic reduction in effort. A web-based viewer provides access to data and results. The improved approach allows efficient analysis of large numbers of clones. Using this method, we studied populations of progenitor cells derived from the anterior and posterior embryonic mouse cerebral cortex, each growing in a standardized culture environment. Progenitors from the anterior cortex were smaller, less motile, and produced smaller clones compared to those from the posterior cortex, demonstrating cell-intrinsic differences that may contribute to the areal organization of the cerebral cortex. PMID:26344906
Mohammed, Yassene; Percy, Andrew J; Chambers, Andrew G; Borchers, Christoph H
2015-02-06
Multiplexed targeted quantitative proteomics typically utilizes multiple reaction monitoring and allows the optimized quantification of a large number of proteins. One challenge, however, is the large amount of data that needs to be reviewed, analyzed, and interpreted. Different vendors provide software for their instruments, which determine the recorded responses of the heavy and endogenous peptides and perform the response-curve integration. Bringing multiplexed data together and generating standard curves is often an off-line step accomplished, for example, with spreadsheet software. This can be laborious, as it requires determining the concentration levels that meet the required accuracy and precision criteria in an iterative process. We present here a computer program, Qualis-SIS, that generates standard curves from multiplexed MRM experiments and determines analyte concentrations in biological samples. Multiple level-removal algorithms and acceptance criteria for concentration levels are implemented. When used to apply the standard curve to new samples, the software flags each measurement according to its quality. From the user's perspective, the data processing is instantaneous due to the reactivity paradigm used, and the user can download the results of the stepwise calculations for further processing, if necessary. This allows for more consistent data analysis and can dramatically accelerate the downstream data analysis.
Standard interface: Twin-coaxial converter
NASA Technical Reports Server (NTRS)
Lushbaugh, W. A.
1976-01-01
The network operations control center standard interface has been adopted as a standard computer interface for all future minicomputer based subsystem development for the Deep Space Network. Discussed is an intercomputer communications link using a pair of coaxial cables. This unit is capable of transmitting and receiving digital information at distances up to 600 m with complete ground isolation between the communicating devices. A converter is described that allows a computer equipped with the standard interface to use the twin coaxial link.
A WPS Based Architecture for Climate Data Analytic Services (CDAS) at NASA
NASA Astrophysics Data System (ADS)
Maxwell, T. P.; McInerney, M.; Duffy, D.; Carriere, L.; Potter, G. L.; Doutriaux, C.
2015-12-01
Faced with unprecedented growth in the Big Data domain of climate science, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute trusted and tested analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using trusted climate data analysis tools (ESMF, CDAT, NCO, etc.). The framework is structured as a set of interacting modules allowing maximal flexibility in deployment choices. The current set of module managers include: Staging Manager: Runs the computation locally on the WPS server or remotely using tools such as celery or SLURM. Compute Engine Manager: Runs the computation serially or distributed over nodes using a parallelization framework such as celery or spark. Decomposition Manger: Manages strategies for distributing the data over nodes. Data Manager: Handles the import of domain data from long term storage and manages the in-memory and disk-based caching architectures. Kernel manager: A kernel is an encapsulated computational unit which executes a processor's compute task. Each kernel is implemented in python exploiting existing analysis packages (e.g. CDAT) and is compatible with all CDAS compute engines and decompositions. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be executed using either direct web service calls, a python script or application, or a javascript-based web application. Client packages in python or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends, compare multiple reanalysis datasets, and variability.
Data Management Standards in Computer-aided Acquisition and Logistic Support (CALS)
NASA Technical Reports Server (NTRS)
Jefferson, David K.
1990-01-01
Viewgraphs and discussion on data management standards in computer-aided acquisition and logistic support (CALS) are presented. CALS is intended to reduce cost, increase quality, and improve timeliness of weapon system acquisition and support by greatly improving the flow of technical information. The phase 2 standards, industrial environment, are discussed. The information resource dictionary system (IRDS) is described.
Mödden, Claudia; Behrens, Marion; Damke, Iris; Eilers, Norbert; Kastrup, Andreas; Hildebrandt, Helmut
2012-06-01
Compensatory and restorative treatments have been developed to improve visual field defects after stroke. However, no controlled trials have compared these interventions with standard occupational therapy (OT). A total of 45 stroke participants with visual field defect admitted for inpatient rehabilitation were randomized to restorative computerized training (RT) using computer-based stimulation of border areas of their visual field defects or to a computer-based compensatory therapy (CT) teaching a visual search strategy. OT, in which different compensation strategies were used to train for activities of daily living, served as standard treatment for the active control group. Each treatment group received 15 single sessions of 30 minutes distributed over 3 weeks. The primary outcome measures were visual field expansion for RT, visual search performance for CT, and reading performance for both treatments. Visual conjunction search, alertness, and the Barthel Index were secondary outcomes. Compared with OT, CT resulted in a better visual search performance, and RT did not result in a larger expansion of the visual field. Intragroup pre-post comparisons demonstrated that CT improved all defined outcome parameters and RT several, whereas OT only improved one. CT improved functional deficits after visual field loss compared with standard OT and may be the intervention of choice during inpatient rehabilitation. A larger trial that includes lesion location in the analysis is recommended.
Monte Carlo Methodology Serves Up a Software Success
NASA Technical Reports Server (NTRS)
2003-01-01
Widely used for the modeling of gas flows through the computation of the motion and collisions of representative molecules, the Direct Simulation Monte Carlo method has become the gold standard for producing research and engineering predictions in the field of rarefied gas dynamics. Direct Simulation Monte Carlo was first introduced in the early 1960s by Dr. Graeme Bird, a professor at the University of Sydney, Australia. It has since proved to be a valuable tool to the aerospace and defense industries in providing design and operational support data, as well as flight data analysis. In 2002, NASA brought to the forefront a software product that maintains the same basic physics formulation of Dr. Bird's method, but provides effective modeling of complex, three-dimensional, real vehicle simulations and parallel processing capabilities to handle additional computational requirements, especially in areas where computational fluid dynamics (CFD) is not applicable. NASA's Direct Simulation Monte Carlo Analysis Code (DAC) software package is now considered the Agency s premier high-fidelity simulation tool for predicting vehicle aerodynamics and aerothermodynamic environments in rarified, or low-density, gas flows.
NASA Technical Reports Server (NTRS)
Barnett, Alan R.; Widrick, Timothy W.; Ludwiczak, Damian R.
1996-01-01
Solving for dynamic responses of free-free launch vehicle/spacecraft systems acted upon by buffeting winds is commonly performed throughout the aerospace industry. Due to the unpredictable nature of this wind loading event, these problems are typically solved using frequency response random analysis techniques. To generate dynamic responses for spacecraft with statically-indeterminate interfaces, spacecraft contractors prefer to develop models which have response transformation matrices developed for mode acceleration data recovery. This method transforms spacecraft boundary accelerations and displacements into internal responses. Unfortunately, standard MSC/NASTRAN modal frequency response solution sequences cannot be used to combine acceleration- and displacement-dependent responses required for spacecraft mode acceleration data recovery. External user-written computer codes can be used with MSC/NASTRAN output to perform such combinations, but these methods can be labor and computer resource intensive. Taking advantage of the analytical and computer resource efficiencies inherent within MS C/NASTRAN, a DMAP Alter has been developed to combine acceleration- and displacement-dependent modal frequency responses for performing spacecraft mode acceleration data recovery. The Alter has been used successfully to efficiently solve a common aerospace buffeting wind analysis.