A Quantitative Review of Functional Analysis Procedures in Public School Settings
ERIC Educational Resources Information Center
Solnick, Mark D.; Ardoin, Scott P.
2010-01-01
Functional behavioral assessments can consist of indirect, descriptive and experimental procedures, such as a functional analysis. Although the research contains numerous examples demonstrating the effectiveness of functional analysis procedures, experimental conditions are often difficult to implement in classroom settings and analog conditions…
Effects of Computer-Based Training on Procedural Modifications to Standard Functional Analyses
ERIC Educational Resources Information Center
Schnell, Lauren K.; Sidener, Tina M.; DeBar, Ruth M.; Vladescu, Jason C.; Kahng, SungWoo
2018-01-01
Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to…
Turkish Special Education Teachers' Implementation of Functional Analysis in Classroom Settings
ERIC Educational Resources Information Center
Erbas, Dilek; Yucesoy, Serife; Turan, Yasemin; Ostrosky, Michaelene M.
2006-01-01
Three Turkish special education teachers conducted a functional analysis to identify variables that might initiate or maintain the problem behaviors of three children with developmental disabilities. The analysis procedures were conducted in natural classroom settings. In Phase 1, following initial training in functional analysis procedures, the…
Geometric Analysis of Wing Sections
DOT National Transportation Integrated Search
1995-04-01
This paper describes a new geometric analysis procedure for wing sections. This procedure is based on the normal mode analysis for continuous functions. A set of special shape functions is introduced to represent the geometry of the wing section. The...
ERIC Educational Resources Information Center
Fleming, Courtney V.
2011-01-01
Minimal research has investigated training packages used to teach professional staff how to implement functional analysis procedures and to interpret data gathered during functional analysis. The current investigation used video-based training with role-play and feedback to teach six professionals in a clinical setting to implement procedures of a…
Behavior analytic approaches to problem behavior in intellectual disabilities.
Hagopian, Louis P; Gregory, Meagan K
2016-03-01
The purpose of the current review is to summarize recent behavior analytic research on problem behavior in individuals with intellectual disabilities. We have focused our review on studies published from 2013 to 2015, but also included earlier studies that were relevant. Behavior analytic research on problem behavior continues to focus on the use and refinement of functional behavioral assessment procedures and function-based interventions. During the review period, a number of studies reported on procedures aimed at making functional analysis procedures more time efficient. Behavioral interventions continue to evolve, and there were several larger scale clinical studies reporting on multiple individuals. There was increased attention on the part of behavioral researchers to develop statistical methods for analysis of within subject data and continued efforts to aggregate findings across studies through evaluative reviews and meta-analyses. Findings support continued utility of functional analysis for guiding individualized interventions and for classifying problem behavior. Modifications designed to make functional analysis more efficient relative to the standard method of functional analysis were reported; however, these require further validation. Larger scale studies on behavioral assessment and treatment procedures provided additional empirical support for effectiveness of these approaches and their sustainability outside controlled clinical settings.
SASS wind ambiguity removal by direct minimization. II - Use of smoothness and dynamical constraints
NASA Technical Reports Server (NTRS)
Hoffman, R. N.
1984-01-01
A variational analysis method (VAM) is used to remove the ambiguity of the Seasat-A Satellite Scatterometer (SASS) winds. The VAM yields the best fit to the data by minimizing an objective function S which is a measure of the lack of fit. The SASS data are described and the function S and the analysis procedure are defined. Analyses of a single ship report which are analogous to Green's functions are presented. The analysis procedure is tuned and its sensitivity is described using the QE II storm. The procedure is then applied to a case study of September 6, 1978, south of Japan.
Effects of computer-based training on procedural modifications to standard functional analyses.
Schnell, Lauren K; Sidener, Tina M; DeBar, Ruth M; Vladescu, Jason C; Kahng, SungWoo
2018-01-01
Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to training materials using interactive software during a 1-day session. Following the training, mean scores on the posttest, novel cases probe, and maintenance probe increased for all participants. These results replicate previous findings during a 1-day session and include a measure of participant acceptability of the training. Recommendations for future research on computer-based training and functional analysis are discussed. © 2017 Society for the Experimental Analysis of Behavior.
Davis, Barbara J; Schmidt, Jonathan; Bowman, Lynn G; Boelter, Eric W
2012-01-01
Current research provides few suggestions for modifications to functional analysis procedures to accommodate low rate, high intensity problem behavior. This study examined the results of the extended duration functional analysis procedures of Kahng, Abt, and Schonbachler (2001) with six children admitted to an inpatient hospital for the treatment of severe problem behavior. Results of initial functional analyses (Iwata, Dorsey, Slifer, Bauman, & Richman, 1982/1994) were inconclusive for all children because of low levels of responding. The altered functional analyses, which changed multiple variables including the duration of the functional analysis (i.e., 6 or 7 hrs), yielded clear behavioral functions for all six participants. These results add additional support for the utility of an altered analysis of low rate, high intensity problem behavior when standard functional analyses do not yield differentiated results. PMID:23326628
Davis, Barbara J; Kahng, Sungwoo; Schmidt, Jonathan; Bowman, Lynn G; Boelter, Eric W
2012-01-01
Current research provides few suggestions for modifications to functional analysis procedures to accommodate low rate, high intensity problem behavior. This study examined the results of the extended duration functional analysis procedures of Kahng, Abt, and Schonbachler (2001) with six children admitted to an inpatient hospital for the treatment of severe problem behavior. Results of initial functional analyses (Iwata, Dorsey, Slifer, Bauman, & Richman, 1982/1994) were inconclusive for all children because of low levels of responding. The altered functional analyses, which changed multiple variables including the duration of the functional analysis (i.e., 6 or 7 hrs), yielded clear behavioral functions for all six participants. These results add additional support for the utility of an altered analysis of low rate, high intensity problem behavior when standard functional analyses do not yield differentiated results.
The interval testing procedure: A general framework for inference in functional data analysis.
Pini, Alessia; Vantini, Simone
2016-09-01
We introduce in this work the Interval Testing Procedure (ITP), a novel inferential technique for functional data. The procedure can be used to test different functional hypotheses, e.g., distributional equality between two or more functional populations, equality of mean function of a functional population to a reference. ITP involves three steps: (i) the representation of data on a (possibly high-dimensional) functional basis; (ii) the test of each possible set of consecutive basis coefficients; (iii) the computation of the adjusted p-values associated to each basis component, by means of a new strategy here proposed. We define a new type of error control, the interval-wise control of the family wise error rate, particularly suited for functional data. We show that ITP is provided with such a control. A simulation study comparing ITP with other testing procedures is reported. ITP is then applied to the analysis of hemodynamical features involved with cerebral aneurysm pathology. ITP is implemented in the fdatest R package. © 2016, The International Biometric Society.
Evaluation of the Utility of a Discrete-Trial Functional Analysis in Early Intervention Classrooms
ERIC Educational Resources Information Center
Kodak, Tiffany; Fisher, Wayne W.; Paden, Amber; Dickes, Nitasha
2013-01-01
We evaluated a discrete-trial functional analysis implemented by regular classroom staff in a classroom setting. The results suggest that the discrete-trial functional analysis identified a social function for each participant and may require fewer staff than standard functional analysis procedures.
ERIC Educational Resources Information Center
O'Neill, Robert E.; Bundock, Kaitlin; Kladis, Kristin; Hawken, Leanne S.
2015-01-01
This survey study assessed the acceptability of a variety of functional behavioral assessment (FBA) procedures (i.e., functional assessment interviews, rating scales/questionnaires, systematic direct observations, functional analysis manipulations) to a national sample of 123 special educators and a state sample of 140 school psychologists.…
Item Purification in Differential Item Functioning Using Generalized Linear Mixed Models
ERIC Educational Resources Information Center
Liu, Qian
2011-01-01
For this dissertation, four item purification procedures were implemented onto the generalized linear mixed model for differential item functioning (DIF) analysis, and the performance of these item purification procedures was investigated through a series of simulations. Among the four procedures, forward and generalized linear mixed model (GLMM)…
Pelios, L; Morren, J; Tesch, D; Axelrod, S
1999-01-01
Self-injurious behavior (SIB) and aggression have been the concern of researchers because of the serious impact these behaviors have on individuals' lives. Despite the plethora of research on the treatment of SIB and aggressive behavior, the reported findings have been inconsistent regarding the effectiveness of reinforcement-based versus punishment-based procedures. We conducted a literature review to determine whether a trend could be detected in researchers' selection of reinforcement-based procedures versus punishment-based procedures, particularly since the introduction of functional analysis to behavioral assessment. The data are consistent with predictions made in the past regarding the potential impact of functional analysis methodology. Specifically, the findings indicate that, once maintaining variables for problem behavior are identified, experimenters tend to choose reinforcement-based procedures rather than punishment-based procedures as treatment for both SIB and aggressive behavior. Results indicated an increased interest in studies on the treatment of SIB and aggressive behavior, particularly since 1988. PMID:10396771
Evaluation of the utility of a discrete-trial functional analysis in early intervention classrooms.
Kodak, Tiffany; Fisher, Wayne W; Paden, Amber; Dickes, Nitasha
2013-01-01
We evaluated a discrete-trial functional analysis implemented by regular classroom staff in a classroom setting. The results suggest that the discrete-trial functional analysis identified a social function for each participant and may require fewer staff than standard functional analysis procedures. © Society for the Experimental Analysis of Behavior.
Cognition and procedure representational requirements for predictive human performance models
NASA Technical Reports Server (NTRS)
Corker, K.
1992-01-01
Models and modeling environments for human performance are becoming significant contributors to early system design and analysis procedures. Issues of levels of automation, physical environment, informational environment, and manning requirements are being addressed by such man/machine analysis systems. The research reported here investigates the close interaction between models of human cognition and models that described procedural performance. We describe a methodology for the decomposition of aircrew procedures that supports interaction with models of cognition on the basis of procedures observed; that serves to identify cockpit/avionics information sources and crew information requirements; and that provides the structure to support methods for function allocation among crew and aiding systems. Our approach is to develop an object-oriented, modular, executable software representation of the aircrew, the aircraft, and the procedures necessary to satisfy flight-phase goals. We then encode in a time-based language, taxonomies of the conceptual, relational, and procedural constraints among the cockpit avionics and control system and the aircrew. We have designed and implemented a goals/procedures hierarchic representation sufficient to describe procedural flow in the cockpit. We then execute the procedural representation in simulation software and calculate the values of the flight instruments, aircraft state variables and crew resources using the constraints available from the relationship taxonomies. The system provides a flexible, extensible, manipulative and executable representation of aircrew and procedures that is generally applicable to crew/procedure task-analysis. The representation supports developed methods of intent inference, and is extensible to include issues of information requirements and functional allocation. We are attempting to link the procedural representation to models of cognitive functions to establish several intent inference methods including procedural backtracking with concurrent search, temporal reasoning, and constraint checking for partial ordering of procedures. Finally, the representation is being linked to models of human decision making processes that include heuristic, propositional and prescriptive judgement models that are sensitive to the procedural content in which the valuative functions are being performed.
Comparison of Traditional and Trial-Based Methodologies for Conducting Functional Analyses
ERIC Educational Resources Information Center
LaRue, Robert H.; Lenard, Karen; Weiss, Mary Jane; Bamond, Meredith; Palmieri, Mark; Kelley, Michael E.
2010-01-01
Functional analysis represents a sophisticated and empirically supported functional assessment procedure. While these procedures have garnered considerable empirical support, they are often underused in clinical practice. Safety risks resulting from the evocation of maladaptive behavior and the length of time required to conduct functional…
DIF Trees: Using Classification Trees to Detect Differential Item Functioning
ERIC Educational Resources Information Center
Vaughn, Brandon K.; Wang, Qiu
2010-01-01
A nonparametric tree classification procedure is used to detect differential item functioning for items that are dichotomously scored. Classification trees are shown to be an alternative procedure to detect differential item functioning other than the use of traditional Mantel-Haenszel and logistic regression analysis. A nonparametric…
ERIC Educational Resources Information Center
Axelrod, Saul
1987-01-01
Emerging approaches for dealing with inappropriate behaviors of the disabled involve conducting a functional or structural behavior analysis to isolate the factors responsible for the aberrant behavior and implementing corrective procedures (often alternatives to punishment) relevant to the function of the inappropriate behavior. (Author/DB)
Assessing the Social Acceptability of the Functional Analysis of Problem Behavior
ERIC Educational Resources Information Center
Langthorne, Paul; McGill, Peter
2011-01-01
Although the clinical utility of the functional analysis is well established, its social acceptability has received minimal attention. The current study assessed the social acceptability of functional analysis procedures among 10 parents and 3 teachers of children who had recently received functional analyses. Participants completed a 9-item…
Using normalization 3D model for automatic clinical brain quantative analysis and evaluation
NASA Astrophysics Data System (ADS)
Lin, Hong-Dun; Yao, Wei-Jen; Hwang, Wen-Ju; Chung, Being-Tau; Lin, Kang-Ping
2003-05-01
Functional medical imaging, such as PET or SPECT, is capable of revealing physiological functions of the brain, and has been broadly used in diagnosing brain disorders by clinically quantitative analysis for many years. In routine procedures, physicians manually select desired ROIs from structural MR images and then obtain physiological information from correspondent functional PET or SPECT images. The accuracy of quantitative analysis thus relies on that of the subjectively selected ROIs. Therefore, standardizing the analysis procedure is fundamental and important in improving the analysis outcome. In this paper, we propose and evaluate a normalization procedure with a standard 3D-brain model to achieve precise quantitative analysis. In the normalization process, the mutual information registration technique was applied for realigning functional medical images to standard structural medical images. Then, the standard 3D-brain model that shows well-defined brain regions was used, replacing the manual ROIs in the objective clinical analysis. To validate the performance, twenty cases of I-123 IBZM SPECT images were used in practical clinical evaluation. The results show that the quantitative analysis outcomes obtained from this automated method are in agreement with the clinical diagnosis evaluation score with less than 3% error in average. To sum up, the method takes advantage of obtaining precise VOIs, information automatically by well-defined standard 3-D brain model, sparing manually drawn ROIs slice by slice from structural medical images in traditional procedure. That is, the method not only can provide precise analysis results, but also improve the process rate for mass medical images in clinical.
Enhanced Multiobjective Optimization Technique for Comprehensive Aerospace Design. Part A
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Rajadas, John N.
1997-01-01
A multidisciplinary design optimization procedure which couples formal multiobjectives based techniques and complex analysis procedures (such as computational fluid dynamics (CFD) codes) developed. The procedure has been demonstrated on a specific high speed flow application involving aerodynamics and acoustics (sonic boom minimization). In order to account for multiple design objectives arising from complex performance requirements, multiobjective formulation techniques are used to formulate the optimization problem. Techniques to enhance the existing Kreisselmeier-Steinhauser (K-S) function multiobjective formulation approach have been developed. The K-S function procedure used in the proposed work transforms a constrained multiple objective functions problem into an unconstrained problem which then is solved using the Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm. Weight factors are introduced during the transformation process to each objective function. This enhanced procedure will provide the designer the capability to emphasize specific design objectives during the optimization process. The demonstration of the procedure utilizes a computational Fluid dynamics (CFD) code which solves the three-dimensional parabolized Navier-Stokes (PNS) equations for the flow field along with an appropriate sonic boom evaluation procedure thus introducing both aerodynamic performance as well as sonic boom as the design objectives to be optimized simultaneously. Sensitivity analysis is performed using a discrete differentiation approach. An approximation technique has been used within the optimizer to improve the overall computational efficiency of the procedure in order to make it suitable for design applications in an industrial setting.
NASA Technical Reports Server (NTRS)
Stein, M.; Housner, J. D.
1978-01-01
A numerical analysis developed for the buckling of rectangular orthotropic layered panels under combined shear and compression is described. This analysis uses a central finite difference procedure based on trigonometric functions instead of using the conventional finite differences which are based on polynomial functions. Inasmuch as the buckle mode shape is usually trigonometric in nature, the analysis using trigonometric finite differences can be made to exhibit a much faster convergence rate than that using conventional differences. Also, the trigonometric finite difference procedure leads to difference equations having the same form as conventional finite differences; thereby allowing available conventional finite difference formulations to be converted readily to trigonometric form. For two-dimensional problems, the procedure introduces two numerical parameters into the analysis. Engineering approaches for the selection of these parameters are presented and the analysis procedure is demonstrated by application to several isotropic and orthotropic panel buckling problems. Among these problems is the shear buckling of stiffened isotropic and filamentary composite panels in which the stiffener is broken. Results indicate that a break may degrade the effect of the stiffener to the extent that the panel will not carry much more load than if the stiffener were absent.
ERIC Educational Resources Information Center
LaRue, Robert H.; Sloman, Kimberly N.; Weiss, Mary Jane; Delmolino, Lara; Hansford, Amy; Szalony, Jill; Madigan, Ryan; Lambright, Nathan M.
2011-01-01
Functional analysis procedures have been effectively used to determine the maintaining variables for challenging behavior and subsequently develop effective interventions. However, fear of evoking dangerous topographies of maladaptive behavior and concerns for reinforcing infrequent maladaptive behavior present challenges for people working in…
Functional Analyses and Treatment of Precursor Behavior
ERIC Educational Resources Information Center
Najdowski, Adel C.; Wallace, Michele D.; Ellsworth, Carrie L.; MacAleese, Alicia N.; Cleveland, Jackie
2008-01-01
Functional analysis has been demonstrated to be an effective method to identify environmental variables that maintain problem behavior. However, there are cases when conducting functional analyses of severe problem behavior may be contraindicated. The current study applied functional analysis procedures to a class of behavior that preceded severe…
ERIC Educational Resources Information Center
Zwick, Rebecca
2012-01-01
Differential item functioning (DIF) analysis is a key component in the evaluation of the fairness and validity of educational tests. The goal of this project was to review the status of ETS DIF analysis procedures, focusing on three aspects: (a) the nature and stringency of the statistical rules used to flag items, (b) the minimum sample size…
Multivariate Cluster Analysis.
ERIC Educational Resources Information Center
McRae, Douglas J.
Procedures for grouping students into homogeneous subsets have long interested educational researchers. The research reported in this paper is an investigation of a set of objective grouping procedures based on multivariate analysis considerations. Four multivariate functions that might serve as criteria for adequate grouping are given and…
Classwide Functional Analysis and Treatment of Preschoolers' Disruptive Behavior
ERIC Educational Resources Information Center
Poole, Veena Y.; Dufrene, Brad A.; Sterling, Heather E.; Tingstrom, Daniel H.; Hardy, Christina M.
2012-01-01
Relatively few functional assessment and intervention studies have been conducted in preschool classrooms with children of typical development who engage in high incidence problem behaviors. Moreover, limited studies have used functional assessment procedures with the class as the unit of analysis. This study included functional analyses and a…
Brief functional analysis and treatment of a vocal tic.
Watson, T S; Sterling, H E
1998-01-01
This study sought to extend functional methodology to the assessment and treatment of habits. After a descriptive assessment indicated that coughing occurred while eating, a brief functional analysis suggested that social attention was the maintaining variable. Results demonstrated that treatment, derived from the assessment and analysis data, rapidly eliminated the cough. We discuss the appropriateness of using functional analysis procedures for deriving treatments for habits in a clinical setting.
Structural tailoring of engine blades (STAEBL)
NASA Technical Reports Server (NTRS)
Platt, C. E.; Pratt, T. K.; Brown, K. W.
1982-01-01
A mathematical optimization procedure was developed for the structural tailoring of engine blades and was used to structurally tailor two engine fan blades constructed of composite materials without midspan shrouds. The first was a solid blade made from superhybrid composites, and the second was a hollow blade with metal matrix composite inlays. Three major computerized functions were needed to complete the procedure: approximate analysis with the established input variables, optimization of an objective function, and refined analysis for design verification.
Development of Multiobjective Optimization Techniques for Sonic Boom Minimization
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.
1996-01-01
A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.
NASA Technical Reports Server (NTRS)
Parrish, R. V.; Mckissick, B. T.; Steinmetz, G. G.
1979-01-01
A recent modification of the methodology of profile analysis, which allows the testing for differences between two functions as a whole with a single test, rather than point by point with multiple tests is discussed. The modification is applied to the examination of the issue of motion/no motion conditions as shown by the lateral deviation curve as a function of engine cut speed of a piloted 737-100 simulator. The results of this application are presented along with those of more conventional statistical test procedures on the same simulator data.
[The structural functional analysis of functioning of day-hospitals of the Russian Federation].
2012-01-01
The article deals with the results of structural functional analysis of functioning of day-hospitals in the Russian Federation. The dynamic analysis is presented concerning day-hospitals' network, capacity; financial support, beds stock structure, treated patients structure, volumes of diagnostic tests and curative procedures. The need in developing of population medical care in conditions of day-hospitals is demonstrated.
Escape-to-Attention as a Potential Variable for Maintaining Problem Behavior in the School Setting
ERIC Educational Resources Information Center
Sarno, Jana M.; Sterling, Heather E.; Mueller, Michael M.; Dufrene, Brad; Tingstrom, Daniel H.; Olmi, D. Joe
2011-01-01
Mueller, Sterling-Turner, and Moore (2005) reported a novel escape-to-attention (ETA) functional analysis condition in a school setting with one child. The current study replicates Mueller et al.'s functional analysis procedures with three elementary school-age boys referred for problem behavior. Functional analysis verified the participant's…
Han, Jubong; Lee, K B; Lee, Jong-Man; Park, Tae Soon; Oh, J S; Oh, Pil-Jei
2016-03-01
We discuss a new method to incorporate Type B uncertainty into least-squares procedures. The new method is based on an extension of the likelihood function from which a conventional least-squares function is derived. The extended likelihood function is the product of the original likelihood function with additional PDFs (Probability Density Functions) that characterize the Type B uncertainties. The PDFs are considered to describe one's incomplete knowledge on correction factors being called nuisance parameters. We use the extended likelihood function to make point and interval estimations of parameters in the basically same way as the least-squares function used in the conventional least-squares method is derived. Since the nuisance parameters are not of interest and should be prevented from appearing in the final result, we eliminate such nuisance parameters by using the profile likelihood. As an example, we present a case study for a linear regression analysis with a common component of Type B uncertainty. In this example we compare the analysis results obtained from using our procedure with those from conventional methods. Copyright © 2015. Published by Elsevier Ltd.
Compendium of Methods for Applying Measured Data to Vibration and Acoustic Problems
1985-10-01
statistical energy analysis , finite element models, transfer function...Procedures for the Modal Analysis Method .............................................. 8-22 8.4 Summary of the Procedures for the Statistical Energy Analysis Method... statistical energy analysis . 8-1 • o + . . i... "_+,A" L + "+..• •+A ’! i, + +.+ +• o.+ -ore -+. • -..- , .%..% ". • 2 -".-2- ;.-.’, . o . It is helpful
Functional Analysis in Virtual Environments
ERIC Educational Resources Information Center
Vasquez, Eleazar, III; Marino, Matthew T.; Donehower, Claire; Koch, Aaron
2017-01-01
Functional analysis (FA) is an assessment procedure involving the systematic manipulation of an individual's environment to determine why a target behavior is occurring. An analog FA provides practitioners the opportunity to manipulate variables in a controlled environment and formulate a hypothesis for the function of a behavior. In previous…
Functional Behavioral Assessment State Policies and Procedures. Quick Turn Around (QTA).
ERIC Educational Resources Information Center
National Association of State Directors of Special Education, Alexandria, VA.
This brief report provides an analysis of survey data collected from 45 states and territories about policies, procedures, and guidelines related to Functional Behavioral Assessment (FBA), plans to develop or revise policy in this area, and technical assistance needs related to FBA, especially for students who exhibit behavior that interferes with…
Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier
2014-05-01
Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.
Nahas, Samar; Yi, Johnny; Magrina, Javier
2013-01-01
To evaluate the surgical outcome and the anatomic and sexual function in 10 women with Rokitansky syndrome who underwent the laparoscopic Vecchietti procedure at our center. Retrospective analysis. Data were analyzed on the basis of short-term and long-term surgical outcome and sexual function. All patients underwent clinical follow-up at 1, 2, and 6 months after surgery. In all 10 patients, the procedure produced anatomic and functional success. The laparoscopic Vecchietti technique is safe, simple, and effective for treatment of vaginal agenesis. Results are comparable to those of all European studies, and the procedure should gain more popularity in North America. Copyright © 2013 AAGL. All rights reserved.
2014-10-01
nonlinear and non-stationary signals. It aims at decomposing a signal, via an iterative sifting procedure, into several intrinsic mode functions ...stationary signals. It aims at decomposing a signal, via an iterative sifting procedure into several intrinsic mode functions (IMFs), and each of the... function , optimization. 1 Introduction It is well known that nonlinear and non-stationary signal analysis is important and difficult. His- torically
NASA Astrophysics Data System (ADS)
Obracaj, Piotr; Fabianowski, Dariusz
2017-10-01
Implementations concerning adaptation of historic facilities for public utility objects are associated with the necessity of solving many complex, often conflicting expectations of future users. This mainly concerns the function that includes construction, technology and aesthetic issues. The list of issues is completed with proper protection of historic values, different in each case. The procedure leading to obtaining the expected solution is a multicriteria procedure, usually difficult to accurately define and requiring designer’s large experience. An innovative approach has been used for the analysis, namely - the modified EA FAHP (Extent Analysis Fuzzy Analytic Hierarchy Process) Chang’s method of a multicriteria analysis for the assessment of complex functional and spatial issues. Selection of optimal spatial form of an adapted historic building intended for the multi-functional public utility facility was analysed. The assumed functional flexibility was determined in the scope of: education, conference, and chamber spectacles, such as drama, concerts, in different stage-audience layouts.
ERIC Educational Resources Information Center
White, Pamela; O'Reilly, Mark; Fragale, Christina; Kang, Soyeon; Muhich, Kimberly; Falcomata, Terry; Lang, Russell; Sigafoos, Jeff; Lancioni, Giulio
2011-01-01
Two children with autism who engaged in aggression and stereotypy were assessed using common analogue functional analysis procedures. Aggression was maintained by access to specific preferred items. Data on the rates of stereotypy and appropriate play were collected during an extended functional analysis tangible condition. These data reveal that…
Kodak, Tiffany; Grow, Laura; Northup, John
2004-01-01
We conducted a functional analysis of elopement in an outdoor setting for a child with a diagnosis of attention deficit hyperactivity disorder. A subsequent treatment consisting of noncontingent attention and time-out was demonstrated to be effective in eliminating elopement. Modifications of functional analysis procedures associated with the occurrence of elopement in a natural setting are demonstrated.
The Effect of Endovascular Revascularization of Common Iliac Artery Occlusions on Erectile Function
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gur, Serkan, E-mail: mserkangur@yahoo.com; Ozkan, Ugur; Onder, Hakan
To determine the incidence of erectile dysfunction in patients with common iliac artery (CIA) occlusive disease and the effect of revascularization on erectile function using the sexual health inventory for males (SHIM) questionnaire. All patients (35 men; mean age 57 {+-} 5 years; range 42-67 years) were asked to recall their sexual function before and 1 month after iliac recanalization. Univariate and multivariate analyses were performed to determine variables effecting improvement of impotence. The incidence of impotence in patients with CIA occlusion was 74% (26 of 35) preoperatively. Overall 16 (46%) of 35 patients reported improved erectile function after iliacmore » recanalization. The rate of improvement of impotence was 61.5% (16 of 26 impotent patients). Sixteen patients (46%), including seven with normal erectile function before the procedure, had no change. Three patients (8%) reported deterioration of their sexual function, two of whom (6%) had normal erectile function before the procedure. The median SHIM score increased from 14 (range 4-25) before the procedure to 20 (range 1-25) after the procedure (P = 0.005). The type of recanalization, the age of the patients, and the length of occlusion were related to erectile function improvement in univariate analysis. However, these factors were not independent factors for improvement of erectile dysfunction in multivariate analysis (P > 0.05). Endovascular recanalization of CIA occlusions clearly improves sexual function. More than half of the patients with erectile dysfunction who underwent endovascular recanalization of the CIA experienced improvement.« less
Effect of Purification Procedures on DIF Analysis in IRTPRO
ERIC Educational Resources Information Center
Fikis, David R. J.; Oshima, T. C.
2017-01-01
Purification of the test has been a well-accepted procedure in enhancing the performance of tests for differential item functioning (DIF). As defined by Lord, purification requires reestimation of ability parameters after removing DIF items before conducting the final DIF analysis. IRTPRO 3 is a recently updated program for analyses in item…
Automatic Coding of Dialogue Acts in Collaboration Protocols
ERIC Educational Resources Information Center
Erkens, Gijsbert; Janssen, Jeroen
2008-01-01
Although protocol analysis can be an important tool for researchers to investigate the process of collaboration and communication, the use of this method of analysis can be time consuming. Hence, an automatic coding procedure for coding dialogue acts was developed. This procedure helps to determine the communicative function of messages in online…
Kodak, Tiffany; Grow, Laura; Northup, John
2004-01-01
We conducted a functional analysis of elopement in an outdoor setting for a child with a diagnosis of attention deficit hyperactivity disorder. A subsequent treatment consisting of noncontingent attention and time-out was demonstrated to be effective in eliminating elopement. Modifications of functional analysis procedures associated with the occurrence of elopement in a natural setting are demonstrated. PMID:15293643
FGWAS: Functional genome wide association analysis.
Huang, Chao; Thompson, Paul; Wang, Yalin; Yu, Yang; Zhang, Jingwen; Kong, Dehan; Colen, Rivka R; Knickmeyer, Rebecca C; Zhu, Hongtu
2017-10-01
Functional phenotypes (e.g., subcortical surface representation), which commonly arise in imaging genetic studies, have been used to detect putative genes for complexly inherited neuropsychiatric and neurodegenerative disorders. However, existing statistical methods largely ignore the functional features (e.g., functional smoothness and correlation). The aim of this paper is to develop a functional genome-wide association analysis (FGWAS) framework to efficiently carry out whole-genome analyses of functional phenotypes. FGWAS consists of three components: a multivariate varying coefficient model, a global sure independence screening procedure, and a test procedure. Compared with the standard multivariate regression model, the multivariate varying coefficient model explicitly models the functional features of functional phenotypes through the integration of smooth coefficient functions and functional principal component analysis. Statistically, compared with existing methods for genome-wide association studies (GWAS), FGWAS can substantially boost the detection power for discovering important genetic variants influencing brain structure and function. Simulation studies show that FGWAS outperforms existing GWAS methods for searching sparse signals in an extremely large search space, while controlling for the family-wise error rate. We have successfully applied FGWAS to large-scale analysis of data from the Alzheimer's Disease Neuroimaging Initiative for 708 subjects, 30,000 vertices on the left and right hippocampal surfaces, and 501,584 SNPs. Copyright © 2017 Elsevier Inc. All rights reserved.
Adjoint-Based, Three-Dimensional Error Prediction and Grid Adaptation
NASA Technical Reports Server (NTRS)
Park, Michael A.
2002-01-01
Engineering computational fluid dynamics (CFD) analysis and design applications focus on output functions (e.g., lift, drag). Errors in these output functions are generally unknown and conservatively accurate solutions may be computed. Computable error estimates can offer the possibility to minimize computational work for a prescribed error tolerance. Such an estimate can be computed by solving the flow equations and the linear adjoint problem for the functional of interest. The computational mesh can be modified to minimize the uncertainty of a computed error estimate. This robust mesh-adaptation procedure automatically terminates when the simulation is within a user specified error tolerance. This procedure for estimating and adapting to error in a functional is demonstrated for three-dimensional Euler problems. An adaptive mesh procedure that links to a Computer Aided Design (CAD) surface representation is demonstrated for wing, wing-body, and extruded high lift airfoil configurations. The error estimation and adaptation procedure yielded corrected functions that are as accurate as functions calculated on uniformly refined grids with ten times as many grid points.
Yuki, Koichi; Koutsogiannaki, Sophia; Lee, Sandra; DiNardo, James A
2018-05-18
An increasing number of surgical and nonsurgical procedures are being performed on an ambulatory basis in children. Analysis of a large group of pediatric patients with congenital heart disease undergoing ambulatory procedures has not been undertaken. The objective of this study was to characterize the profile of children with congenital heart disease who underwent noncardiac procedures on an ambulatory basis at our institution, to determine the incidence of adverse cardiovascular and respiratory adverse events, and to determine the risk factors for unscheduled hospital admission. This is a retrospective study of children with congenital heart disease who underwent noncardiac procedures on an ambulatory basis in a single center. Using the electronic preoperative anesthesia evaluation form, we identified 3010 patients with congenital heart disease who underwent noncardiac procedures of which 1028 (34.1%) were scheduled to occur on an ambulatory basis. Demographic, echocardiographic and functional status data, cardiovascular and respiratory adverse events, and reasons for postprocedure admission were recorded. Univariable analysis was conducted. The unplanned hospital admission was 2.7% and univariable analysis demonstrated that performance of an echocardiogram within 6 mo of the procedure and procedures performed in radiology were associated with postoperative admission. Cardiovascular adverse event incidence was 3.9%. Respiratory adverse event incidence was 1.8%. Ambulatory, noncomplex procedures can be performed in pediatric patients with congenital heart disease and good functional status with a relatively low unanticipated hospital admission rate. © 2018 John Wiley & Sons Ltd.
Functional analysis screening for multiple topographies of problem behavior.
Bell, Marlesha C; Fahmie, Tara A
2018-04-23
The current study evaluated a screening procedure for multiple topographies of problem behavior in the context of an ongoing functional analysis. Experimenters analyzed the function of a topography of primary concern while collecting data on topographies of secondary concern. We used visual analysis to predict the function of secondary topographies and a subsequent functional analysis to test those predictions. Results showed that a general function was accurately predicted for five of six (83%) secondary topographies. A specific function was predicted and supported for a subset of these topographies. The experimenters discuss the implication of these results for clinicians who have limited time for functional assessment. © 2018 Society for the Experimental Analysis of Behavior.
Mixture-based gatekeeping procedures in adaptive clinical trials.
Kordzakhia, George; Dmitrienko, Alex; Ishida, Eiji
2018-01-01
Clinical trials with data-driven decision rules often pursue multiple clinical objectives such as the evaluation of several endpoints or several doses of an experimental treatment. These complex analysis strategies give rise to "multivariate" multiplicity problems with several components or sources of multiplicity. A general framework for defining gatekeeping procedures in clinical trials with adaptive multistage designs is proposed in this paper. The mixture method is applied to build a gatekeeping procedure at each stage and inferences at each decision point (interim or final analysis) are performed using the combination function approach. An advantage of utilizing the mixture method is that it enables powerful gatekeeping procedures applicable to a broad class of settings with complex logical relationships among the hypotheses of interest. Further, the combination function approach supports flexible data-driven decisions such as a decision to increase the sample size or remove a treatment arm. The paper concludes with a clinical trial example that illustrates the methodology by applying it to develop an adaptive two-stage design with a mixture-based gatekeeping procedure.
A Functional Varying-Coefficient Single-Index Model for Functional Response Data
Li, Jialiang; Huang, Chao; Zhu, Hongtu
2016-01-01
Motivated by the analysis of imaging data, we propose a novel functional varying-coefficient single index model (FVCSIM) to carry out the regression analysis of functional response data on a set of covariates of interest. FVCSIM represents a new extension of varying-coefficient single index models for scalar responses collected from cross-sectional and longitudinal studies. An efficient estimation procedure is developed to iteratively estimate varying coefficient functions, link functions, index parameter vectors, and the covariance function of individual functions. We systematically examine the asymptotic properties of all estimators including the weak convergence of the estimated varying coefficient functions, the asymptotic distribution of the estimated index parameter vectors, and the uniform convergence rate of the estimated covariance function and their spectrum. Simulation studies are carried out to assess the finite-sample performance of the proposed procedure. We apply FVCSIM to investigating the development of white matter diffusivities along the corpus callosum skeleton obtained from Alzheimer’s Disease Neuroimaging Initiative (ADNI) study. PMID:29200540
A Functional Varying-Coefficient Single-Index Model for Functional Response Data.
Li, Jialiang; Huang, Chao; Zhu, Hongtu
2017-01-01
Motivated by the analysis of imaging data, we propose a novel functional varying-coefficient single index model (FVCSIM) to carry out the regression analysis of functional response data on a set of covariates of interest. FVCSIM represents a new extension of varying-coefficient single index models for scalar responses collected from cross-sectional and longitudinal studies. An efficient estimation procedure is developed to iteratively estimate varying coefficient functions, link functions, index parameter vectors, and the covariance function of individual functions. We systematically examine the asymptotic properties of all estimators including the weak convergence of the estimated varying coefficient functions, the asymptotic distribution of the estimated index parameter vectors, and the uniform convergence rate of the estimated covariance function and their spectrum. Simulation studies are carried out to assess the finite-sample performance of the proposed procedure. We apply FVCSIM to investigating the development of white matter diffusivities along the corpus callosum skeleton obtained from Alzheimer's Disease Neuroimaging Initiative (ADNI) study.
Training Head Start Teachers to Conduct Trial-Based Functional Analysis of Challenging Behavior
ERIC Educational Resources Information Center
Rispoli, Mandy; Burke, Mack D.; Hatton, Heather; Ninci, Jennifer; Zaini, Samar; Sanchez, Lisa
2015-01-01
Trial-based functional analysis (TBFA) is a procedure for experimentally identifying the function of challenging behavior within applied settings. The purpose of this study was to examine the effects of a TBFA teacher-training package in the context of two Head Start centers implementing programwide positive behavior support (PWPBS). Four Head…
Inverse Thermal Analysis of Titanium GTA Welds Using Multiple Constraints
NASA Astrophysics Data System (ADS)
Lambrakos, S. G.; Shabaev, A.; Huang, L.
2015-06-01
Inverse thermal analysis of titanium gas-tungsten-arc welds using multiple constraint conditions is presented. This analysis employs a methodology that is in terms of numerical-analytical basis functions for inverse thermal analysis of steady-state energy deposition in plate structures. The results of this type of analysis provide parametric representations of weld temperature histories that can be adopted as input data to various types of computational procedures, such as those for prediction of solid-state phase transformations. In addition, these temperature histories can be used to construct parametric function representations for inverse thermal analysis of welds corresponding to other process parameters or welding processes whose process conditions are within similar regimes. The present study applies an inverse thermal analysis procedure that provides for the inclusion of constraint conditions associated with both solidification and phase transformation boundaries.
A Framework for Creating a Function-based Design Tool for Failure Mode Identification
NASA Technical Reports Server (NTRS)
Arunajadai, Srikesh G.; Stone, Robert B.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)
2002-01-01
Knowledge of potential failure modes during design is critical for prevention of failures. Currently industries use procedures such as Failure Modes and Effects Analysis (FMEA), Fault Tree analysis, or Failure Modes, Effects and Criticality analysis (FMECA), as well as knowledge and experience, to determine potential failure modes. When new products are being developed there is often a lack of sufficient knowledge of potential failure mode and/or a lack of sufficient experience to identify all failure modes. This gives rise to a situation in which engineers are unable to extract maximum benefits from the above procedures. This work describes a function-based failure identification methodology, which would act as a storehouse of information and experience, providing useful information about the potential failure modes for the design under consideration, as well as enhancing the usefulness of procedures like FMEA. As an example, the method is applied to fifteen products and the benefits are illustrated.
Simple procedure for phase-space measurement and entanglement validation
NASA Astrophysics Data System (ADS)
Rundle, R. P.; Mills, P. W.; Tilma, Todd; Samson, J. H.; Everitt, M. J.
2017-08-01
It has recently been shown that it is possible to represent the complete quantum state of any system as a phase-space quasiprobability distribution (Wigner function) [Phys. Rev. Lett. 117, 180401 (2016), 10.1103/PhysRevLett.117.180401]. Such functions take the form of expectation values of an observable that has a direct analogy to displaced parity operators. In this work we give a procedure for the measurement of the Wigner function that should be applicable to any quantum system. We have applied our procedure to IBM's Quantum Experience five-qubit quantum processor to demonstrate that we can measure and generate the Wigner functions of two different Bell states as well as the five-qubit Greenberger-Horne-Zeilinger state. Because Wigner functions for spin systems are not unique, we define, compare, and contrast two distinct examples. We show how the use of these Wigner functions leads to an optimal method for quantum state analysis especially in the situation where specific characteristic features are of particular interest (such as for spin Schrödinger cat states). Furthermore we show that this analysis leads to straightforward, and potentially very efficient, entanglement test and state characterization methods.
An evaluation of generalization of mands during functional communication training.
Falcomata, Terry S; Wacker, David P; Ringdahl, Joel E; Vinquist, Kelly; Dutt, Anuradha
2013-01-01
The primary purpose of this study was to evaluate the generalization of mands during functional communication training (FCT) and sign language training across functional contexts (i.e., positive reinforcement, negative reinforcement). A secondary purpose was to evaluate a training procedure based on stimulus control to teach manual signs. During the treatment evaluation, we implemented sign language training in 1 functional context (e.g., positive reinforcement by attention) while continuing the functional analysis conditions in 2 other contexts (e.g., positive reinforcement by tangible item; negative reinforcement by escape). During the generalization evaluation, we tested for the generalization of trained mands across functional contexts (i.e., positive reinforcement; negative reinforcement) by implementing extinction in the 2 nontarget contexts. The results suggested that the stimulus control training procedure effectively taught manual signs and treated destructive behavior. Specific patterns of generalization of trained mands and destructive behavior also were observed. © Society for the Experimental Analysis of Behavior.
Functional Analysis and Treatment of Noncompliance by Preschool Children
ERIC Educational Resources Information Center
Wilder, David A.; Harris, Carelle; Reagan, Renee; Rasey, Amy
2007-01-01
A functional analysis showed that noncompliance occurred most often for 2 preschoolers when it resulted in termination of a preferred activity, suggesting that noncompliance was maintained by positive reinforcement. A differential reinforcement procedure, which involved contingent access to coupons that could be exchanged for uninterrupted access…
Sun, Jianguo; Feng, Yanqin; Zhao, Hui
2015-01-01
Interval-censored failure time data occur in many fields including epidemiological and medical studies as well as financial and sociological studies, and many authors have investigated their analysis (Sun, The statistical analysis of interval-censored failure time data, 2006; Zhang, Stat Modeling 9:321-343, 2009). In particular, a number of procedures have been developed for regression analysis of interval-censored data arising from the proportional hazards model (Finkelstein, Biometrics 42:845-854, 1986; Huang, Ann Stat 24:540-568, 1996; Pan, Biometrics 56:199-203, 2000). For most of these procedures, however, one drawback is that they involve estimation of both regression parameters and baseline cumulative hazard function. In this paper, we propose two simple estimation approaches that do not need estimation of the baseline cumulative hazard function. The asymptotic properties of the resulting estimates are given, and an extensive simulation study is conducted and indicates that they work well for practical situations.
Cosmetic surgery procedures as luxury goods: measuring price and demand in facial plastic surgery.
Alsarraf, Ramsey; Alsarraf, Nicole W; Larrabee, Wayne F; Johnson, Calvin M
2002-01-01
To evaluate the relationship between cosmetic facial plastic surgery procedure price and demand, and to test the hypothesis that these procedures function as luxury goods in the marketplace, with an upward-sloping demand curve. Data were derived from a survey that was sent to every (N = 1727) active fellow, member, or associate of the American Academy of Facial Plastic and Reconstructive Surgery, assessing the costs and frequency of 4 common cosmetic facial plastic surgery procedures (face-lift, brow-lift, blepharoplasty, and rhinoplasty) for 1999 and 1989. An economic analysis was performed to assess the relationship of price and demand for these procedures. A significant association was found between increasing surgeons' fees and total charges for cosmetic facial plastic surgery procedures and increasing demand for these procedures, as measured by their annual frequency (P=.003). After a multiple regression analysis correcting for confounding variables, this association of increased price with increased demand holds for each of the 4 procedures studied, across all US regions, and for both periods surveyed. Cosmetic facial plastic surgery procedures do appear to function as luxury goods in the marketplace, with an upward-sloping demand curve. This stands in contrast to other, traditional, goods for which demand typically declines as price increases. It appears that economic methods can be used to evaluate cosmetic procedure trends; however, these methods must be founded on the appropriate economic theory.
A Functional Analysis of Non-Vocal Verbal Behavior of a Young Child with Autism
ERIC Educational Resources Information Center
Normand, M. P.; Severtson, E. S.; Beavers, G. A.
2008-01-01
The functions of an American Sign Language response were experimentally evaluated with a young boy diagnosed with autism. A functional analysis procedure based on that reported by Lerman et al. (2005) was used to evaluate whether the target sign response would occur under mand, tact, mimetic, or control conditions. The target sign was observed…
Model-Free Feature Screening for Ultrahigh Dimensional Discriminant Analysis
Cui, Hengjian; Li, Runze
2014-01-01
This work is concerned with marginal sure independence feature screening for ultra-high dimensional discriminant analysis. The response variable is categorical in discriminant analysis. This enables us to use conditional distribution function to construct a new index for feature screening. In this paper, we propose a marginal feature screening procedure based on empirical conditional distribution function. We establish the sure screening and ranking consistency properties for the proposed procedure without assuming any moment condition on the predictors. The proposed procedure enjoys several appealing merits. First, it is model-free in that its implementation does not require specification of a regression model. Second, it is robust to heavy-tailed distributions of predictors and the presence of potential outliers. Third, it allows the categorical response having a diverging number of classes in the order of O(nκ) with some κ ≥ 0. We assess the finite sample property of the proposed procedure by Monte Carlo simulation studies and numerical comparison. We further illustrate the proposed methodology by empirical analyses of two real-life data sets. PMID:26392643
The computational structural mechanics testbed procedures manual
NASA Technical Reports Server (NTRS)
Stewart, Caroline B. (Compiler)
1991-01-01
The purpose of this manual is to document the standard high level command language procedures of the Computational Structural Mechanics (CSM) Testbed software system. A description of each procedure including its function, commands, data interface, and use is presented. This manual is designed to assist users in defining and using command procedures to perform structural analysis in the CSM Testbed User's Manual and the CSM Testbed Data Library Description.
Correlation functional in screened-exchange density functional theory procedures.
Chan, Bun; Kawashima, Yukio; Hirao, Kimihiko
2017-10-15
In the present study, we have explored several prospects for the further development of screened-exchange density functional theory (SX-DFT) procedures. Using the performance of HSE06 as our measure, we find that the use of alternative correlation functionals (as oppose to PBEc in HSE06) also yields adequate results for a diverse set of thermochemical properties. We have further examined the performance of new SX-DFT procedures (termed HSEB-type methods) that comprise the HSEx exchange and a (near-optimal) reparametrized B97c (c OS,0 = c SS,0 = 1, c OS,1 = -1.5, c OS,2 = -0.644, c SS,1 = -0.5, and c SS,2 = 1.10) correlation functionals. The different variants of HSEB all perform comparably to or slightly better than the original HSE-type procedures. These results, together with our fundamental analysis of correlation functionals, point toward various directions for advancing SX-DFT methods. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Functional Analyses and Treatment of Precursor Behavior
Najdowski, Adel C; Wallace, Michele D; Ellsworth, Carrie L; MacAleese, Alicia N; Cleveland, Jackie M
2008-01-01
Functional analysis has been demonstrated to be an effective method to identify environmental variables that maintain problem behavior. However, there are cases when conducting functional analyses of severe problem behavior may be contraindicated. The current study applied functional analysis procedures to a class of behavior that preceded severe problem behavior (precursor behavior) and evaluated treatments based on the outcomes of the functional analyses of precursor behavior. Responding for all participants was differentiated during the functional analyses, and individualized treatments eliminated precursor behavior. These results suggest that functional analysis of precursor behavior may offer an alternative, indirect method to assess the operant function of severe problem behavior. PMID:18468282
Chen, Ying-Ju; Ning, Wei; Gupta, Arjun K
2016-05-01
The mean residual life (MRL) function is one of the basic parameters of interest in survival analysis that describes the expected remaining time of an individual after a certain age. The study of changes in the MRL function is practical and interesting because it may help us to identify some factors such as age and gender that may influence the remaining lifetimes of patients after receiving a certain surgery. In this paper, we propose a detection procedure based on the empirical likelihood for the changes in MRL functions with right censored data. Two real examples are also given: Veterans' administration lung cancer study and Stanford heart transplant to illustrate the detecting procedure. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Risk prediction for myocardial infarction via generalized functional regression models.
Ieva, Francesca; Paganoni, Anna M
2016-08-01
In this paper, we propose a generalized functional linear regression model for a binary outcome indicating the presence/absence of a cardiac disease with multivariate functional data among the relevant predictors. In particular, the motivating aim is the analysis of electrocardiographic traces of patients whose pre-hospital electrocardiogram (ECG) has been sent to 118 Dispatch Center of Milan (the Italian free-toll number for emergencies) by life support personnel of the basic rescue units. The statistical analysis starts with a preprocessing of ECGs treated as multivariate functional data. The signals are reconstructed from noisy observations. The biological variability is then removed by a nonlinear registration procedure based on landmarks. Thus, in order to perform a data-driven dimensional reduction, a multivariate functional principal component analysis is carried out on the variance-covariance matrix of the reconstructed and registered ECGs and their first derivatives. We use the scores of the Principal Components decomposition as covariates in a generalized linear model to predict the presence of the disease in a new patient. Hence, a new semi-automatic diagnostic procedure is proposed to estimate the risk of infarction (in the case of interest, the probability of being affected by Left Bundle Brunch Block). The performance of this classification method is evaluated and compared with other methods proposed in literature. Finally, the robustness of the procedure is checked via leave-j-out techniques. © The Author(s) 2013.
ERIC Educational Resources Information Center
Olive, Melissa L.; Lang, Russell B.; Davis, Tonya N.
2008-01-01
The purpose of this study was to examine the effects of Functional Communication Training (FCT) and a Voice Output Communication Aid (VOCA) on the challenging behavior and language development of a 4-year-old girl with autism spectrum disorder. The participant's mother implemented modified functional analysis (FA) and intervention procedures in…
Effects of a Training Package to Improve the Accuracy of Descriptive Analysis Data Recording
ERIC Educational Resources Information Center
Mayer, Kimberly L.; DiGennaro Reed, Florence D.
2013-01-01
Functional behavior assessment is an important precursor to developing interventions to address a problem behavior. Descriptive analysis, a type of functional behavior assessment, is effective in informing intervention design only if the gathered data accurately capture relevant events and behaviors. We investigated a training procedure to improve…
Detection of laryngeal function using speech and electroglottographic data.
Childers, D G; Bae, K S
1992-01-01
The purpose of this research was to develop quantitative measures for the assessment of laryngeal function using speech and electroglottographic (EGG) data. We developed two procedures for the detection of laryngeal pathology: 1) a spectral distortion measure using pitch synchronous and asynchronous methods with linear predictive coding (LPC) vectors and vector quantization (VQ) and 2) analysis of the EGG signal using time interval and amplitude difference measures. The VQ procedure was conjectured to offer the possibility of circumventing the need to estimate the glottal volume velocity wave-form by inverse filtering techniques. The EGG procedure was to evaluate data that was "nearly" a direct measure of vocal fold vibratory motion and thus was conjectured to offer the potential for providing an excellent assessment of laryngeal function. A threshold based procedure gave 75.9 and 69.0% probability of pathological detection using procedures 1) and 2), respectively, for 29 patients with pathological voices and 52 normal subjects. The false alarm probability was 9.6% for the normal subjects.
Failure Mode Identification Through Clustering Analysis
NASA Technical Reports Server (NTRS)
Arunajadai, Srikesh G.; Stone, Robert B.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)
2002-01-01
Research has shown that nearly 80% of the costs and problems are created in product development and that cost and quality are essentially designed into products in the conceptual stage. Currently, failure identification procedures (such as FMEA (Failure Modes and Effects Analysis), FMECA (Failure Modes, Effects and Criticality Analysis) and FTA (Fault Tree Analysis)) and design of experiments are being used for quality control and for the detection of potential failure modes during the detail design stage or post-product launch. Though all of these methods have their own advantages, they do not give information as to what are the predominant failures that a designer should focus on while designing a product. This work uses a functional approach to identify failure modes, which hypothesizes that similarities exist between different failure modes based on the functionality of the product/component. In this paper, a statistical clustering procedure is proposed to retrieve information on the set of predominant failures that a function experiences. The various stages of the methodology are illustrated using a hypothetical design example.
Lageos assembly operation plan
NASA Technical Reports Server (NTRS)
Brueger, J.
1975-01-01
Guidelines and constraints procedures for LAGEOS assembly, operation, and design performance are given. Special attention was given to thermal, optical, and dynamic analysis and testing. The operation procedures illustrate the interrelation and sequence of tasks in a flow diagram. The diagram also includes quality assurance functions for verification of operation tasks.
Efficient sensitivity analysis and optimization of a helicopter rotor
NASA Technical Reports Server (NTRS)
Lim, Joon W.; Chopra, Inderjit
1989-01-01
Aeroelastic optimization of a system essentially consists of the determination of the optimum values of design variables which minimize the objective function and satisfy certain aeroelastic and geometric constraints. The process of aeroelastic optimization analysis is illustrated. To carry out aeroelastic optimization effectively, one needs a reliable analysis procedure to determine steady response and stability of a rotor system in forward flight. The rotor dynamic analysis used in the present study developed inhouse at the University of Maryland is based on finite elements in space and time. The analysis consists of two major phases: vehicle trim and rotor steady response (coupled trim analysis), and aeroelastic stability of the blade. For a reduction of helicopter vibration, the optimization process requires the sensitivity derivatives of the objective function and aeroelastic stability constraints. For this, the derivatives of steady response, hub loads and blade stability roots are calculated using a direct analytical approach. An automated optimization procedure is developed by coupling the rotor dynamic analysis, design sensitivity analysis and constrained optimization code CONMIN.
Stepwise Analysis of Differential Item Functioning Based on Multiple-Group Partial Credit Model.
ERIC Educational Resources Information Center
Muraki, Eiji
1999-01-01
Extended an Item Response Theory (IRT) method for detection of differential item functioning to the partial credit model and applied the method to simulated data using a stepwise procedure. Then applied the stepwise DIF analysis based on the multiple-group partial credit model to writing trend data from the National Assessment of Educational…
ERIC Educational Resources Information Center
Machalicek, Wendy; O'Reilly, Mark F.; Rispoli, Mandy; Davis, Tonya; Lang, Russell; Franco, Jessica Hetlinger; Chan, Jeffrey M.
2010-01-01
We examined the effects of performance feedback provided via video tele-conferencing (VTC) on the acquisition of functional analysis procedures by six teachers. A university supervisor used VTC equipment (i.e., computers equipped with web cameras and Internet) to provide feedback to teachers learning to implement functional analysis conditions…
Describes procedures written based on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.
2009-10-01
8 weeks. The experimental procedure consisted in collecting (i) psychological data (resilience, well-being, anxiety ), (ii) 12h-night urines to assess...was performed during 6 to 8 weeks. The experimental procedure consisted in collecting (i) psychological data (resilience, well-being, anxiety ), (ii...cardio- vascular regulation, the spectral analysis of heart rate variability ( HRV ) analysis is usually proposed as a method to assess vagal tone [7,2,8
Analysis Of Laryngeal Biomechanics Of Deaf Speakers Utilizing High-Speed Cinematography
NASA Astrophysics Data System (ADS)
Metz, Dale E.; Whitehead, Robert L.
1982-02-01
Since the formalization of the myoelastic-aerodynamic theory of vocal fold vibration, it has been generally accepted that biomechanical and aerodynamic forces determine the nature of vocal fold vibration patterns, speaking fundamental frequency and vocal intensity. The speech of the deaf is frequently characterized by abnormal voice qualities and aberrant frequency and intensity variations suggesting mismanagement of the biomechanical and aerodynamic forces acting on the larynx. Unfortunately, efforts to remediate these abnormal laryngeal activities are frequently ineffective. It is reasonable to suggest that more effective remedial strategies could be developed if we had a better understanding of the underlying nature of the problems deaf persons experience when trying to control laryngeal functioning for speech purposes. Toward this end, we are employing high speed laryngeal filming procedures in conjunction with glottal impedance, respiratory kinematic and acous-tical measurement procedures to assess abnormal laryngeal functioning of deaf speakers. All data are collected simultaneously and are time-locked to facilitate analysis of specific laryngeal events. This unique combination of instrumentation has provided important insights regarding laryngeal functioning of the deaf. For example, we have observed that deaf speakers may assume abnormal glottal configurations during phonation that pro-hibit normal laryngeal functioning and disturb upper airway dynamics. Also, normal vibratory patterns are frequently disturbed. Instrumentation, data collection protocols, analysis procedures and selected findings will be discussed.
Functional Analysis and Treatment of Multiply Controlled Inappropriate Mealtime Behavior
ERIC Educational Resources Information Center
Bachmeyer, Melanie H.; Piazza, Cathleen C.; Fredrick, Laura D.; Reed, Gregory K.; Rivas, Kristi D.; Kadey, Heather J.
2009-01-01
Functional analyses identified children whose inappropriate mealtime behavior was maintained by escape and adult attention. Function-based extinction procedures were tested individually and in combination. Attention extinction alone did not result in decreases in inappropriate mealtime behavior or a significant increase in acceptance. By contrast,…
ERIC Educational Resources Information Center
Lambert, Joseph M.; Lloyd, Blair P.; Staubitz, Johanna L.; Weaver, Emily S.; Jennings, Chelsea M.
2014-01-01
The trial-based functional analysis (FA) is a useful alternative to the traditional FA in contexts in which it is challenging to establish environmental control for extended periods of time. Previous researchers have demonstrated that others can be trained to conduct trial-based FAs with high procedural fidelity by providing a didactic…
NASA Technical Reports Server (NTRS)
Bains, R. W.; Herwig, H. A.; Luedeman, J. K.; Torina, E. M.
1974-01-01
The Shuttle Electric Power System Analysis SEPS computer program which performs detailed load analysis including predicting energy demands and consumables requirements of the shuttle electric power system along with parameteric and special case studies on the shuttle electric power system is described. The functional flow diagram of the SEPS program is presented along with data base requirements and formats, procedure and activity definitions, and mission timeline input formats. Distribution circuit input and fixed data requirements are included. Run procedures and deck setups are described.
Aerodynamic design optimization using sensitivity analysis and computational fluid dynamics
NASA Technical Reports Server (NTRS)
Baysal, Oktay; Eleshaky, Mohamed E.
1991-01-01
A new and efficient method is presented for aerodynamic design optimization, which is based on a computational fluid dynamics (CFD)-sensitivity analysis algorithm. The method is applied to design a scramjet-afterbody configuration for an optimized axial thrust. The Euler equations are solved for the inviscid analysis of the flow, which in turn provides the objective function and the constraints. The CFD analysis is then coupled with the optimization procedure that uses a constrained minimization method. The sensitivity coefficients, i.e. gradients of the objective function and the constraints, needed for the optimization are obtained using a quasi-analytical method rather than the traditional brute force method of finite difference approximations. During the one-dimensional search of the optimization procedure, an approximate flow analysis (predicted flow) based on a first-order Taylor series expansion is used to reduce the computational cost. Finally, the sensitivity of the optimum objective function to various design parameters, which are kept constant during the optimization, is computed to predict new optimum solutions. The flow analysis of the demonstrative example are compared with the experimental data. It is shown that the method is more efficient than the traditional methods.
Applying behavior analysis to clinical problems: review and analysis of habit reversal.
Miltenberger, R G; Fuqua, R W; Woods, D W
1998-01-01
This article provides a review and analysis of habit reversal, a multicomponent procedure developed by Azrin and Nunn (1973, 1974) for the treatment of nervous habits, tics, and stuttering. The article starts with a discussion of the behaviors treated with habit reversal, behavioral covariation among habits, and functional analysis and assessment of habits. Research on habit reversal and simplified versions of the procedure is then described. Next the article discusses the limitations of habit reversal and the evidence for its generality. The article concludes with an analysis of the behavioral processes involved in habit reversal and suggestions for future research. PMID:9757583
Describes procedures written based on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.
Classical Testing in Functional Linear Models.
Kong, Dehan; Staicu, Ana-Maria; Maity, Arnab
2016-01-01
We extend four tests common in classical regression - Wald, score, likelihood ratio and F tests - to functional linear regression, for testing the null hypothesis, that there is no association between a scalar response and a functional covariate. Using functional principal component analysis, we re-express the functional linear model as a standard linear model, where the effect of the functional covariate can be approximated by a finite linear combination of the functional principal component scores. In this setting, we consider application of the four traditional tests. The proposed testing procedures are investigated theoretically for densely observed functional covariates when the number of principal components diverges. Using the theoretical distribution of the tests under the alternative hypothesis, we develop a procedure for sample size calculation in the context of functional linear regression. The four tests are further compared numerically for both densely and sparsely observed noisy functional data in simulation experiments and using two real data applications.
Classical Testing in Functional Linear Models
Kong, Dehan; Staicu, Ana-Maria; Maity, Arnab
2016-01-01
We extend four tests common in classical regression - Wald, score, likelihood ratio and F tests - to functional linear regression, for testing the null hypothesis, that there is no association between a scalar response and a functional covariate. Using functional principal component analysis, we re-express the functional linear model as a standard linear model, where the effect of the functional covariate can be approximated by a finite linear combination of the functional principal component scores. In this setting, we consider application of the four traditional tests. The proposed testing procedures are investigated theoretically for densely observed functional covariates when the number of principal components diverges. Using the theoretical distribution of the tests under the alternative hypothesis, we develop a procedure for sample size calculation in the context of functional linear regression. The four tests are further compared numerically for both densely and sparsely observed noisy functional data in simulation experiments and using two real data applications. PMID:28955155
Véliz, Pedro L; Berra, Esperanza M; Jorna, Ana R
2015-07-01
INTRODUCTION Medical specialties' core curricula should take into account functions to be carried out, positions to be filled and populations to be served. The functions in the professional profile for specialty training of Cuban intensive care and emergency medicine specialists do not include all the activities that they actually perform in professional practice. OBJECTIVE Define the specific functions and procedural skills required of Cuban specialists in intensive care and emergency medicine. METHODS The study was conducted from April 2011 to September 2013. A three-stage methodological strategy was designed using qualitative techniques. By purposive maximum variation sampling, 82 professionals were selected. Documentary analysis and key informant criteria were used in the first stage. Two expert groups were formed in the second stage: one used various group techniques (focus group, oral and written brainstorming) and the second used a three-round Delphi method. In the final stage, a third group of experts was questioned in semistructured in-depth interviews, and a two-round Delphi method was employed to assess priorities. RESULTS Ultimately, 78 specific functions were defined: 47 (60.3%) patient care, 16 (20.5%) managerial, 6 (7.7%) teaching, and 9 (11.5%) research. Thirty-one procedural skills were identified. The specific functions and procedural skills defined relate to the profession's requirements in clinical care of the critically ill, management of patient services, teaching and research at the specialist's different occupational levels. CONCLUSIONS The specific functions and procedural skills required of intensive care and emergency medicine specialists were precisely identified by a scientific method. This product is key to improving the quality of teaching, research, administration and patient care in this specialty in Cuba. The specific functions and procedural skills identified are theoretical, practical, methodological and social contributions to inform future curricular reform and to help intensive care specialists enhance their performance in comprehensive patient care. KEYWORDS Intensive care, urgent care, emergency medicine, continuing medical education, curriculum, diagnostic techniques and procedures, medical residency, Cuba.
Trial-Based Functional Analysis Informs Treatment for Vocal Scripting.
Rispoli, Mandy; Brodhead, Matthew; Wolfe, Katie; Gregori, Emily
2018-05-01
Research on trial-based functional analysis has primarily focused on socially maintained challenging behaviors. However, procedural modifications may be necessary to clarify ambiguous assessment results. The purposes of this study were to evaluate the utility of iterative modifications to trial-based functional analysis on the identification of putative reinforcement and subsequent treatment for vocal scripting. For all participants, modifications to the trial-based functional analysis identified a primary function of automatic reinforcement. The structure of the trial-based format led to identification of social attention as an abolishing operation for vocal scripting. A noncontingent attention treatment was evaluated using withdrawal designs for each participant. This noncontingent attention treatment resulted in near zero levels of vocal scripting for all participants. Implications for research and practice are presented.
Impact of Procedure-Related Complications on Long-term Islet Transplantation Outcome.
Caiazzo, Robert; Vantyghem, Marie-Christine; Raverdi, Violeta; Bonner, Caroline; Gmyr, Valery; Defrance, Frederique; Leroy, Clara; Sergent, Geraldine; Hubert, Thomas; Ernst, Oliver; Noel, Christian; Kerr-Conte, Julie; Pattou, François
2015-05-01
Pancreatic islet transplantation offers a promising biotherapy for the treatment of type 1 diabetes, but this procedure has met significant challenges over the years. One such challenge is to address why primary graft function still remains inconsistent after islet transplantation. Several variables have been shown to affect graft function, but the impact of procedure-related complications on primary and long-term graft functions has not yet been explored. Twenty-six patients with established type 1 diabetes were included in this study. Each patient had two to three intraportal islet infusions to obtain 10,000 islet equivalent (IEQ)/kg in body weight, equaling a total of 68 islet infusions. Islet transplantation consisted of three sequential fresh islet infusions within 3 months. Islet infusions were performed surgically or under ultrasound guidance, depending on patient morphology, availability of the radiology suite, and patient medical history. Prospective assessment of adverse events was recorded and graded using "Common Terminology Criteria for adverse events in Trials of Adult Pancreatic Islet Transplantation." There were no deaths or patients dropouts. Early complications occurred in nine of 68 procedures. β score 1 month after the last graft and optimal graft function (β score ≥7) rate were significantly lower in cases of procedure-related complications (P = 0.02, P = 0.03). Procedure-related complications negatively impacted graft function (P = 0.009) and was an independent predictive factor of long-term graft survival (P = 0.033) in multivariate analysis. Complications occurring during radiologic or surgical intraportal islet transplantation significantly impair primary graft function and graft survival regardless of their severity.
ERIC Educational Resources Information Center
Lloyd, Blair P.; Wehby, Joseph H.; Weaver, Emily S.; Goldman, Samantha E.; Harvey, Michelle N.; Sherlock, Daniel R.
2015-01-01
Although functional analysis (FA) remains the standard for identifying the function of problem behavior for students with developmental disabilities, traditional FA procedures are typically costly in terms of time, resources, and perceived risks. Preliminary research suggests that trial-based FA may be a less costly alternative. The purpose of…
Functional Analysis Identified Habit Reversal Components for the Treatment of Motor Tics
ERIC Educational Resources Information Center
Dufrene, Brad A.; Harpole, Lauren Lestremau; Sterling, Heather E.; Perry, Erin J.; Burton, Britney; Zoder-Martell, Kimberly
2013-01-01
This study included brief functional analyses and treatment for motor tics exhibited by two children with Tourette Syndrome. Brief functional analyses were conducted in an outpatient treatment center and results were used to develop individualized habit reversal procedures. Treatment data were collected in clinic for one child and in clinic and…
Ryu, Yasuhiko; Akagi, Yoshito; Yagi, Minoru; Sasatomi, Teruo; Kinugasa, Tetsushi; Yamaguchi, Keizo; Oka, Yousuke; Fukahori, Suguru; Shiratsuchi, Ichitaro; Yoshida, Takefumi; Gotanda, Yukito; Tanaka, Natsuki; Ohchi, Takafumi; Romeo, Kansakar; Shirouzu, Kazuo
2015-01-01
The aim of this study was to elucidate whether fecoflowmetry (FFM) could evaluate more detailed evacuative function than anorectal manometry by comparing between FFM or anorectal manometric findings and the clinical questionnaires and the types of surgical procedure in the patients who received anal-preserving surgery. Fifty-three patients who underwent anal-preserving surgery for low rectal cancer were enrolled. The relationships between FFM or the manometric findings and the clinical questionnaires and the types of procedure of anal-preserving surgery were evaluated. There were significant differences between FFM markers and the clinical questionnaire and the types of the surgical procedure, whereas no significant relationship was observed between the manometric findings and the clinical questionnaire and the types of the surgical procedure. FFM might be feasible and useful for the objective assessment of evacuative function and may be superior to manometry for patients undergoing anal-preserving surgery. PMID:25594637
Self-Critical, and Robust, Procedures for the Analysis of Multivariate Normal Data.
1982-06-01
Influence Functions The influence function is the most important tt of qual- itative zobustness since many other robustness characteristics of an estimator...may be derived from it. The influence function characterizes the (asymptotic) response of an estimator to an additional observation as a function of...the influence function be bounded. It is also advantageous, in our opinion, if the influence functions are re-descending to zero. The influence function for
ERIC Educational Resources Information Center
Orr, Brandon
2013-01-01
This is a pilot study of a proposed model for examining the main and interactionist effects of achievement goal orientations on moral function and the role of perceived ability as a potential moderator in sport morality levels through cluster analysis procedures. One hundred and three elite (103) athletes participating in Division I wrestling…
Bell, Marshall T; Puskas, Ferenc; Bennett, Daine T; Cleveland, Joseph C; Herson, Paco S; Mares, Joshua M; Meng, Xainzhong; Weyant, Michael J; Fullerton, David A; Brett Reece, T
2015-08-27
Paraplegia following complex aortic intervention relies on crude evaluation of lower extremity strength such as whether the patient can lift their legs or flex the ankle. Little attention has been given to the possible long-term neurologic sequelae following these procedures in patients appearing functionally normal. We hypothesize that mice subjected to minimal ischemic time will have functional and histological changes despite the gross appearance of normal function. Male mice underwent 3 min of aortic occlusion (n=14) or sham surgery (n=4) via a median sternotomy. Neurologic function was graded by Basso Motor Score (BMS) preoperatively and at 24h intervals after reperfusion. Mice appearing functionally normal and sham mice were placed on a walking beam and recorded on high-definition, for single-frame motion analysis. After 96 hrs, spinal cords were removed for histological analysis. Following 3 min of ischemia, functional outcomes were split evenly with either mice displaying almost normal function n=7 or near complete paraplegia n=7. Additionally, single-frame motion analysis revealed significant changes in gait. Histologically, there was a significant stepwise reduction of neuronal viability, with even the normal function ischemic group demonstrating significant loss of neurons. Despite the appearance of normal function, temporary ischemia induced marked cyto-architectural changes and neuronal degeneration. Furthermore high-definition gait analysis revealed significant changes in gait and activity following thoracic aortic occlusion. These data suggest that all patients undergoing procedures, even with short ischemic times, may have spinal cord injury that is not evident clinically. Copyright © 2015 Elsevier B.V. All rights reserved.
Procedures to develop a computerized adaptive test to assess patient-reported physical functioning.
McCabe, Erin; Gross, Douglas P; Bulut, Okan
2018-06-07
The purpose of this paper is to demonstrate the procedures to develop and implement a computerized adaptive patient-reported outcome (PRO) measure using secondary analysis of a dataset and items from fixed-format legacy measures. We conducted secondary analysis of a dataset of responses from 1429 persons with work-related lower extremity impairment. We calibrated three measures of physical functioning on the same metric, based on item response theory (IRT). We evaluated efficiency and measurement precision of various computerized adaptive test (CAT) designs using computer simulations. IRT and confirmatory factor analyses support combining the items from the three scales for a CAT item bank of 31 items. The item parameters for IRT were calculated using the generalized partial credit model. CAT simulations show that reducing the test length from the full 31 items to a maximum test length of 8 items, or 20 items is possible without a significant loss of information (95, 99% correlation with legacy measure scores). We demonstrated feasibility and efficiency of using CAT for PRO measurement of physical functioning. The procedures we outlined are straightforward, and can be applied to other PRO measures. Additionally, we have included all the information necessary to implement the CAT of physical functioning in the electronic supplementary material of this paper.
Johnson, Thomas W; Mumford, Andrew D; Scott, Lauren J; Mundell, Stuart; Butler, Mark; Strange, Julian W; Rogers, Chris A; Reeves, Barnaby C; Baumbach, Andreas
2015-01-01
Rapid coronary recanalization following ST-elevation myocardial infarction (STEMI) requires effective anti-platelet and anti-thrombotic therapies. This study tested the impact of door to end of procedure ('door-to-end') time and baseline platelet activity on platelet inhibition within 24hours post-STEMI. 108 patients, treated with prasugrel and procedural bivalirudin, underwent Multiplate® platelet function testing at baseline, 0, 1, 2 and 24hours post-procedure. Major adverse cardiac events (MACE), bleeding and stent thrombosis (ST) were recorded. Baseline ADP activity was high (88.3U [71.8-109.0]), procedural time and consequently bivalirudin infusion duration were short (median door-to-end time 55minutes [40-70] and infusion duration 30minutes [20-42]). Baseline ADP was observed to influence all subsequent measurements of ADP activity, whereas door-to-end time only influenced ADP immediately post-procedure. High residual platelet reactivity (HRPR ADP>46.8U) was observed in 75% of patients immediately post-procedure and persisted in 24% of patients at 2hours. Five patients suffered in-hospital MACE (4.6%). Acute ST occurred in 4 patients, all were <120mins post-procedure and had HRPR. No significant bleeding was observed. In a post-hoc analysis, pre-procedural morphine use was associated with significantly higher ADP activity following intervention. Baseline platelet function, time to STEMI treatment and opiate use all significantly influence immediate post-procedural platelet activity.
Token Reinforcement: A Review and Analysis
ERIC Educational Resources Information Center
Hackenberg, Timothy D.
2009-01-01
Token reinforcement procedures and concepts are reviewed and discussed in relation to general principles of behavior. The paper is divided into four main parts. Part I reviews and discusses previous research on token systems in relation to common behavioral functions--reinforcement, temporal organization, antecedent stimulus functions, and…
NASA Astrophysics Data System (ADS)
Bizyuk, S. A.; Istomin, Yu. P.; Dzhagarov, B. M.
2006-07-01
We have developed a procedure for analysis of the functional status of blood vessels in tumor tissues using computer-assisted color scanning of tumor slices and also for a quantitative assessment of the effectiveness of photoinduced destruction of tumor tissues in animal experiments. Its major advantage is direct determination of the size of the tumor necrosis zone. The procedure has been tested in an experiment on three strains of malignant tumors with different morphologies.
Performance optimization of helicopter rotor blades
NASA Technical Reports Server (NTRS)
Walsh, Joanne L.
1991-01-01
As part of a center-wide activity at NASA Langley Research Center to develop multidisciplinary design procedures by accounting for discipline interactions, a performance design optimization procedure is developed. The procedure optimizes the aerodynamic performance of rotor blades by selecting the point of taper initiation, root chord, taper ratio, and maximum twist which minimize hover horsepower while not degrading forward flight performance. The procedure uses HOVT (a strip theory momentum analysis) to compute the horse power required for hover and the comprehensive helicopter analysis program CAMRAD to compute the horsepower required for forward flight and maneuver. The optimization algorithm consists of the general purpose optimization program CONMIN and approximate analyses. Sensitivity analyses consisting of derivatives of the objective function and constraints are carried out by forward finite differences. The procedure is applied to a test problem which is an analytical model of a wind tunnel model of a utility rotor blade.
NASA Astrophysics Data System (ADS)
Lambrakos, S. G.
2017-08-01
An inverse thermal analysis of Alloy 690 laser and hybrid laser-GMA welds is presented that uses numerical-analytical basis functions and boundary constraints based on measured solidification cross sections. In particular, the inverse analysis procedure uses three-dimensional constraint conditions such that two-dimensional projections of calculated solidification boundaries are constrained to map within experimentally measured solidification cross sections. Temperature histories calculated by this analysis are input data for computational procedures that predict solid-state phase transformations and mechanical response. These temperature histories can be used for inverse thermal analysis of welds corresponding to other welding processes whose process conditions are within similar regimes.
Random analysis of bearing capacity of square footing using the LAS procedure
NASA Astrophysics Data System (ADS)
Kawa, Marek; Puła, Wojciech; Suska, Michał
2016-09-01
In the present paper, a three-dimensional problem of bearing capacity of square footing on random soil medium is analyzed. The random fields of strength parameters c and φ are generated using LAS procedure (Local Average Subdivision, Fenton and Vanmarcke 1990). The procedure used is re-implemented by the authors in Mathematica environment in order to combine it with commercial program. Since the procedure is still tested the random filed has been assumed as one-dimensional: the strength properties of soil are random in vertical direction only. Individual realizations of bearing capacity boundary-problem with strength parameters of medium defined the above procedure are solved using FLAC3D Software. The analysis is performed for two qualitatively different cases, namely for the purely cohesive and cohesive-frictional soils. For the latter case the friction angle and cohesion have been assumed as independent random variables. For these two cases the random square footing bearing capacity results have been obtained for the range of fluctuation scales from 0.5 m to 10 m. Each time 1000 Monte Carlo realizations have been performed. The obtained results allow not only the mean and variance but also the probability density function to be estimated. An example of application of this function for reliability calculation has been presented in the final part of the paper.
Minjares-Fuentes, Rafael; Rodríguez-González, Víctor Manuel; González-Laredo, Rubén Francisco; Eim, Valeria; González-Centeno, María Reyes; Femenia, Antoni
2017-07-15
The main effects of different drying procedures: spray-, industrial freeze-, refractance window- and radiant zone-drying, on acemannan, the main bioactive polysaccharide from Aloe vera gel, were investigated. All the drying procedures caused a considerable decrease in the acemannan yield (∼40%). Degradation affected not only the backbone, as indicated by the important losses of (1→4)-linked mannose units, but also the side-chains formed by galactose. In addition, methylation analysis suggested the deacetylation of mannose units (>60%), which was confirmed by 1 H NMR analysis. Interestingly, all these changes were reflected in the functional properties which were severely affected. Thus, water retention capacity values from processed samples decreased ∼50%, and a reduction greater than 80% was determined in swelling and fat adsorption capacity values. Therefore, these important modifications should be taken into consideration, since not only the functionality but also the physiological effects attributed to many Aloe vera-based products could also be affected. Copyright © 2017 Elsevier Ltd. All rights reserved.
Noncoding sequence classification based on wavelet transform analysis: part II
NASA Astrophysics Data System (ADS)
Paredes, O.; Strojnik, M.; Romo-Vázquez, R.; Vélez-Pérez, H.; Ranta, R.; Garcia-Torales, G.; Scholl, M. K.; Morales, J. A.
2017-09-01
DNA sequences in human genome can be divided into the coding and noncoding ones. We hypothesize that the characteristic periodicities of the noncoding sequences are related to their function. We describe the procedure to identify these characteristic periodicities using the wavelet analysis. Our results show that three groups of noncoding sequences, each one with different biological function, may be differentiated by their wavelet coefficients within specific frequency range.
Expanded function allied dental personnel and dental practice productivity and efficiency.
Beazoglou, Tryfon J; Chen, Lei; Lazar, Vickie F; Brown, L Jackson; Ray, Subhash C; Heffley, Dennis R; Berg, Rob; Bailit, Howard L
2012-08-01
This study examined the impact of expanded function allied dental personnel on the productivity and efficiency of general dental practices. Detailed practice financial and clinical data were obtained from a convenience sample of 154 general dental practices in Colorado. In this state, expanded function dental assistants can provide a wide range of reversible dental services/procedures, and dental hygienists can give local anesthesia. The survey identified practices that currently use expanded function allied dental personnel and the specific services/procedures delegated. Practice productivity was measured using patient visits, gross billings, and net income. Practice efficiency was assessed using a multivariate linear program, Data Envelopment Analysis. Sixty-four percent of the practices were found to use expanded function allied dental personnel, and on average they delegated 31.4 percent of delegatable services/procedures. Practices that used expanded function allied dental personnel treated more patients and had higher gross billings and net incomes than those practices that did not; the more services they delegated, the higher was the practice's productivity and efficiency. The effective use of expanded function allied dental personnel has the potential to substantially expand the capacity of general dental practices to treat more patients and to generate higher incomes for dental practices.
Kholeif, S A
2001-06-01
A new method that belongs to the differential category for determining the end points from potentiometric titration curves is presented. It uses a preprocess to find first derivative values by fitting four data points in and around the region of inflection to a non-linear function, and then locate the end point, usually as a maximum or minimum, using an inverse parabolic interpolation procedure that has an analytical solution. The behavior and accuracy of the sigmoid and cumulative non-linear functions used are investigated against three factors. A statistical evaluation of the new method using linear least-squares method validation and multifactor data analysis are covered. The new method is generally applied to symmetrical and unsymmetrical potentiometric titration curves, and the end point is calculated using numerical procedures only. It outperforms the "parent" regular differential method in almost all factors levels and gives accurate results comparable to the true or estimated true end points. Calculated end points from selected experimental titration curves compatible with the equivalence point category of methods, such as Gran or Fortuin, are also compared with the new method.
Huang, Shou-Guo; Chen, Bo; Lv, Dong; Zhang, Yong; Nie, Feng-Feng; Li, Wei; Lv, Yao; Zhao, Huan-Li; Liu, Hong-Mei
2017-01-01
Purpose Using a network meta-analysis approach, our study aims to develop a ranking of the six surgical procedures, that is, Plate, titanium elastic nail (TEN), tension band wire (TBW), hook plate (HP), reconstruction plate (RP) and Knowles pin, by comparing the post-surgery constant shoulder scores in patients with clavicular fracture (CF). Methods A comprehensive search of electronic scientific literature databases was performed to retrieve publications investigating surgical procedures in CF, with the stringent eligible criteria, and clinical experimental studies of high quality and relevance to our area of interest were selected for network meta-analysis. Statistical analyses were conducted using Stata 12.0. Results A total of 19 studies met our inclusion criteria were eventually enrolled into our network meta-analysis, representing 1164 patients who had undergone surgical procedures for CF (TEN group = 240; Plate group = 164; TBW group = 180; RP group = 168; HP group = 245; Knowles pin group = 167). The network meta-analysis results revealed that RP significantly improved constant shoulder score in patients with CF when compared with TEN, and the post-operative constant shoulder scores in patients with CF after Plate, TBW, HP, Knowles pin and TEN were similar with no statistically significant differences. The treatment relative ranking of predictive probabilities of constant shoulder scores in patients with CF after surgery revealed the surface under the cumulative ranking curves (SUCRA) value is the highest in RP. Conclusion The current network meta-analysis suggests that RP may be the optimum surgical treatment among six inventions for patients with CF, and it can improve the shoulder score of patients with CF. Implications for Rehabilitation RP improves shoulder joint function after surgical procedure. RP achieves stability with minimal complications after surgery. RP may be the optimum surgical treatment for rehabilitation of patients with CF.
Equation of state for detonation product gases
NASA Astrophysics Data System (ADS)
Nagayama, Kunihito; Kubota, Shiro
2003-03-01
A thermodynamic analysis procedure of the detonation product equation of state (EOS) together with the experimental data set of the detonation velocity as a function of initial density has been formulated. The Chapman-Jouguet (CJ) state [W. Ficket and W. C. Davis, Detonation: Theory and Experiment (University of California Press, Berkeley 1979)] on the p-ν plane is found to be well approximated by the envelope function formed by the collection of Rayleigh lines with many different initial density states. The Jones-Stanyukovich-Manson relation [W. Ficket and W. C. Davis, Detonation: Theory and Experiment (University of California Press, Berkeley, 1979)] is used to estimate the error included in this approximation. Based on this analysis, a simplified integration method to calculate the Grüneisen parameter along the CJ state curve with different initial densities utilizing the cylinder expansion data has been presented. The procedure gives a simple way of obtaining the EOS function, compatible with the detonation velocity data. Theoretical analysis has been performed for the precision of the estimated EOS function. EOS of the pentaerithrytoltetranitrate explosive is calculated and compared with some of the experimental data such as CJ pressure data and cylinder expansion data.
Shen, Yi; Kern, Allison B.
2018-01-01
Individual differences in the recognition of monosyllabic words, either in isolation (NU6 test) or in sentence context (SPIN test), were investigated under the theoretical framework of the speech intelligibility index (SII). An adaptive psychophysical procedure, namely the quick-band-importance-function procedure, was developed to enable the fitting of the SII model to individual listeners. Using this procedure, the band importance function (i.e., the relative weights of speech information across the spectrum) and the link function relating the SII to recognition scores can be simultaneously estimated while requiring only 200 to 300 trials of testing. Octave-frequency band importance functions and link functions were estimated separately for NU6 and SPIN materials from 30 normal-hearing listeners who were naïve to speech recognition experiments. For each type of speech material, considerable individual differences in the spectral weights were observed in some but not all frequency regions. At frequencies where the greatest intersubject variability was found, the spectral weights were correlated between the two speech materials, suggesting that the variability in spectral weights reflected listener-originated factors. PMID:29532711
Communications network design and costing model users manual
NASA Technical Reports Server (NTRS)
Logan, K. P.; Somes, S. S.; Clark, C. A.
1983-01-01
The information and procedures needed to exercise the communications network design and costing model for performing network analysis are presented. Specific procedures are included for executing the model on the NASA Lewis Research Center IBM 3033 computer. The concepts, functions, and data bases relating to the model are described. Model parameters and their format specifications for running the model are detailed.
Determining stocking, forest type and stand-size class from forest inventory data
Mark H. Hansen; Jerold T. Hahn
1992-01-01
This paper describes the procedures used by North Central Forest Experiment Station's Forest Inventory and Analysis Work Unit (NCFIA) in determining stocking, forest type, and stand-size class. The stocking procedure assigns a portion of the stocking to individual trees measured on NCFIA 10-point field plots. Stand size and forest type are determined as functions...
USDA-ARS?s Scientific Manuscript database
Fractions of soil organic matter (SOM) are usually extracted from soil by either physical (e.g., size, density) or chemical (e.g., base, acid) procedures. Integrated procedures that combine both of these types promise greater insights into SOM chemistry and function. For a corn-soybean soil in Iowa,...
Strickland, Justin C.; Feinstein, Max A.; Lacy, Ryan T.; Smith, Mark A.
2016-01-01
Impulsive choice is a diagnostic feature and/or complicating factor for several psychological disorders and may be examined in the laboratory using delay-discounting procedures. Recent investigators have proposed using quantitative measures of analysis to examine the behavioral processes contributing to impulsive choice. The purpose of this study was to examine the effects of physical activity (i.e., wheel running) on impulsive choice in a single-response, discrete-trial procedure using two quantitative methods of analysis. To this end, rats were assigned to physical activity or sedentary groups and trained to respond in a delay-discounting procedure. In this procedure, one lever always produced one food pellet immediately, whereas a second lever produced three food pellets after a 0, 10, 20, 40, or 80-second delay. Estimates of sensitivity to reinforcement amount and sensitivity to reinforcement delay were determined using (1) a simple linear analysis and (2) an analysis of logarithmically transformed response ratios. Both analyses revealed that physical activity decreased sensitivity to reinforcement amount and sensitivity to reinforcement delay. These findings indicate that (1) physical activity has significant but functionally opposing effects on the behavioral processes that contribute to impulsive choice and (2) both quantitative methods of analysis are appropriate for use in single-response, discrete-trial procedures. PMID:26964905
Salas, Rosa Ana; Pleite, Jorge
2013-01-01
We propose a specific procedure to compute the inductance of a toroidal ferrite core as a function of the excitation current. The study includes the linear, intermediate and saturation regions. The procedure combines the use of Finite Element Analysis in 2D and experimental measurements. Through the two dimensional (2D) procedure we are able to achieve convergence, a reduction of computational cost and equivalent results to those computed by three dimensional (3D) simulations. The validation is carried out by comparing 2D, 3D and experimental results. PMID:28809283
DIFAS: Differential Item Functioning Analysis System. Computer Program Exchange
ERIC Educational Resources Information Center
Penfield, Randall D.
2005-01-01
Differential item functioning (DIF) is an important consideration in assessing the validity of test scores (Camilli & Shepard, 1994). A variety of statistical procedures have been developed to assess DIF in tests of dichotomous (Hills, 1989; Millsap & Everson, 1993) and polytomous (Penfield & Lam, 2000; Potenza & Dorans, 1995) items. Some of these…
NASA Astrophysics Data System (ADS)
Ma'rufi, Budayasa, I. Ketut; Juniati, Dwi
2017-08-01
The aim of this study was to describe the analysis of mathematics teachers' learning on algebra function limit material based on teaching experience difference. The purpose of this study is to describe the analysis of mathematics teacher's learning on limit algebraic functions in terms of the differences of teaching experience. Learning analysis focused on Pedagogical Content Knowledge (PCK) of teachers in mathematics on limit algebraic functions related to the knowledge of pedagogy. PCK of teachers on limit algebraic function is a type of specialized knowledge for teachers on how to teach limit algebraic function that can be understood by students. Subjects are two high school mathematics teacher who has difference of teaching experience they are one Novice Teacher (NP) and one Experienced Teacher (ET). Data are collected through observation of learning in the class, videos of learning, and then analyzed using qualitative analysis. Teacher's knowledge of Pedagogic defined as a knowledge and understanding of teacher about planning and organizing of learning, and application of learning strategy. The research results showed that the Knowledge of Pedagogy on subject NT in mathematics learning on the material of limit function algebra showed that the subject NT tended to describe procedurally, without explaining the reasons why such steps were used, asking questions which tended to be monotonous not be guiding and digging deeper, and less varied in the use of learning strategies while subject ET gave limited guidance and opportunities to the students to find their own answers, exploit the potential of students to answer questions, provide an opportunity for students to interact and work in groups, and subject ET tended to combine conceptual and procedural explanation.
Characterization of technical surfaces by structure function analysis
NASA Astrophysics Data System (ADS)
Kalms, Michael; Kreis, Thomas; Bergmann, Ralf B.
2018-03-01
The structure function is a tool for characterizing technical surfaces that exhibits a number of advantages over Fourierbased analysis methods. So it is optimally suited for analyzing the height distributions of surfaces measured by full-field non-contacting methods. The structure function is thus a useful method to extract global or local criteria like e. g. periodicities, waviness, lay, or roughness to analyze and evaluate technical surfaces. After the definition of line- and area-structure function and offering effective procedures for their calculation this paper presents examples using simulated and measured data of technical surfaces including aircraft parts.
Evaluation of flaws in carbon steel piping. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zahoor, A.; Gamble, R.M.; Mehta, H.S.
1986-10-01
The objective of this program was to develop flaw evaluation procedures and allowable flaw sizes for ferritic piping used in light water reactor (LWR) power generation facilities. The program results provide relevant ASME Code groups with the information necessary to define flaw evaluation procedures, allowable flaw sizes, and their associated bases for Section XI of the code. Because there are several possible flaw-related failure modes for ferritic piping over the LWR operating temperature range, three analysis methods were employed to develop the evaluation procedures. These include limit load analysis for plastic collapse, elastic plastic fracture mechanics (EPFM) analysis for ductilemore » tearing, and linear elastic fracture mechanics (LEFM) analysis for non ductile crack extension. To ensure the appropriate analysis method is used in an evaluation, a step by step procedure also is provided to identify the relevant acceptance standard or procedure on a case by case basis. The tensile strength and toughness properties required to complete the flaw evaluation for any of the three analysis methods are included in the evaluation procedure. The flaw evaluation standards are provided in tabular form for the plastic collapse and ductile tearing modes, where the allowable part through flaw depth is defined as a function of load and flaw length. For non ductile crack extension, linear elastic fracture mechanics analysis methods, similar to those in Appendix A of Section XI, are defined. Evaluation flaw sizes and procedures are developed for both longitudinal and circumferential flaw orientations and normal/upset and emergency/faulted operating conditions. The tables are based on margins on load of 2.77 and 1.39 for circumferential flaws and 3.0 and 1.5 for longitudinal flaws for normal/upset and emergency/faulted conditions, respectively.« less
Some computational techniques for estimating human operator describing functions
NASA Technical Reports Server (NTRS)
Levison, W. H.
1986-01-01
Computational procedures for improving the reliability of human operator describing functions are described. Special attention is given to the estimation of standard errors associated with mean operator gain and phase shift as computed from an ensemble of experimental trials. This analysis pertains to experiments using sum-of-sines forcing functions. Both open-loop and closed-loop measurement environments are considered.
Report from a quality assurance program on patients undergoing the MILD procedure.
Durkin, Brian; Romeiser, Jamie; Shroyer, A Laurie W; Schiller, Robin; Bae, Jin; Davis, Raphael P; Peyster, Robert; Benveniste, Helene
2013-05-01
To characterize trends in pain and functional outcomes and identify risk factors in patients with lumbar spinal stenosis (LSS) and neurogenic claudication undergoing the "Minimally Invasive Lumbar Decompression" (MILD) procedure. Retrospective observational cohort study. Academic multidisciplinary pain center at Stony Brook Medicine. Patients undergoing the MILD procedure from October 2010 to November 2012. De-identified perioperative, pain and function related data for 50 patients undergoing MILD were extracted from the Center for Pain Management's quality assessment database. Data included numerical rating scale (NRS), symptom severity and physical function (Zurich Claudication Questionnaire), functional status (Oswestry Disability Index [ODI]), pain interference scores (National Institutes of Health Patient-Reported Outcomes Measurement Information System [PROMIS]), and patients' self-reported low back and lower extremity pain distribution. No MILD patient incurred procedure-related complications. Average NRS scores decreased postoperatively and 64.3% of patients reported less pain at 3 months. Clinically meaningful functional ODI improvements of at least 20% from baseline were present in 25% of the patients at 6 months. Preliminary analysis of changes in PROMIS scores at 3 months revealed that pre-MILD "severe" lumbar canal stenosis may be associated with high risk of "no improvement." No such impact was observed for NRS or ODI outcomes. Overall, pain is reduced and functional status improved in LSS patients following the MILD procedure at 3 and 6 months. Given the small sample size, it is not yet possible to identify patient subgroups at risk for "no improvement." Continued follow-up of longer-term outcomes appears warranted to develop evidence-based patient selection criteria. Wiley Periodicals, Inc.
Li, Hongliang; Dai, Jiewen; Si, Jiawen; Zhang, Jianfei; Wang, Minjiao; Shen, Steve Guofang; Yu, Hongbo
2015-01-01
Anterior maxillary segmental distraction (AMSD) is an effective surgical procedure in the treatment of maxillary hypoplasia secondary to cleft lip and palate. Its unique advantage of preserving velopharyngeal function makes this procedure widely applied. In this study, the application of AMSD was described and its long-term stability was explored. Eight patients with severe maxillary hypoplasia secondary to CLP were included in this study. They were treated with AMSD using rigid external distraction (RED) device. Cephalometric analysis was performed twice at three time points for evaluation: before surgery (T1), after distraction (T2), and 2 years after treatment (T3). One-way analysis of variance was used to assess the differences statistically. All the distractions completed smoothly, and maxilla was distracted efficiently. The value of SNA, NA-FH, Ptm-A, U1-PP, overjet and PP (ANS-PNS) increased significantly after the AMSD procedure (P < 0.05), with the mean overjet increased by 14.28 mm. However, comparison of cephalometric analysis between T2 and T3 showed no significant difference (P > 0.05). Changes of palatopharyngeal depth and soft palatal length were insignificant. AMSD with RED device provided an effective way to correct maxillary hypoplasia secondary to CLP, extended the palatal and arch length, avoided damage on velopharyngeal closure function and reduced the relapse rate. It is a promising and valuable technique in this potentially complicated procedure.
POWER-ENHANCED MULTIPLE DECISION FUNCTIONS CONTROLLING FAMILY-WISE ERROR AND FALSE DISCOVERY RATES.
Peña, Edsel A; Habiger, Joshua D; Wu, Wensong
2011-02-01
Improved procedures, in terms of smaller missed discovery rates (MDR), for performing multiple hypotheses testing with weak and strong control of the family-wise error rate (FWER) or the false discovery rate (FDR) are developed and studied. The improvement over existing procedures such as the Šidák procedure for FWER control and the Benjamini-Hochberg (BH) procedure for FDR control is achieved by exploiting possible differences in the powers of the individual tests. Results signal the need to take into account the powers of the individual tests and to have multiple hypotheses decision functions which are not limited to simply using the individual p -values, as is the case, for example, with the Šidák, Bonferroni, or BH procedures. They also enhance understanding of the role of the powers of individual tests, or more precisely the receiver operating characteristic (ROC) functions of decision processes, in the search for better multiple hypotheses testing procedures. A decision-theoretic framework is utilized, and through auxiliary randomizers the procedures could be used with discrete or mixed-type data or with rank-based nonparametric tests. This is in contrast to existing p -value based procedures whose theoretical validity is contingent on each of these p -value statistics being stochastically equal to or greater than a standard uniform variable under the null hypothesis. Proposed procedures are relevant in the analysis of high-dimensional "large M , small n " data sets arising in the natural, physical, medical, economic and social sciences, whose generation and creation is accelerated by advances in high-throughput technology, notably, but not limited to, microarray technology.
High-frequency health data and spline functions.
Martín-Rodríguez, Gloria; Murillo-Fort, Carlos
2005-03-30
Seasonal variations are highly relevant for health service organization. In general, short run movements of medical magnitudes are important features for managers in this field to make adequate decisions. Thus, the analysis of the seasonal pattern in high-frequency health data is an appealing task. The aim of this paper is to propose procedures that allow the analysis of the seasonal component in this kind of data by means of spline functions embedded into a structural model. In the proposed method, useful adaptions of the traditional spline formulation are developed, and the resulting procedures are capable of capturing periodic variations, whether deterministic or stochastic, in a parsimonious way. Finally, these methodological tools are applied to a series of daily emergency service demand in order to capture simultaneous seasonal variations in which periods are different.
Synthesis of aircraft structures using integrated design and analysis methods
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, J.; Goetz, R. C.
1978-01-01
A systematic research is reported to develop and validate methods for structural sizing of an airframe designed with the use of composite materials and active controls. This research program includes procedures for computing aeroelastic loads, static and dynamic aeroelasticity, analysis and synthesis of active controls, and optimization techniques. Development of the methods is concerned with the most effective ways of integrating and sequencing the procedures in order to generate structural sizing and the associated active control system, which is optimal with respect to a given merit function constrained by strength and aeroelasticity requirements.
Chládek, J; Brázdil, M; Halámek, J; Plešinger, F; Jurák, P
2013-01-01
We present an off-line analysis procedure for exploring brain activity recorded from intra-cerebral electroencephalographic data (SEEG). The objective is to determine the statistical differences between different types of stimulations in the time-frequency domain. The procedure is based on computing relative signal power change and subsequent statistical analysis. An example of characteristic statistically significant event-related de/synchronization (ERD/ERS) detected across different frequency bands following different oddball stimuli is presented. The method is used for off-line functional classification of different brain areas.
Levitt, Heidi M; Pomerville, Andrew; Surace, Francisco I; Grabowski, Lauren M
2017-11-01
A metamethod study is a qualitative meta-analysis focused upon the methods and procedures used in a given research domain. These studies are rare in psychological research. They permit both the documentation of the informal standards within a field of research and recommendations for future work in that area. This paper presents a metamethod analysis of a substantial body of qualitative research that focused on clients' experiences in psychotherapy (109 studies). This review examined the ways that methodological integrity has been established across qualitative research methods. It identified the numbers of participants recruited and the form of data collection used (e.g., semistructured interviews, diaries). As well, it examined the types of checks employed to increase methodological integrity, such as participant counts, saturation, reflexivity techniques, participant feedback, or consensus and auditing processes. Central findings indicated that the researchers quite flexibly integrated procedures associated with one method into studies using other methods in order to strengthen their rigor. It appeared normative to adjust procedures to advance methodological integrity. These findings encourage manuscript reviewers to assess the function of procedures within a study rather than to require researchers to adhere to the set of procedures associated with a method. In addition, when epistemological approaches were mentioned they were overwhelmingly constructivist in nature, despite the increasing use of procedures traditionally associated with objectivist perspectives. It is recommended that future researchers do more to explicitly describe the functions of their procedures so that they are coherently situated within the epistemological approaches in use. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Dal Moro, Giancarlo; Moustafa, Sayed S. R.; Al-Arifi, Nassir S.
2018-01-01
Rayleigh waves often propagate according to complex mode excitation so that the proper identification and separation of specific modes can be quite difficult or, in some cases, just impossible. Furthermore, the analysis of a single component (i.e., an inversion procedure based on just one objective function) necessarily prevents solving the problems related to the non-uniqueness of the solution. To overcome these issues and define a holistic analysis of Rayleigh waves, we implemented a procedure to acquire data that are useful to define and efficiently invert the three objective functions defined from the three following "objects": the velocity spectra of the vertical- and radial-components and the Rayleigh-wave particle motion (RPM) frequency-offset data. Two possible implementations are presented. In the first case we consider classical multi-offset (and multi-component) data, while in a second possible approach we exploit the data recorded by a single three-component geophone at a fixed offset from the source. Given the simple field procedures, the method could be particularly useful for the unambiguous geotechnical exploration of large areas, where more complex acquisition procedures, based on the joint acquisition of Rayleigh and Love waves, would not be economically viable. After illustrating the different kinds of data acquisition and the data processing, the results of the proposed methodology are illustrated in a case study. Finally, a series of theoretical and practical aspects are discussed to clarify some issues involved in the overall procedure (data acquisition and processing).
ERIC Educational Resources Information Center
Flowers, Claudia P.; Raju, Nambury S.; Oshima, T. C.
Current interest in the assessment of measurement equivalence emphasizes two methods of analysis, linear, and nonlinear procedures. This study simulated data using the graded response model to examine the performance of linear (confirmatory factor analysis or CFA) and nonlinear (item-response-theory-based differential item function or IRT-Based…
THE EDUCATIONAL INSTITUTION AS A SYSTEM--A PROPOSED GENERALIZED PROCEDURE FOR ANALYSIS.
ERIC Educational Resources Information Center
REISMAN, ARNOLD; TAFT, MARTIN I.
A UNIFIED APPROACH TO THE ANALYSIS AND SYNTHESIS OF THE FUNCTIONS AND OPERATIONS IN EDUCATIONAL INSTITUTIONS IS PRESENTED. SYSTEMS ANALYSIS TECHNIQUES USED IN OTHER AREAS SUCH AS CRAFT, PERT, CERBS, AND OPERATIONS RESEARCH ARE SUGGESTED AS POTENTIALLY ADAPTABLE FOR USE IN HIGHER EDUCATION. THE MAJOR OBJECTIVE OF A SCHOOL IS TO ALLOCATE AVAILABLE…
Palazón, L; Navas, A
2017-06-01
Information on sediment contribution and transport dynamics from the contributing catchments is needed to develop management plans to tackle environmental problems related with effects of fine sediment as reservoir siltation. In this respect, the fingerprinting technique is an indirect technique known to be valuable and effective for sediment source identification in river catchments. Large variability in sediment delivery was found in previous studies in the Barasona catchment (1509 km 2 , Central Spanish Pyrenees). Simulation results with SWAT and fingerprinting approaches identified badlands and agricultural uses as the main contributors to sediment supply in the reservoir. In this study the <63 μm sediment fraction from the surface reservoir sediments (2 cm) are investigated following the fingerprinting procedure to assess how the use of different statistical procedures affects the amounts of source contributions. Three optimum composite fingerprints were selected to discriminate between source contributions based in land uses/land covers from the same dataset by the application of (1) discriminant function analysis; and its combination (as second step) with (2) Kruskal-Wallis H-test and (3) principal components analysis. Source contribution results were different between assessed options with the greatest differences observed for option using #3, including the two step process: principal components analysis and discriminant function analysis. The characteristics of the solutions by the applied mixing model and the conceptual understanding of the catchment showed that the most reliable solution was achieved using #2, the two step process of Kruskal-Wallis H-test and discriminant function analysis. The assessment showed the importance of the statistical procedure used to define the optimum composite fingerprint for sediment fingerprinting applications. Copyright © 2016 Elsevier Ltd. All rights reserved.
Marui, Akira; Saji, Yoshiaki; Nishina, Takeshi; Tadamura, Eiji; Kanao, Shotaro; Shimamoto, Takeshi; Sasahashi, Nozomu; Ikeda, Tadashi; Komeda, Masashi
2008-06-01
Left atrial geometry and mechanical functions exert a profound effect on left ventricular filling and overall cardiovascular performance. We sought to investigate the perioperative factors that influence left atrial geometry and mechanical functions after the Maze procedure in patients with refractory atrial fibrillation and left atrial enlargement. Seventy-four patients with atrial fibrillation and left atrial enlargement (diameter > or = 60 mm) underwent the Maze procedure in association with mitral valve surgery. The maximum left atrial volume and left atrial mechanical functions (booster pump, reservoir, and conduit function [%]) were calculated from the left atrial volume-cardiac cycle curves obtained by magnetic resonance imaging. A stepwise multiple regression analysis was performed to determine the independent variables that influenced the postoperative left atrial geometry and function. The multivariate analysis showed that left atrial reduction surgery concomitant with the Maze procedure and the postoperative maintenance of sinus rhythm were predominant independent variables for postoperative left atrial geometry and mechanical functions. Among the 58 patients who recovered sinus rhythm, the postoperative left atrial geometry and function were compared between patients with (VR group) and without (control group) left atrial volume reduction. At a mean follow-up period of 13.8 months, sinus rhythm recovery rate was better (85% vs 68%, P < .05) in the VR group and maximum left atrial volume was less (116 +/- 25 mL vs 287 +/- 73 mL, P < .001) than in the control group. The maximum left atrial volume reduced with time only in the VR group (reverse remodeling). Postoperative booster pump and reservoir function in the VR group were better than in the control group (25% +/- 6% vs 11% +/- 4% and 34% +/- 7% vs 16% +/- 4%, respectively, P < .001), whereas the conduit function in the VR group was lower than in the control group, indicating that the improvement of the booster pump and reservoir function compensated for the conduit function to left ventricular filling. Left atrial reduction concomitant with the Maze procedure helped restore both contraction (booster pump) and compliance (reservoir) of the left atrium and facilitated left atrial reverse remolding. Left atrial volume reduction and postoperative maintenance of sinus rhythm may be desirable in patients with refractory AF and left atrial enlargement.
Test functions for three-dimensional control-volume mixed finite-element methods on irregular grids
Naff, R.L.; Russell, T.F.; Wilson, J.D.; ,; ,; ,; ,; ,
2000-01-01
Numerical methods based on unstructured grids, with irregular cells, usually require discrete shape functions to approximate the distribution of quantities across cells. For control-volume mixed finite-element methods, vector shape functions are used to approximate the distribution of velocities across cells and vector test functions are used to minimize the error associated with the numerical approximation scheme. For a logically cubic mesh, the lowest-order shape functions are chosen in a natural way to conserve intercell fluxes that vary linearly in logical space. Vector test functions, while somewhat restricted by the mapping into the logical reference cube, admit a wider class of possibilities. Ideally, an error minimization procedure to select the test function from an acceptable class of candidates would be the best procedure. Lacking such a procedure, we first investigate the effect of possible test functions on the pressure distribution over the control volume; specifically, we look for test functions that allow for the elimination of intermediate pressures on cell faces. From these results, we select three forms for the test function for use in a control-volume mixed method code and subject them to an error analysis for different forms of grid irregularity; errors are reported in terms of the discrete L2 norm of the velocity error. Of these three forms, one appears to produce optimal results for most forms of grid irregularity.
Lee, Sandra; Reddington, Elise; Koutsogiannaki, Sophia; Hernandez, Michael R; Odegard, Kirsten C; DiNardo, James A; Yuki, Koichi
2018-04-27
While mortality and adverse perioperative events after noncardiac surgery in children with a broad range of congenital cardiac lesions have been investigated using large multiinstitutional databases, to date single-center studies addressing adverse outcomes in children with congenital heart disease (CHD) undergoing noncardiac surgery have only included small numbers of patients with significant heart disease. The primary objective of this study was to determine the incidences of perioperative cardiovascular and respiratory events in a large cohort of patients from a single institution with a broad range of congenital cardiac lesions undergoing noncardiac procedures and to determine risk factors for these events. We identified 3010 CHD patients presenting for noncardiac procedures in our institution over a 5-year period. We collected demographic information, including procedure performed, cardiac diagnosis, ventricular function as assessed by echocardiogram within 6 months of the procedure, and classification of CHD into 3 groups (minor, major, or severe CHD) based on residual lesion burden and cardiovascular functional status. Characteristics related to conduct of anesthesia care were also collected. The primary outcome variables for our analysis were the incidences of intraoperative cardiovascular and respiratory events. Univariable and multivariable logistic regressions were used to determine risk factors for these 2 outcomes. The incidence of cardiovascular events was 11.5% and of respiratory events was 4.7%. Univariate analysis and multivariable analysis demonstrated that American Society of Anesthesiologists (≥3), emergency cases, major and severe CHD, single-ventricle physiology, ventricular dysfunction, orthopedic surgery, general surgery, neurosurgery, and pulmonary procedures were associated with perioperative cardiovascular events. Respiratory events were associated with American Society of Anesthesiologists (≥4) and otolaryngology, gastrointestinal, general surgery, and maxillofacial procedures. Intraoperative cardiovascular events and respiratory events in patients with CHD were relatively common. While cardiovascular events were highly associated with cardiovascular status, respiratory events were not associated with cardiovascular status.
The Generality of Interview-Informed Functional Analyses: Systematic Replications in School and Home
ERIC Educational Resources Information Center
Santiago, Joana L.; Hanley, Gregory P.; Moore, Keira; Jin, C. Sandy
2016-01-01
Behavioral interventions preceded by a functional analysis have been proven efficacious in treating severe problem behavior associated with autism. There is, however, a lack of research showing socially validated outcomes when assessment and treatment procedures are conducted by ecologically relevant individuals in typical settings. In this study,…
ERIC Educational Resources Information Center
Lewis, Virginia Vimpeny
2011-01-01
Number Concepts; Measurement; Geometry; Probability; Statistics; and Patterns, Functions and Algebra. Procedural Errors were further categorized into the following content categories: Computation; Measurement; Statistics; and Patterns, Functions, and Algebra. The results of the analysis showed the main sources of error for 6th, 7th, and 8th…
Self-Organizing Maps and Parton Distribution Functions
DOE Office of Scientific and Technical Information (OSTI.GOV)
K. Holcomb, Simonetta Liuti, D. Z. Perry
2011-05-01
We present a new method to extract parton distribution functions from high energy experimental data based on a specific type of neural networks, the Self-Organizing Maps. We illustrate the features of our new procedure that are particularly useful for an anaysis directed at extracting generalized parton distributions from data. We show quantitative results of our initial analysis of the parton distribution functions from inclusive deep inelastic scattering.
Cheungpasitporn, Wisit; Thongprayoon, Charat; Brabec, Brady A; Edmonds, Peter J; O'Corragain, Oisin A; Erickson, Stephen B
2014-12-01
The reports on efficacy of oral hydration treatment for the prevention of contrast-induced acute kidney injury (CIAKI) in elective radiological procedures and cardiac catheterization remain controversial. The objective of this meta-analysis was to assess the use of oral hydration regimen for prevention of CIAKI. Comprehensive literature searches for randomized controlled trials (RCTs) of outpatient oral hydration treatment was performed using MEDLINE, EMBASE, Cochrane Database of Systematic Reviews, Cochrane Central Register of Controlled Trials Systematic Reviews, and clinicaltrials.gov from inception until July 4(th), 2014. Primary outcome was the incidence of CIAKI. Six prospective RCTs were included in our analysis. Of 513patients undergoing elective procedures with contrast exposures,45 patients (8.8%) had CIAKI. Of 241 patients with oral hydration regimen, 23 (9.5%) developed CIAKI. Of 272 patients with intravenous (IV) fluid regimen, 22 (8.1%) had CIAKI. Study populations in all included studies had relatively normal kidney function to chronic kidney disease (CKD) stage 3. There was no significant increased risk of CIAKI in oral fluid regimen group compared toIV fluid regimen group (RR = 0.94, 95% confidence interval, CI = 0.38-2.31). According to our analysis,there is no evidence that oral fluid regimen is associated with more risk of CIAKI in patients undergoing elective procedures with contrast exposures compared to IV fluid regimen. This finding suggests that the oral fluid regimen might be considered as a possible outpatient treatment option for CIAKI prevention in patients with normal to moderately reduced kidney function.
Protein arginine methylation: Cellular functions and methods of analysis.
Pahlich, Steffen; Zakaryan, Rouzanna P; Gehring, Heinz
2006-12-01
During the last few years, new members of the growing family of protein arginine methyltransferases (PRMTs) have been identified and the role of arginine methylation in manifold cellular processes like signaling, RNA processing, transcription, and subcellular transport has been extensively investigated. In this review, we describe recent methods and findings that have yielded new insights into the cellular functions of arginine-methylated proteins, and we evaluate the currently used procedures for the detection and analysis of arginine methylation.
NASA Astrophysics Data System (ADS)
Schmitt, Kara Anne
This research aims to prove that strict adherence to procedures and rigid compliance to process in the US Nuclear Industry may not prevent incidents or increase safety. According to the Institute of Nuclear Power Operations, the nuclear power industry has seen a recent rise in events, and this research claims that a contributing factor to this rise is organizational, cultural, and based on peoples overreliance on procedures and policy. Understanding the proper balance of function allocation, automation and human decision-making is imperative to creating a nuclear power plant that is safe, efficient, and reliable. This research claims that new generations of operators are less engaged and thinking because they have been instructed to follow procedures to a fault. According to operators, they were once to know the plant and its interrelations, but organizationally more importance is now put on following procedure and policy. Literature reviews were performed, experts were questioned, and a model for context analysis was developed. The Context Analysis Method for Identifying Design Solutions (CAMIDS) Model was created, verified and validated through both peer review and application in real world scenarios in active nuclear power plant simulators. These experiments supported the claim that strict adherence and rigid compliance to procedures may not increase safety by studying the industry's propensity for following incorrect procedures, and when it directly affects the outcome of safety or security of the plant. The findings of this research indicate that the younger generations of operators rely highly on procedures, and the organizational pressures of required compliance to procedures may lead to incidents within the plant because operators feel pressured into following the rules and policy above performing the correct actions in a timely manner. The findings support computer based procedures, efficient alarm systems, and skill of the craft matrices. The solution to the problems facing the industry include in-depth, multiple fault failure training which tests the operator's knowledge of the situation. This builds operator collaboration, competence and confidence to know what to do, and when to do it in response to an emergency situation. Strict adherence to procedures and rigid compliance to process may not prevent incidents or increase safety; building operators' fundamental skills of collaboration, competence and confidence will.
Smoothing spline ANOVA frailty model for recurrent event data.
Du, Pang; Jiang, Yihua; Wang, Yuedong
2011-12-01
Gap time hazard estimation is of particular interest in recurrent event data. This article proposes a fully nonparametric approach for estimating the gap time hazard. Smoothing spline analysis of variance (ANOVA) decompositions are used to model the log gap time hazard as a joint function of gap time and covariates, and general frailty is introduced to account for between-subject heterogeneity and within-subject correlation. We estimate the nonparametric gap time hazard function and parameters in the frailty distribution using a combination of the Newton-Raphson procedure, the stochastic approximation algorithm (SAA), and the Markov chain Monte Carlo (MCMC) method. The convergence of the algorithm is guaranteed by decreasing the step size of parameter update and/or increasing the MCMC sample size along iterations. Model selection procedure is also developed to identify negligible components in a functional ANOVA decomposition of the log gap time hazard. We evaluate the proposed methods with simulation studies and illustrate its use through the analysis of bladder tumor data. © 2011, The International Biometric Society.
NASA Astrophysics Data System (ADS)
Khondok, Piyoros; Sakulkalavek, Aparporn; Suwansukho, Kajpanya
2018-03-01
A simplified and powerful image processing procedures to separate the paddy of KHAW DOK MALI 105 or Thai jasmine rice and the paddy of sticky rice RD6 varieties were proposed. The procedures consist of image thresholding, image chain coding and curve fitting using polynomial function. From the fitting, three parameters of each variety, perimeters, area, and eccentricity, were calculated. Finally, the overall parameters were determined by using principal component analysis. The result shown that these procedures can be significantly separate both varieties.
Integrated flight/propulsion control - Subsystem specifications for performance
NASA Technical Reports Server (NTRS)
Neighbors, W. K.; Rock, Stephen M.
1993-01-01
A procedure is presented for calculating multiple subsystem specifications given a number of performance requirements on the integrated system. This procedure applies to problems where the control design must be performed in a partitioned manner. It is based on a structured singular value analysis, and generates specifications as magnitude bounds on subsystem uncertainties. The performance requirements should be provided in the form of bounds on transfer functions of the integrated system. This form allows the expression of model following, command tracking, and disturbance rejection requirements. The procedure is demonstrated on a STOVL aircraft design.
The time series approach to short term load forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hagan, M.T.; Behr, S.M.
The application of time series analysis methods to load forecasting is reviewed. It is shown than Box and Jenkins time series models, in particular, are well suited to this application. The logical and organized procedures for model development using the autocorrelation function make these models particularly attractive. One of the drawbacks of these models is the inability to accurately represent the nonlinear relationship between load and temperature. A simple procedure for overcoming this difficulty is introduced, and several Box and Jenkins models are compared with a forecasting procedure currently used by a utility company.
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.; Peters, Jeanne M.
1989-01-01
A computational procedure is presented for the nonlinear dynamic analysis of unsymmetric structures on vector multiprocessor systems. The procedure is based on a novel hierarchical partitioning strategy in which the response of the unsymmetric and antisymmetric response vectors (modes), each obtained by using only a fraction of the degrees of freedom of the original finite element model. The three key elements of the procedure which result in high degree of concurrency throughout the solution process are: (1) mixed (or primitive variable) formulation with independent shape functions for the different fields; (2) operator splitting or restructuring of the discrete equations at each time step to delineate the symmetric and antisymmetric vectors constituting the response; and (3) two level iterative process for generating the response of the structure. An assessment is made of the effectiveness of the procedure on the CRAY X-MP/4 computers.
Wave drag as the objective function in transonic fighter wing optimization
NASA Technical Reports Server (NTRS)
Phillips, P. S.
1984-01-01
The original computational method for determining wave drag in a three dimensional transonic analysis method was replaced by a wave drag formula based on the loss in momentum across an isentropic shock. This formula was used as the objective function in a numerical optimization procedure to reduce the wave drag of a fighter wing at transonic maneuver conditions. The optimization procedure minimized wave drag through modifications to the wing section contours defined by a wing profile shape function. A significant reduction in wave drag was achieved while maintaining a high lift coefficient. Comparisons of the pressure distributions for the initial and optimized wing geometries showed significant reductions in the leading-edge peaks and shock strength across the span.
Structure-function analysis of genetically defined neuronal populations.
Groh, Alexander; Krieger, Patrik
2013-10-01
Morphological and functional classification of individual neurons is a crucial aspect of the characterization of neuronal networks. Systematic structural and functional analysis of individual neurons is now possible using transgenic mice with genetically defined neurons that can be visualized in vivo or in brain slice preparations. Genetically defined neurons are useful for studying a particular class of neurons and also for more comprehensive studies of the neuronal content of a network. Specific subsets of neurons can be identified by fluorescence imaging of enhanced green fluorescent protein (eGFP) or another fluorophore expressed under the control of a cell-type-specific promoter. The advantages of such genetically defined neurons are not only their homogeneity and suitability for systematic descriptions of networks, but also their tremendous potential for cell-type-specific manipulation of neuronal networks in vivo. This article describes a selection of procedures for visualizing and studying the anatomy and physiology of genetically defined neurons in transgenic mice. We provide information about basic equipment, reagents, procedures, and analytical approaches for obtaining three-dimensional (3D) cell morphologies and determining the axonal input and output of genetically defined neurons. We exemplify with genetically labeled cortical neurons, but the procedures are applicable to other brain regions with little or no alterations.
Strickland, Justin C; Feinstein, Max A; Lacy, Ryan T; Smith, Mark A
2016-05-01
Impulsive choice is a diagnostic feature and/or complicating factor for several psychological disorders and may be examined in the laboratory using delay-discounting procedures. Recent investigators have proposed using quantitative measures of analysis to examine the behavioral processes contributing to impulsive choice. The purpose of this study was to examine the effects of physical activity (i.e., wheel running) on impulsive choice in a single-response, discrete-trial procedure using two quantitative methods of analysis. To this end, rats were assigned to physical activity or sedentary groups and trained to respond in a delay-discounting procedure. In this procedure, one lever always produced one food pellet immediately, whereas a second lever produced three food pellets after a 0, 10, 20, 40, or 80-s delay. Estimates of sensitivity to reinforcement amount and sensitivity to reinforcement delay were determined using (1) a simple linear analysis and (2) an analysis of logarithmically transformed response ratios. Both analyses revealed that physical activity decreased sensitivity to reinforcement amount and sensitivity to reinforcement delay. These findings indicate that (1) physical activity has significant but functionally opposing effects on the behavioral processes that contribute to impulsive choice and (2) both quantitative methods of analysis are appropriate for use in single-response, discrete-trial procedures. Copyright © 2016 Elsevier B.V. All rights reserved.
A network of automatic atmospherics analyzer
NASA Technical Reports Server (NTRS)
Schaefer, J.; Volland, H.; Ingmann, P.; Eriksson, A. J.; Heydt, G.
1980-01-01
The design and function of an atmospheric analyzer which uses a computer are discussed. Mathematical models which show the method of measurement are presented. The data analysis and recording procedures of the analyzer are discussed.
Childers, A B; Walsh, B
1996-07-23
Preharvest food safety is essential for the protection of our food supply. The production and transport of livestock and poultry play an integral part in the safety of these food products. The goals of this safety assurance include freedom from pathogenic microorganisms, disease, and parasites, and from potentially harmful residues and physical hazards. Its functions should be based on hazard analysis and critical control points from producer to slaughter plant with emphasis on prevention of identifiable hazards rather than on removal of contaminated products. The production goal is to minimize infection and insure freedom from potentially harmful residues and physical hazards. The marketing goal is control of exposure to pathogens and stress. Both groups should have functional hazard analysis and critical control points management programs which include personnel training and certification of producers. These programs must cover production procedures, chemical usage, feeding, treatment practices, drug usage, assembly and transportation, and animal identification. Plans must use risk assessment principles, and the procedures must be defined. Other elements would include preslaughter certification, environmental protection, control of chemical hazards, live-animal drug-testing procedures, and identification of physical hazards.
Data and Analysis Center for Software.
1980-06-01
can make use of it in their day- to -day activities of developing, maintaining, and managing software. The biblio- graphic collection is composed of...which refer to development, design, or programming approaches whicn view a software system component, or module in terms of its required or intended... practices " are also included In this group. PROCEDURES (I keyword) Procedures is a term used ambiguously in the literature to refer to functions
Li, Hongliang; Dai, Jiewen; Si, Jiawen; Zhang, Jianfei; Wang, Minjiao; Shen, Steve Guofang; Yu, Hongbo
2015-01-01
Anterior maxillary segmental distraction (AMSD) is an effective surgical procedure in the treatment of maxillary hypoplasia secondary to cleft lip and palate. Its unique advantage of preserving velopharyngeal function makes this procedure widely applied. In this study, the application of AMSD was described and its long-term stability was explored. Eight patients with severe maxillary hypoplasia secondary to CLP were included in this study. They were treated with AMSD using rigid external distraction (RED) device. Cephalometric analysis was performed twice at three time points for evaluation: before surgery (T1), after distraction (T2), and 2 years after treatment (T3). One-way analysis of variance was used to assess the differences statistically. All the distractions completed smoothly, and maxilla was distracted efficiently. The value of SNA, NA-FH, Ptm-A, U1-PP, overjet and PP (ANS-PNS) increased significantly after the AMSD procedure (P < 0.05), with the mean overjet increased by 14.28 mm. However, comparison of cephalometric analysis between T2 and T3 showed no significant difference (P > 0.05). Changes of palatopharyngeal depth and soft palatal length were insignificant. AMSD with RED device provided an effective way to correct maxillary hypoplasia secondary to CLP, extended the palatal and arch length, avoided damage on velopharyngeal closure function and reduced the relapse rate. It is a promising and valuable technique in this potentially complicated procedure. PMID:26629107
The construction of control chart for PM10 functional data
NASA Astrophysics Data System (ADS)
Shaadan, Norshahida; Jemain, Abdul Aziz; Deni, Sayang Mohd
2014-06-01
In this paper, a statistical procedure to construct a control chart for monitoring air quality (PM10) using functional data is proposed. A set of daily indices that represent the daily PM10 curves were obtained using Functional Principal Component Analysis (FPCA). By means of an iterative charting procedure, a reference data set that represented a stable PM10 process was obtained. The data were then used as a reference for monitoring future data. The application of the procedure was conducted using seven-year (2004-2010) period of recorded data from the Klang air quality monitoring station located in the Klang Valley region of Peninsular Malaysia. The study showed that the control chart provided a useful visualization tool for monitoring air quality and was capable in detecting abnormality in the process system. As in the case of Klang station, the results showed that with reference to 2004-2008, the air quality (PM10) in 2010 was better than that in 2009.
Martin, Mario; Béjar, Javier; Esposito, Gennaro; Chávez, Diógenes; Contreras-Hernández, Enrique; Glusman, Silvio; Cortés, Ulises; Rudomín, Pablo
2017-01-01
In a previous study we developed a Machine Learning procedure for the automatic identification and classification of spontaneous cord dorsum potentials ( CDPs ). This study further supported the proposal that in the anesthetized cat, the spontaneous CDPs recorded from different lumbar spinal segments are generated by a distributed network of dorsal horn neurons with structured (non-random) patterns of functional connectivity and that these configurations can be changed to other non-random and stable configurations after the noceptive stimulation produced by the intradermic injection of capsaicin in the anesthetized cat. Here we present a study showing that the sequence of identified forms of the spontaneous CDPs follows a Markov chain of at least order one. That is, the system has memory in the sense that the spontaneous activation of dorsal horn neuronal ensembles producing the CDPs is not independent of the most recent activity. We used this markovian property to build a procedure to identify portions of signals as belonging to a specific functional state of connectivity among the neuronal networks involved in the generation of the CDPs . We have tested this procedure during acute nociceptive stimulation produced by the intradermic injection of capsaicin in intact as well as spinalized preparations. Altogether, our results indicate that CDP sequences cannot be generated by a renewal stochastic process. Moreover, it is possible to describe some functional features of activity in the cord dorsum by modeling the CDP sequences as generated by a Markov order one stochastic process. Finally, these Markov models make possible to determine the functional state which produced a CDP sequence. The proposed identification procedures appear to be useful for the analysis of the sequential behavior of the ongoing CDPs recorded from different spinal segments in response to a variety of experimental procedures including the changes produced by acute nociceptive stimulation. They are envisaged as a useful tool to examine alterations of the patterns of functional connectivity between dorsal horn neurons under normal and different pathological conditions, an issue of potential clinical concern.
A vertical-energy-thresholding procedure for data reduction with multiple complex curves.
Jung, Uk; Jeong, Myong K; Lu, Jye-Chyi
2006-10-01
Due to the development of sensing and computer technology, measurements of many process variables are available in current manufacturing processes. It is very challenging, however, to process a large amount of information in a limited time in order to make decisions about the health of the processes and products. This paper develops a "preprocessing" procedure for multiple sets of complicated functional data in order to reduce the data size for supporting timely decision analyses. The data type studied has been used for fault detection, root-cause analysis, and quality improvement in such engineering applications as automobile and semiconductor manufacturing and nanomachining processes. The proposed vertical-energy-thresholding (VET) procedure balances the reconstruction error against data-reduction efficiency so that it is effective in capturing key patterns in the multiple data signals. The selected wavelet coefficients are treated as the "reduced-size" data in subsequent analyses for decision making. This enhances the ability of the existing statistical and machine-learning procedures to handle high-dimensional functional data. A few real-life examples demonstrate the effectiveness of our proposed procedure compared to several ad hoc techniques extended from single-curve-based data modeling and denoising procedures.
Uncertainty analysis of signal deconvolution using a measured instrument response function
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hartouni, E. P.; Beeman, B.; Caggiano, J. A.
2016-10-05
A common analysis procedure minimizes the ln-likelihood that a set of experimental observables matches a parameterized model of the observation. The model includes a description of the underlying physical process as well as the instrument response function (IRF). Here, we investigate the National Ignition Facility (NIF) neutron time-of-flight (nTOF) spectrometers, the IRF is constructed from measurements and models. IRF measurements have a finite precision that can make significant contributions to the uncertainty estimate of the physical model’s parameters. Finally, we apply a Bayesian analysis to properly account for IRF uncertainties in calculating the ln-likelihood function used to find the optimummore » physical parameters.« less
NASA Technical Reports Server (NTRS)
Louis, Pascal; Gokhale, Arun M.
1995-01-01
A number of microstructural processes are sensitive to the spatial arrangements of features in microstructure. However, very little attention has been given in the past to the experimental measurements of the descriptors of microstructural distance distributions due to the lack of practically feasible methods. We present a digital image analysis procedure to estimate the micro-structural distance distributions. The application of the technique is demonstrated via estimation of K function, radial distribution function, and nearest-neighbor distribution function of hollow spherical carbon particulates in a polymer matrix composite, observed in a metallographic section.
Analysis of positron lifetime spectra in polymers
NASA Technical Reports Server (NTRS)
Singh, Jag J.; Mall, Gerald H.; Sprinkle, Danny R.
1988-01-01
A new procedure for analyzing multicomponent positron lifetime spectra in polymers was developed. It requires initial estimates of the lifetimes and the intensities of various components, which are readily obtainable by a standard spectrum stripping process. These initial estimates, after convolution with the timing system resolution function, are then used as the inputs for a nonlinear least squares analysis to compute the estimates that conform to a global error minimization criterion. The convolution integral uses the full experimental resolution function, in contrast to the previous studies where analytical approximations of it were utilized. These concepts were incorporated into a generalized Computer Program for Analyzing Positron Lifetime Spectra (PAPLS) in polymers. Its validity was tested using several artificially generated data sets. These data sets were also analyzed using the widely used POSITRONFIT program. In almost all cases, the PAPLS program gives closer fit to the input values. The new procedure was applied to the analysis of several lifetime spectra measured in metal ion containing Epon-828 samples. The results are described.
NASA Technical Reports Server (NTRS)
Martinovic, Zoran N.; Cerro, Jeffrey A.
2002-01-01
This is an interim user's manual for current procedures used in the Vehicle Analysis Branch at NASA Langley Research Center, Hampton, Virginia, for launch vehicle structural subsystem weight estimation based on finite element modeling and structural analysis. The process is intended to complement traditional methods of conceptual and early preliminary structural design such as the application of empirical weight estimation or application of classical engineering design equations and criteria on one dimensional "line" models. Functions of two commercially available software codes are coupled together. Vehicle modeling and analysis are done using SDRC/I-DEAS, and structural sizing is performed with the Collier Research Corp. HyperSizer program.
ERIC Educational Resources Information Center
Kelley, Michael E.; Shillingsburg, M. Alice; Castro, M. Jicel; Addison, Laura R.; LaRue, Robert H., Jr.; Martins, Megan P.
2007-01-01
Although experimental analysis methodologies have been useful for identifying the function of a wide variety of target behaviors (e.g., Iwata, Dorsey, Slifer, Bauman, & Richman, 1982/1994), only recently have such procedures been applied to verbal operants (Lerman et al., 2005). In the current study, we conducted a systematic replication of the…
MSFC Skylab operations support summary
NASA Technical Reports Server (NTRS)
Martin, J. R.
1974-01-01
A summary of the actions and problems involved in preparing the Skylab-one vehicle is presented. The subjects discussed are: (1) flight operations support functions and organization, (2) launch operations and booster flight support functions and organization, (3) Skylab launch vehicle support teams, (4) Skylab orbital operations support performance analysis, (5) support manning and procedures, and (6) data support and facilities.
Data preparation for functional data analysis of PM10 in Peninsular Malaysia
NASA Astrophysics Data System (ADS)
Shaadan, Norshahida; Jemain, Abdul Aziz; Deni, Sayang Mohd
2014-07-01
The use of curves or functional data in the study analysis is increasingly gaining momentum in the various fields of research. The statistical method to analyze such data is known as functional data analysis (FDA). The first step in FDA is to convert the observed data points which are repeatedly recorded over a period of time or space into either a rough (raw) or smooth curve. In the case of the smooth curve, basis functions expansion is one of the methods used for the data conversion. The data can be converted into a smooth curve either by using the regression smoothing or roughness penalty smoothing approach. By using the regression smoothing approach, the degree of curve's smoothness is very dependent on k number of basis functions; meanwhile for the roughness penalty approach, the smoothness is dependent on a roughness coefficient given by parameter λ Based on previous studies, researchers often used the rather time-consuming trial and error or cross validation method to estimate the appropriate number of basis functions. Thus, this paper proposes a statistical procedure to construct functional data or curves for the hourly and daily recorded data. The Bayesian Information Criteria is used to determine the number of basis functions while the Generalized Cross Validation criteria is used to identify the parameter λ The proposed procedure is then applied on a ten year (2001-2010) period of PM10 data from 30 air quality monitoring stations that are located in Peninsular Malaysia. It was found that the number of basis functions required for the construction of the PM10 daily curve in Peninsular Malaysia was in the interval of between 14 and 20 with an average value of 17; the first percentile is 15 and the third percentile is 19. Meanwhile the initial value of the roughness coefficient was in the interval of between 10-5 and 10-7 and the mode was 10-6. An example of the functional descriptive analysis is also shown.
NASA Technical Reports Server (NTRS)
Parks, Kelsey
2010-01-01
Astronauts experience changes in multiple physiological systems due to exposure to the microgravity conditions of space flight. To understand how changes in physiological function influence functional performance, a testing procedure has been developed that evaluates both astronaut postflight functional performance and related physiological changes. Astronauts complete seven functional and physiological tests. The objective of this project is to use motion tracking and digitizing software to visually display the postflight decrement in the functional performance of the astronauts. The motion analysis software will be used to digitize astronaut data videos into stick figure videos to represent the astronauts as they perform the Functional Tasks Tests. This project will benefit NASA by allowing NASA scientists to present data of their neurological studies without revealing the identities of the astronauts.
Primer for the Transportable Applications Executive
NASA Technical Reports Server (NTRS)
Carlson, P. A.; Emmanuelli, C. A.; Harris, E. L.; Perkins, D. C.
1984-01-01
The Transportable Applications Executive (TAE), an interactive multipurpose executive that provides commonly required functions for scientific analysis systems, is discussed. The concept of an executive is discussed and the various components of TAE are presented. These include on-line help information, the use of menus or commands to access analysis programs, and TAE command procedures.
Nonlinear, discrete flood event models, 1. Bayesian estimation of parameters
NASA Astrophysics Data System (ADS)
Bates, Bryson C.; Townley, Lloyd R.
1988-05-01
In this paper (Part 1), a Bayesian procedure for parameter estimation is applied to discrete flood event models. The essence of the procedure is the minimisation of a sum of squares function for models in which the computed peak discharge is nonlinear in terms of the parameters. This objective function is dependent on the observed and computed peak discharges for several storms on the catchment, information on the structure of observation error, and prior information on parameter values. The posterior covariance matrix gives a measure of the precision of the estimated parameters. The procedure is demonstrated using rainfall and runoff data from seven Australian catchments. It is concluded that the procedure is a powerful alternative to conventional parameter estimation techniques in situations where a number of floods are available for parameter estimation. Parts 2 and 3 will discuss the application of statistical nonlinearity measures and prediction uncertainty analysis to calibrated flood models. Bates (this volume) and Bates and Townley (this volume).
GET electronics samples data analysis
NASA Astrophysics Data System (ADS)
Giovinazzo, J.; Goigoux, T.; Anvar, S.; Baron, P.; Blank, B.; Delagnes, E.; Grinyer, G. F.; Pancin, J.; Pedroza, J. L.; Pibernat, J.; Pollacco, E.; Rebii, A.; Roger, T.; Sizun, P.
2016-12-01
The General Electronics for TPCs (GET) has been developed to equip a generation of time projection chamber detectors for nuclear physics, and may also be used for a wider range of detector types. The goal of this paper is to propose first analysis procedures to be applied on raw data samples from the GET system, in order to correct for systematic effects observed on test measurements. We also present a method to estimate the response function of the GET system channels. The response function is required in analysis where the input signal needs to be reconstructed, in terms of time distribution, from the registered output samples.
A procedure for automated land use mapping using remotely sensed multispectral scanner data
NASA Technical Reports Server (NTRS)
Whitley, S. L.
1975-01-01
A system of processing remotely sensed multispectral scanner data by computer programs to produce color-coded land use maps for large areas is described. The procedure is explained, the software and the hardware are described, and an analogous example of the procedure is presented. Detailed descriptions of the multispectral scanners currently in use are provided together with a summary of the background of current land use mapping techniques. The data analysis system used in the procedure and the pattern recognition software used are functionally described. Current efforts by the NASA Earth Resources Laboratory to evaluate operationally a less complex and less costly system are discussed in a separate section.
Some Integrated Squared Error Procedures for Multivariate Normal Data,
1986-01-01
a lnear regresmion or experimental design model). Our procedures have &lSO been usned wcelyOn non -linear models but we do not addres nan-lnear...of fit, outliers, influence functions, experimental design , cluster analysis, robustness 24L A =TO ACT (VCefme - pvre alli of magsy MW identif by...structured data such as multivariate experimental designs . Several illustrations are provided. * 0 %41 %-. 4.’. * " , -.--, ,. -,, ., -, ’v ’ , " ,,- ,, . -,-. . ., * . - tAma- t
Optimization of reinforced concrete slabs
NASA Technical Reports Server (NTRS)
Ferritto, J. M.
1979-01-01
Reinforced concrete cells composed of concrete slabs and used to limit the effects of accidental explosions during hazardous explosives operations are analyzed. An automated design procedure which considers the dynamic nonlinear behavior of the reinforced concrete of arbitrary geometrical and structural configuration subjected to dynamic pressure loading is discussed. The optimum design of the slab is examined using an interior penalty function. The optimization procedure is presented and the results are discussed and compared with finite element analysis.
Nonlinear viscoelastic characterization of structural adhesives
NASA Technical Reports Server (NTRS)
Rochefort, M. A.; Brinson, H. F.
1983-01-01
Measurements of the nonliner viscoelastic behavior of two adhesives, FM-73 and FM-300, are presented and discussed. Analytical methods to quantify the measurements are given and fitted into a framework of an accelerated testing and analysis procedure. The single integral model used is shown to function well and is analogous to a time-temperature stress-superposition procedure (TTSSP). Advantages and disadvantages of the creep power law method used in this study are given.
Cheungpasitporn, Wisit; Thongprayoon, Charat; Brabec, Brady A.; Edmonds, Peter J.; O'Corragain, Oisin A.; Erickson, Stephen B.
2014-01-01
Background: The reports on efficacy of oral hydration treatment for the prevention of contrast-induced acute kidney injury (CIAKI) in elective radiological procedures and cardiac catheterization remain controversial. Aims: The objective of this meta-analysis was to assess the use of oral hydration regimen for prevention of CIAKI. Materials and Methods: Comprehensive literature searches for randomized controlled trials (RCTs) of outpatient oral hydration treatment was performed using MEDLINE, EMBASE, Cochrane Database of Systematic Reviews, Cochrane Central Register of Controlled Trials Systematic Reviews, and clinicaltrials.gov from inception until July 4th, 2014. Primary outcome was the incidence of CIAKI. Results: Six prospective RCTs were included in our analysis. Of 513patients undergoing elective procedures with contrast exposures,45 patients (8.8%) had CIAKI. Of 241 patients with oral hydration regimen, 23 (9.5%) developed CIAKI. Of 272 patients with intravenous (IV) fluid regimen, 22 (8.1%) had CIAKI. Study populations in all included studies had relatively normal kidney function to chronic kidney disease (CKD) stage 3. There was no significant increased risk of CIAKI in oral fluid regimen group compared toIV fluid regimen group (RR = 0.94, 95% confidence interval, CI = 0.38-2.31). Conclusions: According to our analysis,there is no evidence that oral fluid regimen is associated with more risk of CIAKI in patients undergoing elective procedures with contrast exposures compared to IV fluid regimen. This finding suggests that the oral fluid regimen might be considered as a possible outpatient treatment option for CIAKI prevention in patients with normal to moderately reduced kidney function. PMID:25599049
Double Modification of Polymer End Groups through Thiolactone Chemistry.
Driessen, Frank; Martens, Steven; Meyer, Bernhard De; Du Prez, Filip E; Espeel, Pieter
2016-06-01
A straightforward synthetic procedure for the double modification and polymer-polymer conjugation of telechelic polymers is performed through amine-thiol-ene conjugation. Thiolactone end-functionalized polymers are prepared via two different methods, through controlled radical polymerization of a thiolactone-containing initiator, or by modification of available end-functionalized polymers. Next, these different linear polymers are treated with a variety of amine/acrylate-combinations in a one-pot procedure, creating a library of tailored end-functionalized polymers. End group conversions are monitored via SEC, NMR, and MALDI-TOF analysis, confirming the quantitative modification after each step. Finally, this strategy is applied for the synthesis of block copolymers via polymer-polymer conjugation and the successful outcome is analyzed via LCxSEC measurements. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Froud, Tatiana; Venkat, Shree R; Barbery, Katuzka J; Gunjan, Arora; Narayanan, Govindarajan
2015-09-01
Irreversible electroporation (IRE) is a relatively new ablation modality that uses electric currents to cause cell death. It is commonly used to treat primary and secondary liver tumors in patients with normal liver function and preexisting cirrhosis. Retrospective analysis of 205 procedures sought to evaluate changes in liver function after IRE. Liver function tests (LFTs) results before and after IRE were evaluated from 174 procedures in 124 patients. Aspartate aminotransferase, alanine aminotransferase, alkaline phosphatase (ALKP), and total bilirubin levels were analyzed. The study was Health Insurance Portability and Accountability Act compliant and institutional review board approved. Informed consent was waived. Changes in LFT results after IRE were compared with baseline and were followed up over time to see if they resolved. Changes were compared with volume of ablation. The greatest perturbations were in transaminase levels. The levels increased sharply within 24 hours after IRE in 129 (74.1%) procedures to extreme levels (more than 20 times the upper limit of normal in one-third of cases). Resolution occurred in 95% and was demonstrated to have occurred by a mean of approximately 10 weeks, many documented as early as 7 days after procedure. ALKP levels elevated in 10% procedures, was slower to increase, and was less likely to resolve. Total bilirubin level demonstrated 2 different patterns of elevation--early and late--and similar to ALKP, it was more likely to remain elevated. There was no increased risk in patients with cirrhosis or cholangiocarcinoma. There was no correlation of levels with volume of ablation. IRE results in significant abnormalities in LFT results, but in most of the cases, these are self-limiting, do not preclude treatment, and are similar to the changes seen after radiofrequency and cryoablation in the liver. Copyright © 2015. Published by Elsevier Inc.
Evaluation of tricuspid valve regurgitation following laser lead extraction†.
Pecha, Simon; Castro, Liesa; Gosau, Nils; Linder, Matthias; Vogler, Julia; Willems, Stephan; Reichenspurner, Hermann; Hakmi, Samer
2017-06-01
The objective of this study was to examine the effect of laser lead extraction (LLE) on the development of post-procedural tricuspid regurgitation (TR). Some reports have suggested an increase in TR associated with LLE. We present a series of patients who underwent both, LLE and complete echocardiographic evaluation for TR. A single centre analysis of consecutive patients referred for LLE between January 2012 and August 2015. One hundred and three patients had tricuspid valve function evaluated before the procedure with a transthoracic echocardiography (TTE), during the procedure using transoesophageal echocardiography and postoperatively using a TTE. TR was graded from 0 (none) to 4 (severe). We treated 235 leads in 103 patients, including 118 ventricular leads. Seventy-seven were male (74.8%) and 26 female (25.2%), with a mean age of 65.6 ± 15.4 years. Mean time from initial lead implantation was 98.0 ± 67.3 months. Twenty-one patients (20.4%) had ejection fraction below 30%. No intra-procedural worsening of tricuspid valve function was seen with TEE in any of the patients. Ten patients (9.7%) were found to have TR before LLE that returned to normal valve function after the procedure. Two patients (1.9%) experienced mild TR after the procedure (both with tricuspid valve endocarditis). Ninety-one patients (88.3%) did not experience any significant change of the tricuspid valve function after LLE. Transthoracic and transoesophageal echocardiography findings showed that laser lead extraction was not associated with a significant increase in the incidence of tricuspid valve regurgitation. © The Author 2017. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.
Self-constrained inversion of potential fields
NASA Astrophysics Data System (ADS)
Paoletti, V.; Ialongo, S.; Florio, G.; Fedi, M.; Cella, F.
2013-11-01
We present a potential-field-constrained inversion procedure based on a priori information derived exclusively from the analysis of the gravity and magnetic data (self-constrained inversion). The procedure is designed to be applied to underdetermined problems and involves scenarios where the source distribution can be assumed to be of simple character. To set up effective constraints, we first estimate through the analysis of the gravity or magnetic field some or all of the following source parameters: the source depth-to-the-top, the structural index, the horizontal position of the source body edges and their dip. The second step is incorporating the information related to these constraints in the objective function as depth and spatial weighting functions. We show, through 2-D and 3-D synthetic and real data examples, that potential field-based constraints, for example, structural index, source boundaries and others, are usually enough to obtain substantial improvement in the density and magnetization models.
Collision Avoidance Functional Requirements for Step 1. Revision 6
NASA Technical Reports Server (NTRS)
2006-01-01
This Functional Requirements Document (FRD) describes the flow of requirements from the high level operational objectives down to the functional requirements specific to cooperative collision avoidance for high altitude, long endurance unmanned aircraft systems. These are further decomposed into performance and safety guidelines that are backed up by analysis or references to various documents or research findings. The FRD should be considered when establishing future policies, procedures, and standards pertaining to cooperative collision avoidance.
Design and Stress Analysis of Low-Noise Adjusted Bearing Contact Spiral Bevel Gears
NASA Technical Reports Server (NTRS)
Fuentes, A.; Litvin, F. L.; Mullins, B. R.; Woods, R.; Handschuh, R. F.; Lewicki, David G.
2002-01-01
An integrated computerized approach for design and stress analysis of low-noise spiral bevel gear drives with adjusted bearing contact is proposed. The procedure of computations is an iterative process that requires four separate procedures and provide: (a) a parabolic function of transmission errors that is able to reduce the effect of errors of alignment on noise and vibration, and (b) reduction of the shift of bearing contact caused by misalignment. Application of finite element analysis enables us to determine the contact and bending stresses and investigate the formation of the bearing contact. The design of finite element models and boundary conditions is automated and does not require intermediate CAD computer programs for application of general purpose computer program for finite element analysis.
Piezoelectric Actuator Modeling Using MSC/NASTRAN and MATLAB
NASA Technical Reports Server (NTRS)
Reaves, Mercedes C.; Horta, Lucas G.
2003-01-01
This paper presents a procedure for modeling structures containing piezoelectric actuators using MSCMASTRAN and MATLAB. The paper describes the utility and functionality of one set of validated modeling tools. The tools described herein use MSCMASTRAN to model the structure with piezoelectric actuators and a thermally induced strain to model straining of the actuators due to an applied voltage field. MATLAB scripts are used to assemble the dynamic equations and to generate frequency response functions. The application of these tools is discussed using a cantilever aluminum beam with a surface mounted piezoelectric actuator as a sample problem. Software in the form of MSCINASTRAN DMAP input commands, MATLAB scripts, and a step-by-step procedure to solve the example problem are provided. Analysis results are generated in terms of frequency response functions from deflection and strain data as a function of input voltage to the actuator.
NASA Astrophysics Data System (ADS)
Dobronets, Boris S.; Popova, Olga A.
2018-05-01
The paper considers a new approach of regression modeling that uses aggregated data presented in the form of density functions. Approaches to Improving the reliability of aggregation of empirical data are considered: improving accuracy and estimating errors. We discuss the procedures of data aggregation as a preprocessing stage for subsequent to regression modeling. An important feature of study is demonstration of the way how represent the aggregated data. It is proposed to use piecewise polynomial models, including spline aggregate functions. We show that the proposed approach to data aggregation can be interpreted as the frequency distribution. To study its properties density function concept is used. Various types of mathematical models of data aggregation are discussed. For the construction of regression models, it is proposed to use data representation procedures based on piecewise polynomial models. New approaches to modeling functional dependencies based on spline aggregations are proposed.
Intercorrelation of P and Pn Recordings for the North Korean Nuclear Tests
NASA Astrophysics Data System (ADS)
Lay, T.; Voytan, D.; Ohman, J.
2017-12-01
The relative waveform analysis procedure called Intercorrelation is applied to Pn and P waveforms at regional and teleseismic distances, respectively, for the 5 underground nuclear tests at the North Korean nuclear test site. Intercorrelation is a waveform equalization procedure that parameterizes the effective source function for a given explosion, including the reduced velocity potential convolved with a simplified Green's function that accounts for the free surface reflections (pPn and pP), and possibly additional arrivals such as spall. The source function for one event is convolved with the signal at a given station for a second event, and the recording at the same station for the first event is convolved with the source function for the second event. This procedure eliminates the need to predict the complex receiver function effects at the station, which are typically not well-known for short-period response. The parameters of the source function representation are yield and burial depth, and an explosion source model is required. Here we use the Mueller-Murphy representation of the explosion reduced velocity potential, which explicitly depends on yield and burial depth. We then search over yield and burial depth ranges for both events, constrained by a priori information about reasonable ranges of parameters, to optimize the simultaneous match of multiple station signals for the two events. This procedure, applied to the apparently overburied North Korean nuclear tests (no indications of spall complexity), assuming simple free surface interactions (elastic reflection from a flat surface), provides excellent waveform equalization for all combinations of 5 nuclear tests.
ERIC Educational Resources Information Center
McKenney, Elizabeth L. W.; Waldron, Nancy; Conroy, Maureen
2013-01-01
This study describes the integrity with which 3 general education middle school teachers implemented functional analyses (FA) of appropriate behavior for students who typically engaged in disruption. A 4-step model consistent with behavioral consultation was used to support the assessment process. All analyses were conducted during ongoing…
Foltran, Fabiana A; Silva, Luciana C C B; Sato, Tatiana O; Coury, Helenice J C G
2013-01-01
The recording of human movement is an essential requirement for biomechanical, clinical, and occupational analysis, allowing assessment of postural variation, occupational risks, and preventive programs in physical therapy and rehabilitation. The flexible electrogoniometer (EGM), considered a reliable and accurate device, is used for dynamic recordings of different joints. Despite these advantages, the EGM is susceptible to measurement errors, known as crosstalk. There are two known types of crosstalk: crosstalk due to sensor rotation and inherent crosstalk. Correction procedures have been proposed to correct these errors; however no study has used both procedures in clinical measures for wrist movements with the aim to optimize the correction. To evaluate the effects of mathematical correction procedures on: 1) crosstalk due to forearm rotation, 2) inherent sensor crosstalk; and 3) the combination of these two procedures. 43 healthy subjects had their maximum range of motion of wrist flexion/extension and ulnar/radials deviation recorded by EGM. The results were analyzed descriptively, and procedures were compared by differences. There was no significant difference in measurements before and after the application of correction procedures (P<0.05). Furthermore, the differences between the correction procedures were less than 5° in most cases, having little impact on the measurements. Considering the time-consuming data analysis, the specific technical knowledge involved, and the inefficient results, the correction procedures are not recommended for wrist recordings by EGM.
Methods for scalar-on-function regression.
Reiss, Philip T; Goldsmith, Jeff; Shang, Han Lin; Ogden, R Todd
2017-08-01
Recent years have seen an explosion of activity in the field of functional data analysis (FDA), in which curves, spectra, images, etc. are considered as basic functional data units. A central problem in FDA is how to fit regression models with scalar responses and functional data points as predictors. We review some of the main approaches to this problem, categorizing the basic model types as linear, nonlinear and nonparametric. We discuss publicly available software packages, and illustrate some of the procedures by application to a functional magnetic resonance imaging dataset.
Wigner functions from the two-dimensional wavelet group.
Ali, S T; Krasowska, A E; Murenzi, R
2000-12-01
Following a general procedure developed previously [Ann. Henri Poincaré 1, 685 (2000)], here we construct Wigner functions on a phase space related to the similitude group in two dimensions. Since the group space in this case is topologically homeomorphic to the phase space in question, the Wigner functions so constructed may also be considered as being functions on the group space itself. Previously the similitude group was used to construct wavelets for two-dimensional image analysis; we discuss here the connection between the wavelet transform and the Wigner function.
ERIC Educational Resources Information Center
Choi, Jinnie
2017-01-01
This article reviews PROC IRT, which was added to Statistical Analysis Software in 2014. We provide an introductory overview of a free version of SAS, describe what PROC IRT offers for item response theory (IRT) analysis and how one can use PROC IRT, and discuss how other SAS macros and procedures may compensate the IRT functionalities of PROC IRT.
Token Economy: A Systematic Review of Procedural Descriptions.
Ivy, Jonathan W; Meindl, James N; Overley, Eric; Robson, Kristen M
2017-09-01
The token economy is a well-established and widely used behavioral intervention. A token economy is comprised of six procedural components: the target response(s), a token that functions as a conditioned reinforcer, backup reinforcers, and three interconnected schedules of reinforcement. Despite decades of applied research, the extent to which the procedures of a token economy are described in complete and replicable detail has not been evaluated. Given the inherent complexity of a token economy, an analysis of the procedural descriptions may benefit future token economy research and practice. Articles published between 2000 and 2015 that included implementation of a token economy within an applied setting were identified and reviewed with a focus on evaluating the thoroughness of procedural descriptions. The results show that token economy components are regularly omitted or described in vague terms. Of the articles included in this analysis, only 19% (18 of 96 articles reviewed) included replicable and complete descriptions of all primary components. Missing or vague component descriptions could negatively affect future research or applied practice. Recommendations are provided to improve component descriptions.
A Method for Functional Task Alignment Analysis of an Arthrocentesis Simulator.
Adams, Reid A; Gilbert, Gregory E; Buckley, Lisa A; Nino Fong, Rodolfo; Fuentealba, I Carmen; Little, Erika L
2018-05-16
During simulation-based education, simulators are subjected to procedures composed of a variety of tasks and processes. Simulators should functionally represent a patient in response to the physical action of these tasks. The aim of this work was to describe a method for determining whether a simulator does or does not have sufficient functional task alignment (FTA) to be used in a simulation. Potential performance checklist items were gathered from published arthrocentesis guidelines and aggregated into a performance checklist using Lawshe's method. An expert panel used this performance checklist and an FTA analysis questionnaire to evaluate a simulator's ability to respond to the physical actions required by the performance checklist. Thirteen items, from a pool of 39, were included on the performance checklist. Experts had mixed reviews of the simulator's FTA and its suitability for use in simulation. Unexpectedly, some positive FTA was found for several tasks where the simulator lacked functionality. By developing a detailed list of specific tasks required to complete a clinical procedure, and surveying experts on the simulator's response to those actions, educators can gain insight into the simulator's clinical accuracy and suitability. Unexpected of positive FTA ratings of function deficits suggest that further revision of the survey method is required.
The Effect of the Extinction Procedure in Function-Based Intervention
ERIC Educational Resources Information Center
Janney, Donna M.; Umbreit, John; Ferro, Jolenea B.; Liaupsin, Carl J.; Lane, Kathleen L.
2013-01-01
In this study, we examined the contribution of the extinction procedure in function-based interventions implemented in the general education classrooms of three at-risk elementary-aged students. Function-based interventions included antecedent adjustments, reinforcement procedures, and function-matched extinction procedures. Using a combined ABC…
NASA Technical Reports Server (NTRS)
Cassarino, S.; Sopher, R.
1982-01-01
user instruction and software descriptions for the base program of the coupled rotor/airframe vibration analysis are provided. The functional capabilities and procedures for running the program are provided. Interfaces with external programs are discussed. The procedure of synthesizing a dynamic system and the various solution methods are described. Input data and output results are presented. Detailed information is provided on the program structure. Sample test case results for five representative dynamic configurations are provided and discussed. System response are plotted to demonstrate the plots capabilities available. Instructions to install and execute SIMVIB on the CDC computer system are provided.
Software technology testbed softpanel prototype
NASA Technical Reports Server (NTRS)
1991-01-01
The following subject areas are covered: analysis of using Ada for the development of real-time control systems for the Space Station; analysis of the functionality of the Application Generator; analysis of the User Support Environment criteria; analysis of the SSE tools and procedures which are to be used for the development of ground/flight software for the Space Station; analysis if the CBATS tutorial (an Ada tutorial package); analysis of Interleaf; analysis of the Integration, Test and Verification process of the Space Station; analysis of the DMS on-orbit flight architecture; analysis of the simulation architecture.
Analyzing coastal environments by means of functional data analysis
NASA Astrophysics Data System (ADS)
Sierra, Carlos; Flor-Blanco, Germán; Ordoñez, Celestino; Flor, Germán; Gallego, José R.
2017-07-01
Here we used Functional Data Analysis (FDA) to examine particle-size distributions (PSDs) in a beach/shallow marine sedimentary environment in Gijón Bay (NW Spain). The work involved both Functional Principal Components Analysis (FPCA) and Functional Cluster Analysis (FCA). The grainsize of the sand samples was characterized by means of laser dispersion spectroscopy. Within this framework, FPCA was used as a dimension reduction technique to explore and uncover patterns in grain-size frequency curves. This procedure proved useful to describe variability in the structure of the data set. Moreover, an alternative approach, FCA, was applied to identify clusters and to interpret their spatial distribution. Results obtained with this latter technique were compared with those obtained by means of two vector approaches that combine PCA with CA (Cluster Analysis). The first method, the point density function (PDF), was employed after adapting a log-normal distribution to each PSD and resuming each of the density functions by its mean, sorting, skewness and kurtosis. The second applied a centered-log-ratio (clr) to the original data. PCA was then applied to the transformed data, and finally CA to the retained principal component scores. The study revealed functional data analysis, specifically FPCA and FCA, as a suitable alternative with considerable advantages over traditional vector analysis techniques in sedimentary geology studies.
Probabilistic seismic vulnerability and risk assessment of stone masonry structures
NASA Astrophysics Data System (ADS)
Abo El Ezz, Ahmad
Earthquakes represent major natural hazards that regularly impact the built environment in seismic prone areas worldwide and cause considerable social and economic losses. The high losses incurred following the past destructive earthquakes promoted the need for assessment of the seismic vulnerability and risk of the existing buildings. Many historic buildings in the old urban centers in Eastern Canada such as Old Quebec City are built of stone masonry and represent un-measurable architectural and cultural heritage. These buildings were built to resist gravity loads only and generally offer poor resistance to lateral seismic loads. Seismic vulnerability assessment of stone masonry buildings is therefore the first necessary step in developing seismic retrofitting and pre-disaster mitigation plans. The objective of this study is to develop a set of probability-based analytical tools for efficient seismic vulnerability and uncertainty analysis of stone masonry buildings. A simplified probabilistic analytical methodology for vulnerability modelling of stone masonry building with systematic treatment of uncertainties throughout the modelling process is developed in the first part of this study. Building capacity curves are developed using a simplified mechanical model. A displacement based procedure is used to develop damage state fragility functions in terms of spectral displacement response based on drift thresholds of stone masonry walls. A simplified probabilistic seismic demand analysis is proposed to capture the combined uncertainty in capacity and demand on fragility functions. In the second part, a robust analytical procedure for the development of seismic hazard compatible fragility and vulnerability functions is proposed. The results are given by sets of seismic hazard compatible vulnerability functions in terms of structure-independent intensity measure (e.g. spectral acceleration) that can be used for seismic risk analysis. The procedure is very efficient for conducting rapid vulnerability assessment of stone masonry buildings. With modification of input structural parameters, it can be adapted and applied to any other building class. A sensitivity analysis of the seismic vulnerability modelling is conducted to quantify the uncertainties associated with each of the input parameters. The proposed methodology was validated for a scenario-based seismic risk assessment of existing buildings in Old Quebec City. The procedure for hazard compatible vulnerability modelling was used to develop seismic fragility functions in terms of spectral acceleration representative of the inventoried buildings. A total of 1220 buildings were considered. The assessment was performed for a scenario event of magnitude 6.2 at distance 15km with a probability of exceedance of 2% in 50 years. The study showed that most of the expected damage is concentrated in the old brick and stone masonry buildings.
NASA Astrophysics Data System (ADS)
Michel, Clotaire; Hobiger, Manuel; Edwards, Benjamin; Poggi, Valerio; Burjanek, Jan; Cauzzi, Carlo; Kästli, Philipp; Fäh, Donat
2016-04-01
The Swiss Seismological Service operates one of the densest national seismic networks in the world, still rapidly expanding (see http://www.seismo.ethz.ch/monitor/index_EN). Since 2009, every newly instrumented site is characterized following an established procedure to derive realistic 1D VS velocity profiles. In addition, empirical Fourier spectral modeling is performed on the whole network for each recorded event with sufficient signal-to-noise ratio. Besides the source characteristics of the earthquakes, statistical real time analyses of the residuals of the spectral modeling provide a seamlessly updated amplification function w.r. to Swiss rock conditions at every station. Our site characterization procedure is mainly based on the analysis of surface waves from passive experiments and includes cross-checks of the derived amplification functions with those obtained through spectral modeling. The systematic use of three component surface-wave analysis, allowing the derivation of both Rayleigh and Love waves dispersion curves, also contributes to the improved quality of the retrieved profiles. The results of site characterisation activities at recently installed strong-motion stations depict the large variety of possible effects of surface geology on ground motion in the Alpine context. Such effects range from de-amplification at hard-rock sites to amplification up to a factor of 15 in lacustrine sediments with respect to the Swiss reference rock velocity model. The derived velocity profiles are shown to reproduce observed amplification functions from empirical spectral modeling. Although many sites are found to exhibit 1D behavior, our procedure allows the detection and qualification of 2D and 3D effects. All data collected during the site characterization procedures in the last 20 years are gathered in a database, implementing a data model proposed for community use at the European scale through NERA and EPOS (www.epos-eu.org). A web stationbook derived from it can be accessed through the interface www.stations.seismo.ethz.ch.
How to conduct a qualitative meta-analysis: Tailoring methods to enhance methodological integrity.
Levitt, Heidi M
2018-05-01
Although qualitative research has long been of interest in the field of psychology, meta-analyses of qualitative literatures (sometimes called meta-syntheses) are still quite rare. Like quantitative meta-analyses, these methods function to aggregate findings and identify patterns across primary studies, but their aims, procedures, and methodological considerations may vary. This paper explains the function of qualitative meta-analyses and their methodological development. Recommendations have broad relevance but are framed with an eye toward their use in psychotherapy research. Rather than arguing for the adoption of any single meta-method, this paper advocates for considering how procedures can best be selected and adapted to enhance a meta-study's methodological integrity. Through the paper, recommendations are provided to help researchers identify procedures that can best serve their studies' specific goals. Meta-analysts are encouraged to consider the methodological integrity of their studies in relation to central research processes, including identifying a set of primary research studies, transforming primary findings into initial units of data for a meta-analysis, developing categories or themes, and communicating findings. The paper provides guidance for researchers who desire to tailor meta-analytic methods to meet their particular goals while enhancing the rigor of their research.
NASA Technical Reports Server (NTRS)
1982-01-01
The integrated application of active controls (IAAC) technology to an advanced subsonic transport is reported. Supplementary technical data on the following topics are included: (1) 1990's avionics technology assessment; (2) function criticality assessment; (3) flight deck system for total control and functional features list; (4) criticality and reliability assessment of units; (5) crew procedural function task analysis; and (6) recommendations for simulation mechanization.
Economic Analysis of Redesign Alternatives for the RESFMS Information System
1992-09-01
input parameters to produce the variations of output ( Pressman , 1992). Thus, if system maintenance dictates that the procedure needs to be modified, it...easily counted, and that "a large body of literature and data predicated on LOC already exists." ( Pressman , 1992) Another term for LOC, used by Boehm... Pressman , 1992). 54 D. FUNCTION POINTS An alternative to size metrics such as LOC is the measurement of software "functionality" or "utility." Function
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Leifker, Daniel B.
1991-01-01
Current qualitative device and process models represent only the structure and behavior of physical systems. However, systems in the real world include goal-oriented activities that generally cannot be easily represented using current modeling techniques. An extension of a qualitative modeling system, known as functional modeling, which captures goal-oriented activities explicitly is proposed and how they may be used to support intelligent automation and fault management is shown.
Hagen, M D; Eckman, M H; Pauker, S G
1989-01-01
A previous decision analysis examined a patient with severe CAD, diminished ventricular function, and an abdominal aortic aneurysm and also concluded that CABG followed by aneurysm repair was optimal. This patient, who had well-preserved cardiac function but severely compromised pulmonary status, stood to gain less from CABG than would a patient with more severe coronary disease, thus accounting for the "close-call" between the CABG-AAA and AAA only strategies. Nevertheless, the analysis did emphasize the benefit of aneurysm repair, whether done alone or after CABG. The analysis also highlighted the significant risk of aneurysm rupture the patient is exposed to while recovering from CABG surgery. The operative mortality risks of the two procedures are similar; thus, the patient's total operative risk is approximately doubled if he undergoes both procedures rather than aneurysm repair alone. The key question raised by the analysis is whether this double jeopardy is more than compensated by the degree to which prior CABG reduces both short-term cardiac risk at subsequent aneurysm repair and long-term cardiac mortality. For this patient, who had good cardiac function, the gains appeared sufficient to offset the interval risk of aneurysm rupture and the additional risk associated with a surgical procedures. THE REAL WORLD The patient indeed underwent and tolerated CABG, although he had a stormy prolonged postoperative course due to pulmonary failure. After discharge from the hospital, he declined readmission for repair of the aneurysm. We did not model that possibility, clearly an inadequacy in our tree. Some six months later, the patient was still alive and was, reluctantly, readmitted for aneurysmorrhaphy. At that time, however, his pulmonary function had deteriorated and both the anesthesiologist and the pulmonary consultant stated unequivocally that further surgery was now impossible. In retrospect, the expected utility of CABG without aneurysm repair (thus providing only a decrease in the long-term mortality risk from his CAD) would have been 1.95 (DEALE) or 2.06 (Markov) years. Sensitivity analysis revealed that, even if long-term cardiac risk were completely eliminated by CABG, immediate aneurysm repair would have been a better approach had the patient's physicians known he would be likely to refuse or not be a candidate for the second operation. In summary, although the patient's comorbidities did indeed place him at significant operative risk for either aneurysmorrhaphy alone or two sequential procedures, the benefits to be gained were shown to far outweigh the risks when compared with expectant observation.(ABSTRACT TRUNCATED AT 400 WORDS)
Estimating Mixture of Gaussian Processes by Kernel Smoothing
Huang, Mian; Li, Runze; Wang, Hansheng; Yao, Weixin
2014-01-01
When the functional data are not homogeneous, e.g., there exist multiple classes of functional curves in the dataset, traditional estimation methods may fail. In this paper, we propose a new estimation procedure for the Mixture of Gaussian Processes, to incorporate both functional and inhomogeneous properties of the data. Our method can be viewed as a natural extension of high-dimensional normal mixtures. However, the key difference is that smoothed structures are imposed for both the mean and covariance functions. The model is shown to be identifiable, and can be estimated efficiently by a combination of the ideas from EM algorithm, kernel regression, and functional principal component analysis. Our methodology is empirically justified by Monte Carlo simulations and illustrated by an analysis of a supermarket dataset. PMID:24976675
1979-01-31
LT) sector, distinct and repeatable electron lair - and disceearrlrgos epetvI.Tepro tde tudinal distributions \\%ere observed as a function of substorm...of surface a ,wcond time by using the same procedure, Next the)’ were optical albedo and east-%est nonuniformities in precipitation grouped together
Vanderstichele, Hugo Marcel Johan; Janelidze, Shorena; Demeyer, Leentje; Coart, Els; Stoops, Erik; Herbst, Victor; Mauroo, Kimberley; Brix, Britta; Hansson, Oskar
2016-05-31
Reduced cerebrospinal fluid (CSF) concentration of amyloid-β1-42 (Aβ1-42) reflects the presence of amyloidopathy in brains of subjects with Alzheimer's disease (AD). To qualify the use of Aβ1-42/Aβ1-40 for improvement of standard operating procedures (SOP) for measurement of CSF Aβ with a focus on CSF collection, storage, and analysis. Euroimmun ELISAs for CSF Aβ isoforms were used to set up a SOP with respect to recipient properties (low binding, polypropylene), volume of tubes, freeze/thaw cycles, addition of detergents (Triton X-100, Tween-20) in collection or storage tubes or during CSF analysis. Data were analyzed with linear repeated measures and mixed effects models. Optimization of CSF analysis included a pre-wash of recipients (e.g., tubes, 96-well plates) before sample analysis. Using the Aβ1-42/Aβ1-40 ratio, in contrast to Aβ1-42, eliminated effects of tube type, additional freeze/thaw cycles, or effect of CSF volumes for polypropylene storage tubes. 'Low binding' tubes reduced the loss of Aβ when aliquoting CSF or in function of additional freeze/thaw cycles. Addition of detergent in CSF collection tubes resulted in an almost complete absence of variation in function of collection procedures, but affected the concentration of Aβ isoforms in the immunoassay. The ratio of Aβ1-42/Aβ1-40 is a more robust biomarker than Aβ1-42 toward (pre-) analytical interfering factors. Further, 'low binding' recipients and addition of detergent in collection tubes are able to remove effects of SOP-related confounding factors. Integration of the Aβ1-42/Aβ1-40 ratio and 'low-binding tubes' into guidance criteria may speed up worldwide standardization of CSF biomarker analysis.
Functional Interaction Network Construction and Analysis for Disease Discovery.
Wu, Guanming; Haw, Robin
2017-01-01
Network-based approaches project seemingly unrelated genes or proteins onto a large-scale network context, therefore providing a holistic visualization and analysis platform for genomic data generated from high-throughput experiments, reducing the dimensionality of data via using network modules and increasing the statistic analysis power. Based on the Reactome database, the most popular and comprehensive open-source biological pathway knowledgebase, we have developed a highly reliable protein functional interaction network covering around 60 % of total human genes and an app called ReactomeFIViz for Cytoscape, the most popular biological network visualization and analysis platform. In this chapter, we describe the detailed procedures on how this functional interaction network is constructed by integrating multiple external data sources, extracting functional interactions from human curated pathway databases, building a machine learning classifier called a Naïve Bayesian Classifier, predicting interactions based on the trained Naïve Bayesian Classifier, and finally constructing the functional interaction database. We also provide an example on how to use ReactomeFIViz for performing network-based data analysis for a list of genes.
NASA Astrophysics Data System (ADS)
Jha, Ratneshwar
Multidisciplinary design optimization (MDO) procedures have been developed for smart composite wings and turbomachinery blades. The analysis and optimization methods used are computationally efficient and sufficiently rigorous. Therefore, the developed MDO procedures are well suited for actual design applications. The optimization procedure for the conceptual design of composite aircraft wings with surface bonded piezoelectric actuators involves the coupling of structural mechanics, aeroelasticity, aerodynamics and controls. The load carrying member of the wing is represented as a single-celled composite box beam. Each wall of the box beam is analyzed as a composite laminate using a refined higher-order displacement field to account for the variations in transverse shear stresses through the thickness. Therefore, the model is applicable for the analysis of composite wings of arbitrary thickness. Detailed structural modeling issues associated with piezoelectric actuation of composite structures are considered. The governing equations of motion are solved using the finite element method to analyze practical wing geometries. Three-dimensional aerodynamic computations are performed using a panel code based on the constant-pressure lifting surface method to obtain steady and unsteady forces. The Laplace domain method of aeroelastic analysis produces root-loci of the system which gives an insight into the physical phenomena leading to flutter/divergence and can be efficiently integrated within an optimization procedure. The significance of the refined higher-order displacement field on the aeroelastic stability of composite wings has been established. The effect of composite ply orientations on flutter and divergence speeds has been studied. The Kreisselmeier-Steinhauser (K-S) function approach is used to efficiently integrate the objective functions and constraints into a single envelope function. The resulting unconstrained optimization problem is solved using the Broyden-Fletcher-Goldberg-Shanno algorithm. The optimization problem is formulated with the objective of simultaneously minimizing wing weight and maximizing its aerodynamic efficiency. Design variables include composite ply orientations, ply thicknesses, wing sweep, piezoelectric actuator thickness and actuator voltage. Constraints are placed on the flutter/divergence dynamic pressure, wing root stresses and the maximum electric field applied to the actuators. Numerical results are presented showing significant improvements, after optimization, compared to reference designs. The multidisciplinary optimization procedure for the design of turbomachinery blades integrates aerodynamic and heat transfer design objective criteria along with various mechanical and geometric constraints on the blade geometry. The airfoil shape is represented by Bezier-Bernstein polynomials, which results in a relatively small number of design variables for the optimization. Thin shear layer approximation of the Navier-Stokes equation is used for the viscous flow calculations. Grid generation is accomplished by solving Poisson equations. The maximum and average blade temperatures are obtained through a finite element analysis. Total pressure and exit kinetic energy losses are minimized, with constraints on blade temperatures and geometry. The constrained multiobjective optimization problem is solved using the K-S function approach. The results for the numerical example show significant improvements after optimization.
ERIC Educational Resources Information Center
Beltyukova, Svetlana A.; Stone, Gregory M.; Ellis, Lee W.
2008-01-01
Purpose: Speech intelligibility research typically relies on traditional evidence of reliability and validity. This investigation used Rasch analysis to enhance understanding of the functioning and meaning of scores obtained with 2 commonly used procedures: word identification (WI) and magnitude estimation scaling (MES). Method: Narrative samples…
Efficient Integrative Multi-SNP Association Analysis via Deterministic Approximation of Posteriors.
Wen, Xiaoquan; Lee, Yeji; Luca, Francesca; Pique-Regi, Roger
2016-06-02
With the increasing availability of functional genomic data, incorporating genomic annotations into genetic association analysis has become a standard procedure. However, the existing methods often lack rigor and/or computational efficiency and consequently do not maximize the utility of functional annotations. In this paper, we propose a rigorous inference procedure to perform integrative association analysis incorporating genomic annotations for both traditional GWASs and emerging molecular QTL mapping studies. In particular, we propose an algorithm, named deterministic approximation of posteriors (DAP), which enables highly efficient and accurate joint enrichment analysis and identification of multiple causal variants. We use a series of simulation studies to highlight the power and computational efficiency of our proposed approach and further demonstrate it by analyzing the cross-population eQTL data from the GEUVADIS project and the multi-tissue eQTL data from the GTEx project. In particular, we find that genetic variants predicted to disrupt transcription factor binding sites are enriched in cis-eQTLs across all tissues. Moreover, the enrichment estimates obtained across the tissues are correlated with the cell types for which the annotations are derived. Copyright © 2016 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.
TOKEN REINFORCEMENT: A REVIEW AND ANALYSIS
Hackenberg, Timothy D
2009-01-01
Token reinforcement procedures and concepts are reviewed and discussed in relation to general principles of behavior. The paper is divided into four main parts. Part I reviews and discusses previous research on token systems in relation to common behavioral functions—reinforcement, temporal organization, antecedent stimulus functions, and aversive control—emphasizing both the continuities with other contingencies and the distinctive features of token systems. Part II describes the role of token procedures in the symmetrical law of effect, the view that reinforcers (gains) and punishers (losses) can be measured in conceptually analogous terms. Part III considers the utility of token reinforcement procedures in cross-species analysis of behavior more generally, showing how token procedures can be used to bridge the methodological gulf separating research with humans from that with other animals. Part IV discusses the relevance of token systems to the field of behavioral economics. Token systems have the potential to significantly advance research and theory in behavioral economics, permitting both a more refined analysis of the costs and benefits underlying standard economic models, and a common currency more akin to human monetary systems. Some implications for applied research and for broader theoretical integration across disciplines will also be considered. PMID:19794838
NASA Astrophysics Data System (ADS)
Takatsuji, Toshiyuki; Tanaka, Ken-ichi
1996-06-01
A procedure is derived by which sensory attributes can be scaled as a function of various physical and/or chemical properties of the object to be tested. This procedure consists of four successive steps: (i) design and experiment, (ii) fabrication of specimens according to the design parameters, (iii) assessment of a sensory attribute using sensory evaluation and (iv) derivation of the relationship between the parameters and the sensory attribute. In these steps an experimental design using orthogonal arrays, analysis of variance and regression analyses are used strategically. When a specimen with the design parameters cannot be physically fabricated, an alternative specimen having parameters closest to the design is selected from a group of specimens which can be physically made. The influence of the deviation of actual parameters from the desired ones is also discussed. A method of confirming the validity of the regression equation is also investigated. The procedure is applied to scale the sensory sharpness of kitchen knives as a function of the edge angle and the roughness of the cutting edge.
Dengiz, Ramazan; Haytoğlu, Süheyl; Görgülü, Orhan; Doğru, Mehmet; Arıkan, Osman Kürşat
2015-03-01
Septorhinoplasty (SRP), one of the most commonly performed rhinologic surgery procedures, can affect olfactory function; however, the findings of studies investigating smell following SRP are controversial. We used a culturally adapted modified Brief Smell Identification Test (B-SIT) to investigate the long- and short-term effects of SRP on olfactory function. We enrolled 59 patients admitted to the Ear-Nose-Throat Clinic, who were complaining of external nasal deformity and nasal obstruction. Functional SRP was performed on all cases. The B-SIT was administered prior to surgery and at 4 and 12 weeks post-surgery. The smell identification score (SIS) reflected the number of correct answers. In addition, we investigated the effects of gender and smoking on olfactory function and whether the SRP procedure changed these associations. The mean preoperative, 4-week, and 12-week postoperative SISs were 10.15±1.30, 10.21±1.52, and 10.92±0.95, respectively. The difference between the preoperative and 4-week postoperative SISs was not statistically significant; however, the 12-week postoperative score was significantly different from the preoperative and 4-week postoperative scores. Furthermore, the repeated measures analysis according to gender and smoking habit revealed a significant difference between the 4-and 12-week postoperative SISs. One patient developed postoperative anosmia; however, the patient recovered in the 12-week postoperative period. SRP surgery is a safe procedure in terms of olfactory function. In addition, olfactory function may increase following surgery as a result of improved nasal airflow.
Bioprocessing feasibility analysis. [thymic hormone bioassay and electrophoresis
NASA Technical Reports Server (NTRS)
1978-01-01
The biology and pathophysiology of the thymus gland is discussed and a clinical procedure for thymic hormone assay is described. The separation of null lymphocytes from mice spleens and the functional characteristics of the cells after storage and transportation were investigated to develop a clinical procedure for thymic hormone assay, and to determine whether a ground-based approach will provide the desired end-product in sufficient quantities, or whether the microgravity of space should be exploited for more economical preparation of the hormone.
Spectral and textural processing of ERTS imagery. [Kansas
NASA Technical Reports Server (NTRS)
Haralick, R. M.; Bosley, R. J.
1974-01-01
A procedure is developed to simultaneously extract textural features from all bands of ERTS multispectral scanner imagery for automatic analysis. Multi-images lead to excessively large grey tone N-tuple co-occurrence matrices; therefore, neighboring grey N-tuple differences are measured and an ellipsoidally symmetric functional form is assumed for the co-occurrence distribution of multiimage greytone N-tuple differences. On the basis of past data the ellipsoidally symmetric approximation is shown to be reasonable. Initial evaluation of the procedure is encouraging.
NASA Astrophysics Data System (ADS)
Maciejewska, Beata; Błasiak, Sławomir; Piasecka, Magdalena
This work discusses the mathematical model for laminar-flow heat transfer in a minichannel. The boundary conditions in the form of temperature distributions on the outer sides of the channel walls were determined from experimental data. The data were collected from the experimental stand the essential part of which is a vertical minichannel 1.7 mm deep, 16 mm wide and 180 mm long, asymmetrically heated by a Haynes-230 alloy plate. Infrared thermography allowed determining temperature changes on the outer side of the minichannel walls. The problem was analysed numerically through either ANSYS CFX software or special calculation procedures based on the Finite Element Method and Trefftz functions in the thermal boundary layer. The Trefftz functions were used to construct the basis functions. Solutions to the governing differential equations were approximated with a linear combination of Trefftz-type basis functions. Unknown coefficients of the linear combination were calculated by minimising the functional. The results of the comparative analysis were represented in a graphical form and discussed.
Multidisciplinary design optimization - An emerging new engineering discipline
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw
1993-01-01
A definition of the multidisciplinary design optimization (MDO) is introduced, and functionality and relationship of the MDO conceptual components are examined. The latter include design-oriented analysis, approximation concepts, mathematical system modeling, design space search, an optimization procedure, and a humane interface.
Analysis of Two Advanced Smoothing Algorithms.
1985-09-01
59 B. METHODOLOGY . ......... ........... 60 6 C. TESTING AND RESULTS ---- LINEAR UNDERLYING FUNCTION...SMOOTHING ALGORITHMS ...... .................... 94 A. GENERAL ......... ....................... .. 94 B. METHODOLOGY ............................ .95 C...to define succinctly. 59 B. METHODOLOGY There is no established procedure to follow in testing the efficiency and effectiveness of a smoothing
Code of Federal Regulations, 2010 CFR
2010-04-01
..., including, but not limited to, those that both before and after the transaction remain under the functional...) Transactions that do not require an Appendix A analysis; 1 and 1 Inquiry Concerning the Commission's Merger...
Automated Orbit Determination System (AODS) requirements definition and analysis
NASA Technical Reports Server (NTRS)
Waligora, S. R.; Goorevich, C. E.; Teles, J.; Pajerski, R. S.
1980-01-01
The requirements definition for the prototype version of the automated orbit determination system (AODS) is presented including the AODS requirements at all levels, the functional model as determined through the structured analysis performed during requirements definition, and the results of the requirements analysis. Also specified are the implementation strategy for AODS and the AODS-required external support software system (ADEPT), input and output message formats, and procedures for modifying the requirements.
Binder, Harald; Sauerbrei, Willi; Royston, Patrick
2013-06-15
In observational studies, many continuous or categorical covariates may be related to an outcome. Various spline-based procedures or the multivariable fractional polynomial (MFP) procedure can be used to identify important variables and functional forms for continuous covariates. This is the main aim of an explanatory model, as opposed to a model only for prediction. The type of analysis often guides the complexity of the final model. Spline-based procedures and MFP have tuning parameters for choosing the required complexity. To compare model selection approaches, we perform a simulation study in the linear regression context based on a data structure intended to reflect realistic biomedical data. We vary the sample size, variance explained and complexity parameters for model selection. We consider 15 variables. A sample size of 200 (1000) and R(2) = 0.2 (0.8) is the scenario with the smallest (largest) amount of information. For assessing performance, we consider prediction error, correct and incorrect inclusion of covariates, qualitative measures for judging selected functional forms and further novel criteria. From limited information, a suitable explanatory model cannot be obtained. Prediction performance from all types of models is similar. With a medium amount of information, MFP performs better than splines on several criteria. MFP better recovers simpler functions, whereas splines better recover more complex functions. For a large amount of information and no local structure, MFP and the spline procedures often select similar explanatory models. Copyright © 2012 John Wiley & Sons, Ltd.
Shen, Yi
2013-05-01
A subject's sensitivity to a stimulus variation can be studied by estimating the psychometric function. Generally speaking, three parameters of the psychometric function are of interest: the performance threshold, the slope of the function, and the rate at which attention lapses occur. In the present study, three psychophysical procedures were used to estimate the three-parameter psychometric function for an auditory gap detection task. These were an up-down staircase (up-down) procedure, an entropy-based Bayesian (entropy) procedure, and an updated maximum-likelihood (UML) procedure. Data collected from four young, normal-hearing listeners showed that while all three procedures provided similar estimates of the threshold parameter, the up-down procedure performed slightly better in estimating the slope and lapse rate for 200 trials of data collection. When the lapse rate was increased by mixing in random responses for the three adaptive procedures, the larger lapse rate was especially detrimental to the efficiency of the up-down procedure, and the UML procedure provided better estimates of the threshold and slope than did the other two procedures.
Quantum random oracle model for quantum digital signature
NASA Astrophysics Data System (ADS)
Shang, Tao; Lei, Qi; Liu, Jianwei
2016-10-01
The goal of this work is to provide a general security analysis tool, namely, the quantum random oracle (QRO), for facilitating the security analysis of quantum cryptographic protocols, especially protocols based on quantum one-way function. QRO is used to model quantum one-way function and different queries to QRO are used to model quantum attacks. A typical application of quantum one-way function is the quantum digital signature, whose progress has been hampered by the slow pace of the experimental realization. Alternatively, we use the QRO model to analyze the provable security of a quantum digital signature scheme and elaborate the analysis procedure. The QRO model differs from the prior quantum-accessible random oracle in that it can output quantum states as public keys and give responses to different queries. This tool can be a test bed for the cryptanalysis of more quantum cryptographic protocols based on the quantum one-way function.
Comparison of Optimum Interpolation and Cressman Analyses
NASA Technical Reports Server (NTRS)
Baker, W. E.; Bloom, S. C.; Nestler, M. S.
1984-01-01
The objective of this investigation is to develop a state-of-the-art optimum interpolation (O/I) objective analysis procedure for use in numerical weather prediction studies. A three-dimensional multivariate O/I analysis scheme has been developed. Some characteristics of the GLAS O/I compared with those of the NMC and ECMWF systems are summarized. Some recent enhancements of the GLAS scheme include a univariate analysis of water vapor mixing ratio, a geographically dependent model prediction error correlation function and a multivariate oceanic surface analysis.
Comparison of Optimum Interpolation and Cressman Analyses
NASA Technical Reports Server (NTRS)
Baker, W. E.; Bloom, S. C.; Nestler, M. S.
1985-01-01
The development of a state of the art optimum interpolation (O/I) objective analysis procedure for use in numerical weather prediction studies was investigated. A three dimensional multivariate O/I analysis scheme was developed. Some characteristics of the GLAS O/I compared with those of the NMC and ECMWF systems are summarized. Some recent enhancements of the GLAS scheme include a univariate analysis of water vapor mixing ratio, a geographically dependent model prediction error correlation function and a multivariate oceanic surface analysis.
ERIC Educational Resources Information Center
Tan, Xuan; Xiang, Bihua; Dorans, Neil J.; Qu, Yanxuan
2010-01-01
The nature of the matching criterion (usually the total score) in the study of differential item functioning (DIF) has been shown to impact the accuracy of different DIF detection procedures. One of the topics related to the nature of the matching criterion is whether the studied item should be included. Although many studies exist that suggest…
Framework for adaptive multiscale analysis of nonhomogeneous point processes.
Helgason, Hannes; Bartroff, Jay; Abry, Patrice
2011-01-01
We develop the methodology for hypothesis testing and model selection in nonhomogeneous Poisson processes, with an eye toward the application of modeling and variability detection in heart beat data. Modeling the process' non-constant rate function using templates of simple basis functions, we develop the generalized likelihood ratio statistic for a given template and a multiple testing scheme to model-select from a family of templates. A dynamic programming algorithm inspired by network flows is used to compute the maximum likelihood template in a multiscale manner. In a numerical example, the proposed procedure is nearly as powerful as the super-optimal procedures that know the true template size and true partition, respectively. Extensions to general history-dependent point processes is discussed.
Two models for evaluating landslide hazards
Davis, J.C.; Chung, C.-J.; Ohlmacher, G.C.
2006-01-01
Two alternative procedures for estimating landslide hazards were evaluated using data on topographic digital elevation models (DEMs) and bedrock lithologies in an area adjacent to the Missouri River in Atchison County, Kansas, USA. The two procedures are based on the likelihood ratio model but utilize different assumptions. The empirical likelihood ratio model is based on non-parametric empirical univariate frequency distribution functions under an assumption of conditional independence while the multivariate logistic discriminant model assumes that likelihood ratios can be expressed in terms of logistic functions. The relative hazards of occurrence of landslides were estimated by an empirical likelihood ratio model and by multivariate logistic discriminant analysis. Predictor variables consisted of grids containing topographic elevations, slope angles, and slope aspects calculated from a 30-m DEM. An integer grid of coded bedrock lithologies taken from digitized geologic maps was also used as a predictor variable. Both statistical models yield relative estimates in the form of the proportion of total map area predicted to already contain or to be the site of future landslides. The stabilities of estimates were checked by cross-validation of results from random subsamples, using each of the two procedures. Cell-by-cell comparisons of hazard maps made by the two models show that the two sets of estimates are virtually identical. This suggests that the empirical likelihood ratio and the logistic discriminant analysis models are robust with respect to the conditional independent assumption and the logistic function assumption, respectively, and that either model can be used successfully to evaluate landslide hazards. ?? 2006.
Quantitative fluorescence correlation spectroscopy on DNA in living cells
NASA Astrophysics Data System (ADS)
Hodges, Cameron; Kafle, Rudra P.; Meiners, Jens-Christian
2017-02-01
FCS is a fluorescence technique conventionally used to study the kinetics of fluorescent molecules in a dilute solution. Being a non-invasive technique, it is now drawing increasing interest for the study of more complex systems like the dynamics of DNA or proteins in living cells. Unlike an ordinary dye solution, the dynamics of macromolecules like proteins or entangled DNA in crowded environments is often slow and subdiffusive in nature. This in turn leads to longer residence times of the attached fluorophores in the excitation volume of the microscope and artifacts from photobleaching abound that can easily obscure the signature of the molecular dynamics of interest and make quantitative analysis challenging.We discuss methods and procedures to make FCS applicable to quantitative studies of the dynamics of DNA in live prokaryotic and eukaryotic cells. The intensity autocorrelation is computed function from weighted arrival times of the photons on the detector that maximizes the information content while simultaneously correcting for the effect of photobleaching to yield an autocorrelation function that reflects only the underlying dynamics of the sample. This autocorrelation function in turn is used to calculate the mean square displacement of the fluorophores attached to DNA. The displacement data is more amenable to further quantitative analysis than the raw correlation functions. By using a suitable integral transform of the mean square displacement, we can then determine the viscoelastic moduli of the DNA in its cellular environment. The entire analysis procedure is extensively calibrated and validated using model systems and computational simulations.
Oak Ridge Environmental Information System (OREIS) functional system design document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Birchfield, T.E.; Brown, M.O.; Coleman, P.R.
1994-03-01
The OREIS Functional System Design document provides a detailed functional description of the Oak Ridge Environmental Information System (OREIS). It expands the system requirements defined in the OREIS Phase 1-System Definition Document (ES/ER/TM-34). Documentation of OREIS development is based on the Automated Data Processing System Development Methodology, a Martin Marietta Energy Systems, Inc., procedure written to assist in developing scientific and technical computer systems. This document focuses on the development of the functional design of the user interface, which includes the integration of commercial applications software. The data model and data dictionary are summarized briefly; however, the Data Management Planmore » for OREIS (ES/ER/TM-39), a companion document to the Functional System Design document, provides the complete data dictionary and detailed descriptions of the requirements for the data base structure. The OREIS system will provide the following functions, which are executed from a Menu Manager: (1) preferences, (2) view manager, (3) macro manager, (4) data analysis (assisted analysis and unassisted analysis), and (5) spatial analysis/map generation (assisted ARC/INFO and unassisted ARC/INFO). Additional functionality includes interprocess communications, which handle background operations of OREIS.« less
Alternative Strategies in Assessing Special Education Needs
ERIC Educational Resources Information Center
Dykeman, Bruce F.
2006-01-01
The conventional use of standardized testing within a discrepancy analysis model is reviewed. The Response-to-Intervention (RTI) process is explained, along with descriptions of assessment procedures within RTI: functional assessment, authentic assessment, curriculum-based measurement, and play-based assessment. Psychometric issues relevant to RTI…
Optimization of flexible wing structures subject to strength and induced drag constraints
NASA Technical Reports Server (NTRS)
Haftka, R. T.
1977-01-01
An optimization procedure for designing wing structures subject to stress, strain, and drag constraints is presented. The optimization method utilizes an extended penalty function formulation for converting the constrained problem into a series of unconstrained ones. Newton's method is used to solve the unconstrained problems. An iterative analysis procedure is used to obtain the displacements of the wing structure including the effects of load redistribution due to the flexibility of the structure. The induced drag is calculated from the lift distribution. Approximate expressions for the constraints used during major portions of the optimization process enhance the efficiency of the procedure. A typical fighter wing is used to demonstrate the procedure. Aluminum and composite material designs are obtained. The tradeoff between weight savings and drag reduction is investigated.
Validating Coherence Measurements Using Aligned and Unaligned Coherence Functions
NASA Technical Reports Server (NTRS)
Miles, Jeffrey Hilton
2006-01-01
This paper describes a novel approach based on the use of coherence functions and statistical theory for sensor validation in a harsh environment. By the use of aligned and unaligned coherence functions and statistical theory one can test for sensor degradation, total sensor failure or changes in the signal. This advanced diagnostic approach and the novel data processing methodology discussed provides a single number that conveys this information. This number as calculated with standard statistical procedures for comparing the means of two distributions is compared with results obtained using Yuen's robust statistical method to create confidence intervals. Examination of experimental data from Kulite pressure transducers mounted in a Pratt & Whitney PW4098 combustor using spectrum analysis methods on aligned and unaligned time histories has verified the effectiveness of the proposed method. All the procedures produce good results which demonstrates how robust the technique is.
Command Process Modeling & Risk Analysis
NASA Technical Reports Server (NTRS)
Meshkat, Leila
2011-01-01
Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.
ERIC Educational Resources Information Center
Angrist, Joshua; Pischke, Jorn-Steffen
2010-01-01
This essay reviews progress in empirical economics since Leamer'rs (1983) critique. Leamer highlighted the benefits of sensitivity analysis, a procedure in which researchers show how their results change with changes in specification or functional form. Sensitivity analysis has had a salutary but not a revolutionary effect on econometric practice.…
Healy, Sean; Nacario, Adam; Braithwaite, Rock E; Hopper, Chris
2018-06-01
The purpose of this meta-analysis was to examine the effect of physical activity interventions on youth diagnosed with autism spectrum disorder. Standard meta-analytical procedures determining inclusion criteria, literature searches in electronic databases, coding procedures, and statistical methods were used to identify and synthesize articles retained for analysis. Hedge's g (1988) was utilized to interpret effect sizes and quantify research findings. Moderator and outcome variables were assessed using coding procedures. A total of 29 studies with 30 independent samples (N = 1009) were utilized in this analysis. Results from meta-analyses indicated an overall moderate effect (g = 0.62). Several outcomes indicated moderate-to-large effects (g ≥ 0.5); specifically, moderate to large positive effects were revealed for participants exposed to interventions targeting the development of manipulative skills, locomotor skills, skill-related fitness, social functioning, and muscular strength and endurance. Moderator analyses were conducted to explain variance between groups; environment was the only subgrouping variable (intervention characteristics) to produce a significant difference (Q B = 5.67, P < 0.05) between moderators. While no significant differences were found between other moderators, several trends were apparent within groups in which experimental groups outperformed control groups. Autism Res 2018, 11: 818-833. © 2018 International Society for Autism Research, Wiley Periodicals, Inc. Results of the meta-analysis-a method for synthesizing research-showed physical activity interventions to have a moderate or large effect on a variety of outcomes, including for the development of manipulative skills, locomotor skills, skill-related fitness, social functioning, and muscular strength and endurance. The authors conclude that physical activity's standing as an evidence-based strategy for youth with ASD is reinforced. © 2018 International Society for Autism Research, Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Lambrakos, S. G.
2018-04-01
Inverse thermal analysis of Ti-6Al-4V friction stir welds is presented that demonstrates application of a methodology using numerical-analytical basis functions and temperature-field constraint conditions. This analysis provides parametric representation of friction-stir-weld temperature histories that can be adopted as input data to computational procedures for prediction of solid-state phase transformations and mechanical response. These parameterized temperature histories can be used for inverse thermal analysis of friction stir welds having process conditions similar those considered here. Case studies are presented for inverse thermal analysis of friction stir welds that use three-dimensional constraint conditions on calculated temperature fields, which are associated with experimentally measured transformation boundaries and weld-stir-zone cross sections.
An analysis of maintenance following functional communication training.
Durand, V M; Carr, E G
1992-01-01
The multiple and long-term effects of functional communication training relative to a common reductive procedure (time-out from positive reinforcement) were evaluated. Twelve children participated in a functional analysis of their challenging behaviors (Study 1), which implicated adult attention as a maintaining variable. The children were then matched for chronological age, mental age, and language age and assigned to two groups. One group received functional communication training as an intervention for their challenging behavior, and the second group received time-out as a contrast. Both interventions were initially successful (Study 2), but durable results were achieved only with the group that received functional communication training across different stimulus conditions (Study 3). Students whose challenging behaviors were previously reduced with time-out resumed these behaviors in the presence of naive teachers unaware of the children's intervention history. The value of teaching communicative responses to promote maintenance is discussed as it relates to the concept of functional equivalence. PMID:1478902
Efficient Site-Specific Labeling of Proteins via Cysteines
Kim, Younggyu; Ho, Sam O.; Gassman, Natalie R.; Korlann, You; Landorf, Elizabeth V.; Collart, Frank R.; Weiss, Shimon
2011-01-01
Methods for chemical modifications of proteins have been crucial for the advancement of proteomics. In particular, site-specific covalent labeling of proteins with fluorophores and other moieties has permitted the development of a multitude of assays for proteome analysis. A common approach for such a modification is solvent-accessible cysteine labeling using thiol-reactive dyes. Cysteine is very attractive for site-specific conjugation due to its relative rarity throughout the proteome and the ease of its introduction into a specific site along the protein's amino acid chain. This is achieved by site-directed mutagenesis, most often without perturbing the protein's function. Bottlenecks in this reaction, however, include the maintenance of reactive thiol groups without oxidation before the reaction, and the effective removal of unreacted molecules prior to fluorescence studies. Here, we describe an efficient, specific, and rapid procedure for cysteine labeling starting from well-reduced proteins in the solid state. The efficacy and specificity of the improved procedure are estimated using a variety of single-cysteine proteins and thiol-reactive dyes. Based on UV/vis absorbance spectra, coupling efficiencies are typically in the range 70–90%, and specificities are better than ~95%. The labeled proteins are evaluated using fluorescence assays, proving that the covalent modification does not alter their function. In addition to maleimide-based conjugation, this improved procedure may be used for other thiol-reactive conjugations such as haloacetyl, alkyl halide, and disulfide interchange derivatives. This facile and rapid procedure is well suited for high throughput proteome analysis. PMID:18275130
Efficient site-specific labeling of proteins via cysteines.
Kim, Younggyu; Ho, Sam O; Gassman, Natalie R; Korlann, You; Landorf, Elizabeth V; Collart, Frank R; Weiss, Shimon
2008-03-01
Methods for chemical modifications of proteins have been crucial for the advancement of proteomics. In particular, site-specific covalent labeling of proteins with fluorophores and other moieties has permitted the development of a multitude of assays for proteome analysis. A common approach for such a modification is solvent-accessible cysteine labeling using thiol-reactive dyes. Cysteine is very attractive for site-specific conjugation due to its relative rarity throughout the proteome and the ease of its introduction into a specific site along the protein's amino acid chain. This is achieved by site-directed mutagenesis, most often without perturbing the protein's function. Bottlenecks in this reaction, however, include the maintenance of reactive thiol groups without oxidation before the reaction, and the effective removal of unreacted molecules prior to fluorescence studies. Here, we describe an efficient, specific, and rapid procedure for cysteine labeling starting from well-reduced proteins in the solid state. The efficacy and specificity of the improved procedure are estimated using a variety of single-cysteine proteins and thiol-reactive dyes. Based on UV/vis absorbance spectra, coupling efficiencies are typically in the range 70-90%, and specificities are better than approximately 95%. The labeled proteins are evaluated using fluorescence assays, proving that the covalent modification does not alter their function. In addition to maleimide-based conjugation, this improved procedure may be used for other thiol-reactive conjugations such as haloacetyl, alkyl halide, and disulfide interchange derivatives. This facile and rapid procedure is well suited for high throughput proteome analysis.
Comparing preference assessments: selection- versus duration-based preference assessment procedures.
Kodak, Tiffany; Fisher, Wayne W; Kelley, Michael E; Kisamore, April
2009-01-01
In the current investigation, the results of a selection- and a duration-based preference assessment procedure were compared. A Multiple Stimulus With Replacement (MSW) preference assessment [Windsor, J., Piché, L. M., & Locke, P. A. (1994). Preference testing: A comparison of two presentation methods. Research in Developmental Disabilities, 15, 439-455] and a variation of a Free-Operant (FO) preference assessment procedure [Roane, H. S., Vollmer, T. R., Ringdahl, J. E., & Marcus, B. A. (1998). Evaluation of a brief stimulus preference assessment. Journal of Applied Behavior Analysis, 31, 605-620] were conducted with four participants. A reinforcer assessment was conducted to determine which preference assessment procedure identified the item that produced the highest rates of responding. The items identified as most highly preferred were different across preference assessment procedures for all participants. Results of the reinforcer assessment showed that the MSW identified the item that functioned as the most effective reinforcer for two participants.
Patient Preferences Regarding Surgical Interventions for Knee Osteoarthritis
Moorman, Claude T; Kirwan, Tom; Share, Jennifer; Vannabouathong, Christopher
2017-01-01
Surgical interventions for knee osteoarthritis (OA) have markedly different procedure attributes and may have dramatic differences in patient desirability. A total of 323 patients with knee OA were included in a dual response, choice-based conjoint analysis to identify the relative preference of 9 different procedure attributes. A model was also developed to simulate how patients might respond if presented with the real-world knee OA procedures, based on conservative assumptions regarding their attributes. The “amount of cutting and removal of the existing bone” required for a procedure had the highest preference score, indicating that these patients considered it the most important attribute. More specifically, a procedure that requires the least amount of bone cutting or removal would be expected to be the most preferred surgical alternative. The model also suggested that patients who are younger and report the highest pain levels and greatest functional limitations would be more likely to opt for surgical intervention. PMID:28974919
Brown, Angus M
2006-04-01
The objective of this present study was to demonstrate a method for fitting complex electrophysiological data with multiple functions using the SOLVER add-in of the ubiquitous spreadsheet Microsoft Excel. SOLVER minimizes the difference between the sum of the squares of the data to be fit and the function(s) describing the data using an iterative generalized reduced gradient method. While it is a straightforward procedure to fit data with linear functions, and we have previously demonstrated a method of non-linear regression analysis of experimental data based upon a single function, it is more complex to fit data with multiple functions, usually requiring specialized expensive computer software. In this paper we describe an easily understood program for fitting experimentally acquired data, in this case the stimulus-evoked compound action potential from the mouse optic nerve, with multiple Gaussian functions. The program is flexible and can be applied to describe data with a wide variety of user-input functions.
Risk analysis with a fuzzy-logic approach of a complex installation
NASA Astrophysics Data System (ADS)
Peikert, Tim; Garbe, Heyno; Potthast, Stefan
2016-09-01
This paper introduces a procedural method based on fuzzy logic to analyze systematic the risk of an electronic system in an intentional electromagnetic environment (IEME). The method analyzes the susceptibility of a complex electronic installation with respect to intentional electromagnetic interference (IEMI). It combines the advantages of well-known techniques as fault tree analysis (FTA), electromagnetic topology (EMT) and Bayesian networks (BN) and extends the techniques with an approach to handle uncertainty. This approach uses fuzzy sets, membership functions and fuzzy logic to handle the uncertainty with probability functions and linguistic terms. The linguistic terms add to the risk analysis the knowledge from experts of the investigated system or environment.
Rapid analytical and preparative isolation of functional endosomes by free flow electrophoresis.
Marsh, M; Schmid, S; Kern, H; Harms, E; Male, P; Mellman, I; Helenius, A
1987-04-01
Endosomes are prelysosomal organelles that serve as an intracellular site for the sorting, distribution, and processing of receptors, ligands, fluid phase components, and membrane proteins internalized by endocytosis. Whereas the overall functions of endosomes are increasingly understood, little is known about endosome structure, composition, or biogenesis. In this paper, we describe a rapid procedure that permits analytical and preparative isolation of endosomes from a variety of tissue culture cells. The procedure relies on a combination of density gradient centrifugation and free flow electrophoresis. It yields a fraction of highly purified, functionally intact organelles. As markers for endosomes in Chinese hamster ovary cells, we used endocytosed horseradish peroxidase, FITC-conjugated dextran, and [35S]methionine-labeled Semliki Forest virus. Total postnuclear supernatants, crude microsomal pellets, or partially purified Golgi fractions were subjected to free flow electrophoresis. Endosomes and lysosomes migrated together as a single anodally deflected peak separated from most other organelles (plasma membrane, mitochondria, endoplasmic reticulum, and Golgi). The endosomes and lysosomes were then resolved by centrifugation in Percoll density gradients. Endosomes prepared in this way were enriched up to 70-fold relative to the initial homogenate and were still capable of ATP-dependent acidification. By electron microscopy, the isolated organelles were found to consist of electron lucent vacuoles and tubules, many of which could be shown to contain an endocytic tracer (e.g., horseradish peroxidase). SDS PAGE analysis of integral and peripheral membrane proteins (separated from each other by condensation in Triton X-114) revealed a unique and restricted subset of proteins when compared with lysosomes, the unshifted free flow electrophoresis peak, and total cell protein. Altogether, the purification procedure takes 5-6 h and yields amounts of endosomes (150-200 micrograms protein) sufficient for biochemical, immunological, and functional analysis.
NASA Technical Reports Server (NTRS)
2005-01-01
This document involves definition of technology interface requirements for Contingency Management. This was performed through a review of Contingency Management-related, HSI requirements documents, standards, and recommended practices. Technology concepts in use by the Contingency Management Work Package were considered. Beginning with HSI high-level functional requirements for Contingency Management, and Contingency Management technology elements, HSI requirements for the interface to the pilot were identified. Results of the analysis describe (1) the information required by the pilot to have knowledge of system failures and associated contingency procedures, and (2) the control capability needed by the pilot to obtain system status and procedure information. Fundamentally, these requirements provide the candidate Contingency Management technology concepts with the necessary human-related elements to make them compatible with human capabilities and limitations. The results of the analysis describe how Contingency Management operations and functions should interface with the pilot to provide the necessary Contingency Management functionality to the UA-pilot system. Requirements and guidelines for Contingency Management are partitioned into four categories: (1) Health and Status and (2) Contingency Management. Each requirement is stated and is supported with a rationale and associated reference(s).
Management of pediatric patients with refractory constipation who fail cecostomy.
Bonilla, Silvana F; Flores, Alejandro; Jackson, Carl-Christian A; Chwals, Walter J; Orkin, Bruce A
2013-09-01
Antegrade continence enema (ACE) is a recognized therapeutic option in the management of pediatric refractory constipation. Data on the long-term outcome of patients who fail to improve after an ACE-procedure are lacking. To describe the rate of ACE bowel management failure in pediatric refractory constipation, and the management and long term outcome of these patients. Retrospective analysis of a cohort of patients that underwent ACE-procedure and had at least 3-year-follow-up. Detailed analysis of subsequent treatment and outcome of those patients with a poor functional outcome was performed. 76 patients were included. 12 (16%) failed successful bowel management after ACE requiring additional intervention. Mean follow-up was 66.3 (range 35-95 months) after ACE-procedure. Colonic motility studies demonstrated colonic neuropathy in 7 patients (58%); abnormal motility in 4 patients (33%), and abnormal left-sided colonic motility in 1 patient (9%). All 12 patients were ultimately treated surgically. Nine patients (75%) had marked clinical improvement, whereas 3 patients (25%) continued to have poor function issues at long term follow-up. Colonic resection, either segmental or total, led to improvement or resolution of symptoms in the majority of patients who failed cecostomy. However, this is a complex and heterogeneous group and some patients will have continued issues. Copyright © 2013 Elsevier Inc. All rights reserved.
Minimization of a Class of Matrix Trace Functions by Means of Refined Majorization.
ERIC Educational Resources Information Center
Kiers, Henk A. L.; ten Berge, Jos M. F.
1992-01-01
A procedure is described for minimizing a class of matrix trace functions, which is a refinement of an earlier procedure for minimizing the class of matrix trace functions using majorization. Several trial analyses demonstrate that the revised procedure is more efficient than the earlier majorization-based procedure. (SLD)
NASA Technical Reports Server (NTRS)
Aghazadeh, Fred
2005-01-01
The objective of the planned summer research was to develop a procedure to determine the isokinetic functional strength of suited and unsuited participants in order to estimate the coefficient of micro-gravity suit on human strength. To accomplish this objective, the Anthropometry and Biomechanics Facility's Multipurpose, Multiaxial Isokinetic dynamometer (MMID) was used. Development of procedure involved selection and testing of seven routines to be tested on MMID. We conducted the related experiments and collected the data for 12 participants. In addition to the above objective, we developed a procedure to assess the fatiguing characteristics of suited and unsuited participants using EMG technique. We collected EMG data on 10 participants while performing a programmed routing on MMID. EMG data along with information on the exerted forces, effector speed, number of repetitions, and duration of each routine were recorded for further analysis. Finally, gathering and tabulation Of data for various human strengths for updating of MSIS (HSIS) strength requirement, which started in summer 2003, also continued.
Censored data treatment using additional information in intelligent medical systems
NASA Astrophysics Data System (ADS)
Zenkova, Z. N.
2015-11-01
Statistical procedures are a very important and significant part of modern intelligent medical systems. They are used for proceeding, mining and analysis of different types of the data about patients and their diseases; help to make various decisions, regarding the diagnosis, treatment, medication or surgery, etc. In many cases the data can be censored or incomplete. It is a well-known fact that censorship considerably reduces the efficiency of statistical procedures. In this paper the author makes a brief review of the approaches which allow improvement of the procedures using additional information, and describes a modified estimation of an unknown cumulative distribution function involving additional information about a quantile which is known exactly. The additional information is used by applying a projection of a classical estimator to a set of estimators with certain properties. The Kaplan-Meier estimator is considered as an estimator of the unknown cumulative distribution function, the properties of the modified estimator are investigated for a case of a single right censorship by means of simulations.
Dynamic Stiffness Transfer Function of an Electromechanical Actuator Using System Identification
NASA Astrophysics Data System (ADS)
Kim, Sang Hwa; Tahk, Min-Jea
2018-04-01
In the aeroelastic analysis of flight vehicles with electromechanical actuators (EMAs), an accurate prediction of flutter requires dynamic stiffness characteristics of the EMA. The dynamic stiffness transfer function of the EMA with brushless direct current (BLDC) motor can be obtained by conducting complicated mathematical calculations of control algorithms and mechanical/electrical nonlinearities using linearization techniques. Thus, system identification approaches using experimental data, as an alternative, have considerable advantages. However, the test setup for system identification is expensive and complex, and experimental procedures for data collection are time-consuming tasks. To obtain the dynamic stiffness transfer function, this paper proposes a linear system identification method that uses information obtained from a reliable dynamic stiffness model with a control algorithm and nonlinearities. The results of this study show that the system identification procedure is compact, and the transfer function is able to describe the dynamic stiffness characteristics of the EMA. In addition, to verify the validity of the system identification method, the simulation results of the dynamic stiffness transfer function and the dynamic stiffness model were compared with the experimental data for various external loads.
Sajobi, Tolulope T; Lix, Lisa M; Singh, Gurbakhshash; Lowerison, Mark; Engbers, Jordan; Mayo, Nancy E
2015-03-01
Response shift (RS) is an important phenomenon that influences the assessment of longitudinal changes in health-related quality of life (HRQOL) studies. Given that RS effects are often small, missing data due to attrition or item non-response can contribute to failure to detect RS effects. Since missing data are often encountered in longitudinal HRQOL data, effective strategies to deal with missing data are important to consider. This study aims to compare different imputation methods on the detection of reprioritization RS in the HRQOL of caregivers of stroke survivors. Data were from a Canadian multi-center longitudinal study of caregivers of stroke survivors over a one-year period. The Stroke Impact Scale physical function score at baseline, with a cutoff of 75, was used to measure patient stroke severity for the reprioritization RS analysis. Mean imputation, likelihood-based expectation-maximization imputation, and multiple imputation methods were compared in test procedures based on changes in relative importance weights to detect RS in SF-36 domains over a 6-month period. Monte Carlo simulation methods were used to compare the statistical powers of relative importance test procedures for detecting RS in incomplete longitudinal data under different missing data mechanisms and imputation methods. Of the 409 caregivers, 15.9 and 31.3 % of them had missing data at baseline and 6 months, respectively. There were no statistically significant changes in relative importance weights on any of the domains when complete-case analysis was adopted. But statistical significant changes were detected on physical functioning and/or vitality domains when mean imputation or EM imputation was adopted. There were also statistically significant changes in relative importance weights for physical functioning, mental health, and vitality domains when multiple imputation method was adopted. Our simulations revealed that relative importance test procedures were least powerful under complete-case analysis method and most powerful when a mean imputation or multiple imputation method was adopted for missing data, regardless of the missing data mechanism and proportion of missing data. Test procedures based on relative importance measures are sensitive to the type and amount of missing data and imputation method. Relative importance test procedures based on mean imputation and multiple imputation are recommended for detecting RS in incomplete data.
Convergent close coupling versus the generalized Sturmian function approach: Wave-function analysis
NASA Astrophysics Data System (ADS)
Ambrosio, M.; Mitnik, D. M.; Gasaneo, G.; Randazzo, J. M.; Kadyrov, A. S.; Fursa, D. V.; Bray, I.
2015-11-01
We compare the physical information contained in the Temkin-Poet (TP) scattering wave function representing electron-impact ionization of hydrogen, calculated by the convergent close-coupling (CCC) and generalized Sturmian function (GSF) methodologies. The idea is to show that the ionization cross section can be extracted from the wave functions themselves. Using two different procedures based on hyperspherical Sturmian functions we show that the transition amplitudes contained in both GSF and CCC scattering functions lead to similar single-differential cross sections. The single-continuum channels were also a subject of the present studies, and we show that the elastic and excitation amplitudes are essentially the same as well.
Diffusion of Super-Gaussian Profiles
ERIC Educational Resources Information Center
Rosenberg, C.-J.; Anderson, D.; Desaix, M.; Johannisson, P.; Lisak, M.
2007-01-01
The present analysis describes an analytically simple and systematic approximation procedure for modelling the free diffusive spreading of initially super-Gaussian profiles. The approach is based on a self-similar ansatz for the evolution of the diffusion profile, and the parameter functions involved in the modelling are determined by suitable…
Ferrero, A; Campos, J; Rabal, A M; Pons, A; Hernanz, M L; Corróns, A
2011-09-26
The Bidirectional Reflectance Distribution Function (BRDF) is essential to characterize an object's reflectance properties. This function depends both on the various illumination-observation geometries as well as on the wavelength. As a result, the comprehensive interpretation of the data becomes rather complex. In this work we assess the use of the multivariable analysis technique of Principal Components Analysis (PCA) applied to the experimental BRDF data of a ceramic colour standard. It will be shown that the result may be linked to the various reflection processes occurring on the surface, assuming that the incoming spectral distribution is affected by each one of these processes in a specific manner. Moreover, this procedure facilitates the task of interpolating a series of BRDF measurements obtained for a particular sample. © 2011 Optical Society of America
Cohen, Wendy; Wynne, David McGregor
2015-07-01
A single case study is reported of a child who underwent several surgical procedures as result of congenital grade III subglottic stenosis. The anterior aspect of the right vocal cord was damaged and underwent atrophy during one of these procedures. Now, an active 10-year-old, the patient has become increasingly aware of his vocal limitations on functional activities. Injection of hyaluronic acid into the vocal folds has been known to provide improved voice quality in adults although there are no known cases reported of this procedure in children. This article reports voice outcomes after injection of hyaluronic acid into the Reinke's space in a single case study. Voice recordings were made before, after, and 1 month after injection. The voice recordings were subject to acoustic and perceptual analysis. Post and follow-up voice recordings demonstrate decreased jitter, shimmer, and harmonics-to-noise ratio. Perceptual evaluation indicates improved voice quality. Injection of hyaluronic acid in children who require voice augmentation is possible and may contribute to increased vocal function and improved voice outcomes. Copyright © 2015 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
Computer-aided operations engineering with integrated models of systems and operations
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Ryan, Dan; Fleming, Land
1994-01-01
CONFIG 3 is a prototype software tool that supports integrated conceptual design evaluation from early in the product life cycle, by supporting isolated or integrated modeling, simulation, and analysis of the function, structure, behavior, failures and operation of system designs. Integration and reuse of models is supported in an object-oriented environment providing capabilities for graph analysis and discrete event simulation. Integration is supported among diverse modeling approaches (component view, configuration or flow path view, and procedure view) and diverse simulation and analysis approaches. Support is provided for integrated engineering in diverse design domains, including mechanical and electro-mechanical systems, distributed computer systems, and chemical processing and transport systems. CONFIG supports abstracted qualitative and symbolic modeling, for early conceptual design. System models are component structure models with operating modes, with embedded time-related behavior models. CONFIG supports failure modeling and modeling of state or configuration changes that result in dynamic changes in dependencies among components. Operations and procedure models are activity structure models that interact with system models. CONFIG is designed to support evaluation of system operability, diagnosability and fault tolerance, and analysis of the development of system effects of problems over time, including faults, failures, and procedural or environmental difficulties.
Shelton, Larry R.
1997-01-01
For many years, stream samples for analysis of volatile organic compounds have been collected without specific guidelines or a sampler designed to avoid analyte loss. In 1996, the U.S. Geological Survey's National Water-Quality Assessment Program began aggressively monitoring urban stream-water for volatile organic compounds. To assure representative samples and consistency in collection procedures, a specific sampler was designed to collect samples for analysis of volatile organic compounds in stream water. This sampler, and the collection procedures, were tested in the laboratory and in the field for compound loss, contamination, sample reproducibility, and functional capabilities. This report describes that sampler and its use, and outlines field procedures specifically designed to provide contaminant-free, reproducible volatile organic compound data from stream-water samples. These guidelines and the equipment described represent a significant change in U.S. Geological Survey instructions for collecting and processing stream-water samples for analysis of volatile organic compounds. They are intended to produce data that are both defensible and interpretable, particularly for concentrations below the microgram-per-liter level. The guidelines also contain detailed recommendations for quality-control samples.
Innovative Use of Thighplasty to Improve Prosthesis Fit and Function in a Transfemoral Amputee.
Kuiken, Todd A; Fey, Nicholas P; Reissman, Timothy; Finucane, Suzanne B; Dumanian, Gregory A
2018-01-01
Excess residual limb fat is a common problem that can impair prosthesis control and negatively impact gait. In the general population, thighplasty and liposuction are commonly performed for cosmetic reasons but not specifically to improve function in amputees. The objective of this study was to determine if these procedures could enhance prosthesis fit and function in an overweight above-knee amputee. We evaluated the use of these techniques on a 50-year-old transfemoral amputee who was overweight. The patient underwent presurgical imaging and tests to measure her residual limb tissue distribution, socket-limb interface stiffness, residual femur orientation, lower-extremity function, and prosthesis satisfaction. A medial thighplasty procedure with circumferential liposuction was performed, during which 2,812 g (6.2 lbs.) of subcutaneous fat and skin was removed from her residual limb. Imaging was repeated 5 months postsurgery; functional assessments were repeated 9 months postsurgery. The patient demonstrated notable improvements in socket fit and in performing most functional and walking tests. Her comfortable walking speed increased 13.3%, and her scores for the Sit-to-Stand and Four Square Step tests improved over 20%. Femur alignment in her socket changed from 8.13 to 4.14 degrees, and analysis showed a marked increase in the socket-limb interface stiffness. This study demonstrates the potential of using a routine plastic surgery procedure to modify the intrinsic properties of the limb and to improve functional outcomes in overweight or obese transfemoral amputees. This technique is a potentially attractive option compared with multiple reiterations of sockets, which can be time-consuming and costly.
Sumer, Huseyin; Craig, Jeffrey M.; Sibson, Mandy; Choo, K.H. Andy
2003-01-01
Human neocentromeres are fully functional centromeres that arise at previously noncentromeric regions of the genome. We have tested a rapid procedure of genomic array analysis of chromosome scaffold/matrix attachment regions (S/MARs), involving the isolation of S/MAR DNA and hybridization of this DNA to a genomic BAC/PAC array. Using this procedure, we have defined a 2.5-Mb domain of S/MAR-enriched chromatin that fully encompasses a previously mapped centromere protein-A (CENP-A)-associated domain at a human neocentromere. We have independently verified this procedure using a previously established fluorescence in situ hybridization method on salt-treated metaphase chromosomes. In silico sequence analysis of the S/MAR-enriched and surrounding regions has revealed no outstanding sequence-related predisposition. This study defines the S/MAR-enriched domain of a higher eukaryotic centromere and provides a method that has broad application for the mapping of S/MAR attachment sites over large genomic regions or throughout a genome. PMID:12840048
Formalizing the Austrian Procedure Catalogue: A 4-step methodological analysis approach.
Neururer, Sabrina Barbara; Lasierra, Nelia; Peiffer, Karl Peter; Fensel, Dieter
2016-04-01
Due to the lack of an internationally accepted and adopted standard for coding health interventions, Austria has established its own country-specific procedure classification system - the Austrian Procedure Catalogue (APC). Even though the APC is an elaborate coding standard for medical procedures, it has shortcomings that limit its usability. In order to enhance usability and usefulness, especially for research purposes and e-health applications, we developed an ontologized version of the APC. In this paper we present a novel four-step approach for the ontology engineering process, which enables accurate extraction of relevant concepts for medical ontologies from written text. The proposed approach for formalizing the APC consists of the following four steps: (1) comparative pre-analysis, (2) definition analysis, (3) typological analysis, and (4) ontology implementation. The first step contained a comparison of the APC to other well-established or elaborate health intervention coding systems in order to identify strengths and weaknesses of the APC. In the second step, a list of definitions of medical terminology used in the APC was obtained. This list of definitions was used as input for Step 3, in which we identified the most important concepts to describe medical procedures using the qualitative typological analysis approach. The definition analysis as well as the typological analysis are well-known and effective methods used in social sciences, but not commonly employed in the computer science or ontology engineering domain. Finally, this list of concepts was used in Step 4 to formalize the APC. The pre-analysis highlighted the major shortcomings of the APC, such as the lack of formal definition, leading to implicitly available, but not directly accessible information (hidden data), or the poor procedural type classification. After performing the definition and subsequent typological analyses, we were able to identify the following main characteristics of health interventions: (1) Procedural type, (2) Anatomical site, (3) Medical device, (4) Pathology, (5) Access, (6) Body system, (7) Population, (8) Aim, (9) Discipline, (10) Technique, and (11) Body Function. These main characteristics were taken as input of classes for the formalization of the APC. We were also able to identify relevant relations between classes. The proposed four-step approach for formalizing the APC provides a novel, systematically developed, strong framework to semantically enrich procedure classifications. Although this methodology was designed to address the particularities of the APC, the included methods are based on generic analysis tasks, and therefore can be re-used to provide a systematic representation of other procedure catalogs or classification systems and hence contribute towards a universal alignment of such representations, if desired. Copyright © 2015 Elsevier Inc. All rights reserved.
A Statistical Analysis of Brain Morphology Using Wild Bootstrapping
Ibrahim, Joseph G.; Tang, Niansheng; Rowe, Daniel B.; Hao, Xuejun; Bansal, Ravi; Peterson, Bradley S.
2008-01-01
Methods for the analysis of brain morphology, including voxel-based morphology and surface-based morphometries, have been used to detect associations between brain structure and covariates of interest, such as diagnosis, severity of disease, age, IQ, and genotype. The statistical analysis of morphometric measures usually involves two statistical procedures: 1) invoking a statistical model at each voxel (or point) on the surface of the brain or brain subregion, followed by mapping test statistics (e.g., t test) or their associated p values at each of those voxels; 2) correction for the multiple statistical tests conducted across all voxels on the surface of the brain region under investigation. We propose the use of new statistical methods for each of these procedures. We first use a heteroscedastic linear model to test the associations between the morphological measures at each voxel on the surface of the specified subregion (e.g., cortical or subcortical surfaces) and the covariates of interest. Moreover, we develop a robust test procedure that is based on a resampling method, called wild bootstrapping. This procedure assesses the statistical significance of the associations between a measure of given brain structure and the covariates of interest. The value of this robust test procedure lies in its computationally simplicity and in its applicability to a wide range of imaging data, including data from both anatomical and functional magnetic resonance imaging (fMRI). Simulation studies demonstrate that this robust test procedure can accurately control the family-wise error rate. We demonstrate the application of this robust test procedure to the detection of statistically significant differences in the morphology of the hippocampus over time across gender groups in a large sample of healthy subjects. PMID:17649909
Isolation of Tonoplast Vesicles from Tomato Fruit Pericarp
Snowden, Christopher J.; Thomas, Benjamin; Baxter, Charles J.; Smith, J. Andrew C.; Sweetlove, Lee J.
2017-01-01
This protocol describes the isolation of tonoplast vesicles from tomato fruit. The vesicles isolated using this procedure are of sufficiently high purity for downstream proteomic analysis whilst remaining transport competent for functional assays. The methodology was used to study the transport of amino acids during tomato fruit ripening (Snowden et al., 2015) and based on the procedure used by Betty and Smith (Bettey and Smith, 1993). Such vesicles may be useful in further studies into the dynamic transfer of metabolites across the tonoplast for storage and metabolism during tomato fruit development. PMID:29085859
A Bayesian approach to parameter and reliability estimation in the Poisson distribution.
NASA Technical Reports Server (NTRS)
Canavos, G. C.
1972-01-01
For life testing procedures, a Bayesian analysis is developed with respect to a random intensity parameter in the Poisson distribution. Bayes estimators are derived for the Poisson parameter and the reliability function based on uniform and gamma prior distributions of that parameter. A Monte Carlo procedure is implemented to make possible an empirical mean-squared error comparison between Bayes and existing minimum variance unbiased, as well as maximum likelihood, estimators. As expected, the Bayes estimators have mean-squared errors that are appreciably smaller than those of the other two.
Blinowska, Katarzyna J; Rakowski, Franciszek; Kaminski, Maciej; De Vico Fallani, Fabrizio; Del Percio, Claudio; Lizio, Roberta; Babiloni, Claudio
2017-04-01
This exploratory study provided a proof of concept of a new procedure using multivariate electroencephalographic (EEG) topographic markers of cortical connectivity to discriminate normal elderly (Nold) and Alzheimer's disease (AD) individuals. The new procedure was tested on an existing database formed by resting state eyes-closed EEG data (19 exploring electrodes of 10-20 system referenced to linked-ear reference electrodes) recorded in 42 AD patients with dementia (age: 65.9years±8.5 standard deviation, SD) and 42 Nold non-consanguineous caregivers (age: 70.6years±8.5 SD). In this procedure, spectral EEG coherence estimated reciprocal functional connectivity while non-normalized directed transfer function (NDTF) estimated effective connectivity. Principal component analysis and computation of Mahalanobis distance integrated and combined these EEG topographic markers of cortical connectivity. The area under receiver operating curve (AUC) indexed the classification accuracy. A good classification of Nold and AD individuals was obtained by combining the EEG markers derived from NDTF and coherence (AUC=86%, sensitivity=0.85, specificity=0.70). These encouraging results motivate a cross-validation study of the new procedure in age- and education-matched Nold, stable and progressing mild cognitive impairment individuals, and de novo AD patients with dementia. If cross-validated, the new procedure will provide cheap, broadly available, repeatable over time, and entirely non-invasive EEG topographic markers reflecting abnormal cortical connectivity in AD patients diagnosed by direct or indirect measurement of cerebral amyloid β and hyperphosphorylated tau peptides. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dag, Serkan; Yildirim, Bora; Sabuncuoglu, Baris
The objective of this study is to develop crack growth analysis methods for functionally graded materials (FGMs) subjected to mode I cyclic loading. The study presents finite elements based computational procedures for both two and three dimensional problems to examine fatigue crack growth in functionally graded materials. Developed methods allow the computation of crack length and generation of crack front profile for a graded medium subjected to fluctuating stresses. The results presented for an elliptical crack embedded in a functionally graded medium, illustrate the competing effects of ellipse aspect ratio and material property gradation on the fatigue crack growth behavior.
NASA Astrophysics Data System (ADS)
Morandi, V.; Galli, M.; Marabelli, F.; Comoretto, D.
2010-04-01
In this work, we combined an experimental technique and a detailed data analysis to investigate the influence of an applied pressure on the anisotropic dielectric functions of highly oriented poly(p-phenylene vinylene) (PPV). The dielectric constants were derived from polarized reflectance spectra recorded through a diamond anvil cell up to 50 kbar. The presence of the diamond anvils strongly affects measured spectra requiring the development in an optical model able to take all spurious effects into account. A parametric procedure was then applied to derive the complex dielectric constants for both polarizations as a function of pressure. A detailed analysis of their pressure dependence allows addressing the role of intermolecular interactions and electron-phonon coupling in highly oriented PPV.
Recurrence Quantification Analysis of Sentence-Level Speech Kinematics.
Jackson, Eric S; Tiede, Mark; Riley, Michael A; Whalen, D H
2016-12-01
Current approaches to assessing sentence-level speech variability rely on measures that quantify variability across utterances and use normalization procedures that alter raw trajectory data. The current work tests the feasibility of a less restrictive nonlinear approach-recurrence quantification analysis (RQA)-via a procedural example and subsequent analysis of kinematic data. To test the feasibility of RQA, lip aperture (i.e., the Euclidean distance between lip-tracking sensors) was recorded for 21 typically developing adult speakers during production of a simple utterance. The utterance was produced in isolation and in carrier structures differing just in length or in length and complexity. Four RQA indices were calculated: percent recurrence (%REC), percent determinism (%DET), stability (MAXLINE), and stationarity (TREND). Percent determinism (%DET) decreased only for the most linguistically complex sentence; MAXLINE decreased as a function of linguistic complexity but increased for the longer-only sentence; TREND decreased as a function of both length and linguistic complexity. This research note demonstrates the feasibility of using RQA as a tool to compare speech variability across speakers and groups. RQA offers promise as a technique to assess effects of potential stressors (e.g., linguistic or cognitive factors) on the speech production system.
Recurrence Quantification Analysis of Sentence-Level Speech Kinematics
Tiede, Mark; Riley, Michael A.; Whalen, D. H.
2016-01-01
Purpose Current approaches to assessing sentence-level speech variability rely on measures that quantify variability across utterances and use normalization procedures that alter raw trajectory data. The current work tests the feasibility of a less restrictive nonlinear approach—recurrence quantification analysis (RQA)—via a procedural example and subsequent analysis of kinematic data. Method To test the feasibility of RQA, lip aperture (i.e., the Euclidean distance between lip-tracking sensors) was recorded for 21 typically developing adult speakers during production of a simple utterance. The utterance was produced in isolation and in carrier structures differing just in length or in length and complexity. Four RQA indices were calculated: percent recurrence (%REC), percent determinism (%DET), stability (MAXLINE), and stationarity (TREND). Results Percent determinism (%DET) decreased only for the most linguistically complex sentence; MAXLINE decreased as a function of linguistic complexity but increased for the longer-only sentence; TREND decreased as a function of both length and linguistic complexity. Conclusions This research note demonstrates the feasibility of using RQA as a tool to compare speech variability across speakers and groups. RQA offers promise as a technique to assess effects of potential stressors (e.g., linguistic or cognitive factors) on the speech production system. PMID:27824987
The combined effects of noncontingent reinforcement and punishment on the reduction of rumination.
DeRosa, Nicole M; Roane, Henry S; Bishop, Jamie R; Silkowski, Erica L
2016-09-01
The current study extends the literature on the assessment and treatment of rumination through the evaluation of a combined reinforcement- and punishment-based intervention. The study included a single participant with a history of rumination maintained by automatic reinforcement, as identified via a functional analysis. Both noncontingent reinforcement (NCR) with preferred edible items and punishment, in the form of a facial screen, were implemented separately to evaluate their independent effects on the occurrence of rumination. The final treatment package included both NCR and punishment procedures. Implementation of the combined treatment resulted in a 96.5% reduction in rumination relative to baseline. Procedural modifications and integrity errors also were evaluated. © 2016 Society for the Experimental Analysis of Behavior.
Phylogenetic and Protein Sequence Analysis of Bacterial Chemoreceptors.
Ortega, Davi R; Zhulin, Igor B
2018-01-01
Identifying chemoreceptors in sequenced bacterial genomes, revealing their domain architecture, inferring their evolutionary relationships, and comparing them to chemoreceptors of known function become important steps in genome annotation and chemotaxis research. Here, we describe bioinformatics procedures that enable such analyses, using two closely related bacterial genomes as examples.
Basic Laboratory Skills for Water and Wastewater Analysis. Report No. 125.
ERIC Educational Resources Information Center
Clark, Douglas W.
Designed for individuals wanting to acquire an introductory knowledge of basic skills necessary to function in a water or wastewater laboratory, this handbook emphasizes current use of routine equipment and proper procedures. Explanations and illustrations focus on underlying techniques and principles rather than processes for conducting specific…
Chaparral & Fire Ecology: Role of Fire in Seed Germination.
ERIC Educational Resources Information Center
Steele, Nancy L. C.; Keeley, Jon E.
1991-01-01
An activity that incorporates the concepts of plant structure and function and ecology is described. Students investigate the reasons why some California chaparral seeds germinate only after a fire has burned the surrounding chaparral. The procedure, discussion and analysis questions, expected results, potential problems, and additional activities…
Computer Simulation of Human Behavior: Assessment of Creativity.
ERIC Educational Resources Information Center
Greene, John F.
The major purpose of this study is to further the development of procedures which minimize current limitations of creativity instruments, thus yielding a reliable and functional means for assessing creativity. Computerized content analysis and multiple regression are employed to simulate the creativity ratings of trained judges. The computerized…
Antecedent-Based Interventions for Young Children at Risk for Emotional and Behavioral Disorders
ERIC Educational Resources Information Center
Park, Kristy L.; Scott, Terrance M.
2009-01-01
Following descriptive functional assessment procedures, a brief structural analysis was used to confirm the hypothesized antecedent conditions that preceded problem behavior across three children enrolled in Head Start classrooms. A withdrawal design investigated the effectiveness of antecedent-based interventions to reduce disruptive behaviors…
Averaging Models: Parameters Estimation with the R-Average Procedure
ERIC Educational Resources Information Center
Vidotto, G.; Massidda, D.; Noventa, S.
2010-01-01
The Functional Measurement approach, proposed within the theoretical framework of Information Integration Theory (Anderson, 1981, 1982), can be a useful multi-attribute analysis tool. Compared to the majority of statistical models, the averaging model can account for interaction effects without adding complexity. The R-Average method (Vidotto &…
1990-07-01
replacing "logic diagrams" or "flow charts") to aid in coordinating the functions to be performed by a computer program and its associated Inputs...ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT ITASK IWORK UNIT ELEMENT NO. NO. NO. ACCESSION NO. 11. TITLE...the analysis. Both the logical model and detailed procedures are used to develop the application software programs which will be provided to Government
Saif, A M; Farboud, A; Delfosse, E; Pope, L; Adke, M
2016-10-01
Local anaesthetics and vasoconstrictors are essential for pain control and to aid intra-operative haemostasis in nasal procedures. They also improve access, and reduce discomfort when performing nasal endoscopy. There are no clear guidelines on preparing the nose despite evermore diagnostic and therapeutic procedures utilising the nose as a point of access. This review aims to identify nasal preparations used in diagnostic and therapeutic nasal procedures and to examine their safety and efficacy. Systematic review. A search was carried out using PubMed, MEDLINE, Ovid EMBASE, the Cochrane library and references from the included articles. The inclusion criteria included: full-text English language articles with regard to nasal preparation for surgery. Case reports, systematic reviews, meta-analysis, double-blind placebo controlled randomised trials (RCTs) and case series were included. A total of 53 articles were retrieved: 13 articles on nasal preparation for operative procedures, six on functional endoscopic sinus surgery and 22 on nasendoscopy as well as six case reports. Cocaine was the most widely used topical preparation for operative procedures but was associated with more side-effects; thus, topical tetracaine and levobupivacaine infiltration are alternatives with equivalent efficacy but reduced adverse effects. All articles reviewed for functional endoscopic sinus surgery used a mixture containing lidocaine, adrenaline or both. Flexible nasendoscopy causes minimal patient discomfort and preparation is only recommended in selected patients, in contrast to rigid nasendoscopy which requires preparation. For operative procedures, such as septorhinoplasty, a single agent tetracaine or levobupivicaine provides an improved surgical field. In functional endoscopic sinus surgery, lidocaine-adrenaline preparations have resulted in significantly better surgical and patient outcomes. There is little evidence to support the routine use of pre-procedural nasal preparation for flexible nasendoscopy. Those undergoing rigid endoscopy conversely always require the use of a vasoconstrictor and local anaesthetic. Pre-procedure assessment of patients is recommended, with agents being reserved for those with low pain thresholds, high anxiety and small nasal apertures presenting resistance to the insertion of the endoscope. © 2015 John Wiley & Sons Ltd.
Organisational Pattern Driven Recovery Mechanisms
NASA Astrophysics Data System (ADS)
Giacomo, Valentina Di; Presenza, Domenico; Riccucci, Carlo
The process of reaction to system failures and security attacks is strongly influenced by its infrastructural, procedural and organisational settings. Analysis of reaction procedures and practices from different domains (Air Traffic Management, Response to Computer Security Incident, Response to emergencies, recovery in Chemical Process Industry) highlight three key requirements for this activity: smooth collaboration and coordination among responders, accurate monitoring and management of resources and ability to adapt pre-established reaction plans to the actual context. The SERENITY Reaction Mechanisms (SRM) is the subsystem of the SERENITY Run-time Framework aimed to provide SERENITY aware AmI settings (i.e. socio-technical systems with highly distributed dynamic services) with functionalities to implement applications specific reaction strategies. The SRM uses SERENITY Organisational S&D Patterns as run-time models to drive these three key functionalities.
Analysis of truss, beam, frame, and membrane components. [composite structures
NASA Technical Reports Server (NTRS)
Knoell, A. C.; Robinson, E. Y.
1975-01-01
Truss components are considered, taking into account composite truss structures, truss analysis, column members, and truss joints. Beam components are discussed, giving attention to composite beams, laminated beams, and sandwich beams. Composite frame components and composite membrane components are examined. A description is given of examples of flat membrane components and examples of curved membrane elements. It is pointed out that composite structural design and analysis is a highly interactive, iterative procedure which does not lend itself readily to characterization by design or analysis function only.-
Conway, Aaron; Page, Karen; Rolley, John; Fulbrook, Paul
2013-08-01
Side effects of the medications used for procedural sedation and analgesia in the cardiac catheterisation laboratory are known to cause impaired respiratory function. Impaired respiratory function poses considerable risk to patient safety as it can lead to inadequate oxygenation. Having knowledge about the conditions that predict impaired respiratory function prior to the procedure would enable nurses to identify at-risk patients and selectively implement intensive respiratory monitoring. This would reduce the possibility of inadequate oxygenation occurring. To identify pre-procedure risk factors for impaired respiratory function during nurse-administered procedural sedation and analgesia in the cardiac catheterisation laboratory. Retrospective matched case-control. 21 cases of impaired respiratory function were identified and matched to 113 controls from a consecutive cohort of patients over 18 years of age. Conditional logistic regression was used to identify risk factors for impaired respiratory function. With each additional indicator of acute illness, case patients were nearly two times more likely than their controls to experience impaired respiratory function (OR 1.78; 95% CI 1.19-2.67; p = 0.005). Indicators of acute illness included emergency admission, being transferred from a critical care unit for the procedure or requiring respiratory or haemodynamic support in the lead up to the procedure. Several factors that predict the likelihood of impaired respiratory function were identified. The results from this study could be used to inform prospective studies investigating the effectiveness of interventions for impaired respiratory function during nurse-administered procedural sedation and analgesia in the cardiac catheterisation laboratory.
NASA Astrophysics Data System (ADS)
Fukushima, Toshio
2018-02-01
In order to accelerate the spherical harmonic synthesis and/or analysis of arbitrary function on the unit sphere, we developed a pair of procedures to transform between a truncated spherical harmonic expansion and the corresponding two-dimensional Fourier series. First, we obtained an analytic expression of the sine/cosine series coefficient of the 4 π fully normalized associated Legendre function in terms of the rectangle values of the Wigner d function. Then, we elaborated the existing method to transform the coefficients of the surface spherical harmonic expansion to those of the double Fourier series so as to be capable with arbitrary high degree and order. Next, we created a new method to transform inversely a given double Fourier series to the corresponding surface spherical harmonic expansion. The key of the new method is a couple of new recurrence formulas to compute the inverse transformation coefficients: a decreasing-order, fixed-degree, and fixed-wavenumber three-term formula for general terms, and an increasing-degree-and-order and fixed-wavenumber two-term formula for diagonal terms. Meanwhile, the two seed values are analytically prepared. Both of the forward and inverse transformation procedures are confirmed to be sufficiently accurate and applicable to an extremely high degree/order/wavenumber as 2^{30} {≈ } 10^9. The developed procedures will be useful not only in the synthesis and analysis of the spherical harmonic expansion of arbitrary high degree and order, but also in the evaluation of the derivatives and integrals of the spherical harmonic expansion.
A function-based approach to cockpit procedure aids
NASA Technical Reports Server (NTRS)
Phatak, Anil V.; Jain, Parveen; Palmer, Everett
1990-01-01
The objective of this research is to develop and test a cockpit procedural aid that can compose and present procedures that are appropriate for the given flight situation. The procedure would indicate the status of the aircraft engineering systems, and the environmental conditions. Prescribed procedures already exist for normal as well as for a number of non-normal and emergency situations, and can be presented to the crew using an interactive cockpit display. However, no procedures are prescribed or recommended for a host of plausible flight situations involving multiple malfunctions compounded by adverse environmental conditions. Under these circumstances, the cockpit procedural aid must review the prescribed procedures for the individual malfunction (when available), evaluate the alternatives or options, and present one or more composite procedures (prioritized or unprioritized) in response to the given situation. A top-down function-based conceptual approach towards composing and presenting cockpit procedures is being investigated. This approach is based upon the thought process that an operating crew must go through while attempting to meet the flight objectives given the current flight situation. In order to accomplish the flight objectives, certain critical functions must be maintained during each phase of the flight, using the appropriate procedures or success paths. The viability of these procedures depends upon the availability of required resources. If resources available are not sufficient to meet the requirements, alternative procedures (success paths) using the available resources must be constructed to maintain the critical functions and the corresponding objectives. If no success path exists that can satisfy the critical functions/objectives, then the next level of critical functions/objectives must be selected and the process repeated. Information is given in viewgraph form.
Ghamari, M; Soltanpur, C; Cabrera, S; Romero, R; Martinek, R; Nazeran, H
2016-08-01
Heart Rate Variability (HRV) signal analysis provides a quantitative marker of the Autonomic Nervous System (ANS) function. A wristband-type wireless photoplethysmographic (PPG) device was custom-designed to collect and analyze the arterial pulse in the wrist. The proposed device is comprised of an optical sensor to monitor arterial pulse, a signal conditioning unit to filter and amplify the analog PPG signal, a microcontroller to digitize the analog PPG signal, and a Bluetooth module to transfer the data to a smart device. This paper proposes a novel model to represent the PPG signal as the summation of two Gaussian functions. The paper concludes with a verification procedure for HRV signal analysis during sedentary activities.
Nunziante Cesaro, Stella; Lemorini, Cristina
2012-02-01
The application of combined use-wear analysis and FTIR micro spectroscopy for the investigation of the flint and obsidian tools from the archaeological sites of Masseria Candelaro (Foggia, Italy) and Sant'Anna di Oria (Brindisi, Italy) aiming to clarify their functional use is described. The tools excavated in the former site showed in a very high percentage spectroscopically detectable residues on their working edges. The identification of micro deposits is based on comparison with a great number of replicas studied in the same experimental conditions. FTIR data confirmed in almost all cases the use-wear analysis suggestions and added details about the material processed and about the working procedures. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Mokhtar, Nurkhairany Amyra; Zubairi, Yong Zulina; Hussin, Abdul Ghapor
2017-05-01
Outlier detection has been used extensively in data analysis to detect anomalous observation in data and has important application in fraud detection and robust analysis. In this paper, we propose a method in detecting multiple outliers for circular variables in linear functional relationship model. Using the residual values of the Caires and Wyatt model, we applied the hierarchical clustering procedure. With the use of tree diagram, we illustrate the graphical approach of the detection of outlier. A simulation study is done to verify the accuracy of the proposed method. Also, an illustration to a real data set is given to show its practical applicability.
Local linear regression for function learning: an analysis based on sample discrepancy.
Cervellera, Cristiano; Macciò, Danilo
2014-11-01
Local linear regression models, a kind of nonparametric structures that locally perform a linear estimation of the target function, are analyzed in the context of empirical risk minimization (ERM) for function learning. The analysis is carried out with emphasis on geometric properties of the available data. In particular, the discrepancy of the observation points used both to build the local regression models and compute the empirical risk is considered. This allows to treat indifferently the case in which the samples come from a random external source and the one in which the input space can be freely explored. Both consistency of the ERM procedure and approximating capabilities of the estimator are analyzed, proving conditions to ensure convergence. Since the theoretical analysis shows that the estimation improves as the discrepancy of the observation points becomes smaller, low-discrepancy sequences, a family of sampling methods commonly employed for efficient numerical integration, are also analyzed. Simulation results involving two different examples of function learning are provided.
NASA Astrophysics Data System (ADS)
Cai, Jianhua
2017-05-01
The time-frequency analysis method represents signal as a function of time and frequency, and it is considered a powerful tool for handling arbitrary non-stationary time series by using instantaneous frequency and instantaneous amplitude. It also provides a possible alternative to the analysis of the non-stationary magnetotelluric (MT) signal. Based on the Hilbert-Huang transform (HHT), a time-frequency analysis method is proposed to obtain stable estimates of the magnetotelluric response function. In contrast to conventional methods, the response function estimation is performed in the time-frequency domain using instantaneous spectra rather than in the frequency domain, which allows for imaging the response parameter content as a function of time and frequency. The theory of the method is presented and the mathematical model and calculation procedure, which are used to estimate response function based on HHT time-frequency spectrum, are discussed. To evaluate the results, response function estimates are compared with estimates from a standard MT data processing method based on the Fourier transform. All results show that apparent resistivities and phases, which are calculated from the HHT time-frequency method, are generally more stable and reliable than those determined from the simple Fourier analysis. The proposed method overcomes the drawbacks of the traditional Fourier methods, and the resulting parameter minimises the estimation bias caused by the non-stationary characteristics of the MT data.
Evaluation of aesthetic and functional outcomes in rhinoplasty surgery: a prospective study.
Sena Esteves, Sara; Gonçalves Ferreira, Miguel; Carvalho Almeida, João; Abrunhosa, José; Almeida E Sousa, Cecília
Evaluation of surgery outcome measured by patient satisfaction or quality of life is very important, especially in plastic surgery. There is increasing interest in self-reporting outcomes evaluation in plastic surgery. The aim of our study was to determine patient satisfaction in regard to nose appearance and function with the use of a validated questionnaire, before and after rhinoplasty surgery. A prospective study was realized at a tertiary centre. All rhinoplasty surgeries performed in adults between February 2013 and August 2014 were included. Many patients underwent additional nasal surgery such as septoplasty or turbinoplasty. The surgical procedures and patients' characteristics were also recorded. Among 113 patients, 107 completed the questionnaires and the follow-up period. Analysis of pre-operative and post-operative Rhinoplasty Evaluation Outcome showed a significant improvement after 3 and 6 months in functional and aesthetic questions (p<0.01). In the pre-operative, patients anxious and insecure had a worse score (p<0.05). Difference in improvement of scores was not significant when groups were divided on basis of other nasal procedures, primary or revision surgery and open versus closed approach. We found that patients with lower literacy degree were more satisfied with the procedure. Rhinoplasty surgery significantly improved patient quality of life regarding nose function and appearance. Copyright © 2016 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.
[Education for patients with fibromyalgia. A systematic review of randomised clinical trials].
Elizagaray-Garcia, Ignacio; Muriente-Gonzalez, Jorge; Gil-Martinez, Alfonso
2016-01-16
To analyse the effectiveness of education about pain, quality of life and functionality in patients with fibromyalgia. The search for articles was carried out in electronic databases. Eligibility criteria were: controlled randomised clinical trials (RCT), published in English and Spanish, that had been conducted on patients with fibromyalgia, in which the therapeutic procedure was based on patient education. Two independent reviewers analysed the methodological quality using the PEDro scale. Five RCT were selected, of which four offered good methodological quality. In three of the studies, patient education, in combination with another intervention based on therapeutic exercise, improved the outcomes in the variables assessing pain and quality of life as compared with the same procedures performed separately. Moreover, an RCT with a high quality methodology showed that patient education activated inhibitory neural pathways capable of lowering the level of pain. The quantitative analysis yields strong-moderate evidence that patient education, in combination with other therapeutic exercise procedures, offers positive results in the variables pain, quality of life and functionality. Patient education in itself has not proved to be effective for pain, quality of life or functionality in patients with fibromyalgia. There is strong evidence, however, of the effectiveness of combining patient education with exercise and active strategies for coping with pain, quality of life and functionality in the short, medium and long term in patients with fibromyalgia.
NASA Astrophysics Data System (ADS)
Fisichella, M.; Shotter, A. C.; Di Pietro, A.; Figuera, P.; Lattuada, M.; Marchetta, C.; Privitera, V.; Romano, L.; Ruiz, C.; Zadro, M.
2015-12-01
For low energy reaction studies involving radioactive ion beams, the experimental reaction yields are generally small due to the low intensity of the beams. For this reason, the stacked target technique has been often used to measure excitation functions. This technique offers considerable advantages since the reaction cross-section at several energies can be simultaneously measured. In a further effort to increase yields, thick targets are also employed. The main disadvantage of the method is the degradation of the beam quality as it passes through the stack due to the statistical nature of energy loss processes and any nonuniformity of the stacked targets. This degradation can lead to ambiguities of associating effective beam energies to reaction product yields for the targets within the stack and, as a consequence, to an error in the determination of the excitation function for the reaction under study. A thorough investigation of these ambiguities is reported, and a best practice procedure of analyzing data obtained using the stacked target technique with radioactive ion beams is recommended. Using this procedure a re-evaluation is reported of some previously published sub-barrier fusion data in order to demonstrate the possibility of misinterpretations of derived excitation functions. In addition, this best practice procedure has been used to evaluate, from a new data set, the sub-barrier fusion excitation function for the reaction 6Li+120Sn .
Frieben, Ryan W; Lin, Hao-Cheng; Hinh, Peter P; Berardinelli, Francesco; Canfield, Steven E; Wang, Run
2010-07-01
A systematic review of randomized controlled trials and cohort studies was conducted to evaluate data for the effects of minimally invasive procedures for treatment of symptomatic benign prostatic hyperplasia (BPH) on male sexual function. The studies searched were trials that enrolled men with symptomatic BPH who were treated with laser surgeries, transurethral microwave therapy (TUMT), transurethral needle ablation of the prostate (TUNA), transurethral ethanol ablation of the prostate (TEAP) and high-intensity frequency ultrasound (HIFU), in comparison with traditional transurethral resection of the prostate (TURP) or sham operations. A total of 72 studies were identified, of which 33 met the inclusion criteria. Of the 33 studies, 21 were concerned with laser surgeries, six with TUMT, four with TUNA and two with TEAP containing information regarding male sexual function. No study is available regarding the effect of HIFU for BPH on male sexual function. Our analysis shows that minimally invasive surgeries for BPH have comparable effects to those of TURP on male erectile function. Collectively, less than 15.4% or 15.2% of patients will have either decrease or increase, respectively, of erectile function after laser procedures, TUMT and TUNA. As observed with TURP, a high incidence of ejaculatory dysfunction (EjD) is common after treatment of BPH with holmium, potassium-titanyl-phosphate and thulium laser therapies (> 33.6%). TUMT, TUNA and neodymium:yttrium aluminum garnet visual laser ablation or interstitial laser coagulation for BPH has less incidence of EjD, but these procedures are considered less effective for BPH treatment when compared with TURP.
Frieben, Ryan W.; Lin, Hao-Cheng; Hinh, Peter P.; Berardinelli, Francesco; Canfield, Steven E.; Wang, Run
2010-01-01
A systematic review of randomized controlled trials and cohort studies was conducted to evaluate data for the effects of minimally invasive procedures for treatment of symptomatic benign prostatic hyperplasia (BPH) on male sexual function. The studies searched were trials that enrolled men with symptomatic BPH who were treated with laser surgeries, transurethral microwave therapy (TUMT), transurethral needle ablation of the prostate (TUNA), transurethral ethanol ablation of the prostate (TEAP) and high-intensity frequency ultrasound (HIFU), in comparison with traditional transurethral resection of the prostate (TURP) or sham operations. A total of 72 studies were identified, of which 33 met the inclusion criteria. Of the 33 studies, 21 were concerned with laser surgeries, six with TUMT, four with TUNA and two with TEAP containing information regarding male sexual function. No study is available regarding the effect of HIFU for BPH on male sexual function. Our analysis shows that minimally invasive surgeries for BPH have comparable effects to those of TURP on male erectile function. Collectively, less than 15.4% or 15.2% of patients will have either decrease or increase, respectively, of erectile function after laser procedures, TUMT and TUNA. As observed with TURP, a high incidence of ejaculatory dysfunction (EjD) is common after treatment of BPH with holmium, potassium-titanyl-phosphate and thulium laser therapies (> 33.6%). TUMT, TUNA and neodymium:yttrium aluminum garnet visual laser ablation or interstitial laser coagulation for BPH has less incidence of EjD, but these procedures are considered less effective for BPH treatment when compared with TURP. PMID:20473318
Fernández, Marcela T; Gómez, Adrián R; Santojanni, Américo M; Cancio, Alfredo H; Luna, Daniel R; Benítez, Sonia E
2015-01-01
Electronic Health Record system downtimes may have a great impact on patient care continuity. This paper describes the analysis and actions taken to redesign the Contingency Plan Procedure for the Electronic Health Record System of Hospital Italiano de Buenos Aires. After conducting a thorough analysis of the data gathered at post-contingency meetings, weaknesses were identified in the procedure; thus, strategic actions were recommended to redesign the Contingency Plan to secure an effective communications channel, as well as a formal structure for functions that may support the decision-making process. The main actions were: 1) to incorporate the IT Contingencies Committee (Plan management); 2) to incorporate the Coordinator (general supervision of the procedure); and 3) to redefine the role of the Clinical Informatics Resident, who will be responsible for managing communication between the technical team and Electronic Health Record users. As users need the information for continuity of care, key users evaluated the impact of the new strategy with an adapted survey.
NASA Technical Reports Server (NTRS)
Wong, K. W.
1974-01-01
In lunar phototriangulation, there is a complete lack of accurate ground control points. The accuracy analysis of the results of lunar phototriangulation must, therefore, be completely dependent on statistical procedure. It was the objective of this investigation to examine the validity of the commonly used statistical procedures, and to develop both mathematical techniques and computer softwares for evaluating (1) the accuracy of lunar phototriangulation; (2) the contribution of the different types of photo support data on the accuracy of lunar phototriangulation; (3) accuracy of absolute orientation as a function of the accuracy and distribution of both the ground and model points; and (4) the relative slope accuracy between any triangulated pass points.
Human factors evaluation of remote afterloading brachytherapy. Volume 2, Function and task analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Callan, J.R.; Gwynne, J.W. III; Kelly, T.T.
1995-05-01
A human factors project on the use of nuclear by-product material to treat cancer using remotely operated afterloaders was undertaken by the Nuclear Regulatory Commission. The purpose of the project was to identify factors that contribute to human error in the system for remote afterloading brachytherapy (RAB). This report documents the findings from the first phase of the project, which involved an extensive function and task analysis of RAB. This analysis identified the functions and tasks in RAB, made preliminary estimates of the likelihood of human error in each task, and determined the skills needed to perform each RAB task.more » The findings of the function and task analysis served as the foundation for the remainder of the project, which evaluated four major aspects of the RAB system linked to human error: human-system interfaces; procedures and practices; training and qualifications of RAB staff; and organizational practices and policies. At its completion, the project identified and prioritized areas for recommended NRC and industry attention based on all of the evaluations and analyses.« less
Point-of-care instrument for monitoring tissue health during skin graft repair
NASA Astrophysics Data System (ADS)
Gurjar, R. S.; Seetamraju, M.; Zhang, J.; Feinberg, S. E.; Wolf, D. E.
2011-06-01
We have developed the necessary theoretical framework and the basic instrumental design parameters to enable mapping of subsurface blood dynamics and tissue oxygenation for patients undergoing skin graft procedures. This analysis forms the basis for developing a simple patch geometry, which can be used to map by diffuse optical techniques blood flow velocity and tissue oxygenation as a function of depth in subsurface tissue.skin graft, diffuse correlation analysis, oxygen saturation.
Status of nuclear PDFs after the first LHC p-Pb run
NASA Astrophysics Data System (ADS)
Paukkunen, Hannu
2017-11-01
In this talk, I overview the recent progress on the global analysis of nuclear parton distribution functions (nuclear PDFs). After first introducing the contemporary fits, the analysis procedures are quickly recalled and the ambiguities in the use of experimental data outlined. Various nuclear-PDF parametrizations are compared and the main differences explained. The effects of nuclear PDFs in the LHC p-Pb hard-process observables are discussed and some future prospects sketched.
The Analysis of Riboflavin in Urine Using Fluorescence
NASA Astrophysics Data System (ADS)
Henderleiter, Julie A.; Hyslop, Richard M.
1996-06-01
To become functional as scientists, chemistry students must integrate concepts learned in their classes and apply them to novel, "real life" situations. The laboratory provides an important place for the students to practice integrating concepts. This laboratory experiment, designed for undergraduate biochemistry students, requires each student to determine the amount of riboflavin excreted by his/her body following oral administration of riboflavin contained in a multi-vitamin tablet. The experimental procedure describes a protocol for the analysis of riboflavin concentration in urine using a fluorometric assay. The students must draw upon their knowledge of solution preparation, construction of a standard curve, and back-calculation procedures to determine the concentration of riboflavin in their urine. Students need to combine knowledge from general and analytical chemistry with that learned in biochemistry to complete this analysis, thus providing an opportunity to integrate knowledge while answering a novel question.
Conjoint Analysis for New Service Development on Electricity Distribution in Indonesia
NASA Astrophysics Data System (ADS)
Widaningrum, D. L.; Chynthia; Astuti, L. D.; Seran, M. A. B.
2017-07-01
Many cases of illegal use of electricity in Indonesia is still rampant, especially for activities where the power source is not available, such as in the location of street vendors. It is not only detrimental to the state, but also harm the perpetrators of theft of electricity and the surrounding communities. The purpose of this study is to create New Service Development (NSD) to provide a new electricity source for street vendors' activity based on their preferences. The methods applied in NSD is Conjoint Analysis, Cluster Analysis, Quality Function Deployment (QFD), Service Blueprint, Process Flow Diagrams and Quality Control Plan. The results of this study are the attributes and their importance in the new electricity’s service based on street vendors’ preferences as customers, customer segmentation, service design for new service, designing technical response, designing operational procedures, the quality control plan of any existing operational procedures.
Typification of cider brandy on the basis of cider used in its manufacture.
Rodríguez Madrera, Roberto; Mangas Alonso, Juan J
2005-04-20
A study of typification of cider brandies on the basis of the origin of the raw material used in their manufacture was conducted using chemometric techniques (principal component analysis, linear discriminant analysis, and Bayesian analysis) together with their composition in volatile compounds, as analyzed by gas chromatography with flame ionization to detect the major volatiles and by mass spectrometric to detect the minor ones. Significant principal components computed by a double cross-validation procedure allowed the structure of the database to be visualized as a function of the raw material, that is, cider made from fresh apple juice versus cider made from apple juice concentrate. Feasible and robust discriminant rules were computed and validated by a cross-validation procedure that allowed the authors to classify fresh and concentrate cider brandies, obtaining classification hits of >92%. The most discriminating variables for typifying cider brandies according to their raw material were 1-butanol and ethyl hexanoate.
Performance analysis of a generalized upset detection procedure
NASA Technical Reports Server (NTRS)
Blough, Douglas M.; Masson, Gerald M.
1987-01-01
A general procedure for upset detection in complex systems, called the data block capture and analysis upset monitoring process is described and analyzed. The process consists of repeatedly recording a fixed amount of data from a set of predetermined observation lines of the system being monitored (i.e., capturing a block of data), and then analyzing the captured block in an attempt to determine whether the system is functioning correctly. The algorithm which analyzes the data blocks can be characterized in terms of the amount of time it requires to examine a given length data block to ascertain the existence of features/conditions that have been predetermined to characterize the upset-free behavior of the system. The performance of linear, quadratic, and logarithmic data analysis algorithms is rigorously characterized in terms of three performance measures: (1) the probability of correctly detecting an upset; (2) the expected number of false alarms; and (3) the expected latency in detecting upsets.
An improved procedure for detection and enumeration of walrus signatures in airborne thermal imagery
Burn, Douglas M.; Udevitz, Mark S.; Speckman, Suzann G.; Benter, R. Bradley
2009-01-01
In recent years, application of remote sensing to marine mammal surveys has been a promising area of investigation for wildlife managers and researchers. In April 2006, the United States and Russia conducted an aerial survey of Pacific walrus (Odobenus rosmarus divergens) using thermal infrared sensors to detect groups of animals resting on pack ice in the Bering Sea. The goal of this survey was to estimate the size of the Pacific walrus population. An initial analysis of the U.S. data using previously-established methods resulted in lower detectability of walrus groups in the imagery and higher variability in calibration models than was expected based on pilot studies. This paper describes an improved procedure for detection and enumeration of walrus groups in airborne thermal imagery. Thermal images were first subdivided into smaller 200 x 200 pixel "tiles." We calculated three statistics to represent characteristics of walrus signatures from the temperature histogram for each the. Tiles that exhibited one or more of these characteristics were examined further to determine if walrus signatures were present. We used cluster analysis on tiles that contained walrus signatures to determine which pixels belonged to each group. We then calculated a thermal index value for each walrus group in the imagery and used generalized linear models to estimate detection functions (the probability of a group having a positive index value) and calibration functions (the size of a group as a function of its index value) based on counts from matched digital aerial photographs. The new method described here improved our ability to detect walrus groups at both 2 m and 4 m spatial resolution. In addition, the resulting calibration models have lower variance than the original method. We anticipate that the use of this new procedure will greatly improve the quality of the population estimate derived from these data. This procedure may also have broader applicability to thermal infrared surveys of other wildlife species. Published by Elsevier B.V.
Szidarovszky, Tamás; Fábri, Csaba; Császár, Attila G
2012-05-07
Approximate rotational characterization of variational rovibrational wave functions via the rigid rotor decomposition (RRD) protocol is developed for Hamiltonians based on arbitrary sets of internal coordinates and axis embeddings. An efficient and general procedure is given that allows employing the Eckart embedding with arbitrary polyatomic Hamiltonians through a fully numerical approach. RRD tables formed by projecting rotational-vibrational wave functions into products of rigid-rotor basis functions and previously determined vibrational eigenstates yield rigid-rotor labels for rovibrational eigenstates by selecting the largest overlap. Embedding-dependent RRD analyses are performed, up to high energies and rotational excitations, for the H(2) (16)O isotopologue of the water molecule. Irrespective of the embedding chosen, the RRD procedure proves effective in providing unambiguous rotational assignments at low energies and J values. Rotational labeling of rovibrational states of H(2) (16)O proves to be increasingly difficult beyond about 10,000 cm(-1), close to the barrier to linearity of the water molecule. For medium energies and excitations the Eckart embedding yields the largest RRD coefficients, thus providing the largest number of unambiguous rotational labels.
NASA Astrophysics Data System (ADS)
Curceac, S.; Ternynck, C.; Ouarda, T.
2015-12-01
Over the past decades, a substantial amount of research has been conducted to model and forecast climatic variables. In this study, Nonparametric Functional Data Analysis (NPFDA) methods are applied to forecast air temperature and wind speed time series in Abu Dhabi, UAE. The dataset consists of hourly measurements recorded for a period of 29 years, 1982-2010. The novelty of the Functional Data Analysis approach is in expressing the data as curves. In the present work, the focus is on daily forecasting and the functional observations (curves) express the daily measurements of the above mentioned variables. We apply a non-linear regression model with a functional non-parametric kernel estimator. The computation of the estimator is performed using an asymmetrical quadratic kernel function for local weighting based on the bandwidth obtained by a cross validation procedure. The proximities between functional objects are calculated by families of semi-metrics based on derivatives and Functional Principal Component Analysis (FPCA). Additionally, functional conditional mode and functional conditional median estimators are applied and the advantages of combining their results are analysed. A different approach employs a SARIMA model selected according to the minimum Akaike (AIC) and Bayessian (BIC) Information Criteria and based on the residuals of the model. The performance of the models is assessed by calculating error indices such as the root mean square error (RMSE), relative RMSE, BIAS and relative BIAS. The results indicate that the NPFDA models provide more accurate forecasts than the SARIMA models. Key words: Nonparametric functional data analysis, SARIMA, time series forecast, air temperature, wind speed
Saving the Best for Last? A Cross-Species Analysis of Choices between Reinforcer Sequences
ERIC Educational Resources Information Center
Andrade, Leonardo F.; Hackenberg, Timothy D.
2012-01-01
Two experiments were conducted to compare choices between sequences of reinforcers in pigeon (Experiment 1) and human (Experiment 2) subjects, using functionally analogous procedures. The subjects made pairwise choices among 3 sequence types, all of which provided the same overall reinforcement rate, but differed in their temporal patterning.…
The Effects of a Brushing Procedure on Stereotypical Behavior
ERIC Educational Resources Information Center
Davis, Tonya N.; Durand, Shannon; Chan, Jeffrey M.
2011-01-01
In this study we analyzed the effects of a brushing protocol on stereotyped behavior of a young boy with autism. First, a functional analysis was conducted which showed that the participant's stereotypy was maintained by automatic reinforcement. Next, the Wilbarger Protocol, a brushing intervention, was implemented. An ABA design was implemented…
ERIC Educational Resources Information Center
Baehr, Melany E.
1984-01-01
An empirical procedure to determine areas of required development for personnel in three management hierarchies (line, professional, and sales) involves a job analysis of nine key positions in these hierarchies, determination of learning needs for each job function, and development of program curricula for each need. (SK)
ERIC Educational Resources Information Center
Dave, Eshan V.
2009-01-01
Asphalt concrete pavements are inherently graded viscoelastic structures. Oxidative aging of asphalt binder and temperature cycling due to climatic conditions being the major cause of non-homogeneity. Current pavement analysis and simulation procedures dwell on the use of layered approach to account for these non-homogeneities. The conventional…
Treatment of Challenging Behavior Exhibited by Children with Prenatal Drug Exposure
ERIC Educational Resources Information Center
Kurtz, Patricia F.; Chin, Michelle D.; Rush, Karena S.; Dixon, Dennis R.
2008-01-01
A large body of literature exists describing the harmful effects of prenatal drug exposure on infant and child development. However, there is a paucity of research examining strategies to ameliorate sequelae such as externalizing behavior problems. In the present study, functional analysis procedures were used to assess challenging behavior…
Modelling Systems of Classical/Quantum Identical Particles by Focusing on Algorithms
ERIC Educational Resources Information Center
Guastella, Ivan; Fazio, Claudio; Sperandeo-Mineo, Rosa Maria
2012-01-01
A procedure modelling ideal classical and quantum gases is discussed. The proposed approach is mainly based on the idea that modelling and algorithm analysis can provide a deeper understanding of particularly complex physical systems. Appropriate representations and physical models able to mimic possible pseudo-mechanisms of functioning and having…
APMS 3.0 Flight Analyst Guide: Aviation Performance Measuring System
NASA Technical Reports Server (NTRS)
Jay, Griff; Prothero, Gary; Romanowski, Timothy; Lynch, Robert; Lawrence, Robert; Rosenthal, Loren
2004-01-01
The Aviation Performance Measuring System (APMS) is a method-embodied in software-that uses mathematical algorithms and related procedures to analyze digital flight data extracted from aircraft flight data recorders. APMS consists of an integrated set of tools used to perform two primary functions: a) Flight Data Importation b) Flight Data Analysis.
Collection Development Organization and Committees. SPEC Kit 11.
ERIC Educational Resources Information Center
Association of Research Libraries, Washington, DC. Office of Management Studies.
This kit focuses on information that is useful for starting a collection development program. It contains 7 position descriptions, 10 documents on the role of committees, 4 organization charts, 5 documents on the organization of functions, and an analysis of a Systems and Procedures Exchange Center (SPEC) collection development survey. The survey,…
Light duty utility arm phase 2 qualification test procedure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnes, G.A.
1997-01-16
This Acceptance Test Procedure (ATP) will test and verify that the Exhauster meets the specified functional requirements, safety requirements, operating requirements, and provide a record of the functional test results. The system/functions that will be tested are listed in the scope section of the Acceptance Test Procedure.
NASA Astrophysics Data System (ADS)
Stoykova, Boyka; Chochkova, Maya; Ivanova, Galya; Markova, Nadezhda; Enchev, Venelin; Tsvetkova, Iva; Najdenski, Hristo; Štícha, Martin; Milkova, Tsenka
2017-05-01
N-phenylpropenoyl amino acid amides have been brominated using two alternative sonochemically activated green chemistry procedures. The first synthetic procedure has involved an ultrasound assisted bromination in an aqueous medium using ionic liquid as a catalyst of the reaction, whereas in the second one an in situ formation of Br2 via oxidation of HBr by H2O2 has been used. For comparison, the conventional bromination procedure was also used. The newly brominated compounds were characterized by appropriate analytical techniques. A detailed NMR spectroscopic analysis and quantum chemical calculations using Density Functional Theory (DFT) methods have been used to define the stereochemistry of the products. The results confirmed the physicochemical identity and similar yields of the products obtained by the three synthetic procedures employed, and reveal the co-existence of two diastereoisomeric forms of the newly synthesized products. The antibacterial and antifungal activities of the dibrominated amides were evaluated.
Numerical simulation of aerothermal loads in hypersonic engine inlets due to shock impingement
NASA Technical Reports Server (NTRS)
Ramakrishnan, R.
1992-01-01
The effect of shock impingement on an axial corner simulating the inlet of a hypersonic vehicle engine is modeled using a finite-difference procedure. A three-dimensional dynamic grid adaptation procedure is utilized to move the grids to regions with strong flow gradients. The adaptation procedure uses a grid relocation stencil that is valid at both the interior and boundary points of the finite-difference grid. A linear combination of spatial derivatives of specific flow variables, calculated with finite-element interpolation functions, are used as adaptation measures. This computational procedure is used to study laminar and turbulent Mach 6 flows in the axial corner. The description of flow physics and qualitative measures of heat transfer distributions on cowl and strut surfaces obtained from the analysis are compared with experimental observations. Conclusions are drawn regarding the capability of the numerical scheme for enhanced modeling of high-speed compressible flows.
Bifurcation Analysis of a Predator-Prey System with Ratio-Dependent Functional Response
NASA Astrophysics Data System (ADS)
Jiang, Xin; She, Zhikun; Feng, Zhaosheng; Zheng, Xiuliang
2017-12-01
In this paper, we are concerned with the structural stability of a density dependent predator-prey system with ratio-dependent functional response. Starting with the geometrical analysis of hyperbolic curves, we obtain that the system has one or two positive equilibria under various conditions. Inspired by the S-procedure and semi-definite programming, we use the sum of squares decomposition based method to ensure the global asymptotic stability of the positive equilibrium through the associated polynomial Lyapunov functions. By exploring the monotonic property of the trace of the Jacobian matrix with respect to r under the given different conditions, we analytically verify that there is a corresponding unique r∗ such that the trace is equal to zero and prove the existence of Hopf bifurcation, respectively.
Analog computation of auto and cross-correlation functions
NASA Technical Reports Server (NTRS)
1974-01-01
For analysis of the data obtained from the cross beam systems it was deemed desirable to compute the auto- and cross-correlation functions by both digital and analog methods to provide a cross-check of the analysis methods and an indication as to which of the two methods would be most suitable for routine use in the analysis of such data. It is the purpose of this appendix to provide a concise description of the equipment and procedures used for the electronic analog analysis of the cross beam data. A block diagram showing the signal processing and computation set-up used for most of the analog data analysis is provided. The data obtained at the field test sites were recorded on magnetic tape using wide-band FM recording techniques. The data as recorded were band-pass filtered by electronic signal processing in the data acquisition systems.
Meshfree truncated hierarchical refinement for isogeometric analysis
NASA Astrophysics Data System (ADS)
Atri, H. R.; Shojaee, S.
2018-05-01
In this paper truncated hierarchical B-spline (THB-spline) is coupled with reproducing kernel particle method (RKPM) to blend advantages of the isogeometric analysis and meshfree methods. Since under certain conditions, the isogeometric B-spline and NURBS basis functions are exactly represented by reproducing kernel meshfree shape functions, recursive process of producing isogeometric bases can be omitted. More importantly, a seamless link between meshfree methods and isogeometric analysis can be easily defined which provide an authentic meshfree approach to refine the model locally in isogeometric analysis. This procedure can be accomplished using truncated hierarchical B-splines to construct new bases and adaptively refine them. It is also shown that the THB-RKPM method can provide efficient approximation schemes for numerical simulations and represent a promising performance in adaptive refinement of partial differential equations via isogeometric analysis. The proposed approach for adaptive locally refinement is presented in detail and its effectiveness is investigated through well-known benchmark examples.
Svider, Peter F; Keeley, Brieze R; Zumba, Osvaldo; Mauro, Andrew C; Setzen, Michael; Eloy, Jean Anderson
2013-08-01
Malpractice litigation has increased in recent decades, contributing to higher health-care costs. Characterization of complications leading to litigation is of special interest to practitioners of facial plastic surgery procedures because of the higher proportion of elective cases relative to other subspecialties. In this analysis, we comprehensively examine malpractice litigation in facial plastic surgery procedures and characterize factors important in determining legal responsibility, as this information may be of great interest and use to practitioners in several specialties. Retrospective analysis. The Westlaw legal database was examined for court records pertaining to facial plastic surgery procedures. The term "medical malpractice" was searched in combination with numerous procedures obtained from the American Academy of Facial Plastic and Reconstructive Surgery website. Of the 88 cases included, 62.5% were decided in the physician's favor, 9.1% were resolved with an out-of-court settlement, and 28.4% ended in a jury awarding damages for malpractice. The mean settlement was $577,437 and mean jury award was $352,341. The most litigated procedures were blepharoplasties and rhinoplasties. Alleged lack of informed consent was noted in 38.6% of cases; other common complaints were excessive scarring/disfigurement, functional considerations, and postoperative pain. This analysis characterized factors in determining legal responsibility in facial plastic surgery cases. Several factors were identified as potential targets for minimizing liability. Informed consent was the most reported entity in these malpractice suits. This finding emphasizes the importance of open communication between physicians and their patients regarding expectations as well as documentation of specific risks, benefits, and alternatives. © 2013 The American Laryngological, Rhinological, and Otological Society, Inc.
Real-time MRI guidance of cardiac interventions.
Campbell-Washburn, Adrienne E; Tavallaei, Mohammad A; Pop, Mihaela; Grant, Elena K; Chubb, Henry; Rhode, Kawal; Wright, Graham A
2017-10-01
Cardiac magnetic resonance imaging (MRI) is appealing to guide complex cardiac procedures because it is ionizing radiation-free and offers flexible soft-tissue contrast. Interventional cardiac MR promises to improve existing procedures and enable new ones for complex arrhythmias, as well as congenital and structural heart disease. Guiding invasive procedures demands faster image acquisition, reconstruction and analysis, as well as intuitive intraprocedural display of imaging data. Standard cardiac MR techniques such as 3D anatomical imaging, cardiac function and flow, parameter mapping, and late-gadolinium enhancement can be used to gather valuable clinical data at various procedural stages. Rapid intraprocedural image analysis can extract and highlight critical information about interventional targets and outcomes. In some cases, real-time interactive imaging is used to provide a continuous stream of images displayed to interventionalists for dynamic device navigation. Alternatively, devices are navigated relative to a roadmap of major cardiac structures generated through fast segmentation and registration. Interventional devices can be visualized and tracked throughout a procedure with specialized imaging methods. In a clinical setting, advanced imaging must be integrated with other clinical tools and patient data. In order to perform these complex procedures, interventional cardiac MR relies on customized equipment, such as interactive imaging environments, in-room image display, audio communication, hemodynamic monitoring and recording systems, and electroanatomical mapping and ablation systems. Operating in this sophisticated environment requires coordination and planning. This review provides an overview of the imaging technology used in MRI-guided cardiac interventions. Specifically, this review outlines clinical targets, standard image acquisition and analysis tools, and the integration of these tools into clinical workflow. 1 Technical Efficacy: Stage 5 J. Magn. Reson. Imaging 2017;46:935-950. © 2017 International Society for Magnetic Resonance in Medicine.
Initial Data Analysis Results for ATD-2 ISAS HITL Simulation
NASA Technical Reports Server (NTRS)
Lee, Hanbong
2017-01-01
To evaluate the operational procedures and information requirements for the core functional capabilities of the ATD-2 project, such as tactical surface metering tool, APREQ-CFR procedure, and data element exchanges between ramp and tower, human-in-the-loop (HITL) simulations were performed in March, 2017. This presentation shows the initial data analysis results from the HITL simulations. With respect to the different runway configurations and metering values in tactical surface scheduler, various airport performance metrics were analyzed and compared. These metrics include gate holding time, taxi-out in time, runway throughput, queue size and wait time in queue, and TMI flight compliance. In addition to the metering value, other factors affecting the airport performance in the HITL simulation, including run duration, runway changes, and TMI constraints, are also discussed.
A Biomechanical Analysis Of Craniofacial Form And Function
NASA Astrophysics Data System (ADS)
Oyen, Ordean J.
1989-04-01
In vivo measures of bite force and bone strain obtained in growing African green monkeys (Cercopeithecus aethiops) are being used to study skull biology and geometry. Strain values and distributional patterns seen in association with forceful jaw elevation are inconsistent with conventional explanations linking upper facial morphology with masticatory function and/or using beam models of craniofacial architecture. These results mandate careful use of notions about skeletal geometry based on static analyses that have not been experimentally verified using in vivo procedures.
NASA Astrophysics Data System (ADS)
Simoni, Daniele; Lengani, Davide; Guida, Roberto
2016-09-01
The transition process of the boundary layer growing over a flat plate with pressure gradient simulating the suction side of a low-pressure turbine blade and elevated free-stream turbulence intensity level has been analyzed by means of PIV and hot-wire measurements. A detailed view of the instantaneous flow field in the wall-normal plane highlights the physics characterizing the complex process leading to the formation of large-scale coherent structures during breaking down of the ordered motion of the flow, thus generating randomized oscillations (i.e., turbulent spots). This analysis gives the basis for the development of a new procedure aimed at determining the intermittency function describing (statistically) the transition process. To this end, a wavelet-based method has been employed for the identification of the large-scale structures created during the transition process. Successively, a probability density function of these events has been defined so that an intermittency function is deduced. This latter strictly corresponds to the intermittency function of the transitional flow computed trough a classic procedure based on hot-wire data. The agreement between the two procedures in the intermittency shape and spot production rate proves the capability of the method in providing the statistical representation of the transition process. The main advantages of the procedure here proposed concern with its applicability to PIV data; it does not require a threshold level to discriminate first- and/or second-order time-derivative of hot-wire time traces (that makes the method not influenced by the operator); and it provides a clear evidence of the connection between the flow physics and the statistical representation of transition based on theory of turbulent spot propagation.
A critical analysis of the surgical outcomes for the treatment of Peyronie’s disease
Mandava, Sree H.; Trost, Landon W.; Hellstrom, Wayne J.G.
2013-01-01
Peyronie’s disease (PD) is a relatively common condition, which can impair sexual function and result in emotional and psychological distress. Despite an abundance of minimally invasive treatments, few have confirmed efficacy for improving penile curvature and function. Surgical therapies include many different techniques and are reserved for patients with stable disease of ⩾12 months’ duration. We searched PubMed for all articles from 1990 to the present relating to the surgical management of PD. Preference was given to recent articles, larger series, and those comparing various techniques and/or materials. Outcomes were subsequently analysed and organised by surgical technique and the graft material used. Available surgical techniques include plication/corporoplasty procedures, incision and grafting (I&G), and placing a penile prosthesis with or without adjunctive procedures. Although several surgical algorithms have been reported, in general, plication/corporoplasty procedures are reserved for patients with adequate erectile function, simple curvatures of <60°, and with no deformities (hour-glass, hinge). I&G are reserved for complex curvatures of >60° and those with deformities. Penile prostheses are indicated for combined erectile dysfunction and PD. Overall outcomes show high rates of improved curvature and patient satisfaction, with mildly decreased erectile function with both plication and the I&G procedure (I&G >plication) and decreases in penile length (plication >I&G). Surgical management of PD remains an excellent treatment option for patients with penile curvature precluding or impairing sexual activity. Surgical algorithms are available to assist treating clinicians in appropriately stratifying surgical candidates. Additional research is needed to identify optimal surgical techniques and materials based on patient and disease characteristics. PMID:26558094
Response of discrete linear systems to forcing functions with inequality constraints.
NASA Technical Reports Server (NTRS)
Michalopoulos, C. D.; Riley, T. A.
1972-01-01
An analysis is made of the maximum response of discrete, linear mechanical systems to arbitrary forcing functions which lie within specified bounds. Primary attention is focused on the complete determination of the forcing function which will engender maximum displacement to any particular mass element of a multi-degree-of-freedom system. In general, the desired forcing function is found to be a bang-bang type function, i.e., a function which switches from the maximum to the minimum bound and vice-versa at certain instants of time. Examples of two-degree-of-freedom systems, with and without damping, are presented in detail. Conclusions are drawn concerning the effect of damping on the switching times and the general procedure for finding these times is discussed.
Numerical and experimental study of a hydrodynamic cavitation tube
NASA Astrophysics Data System (ADS)
Hu, H.; Finch, J. A.; Zhou, Z.; Xu, Z.
1998-08-01
A numerical analysis of hydrodynamics in a cavitation tube used for activating fine particle flotation is described. Using numerical procedures developed for solving the turbulent k-ɛ model with boundary fitted coordinates, the stream function, vorticity, velocity, and pressure distributions in a cavitation tube were calculated. The calculated pressure distribution was found to be in excellent agreement with experimental results. The requirement of a pressure drop below approximately 10 m water for cavitation to occur was observed experimentally and confirmed by the model. The use of the numerical procedures for cavitation tube design is discussed briefly.
NASA Astrophysics Data System (ADS)
Peng, Guoyi; Cao, Shuliang; Ishizuka, Masaru; Hayama, Shinji
2002-06-01
This paper is concerned with the design optimization of axial flow hydraulic turbine runner blade geometry. In order to obtain a better design plan with good performance, a new comprehensive performance optimization procedure has been presented by combining a multi-variable multi-objective constrained optimization model with a Q3D inverse computation and a performance prediction procedure. With careful analysis of the inverse design of axial hydraulic turbine runner, the total hydraulic loss and the cavitation coefficient are taken as optimization objectives and a comprehensive objective function is defined using the weight factors. Parameters of a newly proposed blade bound circulation distribution function and parameters describing positions of blade leading and training edges in the meridional flow passage are taken as optimization variables.The optimization procedure has been applied to the design optimization of a Kaplan runner with specific speed of 440 kW. Numerical results show that the performance of designed runner is successfully improved through optimization computation. The optimization model is found to be validated and it has the feature of good convergence. With the multi-objective optimization model, it is possible to control the performance of designed runner by adjusting the value of weight factors defining the comprehensive objective function. Copyright
Steerable Principal Components for Space-Frequency Localized Images*
Landa, Boris; Shkolnisky, Yoel
2017-01-01
As modern scientific image datasets typically consist of a large number of images of high resolution, devising methods for their accurate and efficient processing is a central research task. In this paper, we consider the problem of obtaining the steerable principal components of a dataset, a procedure termed “steerable PCA” (steerable principal component analysis). The output of the procedure is the set of orthonormal basis functions which best approximate the images in the dataset and all of their planar rotations. To derive such basis functions, we first expand the images in an appropriate basis, for which the steerable PCA reduces to the eigen-decomposition of a block-diagonal matrix. If we assume that the images are well localized in space and frequency, then such an appropriate basis is the prolate spheroidal wave functions (PSWFs). We derive a fast method for computing the PSWFs expansion coefficients from the images' equally spaced samples, via a specialized quadrature integration scheme, and show that the number of required quadrature nodes is similar to the number of pixels in each image. We then establish that our PSWF-based steerable PCA is both faster and more accurate then existing methods, and more importantly, provides us with rigorous error bounds on the entire procedure. PMID:29081879
Behavioral economic analysis of drug preference using multiple choice procedure data.
Greenwald, Mark K
2008-01-11
The multiple choice procedure has been used to evaluate preference for psychoactive drugs, relative to money amounts (price), in human subjects. The present re-analysis shows that MCP data are compatible with behavioral economic analysis of drug choices. Demand curves were constructed from studies with intravenous fentanyl, intramuscular hydromorphone and oral methadone in opioid-dependent individuals; oral d-amphetamine, oral MDMA alone and during fluoxetine treatment, and smoked marijuana alone or following naltrexone pretreatment in recreational drug users. For each participant and dose, the MCP crossover point was converted into unit price (UP) by dividing the money value ($) by the drug dose (mg/70kg). At the crossover value, the dose ceases to function as a reinforcer, so "0" was entered for this and higher UPs to reflect lack of drug choice. At lower UPs, the dose functions as a reinforcer and "1" was entered to reflect drug choice. Data for UP vs. average percent choice were plotted in log-log space to generate demand functions. Rank of order of opioid inelasticity (slope of non-linear regression) was: fentanyl>hydromorphone (continuing heroin users)>methadone>hydromorphone (heroin abstainers). Rank order of psychostimulant inelasticity was d-amphetamine>MDMA>MDMA+fluoxetine. Smoked marijuana was more inelastic with high-dose naltrexone. These findings show this method translates individuals' drug preferences into estimates of population demand, which has the potential to yield insights into pharmacotherapy efficacy, abuse liability assessment, and individual differences in susceptibility to drug abuse.
Behavioral Economic Analysis of Drug Preference Using Multiple Choice Procedure Data
Greenwald, Mark K.
2008-01-01
The Multiple Choice Procedure has been used to evaluate preference for psychoactive drugs, relative to money amounts (price), in human subjects. The present re-analysis shows that MCP data are compatible with behavioral economic analysis of drug choices. Demand curves were constructed from studies with intravenous fentanyl, intramuscular hydromorphone and oral methadone in opioid-dependent individuals; oral d-amphetamine, oral MDMA alone and during fluoxetine treatment, and smoked marijuana alone or following naltrexone pretreatment in recreational drug users. For each participant and dose, the MCP crossover point was converted into unit price (UP) by dividing the money value ($) by the drug dose (mg/70 kg). At the crossover value, the dose ceases to function as a reinforcer, so “0” was entered for this and higher UPs to reflect lack of drug choice. At lower UPs, the dose functions as a reinforcer and “1” was entered to reflect drug choice. Data for UP vs. average percent choice were plotted in log-log space to generate demand functions. Rank of order of opioid inelasticity (slope of non-linear regression) was: fentanyl > hydromorphone (continuing heroin users) > methadone > hydromorphone (heroin abstainers). Rank order of psychostimulant inelasticity was d-amphetamine > MDMA > MDMA + fluoxetine. Smoked marijuana was more inelastic with high-dose naltrexone. These findings show this method translates individuals’ drug preferences into estimates of population demand, which has the potential to yield insights into pharmacotherapy efficacy, abuse liability assessment, and individual differences in susceptibility to drug abuse. PMID:17949924
NASA Technical Reports Server (NTRS)
Knapp, Charles F.; Evans, J. M.; Patwardhan, A.; Levenhagen, D.; Wang, M.; Charles, John B.
1991-01-01
A major focus of our research program is to develop noninvasive procedures for determining changes in cardiovascular function associated with the null gravity environment. We define changes in cardiovascular function to be (1) the result of the regulatory system operating at values different from 'normal' but with an overall control system basically unchanged by the null gravity exposure, or (2) the result of operating with a control system that has significantly different regulatory characteristics after an exposure. To this end, we have used a model of weightlessness that consisted of exposing humans to 2 hrs. in the launch position, followed by 20 hrs. of 6 deg head down bedrest. Our principal objective was to use this model to measure cardiovascular responses to the 6 deg head down bedrest protocol and to develop the most sensitive 'systems identification' procedure for indicating change. A second objective, related to future experiments, is to use the procedure in combination with experiments designed to determine the degree to which a regulatory pathway has been altered and to determine the mechanisms responsible for the changes.
Thoracoscopic laser pneumoplasty in the treatment of diffuse bullous emphysema.
Wakabayashi, A
1995-10-01
Thoracoscopic laser pneumoplasty in the treatment of diffuse bullous emphysema by means of a contact neodymium:yttrium-aluminum garnet laser was evaluated by a retrospective analysis of the first consecutive 500 procedures in 443 patients. The indication for thoracoscopic laser pneumoplasty was intractable dyspnea. Advanced age (mean age, 67 years), high oxygen dependency (70%), steroid use (46%), and markedly diminished physical capacity (2% bedridden and 27% wheelchair-bound) were noted. Thoracoscopic laser pneumoplasty was carried out under general anesthesia and one-lung ventilation. Type 3 bullae (381 procedures) were contracted by contact neodymium:yttrium-aluminum garnet laser and type 4 bullae (199 procedures) excised. The operative mortality rate was 4.8%. Subjective improvement was reported by 87% of the patients. Follow-up functional evaluation was available in 229 patients, which showed highly significant improvement. A comparison of preoperative and postoperative functional tests between type 3 and 4 bullae patients showed no significant difference, except the latter had higher decrease in airway resistance, residual volume, and total lung capacity. Thoracoscopic laser pneumoplasty is an effective treatment for both type 3 and 4 bullous emphysema with an acceptable risk.
Assessing the extent, stability, purity and properties of silanised detonation nanodiamond
NASA Astrophysics Data System (ADS)
Duffy, Emer; Mitev, Dimitar P.; Thickett, Stuart C.; Townsend, Ashley T.; Paull, Brett; Nesterenko, Pavel N.
2015-12-01
The functionalisation of nanodiamond is a key step in furthering its application in areas such as surface coatings, drug delivery, bio imaging and other biomedical avenues. Accordingly, analytical methods for the detailed characterisation of functionalised nano-material are of great importance. This work presents an alternative approach for the elemental analysis of zero-dimensional nanocarbons, specifically detonation nanodiamond (DND) following purification and functionalisation procedures. There is a particular emphasis on the presence of silicon, both for the purified DND and after its functionalisation with silanes. Five different silylation procedures for purified DND were explored and assessed quantitatively using inductively coupled plasma-mass spectrometry (ICP-MS) for analysis of dilute suspensions. A maximum Si loading of 29,300 μg g-1 on the DND was achieved through a combination of silylating reagents. The presence of 28 other elements in the DND materials was also quantified by ICP-MS. The characterisation of Si-bond formation was supported by FTIR and XPS evaluation of relevant functional groups. The thermal stability of the silylated DND was examined by thermogravimetric analysis. Improved particle size distribution and dispersion stability resulted from the silylation procedure, as confirmed by dynamic light scattering and capillary zone electrophoresis.
González-Álvarez, Mariana; Noguerol-Pato, Raquel; González-Barreiro, Carmen; Cancho-Grande, Beatriz; Simal-Gándara, Jesús
2014-02-15
The effect of winemaking procedures on the sensory modification of sweet wines was investigated. Garnacha Tintorera-based sweet wines were obtained by two different processes: by using raisins for vinification to obtain a naturally sweet wine and by using freshly harvested grapes with the stoppage of the fermentation by the addition of alcohol. Eight international sweet wines were also subjected to sensory analysis for comparative description purposes. Wines were described with a sensory profile by 12 trained panellists on 70 sensory attributes by employing the frequency of citation method. Analysis of variance of the descriptive data confirmed the existence of subtle sensory differences among Garnacha Tintorera-based sweet wines depending on the procedure used for their production. Cluster analysis emphasised discriminated attributes between the Garnacha Tintorera-based and the commercial groups of sweet wines for both those obtained by raisining and by fortification. Several kinds of discriminant functions were used to separate groups of sweet wines--obtained by botrytisation, raisining and fortification--to show the key descriptors that contribute to their separation and define the sensory perception of each type of wine. Copyright © 2013 Elsevier Ltd. All rights reserved.
Structural tailoring of advanced turboprops
NASA Technical Reports Server (NTRS)
Brown, K. W.; Hopkins, Dale A.
1988-01-01
The Structural Tailoring of Advanced Turboprops (STAT) computer program was developed to perform numerical optimization on highly swept propfan blades. The optimization procedure seeks to minimize an objective function defined as either: (1) direct operating cost of full scale blade or, (2) aeroelastic differences between a blade and its scaled model, by tuning internal and external geometry variables that must satisfy realistic blade design constraints. The STAT analysis system includes an aerodynamic efficiency evaluation, a finite element stress and vibration analysis, an acoustic analysis, a flutter analysis, and a once-per-revolution forced response life prediction capability. STAT includes all relevant propfan design constraints.
Pyridylamination as a means of analyzing complex sugar chains
Hase, Sumihiro
2010-01-01
Herein, I describe pyridylamination for versatile analysis of sugar chains. The reducing ends of the sugar chains are tagged with 2-aminopyridine and the resultant chemically stable fluorescent derivatives are used for structural/functional analysis. Pyridylamination is an effective “operating system” for increasing sensitivity and simplifying the analytical procedures including mass spectrometry and NMR. Excellent separation of isomers is achieved by reversed-phase HPLC. However, separation is further improved by two-dimensional HPLC, which involves a combination of reversed-phase HPLC and size-fractionation HPLC. Moreover, a two-dimensional HPLC map is also useful for structural analysis. I describe a simple procedure for preparing homogeneous pyridylamino sugar chains that is less laborious than existing techniques and can be used for functional analysis (e.g., sugar-protein interaction). This novel approach was applied and some of the results are described: i) a glucosyl-serine type sugar chain found in blood coagulation factors; ii) discovery of endo-β-mannosidase (EC 3.2.1.152) and a new type plant α1,2-l-fucosidase; and iii) novel substrate specificity of a cytosolic α-mannosidase. Moreover, using homogeneous sugar chains of a size similar to in vivo substrates we were able to analyze interactions between sugar chains and proteins such as enzymes and lectins in detail. Interestingly, our studies reveal that some enzymes recognize a wider region of the substrate than anticipated. PMID:20431262
NASA Technical Reports Server (NTRS)
Trosset, Michael W.
1999-01-01
Comprehensive computational experiments to assess the performance of algorithms for numerical optimization require (among other things) a practical procedure for generating pseudorandom nonlinear objective functions. We propose a procedure that is based on the convenient fiction that objective functions are realizations of stochastic processes. This report details the calculations necessary to implement our procedure for the case of certain stationary Gaussian processes and presents a specific implementation in the statistical programming language S-PLUS.
Forward and backward uncertainty propagation: an oxidation ditch modelling example.
Abusam, A; Keesman, K J; van Straten, G
2003-01-01
In the field of water technology, forward uncertainty propagation is frequently used, whereas backward uncertainty propagation is rarely used. In forward uncertainty analysis, one moves from a given (or assumed) parameter subspace towards the corresponding distribution of the output or objective function. However, in the backward uncertainty propagation, one moves in the reverse direction, from the distribution function towards the parameter subspace. Backward uncertainty propagation, which is a generalisation of parameter estimation error analysis, gives information essential for designing experimental or monitoring programmes, and for tighter bounding of parameter uncertainty intervals. The procedure of carrying out backward uncertainty propagation is illustrated in this technical note by working example for an oxidation ditch wastewater treatment plant. Results obtained have demonstrated that essential information can be achieved by carrying out backward uncertainty propagation analysis.
Advancing Autonomous Operations for Deep Space Vehicles
NASA Technical Reports Server (NTRS)
Haddock, Angie T.; Stetson, Howard K.
2014-01-01
Starting in Jan 2012, the Advanced Exploration Systems (AES) Autonomous Mission Operations (AMO) Project began to investigate the ability to create and execute "single button" crew initiated autonomous activities [1]. NASA Marshall Space Flight Center (MSFC) designed and built a fluid transfer hardware test-bed to use as a sub-system target for the investigations of intelligent procedures that would command and control a fluid transfer test-bed, would perform self-monitoring during fluid transfers, detect anomalies and faults, isolate the fault and recover the procedures function that was being executed, all without operator intervention. In addition to the development of intelligent procedures, the team is also exploring various methods for autonomous activity execution where a planned timeline of activities are executed autonomously and also the initial analysis of crew procedure development. This paper will detail the development of intelligent procedures for the NASA MSFC Autonomous Fluid Transfer System (AFTS) as well as the autonomous plan execution capabilities being investigated. Manned deep space missions, with extreme communication delays with Earth based assets, presents significant challenges for what the on-board procedure content will encompass as well as the planned execution of the procedures.
On contact modelling in isogeometric analysis
NASA Astrophysics Data System (ADS)
Cardoso, R. P. R.; Adetoro, O. B.
2017-11-01
IsoGeometric Analysis (IGA) has proved to be a reliable numerical tool for the simulation of structural behaviour and fluid mechanics. The main reasons for this popularity are essentially due to: (i) the possibility of using higher order polynomials for the basis functions; (ii) the high convergence rates possible to achieve; (iii) the possibility to operate directly on CAD geometry without the need to resort to a mesh of elements. The major drawback of IGA is the non-interpolatory characteristic of the basis functions, which adds a difficulty in handling essential boundary conditions and makes it particularly challenging for contact analysis. In this work, the IGA is expanded to include frictionless contact procedures for sheet metal forming analyses. Non-Uniform Rational B-Splines (NURBS) are going to be used for the modelling of rigid tools as well as for the modelling of the deformable blank sheet. The contact methods developed are based on a two-step contact search scheme, where during the first step a global search algorithm is used for the allocation of contact knots into potential contact faces and a second (local) contact search scheme where point inversion techniques are used for the calculation of the contact penetration gap. For completeness, elastoplastic procedures are also included for a proper description of the entire IGA of sheet metal forming processes.
Cameron, Isobel M; Scott, Neil W; Adler, Mats; Reid, Ian C
2014-12-01
It is important for clinical practice and research that measurement scales of well-being and quality of life exhibit only minimal differential item functioning (DIF). DIF occurs where different groups of people endorse items in a scale to different extents after being matched by the intended scale attribute. We investigate the equivalence or otherwise of common methods of assessing DIF. Three methods of measuring age- and sex-related DIF (ordinal logistic regression, Rasch analysis and Mantel χ(2) procedure) were applied to Hospital Anxiety Depression Scale (HADS) data pertaining to a sample of 1,068 patients consulting primary care practitioners. Three items were flagged by all three approaches as having either age- or sex-related DIF with a consistent direction of effect; a further three items identified did not meet stricter criteria for important DIF using at least one method. When applying strict criteria for significant DIF, ordinal logistic regression was slightly less sensitive. Ordinal logistic regression, Rasch analysis and contingency table methods yielded consistent results when identifying DIF in the HADS depression and HADS anxiety scales. Regardless of methods applied, investigators should use a combination of statistical significance, magnitude of the DIF effect and investigator judgement when interpreting the results.
Esque, Jérémy; Urbain, Aurélie; Etchebest, Catherine; de Brevern, Alexandre G
2015-11-01
Transmembrane proteins (TMPs) are major drug targets, but the knowledge of their precise topology structure remains highly limited compared with globular proteins. In spite of the difficulties in obtaining their structures, an important effort has been made these last years to increase their number from an experimental and computational point of view. In view of this emerging challenge, the development of computational methods to extract knowledge from these data is crucial for the better understanding of their functions and in improving the quality of structural models. Here, we revisit an efficient unsupervised learning procedure, called Hybrid Protein Model (HPM), which is applied to the analysis of transmembrane proteins belonging to the all-α structural class. HPM method is an original classification procedure that efficiently combines sequence and structure learning. The procedure was initially applied to the analysis of globular proteins. In the present case, HPM classifies a set of overlapping protein fragments, extracted from a non-redundant databank of TMP 3D structure. After fine-tuning of the learning parameters, the optimal classification results in 65 clusters. They represent at best similar relationships between sequence and local structure properties of TMPs. Interestingly, HPM distinguishes among the resulting clusters two helical regions with distinct hydrophobic patterns. This underlines the complexity of the topology of these proteins. The HPM classification enlightens unusual relationship between amino acids in TMP fragments, which can be useful to elaborate new amino acids substitution matrices. Finally, two challenging applications are described: the first one aims at annotating protein functions (channel or not), the second one intends to assess the quality of the structures (X-ray or models) via a new scoring function deduced from the HPM classification.
Song, Ji Youn; Kang, Hyun A; Kim, Mi-Yeon; Park, Young Min; Kim, Hyung Ok
2004-03-01
Superficial chemical peeling and microdermabrasion have become increasingly popular methods for producing facial rejuvenation. However, there are few studies reporting the skin barrier function changes after these procedures. To evaluate objectively the degree of damage visually and the time needed for the skin barrier function to recover after glycolic acid peeling and aluminum oxide crystal microdermabrasion using noninvasive bioengineering methods. Superficial chemical peeling using 30%, 50%, and 70% glycolic acid and aluminum oxide crystal microdermabrasion were used on the volar forearm of 13 healthy women. The skin response was measured by a visual observation and using an evaporimeter, corneometer, and colorimeter before and after peeling at set time intervals. Both glycolic acid peeling and aluminum oxide crystal microdermabrasion induced significant damage to the skin barrier function immediately after the procedure, and the degree of damage was less severe after the aluminum oxide crystal microdermabrasion compared with glycolic acid peeling. The damaged skin barrier function had recovered within 24 hours after both procedures. The degree of erythema induction was less severe after the aluminum oxide crystal microdermabrasion compared with the glycolic acid peeling procedure. The degree of erythema induced after the glycolic acid peeling procedure was not proportional to the peeling solution concentration used. The erythema subsided within 1 day after the aluminum oxide crystal microdermabrasion procedure and within 4 days after the glycolic acid peeling procedure. These results suggest that the skin barrier function is damaged after the glycolic acid peeling and aluminum oxide crystal microdermabrasion procedure but recovers within 1 to 4 days. Therefore, repeating the superficial peeling procedure at 2-week intervals will allow sufficient time for the damaged skin to recover its barrier function.
Mathematical modelling of the growth of human fetus anatomical structures.
Dudek, Krzysztof; Kędzia, Wojciech; Kędzia, Emilia; Kędzia, Alicja; Derkowski, Wojciech
2017-09-01
The goal of this study was to present a procedure that would enable mathematical analysis of the increase of linear sizes of human anatomical structures, estimate mathematical model parameters and evaluate their adequacy. Section material consisted of 67 foetuses-rectus abdominis muscle and 75 foetuses- biceps femoris muscle. The following methods were incorporated to the study: preparation and anthropologic methods, image digital acquisition, Image J computer system measurements and statistical analysis method. We used an anthropologic method based on age determination with the use of crown-rump length-CRL (V-TUB) by Scammon and Calkins. The choice of mathematical function should be based on a real course of the curve presenting growth of anatomical structure linear size Ύ in subsequent weeks t of pregnancy. Size changes can be described with a segmental-linear model or one-function model with accuracy adequate enough for clinical purposes. The interdependence of size-age is described with many functions. However, the following functions are most often considered: linear, polynomial, spline, logarithmic, power, exponential, power-exponential, log-logistic I and II, Gompertz's I and II and von Bertalanffy's function. With the use of the procedures described above, mathematical models parameters were assessed for V-PL (the total length of body) and CRL body length increases, rectus abdominis total length h, its segments hI, hII, hIII, hIV, as well as biceps femoris length and width of long head (LHL and LHW) and of short head (SHL and SHW). The best adjustments to measurement results were observed in the exponential and Gompertz's models.
ERIC Educational Resources Information Center
Geri, George A.; Hubbard, David C.
Two adaptive psychophysical procedures (tracking and "yes-no" staircase) for obtaining human visual contrast sensitivity functions (CSF) were evaluated. The procedures were chosen based on their proven validity and the desire to evaluate the practical effects of stimulus transients, since tracking procedures traditionally employ gradual…
Individual differences in long-range time representation.
Agostino, Camila S; Caetano, Marcelo S; Balci, Fuat; Claessens, Peter M E; Zana, Yossi
2017-04-01
On the basis of experimental data, long-range time representation has been proposed to follow a highly compressed power function, which has been hypothesized to explain the time inconsistency found in financial discount rate preferences. The aim of this study was to evaluate how well linear and power function models explain empirical data from individual participants tested in different procedural settings. The line paradigm was used in five different procedural variations with 35 adult participants. Data aggregated over the participants showed that fitted linear functions explained more than 98% of the variance in all procedures. A linear regression fit also outperformed a power model fit for the aggregated data. An individual-participant-based analysis showed better fits of a linear model to the data of 14 participants; better fits of a power function with an exponent β > 1 to the data of 12 participants; and better fits of a power function with β < 1 to the data of the remaining nine participants. Of the 35 volunteers, the null hypothesis β = 1 was rejected for 20. The dispersion of the individual β values was approximated well by a normal distribution. These results suggest that, on average, humans perceive long-range time intervals not in a highly compressed, biased manner, but rather in a linear pattern. However, individuals differ considerably in their subjective time scales. This contribution sheds new light on the average and individual psychophysical functions of long-range time representation, and suggests that any attribution of deviation from exponential discount rates in intertemporal choice to the compressed nature of subjective time must entail the characterization of subjective time on an individual-participant basis.
Adding results to a meta-analysis: Theory and example
NASA Astrophysics Data System (ADS)
Willson, Victor L.
Meta-analysis has been used as a research method to describe bodies of research data. It promotes hypothesis formation and the development of science education laws. A function overlooked, however, is the role it plays in updating research. Methods to integrate new research with meta-analysis results need explication. A procedure is presented using Bayesian analysis. Research in science education attitude correlation with achievement has been published after a recent meta-analysis of the topic. The results show how new findings complement the previous meta-analysis and extend its conclusions. Additional methodological questions adddressed are how studies are to be weighted, which variables are to be examined, and how often meta-analysis are to be updated.
Assessment and Treatment of Foot-Shoe Fetish Displayed by a Man with Autism
ERIC Educational Resources Information Center
Dozier, Claudia L.; Iwata, Brian A.; Worsdell, April S.
2011-01-01
Results of a functional analysis indicated that a man diagnosed with autism engaged in bizarre sexual behavior in the presence of women wearing sandals. Several treatments proved to be ineffective or impractical. By contrast, a response-interruption/time-out procedure quickly eliminated the problem behavior in multiple settings. (Contains 1…
Application of Shuttle EVA Systems to Payloads. Volume 2: Payload EVA Task Completion Plans
NASA Technical Reports Server (NTRS)
1976-01-01
Candidate payload tasks for EVA application were identified and selected, based on an analysis of four representative space shuttle payloads, and typical EVA scenarios with supporting crew timelines and procedures were developed. The EVA preparations and post EVA operations, as well as the timelines emphasizing concurrent payload support functions, were also summarized.
Annual Forest Inventories for the North Central Region of the United States
Ronald E. McRoberts; Mark H. Hansen
1999-01-01
The primary objective in developing procedures for annual forest inventories for the north central region of the United States is to establish the capability of producing standard forest inventory and analysis estimates on an annual basis. The inventory system developed to accomplish this objective features several primary functions, including (1) an annual sample of...
NASA Technical Reports Server (NTRS)
Mattson, H. L.; Gianformaggio, A.; Anderson, N. R.
1972-01-01
The activities of the structural and mechanical activity group of the orbital operations study project are discussed. Element interfaces, alternate approaches, design concepts, operational procedures, functional requirements, design influences, and approach selection are presented. The following areas are considered: (1) mating, (2) orbital assembly, (3) separation, EOS payload deployment, and EOS payload retraction.
ERIC Educational Resources Information Center
LeBlanc, Judith M.
To gain some insight into the problem of deviant speech development in low income populations, this study investigated the environmental factors that encourage the development of normal speech. Two specific questions were examined in this study: (1) If specific vocalized environmental sounds are presented contiguously with reinforcement, will…
ERIC Educational Resources Information Center
Scaramella-Nowinski, Valerie L.
The paper presents a discussion of human mental processes as they relate to learning disabilities. Pathognomonic symptoms associated with disturbances to brain areas or functional systems are discussed, as well as treatment procedures. This brain behavior relationship is offered as a basis for a classification system that is seen to more clearly…
A Monte Carlo Study of an Iterative Wald Test Procedure for DIF Analysis
ERIC Educational Resources Information Center
Cao, Mengyang; Tay, Louis; Liu, Yaowu
2017-01-01
This study examined the performance of a proposed iterative Wald approach for detecting differential item functioning (DIF) between two groups when preknowledge of anchor items is absent. The iterative approach utilizes the Wald-2 approach to identify anchor items and then iteratively tests for DIF items with the Wald-1 approach. Monte Carlo…
Function Flow Analysis and Comparison of Doctrinal and Applied Operations Planning Process
2005-05-16
décision humaine est intuitive, c’est-à-dire qu’elle s’effectue selon un processus moins analytique, moins formel. Cela suppose qu’il peut y avoir une...Tactical System. The Land Force wants to develop new procedures that capitalize on the strengths of digitization. Project Minerva will focus on the
ROC and Loss Function Analysis in Sequential Testing
ERIC Educational Resources Information Center
Muijtjens, Arno M. M.; Van Luijk, Scheltus J.; Van Der Vleuten, Cees P. M.
2006-01-01
Sequential testing is applied to reduce costs in SP-based tests (OSCEs). Initially, all candidates take a screening test consisting of a part of the OSCE. Candidates who fail the screen sit the complete test, whereas those who pass the screen are qualified as a pass of the complete test. The procedure may result in a reduction of testing…
NASA Astrophysics Data System (ADS)
Hirabayashi, Mieko; Mehta, Beejal; Vahidi, Nasim W.; Khosla, Ajit; Kassegne, Sam
2013-11-01
In this study, the investigation of surface-treatment of chemically inert graphitic carbon microelectrodes (derived from pyrolyzed photoresist polymer) for improving their attachment chemistry with DNA molecular wires and ropes as part of a bionanoelectronics platform is reported. Polymer microelectrodes were fabricated on a silicon wafer using standard negative lithography procedures with negative-tone photoresist. These microelectrode structures were then pyrolyzed and converted to a form of conductive carbon that is referred to as PP (pyrolyzed polymer) carbon throughout this paper. Functionalization of the resulting pyrolyzed structures was done using nitric, sulfuric, 4-amino benzoic acids (4-ABA), and oxygen plasma etching and the surface modifications confirmed with Fourier transform infrared spectroscopy (FTIR), Raman spectroscopy, and electron dispersion x-ray spectroscopy (EDS). Post surface-treatment analysis of microelectrodes with FTIR and Raman spectroscopy showed signature peaks characteristics of carboxyl functional groups while EDS showed an increase in oxygen content in the surface-treatment procedures (except 4-ABA) indicating an increase in carboxyl functional group. These functional groups form the basis for peptide bond with aminated oligonucleotides that in turn could be used as molecular wires and interconnects in a bionanoelectronics platform. Post-pyrolysis analysis using EDS showed relatively higher oxygen concentrations at the edges and location of defects compared to other locations on these microelectrodes. In addition, electrochemical impedance measurements showed metal-like behavior of PP carbon with high conductivity (|Z| <1 KΩ) and no detectable detrimental effect of oxygen plasma surface-treatment on electrical characteristic. In general, characterization results—taken together—indicated that oxygen plasma surface-treatment produced more reliable, less damaging, and consistently repeatable generation of carboxyl functional groups than diazonium salt and strong acid treatments.
Minimizing Postsampling Degradation of Peptides by a Thermal Benchtop Tissue Stabilization Method
Segerström, Lova; Gustavsson, Jenny
2016-01-01
Enzymatic degradation is a major concern in peptide analysis. Postmortem metabolism in biological samples entails considerable risk for measurements misrepresentative of true in vivo concentrations. It is therefore vital to find reliable, reproducible, and easy-to-use procedures to inhibit enzymatic activity in fresh tissues before subjecting them to qualitative and quantitative analyses. The aim of this study was to test a benchtop thermal stabilization method to optimize measurement of endogenous opioids in brain tissue. Endogenous opioid peptides are generated from precursor proteins through multiple enzymatic steps that include conversion of one bioactive peptide to another, often with a different function. Ex vivo metabolism may, therefore, lead to erroneous functional interpretations. The efficacy of heat stabilization was systematically evaluated in a number of postmortem handling procedures. Dynorphin B (DYNB), Leu-enkephalin-Arg6 (LARG), and Met-enkephalin-Arg6-Phe7 (MEAP) were measured by radioimmunoassay in rat hypothalamus, striatum (STR), and cingulate cortex (CCX). Also, simplified extraction protocols for stabilized tissue were tested. Stabilization affected all peptide levels to varying degrees compared to those prepared by standard dissection and tissue handling procedures. Stabilization increased DYNB in hypothalamus, but not STR or CCX, whereas LARG generally decreased. MEAP increased in hypothalamus after all stabilization procedures, whereas for STR and CCX, the effect was dependent on the time point for stabilization. The efficacy of stabilization allowed samples to be left for 2 hours in room temperature (20°C) without changes in peptide levels. This study shows that conductive heat transfer is an easy-to-use and efficient procedure for the preservation of the molecular composition in biological samples. Region- and peptide-specific critical steps were identified and stabilization enabled the optimization of tissue handling and opioid peptide analysis. The result is improved diagnostic and research value of the samples with great benefits for basic research and clinical work. PMID:27007059
Resting-State Functional Magnetic Resonance Imaging for Language Preoperative Planning
Branco, Paulo; Seixas, Daniela; Deprez, Sabine; Kovacs, Silvia; Peeters, Ronald; Castro, São L.; Sunaert, Stefan
2016-01-01
Functional magnetic resonance imaging (fMRI) is a well-known non-invasive technique for the study of brain function. One of its most common clinical applications is preoperative language mapping, essential for the preservation of function in neurosurgical patients. Typically, fMRI is used to track task-related activity, but poor task performance and movement artifacts can be critical limitations in clinical settings. Recent advances in resting-state protocols open new possibilities for pre-surgical mapping of language potentially overcoming these limitations. To test the feasibility of using resting-state fMRI instead of conventional active task-based protocols, we compared results from fifteen patients with brain lesions while performing a verb-to-noun generation task and while at rest. Task-activity was measured using a general linear model analysis and independent component analysis (ICA). Resting-state networks were extracted using ICA and further classified in two ways: manually by an expert and by using an automated template matching procedure. The results revealed that the automated classification procedure correctly identified language networks as compared to the expert manual classification. We found a good overlay between task-related activity and resting-state language maps, particularly within the language regions of interest. Furthermore, resting-state language maps were as sensitive as task-related maps, and had higher specificity. Our findings suggest that resting-state protocols may be suitable to map language networks in a quick and clinically efficient way. PMID:26869899
Martin, Mario; Contreras-Hernández, Enrique; Béjar, Javier; Esposito, Gennaro; Chávez, Diógenes; Glusman, Silvio; Cortés, Ulises; Rudomin, Pablo
2015-01-01
Previous studies aimed to disclose the functional organization of the neuronal networks involved in the generation of the spontaneous cord dorsum potentials (CDPs) generated in the lumbosacral spinal segments used predetermined templates to select specific classes of spontaneous CDPs. Since this procedure was time consuming and required continuous supervision, it was limited to the analysis of two specific types of CDPs (negative CDPs and negative positive CDPs), thus excluding potentials that may reflect activation of other neuronal networks of presumed functional relevance. We now present a novel procedure based in machine learning that allows the efficient and unbiased selection of a variety of spontaneous CDPs with different shapes and amplitudes. The reliability and performance of the present method is evaluated by analyzing the effects on the probabilities of generation of different classes of spontaneous CDPs induced by the intradermic injection of small amounts of capsaicin in the anesthetized cat, a procedure known to induce a state of central sensitization leading to allodynia and hyperalgesia. The results obtained with the selection method presently described allowed detection of spontaneous CDPs with specific shapes and amplitudes that are assumed to represent the activation of functionally coupled sets of dorsal horn neurones that acquire different, structured configurations in response to nociceptive stimuli. These changes are considered as responses tending to adequate transmission of sensory information to specific functional requirements as part of homeostatic adjustments. PMID:26379540
Optimal design application on the advanced aeroelastic rotor blade
NASA Technical Reports Server (NTRS)
Wei, F. S.; Jones, R.
1985-01-01
The vibration and performance optimization procedure using regression analysis was successfully applied to an advanced aeroelastic blade design study. The major advantage of this regression technique is that multiple optimizations can be performed to evaluate the effects of various objective functions and constraint functions. The data bases obtained from the rotorcraft flight simulation program C81 and Myklestad mode shape program are analytically determined as a function of each design variable. This approach has been verified for various blade radial ballast weight locations and blade planforms. This method can also be utilized to ascertain the effect of a particular cost function which is composed of several objective functions with different weighting factors for various mission requirements without any additional effort.
Kelley, Michael E; Shillingsburg, M Alice; Castro, M Jicel; Addison, Laura R; LaRue, Robert H; Martins, Megan P
2007-01-01
Although experimental analysis methodologies have been useful for identifying the function of a wide variety of target behaviors (e.g., Iwata, Dorsey, Slifer, Bauman, & Richman, 1982/1994), only recently have such procedures been applied to verbal operants (Lerman et al., 2005). In the current study, we conducted a systematic replication of the methodology developed by Lerman et al. Participants were 4 children who had been diagnosed with developmental disabilities and who engaged in limited vocal behavior. The function of vocal behavior was assessed by exposing target vocal responses to experimental analyses. Results showed that experimental analyses were generally useful for identifying the functions of vocal behavior across all participants.
BrainMap VBM: An environment for structural meta-analysis.
Vanasse, Thomas J; Fox, P Mickle; Barron, Daniel S; Robertson, Michaela; Eickhoff, Simon B; Lancaster, Jack L; Fox, Peter T
2018-05-02
The BrainMap database is a community resource that curates peer-reviewed, coordinate-based human neuroimaging literature. By pairing the results of neuroimaging studies with their relevant meta-data, BrainMap facilitates coordinate-based meta-analysis (CBMA) of the neuroimaging literature en masse or at the level of experimental paradigm, clinical disease, or anatomic location. Initially dedicated to the functional, task-activation literature, BrainMap is now expanding to include voxel-based morphometry (VBM) studies in a separate sector, titled: BrainMap VBM. VBM is a whole-brain, voxel-wise method that measures significant structural differences between or within groups which are reported as standardized, peak x-y-z coordinates. Here we describe BrainMap VBM, including the meta-data structure, current data volume, and automated reverse inference functions (region-to-disease profile) of this new community resource. CBMA offers a robust methodology for retaining true-positive and excluding false-positive findings across studies in the VBM literature. As with BrainMap's functional database, BrainMap VBM may be synthesized en masse or at the level of clinical disease or anatomic location. As a use-case scenario for BrainMap VBM, we illustrate a trans-diagnostic data-mining procedure wherein we explore the underlying network structure of 2,002 experiments representing over 53,000 subjects through independent components analysis (ICA). To reduce data-redundancy effects inherent to any database, we demonstrate two data-filtering approaches that proved helpful to ICA. Finally, we apply hierarchical clustering analysis (HCA) to measure network- and disease-specificity. This procedure distinguished psychiatric from neurological diseases. We invite the neuroscientific community to further exploit BrainMap VBM with other modeling approaches. © 2018 Wiley Periodicals, Inc.
Impact of Stone Removal on Renal Function: A Review
Wood, Kyle; Keys, Tristan; Mufarrij, Patrick; Assimos, Dean G
2011-01-01
Stone removal can improve renal function by eradicating obstruction and, in certain cases, an underlying infection. Stone-removing procedures, however, may negatively impact functional integrity. Many things may impact the latter, including the procedures used, the methods of assessing function, the time when these assessments are made, the occurrence of complications, the baseline condition of the kidney, and patient-related factors. In the majority of cases, little significant functional impairment occurs. However, there are gaps in our knowledge of this subject, including the cumulative effects of multiple procedures violating the renal parenchyma and long-term functional outcomes. PMID:21935339
Launch vehicle systems design analysis
NASA Technical Reports Server (NTRS)
Ryan, Robert; Verderaime, V.
1993-01-01
Current launch vehicle design emphasis is on low life-cycle cost. This paper applies total quality management (TQM) principles to a conventional systems design analysis process to provide low-cost, high-reliability designs. Suggested TQM techniques include Steward's systems information flow matrix method, quality leverage principle, quality through robustness and function deployment, Pareto's principle, Pugh's selection and enhancement criteria, and other design process procedures. TQM quality performance at least-cost can be realized through competent concurrent engineering teams and brilliance of their technical leadership.
Structural system reliability calculation using a probabilistic fault tree analysis method
NASA Technical Reports Server (NTRS)
Torng, T. Y.; Wu, Y.-T.; Millwater, H. R.
1992-01-01
The development of a new probabilistic fault tree analysis (PFTA) method for calculating structural system reliability is summarized. The proposed PFTA procedure includes: developing a fault tree to represent the complex structural system, constructing an approximation function for each bottom event, determining a dominant sampling sequence for all bottom events, and calculating the system reliability using an adaptive importance sampling method. PFTA is suitable for complicated structural problems that require computer-intensive computer calculations. A computer program has been developed to implement the PFTA.
System engineering toolbox for design-oriented engineers
NASA Technical Reports Server (NTRS)
Goldberg, B. E.; Everhart, K.; Stevens, R.; Babbitt, N., III; Clemens, P.; Stout, L.
1994-01-01
This system engineering toolbox is designed to provide tools and methodologies to the design-oriented systems engineer. A tool is defined as a set of procedures to accomplish a specific function. A methodology is defined as a collection of tools, rules, and postulates to accomplish a purpose. For each concept addressed in the toolbox, the following information is provided: (1) description, (2) application, (3) procedures, (4) examples, if practical, (5) advantages, (6) limitations, and (7) bibliography and/or references. The scope of the document includes concept development tools, system safety and reliability tools, design-related analytical tools, graphical data interpretation tools, a brief description of common statistical tools and methodologies, so-called total quality management tools, and trend analysis tools. Both relationship to project phase and primary functional usage of the tools are also delineated. The toolbox also includes a case study for illustrative purposes. Fifty-five tools are delineated in the text.
From direct-space discrepancy functions to crystallographic least squares.
Giacovazzo, Carmelo
2015-01-01
Crystallographic least squares are a fundamental tool for crystal structure analysis. In this paper their properties are derived from functions estimating the degree of similarity between two electron-density maps. The new approach leads also to modifications of the standard least-squares procedures, potentially able to improve their efficiency. The role of the scaling factor between observed and model amplitudes is analysed: the concept of unlocated model is discussed and its scattering contribution is combined with that arising from the located model. Also, the possible use of an ancillary parameter, to be associated with the classical weight related to the variance of the observed amplitudes, is studied. The crystallographic discrepancy factors, basic tools often combined with least-squares procedures in phasing approaches, are analysed. The mathematical approach here described includes, as a special case, the so-called vector refinement, used when accurate estimates of the target phases are available.
NASA Astrophysics Data System (ADS)
Hawdon, Aaron; McJannet, David; Wallace, Jim
2014-06-01
The cosmic-ray probe (CRP) provides continuous estimates of soil moisture over an area of ˜30 ha by counting fast neutrons produced from cosmic rays which are predominantly moderated by water molecules in the soil. This paper describes the setup, measurement correction procedures, and field calibration of CRPs at nine locations across Australia with contrasting soil type, climate, and land cover. These probes form the inaugural Australian CRP network, which is known as CosmOz. CRP measurements require neutron count rates to be corrected for effects of atmospheric pressure, water vapor pressure changes, and variations in incoming neutron intensity. We assess the magnitude and importance of these corrections and present standardized approaches for network-wide analysis. In particular, we present a new approach to correct for incoming neutron intensity variations and test its performance against existing procedures used in other studies. Our field calibration results indicate that a generalized calibration function for relating neutron counts to soil moisture is suitable for all soil types, with the possible exception of very sandy soils with low water content. Using multiple calibration data sets, we demonstrate that the generalized calibration function only applies after accounting for persistent sources of hydrogen in the soil profile. Finally, we demonstrate that by following standardized correction procedures and scaling neutron counting rates of all CRPs to a single reference location, differences in calibrations between sites are related to site biomass. This observation provides a means for estimating biomass at a given location or for deriving coefficients for the calibration function in the absence of field calibration data.
Handbook of clinical nursing practice
DOE Office of Scientific and Technical Information (OSTI.GOV)
Asheervath, J.; Blevins, D.R.
Written in outline format, this reference will help nurses further their understanding of advanced nursing procedures. Information is provided on the physiological, psychological, environmental, and safety considerations of nursing activities associated with diagnostic and therapeutic procedures. Special consideration is given to the areas of pediatric nursing, nursing assessment, and selected radiologic and nuclear medicine procedures for each system. Contents: Clinical Introduction. Clinical Nursing Practice: Focus on Basics. Focus on Cardiovascular Function. Focus on Respiratory Function. Focus on Gastrointestinal Function. Focus on Renal and Genito-Urological Function. Focus on Neuro-Skeletal and Muscular Function. Appendices.
Niu, Sheng-Yong; Yang, Jinyu; McDermaid, Adam; Zhao, Jing; Kang, Yu; Ma, Qin
2017-05-08
Metagenomic and metatranscriptomic sequencing approaches are more frequently being used to link microbiota to important diseases and ecological changes. Many analyses have been used to compare the taxonomic and functional profiles of microbiota across habitats or individuals. While a large portion of metagenomic analyses focus on species-level profiling, some studies use strain-level metagenomic analyses to investigate the relationship between specific strains and certain circumstances. Metatranscriptomic analysis provides another important insight into activities of genes by examining gene expression levels of microbiota. Hence, combining metagenomic and metatranscriptomic analyses will help understand the activity or enrichment of a given gene set, such as drug-resistant genes among microbiome samples. Here, we summarize existing bioinformatics tools of metagenomic and metatranscriptomic data analysis, the purpose of which is to assist researchers in deciding the appropriate tools for their microbiome studies. Additionally, we propose an Integrated Meta-Function mapping pipeline to incorporate various reference databases and accelerate functional gene mapping procedures for both metagenomic and metatranscriptomic analyses. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Bayesian function-on-function regression for multilevel functional data.
Meyer, Mark J; Coull, Brent A; Versace, Francesco; Cinciripini, Paul; Morris, Jeffrey S
2015-09-01
Medical and public health research increasingly involves the collection of complex and high dimensional data. In particular, functional data-where the unit of observation is a curve or set of curves that are finely sampled over a grid-is frequently obtained. Moreover, researchers often sample multiple curves per person resulting in repeated functional measures. A common question is how to analyze the relationship between two functional variables. We propose a general function-on-function regression model for repeatedly sampled functional data on a fine grid, presenting a simple model as well as a more extensive mixed model framework, and introducing various functional Bayesian inferential procedures that account for multiple testing. We examine these models via simulation and a data analysis with data from a study that used event-related potentials to examine how the brain processes various types of images. © 2015, The International Biometric Society.
Bio-Oil Analysis Laboratory Procedures | Bioenergy | NREL
Bio-Oil Analysis Laboratory Procedures Bio-Oil Analysis Laboratory Procedures NREL develops standard procedures have been validated and allow for reliable bio-oil analysis. Procedures Determination different hydroxyl groups (-OH) in pyrolysis bio-oil: aliphatic-OH, phenolic-OH, and carboxylic-OH. Download
NASA Astrophysics Data System (ADS)
Yi, Dake; Wang, TzuChiang
2018-06-01
In the paper, a new procedure is proposed to investigate three-dimensional fracture problems of a thin elastic plate with a long through-the-thickness crack under remote uniform tensile loading. The new procedure includes a new analytical method and high accurate finite element simulations. In the part of theoretical analysis, three-dimensional Maxwell stress functions are employed in order to derive three-dimensional crack tip fields. Based on the theoretical analysis, an equation which can describe the relationship among the three-dimensional J-integral J( z), the stress intensity factor K( z) and the tri-axial stress constraint level T z ( z) is derived first. In the part of finite element simulations, a fine mesh including 153360 elements is constructed to compute the stress field near the crack front, J( z) and T z ( z). Numerical results show that in the plane very close to the free surface, the K field solution is still valid for in-plane stresses. Comparison with the numerical results shows that the analytical results are valid.
Dynamic variational asymptotic procedure for laminated composite shells
NASA Astrophysics Data System (ADS)
Lee, Chang-Yong
Unlike published shell theories, the main two parts of this thesis are devoted to the asymptotic construction of a refined theory for composite laminated shells valid over a wide range of frequencies and wavelengths. The resulting theory is applicable to shells each layer of which is made of materials with monoclinic symmetry. It enables one to analyze shell dynamic responses within both long-wavelength, low- and high-frequency vibration regimes. It also leads to energy functionals that are both positive definiteness and sufficient simplicity for all wavelengths. This whole procedure was first performed analytically. From the insight gained from the procedure, a finite element version of the analysis was then developed; and a corresponding computer program, DVAPAS, was developed. DVAPAS can obtain the generalized 2-D constitutive law and recover accurately the 3-D results for stress and strain in composite shells. Some independent works will be needed to develop the corresponding 2-D surface analysis associated with the present theory and to continue towards full verification and validation of the present process by comparison with available published works.
Effect of the image resolution on the statistical descriptors of heterogeneous media.
Ledesma-Alonso, René; Barbosa, Romeli; Ortegón, Jaime
2018-02-01
The characterization and reconstruction of heterogeneous materials, such as porous media and electrode materials, involve the application of image processing methods to data acquired by scanning electron microscopy or other microscopy techniques. Among them, binarization and decimation are critical in order to compute the correlation functions that characterize the microstructure of the above-mentioned materials. In this study, we present a theoretical analysis of the effects of the image-size reduction, due to the progressive and sequential decimation of the original image. Three different decimation procedures (random, bilinear, and bicubic) were implemented and their consequences on the discrete correlation functions (two-point, line-path, and pore-size distribution) and the coarseness (derived from the local volume fraction) are reported and analyzed. The chosen statistical descriptors (correlation functions and coarseness) are typically employed to characterize and reconstruct heterogeneous materials. A normalization for each of the correlation functions has been performed. When the loss of statistical information has not been significant for a decimated image, its normalized correlation function is forecast by the trend of the original image (reference function). In contrast, when the decimated image does not hold statistical evidence of the original one, the normalized correlation function diverts from the reference function. Moreover, the equally weighted sum of the average of the squared difference, between the discrete correlation functions of the decimated images and the reference functions, leads to a definition of an overall error. During the first stages of the gradual decimation, the error remains relatively small and independent of the decimation procedure. Above a threshold defined by the correlation length of the reference function, the error becomes a function of the number of decimation steps. At this stage, some statistical information is lost and the error becomes dependent on the decimation procedure. These results may help us to restrict the amount of information that one can afford to lose during a decimation process, in order to reduce the computational and memory cost, when one aims to diminish the time consumed by a characterization or reconstruction technique, yet maintaining the statistical quality of the digitized sample.
Effect of the image resolution on the statistical descriptors of heterogeneous media
NASA Astrophysics Data System (ADS)
Ledesma-Alonso, René; Barbosa, Romeli; Ortegón, Jaime
2018-02-01
The characterization and reconstruction of heterogeneous materials, such as porous media and electrode materials, involve the application of image processing methods to data acquired by scanning electron microscopy or other microscopy techniques. Among them, binarization and decimation are critical in order to compute the correlation functions that characterize the microstructure of the above-mentioned materials. In this study, we present a theoretical analysis of the effects of the image-size reduction, due to the progressive and sequential decimation of the original image. Three different decimation procedures (random, bilinear, and bicubic) were implemented and their consequences on the discrete correlation functions (two-point, line-path, and pore-size distribution) and the coarseness (derived from the local volume fraction) are reported and analyzed. The chosen statistical descriptors (correlation functions and coarseness) are typically employed to characterize and reconstruct heterogeneous materials. A normalization for each of the correlation functions has been performed. When the loss of statistical information has not been significant for a decimated image, its normalized correlation function is forecast by the trend of the original image (reference function). In contrast, when the decimated image does not hold statistical evidence of the original one, the normalized correlation function diverts from the reference function. Moreover, the equally weighted sum of the average of the squared difference, between the discrete correlation functions of the decimated images and the reference functions, leads to a definition of an overall error. During the first stages of the gradual decimation, the error remains relatively small and independent of the decimation procedure. Above a threshold defined by the correlation length of the reference function, the error becomes a function of the number of decimation steps. At this stage, some statistical information is lost and the error becomes dependent on the decimation procedure. These results may help us to restrict the amount of information that one can afford to lose during a decimation process, in order to reduce the computational and memory cost, when one aims to diminish the time consumed by a characterization or reconstruction technique, yet maintaining the statistical quality of the digitized sample.
Kokaly, Raymond F.
2011-01-01
This report describes procedures for installing and using the U.S. Geological Survey Processing Routines in IDL for Spectroscopic Measurements (PRISM) software. PRISM provides a framework to conduct spectroscopic analysis of measurements made using laboratory, field, airborne, and space-based spectrometers. Using PRISM functions, the user can compare the spectra of materials of unknown composition with reference spectra of known materials. This spectroscopic analysis allows the composition of the material to be identified and characterized. Among its other functions, PRISM contains routines for the storage of spectra in database files, import/export of ENVI spectral libraries, importation of field spectra, correction of spectra to absolute reflectance, arithmetic operations on spectra, interactive continuum removal and comparison of spectral features, correction of imaging spectrometer data to ground-calibrated reflectance, and identification and mapping of materials using spectral feature-based analysis of reflectance data. This report provides step-by-step instructions for installing the PRISM software and running its functions.
Bagging Voronoi classifiers for clustering spatial functional data
NASA Astrophysics Data System (ADS)
Secchi, Piercesare; Vantini, Simone; Vitelli, Valeria
2013-06-01
We propose a bagging strategy based on random Voronoi tessellations for the exploration of geo-referenced functional data, suitable for different purposes (e.g., classification, regression, dimensional reduction, …). Urged by an application to environmental data contained in the Surface Solar Energy database, we focus in particular on the problem of clustering functional data indexed by the sites of a spatial finite lattice. We thus illustrate our strategy by implementing a specific algorithm whose rationale is to (i) replace the original data set with a reduced one, composed by local representatives of neighborhoods covering the entire investigated area; (ii) analyze the local representatives; (iii) repeat the previous analysis many times for different reduced data sets associated to randomly generated different sets of neighborhoods, thus obtaining many different weak formulations of the analysis; (iv) finally, bag together the weak analyses to obtain a conclusive strong analysis. Through an extensive simulation study, we show that this new procedure - which does not require an explicit model for spatial dependence - is statistically and computationally efficient.
NASA Technical Reports Server (NTRS)
Scoggins, J. R.; Clark, T. L.; Possiel, N. C.
1975-01-01
Procedures for forecasting clear air turbulence in the stratosphere over the western United States from rawinsonde data are described and results presented. Approaches taken to relate meteorological parameters to regions of turbulence and nonturbulence encountered by the XB-70 during 46 flights at altitudes between 12-20 km include: empirical probabilities, discriminant function analysis, and mountainwave theory. Results from these techniques were combined into a procedure to forecast regions of clear air turbulence with an accuracy of 70-80 percent. A computer program was developed to provide an objective forecast directly from the rawinsonde sounding data.
Expression of Plant Receptor Kinases in Tobacco BY-2 Cells.
Shinohara, Hidefumi; Matsubayashi, Yoshikatsu
2017-01-01
Although more than 600 single-transmembrane receptor kinase genes have been found in the Arabidopsis genome, only a few of them have known physiological functions, and even fewer plant receptor kinases have known specific ligands. Ligand-binding analysis must be operated using the functionally expressed receptor form. However, the relative abundance of native receptor kinase molecules in the plasma membrane is often quite low. Here, we present a method for stable and functional expression of plant receptor kinases in tobacco BY-2 cells that allows preparation of microsomal fractions containing the receptor. This procedure provides a sufficient amount of receptor proteins while maintaining its ligand-binding activities.
Rainoldi, Giulia; Begnini, Fabio; de Munnik, Mariska; Lo Presti, Leonardo; Vande Velde, Christophe M L; Orru, Romano; Lesma, Giordano; Ruijter, Eelco; Silvani, Alessandra
2018-02-12
We developed two Ugi-type three-component reactions of spirooxindole-fused 3-thiazolines, isocyanides, and either carboxylic acids or trimethylsilyl azide, to give highly functionalized spirooxindole-fused thiazolidines. Two diverse libraries were generated using practical and robust procedures affording the products in typically good yields. The obtained thiazolidines proved to be suitable substrates for further transformations. Notably, both the Ugi-Joullié and the azido-Ugi reactions resulted highly diastereoselective, affording predominantly the trans-configured products, as confirmed by X-ray crystallographic analysis.
NASA Astrophysics Data System (ADS)
Sessa, Francesco; D'Angelo, Paola; Migliorati, Valentina
2018-01-01
In this work we have developed an analytical procedure to identify metal ion coordination geometries in liquid media based on the calculation of Combined Distribution Functions (CDFs) starting from Molecular Dynamics (MD) simulations. CDFs provide a fingerprint which can be easily and unambiguously assigned to a reference polyhedron. The CDF analysis has been tested on five systems and has proven to reliably identify the correct geometries of several ion coordination complexes. This tool is simple and general and can be efficiently applied to different MD simulations of liquid systems.
Development of non-linear finite element computer code
NASA Technical Reports Server (NTRS)
Becker, E. B.; Miller, T.
1985-01-01
Recent work has shown that the use of separable symmetric functions of the principal stretches can adequately describe the response of certain propellant materials and, further, that a data reduction scheme gives a convenient way of obtaining the values of the functions from experimental data. Based on representation of the energy, a computational scheme was developed that allows finite element analysis of boundary value problems of arbitrary shape and loading. The computational procedure was implemental in a three-dimensional finite element code, TEXLESP-S, which is documented herein.
NASA Astrophysics Data System (ADS)
Bertone, Valerio; Carrazza, Stefano; Hartland, Nathan P.; Nocera, Emanuele R.; Rojo, Juan
2017-08-01
We present NNFF1.0, a new determination of the fragmentation functions (FFs) of charged pions, charged kaons, and protons/antiprotons from an analysis of single-inclusive hadron production data in electron-positron annihilation. This determination, performed at leading, next-to-leading, and next-to-next-to-leading order in perturbative QCD, is based on the NNPDF methodology, a fitting framework designed to provide a statistically sound representation of FF uncertainties and to minimise any procedural bias. We discuss novel aspects of the methodology used in this analysis, namely an optimised parametrisation of FFs and a more efficient χ ^2 minimisation strategy, and validate the FF fitting procedure by means of closure tests. We then present the NNFF1.0 sets, and discuss their fit quality, their perturbative convergence, and their stability upon variations of the kinematic cuts and the fitted dataset. We find that the systematic inclusion of higher-order QCD corrections significantly improves the description of the data, especially in the small- z region. We compare the NNFF1.0 sets to other recent sets of FFs, finding in general a reasonable agreement, but also important differences. Together with existing sets of unpolarised and polarised parton distribution functions (PDFs), FFs and PDFs are now available from a common fitting framework for the first time.
Discriminant analysis in wildlife research: Theory and applications
Williams, B.K.; Capen, D.E.
1981-01-01
Discriminant analysis, a method of analyzing grouped multivariate data, is often used in ecological investigations. It has both a predictive and an explanatory function, the former aiming at classification of individuals of unknown group membership. The goal of the latter function is to exhibit group separation by means of linear transforms, and the corresponding method is called canonical analysis. This discussion focuses on the application of canonical analysis in ecology. In order to clarify its meaning, a parametric approach is taken instead of the usual data-based formulation. For certain assumptions the data-based canonical variates are shown to result from maximum likelihood estimation, thus insuring consistency and asymptotic efficiency. The distorting effects of covariance heterogeneity are examined, as are certain difficulties which arise in interpreting the canonical functions. A 'distortion metric' is defined, by means of which distortions resulting from the canonical transformation can be assessed. Several sampling problems which arise in ecological applications are considered. It is concluded that the method may prove valuable for data exploration, but is of limited value as an inferential procedure.
Response surface method in geotechnical/structural analysis, phase 1
NASA Astrophysics Data System (ADS)
Wong, F. S.
1981-02-01
In the response surface approach, an approximating function is fit to a long running computer code based on a limited number of code calculations. The approximating function, called the response surface, is then used to replace the code in subsequent repetitive computations required in a statistical analysis. The procedure of the response surface development and feasibility of the method are shown using a sample problem in slop stability which is based on data from centrifuge experiments of model soil slopes and involves five random soil parameters. It is shown that a response surface can be constructed based on as few as four code calculations and that the response surface is computationally extremely efficient compared to the code calculation. Potential applications of this research include probabilistic analysis of dynamic, complex, nonlinear soil/structure systems such as slope stability, liquefaction, and nuclear reactor safety.
Ezekiel, Fredrick; Bosma, Rachael; Morton, J Bruce
2013-07-01
The Dimensional Change Card Sort (DCCS) is a standard procedure for assessing executive functioning early in development. In the task, participants switch from sorting cards one way (e.g., by color) to sorting them a different way (e.g., by shape). Traditional accounts associate age-related changes in DCCS performance with circumscribed changes in lateral prefrontal cortex (lPFC) functioning, but evidence of age-related differences in the modulation of lPFC activity by switching is mixed. The current study therefore tested for possible age-related differences in functional connectivity of lPFC with regions that comprise a larger cognitive control network. Functional magnetic resonance imaging (fMRI) data collected from children and adults performing the DCCS were analyzed by means of independent components analysis (ICA). The analysis revealed several important age-related differences in functional connectivity of lPFC. In particular, lPFC was more strongly connected with the anterior cingulate, inferior parietal cortex, and the ventral tegmental area in adults than in children. Theoretical implications are discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.
Optimizing cost-efficiency in mean exposure assessment - cost functions reconsidered
2011-01-01
Background Reliable exposure data is a vital concern in medical epidemiology and intervention studies. The present study addresses the needs of the medical researcher to spend monetary resources devoted to exposure assessment with an optimal cost-efficiency, i.e. obtain the best possible statistical performance at a specified budget. A few previous studies have suggested mathematical optimization procedures based on very simple cost models; this study extends the methodology to cover even non-linear cost scenarios. Methods Statistical performance, i.e. efficiency, was assessed in terms of the precision of an exposure mean value, as determined in a hierarchical, nested measurement model with three stages. Total costs were assessed using a corresponding three-stage cost model, allowing costs at each stage to vary non-linearly with the number of measurements according to a power function. Using these models, procedures for identifying the optimally cost-efficient allocation of measurements under a constrained budget were developed, and applied on 225 scenarios combining different sizes of unit costs, cost function exponents, and exposure variance components. Results Explicit mathematical rules for identifying optimal allocation could be developed when cost functions were linear, while non-linear cost functions implied that parts of or the entire optimization procedure had to be carried out using numerical methods. For many of the 225 scenarios, the optimal strategy consisted in measuring on only one occasion from each of as many subjects as allowed by the budget. Significant deviations from this principle occurred if costs for recruiting subjects were large compared to costs for setting up measurement occasions, and, at the same time, the between-subjects to within-subject variance ratio was small. In these cases, non-linearities had a profound influence on the optimal allocation and on the eventual size of the exposure data set. Conclusions The analysis procedures developed in the present study can be used for informed design of exposure assessment strategies, provided that data are available on exposure variability and the costs of collecting and processing data. The present shortage of empirical evidence on costs and appropriate cost functions however impedes general conclusions on optimal exposure measurement strategies in different epidemiologic scenarios. PMID:21600023
Optimizing cost-efficiency in mean exposure assessment--cost functions reconsidered.
Mathiassen, Svend Erik; Bolin, Kristian
2011-05-21
Reliable exposure data is a vital concern in medical epidemiology and intervention studies. The present study addresses the needs of the medical researcher to spend monetary resources devoted to exposure assessment with an optimal cost-efficiency, i.e. obtain the best possible statistical performance at a specified budget. A few previous studies have suggested mathematical optimization procedures based on very simple cost models; this study extends the methodology to cover even non-linear cost scenarios. Statistical performance, i.e. efficiency, was assessed in terms of the precision of an exposure mean value, as determined in a hierarchical, nested measurement model with three stages. Total costs were assessed using a corresponding three-stage cost model, allowing costs at each stage to vary non-linearly with the number of measurements according to a power function. Using these models, procedures for identifying the optimally cost-efficient allocation of measurements under a constrained budget were developed, and applied on 225 scenarios combining different sizes of unit costs, cost function exponents, and exposure variance components. Explicit mathematical rules for identifying optimal allocation could be developed when cost functions were linear, while non-linear cost functions implied that parts of or the entire optimization procedure had to be carried out using numerical methods.For many of the 225 scenarios, the optimal strategy consisted in measuring on only one occasion from each of as many subjects as allowed by the budget. Significant deviations from this principle occurred if costs for recruiting subjects were large compared to costs for setting up measurement occasions, and, at the same time, the between-subjects to within-subject variance ratio was small. In these cases, non-linearities had a profound influence on the optimal allocation and on the eventual size of the exposure data set. The analysis procedures developed in the present study can be used for informed design of exposure assessment strategies, provided that data are available on exposure variability and the costs of collecting and processing data. The present shortage of empirical evidence on costs and appropriate cost functions however impedes general conclusions on optimal exposure measurement strategies in different epidemiologic scenarios.
Models and analysis for multivariate failure time data
NASA Astrophysics Data System (ADS)
Shih, Joanna Huang
The goal of this research is to develop and investigate models and analytic methods for multivariate failure time data. We compare models in terms of direct modeling of the margins, flexibility of dependency structure, local vs. global measures of association, and ease of implementation. In particular, we study copula models, and models produced by right neutral cumulative hazard functions and right neutral hazard functions. We examine the changes of association over time for families of bivariate distributions induced from these models by displaying their density contour plots, conditional density plots, correlation curves of Doksum et al, and local cross ratios of Oakes. We know that bivariate distributions with same margins might exhibit quite different dependency structures. In addition to modeling, we study estimation procedures. For copula models, we investigate three estimation procedures. the first procedure is full maximum likelihood. The second procedure is two-stage maximum likelihood. At stage 1, we estimate the parameters in the margins by maximizing the marginal likelihood. At stage 2, we estimate the dependency structure by fixing the margins at the estimated ones. The third procedure is two-stage partially parametric maximum likelihood. It is similar to the second procedure, but we estimate the margins by the Kaplan-Meier estimate. We derive asymptotic properties for these three estimation procedures and compare their efficiency by Monte-Carlo simulations and direct computations. For models produced by right neutral cumulative hazards and right neutral hazards, we derive the likelihood and investigate the properties of the maximum likelihood estimates. Finally, we develop goodness of fit tests for the dependency structure in the copula models. We derive a test statistic and its asymptotic properties based on the test of homogeneity of Zelterman and Chen (1988), and a graphical diagnostic procedure based on the empirical Bayes approach. We study the performance of these two methods using actual and computer generated data.
Arterial input function derived from pairwise correlations between PET-image voxels.
Schain, Martin; Benjaminsson, Simon; Varnäs, Katarina; Forsberg, Anton; Halldin, Christer; Lansner, Anders; Farde, Lars; Varrone, Andrea
2013-07-01
A metabolite corrected arterial input function is a prerequisite for quantification of positron emission tomography (PET) data by compartmental analysis. This quantitative approach is also necessary for radioligands without suitable reference regions in brain. The measurement is laborious and requires cannulation of a peripheral artery, a procedure that can be associated with patient discomfort and potential adverse events. A non invasive procedure for obtaining the arterial input function is thus preferable. In this study, we present a novel method to obtain image-derived input functions (IDIFs). The method is based on calculation of the Pearson correlation coefficient between the time-activity curves of voxel pairs in the PET image to localize voxels displaying blood-like behavior. The method was evaluated using data obtained in human studies with the radioligands [(11)C]flumazenil and [(11)C]AZ10419369, and its performance was compared with three previously published methods. The distribution volumes (VT) obtained using IDIFs were compared with those obtained using traditional arterial measurements. Overall, the agreement in VT was good (∼3% difference) for input functions obtained using the pairwise correlation approach. This approach performed similarly or even better than the other methods, and could be considered in applied clinical studies. Applications to other radioligands are needed for further verification.
Physiological responses to environmental factors related to space flight
NASA Technical Reports Server (NTRS)
Pace, N.; Grunbaum, B. W.; Kodama, A. M.; Mains, R. C.; Rahlmann, D. F.
1975-01-01
Physiological procedures and instrumentation developed for the measurement of hemodynamic and metabolic parameters during prolonged periods of weightlessness are described along with the physiological response of monkeys to weightlessness. Specific areas examined include: cardiovascular studies; thyroid function; blood oxygen transport; growth and reproduction; excreta analysis for metabolic balance studies; and electrophoretic separation of creatine phosphokinase isoenzymes in human blood.
ERIC Educational Resources Information Center
Steeve, Roger W.; Price, Christiana M.
2010-01-01
An empirical method for investigating differences in neural control of jaw movement across oromandibular behaviours is to compute the coherence function for electromyographic signals obtained from mandibular muscle groups. This procedure has been used with adults but not extended to children. This pilot study investigated if coherence analysis…
NASA Technical Reports Server (NTRS)
Steinwachs, W. L.; Patrick, J. W.; Galvin, D. M.; Turkel, S. H.
1972-01-01
The findings of the support operations activity group of the orbital operations study are presented. Element interfaces, alternate approaches, design concepts, operational procedures, functional requirements, design influences, and approach selection are presented. The following areas are considered: (1) crew transfer, (2) cargo transfer, (3) propellant transfer, (4) attached element operations, and (5) attached element transport.
ERIC Educational Resources Information Center
Bruce, Robert; And Others
This report presents an overview of research objectives, sampling approaches, data collection procedures, and instruments and plans for analysis used in assessing the impact of the President's Council on Physical Fitness and Sports on different types of fitness programs. Surveys were conducted of: (1) community fitness programs; (2) employee…
ERIC Educational Resources Information Center
Ramirez, Matias
2017-01-01
Businesses and Human Resources professionals face the ongoing challenge of continuously upskilling and developing employees. Changes to processes or procedures, changes in technology, changes in job functions, and updates or changes to compliance laws or regulations are all reasons that employees must attend and complete employer-developed…
ERIC Educational Resources Information Center
Cohen, Joseph
This report examines the legal and regulatory structure of basic education in Indonesia beginning in 1989, when Education Law Number 2 was enacted (from which all current regulations, policies, and procedures can be traced). In 1999, two key laws (Number 22 and Number 25) were passed that required the decentralization of many government functions.…
Application of two procedures for dual-point design of transonic airfoils
NASA Technical Reports Server (NTRS)
Mineck, Raymond E.; Campbell, Richard L.; Allison, Dennis O.
1994-01-01
Two dual-point design procedures were developed to reduce the objective function of a baseline airfoil at two design points. The first procedure to develop a redesigned airfoil used a weighted average of the shapes of two intermediate airfoils redesigned at each of the two design points. The second procedure used a weighted average of two pressure distributions obtained from an intermediate airfoil redesigned at each of the two design points. Each procedure was used to design a new airfoil with reduced wave drag at the cruise condition without increasing the wave drag or pitching moment at the climb condition. Two cycles of the airfoil shape-averaging procedure successfully designed a new airfoil that reduced the objective function and satisfied the constraints. One cycle of the target (desired) pressure-averaging procedure was used to design two new airfoils that reduced the objective function and came close to satisfying the constraints.
Human (13)N-ammonia PET studies: the importance of measuring (13)N-ammonia metabolites in blood.
Keiding, Susanne; Sørensen, Michael; Munk, Ole Lajord; Bender, Dirk
2010-03-01
Dynamic (13)N-ammonia PET is used to assess ammonia metabolism in brain, liver and muscle based on kinetic modeling of metabolic pathways, using arterial blood (13)N-ammonia as input function. Rosenspire et al. (1990) introduced a solid phase extraction procedure for fractionation of (13)N-content in blood into (13)N-ammonia, (13)N-urea, (13)N-glutamine and (13)N-glutamate. Due to a radioactive half-life for (13)N of 10 min, the procedure is not suitable for blood samples taken beyond 5-7 min after tracer injection. By modifying Rosenspire's method, we established a method enabling analysis of up to 10 blood samples in the course of 30 min. The modified procedure was validated by HPLC and by 30-min reproducibility studies in humans examined by duplicate (13)N-ammonia injections with a 60-min interval. Blood data from a (13)N-ammonia brain PET study (from Keiding et al. 2006) showed: (1) time courses of (13)N-ammonia fractions could be described adequately by double exponential functions; (2) metabolic conversion of (13)N-ammonia to (13)N-metabolites were in the order: healthy subjects > cirrhotic patients without HE > cirrhotic patients with HE; (3) kinetics of initial tracer distribution in tissue can be assessed by using total (13)N-concentration in blood as input function, whereas assessment of metabolic processes requires (13)N-ammonia measurements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Astaf'ev, S. B., E-mail: bard@ns.crys.ras.ru; Shchedrin, B. M.; Yanusova, L. G.
2012-01-15
The main principles of developing the Basic Analysis of Reflectometry Data (BARD) software package, which is aimed at obtaining a unified (standardized) tool for analyzing the structure of thin multilayer films and nanostructures of different nature based on reflectometry data, are considered. This software package contains both traditionally used procedures for processing reflectometry data and the authors' original developments on the basis of new methods for carrying out and analyzing reflectometry experiments. The structure of the package, its functional possibilities, examples of application, and prospects of development are reviewed.
Computer graphics for quality control in the INAA of geological samples
Grossman, J.N.; Baedecker, P.A.
1987-01-01
A data reduction system for the routine instrumental activation analysis of samples is described, with particular emphasis on interactive graphics capabilities for evaluating analytical quality. Graphics procedures have been developed to interactively control the analysis of selected photopeaks during spectral analysis, and to evaluate detector performance during a given counting cycle. Graphics algorithms are also used to compare the data on reference samples with accepted values, to prepare quality control charts to evaluate long term precision and to search for systematic variations in data on reference samples as a function of time. ?? 1987 Akade??miai Kiado??.
Procedural Learning and Dyslexia
ERIC Educational Resources Information Center
Nicolson, R. I.; Fawcett, A. J.; Brookes, R. L.; Needle, J.
2010-01-01
Three major "neural systems", specialized for different types of information processing, are the sensory, declarative, and procedural systems. It has been proposed ("Trends Neurosci.",30(4), 135-141) that dyslexia may be attributable to impaired function in the procedural system together with intact declarative function. We provide a brief…
Stiffness and strength of fiber reinforced polymer composite bridge deck systems
NASA Astrophysics Data System (ADS)
Zhou, Aixi
This research investigates two principal characteristics that are of primary importance in Fiber Reinforced Polymer (FRP) bridge deck applications: STIFFNESS and STRENGTH. The research was undertaken by investigating the stiffness and strength characteristics of the multi-cellular FRP bridge deck systems consisting of pultruded FRP shapes. A systematic analysis procedure was developed for the stiffness analysis of multi-cellular FRP deck systems. This procedure uses the Method of Elastic Equivalence to model the cellular deck as an equivalent orthotropic plate. The procedure provides a practical method to predict the equivalent orthotropic plate properties of cellular FRP decks. Analytical solutions for the bending analysis of single span decks were developed using classical laminated plate theory. The analysis procedures can be extended to analyze continuous FRP decks. It can also be further developed using higher order plate theories. Several failure modes of the cellular FRP deck systems were recorded and analyzed through laboratory and field tests and Finite Element Analysis (FEA). Two schemes of loading patches were used in the laboratory test: a steel patch made according to the ASSHTO's bridge testing specifications; and a tire patch made from a real truck tire reinforced with silicon rubber. The tire patch was specially designed to simulate service loading conditions by modifying real contact loading from a tire. Our research shows that the effects of the stiffness and contact conditions of loading patches are significant in the stiffness and strength testing of FRP decks. Due to the localization of load, a simulated tire patch yields larger deflection than the steel patch under the same loading level. The tire patch produces significantly different failure compared to the steel patch: a local bending mode with less damage for the tire patch; and a local punching-shear mode for the steel patch. A deck failure function method is proposed for predicting the failure of FRP decks. Using developed laminated composite theories and FEA techniques, a strength analysis procedure containing ply-level information was proposed and detailed for FRP deck systems. The behavior of the deck's unsupported (free) edges was also investigated using ply-level FEA.
32 CFR 989.37 - Procedures for analysis abroad.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 6 2011-07-01 2011-07-01 false Procedures for analysis abroad. 989.37 Section... PROTECTION ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.37 Procedures for analysis abroad. Procedures for analysis of environmental actions abroad are contained in 32 CFR part 187. That directive provides...
2018-01-01
This paper presents an integrated hybrid optimization algorithm for training the radial basis function neural network (RBF NN). Training of neural networks is still a challenging exercise in machine learning domain. Traditional training algorithms in general suffer and trap in local optima and lead to premature convergence, which makes them ineffective when applied for datasets with diverse features. Training algorithms based on evolutionary computations are becoming popular due to their robust nature in overcoming the drawbacks of the traditional algorithms. Accordingly, this paper proposes a hybrid training procedure with differential search (DS) algorithm functionally integrated with the particle swarm optimization (PSO). To surmount the local trapping of the search procedure, a new population initialization scheme is proposed using Logistic chaotic sequence, which enhances the population diversity and aid the search capability. To demonstrate the effectiveness of the proposed RBF hybrid training algorithm, experimental analysis on publicly available 7 benchmark datasets are performed. Subsequently, experiments were conducted on a practical application case for wind speed prediction to expound the superiority of the proposed RBF training algorithm in terms of prediction accuracy. PMID:29768463
Rani R, Hannah Jessie; Victoire T, Aruldoss Albert
2018-01-01
This paper presents an integrated hybrid optimization algorithm for training the radial basis function neural network (RBF NN). Training of neural networks is still a challenging exercise in machine learning domain. Traditional training algorithms in general suffer and trap in local optima and lead to premature convergence, which makes them ineffective when applied for datasets with diverse features. Training algorithms based on evolutionary computations are becoming popular due to their robust nature in overcoming the drawbacks of the traditional algorithms. Accordingly, this paper proposes a hybrid training procedure with differential search (DS) algorithm functionally integrated with the particle swarm optimization (PSO). To surmount the local trapping of the search procedure, a new population initialization scheme is proposed using Logistic chaotic sequence, which enhances the population diversity and aid the search capability. To demonstrate the effectiveness of the proposed RBF hybrid training algorithm, experimental analysis on publicly available 7 benchmark datasets are performed. Subsequently, experiments were conducted on a practical application case for wind speed prediction to expound the superiority of the proposed RBF training algorithm in terms of prediction accuracy.
Hormonal therapy is associated with better self-esteem, mood, and quality of life in transsexuals.
Gorin-Lazard, Audrey; Baumstarck, Karine; Boyer, Laurent; Maquigneau, Aurélie; Penochet, Jean-Claude; Pringuey, Dominique; Albarel, Frédérique; Morange, Isabelle; Bonierbale, Mireille; Lançon, Christophe; Auquier, Pascal
2013-11-01
Few studies have assessed the role of cross-sex hormones on psychological outcomes during the period of hormonal therapy preceding sex reassignment surgery in transsexuals. The objective of this study was to assess the relationship between hormonal therapy, self-esteem, depression, quality of life (QoL), and global functioning. This study incorporated a cross-sectional design. The inclusion criteria were diagnosis of gender identity disorder (Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision) and inclusion in a standardized sex reassignment procedure. The outcome measures were self-esteem (Social Self-Esteem Inventory), mood (Beck Depression Inventory), QoL (Subjective Quality of Life Analysis), and global functioning (Global Assessment of Functioning). Sixty-seven consecutive individuals agreed to participate. Seventy-three percent received hormonal therapy. Hormonal therapy was an independent factor in greater self-esteem, less severe depression symptoms, and greater "psychological-like" dimensions of QoL. These findings should provide pertinent information for health care providers who consider this period as a crucial part of the global sex reassignment procedure.
Kauvar, Arielle N B; Cronin, Terrence; Roenigk, Randall; Hruza, George; Bennett, Richard
2015-05-01
Basal cell carcinoma (BCC) is the most common cancer in the US population affecting approximately 2.8 million people per year. Basal cell carcinomas are usually slow-growing and rarely metastasize, but they do cause localized tissue destruction, compromised function, and cosmetic disfigurement. To provide clinicians with guidelines for the management of BCC based on evidence from a comprehensive literature review, and consensus among the authors. An extensive review of the medical literature was conducted to evaluate the optimal treatment methods for cutaneous BCC, taking into consideration cure rates, recurrence rates, aesthetic and functional outcomes, and cost-effectiveness of the procedures. Surgical approaches provide the best outcomes for BCCs. Mohs micrographic surgery provides the highest cure rates while maximizing tissue preservation, maintenance of function, and cosmesis. Mohs micrographic surgery is an efficient and cost-effective procedure and remains the treatment of choice for high-risk BCCs and for those in cosmetically sensitive locations. Nonsurgical modalities may be used for low-risk BCCs when surgery is contraindicated or impractical, but the cure rates are lower.
Jawad, Zaynab A R; Tsim, Nicole; Pai, Madhava; Bansi, Dev; Westaby, David; Vlavianos, Panagiotis; Jiao, Long R
2016-02-01
To evaluate the short and long term outcomes of duodenum preserving pancreatic head resection (DPPHR) procedures in the treatment of painful chronic pancreatitis. A systematic literature search was performed to identify all comparative studies evaluating long and short term postoperative outcomes (pain relief, morbidity and mortality, pancreatic exocrine and endocrine function). Five published studies fulfilled the inclusion criteria including 1 randomized controlled trial comparing the Beger and Frey procedure. In total, 323 patients underwent surgical procedures for chronic pancreatitis, including Beger (n = 138) and Frey (n = 99), minimal Frey (n = 32), modified Frey (n = 25) and Berne's modification (n = 29). Two studies comparing the Beger and Frey procedure were entered into a meta-analysis and showed no difference in post-operative pain (RD = -0.06; CI -0.21 to 0.09), mortality (RD = 0.01; CI -0.03 to 0.05), morbidity (RD = 0.12; CI -0.00 to 0.24), exocrine insufficiency (RD = 0.04; CI -0.10 to 0.18) and endocrine insufficiency (RD = -0.14 CI -0.28 to 0.01). All procedures are equally effective for the management of pain for chronic pancreatitis. The choice of procedure should be determined by other factors including the presence of secondary complications of pancreatitis and intra-operative findings. Registration number CRD42015019275. Centre for Reviews and Dissemination, University of York, 2009. Copyright © 2015 International Hepato-Pancreato-Biliary Association Inc. Published by Elsevier Ltd. All rights reserved.
Uncertainty Analysis of A Flood Risk Mapping Procedure Applied In Urban Areas
NASA Astrophysics Data System (ADS)
Krause, J.; Uhrich, S.; Bormann, H.; Diekkrüger, B.
In the framework of IRMA-Sponge program the presented study was part of the joint research project FRHYMAP (flood risk and hydrological mapping). A simple con- ceptual flooding model (FLOODMAP) has been developed to simulate flooded areas besides rivers within cities. FLOODMAP requires a minimum of input data (digital el- evation model (DEM), river line, water level plain) and parameters and calculates the flood extent as well as the spatial distribution of flood depths. of course the simulated model results are affected by errors and uncertainties. Possible sources of uncertain- ties are the model structure, model parameters and input data. Thus after the model validation (comparison of simulated water to observed extent, taken from airborne pictures) the uncertainty of the essential input data set (digital elevation model) was analysed. Monte Carlo simulations were performed to assess the effect of uncertain- ties concerning the statistics of DEM quality and to derive flooding probabilities from the set of simulations. The questions concerning a minimum resolution of a DEM re- quired for flood simulation and concerning the best aggregation procedure of a given DEM was answered by comparing the results obtained using all available standard GIS aggregation procedures. Seven different aggregation procedures were applied to high resolution DEMs (1-2m) in three cities (Bonn, Cologne, Luxembourg). Basing on this analysis the effect of 'uncertain' DEM data was estimated and compared with other sources of uncertainties. Especially socio-economic information and monetary transfer functions required for a damage risk analysis show a high uncertainty. There- fore this study helps to analyse the weak points of the flood risk and damage risk assessment procedure.
NASA Astrophysics Data System (ADS)
Ciancio, P. M.; Rossit, C. A.; Laura, P. A. A.
2007-05-01
This study is concerned with the vibration analysis of a cantilevered rectangular anisotropic plate when a concentrated mass is rigidly attached to its center point. Based on the classical theory of anisotropic plates, the Ritz method is employed to perform the analysis. The deflection of the plate is approximated by a set of beam functions in each principal coordinate direction. The influence of the mass magnitude on the natural frequencies and modal shapes of vibration is studied for a boron-epoxy plate and also in the case of a generic anisotropic material. The classical Ritz method with beam functions as the spatial approximation proved to be a suitable procedure to solve a problem of this analytical complexity.
CIRCAL-2 - General-purpose on-line circuit design.
NASA Technical Reports Server (NTRS)
Dertouzos, M. L.; Jessel, G. P.; Stinger, J. R.
1972-01-01
CIRCAL-2 is a second-generation general-purpose on-line circuit-design program with the following main features: (1) multiple-analysis capability; (2) uniform and general data structures for handling text editing, network representations, and output results, regardless of analysis; (3) special techniques and structures for minimizing and controlling user-program interaction; (4) use of functionals for the description of hysteresis and heat effects; and (5) ability to define optimization procedures that 'replace' the user. The paper discusses the organization of CIRCAL-2, the aforementioned main features, and their consequences, such as a set of network elements and models general enough for most analyses and a set of functions tailored to circuit-design requirements. The presentation is descriptive, concentrating on conceptual rather than on program implementation details.
Development and evaluation of the impulse transfer function technique
NASA Technical Reports Server (NTRS)
Mantus, M.
1972-01-01
The development of the test/analysis technique known as the impulse transfer function (ITF) method is discussed. This technique, when implemented with proper data processing systems, should become a valuable supplement to conventional dynamic testing and analysis procedures that will be used in the space shuttle development program. The method can relieve many of the problems associated with extensive and costly testing of the shuttle for transient loading conditions. In addition, the time history information derived from impulse testing has the potential for being used to determine modal data for the structure under investigation. The technique could be very useful in determining the time-varying modal characteristics of structures subjected to thermal transients, where conventional mode surveys are difficult to perform.
NASA Technical Reports Server (NTRS)
Blue, G. D.; Moran, C. M.
1985-01-01
Corrosion rates of 304L stainless steel coupons in MON-1 oxidizer have been measured as a function of cleaning procedures employed, surface layer positions, propellant impurity levels, and short-term exposure durations (14 to 90 days). Of special interest was propellant contamination by buildup of soluble iron, which may cause flow decay. Surface treatments employed were combinations of cleaning, pickling, and passivation procedures. Propellants used were MIL-SPEC MON-1 and several types of purified NTO (i.e., low water, low chloride) which may, at a later time, be specified as spacecraft grade. Pretest coupon surface analysis by X-ray photoelectron spectroscopy (XPS-ESCA) has revealed important differences, for the different cleaning procedures, in the make-up of the surface layer, both in composition and state of chemical combination of the elements involved. Comparisons will be made of XPS/ESCA data, for different cleaning procedures, for specimens before and after propellant exposure.
Deem, J F; Manning, W H; Knack, J V; Matesich, J S
1989-09-01
A program for the automatic extraction of jitter (PAEJ) was developed for the clinical measurement of pitch perturbations using a microcomputer. The program currently includes 12 implementations of an algorithm for marking the boundary criteria for a fundamental period of vocal fold vibration. The relative sensitivity of these extraction procedures for identifying the pitch period was compared using sine waves. Data obtained to date provide information for each procedure concerning the effects of waveform peakedness and slope, sample duration in cycles, noise level of the analysis system with both direct and tape recorded input, and the influence of interpolation. Zero crossing extraction procedures provided lower jitter values regardless of sine wave frequency or sample duration. The procedures making use of positive- or negative-going zero crossings with interpolation provided the lowest measures of jitter with the sine wave stimuli. Pilot data obtained with normal-speaking adults indicated that jitter measures varied as a function of the speaker, vowel, and sample duration.
Analysis of scanner data for crop inventories
NASA Technical Reports Server (NTRS)
Horvath, R. (Principal Investigator); Cicone, R. C.; Kauth, R. J.; Malila, W. A.
1981-01-01
Progress and technical issues are reported in the development of corn/soybeans area estimation procedures for use on data from South America, with particular emphasis on Argentina. Aspects related to the supporting research section of the AgRISTARS Project discussed include: (1) multisegment corn/soybean estimation; (2) through the season separability of corn and soybeans within the U.S. corn belt; (3) TTS estimation; (4) insights derived from the baseline corn and soybean procedure; (5) small fields research; and (6) simulating the spectral appearance of wheat as a function of its growth and development. To assist the foreign commodity production forecasting, the performance of the baseline corn/soybean procedure was analyzed and the procedure modified. Fundamental limitations were found in the existing guidelines for discriminating these two crops. The temporal and spectral characteristics of corn and soybeans must be determined because other crops grow with them in Argentina. The state of software technology is assessed and the use of profile techniques for estimation is considered.
Quantum computation and analysis of Wigner and Husimi functions: toward a quantum image treatment.
Terraneo, M; Georgeot, B; Shepelyansky, D L
2005-06-01
We study the efficiency of quantum algorithms which aim at obtaining phase-space distribution functions of quantum systems. Wigner and Husimi functions are considered. Different quantum algorithms are envisioned to build these functions, and compared with the classical computation. Different procedures to extract more efficiently information from the final wave function of these algorithms are studied, including coarse-grained measurements, amplitude amplification, and measure of wavelet-transformed wave function. The algorithms are analyzed and numerically tested on a complex quantum system showing different behavior depending on parameters: namely, the kicked rotator. The results for the Wigner function show in particular that the use of the quantum wavelet transform gives a polynomial gain over classical computation. For the Husimi distribution, the gain is much larger than for the Wigner function and is larger with the help of amplitude amplification and wavelet transforms. We discuss the generalization of these results to the simulation of other quantum systems. We also apply the same set of techniques to the analysis of real images. The results show that the use of the quantum wavelet transform allows one to lower dramatically the number of measurements needed, but at the cost of a large loss of information.
A baseline-free procedure for transformation models under interval censorship.
Gu, Ming Gao; Sun, Liuquan; Zuo, Guoxin
2005-12-01
An important property of Cox regression model is that the estimation of regression parameters using the partial likelihood procedure does not depend on its baseline survival function. We call such a procedure baseline-free. Using marginal likelihood, we show that an baseline-free procedure can be derived for a class of general transformation models under interval censoring framework. The baseline-free procedure results a simplified and stable computation algorithm for some complicated and important semiparametric models, such as frailty models and heteroscedastic hazard/rank regression models, where the estimation procedures so far available involve estimation of the infinite dimensional baseline function. A detailed computational algorithm using Markov Chain Monte Carlo stochastic approximation is presented. The proposed procedure is demonstrated through extensive simulation studies, showing the validity of asymptotic consistency and normality. We also illustrate the procedure with a real data set from a study of breast cancer. A heuristic argument showing that the score function is a mean zero martingale is provided.
NASA Technical Reports Server (NTRS)
Tripp, John S.; Tcheng, Ping
1999-01-01
Statistical tools, previously developed for nonlinear least-squares estimation of multivariate sensor calibration parameters and the associated calibration uncertainty analysis, have been applied to single- and multiple-axis inertial model attitude sensors used in wind tunnel testing to measure angle of attack and roll angle. The analysis provides confidence and prediction intervals of calibrated sensor measurement uncertainty as functions of applied input pitch and roll angles. A comparative performance study of various experimental designs for inertial sensor calibration is presented along with corroborating experimental data. The importance of replicated calibrations over extended time periods has been emphasized; replication provides independent estimates of calibration precision and bias uncertainties, statistical tests for calibration or modeling bias uncertainty, and statistical tests for sensor parameter drift over time. A set of recommendations for a new standardized model attitude sensor calibration method and usage procedures is included. The statistical information provided by these procedures is necessary for the uncertainty analysis of aerospace test results now required by users of industrial wind tunnel test facilities.
Gonzalez, Karla; Ulloa, Jesus G; Moreno, Gerardo; Echeverría, Oscar; Norris, Keith; Talamantes, Efrain
2017-10-23
Latinos in the U.S. are almost twice as likely to progress to End Stage Renal disease (ESRD) compared to non-Latino whites. Patients with ESRD on dialysis experience high morbidity, pre-mature mortality and receive intensive procedures at the end of life (EOL). This study explores intensive procedure preferences at the EOL in older Latino adults. Seventy-three community-dwelling Spanish- and English-Speaking Latinos over the age of 60 with and without ESRD participated in this study. Those without ESRD (n = 47) participated in one of five focus group sessions, and those with ESRD on dialysis (n = 26) participated in one-on-one semi-structured interviews. Focus group and individual participants answered questions regarding intensive procedures at the EOL. Recurring themes were identified using standard qualitative content-analysis methods. Participants also completed a brief survey that included demographics, language preference, health insurance coverage, co-morbidities, Emergency Department visits and functional limitations. The majority of participants were of Mexican origin with mean age of 70, and there were more female participants in the non-ESRD group, compared to the ESRD dialysis dependent group. The dialysis group reported a higher number of co-morbidities and functional limitations. Nearly 69% of those in the dialysis group reported one or more emergency department visits in the past year, compared to 38% in the non-ESRD group. Primary themes centered on 1) The acceptability of a "natural" versus "invasive" procedure 2) Cultural traditions and family involvement 3) Level of trust in physicians and autonomy in decision-making. Our results highlight the need for improved patient- and family-centered approaches to better understand intensive procedure preferences at the EOL in this underserved population of older adults.
Yao, Hong; You, Zhen; Liu, Bo
2016-01-01
The number of surface water pollution accidents (abbreviated as SWPAs) has increased substantially in China in recent years. Estimation of economic losses due to SWPAs has been one of the focuses in China and is mentioned many times in the Environmental Protection Law of China promulgated in 2014. From the perspective of water bodies’ functions, pollution accident damages can be divided into eight types: damage to human health, water supply suspension, fishery, recreational functions, biological diversity, environmental property loss, the accident’s origin and other indirect losses. In the valuation of damage to people’s life, the procedure for compensation of traffic accidents in China was used. The functional replacement cost method was used in economic estimation of the losses due to water supply suspension and loss of water’s recreational functions. Damage to biological diversity was estimated by recovery cost analysis and damage to environmental property losses were calculated using pollutant removal costs. As a case study, using the proposed calculation procedure the economic losses caused by the major Songhuajiang River pollution accident that happened in China in 2005 have been estimated at 2263 billion CNY. The estimated economic losses for real accidents can sometimes be influenced by social and political factors, such as data authenticity and accuracy. Besides, one or more aspects in the method might be overestimated, underrated or even ignored. The proposed procedure may be used by decision makers for the economic estimation of losses in SWPAs. Estimates of the economic losses of pollution accidents could help quantify potential costs associated with increased risk sources along lakes/rivers but more importantly, highlight the value of clean water to society as a whole. PMID:26805869
Yao, Hong; You, Zhen; Liu, Bo
2016-01-22
The number of surface water pollution accidents (abbreviated as SWPAs) has increased substantially in China in recent years. Estimation of economic losses due to SWPAs has been one of the focuses in China and is mentioned many times in the Environmental Protection Law of China promulgated in 2014. From the perspective of water bodies' functions, pollution accident damages can be divided into eight types: damage to human health, water supply suspension, fishery, recreational functions, biological diversity, environmental property loss, the accident's origin and other indirect losses. In the valuation of damage to people's life, the procedure for compensation of traffic accidents in China was used. The functional replacement cost method was used in economic estimation of the losses due to water supply suspension and loss of water's recreational functions. Damage to biological diversity was estimated by recovery cost analysis and damage to environmental property losses were calculated using pollutant removal costs. As a case study, using the proposed calculation procedure the economic losses caused by the major Songhuajiang River pollution accident that happened in China in 2005 have been estimated at 2263 billion CNY. The estimated economic losses for real accidents can sometimes be influenced by social and political factors, such as data authenticity and accuracy. Besides, one or more aspects in the method might be overestimated, underrated or even ignored. The proposed procedure may be used by decision makers for the economic estimation of losses in SWPAs. Estimates of the economic losses of pollution accidents could help quantify potential costs associated with increased risk sources along lakes/rivers but more importantly, highlight the value of clean water to society as a whole.
Magnetic separation techniques in sample preparation for biological analysis: a review.
He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke
2014-12-01
Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.
SASS wind ambiguity removal by direct minimization. [Seasat-A satellite scatterometer
NASA Technical Reports Server (NTRS)
Hoffman, R. N.
1982-01-01
An objective analysis procedure is presented which combines Seasat-A satellite scatterometer (SASS) data with other available data on wind speeds by minimizing an objective function of gridded wind speed values. The functions are defined as the loss functions for the SASS velocity data, the forecast, the SASS velocity magnitude data, and conventional wind speed data. Only aliases closest to the analysis were included, and a method for improving the first guess while using a minimization technique and slowly changing the parameters of the problem is introduced. The model is employed to predict the wind field for the North Atlantic on Sept. 10, 1978. Dealiased SASS data is compared with available ship readings, showing good agreement between the SASS dealiased winds and the winds measured at the surface. Expansion of the model to take in low-level cloud measurements, pressure data, and convergence and cloud level data correlations is discussed.
Hofer, Stefan O.P.; Payne, Caroline E.
2010-01-01
The foundation of head and neck reconstruction is based on two pillars: the restoration of function and the restoration of aesthetics. The objective of this article is to provide insight into how to prevent undesirable functional and aesthetic outcome after the initial procedure and also to provide solutions for enhancement of functional and aesthetic outcome with secondary procedures. Functional and aesthetic outcome enhancement is discussed in relation to the individual structures within the oral cavity, for the mandible, and for facial reconstruction. Normal prerequisites for all individual structures are described, and key points for restoration of these functional and aesthetic issues are proposed. In addition, further suggestions to improve suboptimal results after initial reconstructive surgery are presented. Understanding the function and aesthetics of the area to be reconstructed will allow appropriate planning and management of the initial reconstruction. Secondary enhancement should be attainable by minor procedures rather than a requirement to redo the initial reconstruction. PMID:22550452
NASA Astrophysics Data System (ADS)
Domino, Krzysztof; Błachowicz, Tomasz
2014-11-01
In our work copula functions and the Hurst exponent calculated using the local Detrended Fluctuation Analysis (DFA) were used to investigate the risk of investment made in shares traded on the Warsaw Stock Exchange. The combination of copula functions and the Hurst exponent calculated using local DFA is a new approach. For copula function analysis bivariate variables composed of shares prices of the PEKAO bank (a big bank with high capitalization) and other banks (PKOBP, BZ WBK, MBANK and HANDLOWY in decreasing capitalization order) and companies from other branches (KGHM-mining industry, PKNORLEN-petrol industry as well as ASSECO-software industry) were used. Hurst exponents were calculated for daily shares prices and used to predict high drops of those prices. It appeared to be a valuable indicator in the copula selection procedure, since Hurst exponent’s low values were pointing on heavily tailed copulas e.g. the Clayton one.
Inference in randomized trials with death and missingness.
Wang, Chenguang; Scharfstein, Daniel O; Colantuoni, Elizabeth; Girard, Timothy D; Yan, Ying
2017-06-01
In randomized studies involving severely ill patients, functional outcomes are often unobserved due to missed clinic visits, premature withdrawal, or death. It is well known that if these unobserved functional outcomes are not handled properly, biased treatment comparisons can be produced. In this article, we propose a procedure for comparing treatments that is based on a composite endpoint that combines information on both the functional outcome and survival. We further propose a missing data imputation scheme and sensitivity analysis strategy to handle the unobserved functional outcomes not due to death. Illustrations of the proposed method are given by analyzing data from a recent non-small cell lung cancer clinical trial and a recent trial of sedation interruption among mechanically ventilated patients. © 2016, The International Biometric Society.
Panagopoulos, G P; Angelopoulou, D; Tzirtzilakis, E E; Giannoulopoulos, P
2016-10-01
This paper presents an innovated method for the discrimination of groundwater samples in common groups representing the hydrogeological units from where they have been pumped. This method proved very efficient even in areas with complex hydrogeological regimes. The proposed method requires chemical analyses of water samples only for major ions, meaning that it is applicable to most of cases worldwide. Another benefit of the method is that it gives a further insight of the aquifer hydrogeochemistry as it provides the ions that are responsible for the discrimination of the group. The procedure begins with cluster analysis of the dataset in order to classify the samples in the corresponding hydrogeological unit. The feasibility of the method is proven from the fact that the samples of volcanic origin were separated into two different clusters, namely the lava units and the pyroclastic-ignimbritic aquifer. The second step is the discriminant analysis of the data which provides the functions that distinguish the groups from each other and the most significant variables that define the hydrochemical composition of the aquifer. The whole procedure was highly successful as the 94.7 % of the samples were classified to the correct aquifer system. Finally, the resulted functions can be safely used to categorize samples of either unknown or doubtful origin improving thus the quality and the size of existing hydrochemical databases.
A Multi-Resolution Nonlinear Mapping Technique for Design and Analysis Applications
NASA Technical Reports Server (NTRS)
Phan, Minh Q.
1998-01-01
This report describes a nonlinear mapping technique where the unknown static or dynamic system is approximated by a sum of dimensionally increasing functions (one-dimensional curves, two-dimensional surfaces, etc.). These lower dimensional functions are synthesized from a set of multi-resolution basis functions, where the resolutions specify the level of details at which the nonlinear system is approximated. The basis functions also cause the parameter estimation step to become linear. This feature is taken advantage of to derive a systematic procedure to determine and eliminate basis functions that are less significant for the particular system under identification. The number of unknown parameters that must be estimated is thus reduced and compact models obtained. The lower dimensional functions (identified curves and surfaces) permit a kind of "visualization" into the complexity of the nonlinearity itself.
A Multi-Resolution Nonlinear Mapping Technique for Design and Analysis Application
NASA Technical Reports Server (NTRS)
Phan, Minh Q.
1997-01-01
This report describes a nonlinear mapping technique where the unknown static or dynamic system is approximated by a sum of dimensionally increasing functions (one-dimensional curves, two-dimensional surfaces, etc.). These lower dimensional functions are synthesized from a set of multi-resolution basis functions, where the resolutions specify the level of details at which the nonlinear system is approximated. The basis functions also cause the parameter estimation step to become linear. This feature is taken advantage of to derive a systematic procedure to determine and eliminate basis functions that are less significant for the particular system under identification. The number of unknown parameters that must be estimated is thus reduced and compact models obtained. The lower dimensional functions (identified curves and surfaces) permit a kind of "visualization" into the complexity of the nonlinearity itself.
Allanson, Paul; Petrie, Dennis
2013-01-01
The usual starting point for understanding changes in income-related health inequality (IRHI) over time has been regression-based decomposition procedures for the health concentration index. However the reliance on repeated cross-sectional analysis for this purpose prevents both the appropriate specification of the health function as a dynamic model and the identification of important determinants of the transition processes underlying IRHI changes such as those relating to mortality. This paper overcomes these limitations by developing alternative longitudinal procedures to analyse the role of health determinants in driving changes in IRHI through both morbidity changes and mortality, with our dynamic modelling framework also serving to identify their contribution to long-run or structural IRHI. The approach is illustrated by an empirical analysis of the causes of the increase in IRHI in Great Britain between 1999 and 2004. PMID:24036199
NASA Technical Reports Server (NTRS)
Ravat, Dhananjay; Hinze, William J.
1991-01-01
Analysis of the total magnetic intensity MAGSAT data has identified and characterized the variability of ionospheric current effects as reflected in the geomagnetic field as a function of longitude, elevation, and time (daily as well as monthly variations). This analysis verifies previous observations in POGO data and provides important boundary conditions for theoretical studies of ionospheric currents. Furthermore, the observations have led to a procedure to remove these temporal perturbations from lithospheric MAGSAT magnetic anomaly data based on 'along-the-dip-latitude' averages from dawn and dusk data sets grouped according to longitudes, time (months), and elevation. Using this method, high-resolution lithospheric magnetic anomaly maps have been prepared of the earth over a plus or minus 50 deg latitude band. These maps have proven useful in the study of the structures, nature, and processes of the lithosphere.
Identifying fMRI Model Violations with Lagrange Multiplier Tests
Cassidy, Ben; Long, Christopher J; Rae, Caroline; Solo, Victor
2013-01-01
The standard modeling framework in Functional Magnetic Resonance Imaging (fMRI) is predicated on assumptions of linearity, time invariance and stationarity. These assumptions are rarely checked because doing so requires specialised software, although failure to do so can lead to bias and mistaken inference. Identifying model violations is an essential but largely neglected step in standard fMRI data analysis. Using Lagrange Multiplier testing methods we have developed simple and efficient procedures for detecting model violations such as non-linearity, non-stationarity and validity of the common Double Gamma specification for hemodynamic response. These procedures are computationally cheap and can easily be added to a conventional analysis. The test statistic is calculated at each voxel and displayed as a spatial anomaly map which shows regions where a model is violated. The methodology is illustrated with a large number of real data examples. PMID:22542665
Generalized recursive solutions to Ornstein-Zernike integral equations
NASA Astrophysics Data System (ADS)
Rossky, Peter J.; Dale, William D. T.
1980-09-01
Recursive procedures for the solution of a class of integral equations based on the Ornstein-Zernike equation are developed; the hypernetted chain and Percus-Yevick equations are two special cases of the class considered. It is shown that certain variants of the new procedures developed here are formally equivalent to those recently developed by Dale and Friedman, if the new recursive expressions are initialized in the same way as theirs. However, the computational solution of the new equations is significantly more efficient. Further, the present analysis leads to the identification of various graphical quantities arising in the earlier study with more familiar quantities related to pair correlation functions. The analysis is greatly facilitated by the use of several identities relating simple chain sums whose graphical elements can be written as a sum of two or more parts. In particular, the use of these identities permits renormalization of the equivalent series solution to the integral equation to be directly incorporated into the recursive solution in a straightforward manner. Formulas appropriate to renormalization with respect to long and short range parts of the pair potential, as well as more general components of the direct correlation function, are obtained. To further illustrate the utility of this approach, we show that a simple generalization of the hypernetted chain closure relation for the direct correlation function leads directly to the reference hypernetted chain (RHNC) equation due to Lado. The form of the correlation function used in the exponential approximation of Andersen and Chandler is then seen to be equivalent to the first estimate obtained from a renormalized RHNC equation.
Kepler AutoRegressive Planet Search
NASA Astrophysics Data System (ADS)
Feigelson, Eric
NASA's Kepler mission is the source of more exoplanets than any other instrument, but the discovery depends on complex statistical analysis procedures embedded in the Kepler pipeline. A particular challenge is mitigating irregular stellar variability without loss of sensitivity to faint periodic planetary transits. This proposal presents a two-stage alternative analysis procedure. First, parametric autoregressive ARFIMA models, commonly used in econometrics, remove most of the stellar variations. Second, a novel matched filter is used to create a periodogram from which transit-like periodicities are identified. This analysis procedure, the Kepler AutoRegressive Planet Search (KARPS), is confirming most of the Kepler Objects of Interest and is expected to identify additional planetary candidates. The proposed research will complete application of the KARPS methodology to the prime Kepler mission light curves of 200,000: stars, and compare the results with Kepler Objects of Interest obtained with the Kepler pipeline. We will then conduct a variety of astronomical studies based on the KARPS results. Important subsamples will be extracted including Habitable Zone planets, hot super-Earths, grazing-transit hot Jupiters, and multi-planet systems. Groundbased spectroscopy of poorly studied candidates will be performed to better characterize the host stars. Studies of stellar variability will then be pursued based on KARPS analysis. The autocorrelation function and nonstationarity measures will be used to identify spotted stars at different stages of autoregressive modeling. Periodic variables with folded light curves inconsistent with planetary transits will be identified; they may be eclipsing or mutually-illuminating binary star systems. Classification of stellar variables with KARPS-derived statistical properties will be attempted. KARPS procedures will then be applied to archived K2 data to identify planetary transits and characterize stellar variability.
Chung, Hyemoon; Jeon, Byunghwan; Chang, Hyuk-Jae; Han, Dongjin; Shim, Hackjoon; Cho, In Jeong; Shim, Chi Young; Hong, Geu-Ru; Kim, Jung-Sun; Jang, Yangsoo; Chung, Namsik
2015-12-01
After left atrial appendage (LAA) device closure, peri-device leakage into the LAA persists due to incomplete occlusion. We hypothesized that pre-procedural three-dimensional (3D) geometric analysis of the interatrial septum (IAS) and LAA orifice can predict this leakage. We investigated the predictive parameters of LAA device closure obtained from baseline cardiac computerized tomography (CT) using a novel 3D analysis system. We conducted a retrospective study of 22 patients who underwent LAA device closure. We defined peri-device leakage as the presence of a Doppler signal inside the LAA after device deployment (group 2, n = 5) compared with patients without peri-device leakage (group 1, n = 17). Conventional parameters were measured by cardiac CT. Angles θ and φ were defined between the IAS plane and the line, linking the LAA orifice center and foramen ovale. Group 2 exhibited significantly better left atrial (LA) function than group 1 (p = 0.031). Pre-procedural θ was also larger in this group (41.9° vs. 52.3°, p = 0.019). The LAA cauliflower-type morphology was more common in group 2. Overall, the patients' LA reserve significantly decreased after the procedure (21.7 mm(3) vs. 17.8 mm(3), p = 0.035). However, we observed no significant interval changes in pre- and post-procedural values of θ and φ in either group (all p > 0.05). Angles between the IAS and LAA orifice might be a novel anatomical parameter for predicting peri-device leakage after LAA device closure. In addition, 3D CT analysis of the LA and LAA orifice could be used to identify clinically favorable candidates for LAA device closure.
Functional Impressions in Complete Denture and Overdenture Treatment
Kršek, Hrvoje
2015-01-01
Tooth loss can cause loss of occlusal, masticatory, esthetic, physiognomic, phonetic and psychosocial function of patients. The most frequently used treatment method of completely edentulous patients and patients with a small number of remaining teeth are complete dentures or overdentures. One of the most important clinical and laboratory procedures in their fabrication is functional impression taking. The aim of this paper was to present procedures of taking functional impressions in fabrication of complete dentures and overdentures, using standardized techniques and materials. An accurate functional impression together with other correctly performed clinical and laboratory procedures ensure good retention and stability of dentures, which is a precondition for restoring patients’ lost functions. PMID:27688385
Representing Operational Modes for Situation Awareness
NASA Astrophysics Data System (ADS)
Kirchhübel, Denis; Lind, Morten; Ravn, Ole
2017-01-01
Operating complex plants is an increasingly demanding task for human operators. Diagnosis of and reaction to on-line events requires the interpretation of real time data. Vast amounts of sensor data as well as operational knowledge about the state and design of the plant are necessary to deduct reasonable reactions to abnormal situations. Intelligent computational support tools can make the operator’s task easier, but they require knowledge about the overall system in form of some model. While tools used for fault-tolerant control design based on physical principles and relations are valuable tools for designing robust systems, the models become too complex when considering the interactions on a plant-wide level. The alarm systems meant to support human operators in the diagnosis of the plant-wide situation on the other hand fail regularly in situations where these interactions of systems lead to many related alarms overloading the operator with alarm floods. Functional modelling can provide a middle way to reduce the complexity of plant-wide models by abstracting from physical details to more general functions and behaviours. Based on functional models the propagation of failures through the interconnected systems can be inferred and alarm floods can potentially be reduced to their root-cause. However, the desired behaviour of a complex system changes due to operating procedures that require more than one physical and functional configuration. In this paper a consistent representation of possible configurations is deduced from the analysis of an exemplary start-up procedure by functional models. The proposed interpretation of the modelling concepts simplifies the functional modelling of distinct modes. The analysis further reveals relevant links between the quantitative sensor data and the qualitative perspective of the diagnostics tool based on functional models. This will form the basis for the ongoing development of a novel real-time diagnostics system based on the on-line adaptation of the underlying MFM model.
NASA Astrophysics Data System (ADS)
Shafii, M.; Tolson, B.; Matott, L. S.
2012-04-01
Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.
Contextual control using a go/no-go procedure with compound abstract stimuli.
Modenesi, Rafael Diego; Debert, Paula
2015-05-01
Contextual control has been described as (1) a five-term contingency, in which the contextual stimulus exerts conditional control over conditional discriminations, and (2) allowing one stimulus to be a member of different equivalence classes without merging them into one. Matching-to-sample is the most commonly employed procedure to produce and study contextual control. The present study evaluated whether the go/no-go procedure with compound stimuli produces equivalence classes that share stimuli. This procedure does not allow the identification of specific stimulus functions (e.g., contextual, conditional, or discriminative functions). If equivalence classes were established with this procedure, then only the latter part of the contextual control definition (2) would be met. Six undergraduate students participated in the present study. In the training phases, responses to AC, BD, and XY compounds with stimuli from the same classes were reinforced, and responses to AC, BD, and XY compounds with stimuli from different classes were not. In addition, responses to X1A1B1, X1A2B2, X2A1B2, and X2A2B1 compounds were reinforced and responses to the other combinations were not. During the tests, the participants had to respond to new combinations of stimuli compounds YCD to indicate the formation of four equivalence classes that share stimuli: X1A1B1Y1C1D1, X1A2B2Y1C2D2, X2A1B2Y2C1D2, and X2A2B1Y2C2D1. Four of the six participants showed the establishment of these classes. These results indicate that establishing contextual stimulus functions is unnecessary to produce equivalence classes that share stimuli. Therefore, these results are inconsistent with the first part of the definition of contextual control. © Society for the Experimental Analysis of Behavior.
Chodór, Piotr; Wilczek, Krzysztof; Zielińska, Teresa; Przybylski, Roman; Głowacki, Jan; Włoch, Łukasz; Zembala, Marian; Kalarus, Zbigniew
2017-01-01
Transcatheter aortic valve implantation (TAVI) is presently a recognized treatment mo-dality for patients with severe aortic stenosis ineligible for surgery. It reduces mortality as compared to the conservative treatment. It is further expected from this therapy to improve quality of life by improving of the cardiovascular function performance. The aim of this study is to compare patients' cardiovascular system efficiency in the 6-minute walk test (6MWT) made before and after TAVI and at the 6-12-month follow-up. From January 2009 until February 2012, in the Silesian Center for Heart Diseases in Zabrze, TAVI was performed in 104 patients. Eighty-two patients who underwent 6MWT before surgery were qualified for the analysis. The average age of the patients was 76.0 ± 9.17 years, women made 45.1%. The risk of surgical treatment according to the Logistic Euroscore averaged 22.76 ± 12.63%, and by the Society of Thoracic Surgeons - 5.55 ± 3.34%. The 6MWT was performed within 1 month before the TAVI procedure, up to a month after the procedure and during the 6-12-month follow-up. The 6-minute walk test after TAVI was performed by 64 patients, and after 6-12 month follow-up by 46 patients. The average distance in 6MWT increased from 268.4 ± 89.0 m before treat-ment to 290.0 ± 98.2 m after the procedure (p = 0.008) and 276.1 ± 93.5 m to 343.1 ± 96.7 m after 6-12 months (p < 0.0001). Transcatheter aortic valve implantation procedures significantly improve function of the cardiovascular system evaluated by the 6MWT in 1- and 6-12-month observations. (Cardiol J 2017; 24, 2: 167-175).
76 FR 78015 - Revised Analysis and Mapping Procedures for Non-Accredited Levees
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-15
...] Revised Analysis and Mapping Procedures for Non-Accredited Levees AGENCY: Federal Emergency Management... comments on the proposed solution for Revised Analysis and Mapping Procedures for Non-Accredited Levees. This document proposes a revised procedure for the analysis and mapping of non-accredited levees on...
Crustal Structure Beneath Taiwan Using Frequency-band Inversion of Receiver Function Waveforms
NASA Astrophysics Data System (ADS)
Tomfohrde, D. A.; Nowack, R. L.
Receiver function analysis is used to determine local crustal structure beneath Taiwan. We have performed preliminary data processing and polarization analysis for the selection of stations and events and to increase overall data quality. Receiver function analysis is then applied to data from the Taiwan Seismic Network to obtain radial and transverse receiver functions. Due to the limited azimuthal coverage, only the radial receiver functions are analyzed in terms of horizontally layered crustal structure for each station. In order to improve convergence of the receiver function inversion, frequency-band inversion (FBI) is implemented, in which an iterative inversion procedure with sequentially higher low-pass corner frequencies is used to stabilize the waveform inversion. Frequency-band inversion is applied to receiver functions at six stations of the Taiwan Seismic Network. Initial 20-layer crustal models are inverted for using prior tomographic results for the initial models. The resulting 20-1ayer models are then simplified to 4 to 5 layer models and input into an alternating depth and velocity frequency-band inversion. For the six stations investigated, the resulting simplified models provide an average estimate of 38 km for the Moho thickness surrounding the Central Range of Taiwan. Also, the individual station estimates compare well with the recent tomographic model of and the refraction results of Rau and Wu (1995) and the refraction results of Ma and Song (1997).
Code of Federal Regulations, 2013 CFR
2013-10-01
... Contractor Employees Performing Acquisition Functions 3.1103 Procedures. (a) By use of the contract clause at... employees perform acquisition functions closely associated with inherently Government functions to— (1) Have... information accessed through performance of a Government contract for personal gain; and (iii) Obtain a signed...
Code of Federal Regulations, 2012 CFR
2012-10-01
... Contractor Employees Performing Acquisition Functions 3.1103 Procedures. (a) By use of the contract clause at... employees perform acquisition functions closely associated with inherently Government functions to— (1) Have... information accessed through performance of a Government contract for personal gain; and (iii) Obtain a signed...
Code of Federal Regulations, 2014 CFR
2014-10-01
... Contractor Employees Performing Acquisition Functions 3.1103 Procedures. (a) By use of the contract clause at... employees perform acquisition functions closely associated with inherently Government functions to— (1) Have... information accessed through performance of a Government contract for personal gain; and (iii) Obtain a signed...
Possibilities of fractal analysis of the competitive dynamics: Approaches and procedures
NASA Astrophysics Data System (ADS)
Zagornaya, T. O.; Medvedeva, M. A.; Panova, V. L.; Isaichik, K. F.; Medvedev, A. N.
2017-11-01
The possibilities of the fractal approach are used for the study of non-linear nature of the competitive dynamics of the market of trading intermediaries. Based on a statistical study of the functioning of retail indicators in the region, the approach to the analysis of the characteristics of the competitive behavior of market participants is developed. The authors postulate the principles of studying the dynamics of competition as a result of changes in the characteristics of the vector and the competitive behavior of market agents.
Post flight analysis of NASA standard star trackers recovered from the solar maximum mission
NASA Technical Reports Server (NTRS)
Newman, P.
1985-01-01
The flight hardware returned after the Solar Maximum Mission Repair Mission was analyzed to determine the effects of 4 years in space. The NASA Standard Star Tracker would be a good candidate for such analysis because it is moderately complex and had a very elaborate calibration during the acceptance procedure. However, the recovery process extensively damaged the cathode of the image dissector detector making proper operation of the tracker and a comparison with preflight characteristics impossible. Otherwise, the tracker functioned nominally during testing.
Nonparametric bootstrap analysis with applications to demographic effects in demand functions.
Gozalo, P L
1997-12-01
"A new bootstrap proposal, labeled smooth conditional moment (SCM) bootstrap, is introduced for independent but not necessarily identically distributed data, where the classical bootstrap procedure fails.... A good example of the benefits of using nonparametric and bootstrap methods is the area of empirical demand analysis. In particular, we will be concerned with their application to the study of two important topics: what are the most relevant effects of household demographic variables on demand behavior, and to what extent present parametric specifications capture these effects." excerpt
The effects of the Cox maze procedure on atrial function
Voeller, Rochus K.; Zierer, Andreas; Lall, Shelly C.; Sakamoto, Shun–ichiro; Chang, Nai–Lun; Schuessler, Richard B.; Moon, Marc R.; Damiano, Ralph J.
2010-01-01
Objective The effects of the Cox maze procedure on atrial function remain poorly defined. The purpose of this study was to investigate the effects of a modified Cox maze procedure on left and right atrial function in a porcine model. Methods After cardiac magnetic resonance imaging, 6 pigs underwent pericardiotomy (sham group), and 6 pigs underwent a modified Cox maze procedure (maze group) with bipolar radiofrequency ablation. The maze group had preablation and immediate postablation left and right atrial pressure–volume relations measured with conductance catheters. All pigs survived for 30 days. Magnetic resonance imaging was then repeated for both groups, and conductance catheter measurements were repeated for the right atrium in the maze group. Results Both groups had significantly higher left atrial volumes postoperatively. Magnetic resonance imaging–derived reservoir and booster pump functional parameters were reduced postoperatively for both groups, but there was no difference in these parameters between the groups. The maze group had significantly higher reduction in the medial and lateral left atrial wall contraction postoperatively. There was no change in immediate left atrial elastance or in the early and 30-day right atrial elastance after the Cox maze procedure. Although the initial left atrial stiffness increased after ablation, right atrial diastolic stiffness did not change initially or at 30 days. Conclusions Performing a pericardiotomy alone had a significant effect on atrial function that can be quantified by means of magnetic resonance imaging. The effects of the Cox maze procedure on left atrial function could only be detected by analyzing segmental wall motion. Understanding the precise physiologic effects of the Cox maze procedure on atrial function will help in developing less-damaging lesion sets for the surgical treatment of atrial fibrillation. PMID:19026812
An analysis of the DuPage County Regional Office of Education physics exam
NASA Astrophysics Data System (ADS)
Muehsler, Hans
In 2009, the DuPage County Regional Office of Education (ROE) tasked volunteer physics teachers with creating a basic skills physics exam reflecting what the participants valued and shared in common across curricula. Mechanics, electricity & magnetism (E&M), and wave phenomena emerged as the primary constructs. The resulting exam was intended for first-exposure physics students. The most recently completed version was psychometrically assessed for unidimensionality within the constructs using a robust WLS structural equation model and for reliability. An item analysis using a 3-PL IRT model was performed on the mechanics items and a 2-PL IRT model was performed on the E&M and waves items; a distractor analysis was also performed on all items. Lastly, differential item functioning (DIF) and differential test functioning (DTF) analyses, using the Mantel-Haenszel procedure, were performed using gender, ethnicity, year in school, ELL, physics level, and math level as groupings.
Quantitative architectural analysis: a new approach to cortical mapping.
Schleicher, A; Palomero-Gallagher, N; Morosan, P; Eickhoff, S B; Kowalski, T; de Vos, K; Amunts, K; Zilles, K
2005-12-01
Recent progress in anatomical and functional MRI has revived the demand for a reliable, topographic map of the human cerebral cortex. Till date, interpretations of specific activations found in functional imaging studies and their topographical analysis in a spatial reference system are, often, still based on classical architectonic maps. The most commonly used reference atlas is that of Brodmann and his successors, despite its severe inherent drawbacks. One obvious weakness in traditional, architectural mapping is the subjective nature of localising borders between cortical areas, by means of a purely visual, microscopical examination of histological specimens. To overcome this limitation, more objective, quantitative mapping procedures have been established in the past years. The quantification of the neocortical, laminar pattern by defining intensity line profiles across the cortical layers, has a long tradition. During the last years, this method has been extended to enable a reliable, reproducible mapping of the cortex based on image analysis and multivariate statistics. Methodological approaches to such algorithm-based, cortical mapping were published for various architectural modalities. In our contribution, principles of algorithm-based mapping are described for cyto- and receptorarchitecture. In a cytoarchitectural parcellation of the human auditory cortex, using a sliding window procedure, the classical areal pattern of the human superior temporal gyrus was modified by a replacing of Brodmann's areas 41, 42, 22 and parts of area 21, with a novel, more detailed map. An extension and optimisation of the sliding window procedure to the specific requirements of receptorarchitectonic mapping, is also described using the macaque central sulcus and adjacent superior parietal lobule as a second, biologically independent example. Algorithm-based mapping procedures, however, are not limited to these two architectural modalities, but can be applied to all images in which a laminar cortical pattern can be detected and quantified, e.g. myeloarchitectonic and in vivo high resolution MR imaging. Defining cortical borders, based on changes in cortical lamination in high resolution, in vivo structural MR images will result in a rapid increase of our knowledge on the structural parcellation of the human cerebral cortex.
Real and Artificial Differential Item Functioning
ERIC Educational Resources Information Center
Andrich, David; Hagquist, Curt
2012-01-01
The literature in modern test theory on procedures for identifying items with differential item functioning (DIF) among two groups of persons includes the Mantel-Haenszel (MH) procedure. Generally, it is not recognized explicitly that if there is real DIF in some items which favor one group, then as an artifact of this procedure, artificial DIF…
Nakajima, Hisato; Yano, Kouya; Nagasawa, Kaoko; Kobayashi, Eiji; Yokota, Kuninobu
2015-01-01
On the basis of Diagnosis Procedure Combination (DPC) survey data, the factors that increase the value of function evaluation coefficient II were considered. A total of 1,505 hospitals were divided into groups I, II, and III, and the following items were considered. 1. Significant differences in function evaluation coefficient II and DPC survey data. 2. Examination of using the Mahalanobis-Taguchi (MT) method. 3. Correlation between function evaluation coefficient II and each DPC survey data item. 1. Function evaluation coefficient II was highest in group II. Group I hospitals showed the highest bed capacity, and numbers of hospitalization days, operations, chemotherapies, radiotherapies and general anesthesia procedures. 2. Using the MT method, we found that the number of ambulance conveyances was effective factor in group I hospitals, the number of general anesthesia procedures was effective factor in group II hospitals, and the bed capacity was effective factor in group III hospitals. 3. In group I hospitals, function evaluation coefficient II significantly correlated to the numbers of ambulance conveyances and chemotherapies. In group II hospitals, function evaluation coefficient II significantly correlated to bed capacity, the numbers of ambulance conveyances, hospitalization days, operations, general anesthesia procedures, and mean hospitalization days. In group III hospitals, function evaluation coefficient II significantly correlated to all items. The factors that improve the value of function evaluation coefficient II were the increases in the numbers of ambulance conveyances, chemotherapies and radiotherapies in group I hospitals, increases in the numbers of hospitalization days, operations, ambulance conveyances and general anesthesia procedures in group II hospitals, and increases in the numbers of hospitalization days, operations and ambulance conveyances. These results indicate that the profit of a hospital will increase, which will lead to medical services of good quality.
Children's self reported discomforts as participants in clinical research.
Staphorst, Mira S; Hunfeld, Joke A M; van de Vathorst, Suzanne; Passchier, Jan; van Goudoever, Johannes B
2015-10-01
There is little empirical evidence on children's subjective experiences of discomfort during clinical research procedures. Therefore, Institutional Review Boards have limited empirical information to guide their decision-making on discomforts for children in clinical research. To get more insight into what children's discomforts are during clinical research procedures, we interviewed a group of children on this topic and also asked for suggestions to reduce possible discomforts. Forty-six children (aged 6-18) participating in clinical research studies (including needle-related procedures, food provocation tests, MRI scans, pulmonary function tests, questionnaires) were interviewed about their experiences during the research procedures. Thematic analysis was used to analyze the interviews. The discomforts of the interviewed children could be divided into two main groups: physical and mental discomforts. The majority experienced physical discomforts during the research procedures: pain, shortness of breath, nausea, itchiness, and feeling hungry, which were often caused by needle procedures, some pulmonary procedures, and food provocation tests. Mental discomforts included anxiousness because of anticipated pain and not knowing what to expect from a research procedure, boredom and tiredness during lengthy research procedures and waiting, and embarrassment during Tanner staging. Children's suggestions to reduce the discomforts of the research procedures were providing distraction (e.g. watching a movie or listening to music), providing age-appropriate information and shortening the duration of lengthy procedures. Our study shows that children can experience various discomforts during research procedures, and it provides information about how these discomforts can be reduced according to them. Further research is needed with larger samples to study the number of children that experience these mentioned discomforts during research procedures in a quantitative way. Copyright © 2015 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This report was prepared at the request of the Lawrence Livermore Laboratory (LLL) to provide background information for analyzing soil-structure interaction by the frequency-independent impedance function approach. LLL is conducting such analyses as part of its seismic review of selected operating plants under the Systematic Evaluation Program for the US Nuclear Regulatory Commission. The analytical background and basic assumptionsof the impedance function theory are briefly reviewed, and the role of radiation damping in soil-structure interaction analysis is discussed. The validity of modeling soil-structure interaction by using frequency-independent functions is evaluated based on data from several field tests. Finally, the recommendedmore » procedures for performing soil-structure interaction analyses are discussed with emphasis on the modal superposition method.« less
Pointwise influence matrices for functional-response regression.
Reiss, Philip T; Huang, Lei; Wu, Pei-Shien; Chen, Huaihou; Colcombe, Stan
2017-12-01
We extend the notion of an influence or hat matrix to regression with functional responses and scalar predictors. For responses depending linearly on a set of predictors, our definition is shown to reduce to the conventional influence matrix for linear models. The pointwise degrees of freedom, the trace of the pointwise influence matrix, are shown to have an adaptivity property that motivates a two-step bivariate smoother for modeling nonlinear dependence on a single predictor. This procedure adapts to varying complexity of the nonlinear model at different locations along the function, and thereby achieves better performance than competing tensor product smoothers in an analysis of the development of white matter microstructure in the brain. © 2017, The International Biometric Society.
Analysis of Nuclear Lamina Proteins in Myoblast Differentiation by Functional Complementation.
Tapia, Olga; Gerace, Larry
2016-01-01
We describe straightforward methodology for structure-function mapping of nuclear lamina proteins in myoblast differentiation, using populations of C2C12 myoblasts in which the endogenous lamina components are replaced with ectopically expressed mutant versions of the proteins. The procedure involves bulk isolation of C2C12 cell populations expressing the ectopic proteins by lentiviral transduction, followed by depletion of the endogenous proteins using siRNA, and incubation of cells under myoblast differentiation conditions. Similar methodology may be applied to mouse embryo fibroblasts or to other cell types as well, for the identification and characterization of sequences of lamina proteins involved in functions that can be measured biochemically or cytologically.
A Dynamic Simulation of Musculoskeletal Function in the Mouse Hindlimb During Trotting Locomotion
Charles, James P.; Cappellari, Ornella; Hutchinson, John R.
2018-01-01
Mice are often used as animal models of various human neuromuscular diseases, and analysis of these models often requires detailed gait analysis. However, little is known of the dynamics of the mouse musculoskeletal system during locomotion. In this study, we used computer optimization procedures to create a simulation of trotting in a mouse, using a previously developed mouse hindlimb musculoskeletal model in conjunction with new experimental data, allowing muscle forces, activation patterns, and levels of mechanical work to be estimated. Analyzing musculotendon unit (MTU) mechanical work throughout the stride allowed a deeper understanding of their respective functions, with the rectus femoris MTU dominating the generation of positive and negative mechanical work during the swing and stance phases. This analysis also tested previous functional inferences of the mouse hindlimb made from anatomical data alone, such as the existence of a proximo-distal gradient of muscle function, thought to reflect adaptations for energy-efficient locomotion. The results do not strongly support the presence of this gradient within the mouse musculoskeletal system, particularly given relatively high negative net work output from the ankle plantarflexor MTUs, although more detailed simulations could test this further. This modeling analysis lays a foundation for future studies of the control of vertebrate movement through the development of neuromechanical simulations. PMID:29868576
A far-field radio-frequency experimental exposure system with unrestrained mice.
Hansen, Jared W; Asif, Sajid; Singelmann, Lauren; Khan, Muhammad Saeed; Ghosh, Sumit; Gustad, Tom; Doetkott, Curt; Braaten, Benjamin D; Ewert, Daniel L
2015-01-01
Many studies have been performed on exploring the effects of radio-frequency (RF) energy on biological function in vivo. In particular, gene expression results have been inconclusive due, in part, to a lack of a standardized experimental procedure. This research describes a new far field RF exposure system for unrestrained murine models that reduces experimental error. The experimental procedure includes the materials used, the creation of a patch antenna, the uncertainty analysis of the equipment, characterization of the test room, experimental equipment used and setup, power density and specific absorption rate experiment, and discussion. The result of this research is an experimental exposure system to be applied to future biological studies.
NASA Technical Reports Server (NTRS)
Rinehart, Maegan L.
2011-01-01
The purpose of this activity is to provide the Mechanical Components Test Facility (MCTF) with the capability to obtain electronic leak test and proof pressure data, Payload and Components Real-time Automated Test System (PACRATS) data acquisition software will be utilized to display real-time data. It will record leak rates and pressure/vacuum level(s) simultaneously. This added functionality will provide electronic leak test and pressure data at specified sampling frequencies. Electronically stored data will provide ES61 with increased data security, analysis, and accuracy. The tasks performed in this procedure are to verify PACRATS only, and are not intended to provide verifications for MCTF equipment.
Portella, Claudio Elidio; Silva, Julio Guilherme; Bastos, Victor Hugo; Machado, Dionis; Cunha, Marlo; Cagy, Maurício; Basile, Luis; Piedade, Roberto; Ribeiro, Pedro
2006-06-01
The objective of the present study was to evaluate attentional, motor and electroencephalographic (EEG) parameters during a procedural task when subjects have ingested 6 mg of bromazepam. The sample consisted of 26 healthy subjects, male or female, between 19 and 36 years of age. The control (placebo) and experimental (bromazepam 6 mg) groups were submitted to a typewriting task in a randomized, double-blind design. The findings did not show significant differences in attentional and motor measures between groups. Coherence measures (qEEG) were evaluated between scalp regions, in theta, alpha and beta bands. A first analysis revealed a main effect for condition (Anova 2-way--condition versus blocks). A second Anova 2-way (condition versus scalp regions) showed a main effect for both factors. The coherence measure was not a sensitive tool at demonstrating differences between cortical areas as a function of procedural learning.
[Long-term results of the surgical treatment of chronic pancreatitis].
Padillo Ruiz, F J; Rufián, S; Varo, E; Solorzano, G; Miño, G; Pera Madrazo, C
1994-08-01
We analized the long-term results after surgical treatment in 41 patients with chronic pancreatitis. Twenty one of them underwent resection: 19 pancreaticoduodenectomy (11 Whipple procedure and 8 Traverso Longmire); total pancreatectomy (1) and near-total pancreatectomy (1). In the remaining 20 patients a drainage procedure was carried out: Puestow-Duval (5); Partington (7); double derivation: pancreatic and biliar (5); triple derivation: pancreatic, biliar, gastric (2) and Nardi procedure+quisteduodenostomy in one patient. The following were evaluated: persistent pain; chronic alcoholism; nutrition status; exocrine function (syntomatic steatorrea, use of pancreatic enzyme preparation and fecal determination of glucide, protids and lipids) and endocrine function (glucose and insulin levels and glucose oral test). Surgery failed to relieve pain in 15.6% of the patients; failures were associated chronic alcoholism (p < 0.05); 18 patients (44%) required oral pancreatic enzymes. There weren't significant differences between resection and drainage procedures regarding the exocrine function. However, endocrine function was significantly worse (p < 0.05) after pancreaticoduodenectomy than after drainages procedures. Among the late, the endocrine function was better after Partington operation than after the Puestow-Duval.
Python package for model STructure ANalysis (pySTAN)
NASA Astrophysics Data System (ADS)
Van Hoey, Stijn; van der Kwast, Johannes; Nopens, Ingmar; Seuntjens, Piet
2013-04-01
The selection and identification of a suitable hydrological model structure is more than fitting parameters of a model structure to reproduce a measured hydrograph. The procedure is highly dependent on various criteria, i.e. the modelling objective, the characteristics and the scale of the system under investigation as well as the available data. Rigorous analysis of the candidate model structures is needed to support and objectify the selection of the most appropriate structure for a specific case (or eventually justify the use of a proposed ensemble of structures). This holds both in the situation of choosing between a limited set of different structures as well as in the framework of flexible model structures with interchangeable components. Many different methods to evaluate and analyse model structures exist. This leads to a sprawl of available methods, all characterized by different assumptions, changing conditions of application and various code implementations. Methods typically focus on optimization, sensitivity analysis or uncertainty analysis, with backgrounds from optimization, machine-learning or statistics amongst others. These methods also need an evaluation metric (objective function) to compare the model outcome with some observed data. However, for current methods described in literature, implementations are not always transparent and reproducible (if available at all). No standard procedures exist to share code and the popularity (and amount of applications) of the methods is sometimes more dependent on the availability than the merits of the method. Moreover, new implementations of existing methods are difficult to verify and the different theoretical backgrounds make it difficult for environmental scientists to decide about the usefulness of a specific method. A common and open framework with a large set of methods can support users in deciding about the most appropriate method. Hence, it enables to simultaneously apply and compare different methods on a fair basis. We developed and present pySTAN (python framework for STructure Analysis), a python package containing a set of functions for model structure evaluation to provide the analysis of (hydrological) model structures. A selected set of algorithms for optimization, uncertainty and sensitivity analysis is currently available, together with a set of evaluation (objective) functions and input distributions to sample from. The methods are implemented model-independent and the python language provides the wrapper functions to apply administer external model codes. Different objective functions can be considered simultaneously with both statistical metrics and more hydrology specific metrics. By using so-called reStructuredText (sphinx documentation generator) and Python documentation strings (docstrings), the generation of manual pages is semi-automated and a specific environment is available to enhance both the readability and transparency of the code. It thereby enables a larger group of users to apply and compare these methods and to extend the functionalities.
18 CFR 358.8 - Implementation requirements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... marketing functions. (b) Compliance measures and written procedures. (1) A transmission provider must... procedures referred to in § 358.7(d) to all its transmission function employees, marketing function employees... its Internet Web site. (d) Books and records. A transmission provider must maintain its books of...
Sun, Duanchen; Liu, Yinliang; Zhang, Xiang-Sun; Wu, Ling-Yun
2017-09-21
High-throughput experimental techniques have been dramatically improved and widely applied in the past decades. However, biological interpretation of the high-throughput experimental results, such as differential expression gene sets derived from microarray or RNA-seq experiments, is still a challenging task. Gene Ontology (GO) is commonly used in the functional enrichment studies. The GO terms identified via current functional enrichment analysis tools often contain direct parent or descendant terms in the GO hierarchical structure. Highly redundant terms make users difficult to analyze the underlying biological processes. In this paper, a novel network-based probabilistic generative model, NetGen, was proposed to perform the functional enrichment analysis. An additional protein-protein interaction (PPI) network was explicitly used to assist the identification of significantly enriched GO terms. NetGen achieved a superior performance than the existing methods in the simulation studies. The effectiveness of NetGen was explored further on four real datasets. Notably, several GO terms which were not directly linked with the active gene list for each disease were identified. These terms were closely related to the corresponding diseases when accessed to the curated literatures. NetGen has been implemented in the R package CopTea publicly available at GitHub ( http://github.com/wulingyun/CopTea/ ). Our procedure leads to a more reasonable and interpretable result of the functional enrichment analysis. As a novel term combination-based functional enrichment analysis method, NetGen is complementary to current individual term-based methods, and can help to explore the underlying pathogenesis of complex diseases.
78 FR 21074 - Airworthiness Directives; Bombardier, Inc. Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-09
... the maintenance requirements manual (MRM) by incorporating procedures for repetitive functional tests... the new tests, removing of the existing procedures for the repetitive functional tests from the MRM...
AGR-1 Thermocouple Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeff Einerson
2012-05-01
This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less
HUNT: launch of a full-length cDNA database from the Helix Research Institute.
Yudate, H T; Suwa, M; Irie, R; Matsui, H; Nishikawa, T; Nakamura, Y; Yamaguchi, D; Peng, Z Z; Yamamoto, T; Nagai, K; Hayashi, K; Otsuki, T; Sugiyama, T; Ota, T; Suzuki, Y; Sugano, S; Isogai, T; Masuho, Y
2001-01-01
The Helix Research Institute (HRI) in Japan is releasing 4356 HUman Novel Transcripts and related information in the newly established HUNT database. The institute is a joint research project principally funded by the Japanese Ministry of International Trade and Industry, and the clones were sequenced in the governmental New Energy and Industrial Technology Development Organization (NEDO) Human cDNA Sequencing Project. The HUNT database contains an extensive amount of annotation from advanced analysis and represents an essential bioinformatics contribution towards understanding of the gene function. The HRI human cDNA clones were obtained from full-length enriched cDNA libraries constructed with the oligo-capping method and have resulted in novel full-length cDNA sequences. A large fraction has little similarity to any proteins of known function and to obtain clues about possible function we have developed original analysis procedures. Any putative function deduced here can be validated or refuted by complementary analysis results. The user can also extract information from specific categories like PROSITE patterns, PFAM domains, PSORT localization, transmembrane helices and clones with GENIUS structure assignments. The HUNT database can be accessed at http://www.hri.co.jp/HUNT.
Kinetic model of turbulence in an incompressible fluid
NASA Technical Reports Server (NTRS)
Tchen, C. M.
1978-01-01
A statistical description of turbulence in an incompressible fluid obeying the Navier-Stokes equations is proposed, where pressure is regarded as a potential for the interaction between fluid elements. A scaling procedure divides a fluctuation into three ranks representing the three transport processes of macroscopic evolution, transport property, and relaxation. Closure is obtained by relaxation, and a kinetic equation is obtained for the fluctuation of the macroscopic rank of the distribution function. The solution gives the transfer function and eddy viscosity. When applied to the inertia subrange of the energy spectrum the analysis recovers the Kolmogorov law and its numerical coefficient.
NASA Technical Reports Server (NTRS)
Atli, K. C.; Karaman, I.; Noebe, R. D.; Maier, H. J.
2010-01-01
We compare the effectiveness of a conventional thermomechanical training procedure and severe plastic deformation via equal channel angular extrusion to achieve improved functional stability in a Ti50.5Ni24.5Pd25 high-temperature shape memory alloy. Thermomechanical testing indicates that both methods result in enhanced shape memory characteristics, such as reduced irrecoverable strain and thermal hysteresis. The mechanisms responsible for the improvements are discussed in light of microstructural findings from transmission electron microscopy.
LANDSAT-D investigations in snow hydrology
NASA Technical Reports Server (NTRS)
Dozier, J.
1983-01-01
The atmospheric radiative transfer calculation program (ATARD) and its supporting programs (setting up atmospheric profile, making Mie tables and an exponential-sum-fitting table) were completed. More sophisticated treatment of aerosol scattering (including angular phase function or asymmetric factor) and multichannel analysis of results from ATRAD are being developed. Some progress was made on a Monte Carlo program for examining two dimensional effects, specifically a surface boundary condition that varies across a scene. The MONTE program combines ATRAD and the Monte Carlo method together to produce an atmospheric point spread function. Currently the procedure passes monochromatic tests and the results are reasonable.
Simple cloning strategy using GFPuv gene as positive/negative indicator.
Miura, Hiromi; Inoko, Hidetoshi; Inoue, Ituro; Tanaka, Masafumi; Sato, Masahiro; Ohtsuka, Masato
2011-09-15
Because construction of expression vectors is the first requisite in the functional analysis of genes, development of simple cloning systems is a major requirement during the postgenomic era. In the current study, we developed cloning vectors for gain- or loss-of-function studies by using the GFPuv gene as a positive/negative indicator of cloning. These vectors allow us to easily detect correct clones and obtain expression vectors from a simple procedure by means of the combined use of the GFPuv gene and a type IIS restriction enzyme. Copyright © 2011 Elsevier Inc. All rights reserved.
The standard operating procedure of the DOE-JGI Microbial Genome Annotation Pipeline (MGAP v.4).
Huntemann, Marcel; Ivanova, Natalia N; Mavromatis, Konstantinos; Tripp, H James; Paez-Espino, David; Palaniappan, Krishnaveni; Szeto, Ernest; Pillay, Manoj; Chen, I-Min A; Pati, Amrita; Nielsen, Torben; Markowitz, Victor M; Kyrpides, Nikos C
2015-01-01
The DOE-JGI Microbial Genome Annotation Pipeline performs structural and functional annotation of microbial genomes that are further included into the Integrated Microbial Genome comparative analysis system. MGAP is applied to assembled nucleotide sequence datasets that are provided via the IMG submission site. Dataset submission for annotation first requires project and associated metadata description in GOLD. The MGAP sequence data processing consists of feature prediction including identification of protein-coding genes, non-coding RNAs and regulatory RNA features, as well as CRISPR elements. Structural annotation is followed by assignment of protein product names and functions.
Recent development in modeling and analysis of functionally graded materials and structures
NASA Astrophysics Data System (ADS)
Gupta, Ankit; Talha, Mohammad
2015-11-01
In this article, an extensive review related to the structural response of the functionally graded materials (FGMs) and structures have been presented. These are high technology materials developed by a group scientist in the late 1980's in Japan. The emphasis has been made here, to present the structural characteristics of FGMs plates/shells under thermo-electro-mechanical loadings under various boundary and environmental conditions. This paper also provides an overview of different fabrication procedures and the future research directions which is required to implement these materials in the design and analysis appropriately. The expected outcome of present review can be treated as milestone for future studies in the area of high technology materials and structures, and would be definitely advantageous for the researchers, scientists, and designers working in this field.
The standard operating procedure of the DOE-JGI Metagenome Annotation Pipeline (MAP v.4)
Huntemann, Marcel; Ivanova, Natalia N.; Mavromatis, Konstantinos; ...
2016-02-24
The DOE-JGI Metagenome Annotation Pipeline (MAP v.4) performs structural and functional annotation for metagenomic sequences that are submitted to the Integrated Microbial Genomes with Microbiomes (IMG/M) system for comparative analysis. The pipeline runs on nucleotide sequences provide d via the IMG submission site. Users must first define their analysis projects in GOLD and then submit the associated sequence datasets consisting of scaffolds/contigs with optional coverage information and/or unassembled reads in fasta and fastq file formats. The MAP processing consists of feature prediction including identification of protein-coding genes, non-coding RNAs and regulatory RNAs, as well as CRISPR elements. Structural annotation ismore » followed by functional annotation including assignment of protein product names and connection to various protein family databases.« less
The standard operating procedure of the DOE-JGI Metagenome Annotation Pipeline (MAP v.4)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huntemann, Marcel; Ivanova, Natalia N.; Mavromatis, Konstantinos
The DOE-JGI Metagenome Annotation Pipeline (MAP v.4) performs structural and functional annotation for metagenomic sequences that are submitted to the Integrated Microbial Genomes with Microbiomes (IMG/M) system for comparative analysis. The pipeline runs on nucleotide sequences provide d via the IMG submission site. Users must first define their analysis projects in GOLD and then submit the associated sequence datasets consisting of scaffolds/contigs with optional coverage information and/or unassembled reads in fasta and fastq file formats. The MAP processing consists of feature prediction including identification of protein-coding genes, non-coding RNAs and regulatory RNAs, as well as CRISPR elements. Structural annotation ismore » followed by functional annotation including assignment of protein product names and connection to various protein family databases.« less
Treatment of challenging behavior exhibited by children with prenatal drug exposure.
Kurtz, Patricia F; Chin, Michelle D; Rush, Karena S; Dixon, Dennis R
2008-01-01
A large body of literature exists describing the harmful effects of prenatal drug exposure on infant and child development. However, there is a paucity of research examining strategies to ameliorate sequelae such as externalizing behavior problems. In the present study, functional analysis procedures were used to assess challenging behavior exhibited by two children who were prenatally exposed to drugs of abuse. Results for both children indicated that challenging behavior was maintained by access to positive reinforcement (adult attention and tangible items). For one child, challenging behavior was also maintained by negative reinforcement (escape from activities of daily living). Function-based interventions were effective in reducing challenging behavior for both children. Implications for utilizing methods of applied behavior analysis in research with children with prenatal drug exposure are discussed.
... of functional rhinoplasty, procedures to reshape the nostrils (ex: a “z-plasty”) or the use of sutures ... The procedure may also straighten the nose, repair post-traumatic or congenital deformities, and improve the appearance. ...
Semen analysis: a new manual and its application to the understanding of semen and its pathology
Jequier, Anne M.
2010-01-01
This article reviews the latest edition of the World Health Organization's manual on semen analysis, a comprehensive instructional guide. The methodology used in the assessment of the usual variables in semen analysis is described, as are many of the less common, but very valuable, sperm function tests. Seminal fluid preparation techniques for procedures such as in vitro fertilization and intrauterine insemination are also outlined in the manual. In addition, it details many useful techniques for the assessment of seminal fluid. It will be a very useful manual for any laboratory that carries out analyses of seminal fluid. PMID:20111075
Semen analysis: a new manual and its application to the understanding of semen and its pathology.
Jequier, Anne M
2010-01-01
This article reviews the latest edition of the World Health Organization's manual on semen analysis, a comprehensive instructional guide. The methodology used in the assessment of the usual variables in semen analysis is described, as are many of the less common, but very valuable, sperm function tests. Seminal fluid preparation techniques for procedures such as in vitro fertilization and intrauterine insemination are also outlined in the manual. In addition, it details many useful techniques for the assessment of seminal fluid. It will be a very useful manual for any laboratory that carries out analyses of seminal fluid.
The multicategory case of the sequential Bayesian pixel selection and estimation procedure
NASA Technical Reports Server (NTRS)
Pore, M. D.; Dennis, T. B. (Principal Investigator)
1980-01-01
A Bayesian technique for stratified proportion estimation and a sampling based on minimizing the mean squared error of this estimator were developed and tested on LANDSAT multispectral scanner data using the beta density function to model the prior distribution in the two-class case. An extention of this procedure to the k-class case is considered. A generalization of the beta function is shown to be a density function for the general case which allows the procedure to be extended.
10 CFR 1023.3 - Principles of general applicability.
Code of Federal Regulations, 2011 CFR
2011-01-01
... functions or procedures, including ADR. (3) Decisions of the Board shall be final agency decisions and shall... (ADR) Functions. (1) Board judges and personnel shall perform ADR related functions impartially, with... judges limited to the nature, procedures, and availability of ADR through the Board are permitted and...
Modeling procedures for handling qualities evaluation of flexible aircraft
NASA Technical Reports Server (NTRS)
Govindaraj, K. S.; Eulrich, B. J.; Chalk, C. R.
1981-01-01
This paper presents simplified modeling procedures to evaluate the impact of flexible modes and the unsteady aerodynamic effects on the handling qualities of Supersonic Cruise Aircraft (SCR). The modeling procedures involve obtaining reduced order transfer function models of SCR vehicles, including the important flexible mode responses and unsteady aerodynamic effects, and conversion of the transfer function models to time domain equations for use in simulations. The use of the modeling procedures is illustrated by a simple example.
NASA Astrophysics Data System (ADS)
Zhengyong, R.; Jingtian, T.; Changsheng, L.; Xiao, X.
2007-12-01
Although adaptive finite-element (AFE) analysis is becoming more and more focused in scientific and engineering fields, its efficient implementations are remain to be a discussed problem as its more complex procedures. In this paper, we propose a clear C++ framework implementation to show the powerful properties of Object-oriented philosophy (OOP) in designing such complex adaptive procedure. In terms of the modal functions of OOP language, the whole adaptive system is divided into several separate parts such as the mesh generation or refinement, a-posterior error estimator, adaptive strategy and the final post processing. After proper designs are locally performed on these separate modals, a connected framework of adaptive procedure is formed finally. Based on the general elliptic deferential equation, little efforts should be added in the adaptive framework to do practical simulations. To show the preferable properties of OOP adaptive designing, two numerical examples are tested. The first one is the 3D direct current resistivity problem in which the powerful framework is efficiently shown as only little divisions are added. And then, in the second induced polarization£¨IP£©exploration case, new adaptive procedure is easily added which adequately shows the strong extendibility and re-usage of OOP language. Finally we believe based on the modal framework adaptive implementation by OOP methodology, more advanced adaptive analysis system will be available in future.
Rotation Capacity of Bolted Flush End-Plate Stiffened Beam-to-Column Connection
NASA Astrophysics Data System (ADS)
Ostrowski, Krzysztof; Kozłowski, Aleksander
2017-06-01
One of the flexibility parameters of semi-rigid joints is rotation capacity. Plastic rotation capacity is especially important in plastic design of framed structures. Current design codes, including Eurocode 3, do not posses procedures enabling designers to obtain value of rotation capacity. In the paper the calculation procedure of the rotation capacity for stiffened bolted flush end-plate beam-to-column connections has been proposed. Theory of experiment design was applied with the use of Hartley's PS/DS-P:Ha3 plan. The analysis was performed with the use of finite element method (ANSYS), based on the numerical experiment plan. The determination of maximal rotation angle was carried out with the use of regression analysis. The main variables analyzed in parametric study were: pitch of the bolt "w" (120-180 mm), the distance between the bolt axis and the beam upper edge cg1 (50-90 mm) and the thickness of the end-plate tp (10-20 mm). Power function was proposed to describe available rotation capacity of the joint. Influence of the particular components on the rotation capacity was also investigated. In the paper a general procedure for determination of rotation capacity was proposed.
Neville, David C A; Coquard, Virginie; Priestman, David A; te Vruchte, Danielle J M; Sillence, Daniel J; Dwek, Raymond A; Platt, Frances M; Butters, Terry D
2004-08-15
Interest in cellular glycosphingolipid (GSL) function has necessitated the development of a rapid and sensitive method to both analyze and characterize the full complement of structures present in various cells and tissues. An optimized method to characterize oligosaccharides released from glycosphingolipids following ceramide glycanase digestion has been developed. The procedure uses the fluorescent compound anthranilic acid (2-aminobenzoic acid; 2-AA) to label oligosaccharides prior to analysis using normal-phase high-performance liquid chromatography. The labeling procedure is rapid, selective, and easy to perform and is based on the published method of Anumula and Dhume [Glycobiology 8 (1998) 685], originally used to analyze N-linked oligosaccharides. It is less time consuming than a previously published 2-aminobenzamide labeling method [Anal. Biochem. 298 (2001) 207] for analyzing GSL-derived oligosaccharides, as the fluorescent labeling is performed on the enzyme reaction mixture. The purification of 2-AA-labeled products has been improved to ensure recovery of oligosaccharides containing one to four monosaccharide units, which was not previously possible using the Anumula and Dhume post-derivatization purification procedure. This new approach may also be used to analyze both N- and O-linked oligosaccharides.
QUEST - A Bayesian adaptive psychometric method
NASA Technical Reports Server (NTRS)
Watson, A. B.; Pelli, D. G.
1983-01-01
An adaptive psychometric procedure that places each trial at the current most probable Bayesian estimate of threshold is described. The procedure takes advantage of the common finding that the human psychometric function is invariant in form when expressed as a function of log intensity. The procedure is simple, fast, and efficient, and may be easily implemented on any computer.
ERIC Educational Resources Information Center
Tay, Louis; Vermunt, Jeroen K.; Wang, Chun
2013-01-01
We evaluate the item response theory with covariates (IRT-C) procedure for assessing differential item functioning (DIF) without preknowledge of anchor items (Tay, Newman, & Vermunt, 2011). This procedure begins with a fully constrained baseline model, and candidate items are tested for uniform and/or nonuniform DIF using the Wald statistic.…
Wong, Stephen; Hargreaves, Eric L; Baltuch, Gordon H; Jaggi, Jurg L; Danish, Shabbar F
2012-01-01
Microelectrode recording (MER) is necessary for precision localization of target structures such as the subthalamic nucleus during deep brain stimulation (DBS) surgery. Attempts to automate this process have produced quantitative temporal trends (feature activity vs. time) extracted from mobile MER data. Our goal was to evaluate computational methods of generating spatial profiles (feature activity vs. depth) from temporal trends that would decouple automated MER localization from the clinical procedure and enhance functional localization in DBS surgery. We evaluated two methods of interpolation (standard vs. kernel) that generated spatial profiles from temporal trends. We compared interpolated spatial profiles to true spatial profiles that were calculated with depth windows, using correlation coefficient analysis. Excellent approximation of true spatial profiles is achieved by interpolation. Kernel-interpolated spatial profiles produced superior correlation coefficient values at optimal kernel widths (r = 0.932-0.940) compared to standard interpolation (r = 0.891). The choice of kernel function and kernel width resulted in trade-offs in smoothing and resolution. Interpolation of feature activity to create spatial profiles from temporal trends is accurate and can standardize and facilitate MER functional localization of subcortical structures. The methods are computationally efficient, enhancing localization without imposing additional constraints on the MER clinical procedure during DBS surgery. Copyright © 2012 S. Karger AG, Basel.
Formal analysis and evaluation of the back-off procedure in IEEE802.11P VANET
NASA Astrophysics Data System (ADS)
Jin, Li; Zhang, Guoan; Zhu, Xiaojun
2017-07-01
The back-off procedure is one of the media access control technologies in 802.11P communication protocol. It plays an important role in avoiding message collisions and allocating channel resources. Formal methods are effective approaches for studying the performances of communication systems. In this paper, we establish a discrete time model for the back-off procedure. We use Markov Decision Processes (MDPs) to model the non-deterministic and probabilistic behaviors of the procedure, and use the probabilistic computation tree logic (PCTL) language to express different properties, which ensure that the discrete time model performs their basic functionality. Based on the model and PCTL specifications, we study the effect of contention window length on the number of senders in the neighborhood of given receivers, and that on the station’s expected cost required by the back-off procedure to successfully send packets. The variation of the window length may increase or decrease the maximum probability of correct transmissions within a time contention unit. We propose to use PRISM model checker to describe our proposed back-off procedure for IEEE802.11P protocol in vehicle network, and define different probability properties formulas to automatically verify the model and derive numerical results. The obtained results are helpful for justifying the values of the time contention unit.
NASA Astrophysics Data System (ADS)
Galassi, S.
2018-05-01
In this paper a mechanical model of masonry arches strengthened with fibre-reinforced composite materials and the relevant numerical procedure for the analysis are proposed. The arch is modelled by using an assemblage of rigid blocks that are connected together and, also to the supporting structures, by mortar joints. The presence of the reinforcement, usually a sheet placed at the intrados or the extrados, prevents the occurrence of cracks that could activate possible collapse mechanisms, due to tensile failure of the mortar joints. Therefore, in a reinforced arch failure generally occurs in a different way from the URM arch. The numerical procedure proposed checks, as a function of an external incremental load, the inner stress state in the arch, in the reinforcement and in the adhesive layer. In so doing, it then provides a prediction of failure modes. Results obtained from experimental tests, carried out on four in-scale models performed in a laboratory, have been compared with those provided by the numerical procedure, implemented in ArchiVAULT, a software developed by the author. In this regard, the numerical procedure is an extension of previous works. Although additional experimental investigations are necessary, these former results confirm that the proposed numerical procedure is promising.
Sass, Steffen; Pitea, Adriana; Unger, Kristian; Hess, Julia; Mueller, Nikola S.; Theis, Fabian J.
2015-01-01
MicroRNAs represent ~22 nt long endogenous small RNA molecules that have been experimentally shown to regulate gene expression post-transcriptionally. One main interest in miRNA research is the investigation of their functional roles, which can typically be accomplished by identification of mi-/mRNA interactions and functional annotation of target gene sets. We here present a novel method “miRlastic”, which infers miRNA-target interactions using transcriptomic data as well as prior knowledge and performs functional annotation of target genes by exploiting the local structure of the inferred network. For the network inference, we applied linear regression modeling with elastic net regularization on matched microRNA and messenger RNA expression profiling data to perform feature selection on prior knowledge from sequence-based target prediction resources. The novelty of miRlastic inference originates in predicting data-driven intra-transcriptome regulatory relationships through feature selection. With synthetic data, we showed that miRlastic outperformed commonly used methods and was suitable even for low sample sizes. To gain insight into the functional role of miRNAs and to determine joint functional properties of miRNA clusters, we introduced a local enrichment analysis procedure. The principle of this procedure lies in identifying regions of high functional similarity by evaluating the shortest paths between genes in the network. We can finally assign functional roles to the miRNAs by taking their regulatory relationships into account. We thoroughly evaluated miRlastic on a cohort of head and neck cancer (HNSCC) patients provided by The Cancer Genome Atlas. We inferred an mi-/mRNA regulatory network for human papilloma virus (HPV)-associated miRNAs in HNSCC. The resulting network best enriched for experimentally validated miRNA-target interaction, when compared to common methods. Finally, the local enrichment step identified two functional clusters of miRNAs that were predicted to mediate HPV-associated dysregulation in HNSCC. Our novel approach was able to characterize distinct pathway regulations from matched miRNA and mRNA data. An R package of miRlastic was made available through: http://icb.helmholtz-muenchen.de/mirlastic. PMID:26694379
Sass, Steffen; Pitea, Adriana; Unger, Kristian; Hess, Julia; Mueller, Nikola S; Theis, Fabian J
2015-12-18
MicroRNAs represent ~22 nt long endogenous small RNA molecules that have been experimentally shown to regulate gene expression post-transcriptionally. One main interest in miRNA research is the investigation of their functional roles, which can typically be accomplished by identification of mi-/mRNA interactions and functional annotation of target gene sets. We here present a novel method "miRlastic", which infers miRNA-target interactions using transcriptomic data as well as prior knowledge and performs functional annotation of target genes by exploiting the local structure of the inferred network. For the network inference, we applied linear regression modeling with elastic net regularization on matched microRNA and messenger RNA expression profiling data to perform feature selection on prior knowledge from sequence-based target prediction resources. The novelty of miRlastic inference originates in predicting data-driven intra-transcriptome regulatory relationships through feature selection. With synthetic data, we showed that miRlastic outperformed commonly used methods and was suitable even for low sample sizes. To gain insight into the functional role of miRNAs and to determine joint functional properties of miRNA clusters, we introduced a local enrichment analysis procedure. The principle of this procedure lies in identifying regions of high functional similarity by evaluating the shortest paths between genes in the network. We can finally assign functional roles to the miRNAs by taking their regulatory relationships into account. We thoroughly evaluated miRlastic on a cohort of head and neck cancer (HNSCC) patients provided by The Cancer Genome Atlas. We inferred an mi-/mRNA regulatory network for human papilloma virus (HPV)-associated miRNAs in HNSCC. The resulting network best enriched for experimentally validated miRNA-target interaction, when compared to common methods. Finally, the local enrichment step identified two functional clusters of miRNAs that were predicted to mediate HPV-associated dysregulation in HNSCC. Our novel approach was able to characterize distinct pathway regulations from matched miRNA and mRNA data. An R package of miRlastic was made available through: http://icb.helmholtz-muenchen.de/mirlastic.
NASA Astrophysics Data System (ADS)
Stark, Martin; Guckenberger, Reinhard; Stemmer, Andreas; Stark, Robert W.
2005-12-01
Dynamic atomic force microscopy (AFM) offers many opportunities for the characterization and manipulation of matter on the nanometer scale with a high temporal resolution. The analysis of time-dependent forces is basic for a deeper understanding of phenomena such as friction, plastic deformation, and surface wetting. However, the dynamic characteristics of the force sensor used for such investigations are determined by various factors such as material and geometry of the cantilever, detection alignment, and the transfer characteristics of the detector. Thus, for a quantitative investigation of surface properties by dynamic AFM an appropriate system identification procedure is required, characterizing the force sensor beyond the usual parameters spring constant, quality factor, and detection sensitivity. Measurement of the transfer function provides such a characterization that fully accounts for the dynamic properties of the force sensor. Here, we demonstrate the estimation of the transfer function in a bandwidth of 1MHz from experimental data. To this end, we analyze the signal of the vibrations induced by snap-to-contact and snap-off-contact events. For the free cantilever, we determine both a parameter-free estimate [empirical transfer function estimate (ETFE)] and a parametric estimate of the transfer function. For the surface-coupled cantilever the ETFE is obtained. These identification procedures provide an intrinsic calibration as they dispense largely with a priori knowledge about the force sensor.
[Changing economic environment of hospitals: management challenges of the 1990s].
Rotstein, Z; Noy, S; Goldman, B; Shani, M
1990-12-16
The modern hospital is an organization which is influenced by the external environment in which it functions. A major relevant area is the economic environment. In recent years the western world has been facing the challenge of rising costs of health care and an increase in their proportion to the gross national product of most countries. Consequently, hospitals as major providers of health care are under pressure from governments and health insurance companies to cut costs and to "produce" more efficiently. Since hospitals worldwide are finding it hard and painful to function in the new environment in which attitudes to hospitals are changing, a potential managerial-economic crisis may be the next phase. How can the hospital adapt to these changes? First, by adopting managerial attitudes and the tools of the business sector. These include: the strategic planning process, hospital operative autonomy, creating medical-economic responsibility centers as departments, cost-accounting for medical procedures, and case-mix budgeting. Management information systems are necessary during the transition. The hospital information system should include functions at the operative level, such as outpatient visits, and admissions and discharges of patients; and also clinical, diagnostic and laboratory procedures related to the patient case-mix. The second level is a management information system which includes salaries of personnel, case-mix budgeting with variance analysis, prices of procedures and epidemiological data. The authors believe that only the managerial approach combining medical and economic disciplines can meet the challenges of the changing modern economic environment.
Schwein, Adeline; Chinnadurai, Ponraj; Shah, Dipan J; Lumsden, Alan B; Bechara, Carlos F; Bismuth, Jean
2017-05-01
Three-dimensional image fusion of preoperative computed tomography (CT) angiography with fluoroscopy using intraoperative noncontrast cone-beam CT (CBCT) has been shown to improve endovascular procedures by reducing procedure length, radiation dose, and contrast media volume. However, patients with a contraindication to CT angiography (renal insufficiency, iodinated contrast allergy) may not benefit from this image fusion technique. The primary objective of this study was to evaluate the feasibility of magnetic resonance angiography (MRA) and fluoroscopy image fusion using noncontrast CBCT as a guidance tool during complex endovascular aortic procedures, especially in patients with renal insufficiency. All endovascular aortic procedures done under MRA image fusion guidance at a single-center were retrospectively reviewed. The patients had moderate to severe renal insufficiency and underwent diagnostic contrast-enhanced magnetic resonance imaging after gadolinium or ferumoxytol injection. Relevant vascular landmarks electronically marked in MRA images were overlaid on real-time two-dimensional fluoroscopy for image guidance, after image fusion with noncontrast intraoperative CBCT. Technical success, time for image registration, procedure time, fluoroscopy time, number of digital subtraction angiography (DSA) acquisitions before stent deployment or vessel catheterization, and renal function before and after the procedure were recorded. The image fusion accuracy was qualitatively evaluated on a binary scale by three physicians after review of image data showing virtual landmarks from MRA on fluoroscopy. Between November 2012 and March 2016, 10 patients underwent endovascular procedures for aortoiliac aneurysmal disease or aortic dissection using MRA image fusion guidance. All procedures were technically successful. A paired t-test analysis showed no difference between preimaging and postoperative renal function (P = .6). The mean time required for MRA-CBCT image fusion was 4:09 ± 01:31 min:sec. Total fluoroscopy time was 20.1 ± 6.9 minutes. Five of 10 patients (50%) underwent stent graft deployment without any predeployment DSA acquisition. Three of six vessels (50%) were cannulated under image fusion guidance without any precannulation DSA runs, and the remaining vessels were cannulated after one planning DSA acquisition. Qualitative evaluation showed 14 of 22 virtual landmarks (63.6%) from MRA overlaid on fluoroscopy were completely accurate, without the need for adjustment. Five of eight incorrect virtual landmarks (iliac and visceral arteries) resulted from vessel deformation caused by endovascular devices. Ferumoxytol or gadolinium-enhanced MRA imaging and image fusion with fluoroscopy using noncontrast CBCT is feasible and allows patients with renal insufficiency to benefit from optimal guidance during complex endovascular aortic procedures, while preserving their residual renal function. Copyright © 2016 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
Dynamic Relaxational Behaviour of Hyperbranched Polyether Polyols
NASA Astrophysics Data System (ADS)
Navarro-Gorris, A.; Garcia-Bernabé, A.; Stiriba, S.-E.
2008-08-01
Hyperbranched polymers are highly cascade branched polymers easily accessible via one-pot procedure from ABm type monomers. A key property of hyperbranched polymers is their molecular architecture, which allows core-shell morphology to be manipulated for further specific applications in material and medical sciences. Since the discovery of hyperbranched polymer materials, an increasing number of reports have been published describing synthetic procedures and technological applications of such materials, but their physical properties have remained less studied until the last decade. In the present work, different esterified hyperbranched polyglycerols have been prepared starting from polyglycerol precursors in presence of acetic acid, thus generating functionalization degree with range from 0 to 94%. Thermal analysis of the obtained samples has been studied by Differential Scanning Calorimetry (DSC). Dielectric Spectroscopy measurements have been analyzed by combining loss spectra deconvolution with the modulus formalism. In this regard, all acetylated polyglycerols exhibited a main relaxation related to the glass transition (α process) and two sub-glassy relaxations (β and γ processes) which vanish at high functionalization degrees.
Nesbit, Steven C.; Van Hoof, Alexander G.; Le, Chi C.; Dearworth, James R.
2015-01-01
Few laboratory exercises have been developed using the crayfish as a model for teaching how neural processing is done by sensory organs that detect light stimuli. This article describes the dissection procedures and methods for conducting extracellular recording from light responses of both the optic nerve fibers found in the animal’s eyestalk and from the caudal photoreceptor located in the ventral nerve cord. Instruction for ADInstruments’ data acquisition system is also featured for the data collection and analysis of responses. The comparison provides students a unique view on how spike activities measured from neurons code image-forming and non-image-forming processes. Results from the exercise show longer latency and lower frequency of firing by the caudal photoreceptor compared to optic nerve fibers to demonstrate evidence of different functions. After students learn the dissection, recording procedure, and the functional anatomy, they can develop their own experiments to learn more about the photoreceptive mechanisms and the sensory integration of modalities by these light-responsive interneurons. PMID:26557793
Luczynski, Kevin C; Hanley, Gregory P; Rodriguez, Nicole M
2014-01-01
The preschool life skills (PLS) program (Hanley, Heal, Tiger, & Ingvarsson, 2007; Luczynski & Hanley, 2013) involves teaching social skills as a means of decreasing and preventing problem behavior. However, achieving durable outcomes as children transition across educational settings depend on the generalization and long-term maintenance of those skills. The purpose of this study was to evaluate procedures for promoting generalization and long-term maintenance of functional communication and self-control skills for 6 preschool children. When the children's social skills decreased across repeated observations during a generalization assessment, we incorporated modifications to the teaching procedures. However, the effects of the modifications were variable across skills and children. Satisfactory generalization was observed only after the teacher was informed of the target skills and teaching strategies. Maintenance of most social skills was observed 3 months after teaching was discontinued. We discuss the importance of improving child and teacher behavior to promote generalization and maintenance of important social skills. © Society for the Experimental Analysis of Behavior.